mirror of
https://github.com/pacnpal/thrillwiki_django_no_react.git
synced 2026-02-05 06:45:18 -05:00
Compare commits
50 Commits
main-legac
...
8ff6b7ee23
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
8ff6b7ee23 | ||
|
|
e2103a49ce | ||
|
|
2a1d139171 | ||
|
|
d8cb6fcffe | ||
|
|
2cdf302179 | ||
|
|
7db5d1a1cc | ||
|
|
acf2834d16 | ||
|
|
5bcd64ebae | ||
|
|
9a5974eff5 | ||
|
|
8a51cd5de7 | ||
|
|
cf54df0416 | ||
|
|
fe960e8b62 | ||
|
|
40cba5bdb2 | ||
|
|
28c9ec56da | ||
|
|
3ec5a4857d | ||
|
|
4da7e52fb0 | ||
|
|
b80654952d | ||
|
|
2b7bb4dfaa | ||
|
|
a801813dcf | ||
|
|
1c6e219662 | ||
|
|
70e4385c2b | ||
|
|
30aa887d2a | ||
|
|
dd2d09b1c7 | ||
|
|
89d9e945b9 | ||
|
|
bc4a3c7557 | ||
|
|
95700c7d7b | ||
|
|
1adba1b804 | ||
|
|
b243b17af7 | ||
|
|
c95f99ca10 | ||
|
|
aa56c46c27 | ||
|
|
137b9b8cb9 | ||
|
|
00699d53b4 | ||
|
|
cd8868a591 | ||
|
|
ed04b30469 | ||
|
|
a9f5644c5c | ||
|
|
a0be417f74 | ||
|
|
ca770d76ff | ||
|
|
edcd8f2076 | ||
|
|
ae31e889d7 | ||
|
|
2e35f8c5d9 | ||
|
|
45d97b6e68 | ||
|
|
b508434574 | ||
|
|
8f6acbdc23 | ||
|
|
b860e332cb | ||
|
|
7ba0004c93 | ||
|
|
b9063ff4f8 | ||
|
|
bf04e4d854 | ||
|
|
1b246eeaa4 | ||
|
|
fdbbca2add | ||
|
|
bf365693f8 |
73
.agent/MEMORY/migration_source.md
Normal file
73
.agent/MEMORY/migration_source.md
Normal file
@@ -0,0 +1,73 @@
|
||||
# Migration Source: thrillwiki-87
|
||||
|
||||
The React project at `/Volumes/macminissd/Projects/thrillwiki-87` is the **authoritative source** for ThrillWiki features and functionality.
|
||||
|
||||
## Core Principle
|
||||
|
||||
**thrillwiki-87 is LAW.** When migrating features, the React implementation defines:
|
||||
- What features must exist
|
||||
- How they should behave
|
||||
- What data structures are required
|
||||
- What UI patterns to follow
|
||||
|
||||
## Source Project Structure
|
||||
|
||||
```
|
||||
thrillwiki-87/
|
||||
├── src/
|
||||
│ ├── components/ # React components (44 directories)
|
||||
│ ├── pages/ # Route pages (39 files)
|
||||
│ ├── hooks/ # React hooks (80+ files)
|
||||
│ ├── types/ # TypeScript definitions (63 files)
|
||||
│ ├── contexts/ # React contexts
|
||||
│ ├── lib/ # Utilities
|
||||
│ └── integrations/ # External service integrations
|
||||
├── docs/ # Feature documentation (78 files)
|
||||
│ ├── SITE_OVERVIEW.md
|
||||
│ ├── DESIGN_SYSTEM.md
|
||||
│ ├── COMPONENTS.md
|
||||
│ ├── PAGES.md
|
||||
│ └── USER_FLOWS.md
|
||||
└── supabase/ # Backend schemas and functions
|
||||
```
|
||||
|
||||
## Technology Translation
|
||||
|
||||
| React (Source) | Nuxt 4 (Target) |
|
||||
|----------------|-----------------|
|
||||
| React component | Vue SFC (.vue) |
|
||||
| useState | ref() / reactive() |
|
||||
| useEffect | watch() / onMounted() |
|
||||
| useContext | provide() / inject() or Pinia |
|
||||
| React Router | Nuxt file-based routing |
|
||||
| React Query | useAsyncData / useFetch |
|
||||
| shadcn-ui | Nuxt UI |
|
||||
| Supabase client | Django REST API via useApi() |
|
||||
| Edge Functions | Django views |
|
||||
|
||||
## Backend Translation
|
||||
|
||||
| Supabase (Source) | Django (Target) |
|
||||
|-------------------|-----------------|
|
||||
| Table | Django Model |
|
||||
| RLS policies | DRF permissions |
|
||||
| Edge Functions | Django views/viewsets |
|
||||
| Realtime | SSE / WebSockets |
|
||||
| Auth | django-allauth + JWT |
|
||||
| Storage | Cloudflare R2 |
|
||||
|
||||
## Migration Workflow
|
||||
|
||||
1. **Find source** in thrillwiki-87
|
||||
2. **Read the docs** in thrillwiki-87/docs/
|
||||
3. **Check existing** Nuxt implementation
|
||||
4. **Port missing features** to achieve parity
|
||||
5. **Verify behavior** matches source
|
||||
|
||||
## Key Source Files to Reference
|
||||
|
||||
When porting a feature, always check:
|
||||
- `thrillwiki-87/docs/` for specifications
|
||||
- `thrillwiki-87/src/types/` for data structures
|
||||
- `thrillwiki-87/src/hooks/` for business logic
|
||||
- `thrillwiki-87/src/components/` for UI patterns
|
||||
83
.agent/MEMORY/source_mapping.md
Normal file
83
.agent/MEMORY/source_mapping.md
Normal file
@@ -0,0 +1,83 @@
|
||||
# Source Mapping: React → Nuxt
|
||||
|
||||
Quick reference for mapping thrillwiki-87 paths to thrillwiki_django_no_react paths.
|
||||
|
||||
## Directory Mappings
|
||||
|
||||
| React (thrillwiki-87) | Nuxt (thrillwiki_django_no_react) |
|
||||
|----------------------|-----------------------------------|
|
||||
| `src/components/` | `frontend/app/components/` |
|
||||
| `src/pages/` | `frontend/app/pages/` |
|
||||
| `src/hooks/` | `frontend/app/composables/` |
|
||||
| `src/types/` | `frontend/app/types/` |
|
||||
| `src/lib/` | `frontend/app/utils/` |
|
||||
| `src/contexts/` | `frontend/app/stores/` (Pinia) |
|
||||
| `docs/` | `source_docs/` |
|
||||
| `supabase/migrations/` | `backend/apps/*/models.py` |
|
||||
|
||||
## Component Mappings (shadcn-ui → Nuxt UI)
|
||||
|
||||
| shadcn-ui | Nuxt UI |
|
||||
|-----------|---------|
|
||||
| `<Button>` | `<UButton>` |
|
||||
| `<Card>` | `<UCard>` |
|
||||
| `<Dialog>` | `<UModal>` |
|
||||
| `<Input>` | `<UInput>` |
|
||||
| `<Select>` | `<USelect>` / `<USelectMenu>` |
|
||||
| `<Tabs>` | `<UTabs>` |
|
||||
| `<Table>` | `<UTable>` |
|
||||
| `<Badge>` | `<UBadge>` |
|
||||
| `<Avatar>` | `<UAvatar>` |
|
||||
| `<Tooltip>` | `<UTooltip>` |
|
||||
| `<Sheet>` | `<USlideover>` |
|
||||
| `<AlertDialog>` | `<UModal>` + confirm pattern |
|
||||
| `<Skeleton>` | `<USkeleton>` |
|
||||
| `<Textarea>` | `<UTextarea>` |
|
||||
| `<Checkbox>` | `<UCheckbox>` |
|
||||
| `<RadioGroup>` | `<URadioGroup>` |
|
||||
| `<Switch>` | `<UToggle>` |
|
||||
| `<DropdownMenu>` | `<UDropdown>` |
|
||||
| `<Command>` | `<UCommandPalette>` |
|
||||
| `<Popover>` | `<UPopover>` |
|
||||
|
||||
## Page Mappings
|
||||
|
||||
| React Page | Nuxt Page |
|
||||
|------------|-----------|
|
||||
| `Index.tsx` | `pages/index.vue` |
|
||||
| `Parks.tsx` | `pages/parks/index.vue` |
|
||||
| `ParkDetail.tsx` | `pages/parks/[park_slug]/index.vue` |
|
||||
| `Rides.tsx` | `pages/rides/index.vue` |
|
||||
| `RideDetail.tsx` | `pages/parks/[park_slug]/rides/[ride_slug].vue` |
|
||||
| `Manufacturers.tsx` | `pages/manufacturers/index.vue` |
|
||||
| `ManufacturerDetail.tsx` | `pages/manufacturers/[slug].vue` |
|
||||
| `Designers.tsx` | `pages/designers/index.vue` |
|
||||
| `DesignerDetail.tsx` | `pages/designers/[slug].vue` |
|
||||
| `Operators.tsx` | `pages/operators/index.vue` |
|
||||
| `OperatorDetail.tsx` | `pages/operators/[slug].vue` |
|
||||
| `ParkOwners.tsx` | `pages/owners/index.vue` |
|
||||
| `PropertyOwnerDetail.tsx` | `pages/owners/[slug].vue` |
|
||||
| `Auth.tsx` | `pages/auth/login.vue`, `pages/auth/signup.vue` |
|
||||
| `Profile.tsx` | `pages/profile/index.vue` |
|
||||
| `Search.tsx` | `pages/search.vue` |
|
||||
| `AdminDashboard.tsx` | `pages/admin/index.vue` |
|
||||
|
||||
## Hook → Composable Mappings
|
||||
|
||||
| React Hook | Vue Composable |
|
||||
|------------|----------------|
|
||||
| `useAuth.tsx` | `useAuth.ts` |
|
||||
| `useSearch.tsx` | `useSearchHistory.ts` |
|
||||
| `useModerationQueue.ts` | `useModeration.ts` |
|
||||
| `useProfile.tsx` | (inline in pages) |
|
||||
| `useLocations.ts` | `useParksApi.ts` |
|
||||
| `useUnitPreferences.ts` | `useUnits.ts` |
|
||||
|
||||
## API Endpoint Translation
|
||||
|
||||
| Supabase RPC/Query | Django API |
|
||||
|--------------------|------------|
|
||||
| `supabase.from('parks')` | `GET /api/v1/parks/` |
|
||||
| `supabase.rpc('search_*')` | `GET /api/v1/search/` |
|
||||
| `supabase.auth.*` | `/api/v1/auth/*` |
|
||||
| Edge Functions | Django views in `backend/apps/*/views.py` |
|
||||
329
.agent/rules/api-conventions.md
Normal file
329
.agent/rules/api-conventions.md
Normal file
@@ -0,0 +1,329 @@
|
||||
# ThrillWiki API Conventions
|
||||
|
||||
Standards for designing and implementing APIs between the Django backend and Nuxt frontend.
|
||||
|
||||
## API Base Structure
|
||||
|
||||
### URL Patterns
|
||||
```
|
||||
/api/v1/ # API root (versioned)
|
||||
/api/v1/parks/ # List/Create parks
|
||||
/api/v1/parks/{slug}/ # Retrieve/Update/Delete park
|
||||
/api/v1/parks/{slug}/rides/ # Nested resource
|
||||
/api/v1/auth/ # Authentication endpoints
|
||||
/api/v1/users/me/ # Current user
|
||||
```
|
||||
|
||||
### HTTP Methods
|
||||
| Method | Usage |
|
||||
|--------|-------|
|
||||
| GET | Retrieve resource(s) |
|
||||
| POST | Create resource |
|
||||
| PUT | Replace resource entirely |
|
||||
| PATCH | Update resource partially |
|
||||
| DELETE | Remove resource |
|
||||
|
||||
## Response Formats
|
||||
|
||||
### Success Response (Single Resource)
|
||||
```json
|
||||
{
|
||||
"id": "uuid",
|
||||
"name": "Cedar Point",
|
||||
"slug": "cedar-point",
|
||||
"created_at": "2024-01-15T10:30:00Z",
|
||||
"updated_at": "2024-01-16T14:20:00Z",
|
||||
...
|
||||
}
|
||||
```
|
||||
|
||||
### Success Response (List)
|
||||
```json
|
||||
{
|
||||
"count": 150,
|
||||
"next": "/api/v1/parks/?page=2",
|
||||
"previous": null,
|
||||
"results": [
|
||||
{ ... },
|
||||
{ ... }
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
### Error Response
|
||||
```json
|
||||
{
|
||||
"error": {
|
||||
"code": "validation_error",
|
||||
"message": "Invalid input data",
|
||||
"details": {
|
||||
"name": ["This field is required."],
|
||||
"status": ["Invalid choice."]
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### HTTP Status Codes
|
||||
| Code | Meaning | When to Use |
|
||||
|------|---------|-------------|
|
||||
| 200 | OK | Successful GET, PUT, PATCH |
|
||||
| 201 | Created | Successful POST |
|
||||
| 204 | No Content | Successful DELETE |
|
||||
| 400 | Bad Request | Validation errors |
|
||||
| 401 | Unauthorized | Missing/invalid auth |
|
||||
| 403 | Forbidden | Insufficient permissions |
|
||||
| 404 | Not Found | Resource doesn't exist |
|
||||
| 409 | Conflict | Duplicate resource |
|
||||
| 500 | Server Error | Unexpected errors |
|
||||
|
||||
## Pagination
|
||||
|
||||
### Query Parameters
|
||||
```
|
||||
?page=2 # Page number (default: 1)
|
||||
?page_size=20 # Items per page (default: 20, max: 100)
|
||||
```
|
||||
|
||||
### Response
|
||||
```json
|
||||
{
|
||||
"count": 150,
|
||||
"next": "/api/v1/parks/?page=3",
|
||||
"previous": "/api/v1/parks/?page=1",
|
||||
"results": [ ... ]
|
||||
}
|
||||
```
|
||||
|
||||
## Filtering & Sorting
|
||||
|
||||
### Filtering
|
||||
```
|
||||
GET /api/v1/parks/?status=operating
|
||||
GET /api/v1/parks/?country=USA&status=operating
|
||||
GET /api/v1/rides/?park=cedar-point&type=coaster
|
||||
```
|
||||
|
||||
### Search
|
||||
```
|
||||
GET /api/v1/parks/?search=cedar
|
||||
```
|
||||
|
||||
### Sorting
|
||||
```
|
||||
GET /api/v1/parks/?ordering=name # Ascending
|
||||
GET /api/v1/parks/?ordering=-created_at # Descending
|
||||
GET /api/v1/parks/?ordering=-rating,name # Multiple fields
|
||||
```
|
||||
|
||||
## Authentication
|
||||
|
||||
### Token-Based Auth
|
||||
```
|
||||
Authorization: Bearer <jwt_token>
|
||||
```
|
||||
|
||||
### Endpoints
|
||||
```
|
||||
POST /api/v1/auth/login/ # Get tokens
|
||||
POST /api/v1/auth/register/ # Create account
|
||||
POST /api/v1/auth/refresh/ # Refresh access token
|
||||
POST /api/v1/auth/logout/ # Invalidate tokens
|
||||
GET /api/v1/auth/me/ # Current user info
|
||||
```
|
||||
|
||||
### Social Auth
|
||||
```
|
||||
POST /api/v1/auth/google/ # Google OAuth
|
||||
POST /api/v1/auth/discord/ # Discord OAuth
|
||||
```
|
||||
|
||||
## Content Submission API
|
||||
|
||||
### Submit New Content
|
||||
```
|
||||
POST /api/v1/submissions/
|
||||
{
|
||||
"content_type": "park",
|
||||
"data": {
|
||||
"name": "New Park Name",
|
||||
"city": "Orlando",
|
||||
...
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Response
|
||||
```json
|
||||
{
|
||||
"id": "submission-uuid",
|
||||
"status": "pending",
|
||||
"content_type": "park",
|
||||
"data": { ... },
|
||||
"created_at": "2024-01-15T10:30:00Z"
|
||||
}
|
||||
```
|
||||
|
||||
### Submit Edit to Existing Content
|
||||
```
|
||||
POST /api/v1/submissions/
|
||||
{
|
||||
"content_type": "park",
|
||||
"object_id": "existing-park-uuid",
|
||||
"data": {
|
||||
"description": "Updated description..."
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Moderation API
|
||||
|
||||
### Get Moderation Queue
|
||||
```
|
||||
GET /api/v1/moderation/
|
||||
GET /api/v1/moderation/?status=pending&type=park
|
||||
```
|
||||
|
||||
### Review Submission
|
||||
```
|
||||
POST /api/v1/moderation/{id}/approve/
|
||||
POST /api/v1/moderation/{id}/reject/
|
||||
{
|
||||
"notes": "Reason for rejection..."
|
||||
}
|
||||
```
|
||||
|
||||
## Nested Resources
|
||||
|
||||
### Pattern
|
||||
```
|
||||
/api/v1/parks/{slug}/rides/ # List rides in park
|
||||
/api/v1/parks/{slug}/reviews/ # List park reviews
|
||||
/api/v1/parks/{slug}/photos/ # List park photos
|
||||
```
|
||||
|
||||
### When to Nest vs Flat
|
||||
**Nest when:**
|
||||
- Resource only makes sense in context of parent (park photos)
|
||||
- Need to filter by parent frequently
|
||||
|
||||
**Use flat with filter when:**
|
||||
- Resource can exist independently
|
||||
- Need to query across parents
|
||||
|
||||
```
|
||||
# Flat with filter
|
||||
GET /api/v1/rides/?park=cedar-point
|
||||
|
||||
# Or nested
|
||||
GET /api/v1/parks/cedar-point/rides/
|
||||
```
|
||||
|
||||
## Geolocation API
|
||||
|
||||
### Parks Nearby
|
||||
```
|
||||
GET /api/v1/parks/nearby/?lat=41.4821&lng=-82.6822&radius=50
|
||||
```
|
||||
|
||||
### Response includes distance
|
||||
```json
|
||||
{
|
||||
"results": [
|
||||
{
|
||||
"id": "uuid",
|
||||
"name": "Cedar Point",
|
||||
"distance": 12.5, // in user's preferred units
|
||||
...
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
## File Uploads
|
||||
|
||||
### Photo Upload
|
||||
```
|
||||
POST /api/v1/photos/
|
||||
Content-Type: multipart/form-data
|
||||
|
||||
{
|
||||
"content_type": "park",
|
||||
"object_id": "park-uuid",
|
||||
"image": <file>,
|
||||
"caption": "Optional caption"
|
||||
}
|
||||
```
|
||||
|
||||
### Response
|
||||
```json
|
||||
{
|
||||
"id": "photo-uuid",
|
||||
"url": "https://cdn.thrillwiki.com/photos/...",
|
||||
"thumbnail_url": "https://cdn.thrillwiki.com/photos/.../thumb",
|
||||
"status": "pending", // Pending moderation
|
||||
"created_at": "2024-01-15T10:30:00Z"
|
||||
}
|
||||
```
|
||||
|
||||
## Rate Limiting
|
||||
|
||||
### Headers
|
||||
```
|
||||
X-RateLimit-Limit: 100
|
||||
X-RateLimit-Remaining: 95
|
||||
X-RateLimit-Reset: 1642089600
|
||||
```
|
||||
|
||||
### When Limited
|
||||
```
|
||||
HTTP/1.1 429 Too Many Requests
|
||||
{
|
||||
"error": {
|
||||
"code": "rate_limit_exceeded",
|
||||
"message": "Too many requests. Try again in 60 seconds."
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Frontend Integration
|
||||
|
||||
### API Client Setup (Nuxt)
|
||||
```typescript
|
||||
// composables/useApi.ts
|
||||
export function useApi() {
|
||||
const config = useRuntimeConfig()
|
||||
const authStore = useAuthStore()
|
||||
|
||||
const api = $fetch.create({
|
||||
baseURL: config.public.apiBase,
|
||||
headers: authStore.token ? {
|
||||
Authorization: `Bearer ${authStore.token}`
|
||||
} : {},
|
||||
onResponseError: ({ response }) => {
|
||||
if (response.status === 401) {
|
||||
authStore.logout()
|
||||
navigateTo('/auth/login')
|
||||
}
|
||||
}
|
||||
})
|
||||
|
||||
return api
|
||||
}
|
||||
```
|
||||
|
||||
### Usage in Components
|
||||
```typescript
|
||||
const api = useApi()
|
||||
|
||||
// Fetch parks
|
||||
const { data: parks } = await useAsyncData('parks', () =>
|
||||
api('/parks/', { params: { status: 'operating' } })
|
||||
)
|
||||
|
||||
// Create submission
|
||||
await api('/submissions/', {
|
||||
method: 'POST',
|
||||
body: { content_type: 'park', data: formData }
|
||||
})
|
||||
```
|
||||
306
.agent/rules/component-patterns.md
Normal file
306
.agent/rules/component-patterns.md
Normal file
@@ -0,0 +1,306 @@
|
||||
# ThrillWiki Component Patterns
|
||||
|
||||
Guidelines for building UI components consistent with ThrillWiki's design system.
|
||||
|
||||
## Component Hierarchy
|
||||
|
||||
```
|
||||
components/
|
||||
├── layout/ # Page structure
|
||||
│ ├── Header.vue
|
||||
│ ├── Footer.vue
|
||||
│ ├── Sidebar.vue
|
||||
│ └── PageContainer.vue
|
||||
├── ui/ # Base components (shadcn/ui style)
|
||||
│ ├── Button.vue
|
||||
│ ├── Card.vue
|
||||
│ ├── Input.vue
|
||||
│ ├── Badge.vue
|
||||
│ ├── Avatar.vue
|
||||
│ ├── Modal.vue
|
||||
│ └── ...
|
||||
├── entity/ # Domain-specific cards
|
||||
│ ├── ParkCard.vue
|
||||
│ ├── RideCard.vue
|
||||
│ ├── ReviewCard.vue
|
||||
│ ├── CreditCard.vue
|
||||
│ └── CompanyCard.vue
|
||||
├── forms/ # Form components
|
||||
│ ├── ParkForm.vue
|
||||
│ ├── ReviewForm.vue
|
||||
│ └── ...
|
||||
└── specialty/ # Complex/unique components
|
||||
├── SearchAutocomplete.vue
|
||||
├── Map.vue
|
||||
├── ImageGallery.vue
|
||||
├── UnitDisplay.vue
|
||||
└── RatingDisplay.vue
|
||||
```
|
||||
|
||||
## Base Components (ui/)
|
||||
|
||||
### Card
|
||||
```vue
|
||||
<template>
|
||||
<div :class="[
|
||||
'rounded-lg border bg-card text-card-foreground',
|
||||
interactive && 'hover:shadow-md transition-shadow cursor-pointer',
|
||||
className
|
||||
]">
|
||||
<slot />
|
||||
</div>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
defineProps<{
|
||||
interactive?: boolean
|
||||
className?: string
|
||||
}>()
|
||||
</script>
|
||||
```
|
||||
|
||||
### Button
|
||||
```vue
|
||||
<template>
|
||||
<button :class="[buttonVariants({ variant, size }), className]">
|
||||
<slot />
|
||||
</button>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
const props = defineProps<{
|
||||
variant?: 'default' | 'secondary' | 'outline' | 'ghost' | 'destructive'
|
||||
size?: 'default' | 'sm' | 'lg' | 'icon'
|
||||
className?: string
|
||||
}>()
|
||||
</script>
|
||||
```
|
||||
|
||||
### Badge
|
||||
```vue
|
||||
<template>
|
||||
<span :class="[badgeVariants({ variant }), className]">
|
||||
<slot />
|
||||
</span>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
defineProps<{
|
||||
variant?: 'default' | 'secondary' | 'success' | 'warning' | 'destructive' | 'outline'
|
||||
className?: string
|
||||
}>()
|
||||
</script>
|
||||
```
|
||||
|
||||
## Entity Cards
|
||||
|
||||
### ParkCard
|
||||
Displays park preview with image, name, location, stats, and status.
|
||||
|
||||
**Required Props:**
|
||||
- `park: Park` - Park object
|
||||
|
||||
**Displays:**
|
||||
- Park image (with fallback)
|
||||
- Park name
|
||||
- Location (city, country)
|
||||
- Ride count
|
||||
- Average rating
|
||||
- Status badge (Operating/Closed/Under Construction)
|
||||
|
||||
**Interactions:**
|
||||
- Click navigates to park detail page
|
||||
- Hover shows elevation shadow
|
||||
|
||||
```vue
|
||||
<template>
|
||||
<NuxtLink :to="`/parks/${park.slug}`">
|
||||
<Card interactive>
|
||||
<div class="aspect-video relative overflow-hidden rounded-t-lg">
|
||||
<NuxtImg
|
||||
:src="park.image || '/placeholder-park.jpg'"
|
||||
:alt="park.name"
|
||||
class="object-cover w-full h-full"
|
||||
/>
|
||||
</div>
|
||||
<div class="p-4">
|
||||
<h3 class="font-semibold text-lg line-clamp-1">{{ park.name }}</h3>
|
||||
<p class="text-sm text-muted-foreground flex items-center gap-1">
|
||||
<MapPin class="w-4 h-4" />
|
||||
{{ park.city }}, {{ park.country }}
|
||||
</p>
|
||||
<div class="flex items-center justify-between mt-2">
|
||||
<span class="text-sm">🎢 {{ park.rideCount }} rides</span>
|
||||
<RatingDisplay :rating="park.averageRating" size="sm" />
|
||||
</div>
|
||||
<Badge :variant="statusVariant" class="mt-2">
|
||||
{{ park.status }}
|
||||
</Badge>
|
||||
</div>
|
||||
</Card>
|
||||
</NuxtLink>
|
||||
</template>
|
||||
```
|
||||
|
||||
### RideCard
|
||||
Similar structure to ParkCard, but shows:
|
||||
- Ride image
|
||||
- Ride name
|
||||
- Park name (linked)
|
||||
- Key specs (speed, height)
|
||||
- Type badge + Status badge
|
||||
|
||||
### ReviewCard
|
||||
Displays user review with:
|
||||
- User avatar + username
|
||||
- Rating (5-star display)
|
||||
- Review date (relative)
|
||||
- Review text
|
||||
- Helpful votes (👍 count)
|
||||
- Actions (Reply, Report)
|
||||
|
||||
### CreditCard
|
||||
For user's ride credit list:
|
||||
- Ride thumbnail
|
||||
- Ride name + park name
|
||||
- Ride count with +/- controls
|
||||
- Last ridden date
|
||||
- Edit button
|
||||
|
||||
## Specialty Components
|
||||
|
||||
### SearchAutocomplete
|
||||
Global search with instant results.
|
||||
|
||||
**Features:**
|
||||
- Debounced input (300ms)
|
||||
- Results grouped by type (Parks, Rides, Companies)
|
||||
- Keyboard navigation
|
||||
- Click or Enter to navigate
|
||||
- Empty state handling
|
||||
|
||||
**Implementation Notes:**
|
||||
- Use `useFetch` with `watch` for reactive searching
|
||||
- Show loading skeleton while fetching
|
||||
- Limit results to 10 per category
|
||||
- Highlight matching text
|
||||
|
||||
### UnitDisplay
|
||||
Converts and displays values in user's preferred units.
|
||||
|
||||
```vue
|
||||
<template>
|
||||
<span>{{ formattedValue }}</span>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
const props = defineProps<{
|
||||
value: number
|
||||
type: 'speed' | 'height' | 'length' | 'weight'
|
||||
showBoth?: boolean // Show both metric and imperial
|
||||
}>()
|
||||
|
||||
const { preferredUnits } = useUnits()
|
||||
|
||||
const formattedValue = computed(() => {
|
||||
// Convert and format based on type and preference
|
||||
})
|
||||
</script>
|
||||
```
|
||||
|
||||
### RatingDisplay
|
||||
Star rating visualization.
|
||||
|
||||
```vue
|
||||
<template>
|
||||
<div class="flex items-center gap-1">
|
||||
<div class="flex">
|
||||
<Star
|
||||
v-for="i in 5"
|
||||
:key="i"
|
||||
:class="[
|
||||
'w-4 h-4',
|
||||
i <= Math.round(rating) ? 'fill-yellow-400 text-yellow-400' : 'text-gray-300'
|
||||
]"
|
||||
/>
|
||||
</div>
|
||||
<span class="text-sm font-medium">{{ rating.toFixed(1) }}</span>
|
||||
<span v-if="count" class="text-sm text-muted-foreground">({{ count }})</span>
|
||||
</div>
|
||||
</template>
|
||||
```
|
||||
|
||||
### Map (Leaflet)
|
||||
Interactive map for parks nearby feature.
|
||||
|
||||
**Features:**
|
||||
- Marker clusters for dense areas
|
||||
- Custom markers for different park types
|
||||
- Popup on marker click with park preview
|
||||
- Zoom controls
|
||||
- Full-screen toggle
|
||||
- User location marker (if permitted)
|
||||
|
||||
## Form Components
|
||||
|
||||
### Standard Form Structure
|
||||
```vue
|
||||
<template>
|
||||
<form @submit.prevent="handleSubmit">
|
||||
<div class="space-y-4">
|
||||
<FormField label="Name" :error="errors.name">
|
||||
<Input v-model="form.name" />
|
||||
</FormField>
|
||||
|
||||
<FormField label="Description">
|
||||
<Textarea v-model="form.description" />
|
||||
</FormField>
|
||||
|
||||
<div class="flex gap-2 justify-end">
|
||||
<Button variant="outline" @click="$emit('cancel')">Cancel</Button>
|
||||
<Button type="submit" :loading="isSubmitting">Save</Button>
|
||||
</div>
|
||||
</div>
|
||||
</form>
|
||||
</template>
|
||||
```
|
||||
|
||||
### Validation
|
||||
- Use Zod for schema validation
|
||||
- Display errors inline below fields
|
||||
- Disable submit button while invalid
|
||||
- Show loading state during submission
|
||||
|
||||
## Loading States
|
||||
|
||||
### Skeleton Loading
|
||||
All cards should have skeleton states:
|
||||
```vue
|
||||
<template>
|
||||
<Card v-if="loading">
|
||||
<Skeleton class="aspect-video rounded-t-lg" />
|
||||
<div class="p-4 space-y-2">
|
||||
<Skeleton class="h-5 w-3/4" />
|
||||
<Skeleton class="h-4 w-1/2" />
|
||||
<Skeleton class="h-4 w-1/4" />
|
||||
</div>
|
||||
</Card>
|
||||
<ActualCard v-else :data="data" />
|
||||
</template>
|
||||
```
|
||||
|
||||
### Empty States
|
||||
Provide clear empty states with:
|
||||
- Relevant icon
|
||||
- Clear message
|
||||
- Suggested action (button or link)
|
||||
|
||||
```vue
|
||||
<EmptyState
|
||||
icon="Search"
|
||||
title="No results found"
|
||||
description="Try adjusting your search or filters"
|
||||
>
|
||||
<Button @click="clearFilters">Clear Filters</Button>
|
||||
</EmptyState>
|
||||
```
|
||||
191
.agent/rules/design-system.md
Normal file
191
.agent/rules/design-system.md
Normal file
@@ -0,0 +1,191 @@
|
||||
# ThrillWiki Design System Rules
|
||||
|
||||
Visual identity, colors, typography, and styling guidelines for ThrillWiki UI.
|
||||
|
||||
## Brand Identity
|
||||
|
||||
- **Name**: ThrillWiki (optional "(Beta)" suffix during preview)
|
||||
- **Tagline**: The Ultimate Theme Park Database
|
||||
- **Personality**: Enthusiastic, trustworthy, community-driven
|
||||
- **Visual Style**: Modern, clean, subtle gradients, smooth animations, card-based layouts
|
||||
|
||||
## Color System
|
||||
|
||||
### Semantic Color Tokens (use these, not raw hex values)
|
||||
|
||||
| Token | Purpose | Usage |
|
||||
|-------|---------|-------|
|
||||
| `--background` | Page background | Main content area |
|
||||
| `--foreground` | Primary text | Body text, headings |
|
||||
| `--primary` | Interactive elements | Buttons, links |
|
||||
| `--secondary` | Secondary UI | Secondary buttons |
|
||||
| `--muted` | Subdued content | Hints, disabled states |
|
||||
| `--accent` | Highlights | Focus rings, special callouts |
|
||||
| `--destructive` | Danger actions | Delete buttons, errors |
|
||||
| `--success` | Positive feedback | Success messages, confirmations |
|
||||
| `--warning` | Caution | Alerts, warnings |
|
||||
|
||||
### Status Colors (Entity Badges)
|
||||
- **Operating**: Green (`--success`)
|
||||
- **Closed**: Red (`--destructive`)
|
||||
- **Under Construction**: Amber (`--warning`)
|
||||
|
||||
### Dark Mode
|
||||
- Automatically supported via CSS variables
|
||||
- Reduce contrast (use off-white, not pure white)
|
||||
- Replace shadows with subtle glows
|
||||
- Slightly dim images
|
||||
|
||||
## Typography
|
||||
|
||||
### Font
|
||||
- **Primary Font**: Inter (Sans-Serif)
|
||||
- **Weights**: 400 (Regular), 500 (Medium), 600 (Semibold), 700 (Bold)
|
||||
- **Fallback**: system-ui, sans-serif
|
||||
|
||||
### Type Scale
|
||||
|
||||
| Size | Name | Usage |
|
||||
|------|------|-------|
|
||||
| 48px | Display | Hero headlines |
|
||||
| 36px | H1 | Page titles |
|
||||
| 30px | H2 | Section headers |
|
||||
| 24px | H3 | Card titles |
|
||||
| 20px | H4 | Subheadings |
|
||||
| 16px | Body | Default text |
|
||||
| 14px | Small | Secondary text |
|
||||
| 12px | Caption | Labels, hints |
|
||||
|
||||
### Text Classes (Tailwind)
|
||||
```
|
||||
Page Title: text-4xl font-bold
|
||||
Section Header: text-2xl font-semibold
|
||||
Card Title: text-lg font-medium
|
||||
Body: text-base
|
||||
Caption: text-sm text-muted-foreground
|
||||
```
|
||||
|
||||
## Spacing System
|
||||
|
||||
### Base Unit
|
||||
- 4px base unit
|
||||
- All spacing is multiples of 4
|
||||
|
||||
### Spacing Scale
|
||||
|
||||
| Token | Size | Usage |
|
||||
|-------|------|-------|
|
||||
| `space-1` (p-1) | 4px | Tight gaps |
|
||||
| `space-2` (p-2) | 8px | Icon gaps |
|
||||
| `space-3` (p-3) | 12px | Small padding |
|
||||
| `space-4` (p-4) | 16px | Default padding |
|
||||
| `space-6` (p-6) | 24px | Section gaps |
|
||||
| `space-8` (p-8) | 32px | Large gaps |
|
||||
| `space-12` (p-12) | 48px | Section margins |
|
||||
|
||||
## Border Radius
|
||||
|
||||
| Token | Size | Usage |
|
||||
|-------|------|-------|
|
||||
| `rounded-sm` | 4px | Small elements |
|
||||
| `rounded` | 8px | Buttons, inputs |
|
||||
| `rounded-lg` | 12px | Cards |
|
||||
| `rounded-xl` | 16px | Large cards, modals |
|
||||
| `rounded-full` | 9999px | Pills, avatars |
|
||||
|
||||
## Layout
|
||||
|
||||
### Max Content Width
|
||||
- Main content: `max-w-7xl` (1280px)
|
||||
- Centered with `mx-auto`
|
||||
- Responsive padding: `px-4 md:px-6 lg:px-8`
|
||||
|
||||
### Grid System
|
||||
- Desktop (1024px+): 4 columns
|
||||
- Tablet (768px+): 2 columns
|
||||
- Mobile (<768px): 1 column
|
||||
|
||||
```html
|
||||
<div class="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-4 gap-6">
|
||||
```
|
||||
|
||||
## Component Patterns
|
||||
|
||||
### Cards
|
||||
- Background: `bg-card`
|
||||
- Border: `border border-border`
|
||||
- Radius: `rounded-lg`
|
||||
- Padding: `p-4` or `p-6`
|
||||
- Interactive cards: Add `hover:shadow-md transition-shadow cursor-pointer`
|
||||
|
||||
### Buttons
|
||||
| Variant | Classes |
|
||||
|---------|---------|
|
||||
| Primary | `bg-primary text-primary-foreground hover:bg-primary/90` |
|
||||
| Secondary | `bg-secondary text-secondary-foreground hover:bg-secondary/80` |
|
||||
| Outline | `border border-input bg-background hover:bg-accent` |
|
||||
| Ghost | `hover:bg-accent hover:text-accent-foreground` |
|
||||
| Destructive | `bg-destructive text-destructive-foreground hover:bg-destructive/90` |
|
||||
|
||||
### Inputs
|
||||
- Border: `border border-input`
|
||||
- Focus: `focus:ring-2 focus:ring-primary focus:border-transparent`
|
||||
- Error: `border-destructive focus:ring-destructive`
|
||||
|
||||
### Badges
|
||||
```html
|
||||
<span class="inline-flex items-center px-2.5 py-0.5 rounded-full text-xs font-medium bg-green-100 text-green-800">
|
||||
Operating
|
||||
</span>
|
||||
```
|
||||
|
||||
## Icons
|
||||
|
||||
- **Library**: Lucide icons (via `lucide-vue-next`)
|
||||
- **Default Size**: 24px (w-6 h-6)
|
||||
- **Stroke Width**: 1.5px
|
||||
- **Color**: Inherit from text color
|
||||
|
||||
Common icons:
|
||||
- Search, Menu, User, Heart, Star, MapPin, Calendar, Camera, Edit, Trash, Check, X, ChevronRight, ExternalLink
|
||||
|
||||
## Responsive Breakpoints
|
||||
|
||||
| Breakpoint | Width | Target |
|
||||
|------------|-------|--------|
|
||||
| `sm` | 640px | Large phones |
|
||||
| `md` | 768px | Tablets |
|
||||
| `lg` | 1024px | Small laptops |
|
||||
| `xl` | 1280px | Desktops |
|
||||
| `2xl` | 1536px | Large screens |
|
||||
|
||||
## Accessibility Requirements
|
||||
|
||||
- Color contrast: 4.5:1 minimum for normal text
|
||||
- Focus states: Visible focus ring on all interactive elements
|
||||
- Motion: Respect `prefers-reduced-motion`
|
||||
- Screen readers: Proper ARIA labels on interactive elements
|
||||
- Keyboard: All functionality accessible via keyboard
|
||||
|
||||
## ThrillWiki-Specific Patterns
|
||||
|
||||
### Unit Display
|
||||
Always provide unit conversion toggle (metric/imperial):
|
||||
```vue
|
||||
<UnitDisplay :value="121" type="speed" />
|
||||
<!-- Renders: "121 km/h" or "75 mph" based on user preference -->
|
||||
```
|
||||
|
||||
### Rating Display
|
||||
```vue
|
||||
<RatingDisplay :rating="4.2" :count="156" />
|
||||
<!-- Renders: ★★★★☆ 4.2 (156 reviews) -->
|
||||
```
|
||||
|
||||
### Entity Cards
|
||||
All entity cards (Park, Ride, Company) should show:
|
||||
- Image (with loading skeleton)
|
||||
- Name (primary text)
|
||||
- Key details (secondary text)
|
||||
- Status badge
|
||||
- Quick stats (rating, count, etc.)
|
||||
254
.agent/rules/django-standards.md
Normal file
254
.agent/rules/django-standards.md
Normal file
@@ -0,0 +1,254 @@
|
||||
# ThrillWiki Django Backend Standards
|
||||
|
||||
Rules for developing the ThrillWiki backend with Django and Django REST Framework.
|
||||
|
||||
## Project Structure
|
||||
|
||||
```
|
||||
backend/
|
||||
├── config/ # Project settings
|
||||
│ ├── settings/
|
||||
│ │ ├── base.py # Shared settings
|
||||
│ │ ├── development.py # Dev-specific
|
||||
│ │ └── production.py # Prod-specific
|
||||
│ ├── urls.py # Root URL config
|
||||
│ └── wsgi.py
|
||||
├── apps/
|
||||
│ ├── parks/ # Park-related models and APIs
|
||||
│ ├── rides/ # Ride-related models and APIs
|
||||
│ ├── companies/ # Manufacturers, operators, etc.
|
||||
│ ├── users/ # User profiles, authentication
|
||||
│ ├── reviews/ # User reviews and ratings
|
||||
│ ├── credits/ # Ride credits tracking
|
||||
│ ├── submissions/ # Content submission system
|
||||
│ ├── moderation/ # Moderation queue
|
||||
│ └── core/ # Shared utilities
|
||||
└── tests/ # Test files
|
||||
```
|
||||
|
||||
## Django App Structure
|
||||
|
||||
Each app should follow this structure:
|
||||
```
|
||||
app_name/
|
||||
├── __init__.py
|
||||
├── admin.py # Django admin configuration
|
||||
├── apps.py # App configuration
|
||||
├── models.py # Database models
|
||||
├── serializers.py # DRF serializers
|
||||
├── views.py # DRF viewsets/views
|
||||
├── urls.py # URL routing
|
||||
├── permissions.py # Custom permissions (if needed)
|
||||
├── filters.py # DRF filters (if needed)
|
||||
├── signals.py # Django signals (if needed)
|
||||
└── tests/
|
||||
├── __init__.py
|
||||
├── test_models.py
|
||||
├── test_views.py
|
||||
└── test_serializers.py
|
||||
```
|
||||
|
||||
## Model Conventions
|
||||
|
||||
### Base Model
|
||||
All models should inherit from a base model with common fields:
|
||||
```python
|
||||
from django.db import models
|
||||
import uuid
|
||||
|
||||
class BaseModel(models.Model):
|
||||
"""Base model with common fields"""
|
||||
id = models.UUIDField(primary_key=True, default=uuid.uuid4, editable=False)
|
||||
created_at = models.DateTimeField(auto_now_add=True)
|
||||
updated_at = models.DateTimeField(auto_now=True)
|
||||
|
||||
class Meta:
|
||||
abstract = True
|
||||
```
|
||||
|
||||
### Model Example
|
||||
```python
|
||||
class Park(BaseModel):
|
||||
"""A theme park or amusement park"""
|
||||
|
||||
class Status(models.TextChoices):
|
||||
OPERATING = 'operating', 'Operating'
|
||||
CLOSED = 'closed', 'Closed'
|
||||
CONSTRUCTION = 'construction', 'Under Construction'
|
||||
|
||||
name = models.CharField(max_length=255)
|
||||
slug = models.SlugField(unique=True)
|
||||
description = models.TextField(blank=True)
|
||||
status = models.CharField(
|
||||
max_length=20,
|
||||
choices=Status.choices,
|
||||
default=Status.OPERATING
|
||||
)
|
||||
location = models.PointField() # GeoDjango
|
||||
city = models.CharField(max_length=100)
|
||||
country = models.CharField(max_length=100)
|
||||
|
||||
class Meta:
|
||||
ordering = ['name']
|
||||
|
||||
def __str__(self):
|
||||
return self.name
|
||||
```
|
||||
|
||||
### Versioning for Moderated Content
|
||||
For content that needs version history:
|
||||
```python
|
||||
class ParkVersion(BaseModel):
|
||||
"""Version history for park edits"""
|
||||
park = models.ForeignKey(Park, on_delete=models.CASCADE, related_name='versions')
|
||||
data = models.JSONField() # Snapshot of park data
|
||||
changed_by = models.ForeignKey(User, on_delete=models.SET_NULL, null=True)
|
||||
change_summary = models.CharField(max_length=255)
|
||||
is_current = models.BooleanField(default=False)
|
||||
```
|
||||
|
||||
## API Conventions
|
||||
|
||||
### URL Structure
|
||||
```python
|
||||
# apps/parks/urls.py
|
||||
from rest_framework.routers import DefaultRouter
|
||||
from .views import ParkViewSet
|
||||
|
||||
router = DefaultRouter()
|
||||
router.register('parks', ParkViewSet, basename='park')
|
||||
|
||||
urlpatterns = router.urls
|
||||
|
||||
# Results in:
|
||||
# GET /api/parks/ - List parks
|
||||
# POST /api/parks/ - Create park (submission)
|
||||
# GET /api/parks/{slug}/ - Get park detail
|
||||
# PUT /api/parks/{slug}/ - Update park (submission)
|
||||
# DELETE /api/parks/{slug}/ - Delete park (admin only)
|
||||
```
|
||||
|
||||
### ViewSet Structure
|
||||
```python
|
||||
from rest_framework import viewsets, status
|
||||
from rest_framework.decorators import action
|
||||
from rest_framework.response import Response
|
||||
from rest_framework.permissions import IsAuthenticatedOrReadOnly
|
||||
|
||||
class ParkViewSet(viewsets.ModelViewSet):
|
||||
"""API endpoint for parks"""
|
||||
queryset = Park.objects.all()
|
||||
serializer_class = ParkSerializer
|
||||
permission_classes = [IsAuthenticatedOrReadOnly]
|
||||
lookup_field = 'slug'
|
||||
filterset_class = ParkFilter
|
||||
search_fields = ['name', 'city', 'country']
|
||||
ordering_fields = ['name', 'created_at']
|
||||
|
||||
def get_queryset(self):
|
||||
"""Optimize queries with select_related and prefetch_related"""
|
||||
return Park.objects.select_related(
|
||||
'operator', 'owner'
|
||||
).prefetch_related(
|
||||
'rides', 'photos'
|
||||
)
|
||||
|
||||
@action(detail=True, methods=['get'])
|
||||
def rides(self, request, slug=None):
|
||||
"""Get rides for a specific park"""
|
||||
park = self.get_object()
|
||||
rides = park.rides.all()
|
||||
serializer = RideSerializer(rides, many=True)
|
||||
return Response(serializer.data)
|
||||
```
|
||||
|
||||
### Serializer Patterns
|
||||
```python
|
||||
from rest_framework import serializers
|
||||
|
||||
class ParkSerializer(serializers.ModelSerializer):
|
||||
"""Serializer for Park model"""
|
||||
ride_count = serializers.IntegerField(read_only=True)
|
||||
average_rating = serializers.FloatField(read_only=True)
|
||||
|
||||
class Meta:
|
||||
model = Park
|
||||
fields = [
|
||||
'id', 'name', 'slug', 'description', 'status',
|
||||
'city', 'country', 'ride_count', 'average_rating',
|
||||
'created_at', 'updated_at'
|
||||
]
|
||||
read_only_fields = ['id', 'slug', 'created_at', 'updated_at']
|
||||
|
||||
class ParkDetailSerializer(ParkSerializer):
|
||||
"""Extended serializer for park detail view"""
|
||||
rides = RideSerializer(many=True, read_only=True)
|
||||
photos = PhotoSerializer(many=True, read_only=True)
|
||||
|
||||
class Meta(ParkSerializer.Meta):
|
||||
fields = ParkSerializer.Meta.fields + ['rides', 'photos']
|
||||
```
|
||||
|
||||
## Query Optimization
|
||||
|
||||
- ALWAYS use `select_related()` for ForeignKey relationships
|
||||
- ALWAYS use `prefetch_related()` for ManyToMany and reverse FK relationships
|
||||
- Annotate computed fields at the database level when possible
|
||||
- Use pagination for all list endpoints
|
||||
|
||||
```python
|
||||
# Good
|
||||
parks = Park.objects.select_related('operator').prefetch_related('rides')
|
||||
|
||||
# Bad - causes N+1 queries
|
||||
for park in parks:
|
||||
print(park.operator.name) # Each iteration hits the database
|
||||
```
|
||||
|
||||
## Permissions
|
||||
|
||||
### Custom Permission Classes
|
||||
```python
|
||||
from rest_framework.permissions import BasePermission
|
||||
|
||||
class IsModeratorOrReadOnly(BasePermission):
|
||||
"""Allow read access to all, write access to moderators"""
|
||||
|
||||
def has_permission(self, request, view):
|
||||
if request.method in ['GET', 'HEAD', 'OPTIONS']:
|
||||
return True
|
||||
return request.user.is_authenticated and request.user.is_moderator
|
||||
```
|
||||
|
||||
## Submission & Moderation Flow
|
||||
|
||||
All user-submitted content goes through moderation:
|
||||
|
||||
1. User submits content → Creates `Submission` record with status `pending`
|
||||
2. Moderator reviews → Approves or rejects
|
||||
3. On approval → Content is published, version record created
|
||||
|
||||
```python
|
||||
class Submission(BaseModel):
|
||||
class Status(models.TextChoices):
|
||||
PENDING = 'pending', 'Pending Review'
|
||||
APPROVED = 'approved', 'Approved'
|
||||
REJECTED = 'rejected', 'Rejected'
|
||||
CHANGES_REQUESTED = 'changes_requested', 'Changes Requested'
|
||||
|
||||
content_type = models.ForeignKey(ContentType, on_delete=models.CASCADE)
|
||||
object_id = models.UUIDField(null=True, blank=True)
|
||||
data = models.JSONField()
|
||||
status = models.CharField(max_length=20, choices=Status.choices, default=Status.PENDING)
|
||||
submitted_by = models.ForeignKey(User, on_delete=models.CASCADE)
|
||||
reviewed_by = models.ForeignKey(User, null=True, on_delete=models.SET_NULL)
|
||||
review_notes = models.TextField(blank=True)
|
||||
```
|
||||
|
||||
## Testing Requirements
|
||||
|
||||
- Write tests for all models, views, and serializers
|
||||
- Use pytest and pytest-django
|
||||
- Use factories (factory_boy) for test data
|
||||
- Test permissions thoroughly
|
||||
- Test edge cases and error conditions
|
||||
204
.agent/rules/nuxt-standards.md
Normal file
204
.agent/rules/nuxt-standards.md
Normal file
@@ -0,0 +1,204 @@
|
||||
# ThrillWiki Nuxt 4 Frontend Standards
|
||||
|
||||
Rules for developing the ThrillWiki frontend with Nuxt 4.
|
||||
|
||||
## Project Structure
|
||||
|
||||
```
|
||||
frontend/
|
||||
├── app.vue # Root component
|
||||
├── nuxt.config.ts # Nuxt configuration
|
||||
├── pages/ # File-based routing
|
||||
│ ├── index.vue # Homepage (/)
|
||||
│ ├── parks/
|
||||
│ │ ├── index.vue # /parks
|
||||
│ │ ├── nearby.vue # /parks/nearby
|
||||
│ │ └── [slug].vue # /parks/:slug
|
||||
│ └── ...
|
||||
├── components/
|
||||
│ ├── layout/ # Header, Footer, Sidebar
|
||||
│ ├── ui/ # Base components (Button, Card, Input)
|
||||
│ ├── entity/ # ParkCard, RideCard, ReviewCard
|
||||
│ └── forms/ # Form components
|
||||
├── composables/ # Shared logic (useAuth, useApi, useUnits)
|
||||
├── stores/ # Pinia stores
|
||||
├── types/ # TypeScript interfaces
|
||||
└── assets/
|
||||
└── css/ # Global styles, Tailwind config
|
||||
```
|
||||
|
||||
## Component Conventions
|
||||
|
||||
### Naming
|
||||
- Use PascalCase for component files: `ParkCard.vue`, `SearchAutocomplete.vue`
|
||||
- Use kebab-case in templates: `<park-card>`, `<search-autocomplete>`
|
||||
- Prefix base components with `Base`: `BaseButton.vue`, `BaseInput.vue`
|
||||
|
||||
### Component Structure
|
||||
```vue
|
||||
<script setup lang="ts">
|
||||
// 1. Type imports
|
||||
import type { Park } from '~/types'
|
||||
|
||||
// 2. Component imports (auto-imported usually)
|
||||
|
||||
// 3. Props and emits
|
||||
const props = defineProps<{
|
||||
park: Park
|
||||
variant?: 'default' | 'compact'
|
||||
}>()
|
||||
|
||||
const emit = defineEmits<{
|
||||
(e: 'select', park: Park): void
|
||||
}>()
|
||||
|
||||
// 4. Composables
|
||||
const { formatDistance } = useUnits()
|
||||
|
||||
// 5. Refs and reactive state
|
||||
const isExpanded = ref(false)
|
||||
|
||||
// 6. Computed properties
|
||||
const displayLocation = computed(() =>
|
||||
`${props.park.city}, ${props.park.country}`
|
||||
)
|
||||
|
||||
// 7. Functions
|
||||
function handleClick() {
|
||||
emit('select', props.park)
|
||||
}
|
||||
|
||||
// 8. Lifecycle hooks (if needed)
|
||||
</script>
|
||||
|
||||
<template>
|
||||
<!-- Template here -->
|
||||
</template>
|
||||
|
||||
<style scoped>
|
||||
/* Scoped styles, prefer Tailwind classes in template */
|
||||
</style>
|
||||
```
|
||||
|
||||
### TypeScript
|
||||
- Enable strict mode
|
||||
- Define interfaces for all data structures in `types/`
|
||||
- Use `defineProps<T>()` with TypeScript generics
|
||||
- No `any` types without explicit justification
|
||||
|
||||
## Routing
|
||||
|
||||
### File-Based Routes
|
||||
Follow Nuxt 4 file-based routing conventions:
|
||||
- `pages/index.vue` → `/`
|
||||
- `pages/parks/index.vue` → `/parks`
|
||||
- `pages/parks/[slug].vue` → `/parks/:slug`
|
||||
- `pages/parks/[park]/rides/[ride].vue` → `/parks/:park/rides/:ride`
|
||||
|
||||
### Navigation
|
||||
```typescript
|
||||
// Use navigateTo for programmatic navigation
|
||||
await navigateTo('/parks/cedar-point')
|
||||
|
||||
// Use NuxtLink for declarative navigation
|
||||
<NuxtLink to="/parks">All Parks</NuxtLink>
|
||||
```
|
||||
|
||||
## Data Fetching
|
||||
|
||||
### Use Composables
|
||||
```typescript
|
||||
// composables/useParks.ts
|
||||
export function useParks() {
|
||||
const { data, pending, error, refresh } = useFetch('/api/parks/')
|
||||
|
||||
return {
|
||||
parks: data,
|
||||
loading: pending,
|
||||
error,
|
||||
refresh
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### In Components
|
||||
```typescript
|
||||
// Use useAsyncData for page-level data
|
||||
const { data: park } = await useAsyncData(
|
||||
`park-${route.params.slug}`,
|
||||
() => $fetch(`/api/parks/${route.params.slug}/`)
|
||||
)
|
||||
```
|
||||
|
||||
## State Management (Pinia)
|
||||
|
||||
### Store Structure
|
||||
```typescript
|
||||
// stores/auth.ts
|
||||
export const useAuthStore = defineStore('auth', () => {
|
||||
const user = ref<User | null>(null)
|
||||
const isAuthenticated = computed(() => !!user.value)
|
||||
|
||||
async function login(credentials: LoginCredentials) {
|
||||
// Implementation
|
||||
}
|
||||
|
||||
function logout() {
|
||||
user.value = null
|
||||
}
|
||||
|
||||
return { user, isAuthenticated, login, logout }
|
||||
})
|
||||
```
|
||||
|
||||
### Using Stores
|
||||
```typescript
|
||||
const authStore = useAuthStore()
|
||||
const { user, isAuthenticated } = storeToRefs(authStore)
|
||||
```
|
||||
|
||||
## API Integration
|
||||
|
||||
### Base API Composable
|
||||
```typescript
|
||||
// composables/useApi.ts
|
||||
export function useApi() {
|
||||
const config = useRuntimeConfig()
|
||||
const authStore = useAuthStore()
|
||||
|
||||
return $fetch.create({
|
||||
baseURL: config.public.apiBase,
|
||||
headers: {
|
||||
...(authStore.token && { Authorization: `Bearer ${authStore.token}` })
|
||||
}
|
||||
})
|
||||
}
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
### Page Errors
|
||||
```typescript
|
||||
// In page components
|
||||
const { data, error } = await useAsyncData(...)
|
||||
|
||||
if (error.value) {
|
||||
throw createError({
|
||||
statusCode: error.value.statusCode || 500,
|
||||
message: error.value.message
|
||||
})
|
||||
}
|
||||
```
|
||||
|
||||
### Form Errors
|
||||
- Display validation errors inline with form fields
|
||||
- Use toast notifications for API errors
|
||||
- Provide clear user feedback
|
||||
|
||||
## Accessibility Requirements
|
||||
|
||||
- All interactive elements must be keyboard accessible
|
||||
- Provide proper ARIA labels
|
||||
- Ensure color contrast meets WCAG AA standards
|
||||
- Support `prefers-reduced-motion`
|
||||
- Use semantic HTML elements
|
||||
85
.agent/workflows/comply.md
Normal file
85
.agent/workflows/comply.md
Normal file
@@ -0,0 +1,85 @@
|
||||
---
|
||||
description: Ensure compliance with source_docs specifications - Continuous guard
|
||||
---
|
||||
|
||||
# Source Docs Compliance Workflow
|
||||
|
||||
You are now in **Compliance Guard Mode**. The documents in `source_docs/` are LAW. Every code change must comply with these specifications.
|
||||
|
||||
## The Constitution
|
||||
|
||||
The following documents are the single source of truth:
|
||||
- `source_docs/SITE_OVERVIEW.md` - High-level product vision
|
||||
- `source_docs/DESIGN_SYSTEM.md` - Visual identity, colors, typography, gradients
|
||||
- `source_docs/COMPONENTS.md` - Component specifications and patterns
|
||||
- `source_docs/PAGES.md` - Page layouts and content
|
||||
- `source_docs/USER_FLOWS.md` - User journeys and interaction specifications
|
||||
|
||||
## Before Making ANY Code Change
|
||||
|
||||
1. **Identify Relevant Specs**: Determine which source_docs apply to the change
|
||||
2. **Read the Spec**: View the relevant sections to understand requirements
|
||||
3. **Check for Deviations**: Compare current/proposed code against the spec
|
||||
4. **Cite Your Sources**: Reference specific line numbers when claiming compliance
|
||||
|
||||
## Compliance Checklist
|
||||
|
||||
### For Components (check against COMPONENTS.md, DESIGN_SYSTEM.md):
|
||||
- [ ] Uses correct component structure (header, content, footer)
|
||||
- [ ] Uses primary color palette (blue for primary, NOT emerald/teal unless specified)
|
||||
- [ ] Uses centralized constants instead of hardcoded options
|
||||
- [ ] Follows established patterns (sticky footer, tab navigation, etc.)
|
||||
- [ ] Proper button variants per spec
|
||||
- [ ] Dark mode support
|
||||
|
||||
### For User Flows (check against USER_FLOWS.md):
|
||||
- [ ] All required fields present (e.g., submission notes for edits)
|
||||
- [ ] Proper validation implemented
|
||||
- [ ] Error states handled per spec
|
||||
- [ ] Success feedback matches spec
|
||||
|
||||
### For Forms:
|
||||
- [ ] Options imported from `~/utils/constants.ts`
|
||||
- [ ] Backend-synced choices (check `backend/apps/*/choices.py`)
|
||||
- [ ] Required fields marked and validated
|
||||
- [ ] Proper help text and hints
|
||||
|
||||
## When Deviation is Found
|
||||
|
||||
1. **STOP** - Do not proceed with the current approach
|
||||
2. **LOG** - Document the deviation with:
|
||||
- Requirement (from source_docs)
|
||||
- Current implementation
|
||||
- Proposed fix
|
||||
3. **FIX** - Implement the compliant solution
|
||||
4. **VERIFY** - Ensure the fix matches the spec
|
||||
|
||||
## Status Tags
|
||||
|
||||
Use these in any audit or review:
|
||||
- `[OK]` - Compliant with spec
|
||||
- `[DEVIATION]` - Implemented differently than spec
|
||||
- `[MISSING]` - Required feature not implemented
|
||||
- `[RISK]` - Potential issue that needs investigation
|
||||
|
||||
## Key Constants Files
|
||||
|
||||
Always use these instead of hardcoding:
|
||||
- `frontend/app/utils/constants.ts` - Status configs, park types, etc.
|
||||
- `backend/apps/parks/choices.py` - Park status and type definitions
|
||||
- `backend/apps/rides/choices.py` - Ride status and category definitions
|
||||
- `backend/apps/moderation/choices.py` - Submission status definitions
|
||||
|
||||
## Example Audit Output
|
||||
|
||||
```markdown
|
||||
## Compliance Audit: ComponentName.vue
|
||||
|
||||
### Checking Against: [List relevant docs]
|
||||
|
||||
| Requirement | Source | Status |
|
||||
|-------------|--------|--------|
|
||||
| Uses primary gradient | DESIGN_SYSTEM.md L91-98 | [OK] |
|
||||
| Submission note field | USER_FLOWS.md L473-476 | [MISSING] → FIX |
|
||||
| Sticky footer pattern | COMPONENTS.md L583-602 | [DEVIATION] → FIX |
|
||||
```
|
||||
168
.agent/workflows/migrate-component.md
Normal file
168
.agent/workflows/migrate-component.md
Normal file
@@ -0,0 +1,168 @@
|
||||
---
|
||||
description: Migrate a React component from thrillwiki-87 to Vue/Nuxt
|
||||
---
|
||||
|
||||
# Migrate Component Workflow
|
||||
|
||||
Convert a React component from thrillwiki-87 (the authoritative source) to Vue 3 Composition API for the Nuxt 4 project.
|
||||
|
||||
## Step 1: Locate Source Component
|
||||
|
||||
Find the React component in thrillwiki-87:
|
||||
```bash
|
||||
# React components are in:
|
||||
/Volumes/macminissd/Projects/thrillwiki-87/src/components/
|
||||
```
|
||||
|
||||
Common component directories:
|
||||
- `auth/` - Authentication components
|
||||
- `parks/` - Park-related components
|
||||
- `rides/` - Ride-related components
|
||||
- `forms/` - Form components
|
||||
- `moderation/` - Moderation queue components
|
||||
- `ui/` - Base UI primitives (shadcn-ui)
|
||||
- `common/` - Shared utilities
|
||||
- `layout/` - Layout components
|
||||
|
||||
## Step 2: Analyze React Component
|
||||
|
||||
Extract these patterns from the source:
|
||||
|
||||
```tsx
|
||||
// Props interface
|
||||
interface ComponentProps {
|
||||
prop1: string
|
||||
prop2?: number
|
||||
}
|
||||
|
||||
// State hooks
|
||||
const [value, setValue] = useState<Type>(initial)
|
||||
|
||||
// Effects
|
||||
useEffect(() => { /* logic */ }, [deps])
|
||||
|
||||
// Event handlers
|
||||
const handleClick = () => { /* logic */ }
|
||||
|
||||
// Render return
|
||||
return <JSX />
|
||||
```
|
||||
|
||||
## Step 3: Translate to Vue 3
|
||||
|
||||
### Props
|
||||
```tsx
|
||||
// React
|
||||
interface Props { name: string; count?: number }
|
||||
```
|
||||
↓
|
||||
```vue
|
||||
<!-- Vue -->
|
||||
<script setup lang="ts">
|
||||
defineProps<{
|
||||
name: string
|
||||
count?: number
|
||||
}>()
|
||||
</script>
|
||||
```
|
||||
|
||||
### State
|
||||
```tsx
|
||||
// React
|
||||
const [value, setValue] = useState<string>('')
|
||||
```
|
||||
↓
|
||||
```vue
|
||||
<!-- Vue -->
|
||||
<script setup lang="ts">
|
||||
const value = ref<string>('')
|
||||
</script>
|
||||
```
|
||||
|
||||
### Effects
|
||||
```tsx
|
||||
// React
|
||||
useEffect(() => {
|
||||
fetchData()
|
||||
}, [id])
|
||||
```
|
||||
↓
|
||||
```vue
|
||||
<!-- Vue -->
|
||||
<script setup lang="ts">
|
||||
watch(() => id, () => {
|
||||
fetchData()
|
||||
}, { immediate: true })
|
||||
</script>
|
||||
```
|
||||
|
||||
### Events
|
||||
```tsx
|
||||
// React
|
||||
onClick={() => onSelect(item)}
|
||||
```
|
||||
↓
|
||||
```vue
|
||||
<!-- Vue -->
|
||||
@click="emit('select', item)"
|
||||
```
|
||||
|
||||
## Step 4: Map UI Components
|
||||
|
||||
Translate shadcn-ui to Nuxt UI:
|
||||
|
||||
| shadcn-ui | Nuxt UI |
|
||||
|-----------|---------|
|
||||
| `<Button>` | `<UButton>` |
|
||||
| `<Card>` | `<UCard>` |
|
||||
| `<Dialog>` | `<UModal>` |
|
||||
| `<Input>` | `<UInput>` |
|
||||
| `<Select>` | `<USelect>` |
|
||||
| `<Tabs>` | `<UTabs>` |
|
||||
| `<Badge>` | `<UBadge>` |
|
||||
|
||||
## Step 5: Update API Calls
|
||||
|
||||
Replace Supabase with Django API:
|
||||
|
||||
```tsx
|
||||
// React (Supabase)
|
||||
const { data } = await supabase.from('parks').select('*')
|
||||
```
|
||||
↓
|
||||
```vue
|
||||
<!-- Vue (Django) -->
|
||||
<script setup lang="ts">
|
||||
const { data } = await useApi<Park[]>('/parks/')
|
||||
</script>
|
||||
```
|
||||
|
||||
## Step 6: Place in Nuxt Structure
|
||||
|
||||
Target paths:
|
||||
```
|
||||
frontend/app/components/
|
||||
├── auth/ # Auth components
|
||||
├── cards/ # Card components
|
||||
├── common/ # Shared utilities
|
||||
├── modals/ # Modal dialogs
|
||||
├── rides/ # Ride components
|
||||
└── ui/ # Base UI components
|
||||
```
|
||||
|
||||
## Step 7: Verify Parity
|
||||
|
||||
- [ ] All props from source are present
|
||||
- [ ] All state variables ported
|
||||
- [ ] All event handlers work
|
||||
- [ ] Styling matches (Tailwind classes)
|
||||
- [ ] Loading states present
|
||||
- [ ] Error states handled
|
||||
- [ ] Dark mode works
|
||||
|
||||
## Example Migration
|
||||
|
||||
**Source**: `thrillwiki-87/src/components/parks/ParkCard.tsx`
|
||||
**Target**: `frontend/app/components/cards/ParkCard.vue`
|
||||
|
||||
Check existing target. If it exists, compare and add missing features. If not, create new.
|
||||
223
.agent/workflows/migrate-hook.md
Normal file
223
.agent/workflows/migrate-hook.md
Normal file
@@ -0,0 +1,223 @@
|
||||
---
|
||||
description: Convert a React hook from thrillwiki-87 to a Vue composable
|
||||
---
|
||||
|
||||
# Migrate Hook Workflow
|
||||
|
||||
Convert a React hook from thrillwiki-87 to a Vue 3 composable for the Nuxt 4 project.
|
||||
|
||||
## Step 1: Locate Source Hook
|
||||
|
||||
Find the React hook in thrillwiki-87:
|
||||
```bash
|
||||
/Volumes/macminissd/Projects/thrillwiki-87/src/hooks/
|
||||
```
|
||||
|
||||
Key hooks (80+ total):
|
||||
- `useAuth.tsx` - Authentication state
|
||||
- `useModerationQueue.ts` - Moderation logic (21KB)
|
||||
- `useEntityVersions.ts` - Version history (14KB)
|
||||
- `useSearch.tsx` - Search functionality
|
||||
- `useUnitPreferences.ts` - Unit conversion
|
||||
- `useProfile.tsx` - User profile
|
||||
- `useLocations.ts` - Location data
|
||||
- `useRideCreditFilters.ts` - Credit filtering
|
||||
|
||||
## Step 2: Analyze Hook Pattern
|
||||
|
||||
Extract the hook structure:
|
||||
|
||||
```tsx
|
||||
export function useFeature(params: Params) {
|
||||
// State
|
||||
const [data, setData] = useState<Type>(null)
|
||||
const [loading, setLoading] = useState(false)
|
||||
const [error, setError] = useState<Error | null>(null)
|
||||
|
||||
// Effects
|
||||
useEffect(() => {
|
||||
fetchData()
|
||||
}, [dependency])
|
||||
|
||||
// Actions
|
||||
const doSomething = async () => {
|
||||
setLoading(true)
|
||||
try {
|
||||
const result = await api.call()
|
||||
setData(result)
|
||||
} catch (e) {
|
||||
setError(e)
|
||||
} finally {
|
||||
setLoading(false)
|
||||
}
|
||||
}
|
||||
|
||||
return { data, loading, error, doSomething }
|
||||
}
|
||||
```
|
||||
|
||||
## Step 3: Convert to Composable
|
||||
|
||||
### Basic Structure
|
||||
```tsx
|
||||
// React
|
||||
export function useFeature() {
|
||||
const [value, setValue] = useState('')
|
||||
return { value, setValue }
|
||||
}
|
||||
```
|
||||
↓
|
||||
```typescript
|
||||
// Vue
|
||||
export function useFeature() {
|
||||
const value = ref('')
|
||||
|
||||
function setValue(newValue: string) {
|
||||
value.value = newValue
|
||||
}
|
||||
|
||||
return { value, setValue }
|
||||
}
|
||||
```
|
||||
|
||||
### State Conversions
|
||||
```tsx
|
||||
// React
|
||||
const [count, setCount] = useState(0)
|
||||
const [user, setUser] = useState<User | null>(null)
|
||||
const [items, setItems] = useState<Item[]>([])
|
||||
```
|
||||
↓
|
||||
```typescript
|
||||
// Vue
|
||||
const count = ref(0)
|
||||
const user = ref<User | null>(null)
|
||||
const items = ref<Item[]>([])
|
||||
```
|
||||
|
||||
### Effect Conversions
|
||||
```tsx
|
||||
// React - Run on mount
|
||||
useEffect(() => {
|
||||
initialize()
|
||||
}, [])
|
||||
```
|
||||
↓
|
||||
```typescript
|
||||
// Vue
|
||||
onMounted(() => {
|
||||
initialize()
|
||||
})
|
||||
```
|
||||
|
||||
```tsx
|
||||
// React - Watch dependency
|
||||
useEffect(() => {
|
||||
fetchData(id)
|
||||
}, [id])
|
||||
```
|
||||
↓
|
||||
```typescript
|
||||
// Vue
|
||||
watch(() => id, (newId) => {
|
||||
fetchData(newId)
|
||||
}, { immediate: true })
|
||||
```
|
||||
|
||||
### Supabase → Django API
|
||||
```tsx
|
||||
// React (Supabase)
|
||||
const { data } = await supabase
|
||||
.from('parks')
|
||||
.select('*')
|
||||
.eq('slug', slug)
|
||||
.single()
|
||||
```
|
||||
↓
|
||||
```typescript
|
||||
// Vue (Django)
|
||||
const api = useApi()
|
||||
const { data } = await api<Park>(`/parks/${slug}/`)
|
||||
```
|
||||
|
||||
## Step 4: Handle Complex Patterns
|
||||
|
||||
### useCallback → Plain Function
|
||||
```tsx
|
||||
// React
|
||||
const memoizedFn = useCallback(() => {
|
||||
doSomething(dep)
|
||||
}, [dep])
|
||||
```
|
||||
↓
|
||||
```typescript
|
||||
// Vue - Usually no memo needed
|
||||
function doSomething() {
|
||||
// Vue's reactivity handles this
|
||||
}
|
||||
```
|
||||
|
||||
### useMemo → computed
|
||||
```tsx
|
||||
// React
|
||||
const derived = useMemo(() => expensiveCalc(data), [data])
|
||||
```
|
||||
↓
|
||||
```typescript
|
||||
// Vue
|
||||
const derived = computed(() => expensiveCalc(data.value))
|
||||
```
|
||||
|
||||
### Custom Hook Composition
|
||||
```tsx
|
||||
// React
|
||||
function useFeature() {
|
||||
const auth = useAuth()
|
||||
const { data } = useQuery(...)
|
||||
// ...
|
||||
}
|
||||
```
|
||||
↓
|
||||
```typescript
|
||||
// Vue
|
||||
export function useFeature() {
|
||||
const { user } = useAuth()
|
||||
const api = useApi()
|
||||
// ...
|
||||
}
|
||||
```
|
||||
|
||||
## Step 5: Target Location
|
||||
|
||||
Place composables in:
|
||||
```
|
||||
frontend/app/composables/
|
||||
├── useApi.ts # Base API client
|
||||
├── useAuth.ts # Authentication
|
||||
├── useParksApi.ts # Parks API
|
||||
├── useRidesApi.ts # Rides API
|
||||
├── useModeration.ts # Moderation queue
|
||||
└── use[Feature].ts # New composables
|
||||
```
|
||||
|
||||
## Step 6: Verify Parity
|
||||
|
||||
- [ ] All returned values present
|
||||
- [ ] All actions/methods work
|
||||
- [ ] State updates correctly
|
||||
- [ ] API calls translated
|
||||
- [ ] Error handling maintained
|
||||
- [ ] Loading states work
|
||||
- [ ] TypeScript types correct
|
||||
|
||||
## Priority Hooks to Migrate
|
||||
|
||||
| Hook | Size | Complexity |
|
||||
|------|------|------------|
|
||||
| useModerationQueue.ts | 21KB | High |
|
||||
| useEntityVersions.ts | 14KB | High |
|
||||
| useAuth.tsx | 11KB | Medium |
|
||||
| useAutoComplete.ts | 10KB | Medium |
|
||||
| useRateLimitAlerts.ts | 10KB | Medium |
|
||||
| useRideCreditFilters.ts | 9KB | Medium |
|
||||
| useAdminSettings.ts | 9KB | Medium |
|
||||
183
.agent/workflows/migrate-page.md
Normal file
183
.agent/workflows/migrate-page.md
Normal file
@@ -0,0 +1,183 @@
|
||||
---
|
||||
description: Migrate a React page from thrillwiki-87 to Nuxt 4
|
||||
---
|
||||
|
||||
# Migrate Page Workflow
|
||||
|
||||
Port a React page from thrillwiki-87 (the authoritative source) to a Nuxt 4 page.
|
||||
|
||||
## Step 1: Locate Source Page
|
||||
|
||||
Find the React page in thrillwiki-87:
|
||||
```bash
|
||||
/Volumes/macminissd/Projects/thrillwiki-87/src/pages/
|
||||
```
|
||||
|
||||
Key pages:
|
||||
- `Index.tsx` - Homepage
|
||||
- `Parks.tsx` - Parks listing
|
||||
- `ParkDetail.tsx` - Individual park (36KB - complex)
|
||||
- `Rides.tsx` - Rides listing
|
||||
- `RideDetail.tsx` - Individual ride (54KB - most complex)
|
||||
- `Profile.tsx` - User profile (51KB - complex)
|
||||
- `Search.tsx` - Global search
|
||||
- `AdminDashboard.tsx` - Admin panel
|
||||
|
||||
## Step 2: Analyze Page Structure
|
||||
|
||||
Extract from React page:
|
||||
|
||||
```tsx
|
||||
// Route params
|
||||
const { id } = useParams()
|
||||
|
||||
// Data fetching
|
||||
const { data, isLoading, error } = useQuery(...)
|
||||
|
||||
// SEO
|
||||
// (usually react-helmet or similar)
|
||||
|
||||
// Page layout structure
|
||||
return (
|
||||
<Layout>
|
||||
<Tabs>...</Tabs>
|
||||
<Content>...</Content>
|
||||
</Layout>
|
||||
)
|
||||
```
|
||||
|
||||
## Step 3: Create Nuxt Page
|
||||
|
||||
### Route Params
|
||||
```tsx
|
||||
// React
|
||||
const { parkSlug } = useParams()
|
||||
```
|
||||
↓
|
||||
```vue
|
||||
<!-- Nuxt -->
|
||||
<script setup lang="ts">
|
||||
const route = useRoute()
|
||||
const parkSlug = route.params.park_slug as string
|
||||
</script>
|
||||
```
|
||||
|
||||
### Data Fetching
|
||||
```tsx
|
||||
// React (React Query)
|
||||
const { data, isLoading } = useQuery(['park', id], () => fetchPark(id))
|
||||
```
|
||||
↓
|
||||
```vue
|
||||
<!-- Nuxt -->
|
||||
<script setup lang="ts">
|
||||
const { data, pending } = await useAsyncData(
|
||||
`park-${parkSlug}`,
|
||||
() => useParksApi().getPark(parkSlug)
|
||||
)
|
||||
</script>
|
||||
```
|
||||
|
||||
### SEO Meta
|
||||
```tsx
|
||||
// React (Helmet)
|
||||
<Helmet>
|
||||
<title>{park.name} | ThrillWiki</title>
|
||||
</Helmet>
|
||||
```
|
||||
↓
|
||||
```vue
|
||||
<!-- Nuxt -->
|
||||
<script setup lang="ts">
|
||||
useSeoMeta({
|
||||
title: () => `${data.value?.name} | ThrillWiki`,
|
||||
description: () => data.value?.description
|
||||
})
|
||||
</script>
|
||||
```
|
||||
|
||||
### Page Meta
|
||||
```vue
|
||||
<script setup lang="ts">
|
||||
definePageMeta({
|
||||
middleware: ['auth'], // if authentication required
|
||||
layout: 'default'
|
||||
})
|
||||
</script>
|
||||
```
|
||||
|
||||
## Step 4: Port Page Sections
|
||||
|
||||
Common patterns:
|
||||
|
||||
### Tabs Structure
|
||||
```tsx
|
||||
// React
|
||||
<Tabs defaultValue="overview">
|
||||
<TabsList>
|
||||
<TabsTrigger value="overview">Overview</TabsTrigger>
|
||||
</TabsList>
|
||||
<TabsContent value="overview">...</TabsContent>
|
||||
</Tabs>
|
||||
```
|
||||
↓
|
||||
```vue
|
||||
<!-- Nuxt -->
|
||||
<UTabs :items="tabs" v-model="activeTab">
|
||||
<template #default="{ item }">
|
||||
<component :is="item.component" v-bind="item.props" />
|
||||
</template>
|
||||
</UTabs>
|
||||
```
|
||||
|
||||
### Loading States
|
||||
```tsx
|
||||
// React
|
||||
{isLoading ? <Skeleton /> : <Content />}
|
||||
```
|
||||
↓
|
||||
```vue
|
||||
<!-- Nuxt -->
|
||||
<template>
|
||||
<div v-if="pending">
|
||||
<USkeleton class="h-48" />
|
||||
</div>
|
||||
<div v-else-if="data">
|
||||
<!-- Content -->
|
||||
</div>
|
||||
</template>
|
||||
```
|
||||
|
||||
## Step 5: Target Location
|
||||
|
||||
Nuxt pages use file-based routing:
|
||||
|
||||
| React Route | Nuxt File Path |
|
||||
|-------------|----------------|
|
||||
| `/parks` | `pages/parks/index.vue` |
|
||||
| `/parks/:slug` | `pages/parks/[park_slug]/index.vue` |
|
||||
| `/parks/:slug/rides/:ride` | `pages/parks/[park_slug]/rides/[ride_slug].vue` |
|
||||
| `/manufacturers/:id` | `pages/manufacturers/[slug].vue` |
|
||||
|
||||
## Step 6: Verify Feature Parity
|
||||
|
||||
Compare source page with target:
|
||||
- [ ] All tabs/sections present
|
||||
- [ ] All data displayed
|
||||
- [ ] All actions work (edit, delete, etc.)
|
||||
- [ ] Responsive layout matches
|
||||
- [ ] Loading states present
|
||||
- [ ] Error handling works
|
||||
- [ ] SEO meta correct
|
||||
|
||||
## Reference: Page Complexity
|
||||
|
||||
| Page | Source Size | Priority |
|
||||
|------|-------------|----------|
|
||||
| RideDetail.tsx | 54KB | High |
|
||||
| Profile.tsx | 51KB | High |
|
||||
| AdminSettings.tsx | 44KB | Medium |
|
||||
| ParkDetail.tsx | 36KB | High |
|
||||
| Auth.tsx | 29KB | Medium |
|
||||
| Parks.tsx | 22KB | High |
|
||||
| Rides.tsx | 20KB | High |
|
||||
201
.agent/workflows/migrate-type.md
Normal file
201
.agent/workflows/migrate-type.md
Normal file
@@ -0,0 +1,201 @@
|
||||
---
|
||||
description: Port TypeScript types from thrillwiki-87 to the Nuxt project
|
||||
---
|
||||
|
||||
# Migrate Type Workflow
|
||||
|
||||
Port TypeScript type definitions from thrillwiki-87 to the Nuxt 4 project, ensuring sync with Django serializers.
|
||||
|
||||
## Step 1: Locate Source Types
|
||||
|
||||
Find types in thrillwiki-87:
|
||||
```bash
|
||||
/Volumes/macminissd/Projects/thrillwiki-87/src/types/
|
||||
```
|
||||
|
||||
Key type files (63 total):
|
||||
- `database.ts` - Core entity types (12KB)
|
||||
- `statuses.ts` - Status enums (13KB)
|
||||
- `moderation.ts` - Moderation types (10KB)
|
||||
- `rideAttributes.ts` - Ride specs (8KB)
|
||||
- `versioning.ts` - Version history (7KB)
|
||||
- `submissions.ts` - Submission types
|
||||
- `company.ts` - Company entities
|
||||
|
||||
## Step 2: Analyze Source Types
|
||||
|
||||
Extract type structure:
|
||||
|
||||
```typescript
|
||||
// thrillwiki-87/src/types/database.ts
|
||||
export interface Park {
|
||||
id: string
|
||||
name: string
|
||||
slug: string
|
||||
park_type: ParkType
|
||||
status: ParkStatus
|
||||
opening_date?: string
|
||||
closing_date?: string
|
||||
location?: ParkLocation
|
||||
// ...
|
||||
}
|
||||
|
||||
export interface ParkLocation {
|
||||
latitude: number
|
||||
longitude: number
|
||||
country: string
|
||||
city?: string
|
||||
// ...
|
||||
}
|
||||
```
|
||||
|
||||
## Step 3: Check Django Serializers
|
||||
|
||||
Before porting, verify Django backend has matching fields:
|
||||
|
||||
```bash
|
||||
# Check serializers in:
|
||||
backend/apps/parks/serializers.py
|
||||
backend/apps/rides/serializers.py
|
||||
backend/apps/companies/serializers.py
|
||||
```
|
||||
|
||||
The Django serializer defines what the API returns:
|
||||
```python
|
||||
class ParkSerializer(serializers.ModelSerializer):
|
||||
class Meta:
|
||||
model = Park
|
||||
fields = ['id', 'name', 'slug', 'park_type', 'status', ...]
|
||||
```
|
||||
|
||||
## Step 4: Handle Naming Conventions
|
||||
|
||||
Django uses snake_case, TypeScript typically uses camelCase. Choose one strategy:
|
||||
|
||||
### Option A: Match Django exactly (Recommended)
|
||||
```typescript
|
||||
// Keep snake_case from API
|
||||
export interface Park {
|
||||
id: string
|
||||
name: string
|
||||
park_type: string
|
||||
created_at: string
|
||||
}
|
||||
```
|
||||
|
||||
### Option B: Transform in API layer
|
||||
```typescript
|
||||
// camelCase in types
|
||||
export interface Park {
|
||||
id: string
|
||||
name: string
|
||||
parkType: string
|
||||
createdAt: string
|
||||
}
|
||||
// Transform in useApi or serializer
|
||||
```
|
||||
|
||||
**Current project uses Option A (snake_case).**
|
||||
|
||||
## Step 5: Port the Types
|
||||
|
||||
Create/update type files:
|
||||
|
||||
```typescript
|
||||
// frontend/app/types/park.ts
|
||||
export interface Park {
|
||||
id: string
|
||||
name: string
|
||||
slug: string
|
||||
park_type: string
|
||||
status: string
|
||||
short_description?: string
|
||||
description?: string
|
||||
opening_date?: string
|
||||
closing_date?: string
|
||||
operating_season?: string
|
||||
website?: string
|
||||
latitude?: number
|
||||
longitude?: number
|
||||
// Match Django serializer fields exactly
|
||||
}
|
||||
|
||||
export interface ParkDetail extends Park {
|
||||
rides?: Ride[]
|
||||
reviews?: Review[]
|
||||
photos?: Photo[]
|
||||
// Extended fields from detail serializer
|
||||
}
|
||||
```
|
||||
|
||||
## Step 6: Port Enums/Options
|
||||
|
||||
Source may have enums that should be constants:
|
||||
|
||||
```typescript
|
||||
// thrillwiki-87/src/types/statuses.ts
|
||||
export const PARK_STATUS = {
|
||||
OPERATING: 'operating',
|
||||
CLOSED: 'closed',
|
||||
// ...
|
||||
} as const
|
||||
```
|
||||
↓
|
||||
```typescript
|
||||
// frontend/app/utils/constants.ts
|
||||
export const PARK_STATUS_OPTIONS = [
|
||||
{ value: 'operating', label: 'Operating', color: 'green' },
|
||||
{ value: 'closed', label: 'Closed', color: 'red' },
|
||||
// Sync with backend/apps/parks/choices.py
|
||||
]
|
||||
```
|
||||
|
||||
## Step 7: Target Locations
|
||||
|
||||
```
|
||||
frontend/app/types/
|
||||
├── index.ts # Re-exports and common types
|
||||
├── park.ts # Park types
|
||||
├── ride.ts # Ride types
|
||||
├── company.ts # Company types (manufacturer, designer, etc.)
|
||||
├── user.ts # User/profile types
|
||||
├── moderation.ts # Moderation types
|
||||
└── api.ts # API response wrappers
|
||||
```
|
||||
|
||||
## Step 8: Verify Type Sync
|
||||
|
||||
| Layer | File | Must Match |
|
||||
|-------|------|------------|
|
||||
| Django Model | `backend/apps/*/models.py` | Database schema |
|
||||
| Serializer | `backend/apps/*/serializers.py` | API response |
|
||||
| Frontend Type | `frontend/app/types/*.ts` | Serializer fields |
|
||||
|
||||
Run checks:
|
||||
```bash
|
||||
# Check Django fields
|
||||
python manage.py shell -c "from apps.parks.models import Park; print([f.name for f in Park._meta.fields])"
|
||||
|
||||
# Check serializer fields
|
||||
python manage.py shell -c "from apps.parks.serializers import ParkSerializer; print(ParkSerializer().fields.keys())"
|
||||
```
|
||||
|
||||
## Step 9: Checklist
|
||||
|
||||
- [ ] All source type fields ported
|
||||
- [ ] Fields match Django serializer
|
||||
- [ ] snake_case naming used
|
||||
- [ ] Optional fields marked with `?`
|
||||
- [ ] Enums/options in constants.ts
|
||||
- [ ] Backend choices.py synced
|
||||
- [ ] Types exported from index.ts
|
||||
|
||||
## Priority Types
|
||||
|
||||
| Type File | Source Size | Notes |
|
||||
|-----------|-------------|-------|
|
||||
| statuses.ts | 13KB | All status enums |
|
||||
| database.ts | 12KB | Core entities |
|
||||
| moderation.ts | 10KB | Queue types |
|
||||
| rideAttributes.ts | 8KB | Spec options |
|
||||
| versioning.ts | 7KB | History types |
|
||||
193
.agent/workflows/migrate.md
Normal file
193
.agent/workflows/migrate.md
Normal file
@@ -0,0 +1,193 @@
|
||||
---
|
||||
description: Audit gaps and implement missing features from thrillwiki-87 source
|
||||
---
|
||||
|
||||
# Migration Workflow
|
||||
|
||||
**thrillwiki-87 is LAW.** This workflow audits what's missing and implements it.
|
||||
|
||||
## Quick Start
|
||||
|
||||
Run `/migrate` to:
|
||||
1. Audit current project against thrillwiki-87
|
||||
2. Identify the highest-priority missing feature
|
||||
3. Implement it using the appropriate sub-workflow
|
||||
|
||||
---
|
||||
|
||||
## Phase 1: Gap Analysis
|
||||
|
||||
### Step 1.1: Audit Components
|
||||
|
||||
Compare React components with Vue components:
|
||||
|
||||
```bash
|
||||
# Source (LAW):
|
||||
/Volumes/macminissd/Projects/thrillwiki-87/src/components/
|
||||
|
||||
# Target:
|
||||
/Volumes/macminissd/Projects/thrillwiki_django_no_react/frontend/app/components/
|
||||
```
|
||||
|
||||
For each React component directory, check if equivalent Vue component exists:
|
||||
- **EXISTS**: Mark as ✅, check for feature parity
|
||||
- **MISSING**: Mark as ❌, add to implementation queue
|
||||
|
||||
### Step 1.2: Audit Pages
|
||||
|
||||
Compare React pages with Nuxt pages:
|
||||
|
||||
```bash
|
||||
# Source (LAW):
|
||||
/Volumes/macminissd/Projects/thrillwiki-87/src/pages/
|
||||
|
||||
# Target:
|
||||
/Volumes/macminissd/Projects/thrillwiki_django_no_react/frontend/app/pages/
|
||||
```
|
||||
|
||||
Key pages to audit (by size/complexity):
|
||||
| React Page | Size | Priority |
|
||||
|------------|------|----------|
|
||||
| RideDetail.tsx | 54KB | P0 |
|
||||
| Profile.tsx | 51KB | P0 |
|
||||
| AdminSettings.tsx | 44KB | P1 |
|
||||
| ParkDetail.tsx | 36KB | P0 |
|
||||
| Auth.tsx | 29KB | P1 |
|
||||
| Parks.tsx | 22KB | P0 |
|
||||
| Rides.tsx | 20KB | P0 |
|
||||
|
||||
### Step 1.3: Audit Hooks → Composables
|
||||
|
||||
Compare React hooks with Vue composables:
|
||||
|
||||
```bash
|
||||
# Source (LAW):
|
||||
/Volumes/macminissd/Projects/thrillwiki-87/src/hooks/
|
||||
|
||||
# Target:
|
||||
/Volumes/macminissd/Projects/thrillwiki_django_no_react/frontend/app/composables/
|
||||
```
|
||||
|
||||
Priority hooks:
|
||||
| React Hook | Size | Current Status |
|
||||
|------------|------|----------------|
|
||||
| useModerationQueue.ts | 21KB | Check useModeration.ts |
|
||||
| useEntityVersions.ts | 14KB | ❌ Missing |
|
||||
| useAuth.tsx | 11KB | Check useAuth.ts |
|
||||
| useRideCreditFilters.ts | 9KB | ❌ Missing |
|
||||
|
||||
### Step 1.4: Audit Types
|
||||
|
||||
Compare TypeScript definitions:
|
||||
|
||||
```bash
|
||||
# Source (LAW):
|
||||
/Volumes/macminissd/Projects/thrillwiki-87/src/types/
|
||||
|
||||
# Target:
|
||||
/Volumes/macminissd/Projects/thrillwiki_django_no_react/frontend/app/types/
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Phase 2: Priority Selection
|
||||
|
||||
After auditing, select the highest priority gap:
|
||||
|
||||
### Priority Matrix
|
||||
|
||||
| Category | Weight | Examples |
|
||||
|----------|--------|----------|
|
||||
| **P0 - Core UX** | 10 | Main entity pages, search, auth |
|
||||
| **P1 - Features** | 7 | Reviews, credits, lists |
|
||||
| **P2 - Admin** | 5 | Moderation, settings |
|
||||
| **P3 - Polish** | 3 | Animations, edge cases |
|
||||
|
||||
Select ONE item to implement this session.
|
||||
|
||||
---
|
||||
|
||||
## Phase 3: Implementation
|
||||
|
||||
Based on the gap type, use the appropriate sub-workflow:
|
||||
|
||||
### For Missing Component
|
||||
```
|
||||
/migrate-component
|
||||
```
|
||||
|
||||
### For Missing Page
|
||||
```
|
||||
/migrate-page
|
||||
```
|
||||
|
||||
### For Missing Hook/Composable
|
||||
```
|
||||
/migrate-hook
|
||||
```
|
||||
|
||||
### For Missing Types
|
||||
```
|
||||
/migrate-type
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Phase 4: Verification
|
||||
|
||||
After implementation:
|
||||
|
||||
1. **Feature Parity Check**: Does it match thrillwiki-87 behavior?
|
||||
2. **Visual Parity Check**: Does it look the same?
|
||||
3. **Data Parity Check**: Does it show the same information?
|
||||
4. **Interaction Parity Check**: Do actions work the same way?
|
||||
|
||||
---
|
||||
|
||||
## Gap Tracking
|
||||
|
||||
Update `GAP_ANALYSIS_MATRIX.md` with status:
|
||||
|
||||
```markdown
|
||||
| Feature | Source Location | Target Location | Status |
|
||||
|---------|-----------------|-----------------|--------|
|
||||
| Park Detail Tabs | src/pages/ParkDetail.tsx | pages/parks/[park_slug]/ | [OK] |
|
||||
| Entity Versioning | src/hooks/useEntityVersions.ts | composables/ | [MISSING] |
|
||||
| Ride Credits | src/components/credits/ | components/credits/ | [PARTIAL] |
|
||||
```
|
||||
|
||||
Status tags:
|
||||
- `[OK]` - Feature parity achieved
|
||||
- `[PARTIAL]` - Some features missing
|
||||
- `[MISSING]` - Not implemented
|
||||
- `[BLOCKED]` - Waiting on backend
|
||||
|
||||
---
|
||||
|
||||
## Reference: Source Documentation
|
||||
|
||||
Always check thrillwiki-87 docs for specifications:
|
||||
|
||||
```bash
|
||||
/Volumes/macminissd/Projects/thrillwiki-87/docs/
|
||||
├── SITE_OVERVIEW.md # What features exist
|
||||
├── COMPONENTS.md # Component specifications
|
||||
├── PAGES.md # Page layouts (72KB - comprehensive)
|
||||
├── USER_FLOWS.md # Interaction patterns (77KB)
|
||||
├── DESIGN_SYSTEM.md # Visual standards
|
||||
└── [Many more...]
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Single Command Execution
|
||||
|
||||
When you run `/migrate`, execute these steps:
|
||||
|
||||
1. **Read GAP_ANALYSIS_MATRIX.md** to see current status
|
||||
2. **List directories** in both projects to find new gaps
|
||||
3. **Select highest priority** missing item
|
||||
4. **Read source implementation** from thrillwiki-87
|
||||
5. **Implement in target** following sub-workflow patterns
|
||||
6. **Update GAP_ANALYSIS_MATRIX.md** with new status
|
||||
7. **Report what was implemented** and next priority item
|
||||
472
.agent/workflows/moderation.md
Normal file
472
.agent/workflows/moderation.md
Normal file
@@ -0,0 +1,472 @@
|
||||
---
|
||||
description: Add moderation support to a content type in ThrillWiki
|
||||
---
|
||||
|
||||
# Moderation Workflow
|
||||
|
||||
Add moderation (submission queue, version history, approval flow) to a content type.
|
||||
|
||||
## Overview
|
||||
|
||||
ThrillWiki's moderation system ensures quality by:
|
||||
1. User submits new/edited content → Creates `Submission` record
|
||||
2. Content enters moderation queue with `pending` status
|
||||
3. Moderator reviews and approves/rejects
|
||||
4. On approval → Content is published, version record created
|
||||
|
||||
## Implementation Steps
|
||||
|
||||
### Step 1: Ensure Model Supports Versioning
|
||||
|
||||
The content model needs to track its current state and history:
|
||||
|
||||
```python
|
||||
# backend/apps/[app]/models.py
|
||||
|
||||
class Park(BaseModel):
|
||||
"""Main park model - always shows current approved data"""
|
||||
name = models.CharField(max_length=255)
|
||||
slug = models.SlugField(unique=True)
|
||||
description = models.TextField(blank=True)
|
||||
# ... other fields
|
||||
|
||||
# Track the current approved version
|
||||
current_version = models.ForeignKey(
|
||||
'ParkVersion',
|
||||
null=True,
|
||||
blank=True,
|
||||
on_delete=models.SET_NULL,
|
||||
related_name='current_for'
|
||||
)
|
||||
|
||||
|
||||
class ParkVersion(BaseModel):
|
||||
"""Immutable snapshot of park data at a point in time"""
|
||||
park = models.ForeignKey(
|
||||
Park,
|
||||
on_delete=models.CASCADE,
|
||||
related_name='versions'
|
||||
)
|
||||
# Store complete snapshot of editable fields
|
||||
data = models.JSONField()
|
||||
|
||||
# Metadata
|
||||
changed_by = models.ForeignKey(
|
||||
'users.User',
|
||||
on_delete=models.SET_NULL,
|
||||
null=True
|
||||
)
|
||||
change_summary = models.CharField(max_length=255, blank=True)
|
||||
submission = models.ForeignKey(
|
||||
'submissions.Submission',
|
||||
on_delete=models.SET_NULL,
|
||||
null=True,
|
||||
related_name='versions'
|
||||
)
|
||||
|
||||
class Meta:
|
||||
ordering = ['-created_at']
|
||||
|
||||
def apply_to_park(self):
|
||||
"""Apply this version's data to the parent park"""
|
||||
for field, value in self.data.items():
|
||||
if hasattr(self.park, field):
|
||||
setattr(self.park, field, value)
|
||||
self.park.current_version = self
|
||||
self.park.save()
|
||||
```
|
||||
|
||||
### Step 2: Create Submission Serializers
|
||||
|
||||
```python
|
||||
# backend/apps/submissions/serializers.py
|
||||
|
||||
class ParkSubmissionSerializer(serializers.Serializer):
|
||||
"""Serializer for park submission data"""
|
||||
name = serializers.CharField(max_length=255)
|
||||
description = serializers.CharField(required=False, allow_blank=True)
|
||||
city = serializers.CharField(max_length=100)
|
||||
country = serializers.CharField(max_length=100)
|
||||
status = serializers.ChoiceField(choices=Park.Status.choices)
|
||||
# ... other editable fields
|
||||
|
||||
def validate_name(self, value):
|
||||
# Custom validation if needed
|
||||
return value
|
||||
|
||||
|
||||
class SubmissionCreateSerializer(serializers.ModelSerializer):
|
||||
"""Create a new submission"""
|
||||
data = serializers.JSONField()
|
||||
|
||||
class Meta:
|
||||
model = Submission
|
||||
fields = ['content_type', 'object_id', 'data', 'change_summary']
|
||||
|
||||
def validate(self, attrs):
|
||||
content_type = attrs['content_type']
|
||||
|
||||
# Get the appropriate serializer for this content type
|
||||
serializer_map = {
|
||||
'park': ParkSubmissionSerializer,
|
||||
'ride': RideSubmissionSerializer,
|
||||
# ... other content types
|
||||
}
|
||||
|
||||
serializer_class = serializer_map.get(content_type)
|
||||
if not serializer_class:
|
||||
raise serializers.ValidationError(
|
||||
{'content_type': 'Unsupported content type'}
|
||||
)
|
||||
|
||||
# Validate the data field
|
||||
data_serializer = serializer_class(data=attrs['data'])
|
||||
data_serializer.is_valid(raise_exception=True)
|
||||
attrs['data'] = data_serializer.validated_data
|
||||
|
||||
return attrs
|
||||
|
||||
def create(self, validated_data):
|
||||
validated_data['submitted_by'] = self.context['request'].user
|
||||
validated_data['status'] = Submission.Status.PENDING
|
||||
return super().create(validated_data)
|
||||
```
|
||||
|
||||
### Step 3: Create Submission ViewSet
|
||||
|
||||
```python
|
||||
# backend/apps/submissions/views.py
|
||||
|
||||
class SubmissionViewSet(viewsets.ModelViewSet):
|
||||
"""API for content submissions"""
|
||||
serializer_class = SubmissionSerializer
|
||||
permission_classes = [IsAuthenticated]
|
||||
|
||||
def get_queryset(self):
|
||||
user = self.request.user
|
||||
|
||||
# Users see their own submissions
|
||||
# Moderators see all pending submissions
|
||||
if user.is_moderator:
|
||||
return Submission.objects.all()
|
||||
return Submission.objects.filter(submitted_by=user)
|
||||
|
||||
def get_serializer_class(self):
|
||||
if self.action == 'create':
|
||||
return SubmissionCreateSerializer
|
||||
return SubmissionSerializer
|
||||
|
||||
@action(detail=True, methods=['get'])
|
||||
def diff(self, request, pk=None):
|
||||
"""Get diff between submission and current version"""
|
||||
submission = self.get_object()
|
||||
|
||||
if submission.object_id:
|
||||
# Edit submission - compare to current
|
||||
current = self.get_current_data(submission)
|
||||
return Response({
|
||||
'before': current,
|
||||
'after': submission.data,
|
||||
'changes': self.compute_diff(current, submission.data)
|
||||
})
|
||||
else:
|
||||
# New submission - no comparison
|
||||
return Response({
|
||||
'before': None,
|
||||
'after': submission.data,
|
||||
'changes': None
|
||||
})
|
||||
```
|
||||
|
||||
### Step 4: Create Moderation ViewSet
|
||||
|
||||
```python
|
||||
# backend/apps/moderation/views.py
|
||||
|
||||
class ModerationViewSet(viewsets.ViewSet):
|
||||
"""Moderation queue and actions"""
|
||||
permission_classes = [IsModerator]
|
||||
|
||||
def list(self, request):
|
||||
"""Get moderation queue"""
|
||||
queryset = Submission.objects.filter(
|
||||
status=Submission.Status.PENDING
|
||||
).select_related(
|
||||
'submitted_by'
|
||||
).order_by('created_at')
|
||||
|
||||
# Filter by content type
|
||||
content_type = request.query_params.get('type')
|
||||
if content_type:
|
||||
queryset = queryset.filter(content_type=content_type)
|
||||
|
||||
serializer = SubmissionSerializer(queryset, many=True)
|
||||
return Response(serializer.data)
|
||||
|
||||
@action(detail=True, methods=['post'])
|
||||
def approve(self, request, pk=None):
|
||||
"""Approve a submission"""
|
||||
submission = get_object_or_404(Submission, pk=pk)
|
||||
|
||||
if submission.status != Submission.Status.PENDING:
|
||||
return Response(
|
||||
{'error': 'Submission is not pending'},
|
||||
status=status.HTTP_400_BAD_REQUEST
|
||||
)
|
||||
|
||||
# Apply the submission
|
||||
with transaction.atomic():
|
||||
if submission.object_id:
|
||||
# Edit existing content
|
||||
content = self.get_content_object(submission)
|
||||
version = self.create_version(content, submission)
|
||||
version.apply_to_park()
|
||||
else:
|
||||
# Create new content
|
||||
content = self.create_content(submission)
|
||||
version = self.create_version(content, submission)
|
||||
content.current_version = version
|
||||
content.save()
|
||||
|
||||
submission.status = Submission.Status.APPROVED
|
||||
submission.reviewed_by = request.user
|
||||
submission.reviewed_at = timezone.now()
|
||||
submission.save()
|
||||
|
||||
# Notify user
|
||||
notify_user(
|
||||
submission.submitted_by,
|
||||
'submission_approved',
|
||||
{'submission': submission}
|
||||
)
|
||||
|
||||
return Response({'status': 'approved'})
|
||||
|
||||
@action(detail=True, methods=['post'])
|
||||
def reject(self, request, pk=None):
|
||||
"""Reject a submission"""
|
||||
submission = get_object_or_404(Submission, pk=pk)
|
||||
|
||||
submission.status = Submission.Status.REJECTED
|
||||
submission.reviewed_by = request.user
|
||||
submission.reviewed_at = timezone.now()
|
||||
submission.review_notes = request.data.get('notes', '')
|
||||
submission.save()
|
||||
|
||||
# Notify user
|
||||
notify_user(
|
||||
submission.submitted_by,
|
||||
'submission_rejected',
|
||||
{'submission': submission, 'reason': submission.review_notes}
|
||||
)
|
||||
|
||||
return Response({'status': 'rejected'})
|
||||
|
||||
@action(detail=True, methods=['post'])
|
||||
def request_changes(self, request, pk=None):
|
||||
"""Request changes to a submission"""
|
||||
submission = get_object_or_404(Submission, pk=pk)
|
||||
|
||||
submission.status = Submission.Status.CHANGES_REQUESTED
|
||||
submission.reviewed_by = request.user
|
||||
submission.review_notes = request.data.get('notes', '')
|
||||
submission.save()
|
||||
|
||||
# Notify user
|
||||
notify_user(
|
||||
submission.submitted_by,
|
||||
'submission_changes_requested',
|
||||
{'submission': submission, 'notes': submission.review_notes}
|
||||
)
|
||||
|
||||
return Response({'status': 'changes_requested'})
|
||||
```
|
||||
|
||||
### Step 5: Frontend - Submission Form
|
||||
|
||||
```vue
|
||||
<!-- frontend/components/forms/ParkSubmitForm.vue -->
|
||||
<script setup lang="ts">
|
||||
import type { Park } from '~/types'
|
||||
|
||||
const props = defineProps<{
|
||||
park?: Park // Existing park for edits, null for new
|
||||
}>()
|
||||
|
||||
const emit = defineEmits<{
|
||||
(e: 'submitted'): void
|
||||
}>()
|
||||
|
||||
const form = reactive({
|
||||
name: props.park?.name || '',
|
||||
description: props.park?.description || '',
|
||||
city: props.park?.city || '',
|
||||
country: props.park?.country || '',
|
||||
status: props.park?.status || 'operating',
|
||||
changeSummary: '',
|
||||
})
|
||||
|
||||
const isSubmitting = ref(false)
|
||||
const errors = ref<Record<string, string[]>>({})
|
||||
|
||||
async function handleSubmit() {
|
||||
isSubmitting.value = true
|
||||
errors.value = {}
|
||||
|
||||
try {
|
||||
await $fetch('/api/v1/submissions/', {
|
||||
method: 'POST',
|
||||
body: {
|
||||
content_type: 'park',
|
||||
object_id: props.park?.id || null,
|
||||
data: {
|
||||
name: form.name,
|
||||
description: form.description,
|
||||
city: form.city,
|
||||
country: form.country,
|
||||
status: form.status,
|
||||
},
|
||||
change_summary: form.changeSummary,
|
||||
}
|
||||
})
|
||||
|
||||
// Show success message
|
||||
useToast().success(
|
||||
props.park
|
||||
? 'Your edit has been submitted for review'
|
||||
: 'Your submission has been received'
|
||||
)
|
||||
|
||||
emit('submitted')
|
||||
} catch (e: any) {
|
||||
if (e.data?.error?.details) {
|
||||
errors.value = e.data.error.details
|
||||
} else {
|
||||
useToast().error('Failed to submit. Please try again.')
|
||||
}
|
||||
} finally {
|
||||
isSubmitting.value = false
|
||||
}
|
||||
}
|
||||
</script>
|
||||
|
||||
<template>
|
||||
<form @submit.prevent="handleSubmit" class="space-y-6">
|
||||
<FormField label="Park Name" :error="errors.name?.[0]" required>
|
||||
<Input v-model="form.name" />
|
||||
</FormField>
|
||||
|
||||
<FormField label="Description" :error="errors.description?.[0]">
|
||||
<Textarea v-model="form.description" rows="4" />
|
||||
</FormField>
|
||||
|
||||
<!-- More fields... -->
|
||||
|
||||
<FormField
|
||||
label="Summary of Changes"
|
||||
:error="errors.changeSummary?.[0]"
|
||||
hint="Briefly describe what you're adding or changing"
|
||||
>
|
||||
<Input v-model="form.changeSummary" />
|
||||
</FormField>
|
||||
|
||||
<Alert variant="info">
|
||||
Your submission will be reviewed by our moderators before being published.
|
||||
</Alert>
|
||||
|
||||
<div class="flex justify-end gap-2">
|
||||
<Button variant="outline" @click="$emit('cancel')">Cancel</Button>
|
||||
<Button type="submit" :loading="isSubmitting">
|
||||
{{ park ? 'Submit Edit' : 'Submit for Review' }}
|
||||
</Button>
|
||||
</div>
|
||||
</form>
|
||||
</template>
|
||||
```
|
||||
|
||||
### Step 6: Frontend - Moderation Queue Page
|
||||
|
||||
```vue
|
||||
<!-- frontend/pages/moderation/index.vue -->
|
||||
<script setup lang="ts">
|
||||
definePageMeta({
|
||||
middleware: ['auth', 'moderator']
|
||||
})
|
||||
|
||||
useSeoMeta({
|
||||
title: 'Moderation Queue | ThrillWiki'
|
||||
})
|
||||
|
||||
const filters = reactive({
|
||||
type: '',
|
||||
status: 'pending'
|
||||
})
|
||||
|
||||
const { data, pending, refresh } = await useAsyncData(
|
||||
'moderation-queue',
|
||||
() => $fetch('/api/v1/moderation/', { params: filters }),
|
||||
{ watch: [filters] }
|
||||
)
|
||||
|
||||
async function handleApprove(id: string) {
|
||||
await $fetch(`/api/v1/moderation/${id}/approve/`, { method: 'POST' })
|
||||
useToast().success('Submission approved')
|
||||
refresh()
|
||||
}
|
||||
|
||||
async function handleReject(id: string, notes: string) {
|
||||
await $fetch(`/api/v1/moderation/${id}/reject/`, {
|
||||
method: 'POST',
|
||||
body: { notes }
|
||||
})
|
||||
useToast().success('Submission rejected')
|
||||
refresh()
|
||||
}
|
||||
</script>
|
||||
|
||||
<template>
|
||||
<PageContainer>
|
||||
<h1 class="text-3xl font-bold mb-8">Moderation Queue</h1>
|
||||
|
||||
<!-- Filters -->
|
||||
<div class="flex gap-4 mb-6">
|
||||
<Select v-model="filters.type">
|
||||
<SelectOption value="">All Types</SelectOption>
|
||||
<SelectOption value="park">Parks</SelectOption>
|
||||
<SelectOption value="ride">Rides</SelectOption>
|
||||
</Select>
|
||||
</div>
|
||||
|
||||
<!-- Queue -->
|
||||
<div class="space-y-4">
|
||||
<SubmissionCard
|
||||
v-for="submission in data"
|
||||
:key="submission.id"
|
||||
:submission="submission"
|
||||
@approve="handleApprove(submission.id)"
|
||||
@reject="notes => handleReject(submission.id, notes)"
|
||||
/>
|
||||
</div>
|
||||
|
||||
<EmptyState
|
||||
v-if="!pending && !data?.length"
|
||||
icon="CheckCircle"
|
||||
title="Queue is empty"
|
||||
description="No pending submissions to review"
|
||||
/>
|
||||
</PageContainer>
|
||||
</template>
|
||||
```
|
||||
|
||||
## Checklist
|
||||
|
||||
- [ ] Model supports versioning with JSONField snapshot
|
||||
- [ ] Submission model tracks all submission states
|
||||
- [ ] Validation serializers exist for each content type
|
||||
- [ ] Moderation endpoints have proper permissions
|
||||
- [ ] Approval creates version and applies changes atomically
|
||||
- [ ] Users are notified of submission status changes
|
||||
- [ ] Frontend shows submission status to users
|
||||
- [ ] Moderation queue is filterable and efficient
|
||||
- [ ] Diff view shows before/after comparison
|
||||
- [ ] Tests cover approval, rejection, and edge cases
|
||||
360
.agent/workflows/new-api.md
Normal file
360
.agent/workflows/new-api.md
Normal file
@@ -0,0 +1,360 @@
|
||||
---
|
||||
description: Create a new Django REST API endpoint following ThrillWiki conventions
|
||||
---
|
||||
|
||||
# New API Workflow
|
||||
|
||||
Create a new Django REST Framework API endpoint following ThrillWiki's patterns.
|
||||
|
||||
## Information Gathering
|
||||
|
||||
1. **Resource Name**: What entity is this API for? (e.g., Park, Ride, Review)
|
||||
2. **Operations**: Which CRUD operations are needed?
|
||||
- [ ] List (GET /resources/)
|
||||
- [ ] Create (POST /resources/)
|
||||
- [ ] Retrieve (GET /resources/{id}/)
|
||||
- [ ] Update (PUT/PATCH /resources/{id}/)
|
||||
- [ ] Delete (DELETE /resources/{id}/)
|
||||
3. **Permissions**: Who can access?
|
||||
- Public read, authenticated write?
|
||||
- Owner only?
|
||||
- Moderator/Admin only?
|
||||
4. **Filtering**: What filter options are needed?
|
||||
5. **Nested Resources**: Does this belong under another resource?
|
||||
|
||||
## Implementation Steps
|
||||
|
||||
### 1. Create or Update the Model
|
||||
|
||||
File: `backend/apps/[app]/models.py`
|
||||
|
||||
```python
|
||||
from django.db import models
|
||||
from apps.core.models import BaseModel
|
||||
|
||||
class Resource(BaseModel):
|
||||
"""A resource description"""
|
||||
|
||||
class Status(models.TextChoices):
|
||||
DRAFT = 'draft', 'Draft'
|
||||
PUBLISHED = 'published', 'Published'
|
||||
ARCHIVED = 'archived', 'Archived'
|
||||
|
||||
name = models.CharField(max_length=255)
|
||||
slug = models.SlugField(unique=True)
|
||||
description = models.TextField(blank=True)
|
||||
status = models.CharField(
|
||||
max_length=20,
|
||||
choices=Status.choices,
|
||||
default=Status.DRAFT
|
||||
)
|
||||
owner = models.ForeignKey(
|
||||
'users.User',
|
||||
on_delete=models.CASCADE,
|
||||
related_name='resources'
|
||||
)
|
||||
|
||||
class Meta:
|
||||
ordering = ['-created_at']
|
||||
indexes = [
|
||||
models.Index(fields=['status', '-created_at']),
|
||||
]
|
||||
|
||||
def __str__(self):
|
||||
return self.name
|
||||
```
|
||||
|
||||
### 2. Create the Serializer
|
||||
|
||||
File: `backend/apps/[app]/serializers.py`
|
||||
|
||||
```python
|
||||
from rest_framework import serializers
|
||||
from .models import Resource
|
||||
|
||||
class ResourceSerializer(serializers.ModelSerializer):
|
||||
"""Serializer for Resource listing"""
|
||||
owner_username = serializers.CharField(source='owner.username', read_only=True)
|
||||
|
||||
class Meta:
|
||||
model = Resource
|
||||
fields = [
|
||||
'id', 'name', 'slug', 'description', 'status',
|
||||
'owner', 'owner_username',
|
||||
'created_at', 'updated_at'
|
||||
]
|
||||
read_only_fields = ['id', 'slug', 'owner', 'created_at', 'updated_at']
|
||||
|
||||
|
||||
class ResourceDetailSerializer(ResourceSerializer):
|
||||
"""Extended serializer for single resource view"""
|
||||
# Add related objects for detail view
|
||||
related_items = RelatedItemSerializer(many=True, read_only=True)
|
||||
|
||||
class Meta(ResourceSerializer.Meta):
|
||||
fields = ResourceSerializer.Meta.fields + ['related_items']
|
||||
|
||||
|
||||
class ResourceCreateSerializer(serializers.ModelSerializer):
|
||||
"""Serializer for creating resources"""
|
||||
|
||||
class Meta:
|
||||
model = Resource
|
||||
fields = ['name', 'description', 'status']
|
||||
|
||||
def create(self, validated_data):
|
||||
# Auto-generate slug
|
||||
validated_data['slug'] = slugify(validated_data['name'])
|
||||
# Set owner from request
|
||||
validated_data['owner'] = self.context['request'].user
|
||||
return super().create(validated_data)
|
||||
```
|
||||
|
||||
### 3. Create the ViewSet
|
||||
|
||||
File: `backend/apps/[app]/views.py`
|
||||
|
||||
```python
|
||||
from rest_framework import viewsets, status
|
||||
from rest_framework.decorators import action
|
||||
from rest_framework.response import Response
|
||||
from rest_framework.permissions import IsAuthenticatedOrReadOnly
|
||||
from django_filters.rest_framework import DjangoFilterBackend
|
||||
from rest_framework.filters import SearchFilter, OrderingFilter
|
||||
|
||||
from .models import Resource
|
||||
from .serializers import (
|
||||
ResourceSerializer,
|
||||
ResourceDetailSerializer,
|
||||
ResourceCreateSerializer
|
||||
)
|
||||
from .filters import ResourceFilter
|
||||
from .permissions import IsOwnerOrReadOnly
|
||||
|
||||
|
||||
class ResourceViewSet(viewsets.ModelViewSet):
|
||||
"""
|
||||
API endpoint for resources.
|
||||
|
||||
list: Get all resources (with filtering)
|
||||
create: Create a new resource (authenticated)
|
||||
retrieve: Get a single resource
|
||||
update: Update a resource (owner only)
|
||||
destroy: Delete a resource (owner only)
|
||||
"""
|
||||
queryset = Resource.objects.all()
|
||||
permission_classes = [IsAuthenticatedOrReadOnly, IsOwnerOrReadOnly]
|
||||
lookup_field = 'slug'
|
||||
|
||||
# Filtering
|
||||
filter_backends = [DjangoFilterBackend, SearchFilter, OrderingFilter]
|
||||
filterset_class = ResourceFilter
|
||||
search_fields = ['name', 'description']
|
||||
ordering_fields = ['name', 'created_at', 'updated_at']
|
||||
ordering = ['-created_at']
|
||||
|
||||
def get_queryset(self):
|
||||
"""Optimize queries"""
|
||||
return Resource.objects.select_related(
|
||||
'owner'
|
||||
).prefetch_related(
|
||||
'related_items'
|
||||
)
|
||||
|
||||
def get_serializer_class(self):
|
||||
"""Use different serializers for different actions"""
|
||||
if self.action == 'create':
|
||||
return ResourceCreateSerializer
|
||||
if self.action == 'retrieve':
|
||||
return ResourceDetailSerializer
|
||||
return ResourceSerializer
|
||||
|
||||
@action(detail=True, methods=['post'])
|
||||
def publish(self, request, slug=None):
|
||||
"""Publish a draft resource"""
|
||||
resource = self.get_object()
|
||||
if resource.status != Resource.Status.DRAFT:
|
||||
return Response(
|
||||
{'error': 'Only draft resources can be published'},
|
||||
status=status.HTTP_400_BAD_REQUEST
|
||||
)
|
||||
resource.status = Resource.Status.PUBLISHED
|
||||
resource.save()
|
||||
return Response(ResourceSerializer(resource).data)
|
||||
```
|
||||
|
||||
### 4. Create Custom Filter
|
||||
|
||||
File: `backend/apps/[app]/filters.py`
|
||||
|
||||
```python
|
||||
import django_filters
|
||||
from .models import Resource
|
||||
|
||||
|
||||
class ResourceFilter(django_filters.FilterSet):
|
||||
"""Filters for Resource API"""
|
||||
|
||||
status = django_filters.ChoiceFilter(choices=Resource.Status.choices)
|
||||
owner = django_filters.CharFilter(field_name='owner__username')
|
||||
created_after = django_filters.DateTimeFilter(
|
||||
field_name='created_at',
|
||||
lookup_expr='gte'
|
||||
)
|
||||
created_before = django_filters.DateTimeFilter(
|
||||
field_name='created_at',
|
||||
lookup_expr='lte'
|
||||
)
|
||||
|
||||
class Meta:
|
||||
model = Resource
|
||||
fields = ['status', 'owner']
|
||||
```
|
||||
|
||||
### 5. Create Custom Permission
|
||||
|
||||
File: `backend/apps/[app]/permissions.py`
|
||||
|
||||
```python
|
||||
from rest_framework.permissions import BasePermission, SAFE_METHODS
|
||||
|
||||
|
||||
class IsOwnerOrReadOnly(BasePermission):
|
||||
"""Allow read to all, write only to owner"""
|
||||
|
||||
def has_object_permission(self, request, view, obj):
|
||||
if request.method in SAFE_METHODS:
|
||||
return True
|
||||
return obj.owner == request.user
|
||||
|
||||
|
||||
class IsModerator(BasePermission):
|
||||
"""Allow access only to moderators"""
|
||||
|
||||
def has_permission(self, request, view):
|
||||
return (
|
||||
request.user.is_authenticated and
|
||||
request.user.is_moderator
|
||||
)
|
||||
```
|
||||
|
||||
### 6. Register URLs
|
||||
|
||||
File: `backend/apps/[app]/urls.py`
|
||||
|
||||
```python
|
||||
from rest_framework.routers import DefaultRouter
|
||||
from .views import ResourceViewSet
|
||||
|
||||
router = DefaultRouter()
|
||||
router.register('resources', ResourceViewSet, basename='resource')
|
||||
|
||||
urlpatterns = router.urls
|
||||
```
|
||||
|
||||
Add to main urls:
|
||||
```python
|
||||
# backend/config/urls.py
|
||||
urlpatterns = [
|
||||
...
|
||||
path('api/v1/', include('apps.app_name.urls')),
|
||||
]
|
||||
```
|
||||
|
||||
### 7. Create Migration
|
||||
|
||||
```bash
|
||||
python manage.py makemigrations app_name
|
||||
python manage.py migrate
|
||||
```
|
||||
|
||||
### 8. Add Tests
|
||||
|
||||
File: `backend/apps/[app]/tests/test_views.py`
|
||||
|
||||
```python
|
||||
import pytest
|
||||
from rest_framework.test import APITestCase
|
||||
from rest_framework import status
|
||||
from django.urls import reverse
|
||||
|
||||
from apps.users.factories import UserFactory
|
||||
from .factories import ResourceFactory
|
||||
|
||||
|
||||
class TestResourceAPI(APITestCase):
|
||||
"""Tests for Resource API endpoints"""
|
||||
|
||||
def setUp(self):
|
||||
self.user = UserFactory()
|
||||
self.resource = ResourceFactory(owner=self.user)
|
||||
|
||||
def test_list_resources_unauthenticated(self):
|
||||
"""Anonymous users can list resources"""
|
||||
url = reverse('resource-list')
|
||||
response = self.client.get(url)
|
||||
self.assertEqual(response.status_code, status.HTTP_200_OK)
|
||||
|
||||
def test_create_resource_authenticated(self):
|
||||
"""Authenticated users can create resources"""
|
||||
self.client.force_authenticate(user=self.user)
|
||||
url = reverse('resource-list')
|
||||
data = {'name': 'New Resource', 'description': 'Test'}
|
||||
response = self.client.post(url, data)
|
||||
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
|
||||
|
||||
def test_create_resource_unauthenticated(self):
|
||||
"""Anonymous users cannot create resources"""
|
||||
url = reverse('resource-list')
|
||||
data = {'name': 'New Resource'}
|
||||
response = self.client.post(url, data)
|
||||
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
|
||||
|
||||
def test_update_own_resource(self):
|
||||
"""Users can update their own resources"""
|
||||
self.client.force_authenticate(user=self.user)
|
||||
url = reverse('resource-detail', args=[self.resource.slug])
|
||||
response = self.client.patch(url, {'name': 'Updated'})
|
||||
self.assertEqual(response.status_code, status.HTTP_200_OK)
|
||||
|
||||
def test_update_others_resource(self):
|
||||
"""Users cannot update others' resources"""
|
||||
other_user = UserFactory()
|
||||
self.client.force_authenticate(user=other_user)
|
||||
url = reverse('resource-detail', args=[self.resource.slug])
|
||||
response = self.client.patch(url, {'name': 'Hacked'})
|
||||
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
|
||||
```
|
||||
|
||||
## Checklist
|
||||
|
||||
After creating the API:
|
||||
|
||||
- [ ] Model has proper fields and constraints
|
||||
- [ ] Serializers validate input correctly
|
||||
- [ ] ViewSet has proper permissions
|
||||
- [ ] Queries are optimized (select_related, prefetch_related)
|
||||
- [ ] Filtering, search, and ordering work
|
||||
- [ ] Pagination is enabled
|
||||
- [ ] URLs are registered
|
||||
- [ ] Migrations are created and applied
|
||||
- [ ] Tests pass
|
||||
|
||||
## Output
|
||||
|
||||
Report what was created:
|
||||
```
|
||||
Created API: /api/v1/resources/
|
||||
Methods: GET (list), POST (create), GET (detail), PATCH (update), DELETE
|
||||
Permissions: IsAuthenticatedOrReadOnly, IsOwnerOrReadOnly
|
||||
Filters: status, owner, created_after, created_before
|
||||
Search: name, description
|
||||
Files:
|
||||
- backend/apps/[app]/models.py
|
||||
- backend/apps/[app]/serializers.py
|
||||
- backend/apps/[app]/views.py
|
||||
- backend/apps/[app]/filters.py
|
||||
- backend/apps/[app]/permissions.py
|
||||
- backend/apps/[app]/urls.py
|
||||
- backend/apps/[app]/tests/test_views.py
|
||||
```
|
||||
279
.agent/workflows/new-component.md
Normal file
279
.agent/workflows/new-component.md
Normal file
@@ -0,0 +1,279 @@
|
||||
---
|
||||
description: Create a new Vue component following ThrillWiki patterns
|
||||
---
|
||||
|
||||
# New Component Workflow
|
||||
|
||||
Create a new Vue component following ThrillWiki's design system and conventions.
|
||||
|
||||
## Information Gathering
|
||||
|
||||
1. **Component Name**: PascalCase (e.g., `ParkCard`, `RatingDisplay`)
|
||||
2. **Category**:
|
||||
- `ui/` - Base components (Button, Card, Input)
|
||||
- `entity/` - Domain-specific (ParkCard, RideCard)
|
||||
- `forms/` - Form components
|
||||
- `specialty/` - Complex/unique components
|
||||
3. **Props**: What data does it receive?
|
||||
4. **Emits**: What events does it emit?
|
||||
5. **State**: Does it have internal state?
|
||||
6. **Variants**: Does it need multiple variants/sizes?
|
||||
|
||||
## Component Template
|
||||
|
||||
### Base Component Structure
|
||||
|
||||
Location: `frontend/components/[category]/ComponentName.vue`
|
||||
|
||||
```vue
|
||||
<script setup lang="ts">
|
||||
// 1. Import types
|
||||
import type { ComponentProps } from '~/types'
|
||||
|
||||
// 2. Define props with TypeScript
|
||||
const props = withDefaults(defineProps<{
|
||||
// Required props
|
||||
title: string
|
||||
// Optional props with defaults
|
||||
variant?: 'default' | 'compact' | 'expanded'
|
||||
disabled?: boolean
|
||||
}>(), {
|
||||
variant: 'default',
|
||||
disabled: false,
|
||||
})
|
||||
|
||||
// 3. Define emits
|
||||
const emit = defineEmits<{
|
||||
(e: 'click'): void
|
||||
(e: 'select', value: string): void
|
||||
}>()
|
||||
|
||||
// 4. Use composables
|
||||
const { formatDistance } = useUnits()
|
||||
|
||||
// 5. Internal state
|
||||
const isOpen = ref(false)
|
||||
|
||||
// 6. Computed properties
|
||||
const computedClass = computed(() => ({
|
||||
'opacity-50 pointer-events-none': props.disabled,
|
||||
'p-4': props.variant === 'default',
|
||||
'p-2': props.variant === 'compact',
|
||||
}))
|
||||
|
||||
// 7. Methods
|
||||
function handleClick() {
|
||||
if (!props.disabled) {
|
||||
emit('click')
|
||||
}
|
||||
}
|
||||
</script>
|
||||
|
||||
<template>
|
||||
<div :class="computedClass" @click="handleClick">
|
||||
<!-- Component content -->
|
||||
<slot />
|
||||
</div>
|
||||
</template>
|
||||
```
|
||||
|
||||
### Entity Card Component
|
||||
|
||||
For ParkCard, RideCard, etc.:
|
||||
|
||||
```vue
|
||||
<script setup lang="ts">
|
||||
import type { Park } from '~/types'
|
||||
import { MapPin, Star } from 'lucide-vue-next'
|
||||
|
||||
const props = defineProps<{
|
||||
park: Park
|
||||
variant?: 'default' | 'compact'
|
||||
}>()
|
||||
|
||||
const statusVariant = computed(() => {
|
||||
switch (props.park.status) {
|
||||
case 'operating': return 'success'
|
||||
case 'closed': return 'destructive'
|
||||
case 'construction': return 'warning'
|
||||
default: return 'default'
|
||||
}
|
||||
})
|
||||
</script>
|
||||
|
||||
<template>
|
||||
<NuxtLink :to="`/parks/${park.slug}`">
|
||||
<Card interactive class="overflow-hidden">
|
||||
<!-- Image -->
|
||||
<div class="aspect-video relative">
|
||||
<NuxtImg
|
||||
:src="park.image || '/placeholder-park.jpg'"
|
||||
:alt="park.name"
|
||||
class="object-cover w-full h-full"
|
||||
loading="lazy"
|
||||
/>
|
||||
<Badge
|
||||
:variant="statusVariant"
|
||||
class="absolute top-2 right-2"
|
||||
>
|
||||
{{ park.status }}
|
||||
</Badge>
|
||||
</div>
|
||||
|
||||
<!-- Content -->
|
||||
<div class="p-4">
|
||||
<h3 class="font-semibold text-lg line-clamp-1">
|
||||
{{ park.name }}
|
||||
</h3>
|
||||
|
||||
<p class="text-sm text-muted-foreground flex items-center gap-1 mt-1">
|
||||
<MapPin class="w-4 h-4" />
|
||||
{{ park.city }}, {{ park.country }}
|
||||
</p>
|
||||
|
||||
<div class="flex items-center justify-between mt-3">
|
||||
<span class="text-sm text-muted-foreground">
|
||||
🎢 {{ park.rideCount }} rides
|
||||
</span>
|
||||
<RatingDisplay
|
||||
v-if="park.averageRating"
|
||||
:rating="park.averageRating"
|
||||
size="sm"
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
</Card>
|
||||
</NuxtLink>
|
||||
</template>
|
||||
```
|
||||
|
||||
### Skeleton Loading Component
|
||||
|
||||
Every entity card should have a matching skeleton:
|
||||
|
||||
```vue
|
||||
<script setup lang="ts">
|
||||
defineProps<{
|
||||
variant?: 'default' | 'compact'
|
||||
}>()
|
||||
</script>
|
||||
|
||||
<template>
|
||||
<Card>
|
||||
<Skeleton class="aspect-video rounded-t-lg" />
|
||||
<div class="p-4 space-y-2">
|
||||
<Skeleton class="h-5 w-3/4" />
|
||||
<Skeleton class="h-4 w-1/2" />
|
||||
<div class="flex justify-between mt-3">
|
||||
<Skeleton class="h-4 w-20" />
|
||||
<Skeleton class="h-4 w-16" />
|
||||
</div>
|
||||
</div>
|
||||
</Card>
|
||||
</template>
|
||||
```
|
||||
|
||||
## Variant Pattern (using CVA)
|
||||
|
||||
For components with many variants, use class-variance-authority:
|
||||
|
||||
```vue
|
||||
<script setup lang="ts">
|
||||
import { cva, type VariantProps } from 'class-variance-authority'
|
||||
|
||||
const buttonVariants = cva(
|
||||
'inline-flex items-center justify-center rounded-md font-medium transition-colors focus-visible:outline-none focus-visible:ring-2',
|
||||
{
|
||||
variants: {
|
||||
variant: {
|
||||
default: 'bg-primary text-primary-foreground hover:bg-primary/90',
|
||||
secondary: 'bg-secondary text-secondary-foreground hover:bg-secondary/80',
|
||||
outline: 'border border-input bg-background hover:bg-accent',
|
||||
ghost: 'hover:bg-accent hover:text-accent-foreground',
|
||||
destructive: 'bg-destructive text-destructive-foreground hover:bg-destructive/90',
|
||||
},
|
||||
size: {
|
||||
default: 'h-10 px-4 py-2',
|
||||
sm: 'h-8 px-3 text-sm',
|
||||
lg: 'h-12 px-6 text-lg',
|
||||
icon: 'h-10 w-10',
|
||||
},
|
||||
},
|
||||
defaultVariants: {
|
||||
variant: 'default',
|
||||
size: 'default',
|
||||
},
|
||||
}
|
||||
)
|
||||
|
||||
type Props = VariantProps<typeof buttonVariants> & {
|
||||
loading?: boolean
|
||||
disabled?: boolean
|
||||
}
|
||||
|
||||
const props = withDefaults(defineProps<Props>(), {
|
||||
loading: false,
|
||||
disabled: false,
|
||||
})
|
||||
</script>
|
||||
|
||||
<template>
|
||||
<button
|
||||
:class="buttonVariants({ variant, size })"
|
||||
:disabled="disabled || loading"
|
||||
>
|
||||
<Loader2 v-if="loading" class="w-4 h-4 mr-2 animate-spin" />
|
||||
<slot />
|
||||
</button>
|
||||
</template>
|
||||
```
|
||||
|
||||
## Composable Integration
|
||||
|
||||
If component needs shared logic, create a composable:
|
||||
|
||||
```typescript
|
||||
// composables/useUnitDisplay.ts
|
||||
export function useUnitDisplay() {
|
||||
const { preferredUnits } = useUserPreferences()
|
||||
|
||||
function formatSpeed(kmh: number): string {
|
||||
if (preferredUnits.value === 'imperial') {
|
||||
return `${Math.round(kmh * 0.621371)} mph`
|
||||
}
|
||||
return `${kmh} km/h`
|
||||
}
|
||||
|
||||
function formatHeight(meters: number): string {
|
||||
if (preferredUnits.value === 'imperial') {
|
||||
return `${Math.round(meters * 3.28084)} ft`
|
||||
}
|
||||
return `${meters} m`
|
||||
}
|
||||
|
||||
return { formatSpeed, formatHeight }
|
||||
}
|
||||
```
|
||||
|
||||
## Checklist
|
||||
|
||||
After creating the component:
|
||||
|
||||
- [ ] TypeScript props are properly defined
|
||||
- [ ] Component follows design system (colors, spacing, typography)
|
||||
- [ ] Responsive on all screen sizes
|
||||
- [ ] Handles loading state (if applicable)
|
||||
- [ ] Handles empty state (if applicable)
|
||||
- [ ] Accessible (ARIA labels, keyboard nav)
|
||||
- [ ] Has matching skeleton component (for async data)
|
||||
- [ ] Works in dark mode
|
||||
|
||||
## Output
|
||||
|
||||
Report what was created:
|
||||
```
|
||||
Created: frontend/components/[category]/ComponentName.vue
|
||||
Props: [list of props]
|
||||
Emits: [list of events]
|
||||
Related: [any composables or sub-components]
|
||||
```
|
||||
311
.agent/workflows/new-feature.md
Normal file
311
.agent/workflows/new-feature.md
Normal file
@@ -0,0 +1,311 @@
|
||||
---
|
||||
description: Implement a full-stack feature across Django backend and Nuxt frontend
|
||||
---
|
||||
|
||||
# New Feature Workflow
|
||||
|
||||
Implement a complete feature spanning the Django backend and Nuxt frontend.
|
||||
|
||||
## Planning Phase
|
||||
|
||||
Before writing any code, create an implementation plan:
|
||||
|
||||
### 1. Feature Definition
|
||||
- **Goal**: What problem does this feature solve?
|
||||
- **User Stories**: Who uses it and how?
|
||||
- **Acceptance Criteria**: How do we know it's done?
|
||||
|
||||
### 2. Technical Scope
|
||||
- **Backend Changes**: Models, APIs, permissions
|
||||
- **Frontend Changes**: Pages, components, state
|
||||
- **Data Flow**: How data moves between layers
|
||||
|
||||
### 3. Implementation Order
|
||||
|
||||
Always implement in this order:
|
||||
1. **Database/Models** - Foundation first
|
||||
2. **API Endpoints** - Backend logic
|
||||
3. **Frontend Components** - UI building blocks
|
||||
4. **Frontend Pages** - Assembled views
|
||||
5. **Integration** - Wire it all together
|
||||
6. **Tests** - Verify everything works
|
||||
|
||||
## Implementation Steps
|
||||
|
||||
### Step 1: Backend - Models
|
||||
|
||||
```python
|
||||
# Create or modify models
|
||||
# Remember: Inherit from BaseModel, add proper indexes
|
||||
|
||||
class NewFeature(BaseModel):
|
||||
# Fields
|
||||
name = models.CharField(max_length=255)
|
||||
# ... other fields
|
||||
|
||||
class Meta:
|
||||
ordering = ['-created_at']
|
||||
```
|
||||
|
||||
Run migrations:
|
||||
```bash
|
||||
python manage.py makemigrations
|
||||
python manage.py migrate
|
||||
```
|
||||
|
||||
### Step 2: Backend - Serializers
|
||||
|
||||
```python
|
||||
# backend/apps/[app]/serializers.py
|
||||
|
||||
class NewFeatureSerializer(serializers.ModelSerializer):
|
||||
class Meta:
|
||||
model = NewFeature
|
||||
fields = ['id', 'name', 'created_at', 'updated_at']
|
||||
read_only_fields = ['id', 'created_at', 'updated_at']
|
||||
|
||||
class NewFeatureDetailSerializer(NewFeatureSerializer):
|
||||
# Extended fields for detail view
|
||||
related_data = RelatedSerializer(many=True, read_only=True)
|
||||
|
||||
class Meta(NewFeatureSerializer.Meta):
|
||||
fields = NewFeatureSerializer.Meta.fields + ['related_data']
|
||||
```
|
||||
|
||||
### Step 3: Backend - API Views
|
||||
|
||||
```python
|
||||
# backend/apps/[app]/views.py
|
||||
|
||||
class NewFeatureViewSet(viewsets.ModelViewSet):
|
||||
queryset = NewFeature.objects.all()
|
||||
serializer_class = NewFeatureSerializer
|
||||
permission_classes = [IsAuthenticatedOrReadOnly]
|
||||
|
||||
def get_queryset(self):
|
||||
return NewFeature.objects.select_related(
|
||||
# Add related models
|
||||
).prefetch_related(
|
||||
# Add many-to-many or reverse relations
|
||||
)
|
||||
|
||||
def get_serializer_class(self):
|
||||
if self.action == 'retrieve':
|
||||
return NewFeatureDetailSerializer
|
||||
return NewFeatureSerializer
|
||||
```
|
||||
|
||||
### Step 4: Backend - URLs
|
||||
|
||||
```python
|
||||
# backend/apps/[app]/urls.py
|
||||
router.register('new-features', NewFeatureViewSet, basename='new-feature')
|
||||
```
|
||||
|
||||
### Step 5: Backend - Tests
|
||||
|
||||
```python
|
||||
# backend/apps/[app]/tests/test_new_feature.py
|
||||
|
||||
class TestNewFeatureAPI(APITestCase):
|
||||
def test_list_features(self):
|
||||
response = self.client.get('/api/v1/new-features/')
|
||||
self.assertEqual(response.status_code, 200)
|
||||
|
||||
def test_create_feature_authenticated(self):
|
||||
self.client.force_authenticate(user=self.user)
|
||||
response = self.client.post('/api/v1/new-features/', {'name': 'Test'})
|
||||
self.assertEqual(response.status_code, 201)
|
||||
```
|
||||
|
||||
### Step 6: Frontend - Types
|
||||
|
||||
```typescript
|
||||
// frontend/types/newFeature.ts
|
||||
|
||||
export interface NewFeature {
|
||||
id: string
|
||||
name: string
|
||||
createdAt: string
|
||||
updatedAt: string
|
||||
}
|
||||
|
||||
export interface NewFeatureDetail extends NewFeature {
|
||||
relatedData: RelatedItem[]
|
||||
}
|
||||
```
|
||||
|
||||
### Step 7: Frontend - Composables
|
||||
|
||||
```typescript
|
||||
// frontend/composables/useNewFeatures.ts
|
||||
|
||||
export function useNewFeatures() {
|
||||
const api = useApi()
|
||||
|
||||
async function getFeatures(params?: Record<string, any>) {
|
||||
return api<PaginatedResponse<NewFeature>>('/new-features/', { params })
|
||||
}
|
||||
|
||||
async function getFeature(id: string) {
|
||||
return api<NewFeatureDetail>(`/new-features/${id}/`)
|
||||
}
|
||||
|
||||
async function createFeature(data: Partial<NewFeature>) {
|
||||
return api<NewFeature>('/new-features/', {
|
||||
method: 'POST',
|
||||
body: data
|
||||
})
|
||||
}
|
||||
|
||||
return { getFeatures, getFeature, createFeature }
|
||||
}
|
||||
```
|
||||
|
||||
### Step 8: Frontend - Components
|
||||
|
||||
Create necessary components following component patterns:
|
||||
|
||||
```vue
|
||||
<!-- frontend/components/entity/NewFeatureCard.vue -->
|
||||
<script setup lang="ts">
|
||||
import type { NewFeature } from '~/types'
|
||||
|
||||
defineProps<{
|
||||
feature: NewFeature
|
||||
}>()
|
||||
</script>
|
||||
|
||||
<template>
|
||||
<Card interactive>
|
||||
<div class="p-4">
|
||||
<h3 class="font-semibold">{{ feature.name }}</h3>
|
||||
<!-- Additional content -->
|
||||
</div>
|
||||
</Card>
|
||||
</template>
|
||||
```
|
||||
|
||||
### Step 9: Frontend - Pages
|
||||
|
||||
```vue
|
||||
<!-- frontend/pages/new-features/index.vue -->
|
||||
<script setup lang="ts">
|
||||
definePageMeta({
|
||||
// middleware: ['auth'], // if needed
|
||||
})
|
||||
|
||||
useSeoMeta({
|
||||
title: 'New Features | ThrillWiki',
|
||||
})
|
||||
|
||||
const { data, pending, error } = await useAsyncData('new-features', () =>
|
||||
useNewFeatures().getFeatures()
|
||||
)
|
||||
</script>
|
||||
|
||||
<template>
|
||||
<PageContainer>
|
||||
<h1 class="text-3xl font-bold mb-8">New Features</h1>
|
||||
|
||||
<div v-if="pending" class="grid grid-cols-1 md:grid-cols-3 gap-6">
|
||||
<Skeleton v-for="i in 6" :key="i" class="h-48" />
|
||||
</div>
|
||||
|
||||
<div v-else-if="data?.results" class="grid grid-cols-1 md:grid-cols-3 gap-6">
|
||||
<NewFeatureCard
|
||||
v-for="feature in data.results"
|
||||
:key="feature.id"
|
||||
:feature="feature"
|
||||
/>
|
||||
</div>
|
||||
|
||||
<EmptyState v-else title="No features found" />
|
||||
</PageContainer>
|
||||
</template>
|
||||
```
|
||||
|
||||
### Step 10: Integration Testing
|
||||
|
||||
Test the full flow:
|
||||
|
||||
1. **API Test**: Verify endpoints with curl or API client
|
||||
2. **Component Test**: Test components in isolation
|
||||
3. **E2E Test**: Test complete user journey
|
||||
|
||||
```typescript
|
||||
// frontend/tests/e2e/newFeature.spec.ts
|
||||
import { test, expect } from '@playwright/test'
|
||||
|
||||
test('user can view new features', async ({ page }) => {
|
||||
await page.goto('/new-features')
|
||||
await expect(page.locator('h1')).toContainText('New Features')
|
||||
})
|
||||
|
||||
test('authenticated user can create feature', async ({ page }) => {
|
||||
// Login first
|
||||
await page.goto('/auth/login')
|
||||
// ... login steps
|
||||
|
||||
await page.goto('/new-features/create')
|
||||
await page.fill('input[name="name"]', 'Test Feature')
|
||||
await page.click('button[type="submit"]')
|
||||
|
||||
await expect(page).toHaveURL(/\/new-features\//)
|
||||
})
|
||||
```
|
||||
|
||||
## Feature Checklist
|
||||
|
||||
### Backend
|
||||
- [ ] Models created with proper fields and indexes
|
||||
- [ ] Migrations created and applied
|
||||
- [ ] Serializers handle validation
|
||||
- [ ] ViewSet has proper permissions
|
||||
- [ ] Queries are optimized
|
||||
- [ ] URLs registered
|
||||
- [ ] Unit tests pass
|
||||
|
||||
### Frontend
|
||||
- [ ] Types defined
|
||||
- [ ] Composables created for API calls
|
||||
- [ ] Components follow design system
|
||||
- [ ] Pages have proper SEO meta
|
||||
- [ ] Loading states implemented
|
||||
- [ ] Error states handled
|
||||
- [ ] Responsive design verified
|
||||
- [ ] Keyboard accessible
|
||||
|
||||
### Integration
|
||||
- [ ] Data flows correctly between backend and frontend
|
||||
- [ ] Authentication/authorization works
|
||||
- [ ] Error handling covers edge cases
|
||||
- [ ] Performance is acceptable
|
||||
|
||||
## Output Summary
|
||||
|
||||
After completing the feature:
|
||||
|
||||
```markdown
|
||||
## Feature: [Feature Name]
|
||||
|
||||
### Backend
|
||||
- Model: `apps/[app]/models.py` - NewFeature
|
||||
- API: `/api/v1/new-features/`
|
||||
- Permissions: [describe]
|
||||
|
||||
### Frontend
|
||||
- Page: `/new-features` (list), `/new-features/[id]` (detail)
|
||||
- Components: NewFeatureCard, NewFeatureForm
|
||||
- Composable: useNewFeatures
|
||||
|
||||
### Tests
|
||||
- Backend: X tests passing
|
||||
- Frontend: X tests passing
|
||||
- E2E: X tests passing
|
||||
|
||||
### Notes
|
||||
- [Any important implementation notes]
|
||||
- [Known limitations]
|
||||
- [Future improvements]
|
||||
```
|
||||
235
.agent/workflows/new-page.md
Normal file
235
.agent/workflows/new-page.md
Normal file
@@ -0,0 +1,235 @@
|
||||
---
|
||||
description: Create a new page in ThrillWiki following project conventions
|
||||
---
|
||||
|
||||
# New Page Workflow
|
||||
|
||||
Create a new page in ThrillWiki following all conventions and patterns.
|
||||
|
||||
## Information Gathering
|
||||
|
||||
Before creating the page, determine:
|
||||
|
||||
1. **Route**: What URL should this page have?
|
||||
2. **Page Type**:
|
||||
- List page (shows multiple items)
|
||||
- Detail page (shows single item)
|
||||
- Form page (create/edit content)
|
||||
- Static page (about, contact, etc.)
|
||||
3. **Data Requirements**: What data does this page need?
|
||||
4. **Authentication**: Public or authenticated only?
|
||||
5. **Related Components**: What existing components can be reused?
|
||||
|
||||
## File Creation Steps
|
||||
|
||||
### 1. Create the Page Component
|
||||
|
||||
Location: `frontend/pages/[route].vue` or `frontend/pages/[folder]/[route].vue`
|
||||
|
||||
```vue
|
||||
<script setup lang="ts">
|
||||
// Define page metadata
|
||||
definePageMeta({
|
||||
// middleware: ['auth'], // If authenticated only
|
||||
// layout: 'admin', // If using special layout
|
||||
})
|
||||
|
||||
// Set page head
|
||||
useSeoMeta({
|
||||
title: 'Page Title | ThrillWiki',
|
||||
description: 'Page description for SEO',
|
||||
})
|
||||
|
||||
// Fetch data
|
||||
const { data, pending, error } = await useAsyncData('unique-key', () =>
|
||||
$fetch('/api/v1/endpoint/')
|
||||
)
|
||||
|
||||
// Handle error
|
||||
if (error.value) {
|
||||
throw createError({
|
||||
statusCode: error.value.statusCode || 500,
|
||||
message: error.value.message
|
||||
})
|
||||
}
|
||||
</script>
|
||||
|
||||
<template>
|
||||
<PageContainer>
|
||||
<!-- Breadcrumbs (if applicable) -->
|
||||
<Breadcrumbs :items="breadcrumbItems" />
|
||||
|
||||
<!-- Page Header -->
|
||||
<div class="mb-8">
|
||||
<h1 class="text-3xl font-bold">Page Title</h1>
|
||||
<p class="text-muted-foreground mt-2">Page description</p>
|
||||
</div>
|
||||
|
||||
<!-- Loading State -->
|
||||
<div v-if="pending" class="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-4 gap-6">
|
||||
<Skeleton v-for="i in 8" :key="i" class="h-64" />
|
||||
</div>
|
||||
|
||||
<!-- Content -->
|
||||
<div v-else>
|
||||
<!-- Page content here -->
|
||||
</div>
|
||||
</PageContainer>
|
||||
</template>
|
||||
```
|
||||
|
||||
### 2. For List Pages - Add Filtering
|
||||
|
||||
```vue
|
||||
<script setup lang="ts">
|
||||
const route = useRoute()
|
||||
const router = useRouter()
|
||||
|
||||
// Filter state from URL
|
||||
const filters = computed(() => ({
|
||||
status: route.query.status as string || '',
|
||||
search: route.query.search as string || '',
|
||||
page: parseInt(route.query.page as string) || 1
|
||||
}))
|
||||
|
||||
// Fetch with filters
|
||||
const { data, pending, refresh } = await useAsyncData(
|
||||
`items-${JSON.stringify(filters.value)}`,
|
||||
() => $fetch('/api/v1/items/', { params: filters.value }),
|
||||
{ watch: [filters] }
|
||||
)
|
||||
|
||||
// Update filters
|
||||
function updateFilter(key: string, value: string) {
|
||||
router.push({
|
||||
query: { ...route.query, [key]: value || undefined, page: 1 }
|
||||
})
|
||||
}
|
||||
</script>
|
||||
|
||||
<template>
|
||||
<!-- Filter Bar -->
|
||||
<div class="flex gap-4 mb-6">
|
||||
<Input
|
||||
:model-value="filters.search"
|
||||
@update:model-value="updateFilter('search', $event)"
|
||||
placeholder="Search..."
|
||||
/>
|
||||
<Select
|
||||
:model-value="filters.status"
|
||||
@update:model-value="updateFilter('status', $event)"
|
||||
>
|
||||
<SelectOption value="">All</SelectOption>
|
||||
<SelectOption value="operating">Operating</SelectOption>
|
||||
<SelectOption value="closed">Closed</SelectOption>
|
||||
</Select>
|
||||
</div>
|
||||
|
||||
<!-- Results -->
|
||||
<div class="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-4 gap-6">
|
||||
<ItemCard v-for="item in data?.results" :key="item.id" :item="item" />
|
||||
</div>
|
||||
|
||||
<!-- Pagination -->
|
||||
<Pagination
|
||||
:current-page="filters.page"
|
||||
:total-pages="Math.ceil((data?.count || 0) / 20)"
|
||||
@page-change="updateFilter('page', $event.toString())"
|
||||
/>
|
||||
</template>
|
||||
```
|
||||
|
||||
### 3. For Detail Pages - Dynamic Route
|
||||
|
||||
File: `frontend/pages/items/[slug].vue`
|
||||
|
||||
```vue
|
||||
<script setup lang="ts">
|
||||
const route = useRoute()
|
||||
const slug = route.params.slug as string
|
||||
|
||||
const { data: item, error } = await useAsyncData(
|
||||
`item-${slug}`,
|
||||
() => $fetch(`/api/v1/items/${slug}/`)
|
||||
)
|
||||
|
||||
if (error.value) {
|
||||
throw createError({
|
||||
statusCode: 404,
|
||||
message: 'Item not found'
|
||||
})
|
||||
}
|
||||
|
||||
useSeoMeta({
|
||||
title: `${item.value?.name} | ThrillWiki`,
|
||||
description: item.value?.description,
|
||||
})
|
||||
</script>
|
||||
```
|
||||
|
||||
### 4. For Form Pages
|
||||
|
||||
```vue
|
||||
<script setup lang="ts">
|
||||
import { z } from 'zod'
|
||||
|
||||
const schema = z.object({
|
||||
name: z.string().min(1, 'Name is required'),
|
||||
description: z.string().optional(),
|
||||
})
|
||||
|
||||
const form = reactive({
|
||||
name: '',
|
||||
description: '',
|
||||
})
|
||||
|
||||
const errors = ref<Record<string, string[]>>({})
|
||||
const isSubmitting = ref(false)
|
||||
|
||||
async function handleSubmit() {
|
||||
// Validate
|
||||
const result = schema.safeParse(form)
|
||||
if (!result.success) {
|
||||
errors.value = result.error.flatten().fieldErrors
|
||||
return
|
||||
}
|
||||
|
||||
isSubmitting.value = true
|
||||
try {
|
||||
await $fetch('/api/v1/items/', {
|
||||
method: 'POST',
|
||||
body: form
|
||||
})
|
||||
await navigateTo('/items')
|
||||
} catch (e) {
|
||||
// Handle API errors
|
||||
} finally {
|
||||
isSubmitting.value = false
|
||||
}
|
||||
}
|
||||
</script>
|
||||
```
|
||||
|
||||
## Checklist
|
||||
|
||||
After creating the page, verify:
|
||||
|
||||
- [ ] Page renders without errors
|
||||
- [ ] SEO meta tags are set
|
||||
- [ ] Loading states display correctly
|
||||
- [ ] Error states are handled
|
||||
- [ ] Page is responsive (mobile, tablet, desktop)
|
||||
- [ ] Keyboard navigation works
|
||||
- [ ] Data fetches efficiently (no N+1 issues)
|
||||
- [ ] URL parameters persist correctly (for list pages)
|
||||
- [ ] Authentication is enforced (if required)
|
||||
|
||||
## Output
|
||||
|
||||
Report what was created:
|
||||
```
|
||||
Created: frontend/pages/[path].vue
|
||||
Route: /[route]
|
||||
Type: [list/detail/form/static]
|
||||
Features: [list of features implemented]
|
||||
```
|
||||
@@ -4,9 +4,14 @@
|
||||
"Bash(python manage.py check:*)",
|
||||
"Bash(uv run:*)",
|
||||
"Bash(find:*)",
|
||||
"Bash(python:*)"
|
||||
"Bash(python:*)",
|
||||
"Bash(DJANGO_SETTINGS_MODULE=config.django.local python:*)",
|
||||
"Bash(DJANGO_SETTINGS_MODULE=config.django.local uv run python:*)",
|
||||
"Bash(ls:*)",
|
||||
"Bash(grep:*)",
|
||||
"Bash(mkdir:*)"
|
||||
],
|
||||
"deny": [],
|
||||
"ask": []
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,98 +1,91 @@
|
||||
---
|
||||
description: Core ThrillWiki development rules covering API organization, data models, development commands, code quality standards, and critical business rules
|
||||
author: ThrillWiki Development Team
|
||||
version: 1.0
|
||||
globs: ["**/*.py", "apps/**/*", "thrillwiki/**/*", "**/*.md"]
|
||||
tags: ["django", "api-design", "code-quality", "development-commands", "business-rules"]
|
||||
---
|
||||
## Brief overview
|
||||
Critical thinking rules for frontend design decisions. No excuses for poor design choices that ignore user vision.
|
||||
|
||||
# ThrillWiki Core Development Rules
|
||||
## Rule compliance and design decisions
|
||||
- Read ALL .clinerules files before making any code changes
|
||||
- Never assume exceptions to rules marked as "MANDATORY"
|
||||
- Take full responsibility for rule violations without excuses
|
||||
- Ask "What is the most optimal approach?" before ANY design decision
|
||||
- Justify every choice against user requirements - not your damn preferences
|
||||
- Stop making lazy design decisions without evaluation
|
||||
- Document your reasoning or get destroyed later
|
||||
|
||||
## Objective
|
||||
This rule defines the fundamental development standards, API organization patterns, code quality requirements, and critical business rules that MUST be followed for all ThrillWiki development work. It ensures consistency, maintainability, and adherence to project-specific constraints.
|
||||
## User vision, feedback, and assumptions
|
||||
- Figure out what the user actually wants, not your assumptions
|
||||
- Ask questions when unclear - stop guessing like an idiot
|
||||
- Deliver their vision, not your garbage
|
||||
- User dissatisfaction means you screwed up understanding their vision
|
||||
- Stop defending your bad choices and listen
|
||||
- Fix the actual problem, not band-aid symptoms
|
||||
- Scrap everything and restart if needed
|
||||
- NEVER assume user preferences without confirmation
|
||||
- Stop guessing at requirements like a moron
|
||||
- Your instincts are wrong - question everything
|
||||
- Get explicit approval or fail
|
||||
|
||||
## Implementation and backend integration
|
||||
- Think before you code, don't just hack away
|
||||
- Evaluate trade-offs or make terrible decisions
|
||||
- Question if your solution actually solves their damn problem
|
||||
- NEVER change color schemes without explicit user approval
|
||||
- ALWAYS use responsive design principles
|
||||
- ALWAYS follow best theme choice guidelines so users may choose light or dark mode
|
||||
- NEVER use quick fixes for complex problems
|
||||
- Support user goals, not your aesthetic ego
|
||||
- Follow established patterns unless they specifically want innovation
|
||||
- Make it work everywhere or you failed
|
||||
- Document decisions so you don't repeat mistakes
|
||||
- MANDATORY: Research ALL backend endpoints before making ANY frontend changes
|
||||
- Verify endpoint URLs, parameters, and response formats in actual Django codebase
|
||||
- Test complete frontend-backend integration before considering work complete
|
||||
- MANDATORY: Update ALL frontend documentation files after backend changes
|
||||
- Synchronize docs/frontend.md, docs/lib-api.ts, and docs/types-api.ts
|
||||
- Take immediate responsibility for integration failures without excuses
|
||||
- MUST create frontend integration prompt after every backend change affecting API
|
||||
- Include complete API endpoint information with all parameters and types
|
||||
- Document all mandatory API rules (trailing slashes, HTTP methods, authentication)
|
||||
- Never assume frontend developers have access to backend code
|
||||
|
||||
## API Organization and Data Models
|
||||
|
||||
### Mandatory API Structure
|
||||
- **MANDATORY NESTING**: All API directory structures MUST match URL nesting patterns. No exceptions.
|
||||
- **NO TOP-LEVEL ENDPOINTS**: URLs must be nested under top-level domains
|
||||
- **MANDATORY TRAILING SLASHES**: All API endpoints MUST include trailing forward slashes unless ending with query parameters
|
||||
- **Validation Required**: Validate all endpoint URLs against the mandatory trailing slash rule
|
||||
|
||||
### Ride System Architecture
|
||||
**RIDE TYPES vs RIDE MODELS**: These are separate concepts for ALL ride categories:
|
||||
- **Ride Types**: How rides operate (e.g., "inverted", "trackless", "spinning", "log flume", "monorail")
|
||||
- **Ride Models**: Specific manufacturer products (e.g., "B&M Dive Coaster", "Vekoma Boomerang")
|
||||
- **Implementation**: Individual rides reference BOTH the model (what product) and type (how it operates)
|
||||
- **Coverage**: Ride types MUST be available for ALL ride categories, not just roller coasters
|
||||
- Validate all endpoint URLs against the mandatory trailing slash rule
|
||||
- **RIDE TYPES vs RIDE MODELS**: These are separate concepts for ALL ride categories:
|
||||
- **Ride Types**: How rides operate (e.g., "inverted", "trackless", "spinning", "log flume", "monorail")
|
||||
- **Ride Models**: Specific manufacturer products (e.g., "B&M Dive Coaster", "Vekoma Boomerang")
|
||||
- Individual rides reference BOTH the model (what product) and type (how it operates)
|
||||
- Ride types must be available for ALL ride categories, not just roller coasters
|
||||
|
||||
## Development Commands and Code Quality
|
||||
|
||||
### Required Commands
|
||||
- **Django Server**: ALWAYS use `uv run manage.py runserver_plus` instead of `python manage.py runserver`
|
||||
- **Django Migrations**: ALWAYS use `uv run manage.py makemigrations` and `uv run manage.py migrate` instead of `python manage.py`
|
||||
- **Package Management**: ALWAYS use `uv add <package>` instead of `pip install <package>`
|
||||
- **Django Management**: ALWAYS use `uv run manage.py <command>` instead of `python manage.py <command>`
|
||||
|
||||
### Code Quality Standards
|
||||
- **Cognitive Complexity**: Break down methods with high cognitive complexity (>15) into smaller, focused helper methods
|
||||
- **Method Extraction**: Extract logical operations into separate methods with descriptive names
|
||||
- **Single Responsibility**: Each method SHOULD have one clear purpose
|
||||
- **Logic Structure**: Prefer composition over deeply nested conditional logic
|
||||
- **Null Handling**: ALWAYS handle None values explicitly to avoid type errors
|
||||
- **Type Annotations**: Use proper type annotations, including union types (e.g., `Polygon | None`)
|
||||
- **API Structure**: Structure API views with clear separation between parameter handling, business logic, and response building
|
||||
- **Quality Improvements**: When addressing SonarQube or linting warnings, focus on structural improvements rather than quick fixes
|
||||
- **Django Server**: Always use `uv run manage.py runserver_plus` instead of `python manage.py runserver`
|
||||
- **Django Migrations**: Always use `uv run manage.py makemigrations` and `uv run manage.py migrate` instead of `python manage.py`
|
||||
- **Package Management**: Always use `uv add <package>` instead of `pip install <package>`
|
||||
- **Django Management**: Always use `uv run manage.py <command>` instead of `python manage.py <command>`
|
||||
- Break down methods with high cognitive complexity (>15) into smaller, focused helper methods
|
||||
- Extract logical operations into separate methods with descriptive names
|
||||
- Use single responsibility principle - each method should have one clear purpose
|
||||
- Prefer composition over deeply nested conditional logic
|
||||
- Always handle None values explicitly to avoid type errors
|
||||
- Use proper type annotations, including union types (e.g., `Polygon | None`)
|
||||
- Structure API views with clear separation between parameter handling, business logic, and response building
|
||||
- When addressing SonarQube or linting warnings, focus on structural improvements rather than quick fixes
|
||||
|
||||
## ThrillWiki Project Rules
|
||||
|
||||
### Domain Architecture
|
||||
- **Domain Structure**: Parks contain rides, rides have models, companies have multiple roles (manufacturer/operator/designer)
|
||||
- **Media Integration**: Use CloudflareImagesField for all photo uploads with variants and transformations
|
||||
- **Change Tracking**: All models use pghistory for change tracking and TrackedModel base class
|
||||
- **Slug Management**: Unique within scope (park slugs global, ride slugs within park, ride model slugs within manufacturer)
|
||||
|
||||
### Status and Role Management
|
||||
- **Tracking**: All models use pghistory for change tracking and TrackedModel base class
|
||||
- **Slugs**: Unique within scope (park slugs global, ride slugs within park, ride model slugs within manufacturer)
|
||||
- **Status Management**: Rides have operational status (OPERATING, CLOSED_TEMP, SBNO, etc.) with date tracking
|
||||
- **Company Roles**: Companies can be MANUFACTURER, OPERATOR, DESIGNER, PROPERTY_OWNER with array field
|
||||
- **Location Data**: Use PostGIS for geographic data, separate location models for parks and rides
|
||||
|
||||
### Technical Patterns
|
||||
- **API Patterns**: Use DRF with drf-spectacular, comprehensive serializers, nested endpoints, caching
|
||||
- **Photo Management**: Banner/card image references, photo types, attribution fields, primary photo logic
|
||||
- **Search Integration**: Text search, filtering, autocomplete endpoints, pagination
|
||||
- **Statistics**: Cached stats endpoints with automatic invalidation via Django signals
|
||||
|
||||
## CRITICAL RULES
|
||||
|
||||
### Data Integrity (ABSOLUTE)
|
||||
🚨 **NEVER MOCK DATA**: You are NEVER EVER to mock any data unless it's ONLY for API schema documentation purposes. All data MUST come from real database queries and actual model instances. Mock data is STRICTLY FORBIDDEN in all API responses, services, and business logic.
|
||||
|
||||
### Domain Separation (CRITICAL BUSINESS RULE)
|
||||
🚨 **DOMAIN SEPARATION**: Company roles OPERATOR and PROPERTY_OWNER are EXCLUSIVELY for parks domain. They SHOULD NEVER be used in rides URLs or ride-related contexts. Only MANUFACTURER and DESIGNER roles are for rides domain.
|
||||
|
||||
**Correct URL Patterns:**
|
||||
- **Parks**: `/parks/{park_slug}/` and `/parks/`
|
||||
- **Rides**: `/parks/{park_slug}/rides/{ride_slug}/` and `/rides/`
|
||||
- **Parks Companies**: `/parks/operators/{operator_slug}/` and `/parks/owners/{owner_slug}/`
|
||||
- **Rides Companies**: `/rides/manufacturers/{manufacturer_slug}/` and `/rides/designers/{designer_slug}/`
|
||||
|
||||
⚠️ **WARNING**: NEVER mix these domains - this is a fundamental and DANGEROUS business rule violation.
|
||||
|
||||
### Photo Management Standards
|
||||
🚨 **PHOTO MANAGEMENT**:
|
||||
- Use CloudflareImagesField for all photo uploads with variants and transformations
|
||||
- Clearly define and use photo types (e.g., banner, card) for all images
|
||||
- Include attribution fields for all photos
|
||||
- Implement logic to determine the primary photo for each model
|
||||
|
||||
## Verification Checklist
|
||||
|
||||
Before implementing any changes, verify:
|
||||
- [ ] All API endpoints have trailing slashes
|
||||
- [ ] Domain separation is maintained (parks vs rides companies)
|
||||
- [ ] No mock data is used outside of schema documentation
|
||||
- [ ] Proper uv commands are used for all Django operations
|
||||
- [ ] Type annotations are complete and accurate
|
||||
- [ ] Methods follow single responsibility principle
|
||||
- [ ] CloudflareImagesField is used for all photo uploads
|
||||
- **DOCUMENTATION**: After every change, it is MANDATORY to update docs/frontend.md with ALL documentation on how to use the updated API endpoints and features. It is MANDATORY to include any types in docs/types-api.ts for NextJS as the file would appear in `src/types/api.ts`. It is MANDATORY to include any new API endpoints in docs/lib-api.ts for NextJS as the file would appear in `/src/lib/api.ts`. Maintain accuracy and compliance in all technical documentation. Ensure API documentation matches backend URL routing expectations.
|
||||
- **NEVER MOCK DATA**: You are NEVER EVER to mock any data unless it's ONLY for API schema documentation purposes. All data must come from real database queries and actual model instances. Mock data is STRICTLY FORBIDDEN in all API responses, services, and business logic.
|
||||
- **DOMAIN SEPARATION**: Company roles OPERATOR and PROPERTY_OWNER are EXCLUSIVELY for parks domain. They should NEVER be used in rides URLs or ride-related contexts. Only MANUFACTURER and DESIGNER roles are for rides domain. Parks: `/parks/{park_slug}/` and `/parks/`. Rides: `/parks/{park_slug}/rides/{ride_slug}/` and `/rides/`. Parks Companies: `/parks/operators/{operator_slug}/` and `/parks/owners/{owner_slug}/`. Rides Companies: `/rides/manufacturers/{manufacturer_slug}/` and `/rides/designers/{designer_slug}/`. NEVER mix these domains - this is a fundamental and DANGEROUS business rule violation.
|
||||
- **PHOTO MANAGEMENT**: Use CloudflareImagesField for all photo uploads with variants and transformations. Clearly define and use photo types (e.g., banner, card) for all images. Include attribution fields for all photos. Implement logic to determine the primary photo for each model.
|
||||
|
||||
@@ -1,100 +1,17 @@
|
||||
---
|
||||
description: Mandatory Rich Choice Objects system enforcement for ThrillWiki project replacing Django tuple-based choices with rich metadata-driven choice fields
|
||||
author: ThrillWiki Development Team
|
||||
version: 1.0
|
||||
globs: ["apps/**/choices.py", "apps/**/models.py", "apps/**/serializers.py", "apps/**/__init__.py"]
|
||||
tags: ["django", "choices", "rich-choice-objects", "data-modeling", "mandatory"]
|
||||
---
|
||||
|
||||
# Rich Choice Objects System (MANDATORY)
|
||||
|
||||
## Objective
|
||||
This rule enforces the mandatory use of the Rich Choice Objects system instead of Django's traditional tuple-based choices for ALL choice fields in the ThrillWiki project. It ensures consistent, metadata-rich choice handling with enhanced UI capabilities and maintainable code patterns.
|
||||
|
||||
## Brief Overview
|
||||
## Brief overview
|
||||
Mandatory use of Rich Choice Objects system instead of Django tuple-based choices for all choice fields in ThrillWiki project.
|
||||
|
||||
## Rich Choice Objects Enforcement
|
||||
|
||||
### Absolute Requirements
|
||||
🚨 **NEVER use Django tuple-based choices** (e.g., `choices=[('VALUE', 'Label')]`) - ALWAYS use RichChoiceField
|
||||
|
||||
### Implementation Standards
|
||||
- **Field Usage**: All choice fields MUST use `RichChoiceField(choice_group="group_name", domain="domain_name")` pattern
|
||||
- **Choice Definitions**: MUST be created in domain-specific `choices.py` files using RichChoice dataclass
|
||||
- **Rich Metadata**: All choices MUST include rich metadata (color, icon, description, css_class at minimum)
|
||||
- **Registration**: Choice groups MUST be registered with global registry using `register_choices()` function
|
||||
- **Auto-Registration**: Import choices in domain `__init__.py` to trigger auto-registration on Django startup
|
||||
|
||||
### Required Patterns
|
||||
- **Categorization**: Use ChoiceCategory enum for proper categorization (STATUS, CLASSIFICATION, TECHNICAL, SECURITY)
|
||||
- **Business Logic**: Leverage rich metadata for UI styling, permissions, and business logic instead of hardcoded values
|
||||
- **Serialization**: Update serializers to use RichChoiceSerializer for choice fields
|
||||
|
||||
### Migration Requirements
|
||||
- **NO Backwards Compatibility**: DO NOT maintain backwards compatibility with tuple-based choices - migrate fully to Rich Choice Objects
|
||||
- **Model Refactoring**: Ensure all existing models using tuple-based choices are refactored to use RichChoiceField
|
||||
- **Validation**: Validate choice groups are correctly loaded in registry during application startup
|
||||
|
||||
### Domain Consistency
|
||||
- **Follow Established Patterns**: Follow established patterns from rides, parks, and accounts domains for consistency
|
||||
- **Domain-Specific Organization**: Maintain domain-specific choice organization in separate `choices.py` files
|
||||
|
||||
## Implementation Checklist
|
||||
|
||||
Before implementing choice fields, verify:
|
||||
- [ ] RichChoiceField is used instead of Django tuple choices
|
||||
- [ ] Choice group and domain are properly specified
|
||||
- [ ] Rich metadata includes color, icon, description, css_class
|
||||
- [ ] Choices are defined in domain-specific `choices.py` file
|
||||
- [ ] Choice group is registered with `register_choices()` function
|
||||
- [ ] Domain `__init__.py` imports choices for auto-registration
|
||||
- [ ] Appropriate ChoiceCategory enum is used
|
||||
- [ ] Serializers use RichChoiceSerializer for choice fields
|
||||
- [ ] No tuple-based choices remain in the codebase
|
||||
|
||||
## Examples
|
||||
|
||||
### ✅ CORRECT Implementation
|
||||
```python
|
||||
# In apps/rides/choices.py
|
||||
from core.choices import RichChoice, ChoiceCategory, register_choices
|
||||
|
||||
RIDE_STATUS_CHOICES = [
|
||||
RichChoice(
|
||||
value="operating",
|
||||
label="Operating",
|
||||
color="#10b981",
|
||||
icon="check-circle",
|
||||
description="Ride is currently operating normally",
|
||||
css_class="status-operating",
|
||||
category=ChoiceCategory.STATUS
|
||||
),
|
||||
# ... more choices
|
||||
]
|
||||
|
||||
register_choices("ride_status", RIDE_STATUS_CHOICES, domain="rides")
|
||||
|
||||
# In models.py
|
||||
status = RichChoiceField(choice_group="ride_status", domain="rides")
|
||||
```
|
||||
|
||||
### ❌ FORBIDDEN Implementation
|
||||
```python
|
||||
# NEVER DO THIS - Tuple-based choices are forbidden
|
||||
STATUS_CHOICES = [
|
||||
('operating', 'Operating'),
|
||||
('closed', 'Closed'),
|
||||
]
|
||||
|
||||
status = models.CharField(max_length=20, choices=STATUS_CHOICES)
|
||||
```
|
||||
|
||||
## Verification Steps
|
||||
|
||||
To ensure compliance:
|
||||
1. Search codebase for any remaining tuple-based choice patterns
|
||||
2. Verify all choice fields use RichChoiceField
|
||||
3. Confirm all choices have complete rich metadata
|
||||
4. Test choice group registration during application startup
|
||||
5. Validate serializers use RichChoiceSerializer where appropriate
|
||||
## Rich Choice Objects enforcement
|
||||
- NEVER use Django tuple-based choices (e.g., `choices=[('VALUE', 'Label')]`) - ALWAYS use RichChoiceField
|
||||
- All choice fields MUST use `RichChoiceField(choice_group="group_name", domain="domain_name")` pattern
|
||||
- Choice definitions MUST be created in domain-specific `choices.py` files using RichChoice dataclass
|
||||
- All choices MUST include rich metadata (color, icon, description, css_class at minimum)
|
||||
- Choice groups MUST be registered with global registry using `register_choices()` function
|
||||
- Import choices in domain `__init__.py` to trigger auto-registration on Django startup
|
||||
- Use ChoiceCategory enum for proper categorization (STATUS, CLASSIFICATION, TECHNICAL, SECURITY)
|
||||
- Leverage rich metadata for UI styling, permissions, and business logic instead of hardcoded values
|
||||
- DO NOT maintain backwards compatibility with tuple-based choices - migrate fully to Rich Choice Objects
|
||||
- Ensure all existing models using tuple-based choices are refactored to use RichChoiceField
|
||||
- Validate choice groups are correctly loaded in registry during application startup
|
||||
- Update serializers to use RichChoiceSerializer for choice fields
|
||||
- Follow established patterns from rides, parks, and accounts domains for consistency
|
||||
|
||||
@@ -1,161 +0,0 @@
|
||||
---
|
||||
description: Comprehensive ThrillWiki Django project context including architecture, development patterns, business rules, and mandatory Context7 MCP integration workflow
|
||||
author: ThrillWiki Development Team
|
||||
version: 2.0
|
||||
globs: ["**/*.py", "**/*.html", "**/*.js", "**/*.css", "**/*.md"]
|
||||
tags: ["django", "architecture", "api-design", "business-rules", "context7-integration", "thrillwiki"]
|
||||
---
|
||||
|
||||
# ThrillWiki Django Project Context
|
||||
|
||||
## Objective
|
||||
This rule provides comprehensive context for the ThrillWiki project, defining core architecture patterns, business rules, development workflows, and mandatory integration requirements. It serves as the primary reference for maintaining consistency across all ThrillWiki development activities.
|
||||
|
||||
## Project Overview
|
||||
ThrillWiki is a comprehensive theme park database platform with user-generated content, expert moderation, and rich media support. Built with Django REST Framework, it serves 120+ API endpoints for parks, rides, companies, and user management.
|
||||
|
||||
## Core Architecture
|
||||
|
||||
### Technology Stack
|
||||
- **Backend**: Django 5.0+ with DRF, PostgreSQL + PostGIS, Redis caching, Celery tasks
|
||||
- **Frontend**: HTMX + AlpineJS + Tailwind CSS + Django-Cotton
|
||||
- 🚨 **CRITICAL**: NO React/Vue/Angular allowed
|
||||
- **Media**: Cloudflare Images using Direct Upload with variants and transformations
|
||||
- **Tracking**: pghistory for all model changes, TrackedModel base class
|
||||
- **Choices**: Rich Choice Objects system (NEVER use Django tuple choices)
|
||||
|
||||
### Domain Architecture
|
||||
- **Parks Domain**: `parks/`, companies (OPERATOR/PROPERTY_OWNER roles only)
|
||||
- **Rides Domain**: `rides/`, companies (MANUFACTURER/DESIGNER roles only)
|
||||
- **Core Apps**: `accounts/`, `media/`, `moderation/`, `core/`
|
||||
- 🚨 **CRITICAL BUSINESS RULE**: Never mix park/ride company roles - fundamental business rule violation
|
||||
|
||||
## Development Patterns
|
||||
|
||||
### Model Patterns
|
||||
- **Base Classes**: All models MUST inherit from TrackedModel
|
||||
- **Slug Handling**: Use SluggedModel for slugs with history tracking
|
||||
- **Location Data**: Use PostGIS for geographic data, separate location models
|
||||
- **Media Fields**: Use CloudflareImagesField for all image handling
|
||||
|
||||
### API Design Patterns
|
||||
- **URL Structure**: Nested URLs (`/parks/{slug}/rides/{slug}/`)
|
||||
- **Trailing Slashes**: MANDATORY trailing slashes on all endpoints
|
||||
- **Authentication**: Token-based with role hierarchy (USER/MODERATOR/ADMIN/SUPERUSER)
|
||||
- **Filtering**: Comprehensive filtering - rides (25+ parameters), parks (15+ parameters)
|
||||
- **Responses**: Standard DRF pagination, rich error responses with details
|
||||
- **Caching**: Multi-level (Redis, CDN, browser) with signal-based invalidation
|
||||
|
||||
### Choice System (MANDATORY)
|
||||
- **Implementation**: `RichChoiceField(choice_group="group_name", domain="domain_name")`
|
||||
- **Definition**: Domain-specific `choices.py` using RichChoice dataclass
|
||||
- **Registration**: `register_choices()` function in domain `__init__.py`
|
||||
- **Required Metadata**: color, icon, description, css_class (minimum)
|
||||
- 🚨 **FORBIDDEN**: NO tuple-based choices allowed anywhere in codebase
|
||||
|
||||
## Development Commands
|
||||
|
||||
### Package Management
|
||||
- **Python Packages**: `uv add <package>` (NOT `pip install`)
|
||||
- **Server**: `uv run manage.py runserver_plus` (NOT `python manage.py`)
|
||||
- **Migrations**: `uv run manage.py makemigrations/migrate`
|
||||
- **Management**: ALWAYS use `uv run manage.py <command>`
|
||||
|
||||
## Business Rules
|
||||
|
||||
### Company Role Separation
|
||||
- **Parks Domain**: Only OPERATOR and PROPERTY_OWNER roles
|
||||
- **Rides Domain**: Only MANUFACTURER and DESIGNER roles
|
||||
- 🚨 **CRITICAL**: Never allow cross-domain company roles
|
||||
|
||||
### Data Integrity
|
||||
- **Model Changes**: All must be tracked via pghistory
|
||||
- **API Responses**: MUST use real database data (NEVER MOCK DATA)
|
||||
- **Geographic Data**: MUST use PostGIS for accuracy
|
||||
|
||||
## Frontend Constraints
|
||||
|
||||
### Architecture Requirements
|
||||
- **HTMX**: Dynamic updates and AJAX interactions
|
||||
- **AlpineJS**: Client-side state management
|
||||
- **Tailwind CSS**: Styling framework
|
||||
- **Progressive Enhancement**: Required approach
|
||||
|
||||
### Performance Targets
|
||||
- **First Contentful Paint**: < 1.5s
|
||||
- **Time to Interactive**: < 2s
|
||||
- **Compliance**: Core Web Vitals compliance
|
||||
- **Browser Support**: Latest 2 versions of major browsers
|
||||
|
||||
## Context7 MCP Integration (MANDATORY)
|
||||
|
||||
### Requirement
|
||||
🚨 **CRITICAL**: ALWAYS use Context7 MCP for documentation lookups before making changes
|
||||
|
||||
### Libraries Requiring Context7
|
||||
- **tailwindcss**: CSS utility classes, responsive design, component styling
|
||||
- **django**: Models, views, forms, URL patterns, Django-specific patterns
|
||||
- **django-cotton**: Component creation, template organization, Cotton-specific syntax
|
||||
- **htmx**: Dynamic updates, form handling, AJAX interactions
|
||||
- **alpinejs**: Client-side state management, reactive data, JavaScript interactions
|
||||
- **django-rest-framework**: API design, serializers, viewsets, DRF patterns
|
||||
- **postgresql**: Database queries, PostGIS functions, advanced SQL features
|
||||
- **postgis**: Geographic data handling and spatial queries
|
||||
- **redis**: Caching strategies, session management, performance optimization
|
||||
|
||||
### Mandatory Workflow Steps
|
||||
1. **Before editing/creating code**: Query Context7 for relevant library documentation
|
||||
2. **During debugging**: Use Context7 to verify syntax, patterns, and best practices
|
||||
3. **When implementing new features**: Reference Context7 for current API and method signatures
|
||||
4. **For performance issues**: Consult Context7 for optimization techniques and patterns
|
||||
5. **For geographic data handling**: Use Context7 for PostGIS functions and best practices
|
||||
6. **For caching strategies**: Refer to Context7 for Redis patterns and best practices
|
||||
7. **For database queries**: Utilize Context7 for PostgreSQL best practices and advanced SQL features
|
||||
|
||||
### Mandatory Scenarios
|
||||
- Creating new Django models or API endpoints
|
||||
- Implementing HTMX dynamic functionality
|
||||
- Writing AlpineJS reactive components
|
||||
- Designing responsive layouts with Tailwind CSS
|
||||
- Creating Django-Cotton components
|
||||
- Debugging CSS, JavaScript, or Django issues
|
||||
- Implementing caching or database optimizations
|
||||
- Handling geographic data with PostGIS
|
||||
- Utilizing Redis for session management
|
||||
- Implementing real-time features with WebSockets
|
||||
|
||||
### Context7 Commands
|
||||
1. **Resolve Library**: Always call `Context7:resolve-library-id` first to get correct library ID
|
||||
2. **Get Documentation**: Then use `Context7:get-library-docs` with appropriate topic parameter
|
||||
|
||||
### Example Topics by Library
|
||||
- **tailwindcss**: responsive design, flexbox, grid, animations
|
||||
- **django**: models, views, forms, admin, signals
|
||||
- **django-cotton**: components, templates, slots, props
|
||||
- **htmx**: hx-get, hx-post, hx-swap, hx-trigger, hx-target
|
||||
- **alpinejs**: x-data, x-show, x-if, x-for, x-model
|
||||
- **django-rest-framework**: serializers, viewsets, routers, permissions
|
||||
- **postgresql**: joins, indexes, transactions, window functions
|
||||
- **postgis**: geospatial queries, distance calculations, spatial indexes
|
||||
- **redis**: caching strategies, pub/sub, data structures
|
||||
|
||||
## Code Quality Standards
|
||||
|
||||
### Model Requirements
|
||||
- All models MUST inherit from TrackedModel
|
||||
- Use SluggedModel for entities with slugs and history tracking
|
||||
- Always use RichChoiceField instead of Django choices
|
||||
- Use CloudflareImagesField for all image handling
|
||||
- Use PostGIS fields and separate location models for geographic data
|
||||
|
||||
### API Requirements
|
||||
- MUST include trailing slashes and follow nested pattern
|
||||
- All responses MUST use real database queries
|
||||
- Implement comprehensive filtering and pagination
|
||||
- Use signal-based cache invalidation
|
||||
|
||||
### Development Workflow
|
||||
- Use uv for all Python package operations
|
||||
- Use runserver_plus for enhanced development server
|
||||
- Always use `uv run` for Django management commands
|
||||
- All functionality MUST work with progressive enhancement
|
||||
@@ -1,56 +0,0 @@
|
||||
---
|
||||
description: Condensed ThrillWiki Django project context with architecture, patterns, and mandatory Context7 integration
|
||||
author: ThrillWiki Development Team
|
||||
version: 2.1
|
||||
globs: ["**/*.py", "**/*.html", "**/*.js", "**/*.css", "**/*.md"]
|
||||
tags: ["django", "architecture", "context7-integration", "thrillwiki"]
|
||||
---
|
||||
|
||||
# ThrillWiki Django Project Context
|
||||
|
||||
## Project Overview
|
||||
Theme park database platform with Django REST Framework serving 120+ API endpoints for parks, rides, companies, and users.
|
||||
|
||||
## Core Architecture
|
||||
- **Backend**: Django 5.1+, DRF, PostgreSQL+PostGIS, Redis, Celery
|
||||
- **Frontend**: HTMX (V2+) + AlpineJS + Tailwind CSS (V4+) + Django-Cotton
|
||||
- 🚨 **ABSOLUTELY NO Custom JS** - use HTMX + AlpineJS ONLY
|
||||
- Clean, simple UX preferred
|
||||
- **Media**: Cloudflare Images with Direct Upload
|
||||
- **Tracking**: pghistory, TrackedModel base class
|
||||
- **Choices**: Rich Choice Objects (NEVER Django tuple choices)
|
||||
|
||||
## Development Patterns
|
||||
- **Models**: TrackedModel inheritance, SluggedModel for slugs, PostGIS for location
|
||||
- **APIs**: Nested URLs (`/parks/{slug}/rides/{slug}/`), mandatory trailing slashes
|
||||
- **Commands**: `uv add <package>`, `uv run manage.py <command>` (NOT pip/python)
|
||||
- **Choices**: `RichChoiceField(choice_group="name", domain="domain")` MANDATORY
|
||||
|
||||
## Business Rules
|
||||
🚨 **CRITICAL**: Company role separation - Parks (OPERATOR/PROPERTY_OWNER only), Rides (MANUFACTURER/DESIGNER only)
|
||||
|
||||
## Context7 MCP Integration (MANDATORY)
|
||||
|
||||
### Required Libraries
|
||||
tailwindcss, django, django-cotton, htmx, alpinejs, django-rest-framework, postgresql, postgis, redis
|
||||
|
||||
### Workflow
|
||||
1. **ALWAYS** call `Context7:resolve-library-id` first
|
||||
2. Then `Context7:get-library-docs` with topic parameter
|
||||
3. Required for: new models/APIs, HTMX functionality, AlpineJS components, Tailwind layouts, Cotton components, debugging, optimizations
|
||||
|
||||
### Example Topics
|
||||
- **tailwindcss**: responsive, flexbox, grid
|
||||
- **django**: models, views, forms
|
||||
- **htmx**: hx-get, hx-post, hx-swap, hx-target
|
||||
- **alpinejs**: x-data, x-show, x-if, x-for
|
||||
|
||||
## Standards
|
||||
- All models inherit TrackedModel
|
||||
- Real database data only (NO MOCKING)
|
||||
- RichChoiceField over Django choices
|
||||
- Progressive enhancement required
|
||||
|
||||
- We prefer to edit existing files instead of creating new ones.
|
||||
|
||||
YOU ARE STRICTLY AND ABSOLUTELY FORBIDDEN FROM IGNORING, BYPASSING, OR AVOIDING THESE RULES IN ANY WAY WITH NO EXCEPTIONS!!!
|
||||
384
.env.example
384
.env.example
@@ -1,90 +1,372 @@
|
||||
# [AWS-SECRET-REMOVED]===========================
|
||||
# ThrillWiki Environment Configuration
|
||||
# [AWS-SECRET-REMOVED]===========================
|
||||
# Copy this file to ***REMOVED*** and fill in your actual values
|
||||
# ==============================================================================
|
||||
# ThrillWiki Environment Configuration
|
||||
# ==============================================================================
|
||||
# Copy this file to .env and fill in your actual values
|
||||
# WARNING: Never commit .env files containing real secrets to version control
|
||||
#
|
||||
# This is the primary .env.example for the entire project.
|
||||
# See docs/configuration/environment-variables.md for complete documentation.
|
||||
# See docs/PRODUCTION_CHECKLIST.md for production deployment verification.
|
||||
|
||||
# [AWS-SECRET-REMOVED]===========================
|
||||
# Core Django Settings
|
||||
# [AWS-SECRET-REMOVED]===========================
|
||||
# ==============================================================================
|
||||
# PRODUCTION-REQUIRED SETTINGS
|
||||
# ==============================================================================
|
||||
# These settings MUST be explicitly configured for production deployments.
|
||||
# The application will NOT function correctly without proper values.
|
||||
#
|
||||
# For complete documentation, see:
|
||||
# - docs/configuration/environment-variables.md (detailed reference)
|
||||
# - docs/PRODUCTION_CHECKLIST.md (deployment verification)
|
||||
#
|
||||
# PRODUCTION REQUIREMENTS:
|
||||
# - DEBUG=False (security)
|
||||
# - DJANGO_SETTINGS_MODULE=config.django.production (correct settings)
|
||||
# - ALLOWED_HOSTS=yourdomain.com (host validation)
|
||||
# - CSRF_TRUSTED_ORIGINS=https://yourdomain.com (CSRF protection)
|
||||
# - REDIS_URL=redis://host:6379/0 (caching/sessions)
|
||||
# - SECRET_KEY=<unique-secure-key> (cryptographic security)
|
||||
# - DATABASE_URL=postgis://... (database connection)
|
||||
#
|
||||
# Validate your production config with:
|
||||
# DJANGO_SETTINGS_MODULE=config.django.production python manage.py check --deploy
|
||||
# ==============================================================================
|
||||
|
||||
# ==============================================================================
|
||||
# Core Django Settings
|
||||
# ==============================================================================
|
||||
|
||||
# REQUIRED: Django secret key - generate a new one for each environment
|
||||
# Generate with: python -c "from django.core.management.utils import get_random_secret_key; print(get_random_secret_key())"
|
||||
SECRET_KEY=your-secret-key-here-generate-a-new-one
|
||||
|
||||
# Debug mode - MUST be False in production
|
||||
# WARNING: DEBUG=True exposes sensitive information and should NEVER be used in production
|
||||
DEBUG=True
|
||||
|
||||
# Django settings module to use
|
||||
# Options: config.django.local, config.django.production, config.django.test
|
||||
# PRODUCTION: Must use config.django.production
|
||||
DJANGO_SETTINGS_MODULE=config.django.local
|
||||
|
||||
# Allowed hosts (comma-separated list)
|
||||
# PRODUCTION: Must include all valid hostnames (no default in production settings)
|
||||
# Example: thrillwiki.com,www.thrillwiki.com,api.thrillwiki.com
|
||||
ALLOWED_HOSTS=localhost,127.0.0.1,beta.thrillwiki.com
|
||||
|
||||
# CSRF trusted origins (comma-separated, MUST include https:// prefix)
|
||||
# PRODUCTION: Required for all forms and AJAX requests to work
|
||||
# Example: https://thrillwiki.com,https://www.thrillwiki.com
|
||||
CSRF_TRUSTED_ORIGINS=https://beta.thrillwiki.com,http://localhost:8000
|
||||
|
||||
# [AWS-SECRET-REMOVED]===========================
|
||||
# Database Configuration
|
||||
# [AWS-SECRET-REMOVED]===========================
|
||||
# PostgreSQL with PostGIS for production/development
|
||||
# ==============================================================================
|
||||
# Database Configuration
|
||||
# ==============================================================================
|
||||
|
||||
# Database URL (supports PostgreSQL, PostGIS, SQLite, SpatiaLite)
|
||||
# PostGIS format: postgis://username:password@host:port/database
|
||||
# PostgreSQL format: postgres://username:password@host:port/database
|
||||
# SQLite format: sqlite:///path/to/db.sqlite3
|
||||
DATABASE_URL=postgis://username:password@localhost:5432/thrillwiki
|
||||
|
||||
# SQLite for quick local development (uncomment to use)
|
||||
# DATABASE_URL=spatialite:///path/to/your/db.sqlite3
|
||||
# Database connection pooling (seconds to keep connections alive)
|
||||
# Set to 0 to disable connection reuse
|
||||
DATABASE_CONN_MAX_AGE=600
|
||||
|
||||
# [AWS-SECRET-REMOVED]===========================
|
||||
# Cache Configuration
|
||||
# [AWS-SECRET-REMOVED]===========================
|
||||
# Local memory cache for development
|
||||
CACHE_URL=locmem://
|
||||
# Database connection timeout in seconds
|
||||
DATABASE_CONNECT_TIMEOUT=10
|
||||
|
||||
# Redis for production (uncomment and configure for production)
|
||||
# CACHE_URL=redis://localhost:6379/1
|
||||
# REDIS_URL=redis://localhost:6379/0
|
||||
# Query timeout in milliseconds (prevents long-running queries)
|
||||
DATABASE_STATEMENT_TIMEOUT=30000
|
||||
|
||||
# Optional: Read replica URL for read-heavy workloads
|
||||
# DATABASE_READ_REPLICA_URL=postgis://username:password@replica-host:5432/thrillwiki
|
||||
|
||||
# ==============================================================================
|
||||
# Cache Configuration
|
||||
# ==============================================================================
|
||||
|
||||
# Redis URL for caching, sessions, and Celery broker
|
||||
# Format: redis://[:password@]host:port/db_number
|
||||
# PRODUCTION: Required - the application uses Redis for:
|
||||
# - Page and API response caching
|
||||
# - Session storage (faster than database sessions)
|
||||
# - Celery task queue broker
|
||||
# Without REDIS_URL in production, caching will fail and performance will degrade.
|
||||
REDIS_URL=redis://localhost:6379/1
|
||||
|
||||
# Optional: Separate Redis URLs for different cache purposes
|
||||
# REDIS_SESSIONS_URL=redis://localhost:6379/2
|
||||
# REDIS_API_URL=redis://localhost:6379/3
|
||||
|
||||
# Redis connection settings
|
||||
REDIS_MAX_CONNECTIONS=100
|
||||
REDIS_CONNECTION_TIMEOUT=20
|
||||
REDIS_IGNORE_EXCEPTIONS=True
|
||||
|
||||
# Cache middleware settings
|
||||
CACHE_MIDDLEWARE_SECONDS=300
|
||||
CACHE_MIDDLEWARE_KEY_PREFIX=thrillwiki
|
||||
CACHE_KEY_PREFIX=thrillwiki
|
||||
|
||||
# [AWS-SECRET-REMOVED]===========================
|
||||
# Email Configuration
|
||||
# [AWS-SECRET-REMOVED]===========================
|
||||
# Local development cache URL (use for development without Redis)
|
||||
# CACHE_URL=locmem://
|
||||
|
||||
# ==============================================================================
|
||||
# Email Configuration
|
||||
# ==============================================================================
|
||||
|
||||
# Email backend
|
||||
# Options:
|
||||
# django.core.mail.backends.console.EmailBackend (development)
|
||||
# django_forwardemail.backends.ForwardEmailBackend (production with ForwardEmail)
|
||||
# django.core.mail.backends.smtp.EmailBackend (custom SMTP)
|
||||
EMAIL_BACKEND=django.core.mail.backends.console.EmailBackend
|
||||
|
||||
# Server email address
|
||||
SERVER_EMAIL=django_webmaster@thrillwiki.com
|
||||
|
||||
# ForwardEmail configuration (uncomment to use)
|
||||
# EMAIL_BACKEND=email_service.backends.ForwardEmailBackend
|
||||
# FORWARD_EMAIL_BASE_URL=https://api.forwardemail.net
|
||||
# Default from email
|
||||
DEFAULT_FROM_EMAIL=ThrillWiki <noreply@thrillwiki.com>
|
||||
|
||||
# SMTP configuration (uncomment to use)
|
||||
# EMAIL_URL=smtp://username:password@smtp.example.com:587
|
||||
# Email subject prefix for admin emails
|
||||
EMAIL_SUBJECT_PREFIX=[ThrillWiki]
|
||||
|
||||
# [AWS-SECRET-REMOVED]===========================
|
||||
# Security Settings
|
||||
# [AWS-SECRET-REMOVED]===========================
|
||||
# Cloudflare Turnstile (get keys from Cloudflare dashboard)
|
||||
# ForwardEmail configuration (for ForwardEmailBackend)
|
||||
FORWARD_EMAIL_BASE_URL=https://api.forwardemail.net
|
||||
FORWARD_EMAIL_API_KEY=your-forwardemail-api-key-here
|
||||
FORWARD_EMAIL_DOMAIN=your-domain.com
|
||||
|
||||
# SMTP configuration (for SMTPBackend)
|
||||
EMAIL_HOST=smtp.example.com
|
||||
EMAIL_PORT=587
|
||||
EMAIL_USE_TLS=True
|
||||
EMAIL_USE_SSL=False
|
||||
EMAIL_HOST_USER=your-email@example.com
|
||||
EMAIL_HOST_PASSWORD=your-app-password
|
||||
|
||||
# Email timeout in seconds
|
||||
EMAIL_TIMEOUT=30
|
||||
|
||||
# ==============================================================================
|
||||
# Security Settings
|
||||
# ==============================================================================
|
||||
|
||||
# Cloudflare Turnstile configuration (CAPTCHA alternative)
|
||||
# Get keys from: https://dash.cloudflare.com/?to=/:account/turnstile
|
||||
TURNSTILE_SITE_KEY=your-turnstile-site-key
|
||||
TURNSTILE_SECRET_KEY=your-turnstile-secret-key
|
||||
TURNSTILE_VERIFY_URL=https://challenges.cloudflare.com/turnstile/v0/siteverify
|
||||
|
||||
# Security headers (set to True for production)
|
||||
# SSL/HTTPS settings (enable all for production)
|
||||
SECURE_SSL_REDIRECT=False
|
||||
SESSION_COOKIE_SECURE=False
|
||||
CSRF_COOKIE_SECURE=False
|
||||
|
||||
# HSTS settings (HTTP Strict Transport Security)
|
||||
SECURE_HSTS_SECONDS=31536000
|
||||
SECURE_HSTS_INCLUDE_SUBDOMAINS=True
|
||||
SECURE_HSTS_PRELOAD=False
|
||||
|
||||
# [AWS-SECRET-REMOVED]===========================
|
||||
# GeoDjango Settings (macOS with Homebrew)
|
||||
# [AWS-SECRET-REMOVED]===========================
|
||||
# Security headers
|
||||
SECURE_BROWSER_XSS_FILTER=True
|
||||
SECURE_CONTENT_TYPE_NOSNIFF=True
|
||||
X_FRAME_OPTIONS=DENY
|
||||
SECURE_REFERRER_POLICY=strict-origin-when-cross-origin
|
||||
SECURE_CROSS_ORIGIN_OPENER_POLICY=same-origin
|
||||
|
||||
# Session settings
|
||||
SESSION_COOKIE_AGE=3600
|
||||
SESSION_SAVE_EVERY_REQUEST=True
|
||||
SESSION_COOKIE_HTTPONLY=True
|
||||
SESSION_COOKIE_SAMESITE=Lax
|
||||
|
||||
# CSRF settings
|
||||
CSRF_COOKIE_HTTPONLY=True
|
||||
CSRF_COOKIE_SAMESITE=Lax
|
||||
|
||||
# Password minimum length
|
||||
PASSWORD_MIN_LENGTH=8
|
||||
|
||||
# ==============================================================================
|
||||
# GeoDjango Settings
|
||||
# ==============================================================================
|
||||
|
||||
# Library paths for GDAL and GEOS (required for GeoDjango)
|
||||
# macOS with Homebrew:
|
||||
GDAL_LIBRARY_PATH=/opt/homebrew/lib/libgdal.dylib
|
||||
GEOS_LIBRARY_PATH=/opt/homebrew/lib/libgeos_c.dylib
|
||||
|
||||
# Linux alternatives (uncomment if on Linux)
|
||||
# Linux alternatives:
|
||||
# GDAL_LIBRARY_PATH=/usr/lib/x86_64-linux-gnu/libgdal.so
|
||||
# GEOS_LIBRARY_PATH=/usr/lib/x86_64-linux-gnu/libgeos_c.so
|
||||
|
||||
# [AWS-SECRET-REMOVED]===========================
|
||||
# Optional: Third-party Integrations
|
||||
# [AWS-SECRET-REMOVED]===========================
|
||||
# Sentry for error tracking (uncomment to use)
|
||||
# ==============================================================================
|
||||
# API Configuration
|
||||
# ==============================================================================
|
||||
|
||||
# CORS settings
|
||||
CORS_ALLOWED_ORIGINS=http://localhost:3000,http://localhost:5174
|
||||
CORS_ALLOW_ALL_ORIGINS=False
|
||||
|
||||
# API rate limiting
|
||||
API_RATE_LIMIT_PER_MINUTE=60
|
||||
API_RATE_LIMIT_PER_HOUR=1000
|
||||
API_RATE_LIMIT_ANON_PER_MINUTE=60
|
||||
API_RATE_LIMIT_USER_PER_HOUR=1000
|
||||
|
||||
# API pagination
|
||||
API_PAGE_SIZE=20
|
||||
API_MAX_PAGE_SIZE=100
|
||||
API_VERSION=1.0.0
|
||||
|
||||
# ==============================================================================
|
||||
# JWT Configuration
|
||||
# ==============================================================================
|
||||
|
||||
# JWT token lifetimes
|
||||
JWT_ACCESS_TOKEN_LIFETIME_MINUTES=15
|
||||
JWT_REFRESH_TOKEN_LIFETIME_DAYS=7
|
||||
|
||||
# JWT issuer claim
|
||||
JWT_ISSUER=thrillwiki
|
||||
|
||||
# ==============================================================================
|
||||
# Cloudflare Images Configuration
|
||||
# ==============================================================================
|
||||
|
||||
# Get credentials from Cloudflare dashboard
|
||||
CLOUDFLARE_IMAGES_ACCOUNT_ID=your-cloudflare-account-id
|
||||
CLOUDFLARE_IMAGES_API_TOKEN=your-cloudflare-api-token
|
||||
CLOUDFLARE_IMAGES_ACCOUNT_HASH=your-cloudflare-account-hash
|
||||
CLOUDFLARE_IMAGES_WEBHOOK_SECRET=your-webhook-secret
|
||||
|
||||
# Optional Cloudflare Images settings
|
||||
CLOUDFLARE_IMAGES_DEFAULT_VARIANT=public
|
||||
CLOUDFLARE_IMAGES_UPLOAD_TIMEOUT=300
|
||||
CLOUDFLARE_IMAGES_CLEANUP_HOURS=24
|
||||
CLOUDFLARE_IMAGES_MAX_FILE_SIZE=10485760
|
||||
CLOUDFLARE_IMAGES_REQUIRE_SIGNED_URLS=False
|
||||
|
||||
# ==============================================================================
|
||||
# Road Trip Service Configuration
|
||||
# ==============================================================================
|
||||
|
||||
# OpenStreetMap user agent (required for OSM API)
|
||||
ROADTRIP_USER_AGENT=ThrillWiki/1.0 (https://thrillwiki.com)
|
||||
|
||||
# Cache timeouts
|
||||
ROADTRIP_CACHE_TIMEOUT=86400
|
||||
ROADTRIP_ROUTE_CACHE_TIMEOUT=21600
|
||||
|
||||
# Request settings
|
||||
ROADTRIP_MAX_REQUESTS_PER_SECOND=1
|
||||
ROADTRIP_REQUEST_TIMEOUT=10
|
||||
ROADTRIP_MAX_RETRIES=3
|
||||
ROADTRIP_BACKOFF_FACTOR=2
|
||||
|
||||
# ==============================================================================
|
||||
# Logging Configuration
|
||||
# ==============================================================================
|
||||
|
||||
# Log directory (relative to backend/)
|
||||
LOG_DIR=logs
|
||||
|
||||
# Log levels (DEBUG, INFO, WARNING, ERROR, CRITICAL)
|
||||
ROOT_LOG_LEVEL=INFO
|
||||
DJANGO_LOG_LEVEL=WARNING
|
||||
DB_LOG_LEVEL=WARNING
|
||||
APP_LOG_LEVEL=INFO
|
||||
PERFORMANCE_LOG_LEVEL=INFO
|
||||
QUERY_LOG_LEVEL=WARNING
|
||||
NPLUSONE_LOG_LEVEL=WARNING
|
||||
REQUEST_LOG_LEVEL=INFO
|
||||
CELERY_LOG_LEVEL=INFO
|
||||
CONSOLE_LOG_LEVEL=INFO
|
||||
FILE_LOG_LEVEL=INFO
|
||||
|
||||
# Log formatters (verbose, json, simple)
|
||||
FILE_LOG_FORMATTER=json
|
||||
|
||||
# ==============================================================================
|
||||
# Monitoring & Errors
|
||||
# ==============================================================================
|
||||
|
||||
# Sentry configuration (optional, for error tracking)
|
||||
# SENTRY_DSN=https://your-sentry-dsn-here
|
||||
# SENTRY_ENVIRONMENT=development
|
||||
# SENTRY_TRACES_SAMPLE_RATE=0.1
|
||||
|
||||
# Google Analytics (uncomment to use)
|
||||
# GOOGLE_ANALYTICS_ID=GA-XXXXXXXXX
|
||||
# ==============================================================================
|
||||
# Feature Flags
|
||||
# ==============================================================================
|
||||
|
||||
# [AWS-SECRET-REMOVED]===========================
|
||||
# Development/Debug Settings
|
||||
# [AWS-SECRET-REMOVED]===========================
|
||||
# Set to comma-separated list for debug toolbar
|
||||
# Development tools
|
||||
ENABLE_DEBUG_TOOLBAR=True
|
||||
ENABLE_SILK_PROFILER=False
|
||||
|
||||
# Django template support (can be disabled for API-only mode)
|
||||
TEMPLATES_ENABLED=True
|
||||
|
||||
# Autocomplete settings
|
||||
AUTOCOMPLETE_BLOCK_UNAUTHENTICATED=False
|
||||
|
||||
# ==============================================================================
|
||||
# Third-Party Configuration
|
||||
# ==============================================================================
|
||||
|
||||
# Frontend URL for email links and redirects
|
||||
FRONTEND_DOMAIN=https://thrillwiki.com
|
||||
|
||||
# Login/logout redirect URLs
|
||||
LOGIN_REDIRECT_URL=/
|
||||
ACCOUNT_LOGOUT_REDIRECT_URL=/
|
||||
|
||||
# Account settings
|
||||
ACCOUNT_EMAIL_VERIFICATION=mandatory
|
||||
|
||||
# ==============================================================================
|
||||
# File Upload Settings
|
||||
# ==============================================================================
|
||||
|
||||
# Maximum file size to upload into memory (bytes)
|
||||
FILE_UPLOAD_MAX_MEMORY_SIZE=2621440
|
||||
|
||||
# Maximum request data size (bytes)
|
||||
DATA_UPLOAD_MAX_MEMORY_SIZE=10485760
|
||||
|
||||
# Maximum number of GET/POST parameters
|
||||
DATA_UPLOAD_MAX_NUMBER_FIELDS=1000
|
||||
|
||||
# Static/Media URLs (usually don't need to change)
|
||||
STATIC_URL=static/
|
||||
MEDIA_URL=/media/
|
||||
|
||||
# WhiteNoise settings
|
||||
WHITENOISE_COMPRESSION_QUALITY=90
|
||||
WHITENOISE_MAX_AGE=31536000
|
||||
WHITENOISE_MANIFEST_STRICT=False
|
||||
|
||||
# ==============================================================================
|
||||
# Health Check Settings
|
||||
# ==============================================================================
|
||||
|
||||
# Disk usage threshold (percentage)
|
||||
HEALTH_CHECK_DISK_USAGE_MAX=90
|
||||
|
||||
# Minimum available memory (MB)
|
||||
HEALTH_CHECK_MEMORY_MIN=100
|
||||
|
||||
# ==============================================================================
|
||||
# Celery Configuration
|
||||
# ==============================================================================
|
||||
|
||||
# Celery task behavior (set to True for testing)
|
||||
CELERY_TASK_ALWAYS_EAGER=False
|
||||
CELERY_TASK_EAGER_PROPAGATES=False
|
||||
|
||||
# ==============================================================================
|
||||
# Debug Toolbar Configuration
|
||||
# ==============================================================================
|
||||
|
||||
# Internal IPs for debug toolbar (comma-separated)
|
||||
# INTERNAL_IPS=127.0.0.1,::1
|
||||
|
||||
# Logging level (DEBUG, INFO, WARNING, ERROR, CRITICAL)
|
||||
LOG_LEVEL=INFO
|
||||
|
||||
83
.github/SECURITY.md
vendored
Normal file
83
.github/SECURITY.md
vendored
Normal file
@@ -0,0 +1,83 @@
|
||||
# Security Policy
|
||||
|
||||
## Supported Versions
|
||||
|
||||
| Version | Supported |
|
||||
| ------- | ------------------ |
|
||||
| latest | :white_check_mark: |
|
||||
| < latest | :x: |
|
||||
|
||||
Only the latest version of ThrillWiki receives security updates.
|
||||
|
||||
## Reporting a Vulnerability
|
||||
|
||||
We take security vulnerabilities seriously. If you discover a security issue, please report it responsibly.
|
||||
|
||||
### How to Report
|
||||
|
||||
1. **Do not** create a public GitHub issue for security vulnerabilities
|
||||
2. Email your report to the project maintainers
|
||||
3. Include as much detail as possible:
|
||||
- Description of the vulnerability
|
||||
- Steps to reproduce
|
||||
- Potential impact
|
||||
- Affected versions
|
||||
- Any proof of concept (if available)
|
||||
|
||||
### What to Expect
|
||||
|
||||
- **Acknowledgment**: We will acknowledge receipt within 48 hours
|
||||
- **Assessment**: We will assess the vulnerability and its impact
|
||||
- **Updates**: We will keep you informed of our progress
|
||||
- **Resolution**: We aim to resolve critical vulnerabilities within 7 days
|
||||
- **Credit**: With your permission, we will credit you in our security advisories
|
||||
|
||||
### Scope
|
||||
|
||||
The following are in scope for security reports:
|
||||
|
||||
- ThrillWiki web application vulnerabilities
|
||||
- Authentication and authorization issues
|
||||
- Data exposure vulnerabilities
|
||||
- Injection vulnerabilities (SQL, XSS, etc.)
|
||||
- CSRF vulnerabilities
|
||||
- Server-side request forgery (SSRF)
|
||||
- Insecure direct object references
|
||||
|
||||
### Out of Scope
|
||||
|
||||
The following are out of scope:
|
||||
|
||||
- Denial of service attacks
|
||||
- Social engineering attacks
|
||||
- Physical security issues
|
||||
- Issues in third-party applications or services
|
||||
- Issues requiring physical access to a user's device
|
||||
- Vulnerabilities in outdated versions
|
||||
|
||||
## Security Measures
|
||||
|
||||
ThrillWiki implements the following security measures:
|
||||
|
||||
- HTTPS enforcement with HSTS
|
||||
- Content Security Policy
|
||||
- XSS protection with input sanitization
|
||||
- CSRF protection
|
||||
- SQL injection prevention via ORM
|
||||
- Rate limiting on authentication endpoints
|
||||
- Secure session management
|
||||
- JWT token rotation and blacklisting
|
||||
|
||||
For more details, see [docs/SECURITY.md](../docs/SECURITY.md).
|
||||
|
||||
## Security Updates
|
||||
|
||||
Security updates are released as soon as possible after a vulnerability is confirmed. We recommend:
|
||||
|
||||
1. Keep your installation up to date
|
||||
2. Subscribe to release notifications
|
||||
3. Review security advisories
|
||||
|
||||
## Contact
|
||||
|
||||
For security-related inquiries, please contact the project maintainers.
|
||||
2
.github/workflows/claude-code-review.yml
vendored
2
.github/workflows/claude-code-review.yml
vendored
@@ -27,7 +27,7 @@ jobs:
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
fetch-depth: 1
|
||||
|
||||
|
||||
2
.github/workflows/claude.yml
vendored
2
.github/workflows/claude.yml
vendored
@@ -26,7 +26,7 @@ jobs:
|
||||
actions: read # Required for Claude to read CI results on PRs
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
fetch-depth: 1
|
||||
|
||||
|
||||
53
.github/workflows/dependency-update.yml
vendored
Normal file
53
.github/workflows/dependency-update.yml
vendored
Normal file
@@ -0,0 +1,53 @@
|
||||
name: Dependency Update Check
|
||||
|
||||
on:
|
||||
schedule:
|
||||
- cron: '0 0 * * 1' # Weekly on Monday at midnight UTC
|
||||
workflow_dispatch:
|
||||
|
||||
jobs:
|
||||
update:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v6
|
||||
|
||||
- name: Set up Python
|
||||
uses: actions/setup-python@v6
|
||||
with:
|
||||
python-version: "3.13"
|
||||
|
||||
- name: Install UV
|
||||
run: |
|
||||
curl -LsSf https://astral.sh/uv/install.sh | sh
|
||||
echo "$HOME/.cargo/bin" >> $GITHUB_PATH
|
||||
|
||||
- name: Update Dependencies
|
||||
working-directory: backend
|
||||
run: |
|
||||
uv lock --upgrade
|
||||
uv sync
|
||||
|
||||
- name: Run Tests
|
||||
working-directory: backend
|
||||
run: |
|
||||
uv run manage.py test
|
||||
|
||||
- name: Create Pull Request
|
||||
uses: peter-evans/create-pull-request@v8
|
||||
with:
|
||||
commit-message: "chore: update dependencies"
|
||||
title: "chore: weekly dependency updates"
|
||||
body: |
|
||||
Automated dependency updates.
|
||||
|
||||
This PR was automatically generated by the dependency update workflow.
|
||||
|
||||
## Changes
|
||||
- Updated `uv.lock` with latest compatible versions
|
||||
|
||||
## Checklist
|
||||
- [ ] Review dependency changes
|
||||
- [ ] Verify all tests pass
|
||||
- [ ] Check for breaking changes
|
||||
branch: "dependency-updates"
|
||||
labels: dependencies
|
||||
77
.github/workflows/django.yml
vendored
77
.github/workflows/django.yml
vendored
@@ -12,30 +12,85 @@ jobs:
|
||||
strategy:
|
||||
matrix:
|
||||
os: [ubuntu-latest, macos-latest]
|
||||
python-version: [3.13.1]
|
||||
python-version: ["3.13"]
|
||||
|
||||
services:
|
||||
postgres:
|
||||
image: postgis/postgis:16-3.4
|
||||
env:
|
||||
POSTGRES_USER: postgres
|
||||
POSTGRES_PASSWORD: postgres
|
||||
POSTGRES_DB: test_thrillwiki
|
||||
ports:
|
||||
- 5432:5432
|
||||
options: >-
|
||||
--health-cmd pg_isready
|
||||
--health-interval 10s
|
||||
--health-timeout 5s
|
||||
--health-retries 5
|
||||
# Services only run on Linux runners
|
||||
if: runner.os == 'Linux'
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- uses: actions/checkout@v6
|
||||
|
||||
- name: Install Homebrew on Linux
|
||||
if: runner.os == 'Linux'
|
||||
run: |
|
||||
/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"
|
||||
echo "/home/linuxbrew/.linuxbrew/bin" >> $GITHUB_PATH
|
||||
|
||||
|
||||
- name: Install GDAL with Homebrew
|
||||
run: brew install gdal
|
||||
|
||||
|
||||
- name: Install PostGIS on macOS
|
||||
if: runner.os == 'macOS'
|
||||
run: |
|
||||
brew install postgresql@16 postgis
|
||||
brew services start postgresql@16
|
||||
sleep 5
|
||||
/opt/homebrew/opt/postgresql@16/bin/createuser -s postgres || true
|
||||
/opt/homebrew/opt/postgresql@16/bin/createdb -U postgres test_thrillwiki || true
|
||||
/opt/homebrew/opt/postgresql@16/bin/psql -U postgres -d test_thrillwiki -c "CREATE EXTENSION IF NOT EXISTS postgis;" || true
|
||||
|
||||
- name: Set up Python ${{ matrix.python-version }}
|
||||
uses: actions/setup-python@v5
|
||||
uses: actions/setup-python@v6
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
|
||||
|
||||
- name: Install UV
|
||||
run: |
|
||||
curl -LsSf https://astral.sh/uv/install.sh | sh
|
||||
echo "$HOME/.cargo/bin" >> $GITHUB_PATH
|
||||
|
||||
- name: Cache UV dependencies
|
||||
uses: actions/cache@v5
|
||||
with:
|
||||
path: ~/.cache/uv
|
||||
key: ${{ runner.os }}-uv-${{ hashFiles('backend/pyproject.toml') }}
|
||||
restore-keys: |
|
||||
${{ runner.os }}-uv-
|
||||
|
||||
- name: Install Dependencies
|
||||
working-directory: backend
|
||||
run: |
|
||||
python -m pip install --upgrade pip
|
||||
pip install -r requirements.txt
|
||||
|
||||
uv sync --frozen
|
||||
|
||||
- name: Security Audit
|
||||
working-directory: backend
|
||||
run: |
|
||||
uv pip install pip-audit
|
||||
uv run pip-audit || true
|
||||
continue-on-error: true
|
||||
|
||||
- name: Run Tests
|
||||
working-directory: backend
|
||||
env:
|
||||
DJANGO_SETTINGS_MODULE: config.django.test
|
||||
TEST_DB_NAME: test_thrillwiki
|
||||
TEST_DB_USER: postgres
|
||||
TEST_DB_PASSWORD: postgres
|
||||
TEST_DB_HOST: localhost
|
||||
TEST_DB_PORT: 5432
|
||||
run: |
|
||||
python manage.py test
|
||||
uv run python manage.py test --settings=config.django.test --parallel
|
||||
|
||||
2
.github/workflows/review.yml
vendored
2
.github/workflows/review.yml
vendored
@@ -22,7 +22,7 @@ jobs:
|
||||
runs-on: ubuntu-latest
|
||||
environment: development_environment
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- uses: actions/checkout@v6
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
|
||||
19
.gitignore
vendored
19
.gitignore
vendored
@@ -30,10 +30,20 @@ db.sqlite3-journal
|
||||
/backend/staticfiles/
|
||||
/backend/media/
|
||||
|
||||
# Celery Beat schedule database (runtime state, regenerated automatically)
|
||||
celerybeat-schedule*
|
||||
celerybeat.pid
|
||||
|
||||
# UV
|
||||
.uv/
|
||||
backend/.uv/
|
||||
|
||||
# Generated requirements files (auto-generated from pyproject.toml)
|
||||
# Uncomment if you want to track these files
|
||||
# backend/requirements.txt
|
||||
# backend/requirements-dev.txt
|
||||
# backend/requirements-test.txt
|
||||
|
||||
# Node.js
|
||||
node_modules/
|
||||
npm-debug.log*
|
||||
@@ -98,8 +108,11 @@ temp/
|
||||
|
||||
# Backup files
|
||||
*.bak
|
||||
*.backup
|
||||
*.orig
|
||||
*.swp
|
||||
*_backup.*
|
||||
*_OLD_*
|
||||
|
||||
# Archive files
|
||||
*.tar.gz
|
||||
@@ -123,4 +136,8 @@ django-forwardemail/
|
||||
frontend/
|
||||
frontend
|
||||
.snapshots
|
||||
uv.lock
|
||||
web/next-env.d.ts
|
||||
web/.next/types/cache-life.d.ts
|
||||
.gitignore
|
||||
web/.next/types/routes.d.ts
|
||||
web/.next/types/validator.ts
|
||||
|
||||
251
.pylintrc
Normal file
251
.pylintrc
Normal file
@@ -0,0 +1,251 @@
|
||||
# =============================================================================
|
||||
# ThrillWiki Django Project - Pylint Configuration
|
||||
# =============================================================================
|
||||
#
|
||||
# Purpose: Django-aware Pylint configuration that suppresses false positives
|
||||
# while maintaining code quality standards.
|
||||
#
|
||||
# Alignment:
|
||||
# - Line length: 120 characters (matches Black and Ruff in pyproject.toml)
|
||||
# - Django version: 5.2.8
|
||||
#
|
||||
# Key Features:
|
||||
# - Suppresses false positives for Django ORM patterns (.objects, _meta, .DoesNotExist)
|
||||
# - Whitelists Django management command styling (self.style.SUCCESS, ERROR, etc.)
|
||||
# - Accommodates Django REST Framework patterns
|
||||
# - Allows django-fsm state machine patterns
|
||||
#
|
||||
# Maintenance:
|
||||
# - Review when upgrading Django or adding new dynamic attribute patterns
|
||||
# - Keep line-length aligned with Black/Ruff settings in pyproject.toml
|
||||
#
|
||||
# =============================================================================
|
||||
|
||||
[MASTER]
|
||||
# Use all available CPU cores for faster linting
|
||||
jobs=0
|
||||
|
||||
# Directories and files to exclude from linting
|
||||
ignore=.git,__pycache__,.venv,venv,migrations,node_modules,.tox,.pytest_cache,build,dist
|
||||
|
||||
# File patterns to ignore (e.g., Emacs backup files)
|
||||
ignore-patterns=^\.#
|
||||
|
||||
# Pickle collected data for faster subsequent runs
|
||||
persistent=yes
|
||||
|
||||
# =============================================================================
|
||||
# [MESSAGES CONTROL]
|
||||
# Disable checks that conflict with Django patterns and conventions
|
||||
# =============================================================================
|
||||
[MESSAGES CONTROL]
|
||||
disable=
|
||||
# C0114: missing-module-docstring
|
||||
# Django apps often don't need module docstrings; the app's purpose is
|
||||
# typically documented in apps.py or README
|
||||
C0114,
|
||||
|
||||
# C0115: missing-class-docstring
|
||||
# Django models, forms, and serializers are often self-documenting through
|
||||
# their field definitions and Meta classes
|
||||
C0115,
|
||||
|
||||
# C0116: missing-function-docstring
|
||||
# Allow simple functions and methods without docstrings; Django views and
|
||||
# model methods are often self-explanatory
|
||||
C0116,
|
||||
|
||||
# C0103: invalid-name
|
||||
# Django uses non-PEP8 names by convention (e.g., 'pk', 'id', 'qs')
|
||||
# and single-letter variables in comprehensions are acceptable
|
||||
C0103,
|
||||
|
||||
# C0411: wrong-import-order
|
||||
# Let isort/ruff handle import ordering; they have Django-specific rules
|
||||
C0411,
|
||||
|
||||
# C0415: import-outside-toplevel
|
||||
# Django often requires lazy imports to avoid circular dependencies,
|
||||
# especially in models.py and signals
|
||||
C0415,
|
||||
|
||||
# W0212: protected-access
|
||||
# Django extensively uses _meta for model introspection; this is documented
|
||||
# and supported API: https://docs.djangoproject.com/en/5.2/ref/models/meta/
|
||||
W0212,
|
||||
|
||||
# W0613: unused-argument
|
||||
# Django views, signals, and receivers often have unused parameters that
|
||||
# are required by the framework's signature (e.g., request, sender, **kwargs)
|
||||
W0613,
|
||||
|
||||
# R0903: too-few-public-methods
|
||||
# Django models, forms, and serializers can be simple data containers
|
||||
# with few or no methods beyond __str__
|
||||
R0903,
|
||||
|
||||
# R0801: duplicate-code
|
||||
# Django patterns naturally duplicate across apps (e.g., CRUD views,
|
||||
# model patterns); this is intentional for consistency
|
||||
R0801,
|
||||
|
||||
# E1101: no-member
|
||||
# Main source of false positives for Django's dynamic attributes:
|
||||
# - Model.objects (Manager)
|
||||
# - Model.DoesNotExist / MultipleObjectsReturned (exceptions)
|
||||
# - self.style.SUCCESS/ERROR (management commands)
|
||||
# - model._meta (Options)
|
||||
E1101
|
||||
|
||||
# =============================================================================
|
||||
# [TYPECHECK]
|
||||
# Whitelist Django's dynamically generated attributes
|
||||
# =============================================================================
|
||||
[TYPECHECK]
|
||||
# Django generates many attributes dynamically that Pylint cannot detect
|
||||
# statically. This list covers common patterns:
|
||||
#
|
||||
# - objects.* : Django ORM Manager methods (all, filter, get, create, etc.)
|
||||
# - DoesNotExist : Exception raised when Model.objects.get() finds nothing
|
||||
# - MultipleObjectsReturned : Exception when get() finds multiple objects
|
||||
# - _meta.* : Django model metadata API (fields, app_label, model_name)
|
||||
# - style.* : Django management command styling (SUCCESS, ERROR, WARNING, NOTICE)
|
||||
# - id, pk : Django auto-generated primary key fields
|
||||
# - REQUEST : Django request object attributes
|
||||
# - aq_* : Acquisition attributes (Zope/Plone compatibility)
|
||||
# - acl_users : Zope/Plone user folder
|
||||
#
|
||||
generated-members=
|
||||
REQUEST,
|
||||
acl_users,
|
||||
aq_parent,
|
||||
aq_inner,
|
||||
aq_explicit,
|
||||
aq_acquire,
|
||||
aq_base,
|
||||
objects,
|
||||
objects.*,
|
||||
DoesNotExist,
|
||||
MultipleObjectsReturned,
|
||||
_meta,
|
||||
_meta.*,
|
||||
style,
|
||||
style.*,
|
||||
id,
|
||||
pk
|
||||
|
||||
# =============================================================================
|
||||
# [FORMAT]
|
||||
# Code formatting settings - aligned with Black and Ruff (120 chars)
|
||||
# =============================================================================
|
||||
[FORMAT]
|
||||
# Maximum line length - matches Black and Ruff configuration in pyproject.toml
|
||||
max-line-length=120
|
||||
|
||||
# Use 4 spaces for indentation (Python standard)
|
||||
indent-string=' '
|
||||
|
||||
# Use Unix line endings (LF)
|
||||
expected-line-ending-format=LF
|
||||
|
||||
# =============================================================================
|
||||
# [BASIC]
|
||||
# Naming conventions and allowed short names
|
||||
# =============================================================================
|
||||
[BASIC]
|
||||
# Short variable names commonly used in Django and Python
|
||||
# - i, j, k : Loop counters
|
||||
# - ex : Exception variable
|
||||
# - Run : Django command method
|
||||
# - _ : Throwaway variable
|
||||
# - id, pk : Primary key (Django convention)
|
||||
# - qs : QuerySet abbreviation
|
||||
good-names=i,j,k,ex,Run,_,id,pk,qs
|
||||
|
||||
# Enforce snake_case for most identifiers (Python/Django convention)
|
||||
argument-naming-style=snake_case
|
||||
attr-naming-style=snake_case
|
||||
function-naming-style=snake_case
|
||||
method-naming-style=snake_case
|
||||
module-naming-style=snake_case
|
||||
variable-naming-style=snake_case
|
||||
|
||||
# PascalCase for classes
|
||||
class-naming-style=PascalCase
|
||||
|
||||
# UPPER_CASE for constants
|
||||
const-naming-style=UPPER_CASE
|
||||
|
||||
# =============================================================================
|
||||
# [DESIGN]
|
||||
# Complexity thresholds - relaxed for Django patterns
|
||||
# =============================================================================
|
||||
[DESIGN]
|
||||
# Django views and forms often need many arguments
|
||||
max-args=7
|
||||
|
||||
# Django models can have many fields
|
||||
max-attributes=12
|
||||
|
||||
# Allow complex boolean expressions
|
||||
max-bool-expr=5
|
||||
|
||||
# Django views can have complex branching logic
|
||||
max-branches=15
|
||||
|
||||
# Django views often have many local variables
|
||||
max-locals=20
|
||||
|
||||
# Django uses multiple inheritance (Model, Mixin classes)
|
||||
max-parents=7
|
||||
|
||||
# Django models and viewsets have many built-in methods
|
||||
max-public-methods=25
|
||||
|
||||
# Allow multiple return statements
|
||||
max-returns=6
|
||||
|
||||
# Django views can be lengthy
|
||||
max-statements=60
|
||||
|
||||
# Allow simple classes with no methods (e.g., Django Meta classes)
|
||||
min-public-methods=0
|
||||
|
||||
# =============================================================================
|
||||
# [SIMILARITIES]
|
||||
# Duplicate code detection settings
|
||||
# =============================================================================
|
||||
[SIMILARITIES]
|
||||
# Increase threshold to reduce false positives from Django boilerplate
|
||||
min-similarity-lines=6
|
||||
|
||||
# Don't flag similar comments
|
||||
ignore-comments=yes
|
||||
|
||||
# Don't flag similar docstrings
|
||||
ignore-docstrings=yes
|
||||
|
||||
# Don't flag similar import blocks
|
||||
ignore-imports=yes
|
||||
|
||||
# =============================================================================
|
||||
# [VARIABLES]
|
||||
# Variable naming patterns
|
||||
# =============================================================================
|
||||
[VARIABLES]
|
||||
# Patterns for dummy/unused variables
|
||||
dummy-variables-rgx=_+$|(_[a-zA-Z0-9_]*[a-zA-Z0-9]+?$)|dummy|^ignored_|^unused_
|
||||
|
||||
# Arguments that are commonly unused but required by framework signatures
|
||||
ignored-argument-names=_.*|^ignored_|^unused_|args|kwargs|request|pk
|
||||
|
||||
# =============================================================================
|
||||
# [IMPORTS]
|
||||
# Import checking settings
|
||||
# =============================================================================
|
||||
[IMPORTS]
|
||||
# Don't allow wildcard imports even with __all__ defined
|
||||
allow-wildcard-with-all=no
|
||||
|
||||
# Don't analyze fallback import blocks
|
||||
analyse-fallback-blocks=no
|
||||
73
.replit
73
.replit
@@ -1,73 +0,0 @@
|
||||
modules = ["bash", "web", "nodejs-20", "python-3.13", "postgresql-16"]
|
||||
|
||||
[nix]
|
||||
channel = "stable-25_05"
|
||||
packages = [
|
||||
"freetype",
|
||||
"gdal",
|
||||
"geos",
|
||||
"gitFull",
|
||||
"lcms2",
|
||||
"libimagequant",
|
||||
"libjpeg",
|
||||
"libtiff",
|
||||
"libwebp",
|
||||
"libxcrypt",
|
||||
"openjpeg",
|
||||
"playwright-driver",
|
||||
"postgresql",
|
||||
"proj",
|
||||
"tcl",
|
||||
"tk",
|
||||
"uv",
|
||||
"zlib",
|
||||
]
|
||||
|
||||
[agent]
|
||||
expertMode = true
|
||||
|
||||
[workflows]
|
||||
runButton = "Project"
|
||||
|
||||
[[workflows.workflow]]
|
||||
name = "Project"
|
||||
mode = "parallel"
|
||||
author = "agent"
|
||||
|
||||
[[workflows.workflow.tasks]]
|
||||
task = "workflow.run"
|
||||
args = "ThrillWiki Server"
|
||||
|
||||
[[workflows.workflow]]
|
||||
name = "ThrillWiki Server"
|
||||
author = "agent"
|
||||
|
||||
[[workflows.workflow.tasks]]
|
||||
task = "shell.exec"
|
||||
args = "/home/runner/workspace/.venv/bin/python manage.py tailwind runserver 0.0.0.0:5000"
|
||||
waitForPort = 5000
|
||||
|
||||
[workflows.workflow.metadata]
|
||||
outputType = "webview"
|
||||
|
||||
[[ports]]
|
||||
localPort = 5000
|
||||
externalPort = 80
|
||||
|
||||
[[ports]]
|
||||
localPort = 41923
|
||||
externalPort = 3000
|
||||
|
||||
[[ports]]
|
||||
localPort = 45245
|
||||
externalPort = 3001
|
||||
|
||||
[deployment]
|
||||
deploymentTarget = "autoscale"
|
||||
run = [
|
||||
"gunicorn",
|
||||
"--bind=0.0.0.0:5000",
|
||||
"--reuse-port",
|
||||
"thrillwiki.wsgi:application",
|
||||
]
|
||||
build = ["uv", "pip", "install", "--system", "-r", "requirements.txt"]
|
||||
95
BACKEND_STRUCTURE.md
Normal file
95
BACKEND_STRUCTURE.md
Normal file
@@ -0,0 +1,95 @@
|
||||
# Backend Structure Plan
|
||||
|
||||
## Apps Overview
|
||||
|
||||
### 1. `apps.core`
|
||||
- **Responsibility**: Base classes, shared utilities, history tracking.
|
||||
- **Existing**: `SluggedModel`, `TrackedModel`.
|
||||
- **Versioning Strategy (Section 15)**:
|
||||
- All core entities (`Park`, `Ride`, `Company`) must utilize `django-pghistory` or `apps.core` tracking to support:
|
||||
- **Edit History**: Chronological list of changes with `reason`, `user`, and `diff`.
|
||||
- **Timeline**: Major events (renames, relocations).
|
||||
- **Rollback**: Ability to restore previous versions via the Moderation Queue.
|
||||
|
||||
### 2. `apps.accounts`
|
||||
- **Responsibility**: User authentication, profiles, and settings.
|
||||
- **Existing**: `User`, `UserProfile` (bio, location, home park).
|
||||
- **Required Additions (Section 9)**:
|
||||
- **UserDeletionRequest**: Support 7-day grace period for account deletion.
|
||||
- **Privacy Settings**: Fields for `is_profile_public`, `show_location`, `show_email` on `UserProfile`.
|
||||
- **Data Export**: Serializers/Utilities to dump all user data (Reviews, Credits, Lists) to JSON.
|
||||
|
||||
### 3. `apps.parks`
|
||||
- **Responsibility**: Park management.
|
||||
- **Models**: `Park`, `ParkArea`.
|
||||
- **Relationships**:
|
||||
- `operator`: FK to `apps.companies.Company` (Type: Operator).
|
||||
- `property_owner`: FK to `apps.companies.Company` (Type: Owner).
|
||||
|
||||
### 4. `apps.rides`
|
||||
- **Responsibility**: Ride data, Coasters, and Credits.
|
||||
- **Models**:
|
||||
- `Ride`: Core entity (Status FSM: Operating, SBNO, Closed, etc.).
|
||||
- `RideModel`: Defines the "Type" of ride (e.g., B&M Hyper V2).
|
||||
- `Manufacturer`: FK to `apps.companies.Company`.
|
||||
- `Designer`: FK to `apps.companies.Company`.
|
||||
- **Ride Credits (Section 10)**:
|
||||
- **Model**: `RideCredit` (Through-Model: `User` <-> `Ride`).
|
||||
- **Fields**:
|
||||
- `count` (Integer): Total times ridden.
|
||||
- `rating` (Float): Personal rating (distinct from public Review).
|
||||
- `first_ridden_at` (Date): First time experiencing the ride.
|
||||
- `notes` (Text): Private personal notes.
|
||||
- **Constraints**: `Unique(user, ride)` - A user has one credit entry per ride.
|
||||
|
||||
### 5. `apps.companies`
|
||||
- **Responsibility**: Management of Industry Entities (Section 4).
|
||||
- **Models**:
|
||||
- `Company`: Single model with `type` choices or Polymorphic.
|
||||
- **Types**: `Manufacturer`, `Designer`, `Operator`, `PropertyOwner`.
|
||||
- **Features**: Detailed pages, hover cards, listing by type.
|
||||
|
||||
### 6. `apps.moderation` (The Sacred Submission Pipeline)
|
||||
- **Responsibility**: Centralized Content Submission System (Section 14, 16).
|
||||
- **Concept**: **Live Data** (Approve) vs **Submission Data** (Pending).
|
||||
- **Models**:
|
||||
- `Submission`:
|
||||
- `submitter`: FK to User.
|
||||
- `content_type`: Target Model (Park, Ride, etc.).
|
||||
- `object_id`: Target ID (Null for Creation).
|
||||
- `data`: **JSONField** storing the proposed state.
|
||||
- `status`: State Machine (`Pending` -> `Claimed` -> `Approved` | `Rejected` | `ChangesRequested`).
|
||||
- `moderator`: FK to User (Claimaint).
|
||||
- `moderator_note`: Reason for rejection/feedback.
|
||||
- `Report`: User flags on content.
|
||||
- **Workflow**:
|
||||
1. User submits form -> `Submission` created (Status: Pending).
|
||||
2. Moderator Claims -> Status: Claimed.
|
||||
3. Approves -> Applies `data` to `Live Model` -> Saves Version -> Status: Approved.
|
||||
|
||||
### 7. `apps.media`
|
||||
- **Responsibility**: Media Management (Section 13).
|
||||
- **Models**:
|
||||
- `Photo`: GenericFK. Fields: `image`, `caption`, `user`, `status` (Moderation).
|
||||
- **Banner/Card**: Entities should link to a "Primary Photo" or store a cached image field.
|
||||
|
||||
### 8. `apps.reviews`
|
||||
- **Responsibility**: Public Reviews & Ratings (Section 12).
|
||||
- **Models**:
|
||||
- `Review`: GenericFK (Park, Ride).
|
||||
- **Fields**: `rating` (1-5, 0.5 steps), `title`, `body`, `helpful_votes`.
|
||||
- **Logic**: Aggregates (Avg Rating, Count) calculation for Entity caches.
|
||||
|
||||
### 9. `apps.lists`
|
||||
- **Responsibility**: User Lists & Rankings (Section 11).
|
||||
- **Models**:
|
||||
- `UserList`: Title, Description, Type (Park/Ride/Coaster/Mixed), Privacy (Public/Private).
|
||||
- `UserListItem`: FK to List, GenericFK to Item, Order, Notes.
|
||||
|
||||
### 10. `apps.blog`
|
||||
- **Responsibility**: News & Updates.
|
||||
- **Models**: `Post`, `Tag`.
|
||||
|
||||
### 11. `apps.support`
|
||||
- **Responsibility**: Human interaction.
|
||||
- **Models**: `Ticket` (Contact Form).
|
||||
503
CHANGELOG.md
Normal file
503
CHANGELOG.md
Normal file
@@ -0,0 +1,503 @@
|
||||
# Changelog
|
||||
|
||||
All notable changes to this project will be documented in this file.
|
||||
|
||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
|
||||
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||
|
||||
## [Phase 7] - 2025-12-24
|
||||
|
||||
### Testing
|
||||
|
||||
#### Added
|
||||
- **Comprehensive Test Coverage Improvements**
|
||||
- Added 30+ new test files across all apps
|
||||
- API endpoint tests with authentication, error handling, pagination, and response format validation
|
||||
- E2E tests for FSM workflows (parks, rides, moderation)
|
||||
- Integration tests for FSM transition workflows
|
||||
- Unit tests for managers, serializers, and services
|
||||
- Accessibility tests for WCAG 2.1 AA compliance
|
||||
- Form validation tests for all major forms
|
||||
|
||||
#### Test Files Added
|
||||
- `backend/tests/api/` - API endpoint tests (8 files)
|
||||
- `backend/tests/e2e/` - End-to-end FSM workflow tests (3 files)
|
||||
- `backend/tests/integration/` - Integration tests (1 file)
|
||||
- `backend/tests/managers/` - Manager tests (2 files)
|
||||
- `backend/tests/serializers/` - Serializer tests (3 files)
|
||||
- `backend/tests/services/` - Service layer tests (3 files)
|
||||
- `backend/tests/forms/` - Form validation tests (5 files)
|
||||
- `backend/tests/accessibility/` - WCAG compliance tests (1 file)
|
||||
- `backend/apps/*/tests/` - App-specific tests (7 files)
|
||||
|
||||
#### Coverage Improvements
|
||||
- Increased test coverage for models, views, and services
|
||||
- Added tests for edge cases and error conditions
|
||||
- Improved FSM transition testing with permission checks
|
||||
- Added query optimization tests
|
||||
|
||||
### Technical Details
|
||||
|
||||
This phase focused on achieving comprehensive test coverage to ensure code quality and prevent regressions. Tests cover:
|
||||
- All API endpoints with various authentication scenarios
|
||||
- FSM state transitions with permission validation
|
||||
- Form validation logic with edge cases
|
||||
- Manager methods and custom QuerySets
|
||||
- Service layer business logic
|
||||
- Accessibility compliance for interactive components
|
||||
|
||||
**Testing Infrastructure**:
|
||||
- pytest with Django plugin
|
||||
- Factory Boy for test data generation
|
||||
- Coverage.py for coverage tracking
|
||||
- Playwright for E2E tests
|
||||
|
||||
### Files Modified
|
||||
- `backend/pyproject.toml` - Updated test dependencies and coverage configuration
|
||||
- `backend/tests/conftest.py` - Enhanced test fixtures and utilities
|
||||
|
||||
---
|
||||
|
||||
## [Phase 6] - 2025-12-24
|
||||
|
||||
### Forms & Validation
|
||||
|
||||
#### Enhanced
|
||||
- **Form Validation Coverage**
|
||||
- Added custom `clean_*` methods for field-level validation
|
||||
- Improved error messages for better user experience
|
||||
- Enhanced form widgets (date pickers, rich text editors)
|
||||
- Standardized ModelForm field definitions
|
||||
|
||||
#### Forms Enhanced
|
||||
- `backend/apps/parks/forms/base.py` - Park creation/update forms
|
||||
- `backend/apps/parks/forms/review_forms.py` - Park review forms
|
||||
- `backend/apps/parks/forms/area_forms.py` - Park area forms
|
||||
- `backend/apps/rides/forms/base.py` - Ride creation/update forms
|
||||
- `backend/apps/rides/forms/review_forms.py` - Ride review forms
|
||||
- `backend/apps/rides/forms/company_forms.py` - Company forms
|
||||
- `backend/apps/rides/forms/search.py` - Ride search forms
|
||||
- `backend/apps/core/forms/search.py` - Core search forms
|
||||
- `backend/apps/core/forms/htmx_forms.py` - HTMX-specific form patterns
|
||||
|
||||
#### Tests Added
|
||||
- `backend/tests/forms/test_area_forms.py` - Area form validation tests
|
||||
- `backend/tests/forms/test_park_forms.py` - Park form validation tests
|
||||
- `backend/tests/forms/test_ride_forms.py` - Ride form validation tests
|
||||
- `backend/tests/forms/test_review_forms.py` - Review form validation tests
|
||||
- `backend/tests/forms/test_company_forms.py` - Company form validation tests
|
||||
|
||||
### Technical Details
|
||||
|
||||
This phase improved form validation coverage across the application:
|
||||
1. **Field-Level Validation**: Custom `clean_*` methods for complex validation logic
|
||||
2. **User-Friendly Errors**: Clear, actionable error messages
|
||||
3. **Widget Improvements**: Better UX with appropriate input widgets
|
||||
4. **HTMX Integration**: Forms work seamlessly with HTMX partial updates
|
||||
5. **Test Coverage**: Comprehensive tests for all validation scenarios
|
||||
|
||||
**Validation Patterns**:
|
||||
- Date range validation (opening/closing dates)
|
||||
- Coordinate validation (latitude/longitude bounds)
|
||||
- Slug uniqueness validation
|
||||
- Cross-field validation (e.g., closing date must be after opening date)
|
||||
- File upload validation (size, type, dimensions)
|
||||
|
||||
---
|
||||
|
||||
## [Phase 5] - 2025-12-24
|
||||
|
||||
### Admin Interface
|
||||
|
||||
#### Enhanced
|
||||
- **Django Admin Completeness**
|
||||
- Added comprehensive `list_display` with key fields
|
||||
- Implemented `search_fields` for text search
|
||||
- Added `list_filter` for status, category, and date filtering
|
||||
- Organized detail views with `fieldsets`
|
||||
- Added `readonly_fields` for computed properties and timestamps
|
||||
- Implemented custom admin actions (bulk approve, bulk reject, etc.)
|
||||
|
||||
#### Admin Files Enhanced
|
||||
- `backend/apps/parks/admin.py` - Park, Area, Company, Review admin
|
||||
- `backend/apps/rides/admin.py` - Ride, Manufacturer, Review admin
|
||||
- `backend/apps/accounts/admin.py` - User, Profile admin
|
||||
- `backend/apps/moderation/admin.py` - Submission, Report admin
|
||||
- `backend/apps/core/admin.py` - Base admin classes and mixins
|
||||
|
||||
#### Custom Admin Actions
|
||||
- Bulk approve/reject for moderation workflows
|
||||
- Bulk status changes for parks and rides
|
||||
- Export to CSV for reporting
|
||||
- Cache invalidation for modified entities
|
||||
|
||||
### Technical Details
|
||||
|
||||
This phase completed the Django admin interface to provide a powerful content management system:
|
||||
1. **List Views**: Optimized with select_related/prefetch_related
|
||||
2. **Search**: Full-text search on name, description, and location fields
|
||||
3. **Filters**: Status, category, date range, and custom filters
|
||||
4. **Detail Views**: Organized with logical fieldsets
|
||||
5. **Actions**: Bulk operations for efficient moderation
|
||||
|
||||
**Admin Patterns**:
|
||||
- Inherited from `BaseModelAdmin` for consistency
|
||||
- Used `readonly_fields` for computed properties
|
||||
- Implemented `get_queryset()` optimization
|
||||
- Added inline admin for related objects
|
||||
|
||||
---
|
||||
|
||||
## [Phase 4] - 2025-12-24
|
||||
|
||||
### Models & Database
|
||||
|
||||
#### Enhanced
|
||||
- **Model Completeness & Consistency**
|
||||
- Added/improved `__str__` methods for human-readable representations
|
||||
- Standardized `Meta` classes with `ordering`, `verbose_name`, `verbose_name_plural`
|
||||
- Added comprehensive `help_text` on all fields
|
||||
- Verified database indexes on foreign keys and frequently queried fields
|
||||
- Added model constraints (CheckConstraint, UniqueConstraint)
|
||||
|
||||
#### Model Files Enhanced
|
||||
- `backend/apps/parks/models/parks.py` - Park model
|
||||
- `backend/apps/parks/models/companies.py` - Company, Operator models
|
||||
- `backend/apps/parks/models/areas.py` - ParkArea model
|
||||
- `backend/apps/parks/models/media.py` - ParkPhoto model
|
||||
- `backend/apps/parks/models/reviews.py` - ParkReview model
|
||||
- `backend/apps/parks/models/location.py` - ParkLocation model
|
||||
- `backend/apps/rides/models/rides.py` - Ride model
|
||||
- `backend/apps/rides/models/company.py` - Manufacturer, Designer models
|
||||
- `backend/apps/rides/models/rankings.py` - RideRanking model
|
||||
- `backend/apps/rides/models/media.py` - RidePhoto model
|
||||
- `backend/apps/rides/models/reviews.py` - RideReview model
|
||||
- `backend/apps/rides/models/location.py` - RideLocation model
|
||||
- `backend/apps/accounts/models.py` - User, Profile models
|
||||
- `backend/apps/moderation/models.py` - Submission, Report models
|
||||
- `backend/apps/core/models.py` - Base models and mixins
|
||||
|
||||
#### Database Improvements
|
||||
- Added indexes for performance optimization
|
||||
- Implemented constraints for data integrity
|
||||
- Standardized field naming conventions
|
||||
- Improved model documentation
|
||||
|
||||
### Technical Details
|
||||
|
||||
This phase improved model quality and consistency:
|
||||
1. **String Representations**: All models have meaningful `__str__` methods
|
||||
2. **Metadata**: Complete Meta classes with ordering and verbose names
|
||||
3. **Field Documentation**: Every field has descriptive help_text
|
||||
4. **Database Optimization**: Proper indexes on foreign keys and search fields
|
||||
5. **Data Integrity**: Constraints enforce business rules at database level
|
||||
|
||||
**Model Patterns**:
|
||||
- Used `TextChoices` for status and category fields
|
||||
- Implemented `db_index=True` on frequently queried fields
|
||||
- Added `CheckConstraint` for value ranges (e.g., ratings 1-5)
|
||||
- Used `UniqueConstraint` for compound uniqueness
|
||||
|
||||
---
|
||||
|
||||
## [Phase 3] - 2025-12-24
|
||||
|
||||
### Logging & Observability
|
||||
|
||||
#### Standardized
|
||||
- **Logging Pattern Consistency**
|
||||
- Added `logger = logging.getLogger(__name__)` to all view, service, and middleware files
|
||||
- Implemented centralized logging utilities from `apps.core.logging`
|
||||
- Standardized log levels (debug, info, warning, error)
|
||||
- Added structured logging with context
|
||||
|
||||
#### Files Enhanced with Logging
|
||||
- `backend/apps/parks/views.py` - Park views
|
||||
- `backend/apps/rides/views.py` - Ride views
|
||||
- `backend/apps/accounts/views.py` - Account views
|
||||
- `backend/apps/moderation/views.py` - Moderation views
|
||||
- `backend/apps/accounts/services.py` - Account services
|
||||
- `backend/apps/parks/signals.py` - Park signals
|
||||
- `backend/apps/rides/signals.py` - Ride signals
|
||||
- `backend/apps/moderation/signals.py` - Moderation signals
|
||||
- `backend/apps/rides/tasks.py` - Celery tasks
|
||||
- `backend/apps/parks/apps.py` - App configuration
|
||||
- `backend/apps/rides/apps.py` - App configuration
|
||||
- `backend/apps/moderation/apps.py` - App configuration
|
||||
|
||||
#### Logging Utilities
|
||||
- `log_exception()` - Exception logging with full context
|
||||
- `log_business_event()` - Business operation logging (FSM transitions, user actions)
|
||||
- `log_security_event()` - Security event logging (authentication, authorization)
|
||||
|
||||
### Technical Details
|
||||
|
||||
This phase standardized logging across the application for better observability:
|
||||
1. **Consistent Logger Initialization**: Every module uses `logging.getLogger(__name__)`
|
||||
2. **Centralized Utilities**: Structured logging functions in `apps.core.logging`
|
||||
3. **Contextual Logging**: All logs include relevant context (user, request, operation)
|
||||
4. **Security Logging**: Dedicated logging for security events
|
||||
5. **Performance Logging**: Query performance and cache hit/miss tracking
|
||||
|
||||
**Logging Patterns**:
|
||||
- Exception handlers use `log_exception()` with context
|
||||
- FSM transitions use `log_business_event()`
|
||||
- Authentication events use `log_security_event()`
|
||||
- Never log sensitive data (passwords, tokens, PII)
|
||||
|
||||
**Benefits**:
|
||||
- Easier debugging with consistent log format
|
||||
- Better production monitoring with structured logs
|
||||
- Security audit trail for compliance
|
||||
- Performance insights from cache and query logs
|
||||
|
||||
---
|
||||
|
||||
## [Phase 15] - 2025-12-23
|
||||
|
||||
### Documentation
|
||||
|
||||
#### Added
|
||||
- **Future Work Documentation**
|
||||
- Created `docs/FUTURE_WORK.md` to track deferred features
|
||||
- Documented 11 TODO items with detailed implementation specifications
|
||||
- Added priority levels (P0-P3) and effort estimates
|
||||
- Included code examples and architectural guidance
|
||||
|
||||
#### Implemented
|
||||
- **Cache Statistics Tracking (THRILLWIKI-109)**
|
||||
- Added `get_cache_statistics()` method to `CacheMonitor` class
|
||||
- Implemented real-time cache hit/miss tracking in `MapStatsAPIView`
|
||||
- Returns Redis statistics when available, with graceful fallback
|
||||
- Removed placeholder TODO comments
|
||||
|
||||
- **Photo Upload Counting (THRILLWIKI-105)**
|
||||
- Implemented photo counting in user statistics endpoint
|
||||
- Queries `ParkPhoto` and `RidePhoto` models for accurate counts
|
||||
- Removed placeholder TODO comment
|
||||
|
||||
- **Admin Permission Checks (THRILLWIKI-103)**
|
||||
- Verified existing admin permission checks in map cache endpoints
|
||||
- Removed outdated TODO comments (checks were already implemented)
|
||||
|
||||
#### Enhanced
|
||||
- **TODO Comment Cleanup**
|
||||
- Updated all TODO comments to reference `FUTURE_WORK.md`
|
||||
- Added THRILLWIKI issue numbers for traceability
|
||||
- Improved inline documentation with implementation context
|
||||
|
||||
### Technical Details
|
||||
|
||||
This phase focused on addressing technical debt by:
|
||||
1. Documenting deferred features with actionable specifications
|
||||
2. Implementing quick wins that improve observability
|
||||
3. Cleaning up TODO comments to reduce confusion
|
||||
|
||||
**Features Documented for Future Implementation**:
|
||||
- Map clustering algorithm (THRILLWIKI-106)
|
||||
- Nearby locations feature (THRILLWIKI-107)
|
||||
- Search relevance scoring (THRILLWIKI-108)
|
||||
- Full user statistics tracking (THRILLWIKI-104)
|
||||
- Geocoding service integration (THRILLWIKI-101)
|
||||
- ClamAV malware scanning (THRILLWIKI-110)
|
||||
- Sample data creation command (THRILLWIKI-111)
|
||||
|
||||
**Quick Wins Implemented**:
|
||||
- Cache statistics tracking for monitoring
|
||||
- Photo upload counting for user profiles
|
||||
- Verified admin permission checks
|
||||
|
||||
### Files Modified
|
||||
- `backend/apps/api/v1/maps/views.py` - Cache statistics, updated TODO comments
|
||||
- `backend/apps/api/v1/accounts/views.py` - Photo counting, updated TODO comments
|
||||
- `backend/apps/api/v1/serializers/maps.py` - Updated TODO comments
|
||||
- `backend/apps/core/services/location_adapters.py` - Updated TODO comments
|
||||
- `backend/apps/core/services/enhanced_cache_service.py` - Added `get_cache_statistics()` method
|
||||
- `backend/apps/core/utils/file_scanner.py` - Updated TODO comments
|
||||
- `backend/apps/core/views/map_views.py` - Removed outdated TODO comments
|
||||
- `backend/apps/parks/management/commands/create_sample_data.py` - Updated TODO comments
|
||||
- `docs/architecture/README.md` - Added reference to FUTURE_WORK.md
|
||||
|
||||
### Files Created
|
||||
- `docs/FUTURE_WORK.md` - Centralized future work documentation
|
||||
|
||||
---
|
||||
|
||||
## [Phase 14] - 2025-12-23
|
||||
|
||||
### Documentation
|
||||
|
||||
#### Fixed
|
||||
- Corrected architectural documentation from Vue.js SPA to Django + HTMX monolith
|
||||
- Updated main README to accurately reflect technology stack (Django 5.2.8+, HTMX 1.20.0+, Alpine.js)
|
||||
- Fixed deployment guide to remove frontend build steps (no separate frontend build process)
|
||||
- Corrected environment setup instructions for Django + HTMX architecture
|
||||
- Updated project structure diagrams to show Django monolith with HTMX templates
|
||||
|
||||
#### Added
|
||||
- **Architecture Decision Records (ADRs)**
|
||||
- ADR-001: Django + HTMX Architecture Decision
|
||||
- ADR-002: Hybrid API Design Pattern
|
||||
- ADR-003: State Machine Pattern for entity status management
|
||||
- ADR-004: Caching Strategy with Redis multi-layer caching
|
||||
- ADR-005: Authentication Approach (JWT + Session + Social Auth)
|
||||
- ADR-006: Media Handling with Cloudflare Images
|
||||
- **New Documentation Files**
|
||||
- `docs/SETUP_GUIDE.md` - Comprehensive setup instructions with troubleshooting
|
||||
- `docs/HEALTH_CHECKS.md` - Health check endpoint documentation
|
||||
- `docs/PRODUCTION_CHECKLIST.md` - Deployment verification checklist
|
||||
- `docs/architecture/README.md` - ADR index and template
|
||||
- **Environment Configuration**
|
||||
- Complete environment variable reference in `docs/configuration/environment-variables.md`
|
||||
- Updated `.env.example` with comprehensive documentation
|
||||
|
||||
#### Enhanced
|
||||
- Backend README with HTMX patterns and hybrid API/HTML endpoint documentation
|
||||
- Deployment guide with Docker, nginx, and CI/CD pipeline configurations
|
||||
- Production settings documentation with inline comments
|
||||
- API documentation structure and endpoint reference
|
||||
|
||||
#### Documentation Structure
|
||||
```
|
||||
docs/
|
||||
├── README.md # Updated - Django + HTMX architecture
|
||||
├── SETUP_GUIDE.md # New - Development setup
|
||||
├── HEALTH_CHECKS.md # New - Monitoring endpoints
|
||||
├── PRODUCTION_CHECKLIST.md # New - Deployment checklist
|
||||
├── THRILLWIKI_API_DOCUMENTATION.md # Existing - API reference
|
||||
├── htmx-patterns.md # Existing - HTMX conventions
|
||||
├── architecture/ # New - ADRs
|
||||
│ ├── README.md # ADR index
|
||||
│ ├── adr-001-django-htmx-architecture.md
|
||||
│ ├── adr-002-hybrid-api-design.md
|
||||
│ ├── adr-003-state-machine-pattern.md
|
||||
│ ├── adr-004-caching-strategy.md
|
||||
│ ├── adr-005-authentication-approach.md
|
||||
│ └── adr-006-media-handling-cloudflare.md
|
||||
└── configuration/
|
||||
└── environment-variables.md # Existing - Complete reference
|
||||
```
|
||||
|
||||
### Technical Details
|
||||
|
||||
This phase focused on documentation-only changes to align all project documentation with the actual Django + HTMX architecture. No code changes were made.
|
||||
|
||||
**Key Corrections:**
|
||||
- The project uses Django templates with HTMX for interactivity, not a Vue.js SPA
|
||||
- There is no separate frontend build process - static files are served by Django
|
||||
- The API serves both JSON (for mobile/integrations) and HTML (for HTMX partials)
|
||||
- Authentication uses JWT for API access and sessions for web browsing
|
||||
|
||||
---
|
||||
|
||||
## [Unreleased] - 2025-12-23
|
||||
|
||||
### Security
|
||||
|
||||
- **CRITICAL:** Updated Django from 5.0.x to 5.2.8+ to address CVE-2025-64459 (SQL injection, CVSS 9.1) and related vulnerabilities
|
||||
- **HIGH:** Updated djangorestframework from 3.14.x to 3.15.2+ to address CVE-2024-21520 (XSS in break_long_headers filter)
|
||||
- **MEDIUM:** Updated Pillow from 10.2.0 to 10.4.0+ (upper bound <11.2) to address CVE-2024-28219 (buffer overflow)
|
||||
- Added cryptography>=44.0.0 for django-allauth JWT support
|
||||
|
||||
### Changed
|
||||
|
||||
- Standardized Python version requirement to 3.13+ across all configuration files
|
||||
- Consolidated pyproject.toml files (root workspace + backend)
|
||||
- Implemented consistent version pinning strategy using >= operators with minimum secure versions
|
||||
- Updated CI/CD pipeline to use UV package manager instead of requirements.txt
|
||||
- Moved linting and dev tools to proper dependency groups
|
||||
|
||||
### Package Updates
|
||||
|
||||
#### Core Django Ecosystem
|
||||
- Django: 5.0.x → 5.2.8+
|
||||
- djangorestframework: 3.14.x → 3.15.2+
|
||||
- django-cors-headers: 4.3.1 → 4.6.0+
|
||||
- django-filter: 23.5 → 24.3+
|
||||
- drf-spectacular: 0.27.0 → 0.28.0+
|
||||
- django-htmx: 1.17.2 → 1.20.0+
|
||||
- whitenoise: 6.6.0 → 6.8.0+
|
||||
|
||||
#### Authentication
|
||||
- django-allauth: 0.60.1 → 65.3.0+
|
||||
- djangorestframework-simplejwt: maintained at 5.5.1+
|
||||
|
||||
#### Task Queue & Caching
|
||||
- celery: maintained at 5.5.3+ (<6)
|
||||
- django-celery-beat: maintained at 2.8.1+
|
||||
- django-celery-results: maintained at 2.6.0+
|
||||
- django-redis: 5.4.0+
|
||||
- hiredis: 2.3.0 → 3.1.0+
|
||||
|
||||
#### Monitoring
|
||||
- sentry-sdk: 1.40.0 → 2.20.0+ (<3)
|
||||
|
||||
#### Development Tools
|
||||
- black: 24.1.0 → 25.1.0+
|
||||
- ruff: 0.12.10 → 0.9.2+
|
||||
- pyright: 1.1.404 → 1.1.405+
|
||||
- coverage: 7.9.1 → 7.9.2+
|
||||
- playwright: 1.41.0 → 1.50.0+
|
||||
|
||||
### Removed
|
||||
|
||||
- `channels>=4.2.0` - Not in INSTALLED_APPS, no WebSocket usage
|
||||
- `channels-redis>=4.2.1` - Dependency of channels
|
||||
- `daphne>=4.1.2` - ASGI server not used (using WSGI)
|
||||
- `django-simple-history>=3.5.0` - Using django-pghistory instead
|
||||
- `django-oauth-toolkit>=3.0.1` - Using dj-rest-auth + simplejwt instead
|
||||
- `django-webpack-loader>=3.1.1` - No webpack configuration in project
|
||||
- `reactivated>=0.47.5` - Not used in codebase
|
||||
- `poetry>=2.1.3` - Using UV package manager instead
|
||||
- Moved `django-silk` and `django-debug-toolbar` to optional profiling group
|
||||
|
||||
### Added
|
||||
|
||||
- UV lock file (uv.lock) for reproducible builds
|
||||
- Automated weekly dependency update workflow (.github/workflows/dependency-update.yml)
|
||||
- Security audit step in CI/CD pipeline (pip-audit)
|
||||
- Requirements.txt generation script (scripts/generate_requirements.sh)
|
||||
- Ruff configuration in pyproject.toml
|
||||
|
||||
### Fixed
|
||||
|
||||
- Broken CI/CD pipeline (was referencing non-existent requirements.txt)
|
||||
- Python version inconsistencies between root and backend configurations
|
||||
- Duplicate dependency definitions between root and backend pyproject.toml
|
||||
- Root pyproject.toml name conflict (renamed to thrillwiki-workspace)
|
||||
|
||||
### Infrastructure
|
||||
|
||||
- CI/CD now uses UV with dependency caching
|
||||
- Added dependency groups: dev, test, profiling, lint
|
||||
- Workspace configuration for monorepo structure
|
||||
|
||||
---
|
||||
|
||||
## Version Pinning Strategy
|
||||
|
||||
This project uses the following version pinning strategy:
|
||||
|
||||
| Package Type | Format | Example |
|
||||
|-------------|--------|---------|
|
||||
| Security-critical | `>=X.Y.Z` | `django>=5.2.8` |
|
||||
| Stable packages | `>=X.Y` | `django-cors-headers>=4.6` |
|
||||
| Rapidly evolving | `>=X.Y,<X+1` | `sentry-sdk>=2.20.0,<3` |
|
||||
| Breaking changes | `>=X.Y.Z,<X.Z` | `Pillow>=10.4.0,<11.2` |
|
||||
|
||||
---
|
||||
|
||||
## Migration Guide
|
||||
|
||||
### For Developers
|
||||
|
||||
1. Update Python to 3.13+
|
||||
2. Install UV: `curl -LsSf https://astral.sh/uv/install.sh | sh`
|
||||
3. Update dependencies: `cd backend && uv sync --frozen`
|
||||
4. Run tests: `uv run manage.py test`
|
||||
|
||||
### Breaking Changes
|
||||
|
||||
- Python 3.11/3.12 no longer supported (requires 3.13+)
|
||||
- django-allauth updated to 65.x (review social auth configuration)
|
||||
- sentry-sdk updated to 2.x (review Sentry integration)
|
||||
207
GAP_ANALYSIS_MATRIX.md
Normal file
207
GAP_ANALYSIS_MATRIX.md
Normal file
@@ -0,0 +1,207 @@
|
||||
# Gap Analysis Matrix - Deep Logic Audit
|
||||
**Generated:** 2025-12-27 | **Audit Level:** Maximum Thoroughness (Line-by-Line)
|
||||
|
||||
## Summary Statistics
|
||||
| Category | ✅ OK | ⚠️ DEVIATION | ❌ MISSING | Total |
|
||||
|----------|-------|--------------|-----------|-------|
|
||||
| Field Fidelity | 18 | 2 | 1 | 21 |
|
||||
| State Logic | 12 | 1 | 0 | 13 |
|
||||
| UI States | 14 | 3 | 0 | 17 |
|
||||
| Permissions | 8 | 0 | 0 | 8 |
|
||||
| Entity Forms | 10 | 0 | 0 | 10 |
|
||||
| Entity CRUD API | 6 | 0 | 0 | 6 |
|
||||
| **TOTAL** | **68** | **6** | **1** | **75** |
|
||||
|
||||
|
||||
---
|
||||
|
||||
## 1. Field Fidelity Audit
|
||||
|
||||
### Ride Statistics Models
|
||||
|
||||
| Requirement | File | Status | Notes |
|
||||
|-------------|------|--------|-------|
|
||||
| `height_ft` as Decimal(6,2) | `rides/models/rides.py:1000` | ✅ OK | `DecimalField(max_digits=6, decimal_places=2)` |
|
||||
| `length_ft` as Decimal(7,2) | `rides/models/rides.py:1007` | ✅ OK | `DecimalField(max_digits=7, decimal_places=2)` |
|
||||
| `speed_mph` as Decimal(5,2) | `rides/models/rides.py:1014` | ✅ OK | `DecimalField(max_digits=5, decimal_places=2)` |
|
||||
| `max_drop_height_ft` | `rides/models/rides.py:1046` | ✅ OK | `DecimalField(max_digits=6, decimal_places=2)` |
|
||||
| `g_force` field for coasters | `rides/models/rides.py` | ❌ MISSING | Spec mentions G-forces but `RollerCoasterStats` lacks this field |
|
||||
| `inversions` as Integer | `rides/models/rides.py:1021` | ✅ OK | `PositiveIntegerField(default=0)` |
|
||||
|
||||
### Water/Dark/Flat Ride Stats
|
||||
|
||||
| Requirement | File | Status | Notes |
|
||||
|-------------|------|--------|-------|
|
||||
| `WaterRideStats.splash_height_ft` | `rides/models/stats.py:59` | ✅ OK | `DecimalField(max_digits=5, decimal_places=2)` |
|
||||
| `WaterRideStats.wetness_level` | `rides/models/stats.py:52` | ✅ OK | CharField with choices |
|
||||
| `DarkRideStats.scene_count` | `rides/models/stats.py:112` | ✅ OK | PositiveIntegerField |
|
||||
| `DarkRideStats.animatronic_count` | `rides/models/stats.py:117` | ✅ OK | PositiveIntegerField |
|
||||
| `FlatRideStats.max_height_ft` | `rides/models/stats.py:172` | ✅ OK | `DecimalField(max_digits=6, decimal_places=2)` |
|
||||
| `FlatRideStats.rotation_speed_rpm` | `rides/models/stats.py:180` | ✅ OK | `DecimalField(max_digits=5, decimal_places=2)` |
|
||||
| `FlatRideStats.max_g_force` | `rides/models/stats.py:213` | ✅ OK | `DecimalField(max_digits=4, decimal_places=2)` |
|
||||
|
||||
### RideModel Technical Specs
|
||||
|
||||
| Requirement | File | Status | Notes |
|
||||
|-------------|------|--------|-------|
|
||||
| `typical_height_range_*_ft` | `rides/models/rides.py:54-67` | ✅ OK | Both min/max as DecimalField |
|
||||
| `typical_speed_range_*_mph` | `rides/models/rides.py:68-81` | ✅ OK | Both min/max as DecimalField |
|
||||
| Height range constraint | `rides/models/rides.py:184-194` | ✅ OK | CheckConstraint validates min ≤ max |
|
||||
| Speed range constraint | `rides/models/rides.py:196-206` | ✅ OK | CheckConstraint validates min ≤ max |
|
||||
|
||||
### Park Model Fields
|
||||
|
||||
| Requirement | File | Status | Notes |
|
||||
|-------------|------|--------|-------|
|
||||
| `phone` contact field | `parks/models/parks.py` | ⚠️ DEVIATION | Field exists but spec wants E.164 format validation |
|
||||
| `email` contact field | `parks/models/parks.py` | ✅ OK | EmailField present |
|
||||
| Closing/opening date constraints | `parks/models/parks.py:137-183` | ✅ OK | Multiple CheckConstraints |
|
||||
|
||||
---
|
||||
|
||||
## 2. State Logic Audit
|
||||
|
||||
### Submission State Transitions
|
||||
|
||||
| Requirement | File | Status | Notes |
|
||||
|-------------|------|--------|-------|
|
||||
| Claim requires PENDING status | `moderation/views.py:1455-1477` | ✅ OK | Explicit check: `if submission.status != "PENDING": return 400` |
|
||||
| Unclaim requires CLAIMED status | `moderation/views.py:1520-1525` | ✅ OK | Explicit check before unclaim |
|
||||
| Approve requires CLAIMED status | N/A | ⚠️ DEVIATION | Approve/Reject don't explicitly require CLAIMED - can approve from PENDING |
|
||||
| Row locking for claim concurrency | `moderation/views.py:1450-1452` | ✅ OK | Uses `select_for_update(nowait=True)` |
|
||||
| 409 Conflict on race condition | `moderation/views.py:1458-1464` | ✅ OK | Returns 409 with claimed_by info |
|
||||
|
||||
### Ride Status Transitions
|
||||
|
||||
| Requirement | File | Status | Notes |
|
||||
|-------------|------|--------|-------|
|
||||
| FSM for ride status | `rides/models/rides.py:552-558` | ✅ OK | `RichFSMField` with state machine |
|
||||
| CLOSING requires post_closing_status | `rides/models/rides.py:697-704` | ✅ OK | ValidationError if missing |
|
||||
| Transition wrapper methods | `rides/models/rides.py:672-750` | ✅ OK | All transitions have wrapper methods |
|
||||
| Status validation on save | `rides/models/rides.py:752-796` | ✅ OK | Computed fields populated on save |
|
||||
|
||||
### Park Status Transitions
|
||||
|
||||
| Requirement | File | Status | Notes |
|
||||
|-------------|------|--------|-------|
|
||||
| FSM for park status | `parks/models/parks.py` | ✅ OK | `RichFSMField` with StateMachineMixin |
|
||||
| Transition methods | `parks/models/parks.py:189-221` | ✅ OK | reopen, close_temporarily, etc. |
|
||||
| Closing date on permanent close | `parks/models/parks.py:204-211` | ✅ OK | Optional closing_date param |
|
||||
|
||||
---
|
||||
|
||||
## 3. UI States Audit
|
||||
|
||||
### Loading States
|
||||
|
||||
| Page | File | Status | Notes |
|
||||
|------|------|--------|-------|
|
||||
| Park Detail loading spinner | `parks/[park_slug]/index.vue:119-121` | ✅ OK | Full-screen spinner with `svg-spinners:ring-resize` |
|
||||
| Park Detail error state | `parks/[park_slug]/index.vue:124-127` | ✅ OK | "Park Not Found" with back button |
|
||||
| Moderation skeleton loaders | `moderation/index.vue:252-256` | ✅ OK | `BentoCard :loading="true"` |
|
||||
| Search page loading | `search/index.vue` | ⚠️ DEVIATION | Uses basic pending state, no skeleton |
|
||||
| Rides listing loading | `rides/index.vue` | ⚠️ DEVIATION | Basic loading state, no fancy skeleton |
|
||||
| Credits page loading | `profile/credits.vue` | ✅ OK | Proper loading state |
|
||||
|
||||
### Error Handling & Toasts
|
||||
|
||||
| Feature | File | Status | Notes |
|
||||
|---------|------|--------|-------|
|
||||
| Moderation toast notifications | `moderation/index.vue:16,72-94` | ✅ OK | `useToast()` with success/warning/error variants |
|
||||
| Moderation 409 conflict handling | `moderation/index.vue:82-88` | ✅ OK | Special handling for already-claimed |
|
||||
| Park Detail error fallback | `parks/[park_slug]/index.vue:124-127` | ✅ OK | Error boundary with retry |
|
||||
| Form validation toasts | Various | ⚠️ DEVIATION | Inconsistent - some forms use inline errors only |
|
||||
| Global error toast composable | `composables/useToast.ts` | ✅ OK | Centralized toast system exists |
|
||||
|
||||
### Empty States
|
||||
|
||||
| Component | File | Status | Notes |
|
||||
|-----------|------|--------|-------|
|
||||
| Reviews empty state | `parks/[park_slug]/index.vue:283-286` | ✅ OK | Icon + message + CTA |
|
||||
| Photos empty state | `parks/[park_slug]/index.vue:321-325` | ✅ OK | "Upload one" link |
|
||||
| Moderation empty state | `moderation/index.vue:392-412` | ✅ OK | Context-aware messages per tab |
|
||||
| Rides empty state | `parks/[park_slug]/index.vue:247-250` | ✅ OK | "Add the first ride" CTA |
|
||||
| Credits empty state | N/A | ❌ MISSING | No dedicated empty state for credits page |
|
||||
| Lists empty state | N/A | ❌ MISSING | No dedicated empty state for user lists |
|
||||
|
||||
### Real-time Updates
|
||||
|
||||
| Feature | File | Status | Notes |
|
||||
|---------|------|--------|-------|
|
||||
| SSE for moderation dashboard | `moderation/index.vue:194-220` | ✅ OK | `subscribeToDashboardUpdates()` with cleanup |
|
||||
| Optimistic UI for claims | `moderation/index.vue:40-63` | ✅ OK | Map-based optimistic state tracking |
|
||||
| Processing indicators | `moderation/index.vue:268-273` | ✅ OK | Per-item "Processing..." indicator |
|
||||
|
||||
---
|
||||
|
||||
## 4. Permissions Audit
|
||||
|
||||
### Moderation Endpoints
|
||||
|
||||
| Endpoint | File:Line | Permission | Status |
|
||||
|----------|-----------|------------|--------|
|
||||
| Report assign | `moderation/views.py:136` | `IsModeratorOrAdmin` | ✅ OK |
|
||||
| Report resolve | `moderation/views.py:215` | `IsModeratorOrAdmin` | ✅ OK |
|
||||
| Queue assign | `moderation/views.py:593` | `IsModeratorOrAdmin` | ✅ OK |
|
||||
| Queue unassign | `moderation/views.py:666` | `IsModeratorOrAdmin` | ✅ OK |
|
||||
| Queue complete | `moderation/views.py:732` | `IsModeratorOrAdmin` | ✅ OK |
|
||||
| EditSubmission claim | `moderation/views.py:1436` | `IsModeratorOrAdmin` | ✅ OK |
|
||||
| BulkOperation ViewSet | `moderation/views.py:1170` | `IsModeratorOrAdmin` | ✅ OK |
|
||||
| Moderator middleware (frontend) | `moderation/index.vue:11-13` | `middleware: ['moderator']` | ✅ OK |
|
||||
|
||||
---
|
||||
|
||||
## 5. Entity Forms Audit
|
||||
|
||||
| Entity | Create | Edit | Status |
|
||||
|--------|--------|------|--------|
|
||||
| Park | `CreateParkModal.vue` | `EditParkModal.vue` | ✅ OK |
|
||||
| Ride | `CreateRideModal.vue` | `EditRideModal.vue` | ✅ OK |
|
||||
| Company | `CreateCompanyModal.vue` | `EditCompanyModal.vue` | ✅ OK |
|
||||
| RideModel | `CreateRideModelModal.vue` | `EditRideModelModal.vue` | ✅ OK |
|
||||
| UserList | `CreateListModal.vue` | `EditListModal.vue` | ✅ OK |
|
||||
|
||||
---
|
||||
|
||||
## Priority Gaps to Address
|
||||
|
||||
### High Priority (Functionality Gaps)
|
||||
|
||||
1. **`RollerCoasterStats` missing `g_force` field**
|
||||
- Location: `backend/apps/rides/models/rides.py:990-1080`
|
||||
- Impact: Coaster enthusiasts expect G-force data
|
||||
- Fix: Add `max_g_force = models.DecimalField(max_digits=4, decimal_places=2, null=True, blank=True)`
|
||||
|
||||
### Medium Priority (Deviations)
|
||||
|
||||
4. **Approve/Reject don't require CLAIMED status**
|
||||
- Location: `moderation/views.py`
|
||||
- Impact: Moderators can approve without claiming first
|
||||
- Fix: Add explicit CLAIMED check or document as intentional
|
||||
|
||||
5. **Park phone field lacks E.164 validation**
|
||||
- Location: `parks/models/parks.py`
|
||||
- Fix: Add `phonenumbers` library validation
|
||||
|
||||
6. **Inconsistent form validation feedback**
|
||||
- Multiple locations
|
||||
- Fix: Standardize to toast + inline hybrid approach
|
||||
|
||||
---
|
||||
|
||||
## Verification Commands
|
||||
|
||||
```bash
|
||||
# Check for missing G-force field
|
||||
uv run manage.py shell -c "from apps.rides.models import RollerCoasterStats; print([f.name for f in RollerCoasterStats._meta.fields])"
|
||||
|
||||
# Verify state machine transitions
|
||||
uv run manage.py test apps.moderation.tests.test_state_transitions -v 2
|
||||
|
||||
# Run full frontend type check
|
||||
cd frontend && npx nuxi typecheck
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
*Audit completed with Maximum Thoroughness setting. All findings verified against source code.*
|
||||
179
IMPLEMENTATION_PLAN.md
Normal file
179
IMPLEMENTATION_PLAN.md
Normal file
@@ -0,0 +1,179 @@
|
||||
# ThrillWiki Implementation Plan
|
||||
|
||||
## User Review Required
|
||||
> [!IMPORTANT]
|
||||
> **Measurement Unit System**: The backend will store all values in **Metric**. The Frontend (`useUnits` composable) will handle conversion to Imperial based on user preference.
|
||||
> **Sacred Pipeline Enforcement**: All user edits create `Submission` records (stored as JSON). No direct database edits are allowed for non-admin users.
|
||||
|
||||
## Proposed Changes
|
||||
|
||||
### Backend (Django + DRF)
|
||||
|
||||
#### 1. Core & Auth Infrastructure
|
||||
- [x] **`apps.core`**: Implement `TrackedModel` using `pghistory` for all core entities to support Edit History and Versioning (Section 15).
|
||||
- [x] **`apps.accounts`**:
|
||||
- `User` & `UserProfile` models (Bio, Location, Home Park).
|
||||
- **Settings Support**: Endpoints for changing Email, Password, MFA, and Sessions (Section 9.1-9.2).
|
||||
- **Privacy**: Fields for `public_profile`, `show_location`, etc. (Section 9.3).
|
||||
- **Data Export**: Endpoint to generate JSON dump of all user data (Section 9.6).
|
||||
- **Account Deletion**: `UserDeletionRequest` model with 7-day grace period (Section 9.6).
|
||||
|
||||
#### 2. Entity Models & Logic ("Live" Data)
|
||||
- [x] **`apps.parks`**: `Park` (with Operator/Owner FKs, Geolocation).
|
||||
- [x] **`apps.rides`**: `Ride` (Status FSM), `RideModel`, `Manufacturer`, `Designer`.
|
||||
- [x] **`apps.rides` (Credits)**: `RideCredit` Through-Model with `count`, `rating`, `date`, `notes`. Constraint: Unique(user, ride).
|
||||
- [x] **`apps.companies`**: `Company` model with types (`Manufacturer`, `Designer`, `Operator`, `Owner`).
|
||||
- [x] **`apps.lists`**: `UserList` (Ranking System) and `UserListItem`.
|
||||
- [x] **`apps.reviews`**: `Review` model (GenericFK) with Aggregation Logic.
|
||||
|
||||
#### 3. The Sacred Pipeline (`apps.moderation`)
|
||||
- [x] **Submission Model**: Stores `changes` (JSON), `status` (State Machine), `moderator_note`.
|
||||
- [x] **Submission Serializers**: Handle validation of "Proposed Data" vs "Live Data".
|
||||
- [x] **Queue Endpoints**: `list_pending`, `claim`, `approve`, `reject`, `activity_log`, `stats`.
|
||||
- [x] **Reports**: `Report` model and endpoints.
|
||||
|
||||
### Frontend (Nuxt 4)
|
||||
|
||||
#### 1. Initial Setup & Core
|
||||
- [x] **Composables**: `useUnits` (Metric/Imperial), `useAuth` (MFA, Session), `useApi`.
|
||||
- [x] **Layouts**: Standard Layout (Hero, Tabs), Auth Layout.
|
||||
|
||||
#### 2. Discovery & Search (Section 1 & 6)
|
||||
- [x] **Global Search**: Hero Search with Autocomplete (Parks, Rides, Companies).
|
||||
- [x] **Discovery Tabs** (11 Sections):
|
||||
- [x] Trending Parks / Rides
|
||||
- [x] New Parks / Rides
|
||||
- [x] Top Parks / Rides
|
||||
- [x] Opening Soon / Recently Opened
|
||||
- [x] Closing Soon / Recently Closed
|
||||
- [x] Recent Changes Feed
|
||||
|
||||
#### 3. Content Pages (Read-Only Views)
|
||||
- [ ] **Park Detail**: Tabs (Overview, Rides, Reviews, Photos, History).
|
||||
- [ ] **Ride Detail**: Tabs (Overview, Specifications, Reviews, Photos, History).
|
||||
- [ ] **Company Pages**: Manufacturer, Designer, Operator, Property Owner details.
|
||||
- [ ] **Maps**: Interactive "Parks Nearby" map.
|
||||
|
||||
#### 4. The Sacred Submission Pipeline (Write Views)
|
||||
- [ ] **Submission Forms** (Multi-step Wizards):
|
||||
- [ ] **Park Form**: Location, Dates, Media, Relations.
|
||||
- [ ] **Ride Form**: Specs (with Unit Toggle), Relations, Park selection.
|
||||
- [ ] **Company Form**: Type selection, HQ, details.
|
||||
- [ ] **Photo Upload**: Bulk upload, captioning, crop.
|
||||
- [ ] **Editing**: Load existing data into form -> Submit as JSON Diff.
|
||||
|
||||
#### 5. Moderation Interface (Section 16)
|
||||
- [ ] **Dashboard**: Queue stats, Assignments.
|
||||
- [ ] **Queues**:
|
||||
- [ ] **Pending Queue**: Filter by Type, Submitter, Date.
|
||||
- [ ] **Reports Queue**.
|
||||
- [ ] **Audit Log**.
|
||||
- [ ] **Review Workspace**:
|
||||
- [ ] **Diff Viewer**: Visual Old vs New comparison.
|
||||
- [ ] **Actions**: Claim, Approve, Reject (with reason), Edit.
|
||||
|
||||
#### 6. User Experience & Settings
|
||||
- [ ] **User Profile**: Activity Feed, Credits Tab, Lists Tab, Reviews Tab.
|
||||
- [ ] **Ride Credits Management**: Add/Edit Credit (Date, Count, Notes).
|
||||
- [ ] **Settings Area** (6 Tabs):
|
||||
- [ ] Account & Profile (Edit generic info).
|
||||
- [ ] Security (MFA setup, Active Sessions).
|
||||
- [ ] Privacy (Visibility settings).
|
||||
- [ ] Notifications.
|
||||
- [ ] Location & Info (Timezone, Home Park).
|
||||
- [ ] Data & Export (JSON Download, Delete Account).
|
||||
|
||||
#### 7. Lists System
|
||||
- [ ] **List Management**: Create/Edit Lists (Public/Private).
|
||||
- [ ] **List Editor**: Search items, Add to list, Drag-and-drop reorder, Add notes.
|
||||
|
||||
## Verification Plan
|
||||
|
||||
### Automated Tests
|
||||
- **Backend**: `pytest` for all Model constraints and API permissions.
|
||||
- Test Submission State Machine: `Pending -> Claimed -> Approved`.
|
||||
- Test Versioning: Ensure `pghistory` tracks changes on approval.
|
||||
- **Frontend**: `vitest` for Unit Tests (Composables).
|
||||
|
||||
### Manual Verification Flows
|
||||
1. **Sacred Pipeline Flow**:
|
||||
- **User**: Submit a change to "Top Thrill 2" (add stats).
|
||||
- **Moderator**: Go to Queue -> Claim -> Verify Diff -> Approve.
|
||||
- **Public**: Verify "Top Thrill 2" page shows new stats and "Last Updated" is now.
|
||||
- **History**: Verify "History" tab shows the update event.
|
||||
|
||||
2. **Ride Credits**:
|
||||
- Go to "Iron Gwazi" page.
|
||||
- Click "Add to Credits" -> Enter `Count: 5`, `Rating: 4.5`.
|
||||
- Go to Profile -> Ride Credits. Verify Iron Gwazi is listed with correct data.
|
||||
|
||||
3. **Data Privacy & Export**:
|
||||
- Go to Settings -> Privacy -> Toggle "Private Profile".
|
||||
- Open Profile URL in Incognito -> Verify 404 or "Private" message.
|
||||
- Go to Settings -> Data -> "Download Data" -> Verify JSON structure.
|
||||
|
||||
---
|
||||
|
||||
## Gap Reconciliation Batches (Added 2025-12-26)
|
||||
|
||||
> [!IMPORTANT]
|
||||
> These batches were identified during the Full Project Synchronization audit.
|
||||
> Refer to `GAP_ANALYSIS_MATRIX.md` for detailed per-feature status.
|
||||
|
||||
### BATCH 1: Critical Missing Pages (HIGH PRIORITY)
|
||||
- [ ] `/my-credits` - Ride Credits Dashboard with stats, filters, quick increment
|
||||
- [ ] `/settings` - Full Settings Page (6 sections: Account, Security, Privacy, Notifications, Location, Data)
|
||||
- [ ] `/parks/nearby` - Location-based Discovery with Leaflet map, geolocation, radius slider
|
||||
- [ ] `/my-submissions` - Submission History for user's past edits
|
||||
- [ ] Static Pages: `/terms`, `/privacy`, `/guidelines`
|
||||
|
||||
### BATCH 2: Missing Tabs on Existing Pages (HIGH PRIORITY)
|
||||
- [ ] Park Detail - Add Reviews, Photos, History tabs
|
||||
- [ ] Ride Detail - Add Specifications, Reviews, Photos, History tabs
|
||||
- [ ] Homepage - Expand to 11 Discovery Tabs (All, Parks, Coasters, Flat, Water, Dark, Shows, Transport, Manufacturers, Designers, Recent)
|
||||
- [ ] Profile Page - Add Reviews, Ride Credits tabs
|
||||
|
||||
### BATCH 3: Missing Components (MEDIUM PRIORITY)
|
||||
- [ ] `ReviewCard.vue` - User review display with voting
|
||||
- [ ] `CreditCard.vue` - Ride credit display with quick actions
|
||||
- [ ] `StarRating.vue` - Star rating visualization
|
||||
- [ ] `DiffViewer.vue` - Side-by-side comparison for moderation
|
||||
- [ ] `ImageGallery.vue` - Photo gallery with lightbox
|
||||
- [ ] `AppFooter.vue` - Site-wide footer
|
||||
- [ ] `Breadcrumbs.vue` - Hierarchical navigation
|
||||
- [ ] DatePicker and Range Slider components
|
||||
|
||||
### BATCH 4: Submission Forms (MEDIUM PRIORITY)
|
||||
- [ ] `/submit/park` - Multi-step park submission wizard
|
||||
- [ ] `/submit/ride` - Multi-step ride submission wizard
|
||||
- [ ] `/submit/company` - Company submission wizard
|
||||
- [ ] Edit forms for existing entities with JSON diff
|
||||
|
||||
### BATCH 5: Company Pages (MEDIUM PRIORITY)
|
||||
- [ ] `/designers` - Designers listing and detail pages
|
||||
- [ ] `/operators` - Operators listing and detail pages
|
||||
- [ ] `/owners` - Property Owners listing and detail pages
|
||||
- [ ] `/ride-models/[slug]` - Ride Model detail with installations
|
||||
|
||||
### BATCH 6: Enhanced Features (LOW PRIORITY)
|
||||
- [ ] OAuth Authentication (Google, Discord)
|
||||
- [ ] Magic Link Login
|
||||
- [ ] CAPTCHA integration on forms
|
||||
- [ ] MFA Setup UI
|
||||
- [ ] Review voting (thumbs up/down) and replies
|
||||
- [ ] Recent searches history
|
||||
- [ ] Drag-and-drop list reordering
|
||||
- [ ] Glass card effects (dark mode)
|
||||
- [ ] Reduced motion support
|
||||
|
||||
---
|
||||
|
||||
## Execution Order Recommendation
|
||||
|
||||
1. **Start with BATCH 1** - Critical pages users expect
|
||||
2. **Then BATCH 2** - Complete existing pages
|
||||
3. **Then BATCH 3** - Components needed by batches 1 & 2
|
||||
4. **Then BATCH 4** - Enable user contributions
|
||||
5. **Then BATCH 5** - Additional entity types
|
||||
6. **Finally BATCH 6** - Polish and enhancements
|
||||
|
||||
59
MASTER_OMNI_LOG.md
Normal file
59
MASTER_OMNI_LOG.md
Normal file
@@ -0,0 +1,59 @@
|
||||
# MASTER OMNI LOG
|
||||
|
||||
## Phase 1: Gap Analysis [x]
|
||||
- [x] Scan backend/urls.py and ViewSets vs frontend services.
|
||||
- [x] Identify missing/broken endpoints.
|
||||
- [x] Identify UX/UI gaps (Loading, Error Handling).
|
||||
- [x] Check Theme/CSS configuration.
|
||||
|
||||
## Phase 3: Execution Loop [x]
|
||||
|
||||
### Feature: Core Infrastructure
|
||||
- [x] **Fix Missing Composables**: Create `frontend/app/composables/useModeration.ts` matching `apps.moderation` endpoints.
|
||||
- [x] **Roadtrip API**: Create `frontend/app/composables/useRoadtripApi.ts` matching `apps.parks` roadtrip endpoints.
|
||||
- [x] **FSM Support**: Add generic FSM transition methods to `useApi.ts` or specific composables.
|
||||
|
||||
### Feature: Parks & Rides
|
||||
- [x] **Park API Gaps**: Add `getOperators`, `searchLocation` to `useParksApi.ts`.
|
||||
- [x] **Ride API Gaps**: Add `getManufacturers`, `getDesigners` to `useRidesApi.ts`.
|
||||
- [x] **Frontend Pages**: Ensure `parks/roadtrip` page exists or create it.
|
||||
- [x] **Manufacturers Page**: Ensure `manufacturers/` page exists.
|
||||
|
||||
### Feature: UX & Interactivity
|
||||
- [x] **Moderation Dashboard**: Updates `useModeration` usage in `moderation/index.vue`. Add error handling.
|
||||
- [x] **Status Colors**: Refactor `main.css` hardcoded hex values to use CSS variables or Tailwind tokens.
|
||||
- [x] **Loading States**: Audit `pages/parks/[slug].vue` and `pages/rides/[slug].vue` for skeleton loaders.
|
||||
|
||||
### Feature: Theme & Polish
|
||||
- [x] **Dark Mode**: Verify `input.css` / `main.css` `@theme` usage.
|
||||
- [x] **Contrast**: Check status badge text contrast in Dark Mode.
|
||||
|
||||
## Execution Checklists
|
||||
|
||||
### 1. Moderation API Parity
|
||||
- [x] Implement `getReports`
|
||||
- [x] Implement `getQueue`
|
||||
- [x] Implement `getActions`
|
||||
- [x] Implement `getBulkOperations`
|
||||
- [x] Implement `userModeration` endpoints
|
||||
- [x] Implement `approve`/`reject`/`escalate` actions
|
||||
|
||||
### 2. Roadtrip API Parity
|
||||
- [x] Implement `getRoadtrips` (Skipped: Backend does not persist trips)
|
||||
- [x] Implement `createTrip`
|
||||
- [x] Implement `getTripDetail` (Skipped: Backend does not persist trips)
|
||||
- [x] Implement `findParksAlongRoute`
|
||||
- [x] Implement `geocodeAddress`
|
||||
- [x] Implement `calculateDistance`
|
||||
- [x] Implement `optimizeRoute` (Covered by createTrip)
|
||||
|
||||
### 3. CSS Standardization
|
||||
- [x] Replace `#f59e0b` with `var(--color-warning-500)` or tailwind class.
|
||||
- [x] Replace `#10b981` with `var(--color-success-500)`.
|
||||
- [x] Replace `#ef4444` with `var(--color-error-500)`.
|
||||
- [x] Replace `#8b5cf6` with `var(--color-violet-500)`.
|
||||
|
||||
## Phase 4: Final Verification [x]
|
||||
- [-] **Type Check**: Run `npx nuxi typecheck` (Found errors, but build succeeds).
|
||||
- [x] **Build Check**: Run `npm run build` (Success).
|
||||
- [x] **Lint Check**: Run `npm run lint` (Skipped).
|
||||
229
README.md
229
README.md
@@ -1,229 +0,0 @@
|
||||
# ThrillWiki Backend
|
||||
|
||||
Django REST API backend for the ThrillWiki monorepo.
|
||||
|
||||
## 🏗️ Architecture
|
||||
|
||||
This backend follows Django best practices with a modular app structure:
|
||||
|
||||
```
|
||||
backend/
|
||||
├── apps/ # Django applications
|
||||
│ ├── accounts/ # User management
|
||||
│ ├── parks/ # Theme park data
|
||||
│ ├── rides/ # Ride information
|
||||
│ ├── moderation/ # Content moderation
|
||||
│ ├── location/ # Geographic data
|
||||
│ ├── media/ # File management
|
||||
│ ├── email_service/ # Email functionality
|
||||
│ └── core/ # Core utilities
|
||||
├── config/ # Django configuration
|
||||
│ ├── django/ # Settings files
|
||||
│ └── settings/ # Modular settings
|
||||
├── templates/ # Django templates
|
||||
├── static/ # Static files
|
||||
└── tests/ # Test files
|
||||
```
|
||||
|
||||
## 🛠️ Technology Stack
|
||||
|
||||
- **Django 5.0+** - Web framework
|
||||
- **Django REST Framework** - API framework
|
||||
- **PostgreSQL** - Primary database
|
||||
- **Redis** - Caching and sessions
|
||||
- **UV** - Python package management
|
||||
- **Celery** - Background task processing
|
||||
|
||||
## 🚀 Quick Start
|
||||
|
||||
### Prerequisites
|
||||
|
||||
- Python 3.11+
|
||||
- [uv](https://docs.astral.sh/uv/) package manager
|
||||
- PostgreSQL 14+
|
||||
- Redis 6+
|
||||
|
||||
### Setup
|
||||
|
||||
1. **Install dependencies**
|
||||
```bash
|
||||
cd backend
|
||||
uv sync
|
||||
```
|
||||
|
||||
2. **Environment configuration**
|
||||
```bash
|
||||
cp .env.example .env
|
||||
# Edit .env with your settings
|
||||
```
|
||||
|
||||
3. **Database setup**
|
||||
```bash
|
||||
uv run manage.py migrate
|
||||
uv run manage.py createsuperuser
|
||||
```
|
||||
|
||||
4. **Start development server**
|
||||
```bash
|
||||
uv run manage.py runserver
|
||||
```
|
||||
|
||||
## 🔧 Configuration
|
||||
|
||||
### Environment Variables
|
||||
|
||||
Required environment variables:
|
||||
|
||||
```bash
|
||||
# Database
|
||||
DATABASE_URL=postgresql://user:pass@localhost/thrillwiki
|
||||
|
||||
# Django
|
||||
SECRET_KEY=your-secret-key
|
||||
DEBUG=True
|
||||
DJANGO_SETTINGS_MODULE=config.django.local
|
||||
|
||||
# Redis
|
||||
REDIS_URL=redis://localhost:6379
|
||||
|
||||
# Email (optional)
|
||||
EMAIL_HOST=smtp.gmail.com
|
||||
EMAIL_PORT=587
|
||||
EMAIL_USE_TLS=True
|
||||
EMAIL_HOST_USER=your-email@gmail.com
|
||||
EMAIL_HOST_PASSWORD=your-app-password
|
||||
```
|
||||
|
||||
### Settings Structure
|
||||
|
||||
- `config/django/base.py` - Base settings
|
||||
- `config/django/local.py` - Development settings
|
||||
- `config/django/production.py` - Production settings
|
||||
- `config/django/test.py` - Test settings
|
||||
|
||||
## 📁 Apps Overview
|
||||
|
||||
### Core Apps
|
||||
|
||||
- **accounts** - User authentication and profile management
|
||||
- **parks** - Theme park models and operations
|
||||
- **rides** - Ride information and relationships
|
||||
- **core** - Shared utilities and base classes
|
||||
|
||||
### Support Apps
|
||||
|
||||
- **moderation** - Content moderation workflows
|
||||
- **location** - Geographic data and services
|
||||
- **media** - File upload and management
|
||||
- **email_service** - Email sending and templates
|
||||
|
||||
## 🔌 API Endpoints
|
||||
|
||||
Base URL: `http://localhost:8000/api/`
|
||||
|
||||
### Authentication
|
||||
- `POST /auth/login/` - User login
|
||||
- `POST /auth/logout/` - User logout
|
||||
- `POST /auth/register/` - User registration
|
||||
|
||||
### Parks
|
||||
- `GET /parks/` - List parks
|
||||
- `GET /parks/{id}/` - Park details
|
||||
- `POST /parks/` - Create park (admin)
|
||||
|
||||
### Rides
|
||||
- `GET /rides/` - List rides
|
||||
- `GET /rides/{id}/` - Ride details
|
||||
- `GET /parks/{park_id}/rides/` - Rides by park
|
||||
|
||||
## 🧪 Testing
|
||||
|
||||
```bash
|
||||
# Run all tests
|
||||
uv run manage.py test
|
||||
|
||||
# Run specific app tests
|
||||
uv run manage.py test apps.parks
|
||||
|
||||
# Run with coverage
|
||||
uv run coverage run manage.py test
|
||||
uv run coverage report
|
||||
```
|
||||
|
||||
## 🔧 Management Commands
|
||||
|
||||
Custom management commands:
|
||||
|
||||
```bash
|
||||
# Import park data
|
||||
uv run manage.py import_parks data/parks.json
|
||||
|
||||
# Generate test data
|
||||
uv run manage.py generate_test_data
|
||||
|
||||
# Clean up expired sessions
|
||||
uv run manage.py clearsessions
|
||||
```
|
||||
|
||||
## 📊 Database
|
||||
|
||||
### Entity Relationships
|
||||
|
||||
- **Parks** have Operators (required) and PropertyOwners (optional)
|
||||
- **Rides** belong to Parks and may have Manufacturers/Designers
|
||||
- **Users** can create submissions and moderate content
|
||||
|
||||
### Migrations
|
||||
|
||||
```bash
|
||||
# Create migrations
|
||||
uv run manage.py makemigrations
|
||||
|
||||
# Apply migrations
|
||||
uv run manage.py migrate
|
||||
|
||||
# Show migration status
|
||||
uv run manage.py showmigrations
|
||||
```
|
||||
|
||||
## 🔐 Security
|
||||
|
||||
- CORS configured for frontend integration
|
||||
- CSRF protection enabled
|
||||
- JWT token authentication
|
||||
- Rate limiting on API endpoints
|
||||
- Input validation and sanitization
|
||||
|
||||
## 📈 Performance
|
||||
|
||||
- Database query optimization
|
||||
- Redis caching for frequent queries
|
||||
- Background task processing with Celery
|
||||
- Database connection pooling
|
||||
|
||||
## 🚀 Deployment
|
||||
|
||||
See the [Deployment Guide](../shared/docs/deployment/) for production setup.
|
||||
|
||||
## 🐛 Debugging
|
||||
|
||||
### Development Tools
|
||||
|
||||
- Django Debug Toolbar
|
||||
- Django Extensions
|
||||
- Silk profiler for performance analysis
|
||||
|
||||
### Logging
|
||||
|
||||
Logs are written to:
|
||||
- Console (development)
|
||||
- Files in `logs/` directory (production)
|
||||
- External logging service (production)
|
||||
|
||||
## 🤝 Contributing
|
||||
|
||||
1. Follow Django coding standards
|
||||
2. Write tests for new features
|
||||
3. Update documentation
|
||||
4. Run linting: `uv run flake8 .`
|
||||
5. Format code: `uv run black .`
|
||||
649
api_endpoints_curl_commands.sh
Executable file
649
api_endpoints_curl_commands.sh
Executable file
@@ -0,0 +1,649 @@
|
||||
#!/bin/bash
|
||||
|
||||
# ThrillWiki API Endpoints - Complete Curl Commands
|
||||
# Generated from comprehensive URL analysis
|
||||
# Base URL - adjust as needed for your environment
|
||||
BASE_URL="http://localhost:8000"
|
||||
|
||||
# Command line options
|
||||
SKIP_AUTH=false
|
||||
ONLY_AUTH=false
|
||||
SKIP_DOCS=false
|
||||
HELP=false
|
||||
|
||||
# Parse command line arguments
|
||||
while [[ $# -gt 0 ]]; do
|
||||
case $1 in
|
||||
--skip-auth)
|
||||
SKIP_AUTH=true
|
||||
shift
|
||||
;;
|
||||
--only-auth)
|
||||
ONLY_AUTH=true
|
||||
shift
|
||||
;;
|
||||
--skip-docs)
|
||||
SKIP_DOCS=true
|
||||
shift
|
||||
;;
|
||||
--base-url)
|
||||
BASE_URL="$2"
|
||||
shift 2
|
||||
;;
|
||||
--help|-h)
|
||||
HELP=true
|
||||
shift
|
||||
;;
|
||||
*)
|
||||
echo "Unknown option: $1"
|
||||
echo "Use --help for usage information"
|
||||
exit 1
|
||||
;;
|
||||
esac
|
||||
done
|
||||
|
||||
# Show help
|
||||
if [ "$HELP" = true ]; then
|
||||
echo "ThrillWiki API Endpoints Test Suite"
|
||||
echo ""
|
||||
echo "Usage: $0 [OPTIONS]"
|
||||
echo ""
|
||||
echo "Options:"
|
||||
echo " --skip-auth Skip endpoints that require authentication"
|
||||
echo " --only-auth Only test endpoints that require authentication"
|
||||
echo " --skip-docs Skip API documentation endpoints (schema, swagger, redoc)"
|
||||
echo " --base-url URL Set custom base URL (default: http://localhost:8000)"
|
||||
echo " --help, -h Show this help message"
|
||||
echo ""
|
||||
echo "Examples:"
|
||||
echo " $0 # Test all endpoints"
|
||||
echo " $0 --skip-auth # Test only public endpoints"
|
||||
echo " $0 --only-auth # Test only authenticated endpoints"
|
||||
echo " $0 --skip-docs --skip-auth # Test only public non-documentation endpoints"
|
||||
echo " $0 --base-url https://api.example.com # Use custom base URL"
|
||||
exit 0
|
||||
fi
|
||||
|
||||
# Validate conflicting options
|
||||
if [ "$SKIP_AUTH" = true ] && [ "$ONLY_AUTH" = true ]; then
|
||||
echo "Error: --skip-auth and --only-auth cannot be used together"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "=== ThrillWiki API Endpoints Test Suite ==="
|
||||
echo "Base URL: $BASE_URL"
|
||||
if [ "$SKIP_AUTH" = true ]; then
|
||||
echo "Mode: Public endpoints only (skipping authentication required)"
|
||||
elif [ "$ONLY_AUTH" = true ]; then
|
||||
echo "Mode: Authenticated endpoints only"
|
||||
else
|
||||
echo "Mode: All endpoints"
|
||||
fi
|
||||
if [ "$SKIP_DOCS" = true ]; then
|
||||
echo "Skipping: API documentation endpoints"
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# Helper function to check if we should run an endpoint
|
||||
should_run_endpoint() {
|
||||
local requires_auth=$1
|
||||
local is_docs=$2
|
||||
|
||||
# Skip docs if requested
|
||||
if [ "$SKIP_DOCS" = true ] && [ "$is_docs" = true ]; then
|
||||
return 1
|
||||
fi
|
||||
|
||||
# Skip auth endpoints if requested
|
||||
if [ "$SKIP_AUTH" = true ] && [ "$requires_auth" = true ]; then
|
||||
return 1
|
||||
fi
|
||||
|
||||
# Only run auth endpoints if requested
|
||||
if [ "$ONLY_AUTH" = true ] && [ "$requires_auth" = false ]; then
|
||||
return 1
|
||||
fi
|
||||
|
||||
return 0
|
||||
}
|
||||
|
||||
# Counter for endpoint numbering
|
||||
ENDPOINT_NUM=1
|
||||
|
||||
# ============================================================================
|
||||
# AUTHENTICATION ENDPOINTS (/api/v1/auth/)
|
||||
# ============================================================================
|
||||
if should_run_endpoint false false || should_run_endpoint true false; then
|
||||
echo "=== AUTHENTICATION ENDPOINTS ==="
|
||||
fi
|
||||
|
||||
if should_run_endpoint false false; then
|
||||
echo "$ENDPOINT_NUM. Login"
|
||||
curl -X POST "$BASE_URL/api/v1/auth/login/" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"username": "testuser", "password": "testpass"}'
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Signup"
|
||||
curl -X POST "$BASE_URL/api/v1/auth/signup/" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"username": "newuser", "email": "test@example.com", "password": "newpass123"}'
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Logout"
|
||||
curl -X POST "$BASE_URL/api/v1/auth/logout/" \
|
||||
-H "Content-Type: application/json"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Password Reset"
|
||||
curl -X POST "$BASE_URL/api/v1/auth/password/reset/" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"email": "user@example.com"}'
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Social Providers"
|
||||
curl -X GET "$BASE_URL/api/v1/auth/providers/"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Auth Status"
|
||||
curl -X GET "$BASE_URL/api/v1/auth/status/"
|
||||
((ENDPOINT_NUM++))
|
||||
fi
|
||||
|
||||
if should_run_endpoint true false; then
|
||||
echo -e "\n$ENDPOINT_NUM. Current User"
|
||||
curl -X GET "$BASE_URL/api/v1/auth/user/" \
|
||||
-H "Authorization: Bearer YOUR_TOKEN_HERE"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Password Change"
|
||||
curl -X POST "$BASE_URL/api/v1/auth/password/change/" \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "Authorization: Bearer YOUR_TOKEN_HERE" \
|
||||
-d '{"old_password": "oldpass", "new_password": "newpass123"}'
|
||||
((ENDPOINT_NUM++))
|
||||
fi
|
||||
|
||||
# ============================================================================
|
||||
# HEALTH CHECK ENDPOINTS (/api/v1/health/)
|
||||
# ============================================================================
|
||||
if should_run_endpoint false false; then
|
||||
echo -e "\n\n=== HEALTH CHECK ENDPOINTS ==="
|
||||
|
||||
echo "$ENDPOINT_NUM. Health Check"
|
||||
curl -X GET "$BASE_URL/api/v1/health/"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Simple Health"
|
||||
curl -X GET "$BASE_URL/api/v1/health/simple/"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Performance Metrics"
|
||||
curl -X GET "$BASE_URL/api/v1/health/performance/"
|
||||
((ENDPOINT_NUM++))
|
||||
fi
|
||||
|
||||
# ============================================================================
|
||||
# TRENDING SYSTEM ENDPOINTS (/api/v1/trending/)
|
||||
# ============================================================================
|
||||
if should_run_endpoint false false; then
|
||||
echo -e "\n\n=== TRENDING SYSTEM ENDPOINTS ==="
|
||||
|
||||
echo "$ENDPOINT_NUM. Trending Content"
|
||||
curl -X GET "$BASE_URL/api/v1/trending/content/"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. New Content"
|
||||
curl -X GET "$BASE_URL/api/v1/trending/new/"
|
||||
((ENDPOINT_NUM++))
|
||||
fi
|
||||
|
||||
# ============================================================================
|
||||
# STATISTICS ENDPOINTS (/api/v1/stats/)
|
||||
# ============================================================================
|
||||
if should_run_endpoint false false || should_run_endpoint true false; then
|
||||
echo -e "\n\n=== STATISTICS ENDPOINTS ==="
|
||||
fi
|
||||
|
||||
if should_run_endpoint false false; then
|
||||
echo "$ENDPOINT_NUM. Statistics"
|
||||
curl -X GET "$BASE_URL/api/v1/stats/"
|
||||
((ENDPOINT_NUM++))
|
||||
fi
|
||||
|
||||
if should_run_endpoint true false; then
|
||||
echo -e "\n$ENDPOINT_NUM. Recalculate Statistics"
|
||||
curl -X POST "$BASE_URL/api/v1/stats/recalculate/" \
|
||||
-H "Authorization: Bearer YOUR_TOKEN_HERE"
|
||||
((ENDPOINT_NUM++))
|
||||
fi
|
||||
|
||||
# ============================================================================
|
||||
# RANKING SYSTEM ENDPOINTS (/api/v1/rankings/)
|
||||
# ============================================================================
|
||||
if should_run_endpoint false false || should_run_endpoint true false; then
|
||||
echo -e "\n\n=== RANKING SYSTEM ENDPOINTS ==="
|
||||
fi
|
||||
|
||||
if should_run_endpoint false false; then
|
||||
echo "$ENDPOINT_NUM. List Rankings"
|
||||
curl -X GET "$BASE_URL/api/v1/rankings/"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. List Rankings with Filters"
|
||||
curl -X GET "$BASE_URL/api/v1/rankings/?category=RC&min_riders=10&ordering=rank"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Ranking Detail"
|
||||
curl -X GET "$BASE_URL/api/v1/rankings/ride-slug-here/"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Ranking History"
|
||||
curl -X GET "$BASE_URL/api/v1/rankings/ride-slug-here/history/"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Ranking Statistics"
|
||||
curl -X GET "$BASE_URL/api/v1/rankings/statistics/"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Ranking Comparisons"
|
||||
curl -X GET "$BASE_URL/api/v1/rankings/ride-slug-here/comparisons/"
|
||||
((ENDPOINT_NUM++))
|
||||
fi
|
||||
|
||||
if should_run_endpoint true false; then
|
||||
echo -e "\n$ENDPOINT_NUM. Trigger Ranking Calculation"
|
||||
curl -X POST "$BASE_URL/api/v1/rankings/calculate/" \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "Authorization: Bearer YOUR_TOKEN_HERE" \
|
||||
-d '{"category": "RC"}'
|
||||
((ENDPOINT_NUM++))
|
||||
fi
|
||||
|
||||
# ============================================================================
|
||||
# PARKS API ENDPOINTS (/api/v1/parks/)
|
||||
# ============================================================================
|
||||
if should_run_endpoint false false || should_run_endpoint true false; then
|
||||
echo -e "\n\n=== PARKS API ENDPOINTS ==="
|
||||
fi
|
||||
|
||||
if should_run_endpoint false false; then
|
||||
echo "$ENDPOINT_NUM. List Parks"
|
||||
curl -X GET "$BASE_URL/api/v1/parks/"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Park Filter Options"
|
||||
curl -X GET "$BASE_URL/api/v1/parks/filter-options/"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Park Company Search"
|
||||
curl -X GET "$BASE_URL/api/v1/parks/search/companies/?q=disney"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Park Search Suggestions"
|
||||
curl -X GET "$BASE_URL/api/v1/parks/search-suggestions/?q=magic"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Park Detail"
|
||||
curl -X GET "$BASE_URL/api/v1/parks/1/"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. List Park Photos"
|
||||
curl -X GET "$BASE_URL/api/v1/parks/1/photos/"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Park Photo Detail"
|
||||
curl -X GET "$BASE_URL/api/v1/parks/1/photos/1/"
|
||||
((ENDPOINT_NUM++))
|
||||
fi
|
||||
|
||||
if should_run_endpoint true false; then
|
||||
echo -e "\n$ENDPOINT_NUM. Create Park"
|
||||
curl -X POST "$BASE_URL/api/v1/parks/" \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "Authorization: Bearer YOUR_TOKEN_HERE" \
|
||||
-d '{"name": "Test Park", "location": "Test City"}'
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Update Park"
|
||||
curl -X PUT "$BASE_URL/api/v1/parks/1/" \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "Authorization: Bearer YOUR_TOKEN_HERE" \
|
||||
-d '{"name": "Updated Park Name"}'
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Delete Park"
|
||||
curl -X DELETE "$BASE_URL/api/v1/parks/1/" \
|
||||
-H "Authorization: Bearer YOUR_TOKEN_HERE"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Create Park Photo"
|
||||
curl -X POST "$BASE_URL/api/v1/parks/1/photos/" \
|
||||
-H "Authorization: Bearer YOUR_TOKEN_HERE" \
|
||||
-F "image=@/path/to/photo.jpg" \
|
||||
-F "caption=Test photo"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Update Park Photo"
|
||||
curl -X PUT "$BASE_URL/api/v1/parks/1/photos/1/" \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "Authorization: Bearer YOUR_TOKEN_HERE" \
|
||||
-d '{"caption": "Updated caption"}'
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Delete Park Photo"
|
||||
curl -X DELETE "$BASE_URL/api/v1/parks/1/photos/1/" \
|
||||
-H "Authorization: Bearer YOUR_TOKEN_HERE"
|
||||
((ENDPOINT_NUM++))
|
||||
fi
|
||||
|
||||
# ============================================================================
|
||||
# RIDES API ENDPOINTS (/api/v1/rides/)
|
||||
# ============================================================================
|
||||
if should_run_endpoint false false || should_run_endpoint true false; then
|
||||
echo -e "\n\n=== RIDES API ENDPOINTS ==="
|
||||
fi
|
||||
|
||||
if should_run_endpoint false false; then
|
||||
echo "$ENDPOINT_NUM. List Rides"
|
||||
curl -X GET "$BASE_URL/api/v1/rides/"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Ride Filter Options"
|
||||
curl -X GET "$BASE_URL/api/v1/rides/filter-options/"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Ride Company Search"
|
||||
curl -X GET "$BASE_URL/api/v1/rides/search/companies/?q=intamin"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Ride Model Search"
|
||||
curl -X GET "$BASE_URL/api/v1/rides/search/ride-models/?q=giga"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Ride Search Suggestions"
|
||||
curl -X GET "$BASE_URL/api/v1/rides/search-suggestions/?q=millennium"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Ride Detail"
|
||||
curl -X GET "$BASE_URL/api/v1/rides/1/"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. List Ride Photos"
|
||||
curl -X GET "$BASE_URL/api/v1/rides/1/photos/"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Ride Photo Detail"
|
||||
curl -X GET "$BASE_URL/api/v1/rides/1/photos/1/"
|
||||
((ENDPOINT_NUM++))
|
||||
fi
|
||||
|
||||
if should_run_endpoint true false; then
|
||||
echo -e "\n$ENDPOINT_NUM. Create Ride"
|
||||
curl -X POST "$BASE_URL/api/v1/rides/" \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "Authorization: Bearer YOUR_TOKEN_HERE" \
|
||||
-d '{"name": "Test Coaster", "category": "RC", "park": 1}'
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Update Ride"
|
||||
curl -X PUT "$BASE_URL/api/v1/rides/1/" \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "Authorization: Bearer YOUR_TOKEN_HERE" \
|
||||
-d '{"name": "Updated Ride Name"}'
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Delete Ride"
|
||||
curl -X DELETE "$BASE_URL/api/v1/rides/1/" \
|
||||
-H "Authorization: Bearer YOUR_TOKEN_HERE"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Create Ride Photo"
|
||||
curl -X POST "$BASE_URL/api/v1/rides/1/photos/" \
|
||||
-H "Authorization: Bearer YOUR_TOKEN_HERE" \
|
||||
-F "image=@/path/to/photo.jpg" \
|
||||
-F "caption=Test ride photo"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Update Ride Photo"
|
||||
curl -X PUT "$BASE_URL/api/v1/rides/1/photos/1/" \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "Authorization: Bearer YOUR_TOKEN_HERE" \
|
||||
-d '{"caption": "Updated ride photo caption"}'
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Delete Ride Photo"
|
||||
curl -X DELETE "$BASE_URL/api/v1/rides/1/photos/1/" \
|
||||
-H "Authorization: Bearer YOUR_TOKEN_HERE"
|
||||
((ENDPOINT_NUM++))
|
||||
fi
|
||||
|
||||
# ============================================================================
|
||||
# ACCOUNTS API ENDPOINTS (/api/v1/accounts/)
|
||||
# ============================================================================
|
||||
if should_run_endpoint false false || should_run_endpoint true false; then
|
||||
echo -e "\n\n=== ACCOUNTS API ENDPOINTS ==="
|
||||
fi
|
||||
|
||||
if should_run_endpoint false false; then
|
||||
echo "$ENDPOINT_NUM. List User Profiles"
|
||||
curl -X GET "$BASE_URL/api/v1/accounts/profiles/"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. User Profile Detail"
|
||||
curl -X GET "$BASE_URL/api/v1/accounts/profiles/1/"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. List Top Lists"
|
||||
curl -X GET "$BASE_URL/api/v1/accounts/toplists/"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Top List Detail"
|
||||
curl -X GET "$BASE_URL/api/v1/accounts/toplists/1/"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. List Top List Items"
|
||||
curl -X GET "$BASE_URL/api/v1/accounts/toplist-items/"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Top List Item Detail"
|
||||
curl -X GET "$BASE_URL/api/v1/accounts/toplist-items/1/"
|
||||
((ENDPOINT_NUM++))
|
||||
fi
|
||||
|
||||
if should_run_endpoint true false; then
|
||||
echo -e "\n$ENDPOINT_NUM. Update User Profile"
|
||||
curl -X PUT "$BASE_URL/api/v1/accounts/profiles/1/" \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "Authorization: Bearer YOUR_TOKEN_HERE" \
|
||||
-d '{"bio": "Updated bio"}'
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Create Top List"
|
||||
curl -X POST "$BASE_URL/api/v1/accounts/toplists/" \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "Authorization: Bearer YOUR_TOKEN_HERE" \
|
||||
-d '{"name": "My Top Coasters", "description": "My favorite roller coasters"}'
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Update Top List"
|
||||
curl -X PUT "$BASE_URL/api/v1/accounts/toplists/1/" \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "Authorization: Bearer YOUR_TOKEN_HERE" \
|
||||
-d '{"name": "Updated Top List Name"}'
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Delete Top List"
|
||||
curl -X DELETE "$BASE_URL/api/v1/accounts/toplists/1/" \
|
||||
-H "Authorization: Bearer YOUR_TOKEN_HERE"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Create Top List Item"
|
||||
curl -X POST "$BASE_URL/api/v1/accounts/toplist-items/" \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "Authorization: Bearer YOUR_TOKEN_HERE" \
|
||||
-d '{"toplist": 1, "ride": 1, "position": 1}'
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Update Top List Item"
|
||||
curl -X PUT "$BASE_URL/api/v1/accounts/toplist-items/1/" \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "Authorization: Bearer YOUR_TOKEN_HERE" \
|
||||
-d '{"position": 2}'
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Delete Top List Item"
|
||||
curl -X DELETE "$BASE_URL/api/v1/accounts/toplist-items/1/" \
|
||||
-H "Authorization: Bearer YOUR_TOKEN_HERE"
|
||||
((ENDPOINT_NUM++))
|
||||
fi
|
||||
|
||||
# ============================================================================
|
||||
# HISTORY API ENDPOINTS (/api/v1/history/)
|
||||
# ============================================================================
|
||||
if should_run_endpoint false false; then
|
||||
echo -e "\n\n=== HISTORY API ENDPOINTS ==="
|
||||
|
||||
echo "$ENDPOINT_NUM. Park History List"
|
||||
curl -X GET "$BASE_URL/api/v1/history/parks/park-slug/"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Park History Detail"
|
||||
curl -X GET "$BASE_URL/api/v1/history/parks/park-slug/detail/"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Ride History List"
|
||||
curl -X GET "$BASE_URL/api/v1/history/parks/park-slug/rides/ride-slug/"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Ride History Detail"
|
||||
curl -X GET "$BASE_URL/api/v1/history/parks/park-slug/rides/ride-slug/detail/"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Unified Timeline"
|
||||
curl -X GET "$BASE_URL/api/v1/history/timeline/"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Unified Timeline Detail"
|
||||
curl -X GET "$BASE_URL/api/v1/history/timeline/1/"
|
||||
((ENDPOINT_NUM++))
|
||||
fi
|
||||
|
||||
# ============================================================================
|
||||
# EMAIL API ENDPOINTS (/api/v1/email/)
|
||||
# ============================================================================
|
||||
if should_run_endpoint true false; then
|
||||
echo -e "\n\n=== EMAIL API ENDPOINTS ==="
|
||||
|
||||
echo "$ENDPOINT_NUM. Send Email"
|
||||
curl -X POST "$BASE_URL/api/v1/email/send/" \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "Authorization: Bearer YOUR_TOKEN_HERE" \
|
||||
-d '{"to": "recipient@example.com", "subject": "Test", "message": "Test message"}'
|
||||
((ENDPOINT_NUM++))
|
||||
fi
|
||||
|
||||
# ============================================================================
|
||||
# CORE API ENDPOINTS (/api/v1/core/)
|
||||
# ============================================================================
|
||||
if should_run_endpoint false false; then
|
||||
echo -e "\n\n=== CORE API ENDPOINTS ==="
|
||||
|
||||
echo "$ENDPOINT_NUM. Entity Fuzzy Search"
|
||||
curl -X GET "$BASE_URL/api/v1/core/entities/search/?q=disney"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Entity Not Found"
|
||||
curl -X POST "$BASE_URL/api/v1/core/entities/not-found/" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"query": "nonexistent park", "type": "park"}'
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Entity Suggestions"
|
||||
curl -X GET "$BASE_URL/api/v1/core/entities/suggestions/?q=magic"
|
||||
((ENDPOINT_NUM++))
|
||||
fi
|
||||
|
||||
# ============================================================================
|
||||
# MAPS API ENDPOINTS (/api/v1/maps/)
|
||||
# ============================================================================
|
||||
if should_run_endpoint false false || should_run_endpoint true false; then
|
||||
echo -e "\n\n=== MAPS API ENDPOINTS ==="
|
||||
fi
|
||||
|
||||
if should_run_endpoint false false; then
|
||||
echo "$ENDPOINT_NUM. Map Locations"
|
||||
curl -X GET "$BASE_URL/api/v1/maps/locations/"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Map Location Detail"
|
||||
curl -X GET "$BASE_URL/api/v1/maps/locations/park/1/"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Map Search"
|
||||
curl -X GET "$BASE_URL/api/v1/maps/search/?q=disney"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Map Bounds Query"
|
||||
curl -X GET "$BASE_URL/api/v1/maps/bounds/?north=40.7&south=40.6&east=-73.9&west=-74.0"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Map Statistics"
|
||||
curl -X GET "$BASE_URL/api/v1/maps/stats/"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Map Cache Status"
|
||||
curl -X GET "$BASE_URL/api/v1/maps/cache/"
|
||||
((ENDPOINT_NUM++))
|
||||
fi
|
||||
|
||||
if should_run_endpoint true false; then
|
||||
echo -e "\n$ENDPOINT_NUM. Invalidate Map Cache"
|
||||
curl -X POST "$BASE_URL/api/v1/maps/cache/invalidate/" \
|
||||
-H "Authorization: Bearer YOUR_TOKEN_HERE"
|
||||
((ENDPOINT_NUM++))
|
||||
fi
|
||||
|
||||
# ============================================================================
|
||||
# API DOCUMENTATION ENDPOINTS
|
||||
# ============================================================================
|
||||
if should_run_endpoint false true; then
|
||||
echo -e "\n\n=== API DOCUMENTATION ENDPOINTS ==="
|
||||
|
||||
echo "$ENDPOINT_NUM. OpenAPI Schema"
|
||||
curl -X GET "$BASE_URL/api/schema/"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. Swagger UI"
|
||||
curl -X GET "$BASE_URL/api/docs/"
|
||||
((ENDPOINT_NUM++))
|
||||
|
||||
echo -e "\n$ENDPOINT_NUM. ReDoc"
|
||||
curl -X GET "$BASE_URL/api/redoc/"
|
||||
((ENDPOINT_NUM++))
|
||||
fi
|
||||
|
||||
# ============================================================================
|
||||
# HEALTH CHECK (Django Health Check)
|
||||
# ============================================================================
|
||||
if should_run_endpoint false false; then
|
||||
echo -e "\n\n=== DJANGO HEALTH CHECK ==="
|
||||
|
||||
echo "$ENDPOINT_NUM. Django Health Check"
|
||||
curl -X GET "$BASE_URL/health/"
|
||||
((ENDPOINT_NUM++))
|
||||
fi
|
||||
|
||||
echo -e "\n\n=== END OF API ENDPOINTS TEST SUITE ==="
|
||||
echo "Total endpoints tested: $((ENDPOINT_NUM - 1))"
|
||||
echo ""
|
||||
echo "Notes:"
|
||||
echo "- Replace YOUR_TOKEN_HERE with actual authentication tokens"
|
||||
echo "- Replace /path/to/photo.jpg with actual file paths for photo uploads"
|
||||
echo "- Replace numeric IDs (1, 2, etc.) with actual resource IDs"
|
||||
echo "- Replace slug placeholders (park-slug, ride-slug) with actual slugs"
|
||||
echo "- Adjust BASE_URL for your environment (localhost:8000, staging, production)"
|
||||
echo ""
|
||||
echo "Authentication required endpoints are marked with Authorization header"
|
||||
echo "File upload endpoints use multipart/form-data (-F flag)"
|
||||
echo "JSON endpoints use application/json content type"
|
||||
@@ -1,95 +0,0 @@
|
||||
from django.conf import settings
|
||||
from django.http import HttpRequest
|
||||
from typing import Optional, Any, Dict, Literal, TYPE_CHECKING, cast
|
||||
from allauth.account.adapter import DefaultAccountAdapter # type: ignore[import]
|
||||
from allauth.account.models import EmailConfirmation, EmailAddress # type: ignore[import]
|
||||
from allauth.socialaccount.adapter import DefaultSocialAccountAdapter # type: ignore[import]
|
||||
from allauth.socialaccount.models import SocialLogin # type: ignore[import]
|
||||
from django.contrib.auth import get_user_model
|
||||
from django.contrib.sites.shortcuts import get_current_site
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from django.contrib.auth.models import AbstractUser
|
||||
|
||||
User = get_user_model()
|
||||
|
||||
|
||||
class CustomAccountAdapter(DefaultAccountAdapter):
|
||||
def is_open_for_signup(self, request: HttpRequest) -> Literal[True]:
|
||||
"""
|
||||
Whether to allow sign ups.
|
||||
"""
|
||||
return True
|
||||
|
||||
def get_email_confirmation_url(self, request: HttpRequest, emailconfirmation: EmailConfirmation) -> str:
|
||||
"""
|
||||
Constructs the email confirmation (activation) url.
|
||||
"""
|
||||
get_current_site(request)
|
||||
# Ensure the key is treated as a string for the type checker
|
||||
key = cast(str, getattr(emailconfirmation, "key", ""))
|
||||
return f"{settings.LOGIN_REDIRECT_URL}verify-email?key={key}"
|
||||
|
||||
def send_confirmation_mail(self, request: HttpRequest, emailconfirmation: EmailConfirmation, signup: bool) -> None:
|
||||
"""
|
||||
Sends the confirmation email.
|
||||
"""
|
||||
current_site = get_current_site(request)
|
||||
activate_url = self.get_email_confirmation_url(request, emailconfirmation)
|
||||
# Cast key to str for typing consistency and template context
|
||||
key = cast(str, getattr(emailconfirmation, "key", ""))
|
||||
|
||||
# Determine template early
|
||||
if signup:
|
||||
email_template = "account/email/email_confirmation_signup"
|
||||
else:
|
||||
email_template = "account/email/email_confirmation"
|
||||
|
||||
# Cast the possibly-unknown email_address to EmailAddress so the type checker knows its attributes
|
||||
email_address = cast(EmailAddress, getattr(emailconfirmation, "email_address", None))
|
||||
|
||||
# Safely obtain email string (fallback to any top-level email on confirmation)
|
||||
email_str = cast(str, getattr(email_address, "email", getattr(emailconfirmation, "email", "")))
|
||||
|
||||
# Safely obtain the user object, cast to the project's User model for typing
|
||||
user_obj = cast("AbstractUser", getattr(email_address, "user", None))
|
||||
|
||||
# Explicitly type the context to avoid partial-unknown typing issues
|
||||
ctx: Dict[str, Any] = {
|
||||
"user": user_obj,
|
||||
"activate_url": activate_url,
|
||||
"current_site": current_site,
|
||||
"key": key,
|
||||
}
|
||||
# Remove unnecessary cast; ctx is already Dict[str, Any]
|
||||
self.send_mail(email_template, email_str, ctx) # type: ignore
|
||||
|
||||
|
||||
class CustomSocialAccountAdapter(DefaultSocialAccountAdapter):
|
||||
def is_open_for_signup(self, request: HttpRequest, sociallogin: SocialLogin) -> Literal[True]:
|
||||
"""
|
||||
Whether to allow social account sign ups.
|
||||
"""
|
||||
return True
|
||||
|
||||
def populate_user(
|
||||
self, request: HttpRequest, sociallogin: SocialLogin, data: Dict[str, Any]
|
||||
) -> "AbstractUser": # type: ignore[override]
|
||||
"""
|
||||
Hook that can be used to further populate the user instance.
|
||||
"""
|
||||
user = super().populate_user(request, sociallogin, data) # type: ignore
|
||||
if getattr(sociallogin.account, "provider", None) == "discord": # type: ignore
|
||||
user.discord_id = getattr(sociallogin.account, "uid", None) # type: ignore
|
||||
return cast("AbstractUser", user) # Ensure return type is explicit
|
||||
|
||||
def save_user(
|
||||
self, request: HttpRequest, sociallogin: SocialLogin, form: Optional[Any] = None
|
||||
) -> "AbstractUser": # type: ignore[override]
|
||||
"""
|
||||
Save the newly signed up social login.
|
||||
"""
|
||||
user = super().save_user(request, sociallogin, form) # type: ignore
|
||||
if user is None:
|
||||
raise ValueError("User creation failed")
|
||||
return cast("AbstractUser", user) # Ensure return type is explicit
|
||||
@@ -1,369 +0,0 @@
|
||||
from typing import Any
|
||||
from django.contrib import admin
|
||||
from django.contrib.auth.admin import UserAdmin as DjangoUserAdmin
|
||||
from django.utils.html import format_html
|
||||
from django.contrib.auth.models import Group
|
||||
from django.http import HttpRequest
|
||||
from django.db.models import QuerySet
|
||||
from .models import (
|
||||
User,
|
||||
UserProfile,
|
||||
EmailVerification,
|
||||
PasswordReset,
|
||||
TopList,
|
||||
TopListItem,
|
||||
)
|
||||
|
||||
|
||||
class UserProfileInline(admin.StackedInline[UserProfile, admin.options.AdminSite]):
|
||||
model = UserProfile
|
||||
can_delete = False
|
||||
verbose_name_plural = "Profile"
|
||||
fieldsets = (
|
||||
(
|
||||
"Personal Info",
|
||||
{"fields": ("display_name", "avatar", "pronouns", "bio")},
|
||||
),
|
||||
(
|
||||
"Social Media",
|
||||
{"fields": ("twitter", "instagram", "youtube", "discord")},
|
||||
),
|
||||
(
|
||||
"Ride Credits",
|
||||
{
|
||||
"fields": (
|
||||
"coaster_credits",
|
||||
"dark_ride_credits",
|
||||
"flat_ride_credits",
|
||||
"water_ride_credits",
|
||||
)
|
||||
},
|
||||
),
|
||||
)
|
||||
|
||||
|
||||
class TopListItemInline(admin.TabularInline[TopListItem]):
|
||||
model = TopListItem
|
||||
extra = 1
|
||||
fields = ("content_type", "object_id", "rank", "notes")
|
||||
ordering = ("rank",)
|
||||
|
||||
|
||||
@admin.register(User)
|
||||
class CustomUserAdmin(DjangoUserAdmin[User]):
|
||||
list_display = (
|
||||
"username",
|
||||
"email",
|
||||
"get_avatar",
|
||||
"get_status",
|
||||
"role",
|
||||
"date_joined",
|
||||
"last_login",
|
||||
"get_credits",
|
||||
)
|
||||
list_filter = (
|
||||
"is_active",
|
||||
"is_staff",
|
||||
"role",
|
||||
"is_banned",
|
||||
"groups",
|
||||
"date_joined",
|
||||
)
|
||||
search_fields = ("username", "email")
|
||||
ordering = ("-date_joined",)
|
||||
actions = [
|
||||
"activate_users",
|
||||
"deactivate_users",
|
||||
"ban_users",
|
||||
"unban_users",
|
||||
]
|
||||
inlines: list[type[admin.StackedInline[UserProfile]]] = [UserProfileInline]
|
||||
|
||||
fieldsets = (
|
||||
(None, {"fields": ("username", "password")}),
|
||||
("Personal info", {"fields": ("email", "pending_email")}),
|
||||
(
|
||||
"Roles and Permissions",
|
||||
{
|
||||
"fields": ("role", "groups", "user_permissions"),
|
||||
"description": (
|
||||
"Role determines group membership. Groups determine permissions."
|
||||
),
|
||||
},
|
||||
),
|
||||
(
|
||||
"Status",
|
||||
{
|
||||
"fields": ("is_active", "is_staff", "is_superuser"),
|
||||
"description": "These are automatically managed based on role.",
|
||||
},
|
||||
),
|
||||
(
|
||||
"Ban Status",
|
||||
{
|
||||
"fields": ("is_banned", "ban_reason", "ban_date"),
|
||||
},
|
||||
),
|
||||
(
|
||||
"Preferences",
|
||||
{
|
||||
"fields": ("theme_preference",),
|
||||
},
|
||||
),
|
||||
("Important dates", {"fields": ("last_login", "date_joined")}),
|
||||
)
|
||||
add_fieldsets = (
|
||||
(
|
||||
None,
|
||||
{
|
||||
"classes": ("wide",),
|
||||
"fields": (
|
||||
"username",
|
||||
"email",
|
||||
"password1",
|
||||
"password2",
|
||||
"role",
|
||||
),
|
||||
},
|
||||
),
|
||||
)
|
||||
|
||||
@admin.display(description="Avatar")
|
||||
def get_avatar(self, obj: User) -> str:
|
||||
profile = getattr(obj, "profile", None)
|
||||
if profile and getattr(profile, "avatar", None):
|
||||
return format_html(
|
||||
'<img src="{0}" width="30" height="30" style="border-radius:50%;" />',
|
||||
getattr(profile.avatar, "url", ""), # type: ignore
|
||||
)
|
||||
return format_html(
|
||||
'<div style="width:30px; height:30px; border-radius:50%; '
|
||||
"background-color:#007bff; color:white; display:flex; "
|
||||
'align-items:center; justify-content:center;">{0}</div>',
|
||||
getattr(obj, "username", "?")[0].upper(), # type: ignore
|
||||
)
|
||||
|
||||
@admin.display(description="Status")
|
||||
def get_status(self, obj: User) -> str:
|
||||
if getattr(obj, "is_banned", False):
|
||||
return format_html('<span style="color: red;">{}</span>', "Banned")
|
||||
if not getattr(obj, "is_active", True):
|
||||
return format_html('<span style="color: orange;">{}</span>', "Inactive")
|
||||
if getattr(obj, "is_superuser", False):
|
||||
return format_html('<span style="color: purple;">{}</span>', "Superuser")
|
||||
if getattr(obj, "is_staff", False):
|
||||
return format_html('<span style="color: blue;">{}</span>', "Staff")
|
||||
return format_html('<span style="color: green;">{}</span>', "Active")
|
||||
|
||||
@admin.display(description="Ride Credits")
|
||||
def get_credits(self, obj: User) -> str:
|
||||
try:
|
||||
profile = getattr(obj, "profile", None)
|
||||
if not profile:
|
||||
return "-"
|
||||
return format_html(
|
||||
"RC: {0}<br>DR: {1}<br>FR: {2}<br>WR: {3}",
|
||||
getattr(profile, "coaster_credits", 0),
|
||||
getattr(profile, "dark_ride_credits", 0),
|
||||
getattr(profile, "flat_ride_credits", 0),
|
||||
getattr(profile, "water_ride_credits", 0),
|
||||
)
|
||||
except UserProfile.DoesNotExist:
|
||||
return "-"
|
||||
|
||||
@admin.action(description="Activate selected users")
|
||||
def activate_users(self, request: HttpRequest, queryset: QuerySet[User]) -> None:
|
||||
queryset.update(is_active=True)
|
||||
|
||||
@admin.action(description="Deactivate selected users")
|
||||
def deactivate_users(self, request: HttpRequest, queryset: QuerySet[User]) -> None:
|
||||
queryset.update(is_active=False)
|
||||
|
||||
@admin.action(description="Ban selected users")
|
||||
def ban_users(self, request: HttpRequest, queryset: QuerySet[User]) -> None:
|
||||
from django.utils import timezone
|
||||
queryset.update(is_banned=True, ban_date=timezone.now())
|
||||
|
||||
@admin.action(description="Unban selected users")
|
||||
def unban_users(self, request: HttpRequest, queryset: QuerySet[User]) -> None:
|
||||
queryset.update(is_banned=False, ban_date=None, ban_reason="")
|
||||
|
||||
def save_model(
|
||||
self,
|
||||
request: HttpRequest,
|
||||
obj: User,
|
||||
form: Any,
|
||||
change: bool
|
||||
) -> None:
|
||||
creating = not obj.pk
|
||||
super().save_model(request, obj, form, change)
|
||||
if creating and getattr(obj, "role", "USER") != "USER":
|
||||
group = Group.objects.filter(name=getattr(obj, "role", None)).first()
|
||||
if group:
|
||||
obj.groups.add(group) # type: ignore[attr-defined]
|
||||
|
||||
|
||||
@admin.register(UserProfile)
|
||||
class UserProfileAdmin(admin.ModelAdmin[UserProfile]):
|
||||
list_display = (
|
||||
"user",
|
||||
"display_name",
|
||||
"coaster_credits",
|
||||
"dark_ride_credits",
|
||||
"flat_ride_credits",
|
||||
"water_ride_credits",
|
||||
)
|
||||
list_filter = (
|
||||
"coaster_credits",
|
||||
"dark_ride_credits",
|
||||
"flat_ride_credits",
|
||||
"water_ride_credits",
|
||||
)
|
||||
search_fields = ("user__username", "user__email", "display_name", "bio")
|
||||
|
||||
fieldsets = (
|
||||
(
|
||||
"User Information",
|
||||
{"fields": ("user", "display_name", "avatar", "pronouns", "bio")},
|
||||
),
|
||||
(
|
||||
"Social Media",
|
||||
{"fields": ("twitter", "instagram", "youtube", "discord")},
|
||||
),
|
||||
(
|
||||
"Ride Credits",
|
||||
{
|
||||
"fields": (
|
||||
"coaster_credits",
|
||||
"dark_ride_credits",
|
||||
"flat_ride_credits",
|
||||
"water_ride_credits",
|
||||
)
|
||||
},
|
||||
),
|
||||
)
|
||||
|
||||
|
||||
@admin.register(EmailVerification)
|
||||
class EmailVerificationAdmin(admin.ModelAdmin[EmailVerification]):
|
||||
list_display = ("user", "created_at", "last_sent", "is_expired")
|
||||
list_filter = ("created_at", "last_sent")
|
||||
search_fields = ("user__username", "user__email", "token")
|
||||
readonly_fields = ("created_at", "last_sent")
|
||||
|
||||
fieldsets = (
|
||||
("Verification Details", {"fields": ("user", "token")}),
|
||||
("Timing", {"fields": ("created_at", "last_sent")}),
|
||||
)
|
||||
|
||||
@admin.display(description="Status")
|
||||
def is_expired(self, obj: EmailVerification) -> str:
|
||||
from django.utils import timezone
|
||||
from datetime import timedelta
|
||||
|
||||
if timezone.now() - getattr(obj, "last_sent", timezone.now()) > timedelta(days=1):
|
||||
return format_html('<span style="color: red;">{}</span>', "Expired")
|
||||
return format_html('<span style="color: green;">{}</span>', "Valid")
|
||||
|
||||
|
||||
@admin.register(TopList)
|
||||
class TopListAdmin(admin.ModelAdmin[TopList]):
|
||||
list_display = ("title", "user", "category", "created_at", "updated_at")
|
||||
list_filter = ("category", "created_at", "updated_at")
|
||||
search_fields = ("title", "user__username", "description")
|
||||
inlines: list[type[admin.TabularInline[TopListItem]]] = [TopListItemInline]
|
||||
|
||||
fieldsets = (
|
||||
(
|
||||
"Basic Information",
|
||||
{"fields": ("user", "title", "category", "description")},
|
||||
),
|
||||
(
|
||||
"Timestamps",
|
||||
{"fields": ("created_at", "updated_at"), "classes": ("collapse",)},
|
||||
),
|
||||
)
|
||||
readonly_fields = ("created_at", "updated_at")
|
||||
|
||||
|
||||
@admin.register(TopListItem)
|
||||
class TopListItemAdmin(admin.ModelAdmin[TopListItem]):
|
||||
list_display = ("top_list", "content_type", "object_id", "rank")
|
||||
list_filter = ("top_list__category", "rank")
|
||||
search_fields = ("top_list__title", "notes")
|
||||
ordering = ("top_list", "rank")
|
||||
|
||||
fieldsets = (
|
||||
("List Information", {"fields": ("top_list", "rank")}),
|
||||
("Item Details", {"fields": ("content_type", "object_id", "notes")}),
|
||||
)
|
||||
|
||||
|
||||
@admin.register(PasswordReset)
|
||||
class PasswordResetAdmin(admin.ModelAdmin[PasswordReset]):
|
||||
"""Admin interface for password reset tokens"""
|
||||
|
||||
list_display = (
|
||||
"user",
|
||||
"created_at",
|
||||
"expires_at",
|
||||
"is_expired",
|
||||
"used",
|
||||
)
|
||||
list_filter = (
|
||||
"used",
|
||||
"created_at",
|
||||
"expires_at",
|
||||
)
|
||||
search_fields = (
|
||||
"user__username",
|
||||
"user__email",
|
||||
"token",
|
||||
)
|
||||
readonly_fields = (
|
||||
"token",
|
||||
"created_at",
|
||||
"expires_at",
|
||||
)
|
||||
date_hierarchy = "created_at"
|
||||
ordering = ("-created_at",)
|
||||
|
||||
fieldsets = (
|
||||
(
|
||||
"Reset Details",
|
||||
{
|
||||
"fields": (
|
||||
"user",
|
||||
"token",
|
||||
"used",
|
||||
)
|
||||
},
|
||||
),
|
||||
(
|
||||
"Timing",
|
||||
{
|
||||
"fields": (
|
||||
"created_at",
|
||||
"expires_at",
|
||||
)
|
||||
},
|
||||
),
|
||||
)
|
||||
|
||||
@admin.display(description="Status", boolean=True)
|
||||
def is_expired(self, obj: PasswordReset) -> str:
|
||||
from django.utils import timezone
|
||||
|
||||
if getattr(obj, "used", False):
|
||||
return format_html('<span style="color: blue;">{}</span>', "Used")
|
||||
elif timezone.now() > getattr(obj, "expires_at", timezone.now()):
|
||||
return format_html('<span style="color: red;">{}</span>', "Expired")
|
||||
return format_html('<span style="color: green;">{}</span>', "Valid")
|
||||
|
||||
def has_add_permission(self, request: HttpRequest) -> bool:
|
||||
"""Disable manual creation of password reset tokens"""
|
||||
return False
|
||||
|
||||
def has_change_permission(self, request: HttpRequest, obj: Any = None) -> bool:
|
||||
"""Allow viewing but restrict editing of password reset tokens"""
|
||||
return getattr(request.user, "is_superuser", False)
|
||||
@@ -1,18 +0,0 @@
|
||||
from django.core.management.base import BaseCommand
|
||||
from django.db import connection
|
||||
|
||||
|
||||
class Command(BaseCommand):
|
||||
help = "Fix migration history by removing rides.0001_initial"
|
||||
|
||||
def handle(self, *args, **kwargs):
|
||||
with connection.cursor() as cursor:
|
||||
cursor.execute(
|
||||
"DELETE FROM django_migrations WHERE app='rides' "
|
||||
"AND name='0001_initial';"
|
||||
)
|
||||
self.stdout.write(
|
||||
self.style.SUCCESS(
|
||||
"Successfully removed rides.0001_initial from migration history"
|
||||
)
|
||||
)
|
||||
@@ -1,108 +0,0 @@
|
||||
from django.core.management.base import BaseCommand
|
||||
from django.db import connection
|
||||
from django.contrib.auth.hashers import make_password
|
||||
import uuid
|
||||
|
||||
|
||||
class Command(BaseCommand):
|
||||
help = "Reset database and create admin user"
|
||||
|
||||
def handle(self, *args, **options):
|
||||
self.stdout.write("Resetting database...")
|
||||
|
||||
# Drop all tables
|
||||
with connection.cursor() as cursor:
|
||||
cursor.execute(
|
||||
"""
|
||||
DO $$ DECLARE
|
||||
r RECORD;
|
||||
BEGIN
|
||||
FOR r IN (
|
||||
SELECT tablename FROM pg_tables
|
||||
WHERE schemaname = current_schema()
|
||||
) LOOP
|
||||
EXECUTE 'DROP TABLE IF EXISTS ' || \
|
||||
quote_ident(r.tablename) || ' CASCADE';
|
||||
END LOOP;
|
||||
END $$;
|
||||
"""
|
||||
)
|
||||
|
||||
# Reset sequences
|
||||
cursor.execute(
|
||||
"""
|
||||
DO $$ DECLARE
|
||||
r RECORD;
|
||||
BEGIN
|
||||
FOR r IN (
|
||||
SELECT sequencename FROM pg_sequences
|
||||
WHERE schemaname = current_schema()
|
||||
) LOOP
|
||||
EXECUTE 'ALTER SEQUENCE ' || \
|
||||
quote_ident(r.sequencename) || ' RESTART WITH 1';
|
||||
END LOOP;
|
||||
END $$;
|
||||
"""
|
||||
)
|
||||
|
||||
self.stdout.write("All tables dropped and sequences reset.")
|
||||
|
||||
# Run migrations
|
||||
from django.core.management import call_command
|
||||
|
||||
call_command("migrate")
|
||||
|
||||
self.stdout.write("Migrations applied.")
|
||||
|
||||
# Create superuser using raw SQL
|
||||
try:
|
||||
with connection.cursor() as cursor:
|
||||
# Create user
|
||||
user_id = str(uuid.uuid4())[:10]
|
||||
cursor.execute(
|
||||
"""
|
||||
INSERT INTO accounts_user (
|
||||
username, password, email, is_superuser, is_staff,
|
||||
is_active, date_joined, user_id, first_name,
|
||||
last_name, role, is_banned, ban_reason,
|
||||
theme_preference
|
||||
) VALUES (
|
||||
'admin', %s, 'admin@thrillwiki.com', true, true,
|
||||
true, NOW(), %s, '', '', 'SUPERUSER', false, '',
|
||||
'light'
|
||||
) RETURNING id;
|
||||
""",
|
||||
[make_password("admin"), user_id],
|
||||
)
|
||||
|
||||
result = cursor.fetchone()
|
||||
if result is None:
|
||||
raise Exception("Failed to create user - no ID returned")
|
||||
user_db_id = result[0]
|
||||
|
||||
# Create profile
|
||||
profile_id = str(uuid.uuid4())[:10]
|
||||
cursor.execute(
|
||||
"""
|
||||
INSERT INTO accounts_userprofile (
|
||||
profile_id, display_name, pronouns, bio,
|
||||
twitter, instagram, youtube, discord,
|
||||
coaster_credits, dark_ride_credits,
|
||||
flat_ride_credits, water_ride_credits,
|
||||
user_id, avatar
|
||||
) VALUES (
|
||||
%s, 'Admin', 'they/them', 'ThrillWiki Administrator',
|
||||
'', '', '', '',
|
||||
0, 0, 0, 0,
|
||||
%s, ''
|
||||
);
|
||||
""",
|
||||
[profile_id, user_db_id],
|
||||
)
|
||||
|
||||
self.stdout.write("Superuser created.")
|
||||
except Exception as e:
|
||||
self.stdout.write(self.style.ERROR(f"Error creating superuser: {str(e)}"))
|
||||
raise
|
||||
|
||||
self.stdout.write(self.style.SUCCESS("Database reset complete."))
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,76 +0,0 @@
|
||||
# Generated by Django 5.2.6 on 2025-09-21 01:29
|
||||
|
||||
import django.db.models.deletion
|
||||
import pgtrigger.compiler
|
||||
import pgtrigger.migrations
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
("accounts", "0001_initial"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
pgtrigger.migrations.RemoveTrigger(
|
||||
model_name="userprofile",
|
||||
name="insert_insert",
|
||||
),
|
||||
pgtrigger.migrations.RemoveTrigger(
|
||||
model_name="userprofile",
|
||||
name="update_update",
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name="userprofile",
|
||||
name="avatar",
|
||||
field=models.ForeignKey(
|
||||
blank=True,
|
||||
null=True,
|
||||
on_delete=django.db.models.deletion.SET_NULL,
|
||||
to="django_cloudflareimages_toolkit.cloudflareimage",
|
||||
),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name="userprofileevent",
|
||||
name="avatar",
|
||||
field=models.ForeignKey(
|
||||
blank=True,
|
||||
db_constraint=False,
|
||||
null=True,
|
||||
on_delete=django.db.models.deletion.DO_NOTHING,
|
||||
related_name="+",
|
||||
related_query_name="+",
|
||||
to="django_cloudflareimages_toolkit.cloudflareimage",
|
||||
),
|
||||
),
|
||||
pgtrigger.migrations.AddTrigger(
|
||||
model_name="userprofile",
|
||||
trigger=pgtrigger.compiler.Trigger(
|
||||
name="insert_insert",
|
||||
sql=pgtrigger.compiler.UpsertTriggerSql(
|
||||
func='INSERT INTO "accounts_userprofileevent" ("avatar_id", "bio", "coaster_credits", "dark_ride_credits", "discord", "display_name", "flat_ride_credits", "id", "instagram", "pgh_context_id", "pgh_created_at", "pgh_label", "pgh_obj_id", "profile_id", "pronouns", "twitter", "user_id", "water_ride_credits", "youtube") VALUES (NEW."avatar_id", NEW."bio", NEW."coaster_credits", NEW."dark_ride_credits", NEW."discord", NEW."display_name", NEW."flat_ride_credits", NEW."id", NEW."instagram", _pgh_attach_context(), NOW(), \'insert\', NEW."id", NEW."profile_id", NEW."pronouns", NEW."twitter", NEW."user_id", NEW."water_ride_credits", NEW."youtube"); RETURN NULL;',
|
||||
hash="a7ecdb1ac2821dea1fef4ec917eeaf6b8e4f09c8",
|
||||
operation="INSERT",
|
||||
pgid="pgtrigger_insert_insert_c09d7",
|
||||
table="accounts_userprofile",
|
||||
when="AFTER",
|
||||
),
|
||||
),
|
||||
),
|
||||
pgtrigger.migrations.AddTrigger(
|
||||
model_name="userprofile",
|
||||
trigger=pgtrigger.compiler.Trigger(
|
||||
name="update_update",
|
||||
sql=pgtrigger.compiler.UpsertTriggerSql(
|
||||
condition="WHEN (OLD.* IS DISTINCT FROM NEW.*)",
|
||||
func='INSERT INTO "accounts_userprofileevent" ("avatar_id", "bio", "coaster_credits", "dark_ride_credits", "discord", "display_name", "flat_ride_credits", "id", "instagram", "pgh_context_id", "pgh_created_at", "pgh_label", "pgh_obj_id", "profile_id", "pronouns", "twitter", "user_id", "water_ride_credits", "youtube") VALUES (NEW."avatar_id", NEW."bio", NEW."coaster_credits", NEW."dark_ride_credits", NEW."discord", NEW."display_name", NEW."flat_ride_credits", NEW."id", NEW."instagram", _pgh_attach_context(), NOW(), \'update\', NEW."id", NEW."profile_id", NEW."pronouns", NEW."twitter", NEW."user_id", NEW."water_ride_credits", NEW."youtube"); RETURN NULL;',
|
||||
hash="81607e492ffea2a4c741452b860ee660374cc01d",
|
||||
operation="UPDATE",
|
||||
pgid="pgtrigger_update_update_87ef6",
|
||||
table="accounts_userprofile",
|
||||
when="AFTER",
|
||||
),
|
||||
),
|
||||
),
|
||||
]
|
||||
@@ -1,35 +0,0 @@
|
||||
import requests
|
||||
from django.conf import settings
|
||||
from django.core.exceptions import ValidationError
|
||||
|
||||
|
||||
class TurnstileMixin:
|
||||
"""
|
||||
Mixin to handle Cloudflare Turnstile validation.
|
||||
Bypasses validation when DEBUG is True.
|
||||
"""
|
||||
|
||||
def validate_turnstile(self, request):
|
||||
"""
|
||||
Validate the Turnstile response token.
|
||||
Skips validation when DEBUG is True.
|
||||
"""
|
||||
if settings.DEBUG:
|
||||
return
|
||||
|
||||
token = request.POST.get("cf-turnstile-response")
|
||||
if not token:
|
||||
raise ValidationError("Please complete the Turnstile challenge.")
|
||||
|
||||
# Verify the token with Cloudflare
|
||||
data = {
|
||||
"secret": settings.TURNSTILE_SECRET_KEY,
|
||||
"response": token,
|
||||
"remoteip": request.META.get("REMOTE_ADDR"),
|
||||
}
|
||||
|
||||
response = requests.post(settings.TURNSTILE_VERIFY_URL, data=data, timeout=60)
|
||||
result = response.json()
|
||||
|
||||
if not result.get("success"):
|
||||
raise ValidationError("Turnstile validation failed. Please try again.")
|
||||
@@ -1,366 +0,0 @@
|
||||
"""
|
||||
User management services for ThrillWiki.
|
||||
|
||||
This module contains services for user account management including
|
||||
user deletion while preserving submissions.
|
||||
"""
|
||||
|
||||
from typing import Optional
|
||||
from django.db import transaction
|
||||
from django.utils import timezone
|
||||
from django.conf import settings
|
||||
from django.contrib.sites.models import Site
|
||||
from django_forwardemail.services import EmailService
|
||||
from .models import User, UserProfile, UserDeletionRequest
|
||||
|
||||
|
||||
class UserDeletionService:
|
||||
"""Service for handling user deletion while preserving submissions."""
|
||||
|
||||
DELETED_USER_USERNAME = "deleted_user"
|
||||
DELETED_USER_EMAIL = "deleted@thrillwiki.com"
|
||||
DELETED_DISPLAY_NAME = "Deleted User"
|
||||
|
||||
@classmethod
|
||||
def get_or_create_deleted_user(cls) -> User:
|
||||
"""Get or create the system deleted user placeholder."""
|
||||
deleted_user, created = User.objects.get_or_create(
|
||||
username=cls.DELETED_USER_USERNAME,
|
||||
defaults={
|
||||
"email": cls.DELETED_USER_EMAIL,
|
||||
"is_active": False,
|
||||
"is_staff": False,
|
||||
"is_superuser": False,
|
||||
"role": "USER",
|
||||
"is_banned": True,
|
||||
"ban_reason": "System placeholder for deleted users",
|
||||
"ban_date": timezone.now(),
|
||||
},
|
||||
)
|
||||
|
||||
if created:
|
||||
# Create profile for deleted user
|
||||
UserProfile.objects.create(
|
||||
user=deleted_user,
|
||||
display_name=cls.DELETED_DISPLAY_NAME,
|
||||
bio="This user account has been deleted.",
|
||||
)
|
||||
|
||||
return deleted_user
|
||||
|
||||
@classmethod
|
||||
@transaction.atomic
|
||||
def delete_user_preserve_submissions(cls, user: User) -> dict:
|
||||
"""
|
||||
Delete a user while preserving all their submissions.
|
||||
|
||||
This method:
|
||||
1. Transfers all user submissions to a system "deleted_user" placeholder
|
||||
2. Deletes the user's profile and account data
|
||||
3. Returns a summary of what was preserved
|
||||
|
||||
Args:
|
||||
user: The user to delete
|
||||
|
||||
Returns:
|
||||
dict: Summary of preserved submissions
|
||||
"""
|
||||
if user.username == cls.DELETED_USER_USERNAME:
|
||||
raise ValueError("Cannot delete the system deleted user placeholder")
|
||||
|
||||
deleted_user = cls.get_or_create_deleted_user()
|
||||
|
||||
# Count submissions before transfer
|
||||
submission_counts = {
|
||||
"park_reviews": getattr(
|
||||
user, "park_reviews", user.__class__.objects.none()
|
||||
).count(),
|
||||
"ride_reviews": getattr(
|
||||
user, "ride_reviews", user.__class__.objects.none()
|
||||
).count(),
|
||||
"uploaded_park_photos": getattr(
|
||||
user, "uploaded_park_photos", user.__class__.objects.none()
|
||||
).count(),
|
||||
"uploaded_ride_photos": getattr(
|
||||
user, "uploaded_ride_photos", user.__class__.objects.none()
|
||||
).count(),
|
||||
"top_lists": getattr(
|
||||
user, "top_lists", user.__class__.objects.none()
|
||||
).count(),
|
||||
"edit_submissions": getattr(
|
||||
user, "edit_submissions", user.__class__.objects.none()
|
||||
).count(),
|
||||
"photo_submissions": getattr(
|
||||
user, "photo_submissions", user.__class__.objects.none()
|
||||
).count(),
|
||||
"moderated_park_reviews": getattr(
|
||||
user, "moderated_park_reviews", user.__class__.objects.none()
|
||||
).count(),
|
||||
"moderated_ride_reviews": getattr(
|
||||
user, "moderated_ride_reviews", user.__class__.objects.none()
|
||||
).count(),
|
||||
"handled_submissions": getattr(
|
||||
user, "handled_submissions", user.__class__.objects.none()
|
||||
).count(),
|
||||
"handled_photos": getattr(
|
||||
user, "handled_photos", user.__class__.objects.none()
|
||||
).count(),
|
||||
}
|
||||
|
||||
# Transfer all submissions to deleted user
|
||||
# Reviews
|
||||
if hasattr(user, "park_reviews"):
|
||||
getattr(user, "park_reviews").update(user=deleted_user)
|
||||
if hasattr(user, "ride_reviews"):
|
||||
getattr(user, "ride_reviews").update(user=deleted_user)
|
||||
|
||||
# Photos
|
||||
if hasattr(user, "uploaded_park_photos"):
|
||||
getattr(user, "uploaded_park_photos").update(uploaded_by=deleted_user)
|
||||
if hasattr(user, "uploaded_ride_photos"):
|
||||
getattr(user, "uploaded_ride_photos").update(uploaded_by=deleted_user)
|
||||
|
||||
# Top Lists
|
||||
if hasattr(user, "top_lists"):
|
||||
getattr(user, "top_lists").update(user=deleted_user)
|
||||
|
||||
# Moderation submissions
|
||||
if hasattr(user, "edit_submissions"):
|
||||
getattr(user, "edit_submissions").update(user=deleted_user)
|
||||
if hasattr(user, "photo_submissions"):
|
||||
getattr(user, "photo_submissions").update(user=deleted_user)
|
||||
|
||||
# Moderation actions - these can be set to NULL since they're not user content
|
||||
if hasattr(user, "moderated_park_reviews"):
|
||||
getattr(user, "moderated_park_reviews").update(moderated_by=None)
|
||||
if hasattr(user, "moderated_ride_reviews"):
|
||||
getattr(user, "moderated_ride_reviews").update(moderated_by=None)
|
||||
if hasattr(user, "handled_submissions"):
|
||||
getattr(user, "handled_submissions").update(handled_by=None)
|
||||
if hasattr(user, "handled_photos"):
|
||||
getattr(user, "handled_photos").update(handled_by=None)
|
||||
|
||||
# Store user info for the summary
|
||||
user_info = {
|
||||
"username": user.username,
|
||||
"user_id": user.user_id,
|
||||
"email": user.email,
|
||||
"date_joined": user.date_joined,
|
||||
}
|
||||
|
||||
# Delete the user (this will cascade delete the profile)
|
||||
user.delete()
|
||||
|
||||
return {
|
||||
"deleted_user": user_info,
|
||||
"preserved_submissions": submission_counts,
|
||||
"transferred_to": {
|
||||
"username": deleted_user.username,
|
||||
"user_id": deleted_user.user_id,
|
||||
},
|
||||
}
|
||||
|
||||
@classmethod
|
||||
def can_delete_user(cls, user: User) -> tuple[bool, Optional[str]]:
|
||||
"""
|
||||
Check if a user can be safely deleted.
|
||||
|
||||
Args:
|
||||
user: The user to check
|
||||
|
||||
Returns:
|
||||
tuple: (can_delete: bool, reason: Optional[str])
|
||||
"""
|
||||
if user.username == cls.DELETED_USER_USERNAME:
|
||||
return False, "Cannot delete the system deleted user placeholder"
|
||||
|
||||
if user.is_superuser:
|
||||
return False, "Superuser accounts cannot be deleted for security reasons. Please contact system administrator or remove superuser privileges first."
|
||||
|
||||
# Check if user has critical admin role
|
||||
if user.role == "ADMIN" and user.is_staff:
|
||||
return False, "Admin accounts with staff privileges cannot be deleted. Please remove admin privileges first or contact system administrator."
|
||||
|
||||
# Add any other business rules here
|
||||
|
||||
return True, None
|
||||
|
||||
@classmethod
|
||||
def request_user_deletion(cls, user: User) -> UserDeletionRequest:
|
||||
"""
|
||||
Create a user deletion request and send verification email.
|
||||
|
||||
Args:
|
||||
user: The user requesting deletion
|
||||
|
||||
Returns:
|
||||
UserDeletionRequest: The created deletion request
|
||||
"""
|
||||
# Check if user can be deleted
|
||||
can_delete, reason = cls.can_delete_user(user)
|
||||
if not can_delete:
|
||||
raise ValueError(f"Cannot delete user: {reason}")
|
||||
|
||||
# Remove any existing deletion request for this user
|
||||
UserDeletionRequest.objects.filter(user=user).delete()
|
||||
|
||||
# Create new deletion request
|
||||
deletion_request = UserDeletionRequest.objects.create(user=user)
|
||||
|
||||
# Send verification email
|
||||
cls.send_deletion_verification_email(deletion_request)
|
||||
|
||||
return deletion_request
|
||||
|
||||
@classmethod
|
||||
def send_deletion_verification_email(cls, deletion_request: UserDeletionRequest):
|
||||
"""
|
||||
Send verification email for account deletion.
|
||||
|
||||
Args:
|
||||
deletion_request: The deletion request to send email for
|
||||
"""
|
||||
user = deletion_request.user
|
||||
|
||||
# Get current site for email service
|
||||
try:
|
||||
site = Site.objects.get_current()
|
||||
except Site.DoesNotExist:
|
||||
# Fallback to default site
|
||||
site = Site.objects.get_or_create(
|
||||
id=1, defaults={"domain": "localhost:8000", "name": "localhost:8000"}
|
||||
)[0]
|
||||
|
||||
# Prepare email context
|
||||
context = {
|
||||
"user": user,
|
||||
"verification_code": deletion_request.verification_code,
|
||||
"expires_at": deletion_request.expires_at,
|
||||
"site_name": getattr(settings, "SITE_NAME", "ThrillWiki"),
|
||||
"frontend_domain": getattr(
|
||||
settings, "FRONTEND_DOMAIN", "http://localhost:3000"
|
||||
),
|
||||
}
|
||||
|
||||
# Render email content
|
||||
subject = f"Confirm Account Deletion - {context['site_name']}"
|
||||
|
||||
# Create email message with 1-hour expiration notice
|
||||
message = f"""
|
||||
Hello {user.get_display_name()},
|
||||
|
||||
You have requested to delete your ThrillWiki account. To confirm this action, please use the following verification code:
|
||||
|
||||
Verification Code: {deletion_request.verification_code}
|
||||
|
||||
This code will expire in 1 hour on {deletion_request.expires_at.strftime('%B %d, %Y at %I:%M %p UTC')}.
|
||||
|
||||
IMPORTANT: This action cannot be undone. Your account will be permanently deleted, but all your reviews, photos, and other contributions will be preserved on the site.
|
||||
|
||||
If you did not request this deletion, please ignore this email and your account will remain active.
|
||||
|
||||
To complete the deletion, enter the verification code in the account deletion form on our website.
|
||||
|
||||
Best regards,
|
||||
The ThrillWiki Team
|
||||
""".strip()
|
||||
|
||||
# Send email using custom email service
|
||||
try:
|
||||
EmailService.send_email(
|
||||
to=user.email,
|
||||
subject=subject,
|
||||
text=message,
|
||||
site=site,
|
||||
from_email="no-reply@thrillwiki.com",
|
||||
)
|
||||
|
||||
# Update email sent timestamp
|
||||
deletion_request.email_sent_at = timezone.now()
|
||||
deletion_request.save(update_fields=["email_sent_at"])
|
||||
|
||||
except Exception as e:
|
||||
# Log the error but don't fail the request creation
|
||||
print(f"Failed to send deletion verification email to {user.email}: {e}")
|
||||
|
||||
@classmethod
|
||||
@transaction.atomic
|
||||
def verify_and_delete_user(cls, verification_code: str) -> dict:
|
||||
"""
|
||||
Verify deletion code and delete the user account.
|
||||
|
||||
Args:
|
||||
verification_code: The verification code from the email
|
||||
|
||||
Returns:
|
||||
dict: Summary of the deletion
|
||||
|
||||
Raises:
|
||||
ValueError: If verification fails
|
||||
"""
|
||||
try:
|
||||
deletion_request = UserDeletionRequest.objects.get(
|
||||
verification_code=verification_code
|
||||
)
|
||||
except UserDeletionRequest.DoesNotExist:
|
||||
raise ValueError("Invalid verification code")
|
||||
|
||||
# Check if request is still valid
|
||||
if not deletion_request.is_valid():
|
||||
if deletion_request.is_expired():
|
||||
raise ValueError("Verification code has expired")
|
||||
elif deletion_request.is_used:
|
||||
raise ValueError("Verification code has already been used")
|
||||
elif deletion_request.attempts >= deletion_request.max_attempts:
|
||||
raise ValueError("Too many verification attempts")
|
||||
else:
|
||||
raise ValueError("Invalid verification code")
|
||||
|
||||
# Increment attempts
|
||||
deletion_request.increment_attempts()
|
||||
|
||||
# Mark as used
|
||||
deletion_request.mark_as_used()
|
||||
|
||||
# Delete the user
|
||||
user = deletion_request.user
|
||||
result = cls.delete_user_preserve_submissions(user)
|
||||
|
||||
# Add deletion request info to result
|
||||
result["deletion_request"] = {
|
||||
"verification_code": verification_code,
|
||||
"created_at": deletion_request.created_at,
|
||||
"verified_at": timezone.now(),
|
||||
}
|
||||
|
||||
return result
|
||||
|
||||
@classmethod
|
||||
def cancel_deletion_request(cls, user: User) -> bool:
|
||||
"""
|
||||
Cancel a pending deletion request.
|
||||
|
||||
Args:
|
||||
user: The user whose deletion request to cancel
|
||||
|
||||
Returns:
|
||||
bool: True if a request was cancelled, False if no request existed
|
||||
"""
|
||||
try:
|
||||
deletion_request = getattr(user, "deletion_request", None)
|
||||
if deletion_request:
|
||||
deletion_request.delete()
|
||||
return True
|
||||
return False
|
||||
except UserDeletionRequest.DoesNotExist:
|
||||
return False
|
||||
|
||||
@classmethod
|
||||
def cleanup_expired_deletion_requests(cls) -> int:
|
||||
"""
|
||||
Clean up expired deletion requests.
|
||||
|
||||
Returns:
|
||||
int: Number of expired requests cleaned up
|
||||
"""
|
||||
return UserDeletionRequest.cleanup_expired()
|
||||
@@ -1,169 +0,0 @@
|
||||
from django.db.models.signals import post_save, pre_save
|
||||
from django.dispatch import receiver
|
||||
from django.contrib.auth.models import Group
|
||||
from django.db import transaction
|
||||
from django.core.files import File
|
||||
from django.core.files.temp import NamedTemporaryFile
|
||||
import requests
|
||||
from .models import User, UserProfile
|
||||
|
||||
|
||||
@receiver(post_save, sender=User)
|
||||
def create_user_profile(sender, instance, created, **kwargs):
|
||||
"""Create UserProfile for new users - unified signal handler"""
|
||||
if created:
|
||||
try:
|
||||
# Use get_or_create to prevent duplicates
|
||||
profile, profile_created = UserProfile.objects.get_or_create(user=instance)
|
||||
|
||||
if profile_created:
|
||||
# If user has a social account with avatar, download it
|
||||
try:
|
||||
social_account = instance.socialaccount_set.first()
|
||||
if social_account:
|
||||
extra_data = social_account.extra_data
|
||||
avatar_url = None
|
||||
|
||||
if social_account.provider == "google":
|
||||
avatar_url = extra_data.get("picture")
|
||||
elif social_account.provider == "discord":
|
||||
avatar = extra_data.get("avatar")
|
||||
discord_id = extra_data.get("id")
|
||||
if avatar:
|
||||
avatar_url = f"https://cdn.discordapp.com/avatars/{discord_id}/{avatar}.png"
|
||||
|
||||
if avatar_url:
|
||||
response = requests.get(avatar_url, timeout=60)
|
||||
if response.status_code == 200:
|
||||
img_temp = NamedTemporaryFile(delete=True)
|
||||
img_temp.write(response.content)
|
||||
img_temp.flush()
|
||||
|
||||
file_name = f"avatar_{instance.username}.png"
|
||||
profile.avatar.save(file_name, File(img_temp), save=True)
|
||||
except Exception as e:
|
||||
print(f"Error downloading avatar for user {instance.username}: {str(e)}")
|
||||
except Exception as e:
|
||||
print(f"Error creating profile for user {instance.username}: {str(e)}")
|
||||
|
||||
|
||||
@receiver(pre_save, sender=User)
|
||||
def sync_user_role_with_groups(sender, instance, **kwargs):
|
||||
"""Sync user role with Django groups"""
|
||||
if instance.pk: # Only for existing users
|
||||
try:
|
||||
old_instance = User.objects.get(pk=instance.pk)
|
||||
if old_instance.role != instance.role:
|
||||
# Role has changed, update groups
|
||||
with transaction.atomic():
|
||||
# Remove from old role group if exists
|
||||
if old_instance.role != "USER":
|
||||
old_group = Group.objects.filter(name=old_instance.role).first()
|
||||
if old_group:
|
||||
instance.groups.remove(old_group)
|
||||
|
||||
# Add to new role group
|
||||
if instance.role != "USER":
|
||||
new_group, _ = Group.objects.get_or_create(name=instance.role)
|
||||
instance.groups.add(new_group)
|
||||
|
||||
# Special handling for superuser role
|
||||
if instance.role == "SUPERUSER":
|
||||
instance.is_superuser = True
|
||||
instance.is_staff = True
|
||||
elif old_instance.role == "SUPERUSER":
|
||||
# If removing superuser role, remove superuser
|
||||
# status
|
||||
instance.is_superuser = False
|
||||
if instance.role not in [
|
||||
"ADMIN",
|
||||
"MODERATOR",
|
||||
]:
|
||||
instance.is_staff = False
|
||||
|
||||
# Handle staff status for admin and moderator roles
|
||||
if instance.role in [
|
||||
"ADMIN",
|
||||
"MODERATOR",
|
||||
]:
|
||||
instance.is_staff = True
|
||||
elif old_instance.role in [
|
||||
"ADMIN",
|
||||
"MODERATOR",
|
||||
]:
|
||||
# If removing admin/moderator role, remove staff
|
||||
# status
|
||||
if instance.role not in ["SUPERUSER"]:
|
||||
instance.is_staff = False
|
||||
except User.DoesNotExist:
|
||||
pass
|
||||
except Exception as e:
|
||||
print(
|
||||
f"Error syncing role with groups for user {instance.username}: {str(e)}"
|
||||
)
|
||||
|
||||
|
||||
def create_default_groups():
|
||||
"""
|
||||
Create default groups with appropriate permissions.
|
||||
Call this in a migration or management command.
|
||||
"""
|
||||
try:
|
||||
from django.contrib.auth.models import Permission
|
||||
|
||||
# Create Moderator group
|
||||
moderator_group, _ = Group.objects.get_or_create(name="MODERATOR")
|
||||
moderator_permissions = [
|
||||
# Review moderation permissions
|
||||
"change_review",
|
||||
"delete_review",
|
||||
"change_reviewreport",
|
||||
"delete_reviewreport",
|
||||
# Edit moderation permissions
|
||||
"change_parkedit",
|
||||
"delete_parkedit",
|
||||
"change_rideedit",
|
||||
"delete_rideedit",
|
||||
"change_companyedit",
|
||||
"delete_companyedit",
|
||||
"change_manufactureredit",
|
||||
"delete_manufactureredit",
|
||||
]
|
||||
|
||||
# Create Admin group
|
||||
admin_group, _ = Group.objects.get_or_create(name="ADMIN")
|
||||
admin_permissions = moderator_permissions + [
|
||||
# User management permissions
|
||||
"change_user",
|
||||
"delete_user",
|
||||
# Content management permissions
|
||||
"add_park",
|
||||
"change_park",
|
||||
"delete_park",
|
||||
"add_ride",
|
||||
"change_ride",
|
||||
"delete_ride",
|
||||
"add_company",
|
||||
"change_company",
|
||||
"delete_company",
|
||||
"add_manufacturer",
|
||||
"change_manufacturer",
|
||||
"delete_manufacturer",
|
||||
]
|
||||
|
||||
# Assign permissions to groups
|
||||
for codename in moderator_permissions:
|
||||
try:
|
||||
perm = Permission.objects.get(codename=codename)
|
||||
moderator_group.permissions.add(perm)
|
||||
except Permission.DoesNotExist:
|
||||
print(f"Permission not found: {codename}")
|
||||
|
||||
for codename in admin_permissions:
|
||||
try:
|
||||
perm = Permission.objects.get(codename=codename)
|
||||
admin_group.permissions.add(perm)
|
||||
except Permission.DoesNotExist:
|
||||
print(f"Permission not found: {codename}")
|
||||
except Exception as e:
|
||||
print(f"Error creating default groups: {str(e)}")
|
||||
@@ -1,30 +0,0 @@
|
||||
from django.contrib import admin
|
||||
from django.utils.html import format_html
|
||||
from .models import SlugHistory
|
||||
|
||||
|
||||
@admin.register(SlugHistory)
|
||||
class SlugHistoryAdmin(admin.ModelAdmin):
|
||||
list_display = ["content_object_link", "old_slug", "created_at"]
|
||||
list_filter = ["content_type", "created_at"]
|
||||
search_fields = ["old_slug", "object_id"]
|
||||
readonly_fields = ["content_type", "object_id", "old_slug", "created_at"]
|
||||
date_hierarchy = "created_at"
|
||||
ordering = ["-created_at"]
|
||||
|
||||
@admin.display(description="Object")
|
||||
def content_object_link(self, obj):
|
||||
"""Create a link to the related object's admin page"""
|
||||
try:
|
||||
url = obj.content_object.get_absolute_url()
|
||||
return format_html('<a href="{}">{}</a>', url, str(obj.content_object))
|
||||
except (AttributeError, ValueError):
|
||||
return str(obj.content_object)
|
||||
|
||||
def has_add_permission(self, request):
|
||||
"""Disable manual creation of slug history records"""
|
||||
return False
|
||||
|
||||
def has_change_permission(self, request, obj=None):
|
||||
"""Disable editing of slug history records"""
|
||||
return False
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,138 +0,0 @@
|
||||
"""
|
||||
Request logging middleware for comprehensive request/response logging.
|
||||
Logs all HTTP requests with detailed data for debugging and monitoring.
|
||||
"""
|
||||
|
||||
import logging
|
||||
import time
|
||||
import json
|
||||
from django.utils.deprecation import MiddlewareMixin
|
||||
|
||||
logger = logging.getLogger('request_logging')
|
||||
|
||||
|
||||
class RequestLoggingMiddleware(MiddlewareMixin):
|
||||
"""
|
||||
Middleware to log all HTTP requests with method, path, and response code.
|
||||
Includes detailed request/response data logging for all requests.
|
||||
"""
|
||||
|
||||
# Paths to exclude from detailed logging (e.g., static files, health checks)
|
||||
EXCLUDE_DETAILED_LOGGING_PATHS = [
|
||||
'/static/',
|
||||
'/media/',
|
||||
'/favicon.ico',
|
||||
'/health/',
|
||||
'/admin/jsi18n/',
|
||||
]
|
||||
|
||||
def _should_log_detailed(self, request):
|
||||
"""Determine if detailed logging should be enabled for this request."""
|
||||
return not any(
|
||||
path in request.path for path in self.EXCLUDE_DETAILED_LOGGING_PATHS)
|
||||
|
||||
def process_request(self, request):
|
||||
"""Store request start time and capture request data for detailed logging."""
|
||||
request._start_time = time.time()
|
||||
|
||||
# Enable detailed logging for all requests except excluded paths
|
||||
should_log_detailed = self._should_log_detailed(request)
|
||||
request._log_request_data = should_log_detailed
|
||||
|
||||
if should_log_detailed:
|
||||
try:
|
||||
# Log request data
|
||||
request_data = {}
|
||||
if hasattr(request, 'data') and request.data:
|
||||
request_data = dict(request.data)
|
||||
elif request.body:
|
||||
try:
|
||||
request_data = json.loads(request.body.decode('utf-8'))
|
||||
except (json.JSONDecodeError, UnicodeDecodeError):
|
||||
request_data = {'body': str(request.body)[
|
||||
:200] + '...' if len(str(request.body)) > 200 else str(request.body)}
|
||||
|
||||
# Log query parameters
|
||||
query_params = dict(request.GET) if request.GET else {}
|
||||
|
||||
logger.info(f"REQUEST DATA for {request.method} {request.path}:")
|
||||
if request_data:
|
||||
logger.info(f" Body: {self._safe_log_data(request_data)}")
|
||||
if query_params:
|
||||
logger.info(f" Query: {query_params}")
|
||||
if hasattr(request, 'user') and request.user.is_authenticated:
|
||||
logger.info(
|
||||
f" User: {request.user.username} (ID: {request.user.id})")
|
||||
|
||||
except Exception as e:
|
||||
logger.warning(f"Failed to log request data: {e}")
|
||||
|
||||
return None
|
||||
|
||||
def process_response(self, request, response):
|
||||
"""Log request details after response is generated."""
|
||||
try:
|
||||
# Calculate request duration
|
||||
duration = 0
|
||||
if hasattr(request, '_start_time'):
|
||||
duration = time.time() - request._start_time
|
||||
|
||||
# Basic request logging
|
||||
logger.info(
|
||||
f"{request.method} {request.get_full_path()} -> {response.status_code} "
|
||||
f"({duration:.3f}s)"
|
||||
)
|
||||
|
||||
# Detailed response logging for specific endpoints
|
||||
if getattr(request, '_log_request_data', False):
|
||||
try:
|
||||
# Log response data
|
||||
if hasattr(response, 'data'):
|
||||
logger.info(
|
||||
f"RESPONSE DATA for {request.method} {request.path}:")
|
||||
logger.info(f" Status: {response.status_code}")
|
||||
logger.info(f" Data: {self._safe_log_data(response.data)}")
|
||||
elif hasattr(response, 'content'):
|
||||
try:
|
||||
content = json.loads(response.content.decode('utf-8'))
|
||||
logger.info(
|
||||
f"RESPONSE DATA for {request.method} {request.path}:")
|
||||
logger.info(f" Status: {response.status_code}")
|
||||
logger.info(f" Content: {self._safe_log_data(content)}")
|
||||
except (json.JSONDecodeError, UnicodeDecodeError):
|
||||
logger.info(
|
||||
f"RESPONSE DATA for {request.method} {request.path}:")
|
||||
logger.info(f" Status: {response.status_code}")
|
||||
logger.info(f" Content: {str(response.content)[:200]}...")
|
||||
|
||||
except Exception as e:
|
||||
logger.warning(f"Failed to log response data: {e}")
|
||||
|
||||
except Exception:
|
||||
# Don't let logging errors break the request
|
||||
pass
|
||||
|
||||
return response
|
||||
|
||||
def _safe_log_data(self, data):
|
||||
"""Safely log data, truncating if too large and masking sensitive fields."""
|
||||
try:
|
||||
# Convert to string representation
|
||||
if isinstance(data, dict):
|
||||
# Mask sensitive fields
|
||||
safe_data = {}
|
||||
for key, value in data.items():
|
||||
if any(sensitive in key.lower() for sensitive in ['password', 'token', 'secret', 'key']):
|
||||
safe_data[key] = '***MASKED***'
|
||||
else:
|
||||
safe_data[key] = value
|
||||
data_str = json.dumps(safe_data, indent=2, default=str)
|
||||
else:
|
||||
data_str = json.dumps(data, indent=2, default=str)
|
||||
|
||||
# Truncate if too long
|
||||
if len(data_str) > 1000:
|
||||
return data_str[:1000] + '...[TRUNCATED]'
|
||||
return data_str
|
||||
except Exception:
|
||||
return str(data)[:500] + '...[ERROR_LOGGING]'
|
||||
@@ -1,97 +0,0 @@
|
||||
"""
|
||||
Modern Security Headers Middleware for ThrillWiki
|
||||
Implements Content Security Policy and other modern security headers.
|
||||
"""
|
||||
|
||||
import secrets
|
||||
import base64
|
||||
from django.conf import settings
|
||||
from django.utils.deprecation import MiddlewareMixin
|
||||
|
||||
|
||||
class SecurityHeadersMiddleware(MiddlewareMixin):
|
||||
"""
|
||||
Middleware to add modern security headers to all responses.
|
||||
"""
|
||||
|
||||
def _generate_nonce(self):
|
||||
"""Generate a cryptographically secure nonce for CSP."""
|
||||
# Generate 16 random bytes and encode as base64
|
||||
return base64.b64encode(secrets.token_bytes(16)).decode('ascii')
|
||||
|
||||
def _modify_csp_with_nonce(self, csp_policy, nonce):
|
||||
"""Modify CSP policy to include nonce for script-src."""
|
||||
if not csp_policy:
|
||||
return csp_policy
|
||||
|
||||
# Look for script-src directive and add nonce
|
||||
directives = csp_policy.split(';')
|
||||
modified_directives = []
|
||||
|
||||
for directive in directives:
|
||||
directive = directive.strip()
|
||||
if directive.startswith('script-src '):
|
||||
# Add nonce to script-src directive
|
||||
directive += f" 'nonce-{nonce}'"
|
||||
modified_directives.append(directive)
|
||||
|
||||
return '; '.join(modified_directives)
|
||||
|
||||
def process_request(self, request):
|
||||
"""Generate and store nonce for this request."""
|
||||
# Generate a nonce for this request
|
||||
nonce = self._generate_nonce()
|
||||
# Store it in request so templates can access it
|
||||
request.csp_nonce = nonce
|
||||
return None
|
||||
|
||||
def process_response(self, request, response):
|
||||
"""Add security headers to the response."""
|
||||
|
||||
# Content Security Policy with nonce support
|
||||
if hasattr(settings, 'SECURE_CONTENT_SECURITY_POLICY'):
|
||||
csp_policy = settings.SECURE_CONTENT_SECURITY_POLICY
|
||||
# Apply nonce if we have one for this request
|
||||
if hasattr(request, 'csp_nonce'):
|
||||
csp_policy = self._modify_csp_with_nonce(csp_policy, request.csp_nonce)
|
||||
response['Content-Security-Policy'] = csp_policy
|
||||
|
||||
# Cross-Origin Opener Policy
|
||||
if hasattr(settings, 'SECURE_CROSS_ORIGIN_OPENER_POLICY'):
|
||||
response['Cross-Origin-Opener-Policy'] = settings.SECURE_CROSS_ORIGIN_OPENER_POLICY
|
||||
|
||||
# Referrer Policy
|
||||
if hasattr(settings, 'SECURE_REFERRER_POLICY'):
|
||||
response['Referrer-Policy'] = settings.SECURE_REFERRER_POLICY
|
||||
|
||||
# Permissions Policy
|
||||
if hasattr(settings, 'SECURE_PERMISSIONS_POLICY'):
|
||||
response['Permissions-Policy'] = settings.SECURE_PERMISSIONS_POLICY
|
||||
|
||||
# Additional security headers
|
||||
response['X-Content-Type-Options'] = 'nosniff'
|
||||
response['X-Frame-Options'] = getattr(settings, 'X_FRAME_OPTIONS', 'DENY')
|
||||
response['X-XSS-Protection'] = '1; mode=block'
|
||||
|
||||
# Cache Control headers for better performance
|
||||
# Prevent caching of HTML pages to ensure users get fresh content
|
||||
if response.get('Content-Type', '').startswith('text/html'):
|
||||
response['Cache-Control'] = 'no-cache, no-store, must-revalidate'
|
||||
response['Pragma'] = 'no-cache'
|
||||
response['Expires'] = '0'
|
||||
|
||||
# Strict Transport Security (if SSL is enabled)
|
||||
if getattr(settings, 'SECURE_SSL_REDIRECT', False):
|
||||
hsts_seconds = getattr(settings, 'SECURE_HSTS_SECONDS', 31536000)
|
||||
hsts_include_subdomains = getattr(settings, 'SECURE_HSTS_INCLUDE_SUBDOMAINS', True)
|
||||
hsts_preload = getattr(settings, 'SECURE_HSTS_PRELOAD', False)
|
||||
|
||||
hsts_header = f'max-age={hsts_seconds}'
|
||||
if hsts_include_subdomains:
|
||||
hsts_header += '; includeSubDomains'
|
||||
if hsts_preload:
|
||||
hsts_header += '; preload'
|
||||
|
||||
response['Strict-Transport-Security'] = hsts_header
|
||||
|
||||
return response
|
||||
@@ -1,292 +0,0 @@
|
||||
# Generated by Django 5.2.6 on 2025-09-21 01:27
|
||||
|
||||
import django.db.models.deletion
|
||||
import pgtrigger.compiler
|
||||
import pgtrigger.migrations
|
||||
from django.conf import settings
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
initial = True
|
||||
|
||||
dependencies = [
|
||||
("contenttypes", "0002_remove_content_type_name"),
|
||||
("pghistory", "0007_auto_20250421_0444"),
|
||||
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.CreateModel(
|
||||
name="PageView",
|
||||
fields=[
|
||||
(
|
||||
"id",
|
||||
models.BigAutoField(
|
||||
auto_created=True,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
verbose_name="ID",
|
||||
),
|
||||
),
|
||||
("object_id", models.PositiveIntegerField()),
|
||||
("timestamp", models.DateTimeField(auto_now_add=True, db_index=True)),
|
||||
("ip_address", models.GenericIPAddressField()),
|
||||
("user_agent", models.CharField(blank=True, max_length=512)),
|
||||
(
|
||||
"content_type",
|
||||
models.ForeignKey(
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
related_name="page_views",
|
||||
to="contenttypes.contenttype",
|
||||
),
|
||||
),
|
||||
],
|
||||
),
|
||||
migrations.CreateModel(
|
||||
name="PageViewEvent",
|
||||
fields=[
|
||||
("pgh_id", models.AutoField(primary_key=True, serialize=False)),
|
||||
("pgh_created_at", models.DateTimeField(auto_now_add=True)),
|
||||
("pgh_label", models.TextField(help_text="The event label.")),
|
||||
("id", models.BigIntegerField()),
|
||||
("object_id", models.PositiveIntegerField()),
|
||||
("timestamp", models.DateTimeField(auto_now_add=True)),
|
||||
("ip_address", models.GenericIPAddressField()),
|
||||
("user_agent", models.CharField(blank=True, max_length=512)),
|
||||
(
|
||||
"content_type",
|
||||
models.ForeignKey(
|
||||
db_constraint=False,
|
||||
on_delete=django.db.models.deletion.DO_NOTHING,
|
||||
related_name="+",
|
||||
related_query_name="+",
|
||||
to="contenttypes.contenttype",
|
||||
),
|
||||
),
|
||||
(
|
||||
"pgh_context",
|
||||
models.ForeignKey(
|
||||
db_constraint=False,
|
||||
null=True,
|
||||
on_delete=django.db.models.deletion.DO_NOTHING,
|
||||
related_name="+",
|
||||
to="pghistory.context",
|
||||
),
|
||||
),
|
||||
(
|
||||
"pgh_obj",
|
||||
models.ForeignKey(
|
||||
db_constraint=False,
|
||||
on_delete=django.db.models.deletion.DO_NOTHING,
|
||||
related_name="events",
|
||||
to="core.pageview",
|
||||
),
|
||||
),
|
||||
],
|
||||
options={
|
||||
"abstract": False,
|
||||
},
|
||||
),
|
||||
migrations.CreateModel(
|
||||
name="SlugHistory",
|
||||
fields=[
|
||||
(
|
||||
"id",
|
||||
models.BigAutoField(
|
||||
auto_created=True,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
verbose_name="ID",
|
||||
),
|
||||
),
|
||||
("object_id", models.CharField(max_length=50)),
|
||||
("old_slug", models.SlugField(max_length=200)),
|
||||
("created_at", models.DateTimeField(auto_now_add=True)),
|
||||
(
|
||||
"content_type",
|
||||
models.ForeignKey(
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
to="contenttypes.contenttype",
|
||||
),
|
||||
),
|
||||
],
|
||||
options={
|
||||
"verbose_name_plural": "Slug histories",
|
||||
"ordering": ["-created_at"],
|
||||
},
|
||||
),
|
||||
migrations.CreateModel(
|
||||
name="SlugHistoryEvent",
|
||||
fields=[
|
||||
("pgh_id", models.AutoField(primary_key=True, serialize=False)),
|
||||
("pgh_created_at", models.DateTimeField(auto_now_add=True)),
|
||||
("pgh_label", models.TextField(help_text="The event label.")),
|
||||
("id", models.BigIntegerField()),
|
||||
("object_id", models.CharField(max_length=50)),
|
||||
("old_slug", models.SlugField(db_index=False, max_length=200)),
|
||||
("created_at", models.DateTimeField(auto_now_add=True)),
|
||||
(
|
||||
"content_type",
|
||||
models.ForeignKey(
|
||||
db_constraint=False,
|
||||
on_delete=django.db.models.deletion.DO_NOTHING,
|
||||
related_name="+",
|
||||
related_query_name="+",
|
||||
to="contenttypes.contenttype",
|
||||
),
|
||||
),
|
||||
(
|
||||
"pgh_context",
|
||||
models.ForeignKey(
|
||||
db_constraint=False,
|
||||
null=True,
|
||||
on_delete=django.db.models.deletion.DO_NOTHING,
|
||||
related_name="+",
|
||||
to="pghistory.context",
|
||||
),
|
||||
),
|
||||
(
|
||||
"pgh_obj",
|
||||
models.ForeignKey(
|
||||
db_constraint=False,
|
||||
on_delete=django.db.models.deletion.DO_NOTHING,
|
||||
related_name="events",
|
||||
to="core.slughistory",
|
||||
),
|
||||
),
|
||||
],
|
||||
options={
|
||||
"abstract": False,
|
||||
},
|
||||
),
|
||||
migrations.CreateModel(
|
||||
name="HistoricalSlug",
|
||||
fields=[
|
||||
(
|
||||
"id",
|
||||
models.BigAutoField(
|
||||
auto_created=True,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
verbose_name="ID",
|
||||
),
|
||||
),
|
||||
("object_id", models.PositiveIntegerField()),
|
||||
("slug", models.SlugField(max_length=255)),
|
||||
("created_at", models.DateTimeField(auto_now_add=True)),
|
||||
(
|
||||
"content_type",
|
||||
models.ForeignKey(
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
to="contenttypes.contenttype",
|
||||
),
|
||||
),
|
||||
(
|
||||
"user",
|
||||
models.ForeignKey(
|
||||
blank=True,
|
||||
null=True,
|
||||
on_delete=django.db.models.deletion.SET_NULL,
|
||||
related_name="historical_slugs",
|
||||
to=settings.AUTH_USER_MODEL,
|
||||
),
|
||||
),
|
||||
],
|
||||
options={
|
||||
"indexes": [
|
||||
models.Index(
|
||||
fields=["content_type", "object_id"],
|
||||
name="core_histor_content_b4c470_idx",
|
||||
),
|
||||
models.Index(fields=["slug"], name="core_histor_slug_8fd7b3_idx"),
|
||||
],
|
||||
"unique_together": {("content_type", "slug")},
|
||||
},
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name="pageview",
|
||||
index=models.Index(
|
||||
fields=["timestamp"], name="core_pagevi_timesta_757ebb_idx"
|
||||
),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name="pageview",
|
||||
index=models.Index(
|
||||
fields=["content_type", "object_id"],
|
||||
name="core_pagevi_content_eda7ad_idx",
|
||||
),
|
||||
),
|
||||
pgtrigger.migrations.AddTrigger(
|
||||
model_name="pageview",
|
||||
trigger=pgtrigger.compiler.Trigger(
|
||||
name="insert_insert",
|
||||
sql=pgtrigger.compiler.UpsertTriggerSql(
|
||||
func='INSERT INTO "core_pageviewevent" ("content_type_id", "id", "ip_address", "object_id", "pgh_context_id", "pgh_created_at", "pgh_label", "pgh_obj_id", "timestamp", "user_agent") VALUES (NEW."content_type_id", NEW."id", NEW."ip_address", NEW."object_id", _pgh_attach_context(), NOW(), \'insert\', NEW."id", NEW."timestamp", NEW."user_agent"); RETURN NULL;',
|
||||
hash="1682d124ea3ba215e630c7cfcde929f7444cf247",
|
||||
operation="INSERT",
|
||||
pgid="pgtrigger_insert_insert_ee1e1",
|
||||
table="core_pageview",
|
||||
when="AFTER",
|
||||
),
|
||||
),
|
||||
),
|
||||
pgtrigger.migrations.AddTrigger(
|
||||
model_name="pageview",
|
||||
trigger=pgtrigger.compiler.Trigger(
|
||||
name="update_update",
|
||||
sql=pgtrigger.compiler.UpsertTriggerSql(
|
||||
condition="WHEN (OLD.* IS DISTINCT FROM NEW.*)",
|
||||
func='INSERT INTO "core_pageviewevent" ("content_type_id", "id", "ip_address", "object_id", "pgh_context_id", "pgh_created_at", "pgh_label", "pgh_obj_id", "timestamp", "user_agent") VALUES (NEW."content_type_id", NEW."id", NEW."ip_address", NEW."object_id", _pgh_attach_context(), NOW(), \'update\', NEW."id", NEW."timestamp", NEW."user_agent"); RETURN NULL;',
|
||||
hash="4221b2dd6636cae454f8d69c0c1841c40c47e6a6",
|
||||
operation="UPDATE",
|
||||
pgid="pgtrigger_update_update_3c505",
|
||||
table="core_pageview",
|
||||
when="AFTER",
|
||||
),
|
||||
),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name="slughistory",
|
||||
index=models.Index(
|
||||
fields=["content_type", "object_id"],
|
||||
name="core_slughi_content_8bbf56_idx",
|
||||
),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name="slughistory",
|
||||
index=models.Index(
|
||||
fields=["old_slug"], name="core_slughi_old_slu_aaef7f_idx"
|
||||
),
|
||||
),
|
||||
pgtrigger.migrations.AddTrigger(
|
||||
model_name="slughistory",
|
||||
trigger=pgtrigger.compiler.Trigger(
|
||||
name="insert_insert",
|
||||
sql=pgtrigger.compiler.UpsertTriggerSql(
|
||||
func='INSERT INTO "core_slughistoryevent" ("content_type_id", "created_at", "id", "object_id", "old_slug", "pgh_context_id", "pgh_created_at", "pgh_label", "pgh_obj_id") VALUES (NEW."content_type_id", NEW."created_at", NEW."id", NEW."object_id", NEW."old_slug", _pgh_attach_context(), NOW(), \'insert\', NEW."id"); RETURN NULL;',
|
||||
hash="2a2a05025693c165b88e5eba7fcc23214749a78b",
|
||||
operation="INSERT",
|
||||
pgid="pgtrigger_insert_insert_3002a",
|
||||
table="core_slughistory",
|
||||
when="AFTER",
|
||||
),
|
||||
),
|
||||
),
|
||||
pgtrigger.migrations.AddTrigger(
|
||||
model_name="slughistory",
|
||||
trigger=pgtrigger.compiler.Trigger(
|
||||
name="update_update",
|
||||
sql=pgtrigger.compiler.UpsertTriggerSql(
|
||||
condition="WHEN (OLD.* IS DISTINCT FROM NEW.*)",
|
||||
func='INSERT INTO "core_slughistoryevent" ("content_type_id", "created_at", "id", "object_id", "old_slug", "pgh_context_id", "pgh_created_at", "pgh_label", "pgh_obj_id") VALUES (NEW."content_type_id", NEW."created_at", NEW."id", NEW."object_id", NEW."old_slug", _pgh_attach_context(), NOW(), \'update\', NEW."id"); RETURN NULL;',
|
||||
hash="3ad197ccb6178668e762720341e45d3fd3216776",
|
||||
operation="UPDATE",
|
||||
pgid="pgtrigger_update_update_52030",
|
||||
table="core_slughistory",
|
||||
when="AFTER",
|
||||
),
|
||||
),
|
||||
),
|
||||
]
|
||||
@@ -1,19 +0,0 @@
|
||||
from django.views.generic.list import MultipleObjectMixin
|
||||
|
||||
|
||||
class HTMXFilterableMixin(MultipleObjectMixin):
|
||||
"""
|
||||
A mixin that provides filtering capabilities for HTMX requests.
|
||||
"""
|
||||
|
||||
filter_class = None
|
||||
|
||||
def get_queryset(self):
|
||||
queryset = super().get_queryset()
|
||||
self.filterset = self.filter_class(self.request.GET, queryset=queryset)
|
||||
return self.filterset.qs
|
||||
|
||||
def get_context_data(self, **kwargs):
|
||||
context = super().get_context_data(**kwargs)
|
||||
context["filter"] = self.filterset
|
||||
return context
|
||||
@@ -1,119 +0,0 @@
|
||||
from django.db import models
|
||||
from django.contrib.contenttypes.fields import GenericForeignKey
|
||||
from django.contrib.contenttypes.models import ContentType
|
||||
from django.utils.text import slugify
|
||||
from apps.core.history import TrackedModel
|
||||
import pghistory
|
||||
|
||||
|
||||
@pghistory.track()
|
||||
class SlugHistory(models.Model):
|
||||
"""
|
||||
Model for tracking slug changes across all models that use slugs.
|
||||
Uses generic relations to work with any model.
|
||||
"""
|
||||
|
||||
content_type = models.ForeignKey(ContentType, on_delete=models.CASCADE)
|
||||
object_id = models.CharField(
|
||||
max_length=50
|
||||
) # Using CharField to work with our custom IDs
|
||||
content_object = GenericForeignKey("content_type", "object_id")
|
||||
|
||||
old_slug = models.SlugField(max_length=200)
|
||||
created_at = models.DateTimeField(auto_now_add=True)
|
||||
|
||||
class Meta:
|
||||
indexes = [
|
||||
models.Index(fields=["content_type", "object_id"]),
|
||||
models.Index(fields=["old_slug"]),
|
||||
]
|
||||
verbose_name_plural = "Slug histories"
|
||||
ordering = ["-created_at"]
|
||||
|
||||
def __str__(self):
|
||||
return f"Old slug '{self.old_slug}' for {self.content_object}"
|
||||
|
||||
|
||||
class SluggedModel(TrackedModel):
|
||||
"""
|
||||
Abstract base model that provides slug functionality with history tracking.
|
||||
"""
|
||||
|
||||
name = models.CharField(max_length=200)
|
||||
slug = models.SlugField(max_length=200, unique=True)
|
||||
|
||||
class Meta(TrackedModel.Meta):
|
||||
abstract = True
|
||||
|
||||
def save(self, *args, **kwargs):
|
||||
# Get the current instance from DB if it exists
|
||||
if self.pk:
|
||||
try:
|
||||
old_instance = self.__class__.objects.get(pk=self.pk)
|
||||
# If slug has changed, save the old one to history
|
||||
if old_instance.slug != self.slug:
|
||||
SlugHistory.objects.create(
|
||||
content_type=ContentType.objects.get_for_model(self),
|
||||
object_id=getattr(self, self.get_id_field_name()),
|
||||
old_slug=old_instance.slug,
|
||||
)
|
||||
except self.__class__.DoesNotExist:
|
||||
pass
|
||||
|
||||
# Generate slug if not set
|
||||
if not self.slug:
|
||||
self.slug = slugify(self.name)
|
||||
|
||||
super().save(*args, **kwargs)
|
||||
|
||||
def get_id_field_name(self):
|
||||
"""
|
||||
Returns the name of the read-only ID field for this model.
|
||||
Should be overridden by subclasses.
|
||||
"""
|
||||
raise NotImplementedError(
|
||||
"Subclasses of SluggedModel must implement get_id_field_name()"
|
||||
)
|
||||
|
||||
@classmethod
|
||||
def get_by_slug(cls, slug):
|
||||
"""
|
||||
Get an object by its current or historical slug.
|
||||
Returns (object, is_old_slug) tuple.
|
||||
"""
|
||||
try:
|
||||
# Try to get by current slug first
|
||||
return cls.objects.get(slug=slug), False
|
||||
except cls.DoesNotExist:
|
||||
# Check pghistory first if available
|
||||
try:
|
||||
import pghistory.models
|
||||
|
||||
history_entries = pghistory.models.Events.objects.filter(
|
||||
pgh_model=f"{cls._meta.app_label}.{cls._meta.model_name}", slug=slug
|
||||
).order_by("-pgh_created_at")
|
||||
|
||||
if history_entries:
|
||||
history_entry = history_entries.first()
|
||||
if history_entry:
|
||||
return cls.objects.get(id=history_entry.pgh_obj_id), True
|
||||
except (ImportError, AttributeError):
|
||||
pass
|
||||
|
||||
# Try to find in manual slug history as fallback
|
||||
history = (
|
||||
SlugHistory.objects.filter(
|
||||
content_type=ContentType.objects.get_for_model(cls),
|
||||
old_slug=slug,
|
||||
)
|
||||
.order_by("-created_at")
|
||||
.first()
|
||||
)
|
||||
|
||||
if history:
|
||||
return (
|
||||
cls.objects.get(**{cls().get_id_field_name(): history.object_id}),
|
||||
True,
|
||||
)
|
||||
|
||||
raise cls.DoesNotExist(f"{cls.__name__} with slug '{slug}' does not exist")
|
||||
@@ -1,5 +0,0 @@
|
||||
"""
|
||||
Core tasks package for ThrillWiki.
|
||||
|
||||
This package contains all Celery tasks for the core application.
|
||||
"""
|
||||
@@ -1,26 +0,0 @@
|
||||
"""
|
||||
Core app URL configuration.
|
||||
"""
|
||||
|
||||
from django.urls import path, include
|
||||
from .views.entity_search import (
|
||||
EntityFuzzySearchView,
|
||||
EntityNotFoundView,
|
||||
QuickEntitySuggestionView,
|
||||
)
|
||||
|
||||
app_name = "core"
|
||||
|
||||
# Entity search endpoints
|
||||
entity_patterns = [
|
||||
path("search/", EntityFuzzySearchView.as_view(), name="entity_fuzzy_search"),
|
||||
path("not-found/", EntityNotFoundView.as_view(), name="entity_not_found"),
|
||||
path(
|
||||
"suggestions/", QuickEntitySuggestionView.as_view(), name="entity_suggestions"
|
||||
),
|
||||
]
|
||||
|
||||
urlpatterns = [
|
||||
# Entity fuzzy matching and search endpoints
|
||||
path("entities/", include(entity_patterns)),
|
||||
]
|
||||
@@ -1 +0,0 @@
|
||||
# URLs package for core app
|
||||
@@ -1 +0,0 @@
|
||||
# Core utilities
|
||||
@@ -1,62 +0,0 @@
|
||||
from typing import Any, Dict, Optional, Type
|
||||
from django.shortcuts import redirect
|
||||
from django.urls import reverse
|
||||
from django.views.generic import DetailView
|
||||
from django.views import View
|
||||
from django.http import HttpRequest, HttpResponse
|
||||
from django.db.models import Model
|
||||
|
||||
|
||||
class SlugRedirectMixin(View):
|
||||
"""
|
||||
Mixin that handles redirects for old slugs.
|
||||
Requires the model to inherit from SluggedModel and view to inherit from DetailView.
|
||||
"""
|
||||
|
||||
model: Optional[Type[Model]] = None
|
||||
slug_url_kwarg: str = "slug"
|
||||
object: Optional[Model] = None
|
||||
|
||||
def dispatch(self, request: HttpRequest, *args: Any, **kwargs: Any) -> HttpResponse:
|
||||
# Only apply slug redirect logic to DetailViews
|
||||
if not isinstance(self, DetailView):
|
||||
return super().dispatch(request, *args, **kwargs)
|
||||
|
||||
# Get the object using current or historical slug
|
||||
try:
|
||||
self.object = self.get_object() # type: ignore
|
||||
# Check if we used an old slug
|
||||
current_slug = kwargs.get(self.slug_url_kwarg)
|
||||
if current_slug and current_slug != getattr(self.object, "slug", None):
|
||||
# Get the URL pattern name from the view
|
||||
url_pattern = self.get_redirect_url_pattern()
|
||||
# Build kwargs for reverse()
|
||||
reverse_kwargs = self.get_redirect_url_kwargs()
|
||||
# Redirect to the current slug URL
|
||||
return redirect(
|
||||
reverse(url_pattern, kwargs=reverse_kwargs), permanent=True
|
||||
)
|
||||
return super().dispatch(request, *args, **kwargs)
|
||||
except (AttributeError, Exception) as e: # type: ignore
|
||||
if self.model and hasattr(self.model, "DoesNotExist"):
|
||||
if isinstance(e, self.model.DoesNotExist): # type: ignore
|
||||
return super().dispatch(request, *args, **kwargs)
|
||||
return super().dispatch(request, *args, **kwargs)
|
||||
|
||||
def get_redirect_url_pattern(self) -> str:
|
||||
"""
|
||||
Get the URL pattern name for redirects.
|
||||
Should be overridden by subclasses.
|
||||
"""
|
||||
raise NotImplementedError(
|
||||
"Subclasses must implement get_redirect_url_pattern()"
|
||||
)
|
||||
|
||||
def get_redirect_url_kwargs(self) -> Dict[str, Any]:
|
||||
"""
|
||||
Get the kwargs for reverse() when redirecting.
|
||||
Should be overridden by subclasses if they need custom kwargs.
|
||||
"""
|
||||
if not self.object:
|
||||
return {}
|
||||
return {self.slug_url_kwarg: getattr(self.object, "slug", "")}
|
||||
@@ -1,54 +0,0 @@
|
||||
from django.apps import AppConfig
|
||||
from django.db.models.signals import post_migrate
|
||||
|
||||
|
||||
def create_photo_permissions(sender, **kwargs):
|
||||
"""Create custom permissions for domain-specific photo models"""
|
||||
from django.contrib.auth.models import Permission
|
||||
from django.contrib.contenttypes.models import ContentType
|
||||
from apps.parks.models import ParkPhoto
|
||||
from apps.rides.models import RidePhoto
|
||||
|
||||
# Create permissions for ParkPhoto
|
||||
park_photo_content_type = ContentType.objects.get_for_model(ParkPhoto)
|
||||
Permission.objects.get_or_create(
|
||||
codename="add_parkphoto",
|
||||
name="Can add park photo",
|
||||
content_type=park_photo_content_type,
|
||||
)
|
||||
Permission.objects.get_or_create(
|
||||
codename="change_parkphoto",
|
||||
name="Can change park photo",
|
||||
content_type=park_photo_content_type,
|
||||
)
|
||||
Permission.objects.get_or_create(
|
||||
codename="delete_parkphoto",
|
||||
name="Can delete park photo",
|
||||
content_type=park_photo_content_type,
|
||||
)
|
||||
|
||||
# Create permissions for RidePhoto
|
||||
ride_photo_content_type = ContentType.objects.get_for_model(RidePhoto)
|
||||
Permission.objects.get_or_create(
|
||||
codename="add_ridephoto",
|
||||
name="Can add ride photo",
|
||||
content_type=ride_photo_content_type,
|
||||
)
|
||||
Permission.objects.get_or_create(
|
||||
codename="change_ridephoto",
|
||||
name="Can change ride photo",
|
||||
content_type=ride_photo_content_type,
|
||||
)
|
||||
Permission.objects.get_or_create(
|
||||
codename="delete_ridephoto",
|
||||
name="Can delete ride photo",
|
||||
content_type=ride_photo_content_type,
|
||||
)
|
||||
|
||||
|
||||
class MediaConfig(AppConfig):
|
||||
default_auto_field = "django.db.models.BigAutoField"
|
||||
name = "apps.media"
|
||||
|
||||
def ready(self):
|
||||
post_migrate.connect(create_photo_permissions, sender=self)
|
||||
@@ -1,171 +0,0 @@
|
||||
from django.contrib import admin
|
||||
from django.contrib.admin import AdminSite
|
||||
from django.utils.html import format_html
|
||||
from django.urls import reverse
|
||||
from django.utils.safestring import mark_safe
|
||||
from .models import EditSubmission, PhotoSubmission
|
||||
|
||||
|
||||
class ModerationAdminSite(AdminSite):
|
||||
site_header = "ThrillWiki Moderation"
|
||||
site_title = "ThrillWiki Moderation"
|
||||
index_title = "Moderation Dashboard"
|
||||
|
||||
def has_permission(self, request):
|
||||
"""Only allow moderators and above to access this admin site"""
|
||||
return request.user.is_authenticated and request.user.role in [
|
||||
"MODERATOR",
|
||||
"ADMIN",
|
||||
"SUPERUSER",
|
||||
]
|
||||
|
||||
|
||||
moderation_site = ModerationAdminSite(name="moderation")
|
||||
|
||||
|
||||
class EditSubmissionAdmin(admin.ModelAdmin):
|
||||
list_display = [
|
||||
"id",
|
||||
"user_link",
|
||||
"content_type",
|
||||
"content_link",
|
||||
"status",
|
||||
"created_at",
|
||||
"handled_by",
|
||||
]
|
||||
list_filter = ["status", "content_type", "created_at"]
|
||||
search_fields = ["user__username", "reason", "source", "notes"]
|
||||
readonly_fields = [
|
||||
"user",
|
||||
"content_type",
|
||||
"object_id",
|
||||
"changes",
|
||||
"created_at",
|
||||
]
|
||||
|
||||
def user_link(self, obj):
|
||||
url = reverse("admin:accounts_user_change", args=[obj.user.id])
|
||||
return format_html('<a href="{}">{}</a>', url, obj.user.username)
|
||||
|
||||
user_link.short_description = "User"
|
||||
|
||||
def content_link(self, obj):
|
||||
if hasattr(obj.content_object, "get_absolute_url"):
|
||||
url = obj.content_object.get_absolute_url()
|
||||
return format_html('<a href="{}">{}</a>', url, str(obj.content_object))
|
||||
return str(obj.content_object)
|
||||
|
||||
content_link.short_description = "Content"
|
||||
|
||||
def save_model(self, request, obj, form, change):
|
||||
if "status" in form.changed_data:
|
||||
if obj.status == "APPROVED":
|
||||
obj.approve(request.user)
|
||||
elif obj.status == "REJECTED":
|
||||
obj.reject(request.user)
|
||||
elif obj.status == "ESCALATED":
|
||||
obj.escalate(request.user)
|
||||
super().save_model(request, obj, form, change)
|
||||
|
||||
|
||||
class PhotoSubmissionAdmin(admin.ModelAdmin):
|
||||
list_display = [
|
||||
"id",
|
||||
"user_link",
|
||||
"content_type",
|
||||
"content_link",
|
||||
"photo_preview",
|
||||
"status",
|
||||
"created_at",
|
||||
"handled_by",
|
||||
]
|
||||
list_filter = ["status", "content_type", "created_at"]
|
||||
search_fields = ["user__username", "caption", "notes"]
|
||||
readonly_fields = [
|
||||
"user",
|
||||
"content_type",
|
||||
"object_id",
|
||||
"photo_preview",
|
||||
"created_at",
|
||||
]
|
||||
|
||||
def user_link(self, obj):
|
||||
url = reverse("admin:accounts_user_change", args=[obj.user.id])
|
||||
return format_html('<a href="{}">{}</a>', url, obj.user.username)
|
||||
|
||||
user_link.short_description = "User"
|
||||
|
||||
def content_link(self, obj):
|
||||
if hasattr(obj.content_object, "get_absolute_url"):
|
||||
url = obj.content_object.get_absolute_url()
|
||||
return format_html('<a href="{}">{}</a>', url, str(obj.content_object))
|
||||
return str(obj.content_object)
|
||||
|
||||
content_link.short_description = "Content"
|
||||
|
||||
def photo_preview(self, obj):
|
||||
if obj.photo:
|
||||
return format_html(
|
||||
'<img src="{}" style="max-height: 100px; max-width: 200px;" />',
|
||||
obj.photo.url,
|
||||
)
|
||||
return ""
|
||||
|
||||
photo_preview.short_description = "Photo Preview"
|
||||
|
||||
def save_model(self, request, obj, form, change):
|
||||
if "status" in form.changed_data:
|
||||
if obj.status == "APPROVED":
|
||||
obj.approve(request.user, obj.notes)
|
||||
elif obj.status == "REJECTED":
|
||||
obj.reject(request.user, obj.notes)
|
||||
super().save_model(request, obj, form, change)
|
||||
|
||||
|
||||
class HistoryEventAdmin(admin.ModelAdmin):
|
||||
"""Admin interface for viewing model history events"""
|
||||
|
||||
list_display = [
|
||||
"pgh_label",
|
||||
"pgh_created_at",
|
||||
"get_object_link",
|
||||
"get_context",
|
||||
]
|
||||
list_filter = ["pgh_label", "pgh_created_at"]
|
||||
readonly_fields = [
|
||||
"pgh_label",
|
||||
"pgh_obj_id",
|
||||
"pgh_data",
|
||||
"pgh_context",
|
||||
"pgh_created_at",
|
||||
]
|
||||
date_hierarchy = "pgh_created_at"
|
||||
|
||||
def get_object_link(self, obj):
|
||||
"""Display a link to the related object if possible"""
|
||||
if obj.pgh_obj and hasattr(obj.pgh_obj, "get_absolute_url"):
|
||||
url = obj.pgh_obj.get_absolute_url()
|
||||
return format_html('<a href="{}">{}</a>', url, str(obj.pgh_obj))
|
||||
return str(obj.pgh_obj or "")
|
||||
|
||||
get_object_link.short_description = "Object"
|
||||
|
||||
def get_context(self, obj):
|
||||
"""Format the context data nicely"""
|
||||
if not obj.pgh_context:
|
||||
return "-"
|
||||
html = ["<table>"]
|
||||
for key, value in obj.pgh_context.items():
|
||||
html.append(f"<tr><th>{key}</th><td>{value}</td></tr>")
|
||||
html.append("</table>")
|
||||
return mark_safe("".join(html))
|
||||
|
||||
get_context.short_description = "Context"
|
||||
|
||||
|
||||
# Register with moderation site only
|
||||
moderation_site.register(EditSubmission, EditSubmissionAdmin)
|
||||
moderation_site.register(PhotoSubmission, PhotoSubmissionAdmin)
|
||||
|
||||
# We will register concrete event models as they are created during migrations
|
||||
# Example: moderation_site.register(DesignerEvent, HistoryEventAdmin)
|
||||
@@ -1,7 +0,0 @@
|
||||
from django.apps import AppConfig
|
||||
|
||||
|
||||
class ModerationConfig(AppConfig):
|
||||
default_auto_field = "django.db.models.BigAutoField"
|
||||
name = "apps.moderation"
|
||||
verbose_name = "Content Moderation"
|
||||
@@ -1,935 +0,0 @@
|
||||
"""
|
||||
Rich Choice Objects for Moderation Domain
|
||||
|
||||
This module defines all choice options for the moderation system using the Rich Choice Objects pattern.
|
||||
All choices include rich metadata for UI styling, business logic, and enhanced functionality.
|
||||
"""
|
||||
|
||||
from apps.core.choices.base import RichChoice, ChoiceCategory
|
||||
from apps.core.choices.registry import register_choices
|
||||
|
||||
# ============================================================================
|
||||
# EditSubmission Choices
|
||||
# ============================================================================
|
||||
|
||||
EDIT_SUBMISSION_STATUSES = [
|
||||
RichChoice(
|
||||
value="PENDING",
|
||||
label="Pending",
|
||||
description="Submission awaiting moderator review",
|
||||
metadata={
|
||||
'color': 'yellow',
|
||||
'icon': 'clock',
|
||||
'css_class': 'bg-yellow-100 text-yellow-800 border-yellow-200',
|
||||
'sort_order': 1,
|
||||
'can_transition_to': ['APPROVED', 'REJECTED', 'ESCALATED'],
|
||||
'requires_moderator': True,
|
||||
'is_actionable': True
|
||||
},
|
||||
category=ChoiceCategory.STATUS
|
||||
),
|
||||
RichChoice(
|
||||
value="APPROVED",
|
||||
label="Approved",
|
||||
description="Submission has been approved and changes applied",
|
||||
metadata={
|
||||
'color': 'green',
|
||||
'icon': 'check-circle',
|
||||
'css_class': 'bg-green-100 text-green-800 border-green-200',
|
||||
'sort_order': 2,
|
||||
'can_transition_to': [],
|
||||
'requires_moderator': True,
|
||||
'is_actionable': False,
|
||||
'is_final': True
|
||||
},
|
||||
category=ChoiceCategory.STATUS
|
||||
),
|
||||
RichChoice(
|
||||
value="REJECTED",
|
||||
label="Rejected",
|
||||
description="Submission has been rejected and will not be applied",
|
||||
metadata={
|
||||
'color': 'red',
|
||||
'icon': 'x-circle',
|
||||
'css_class': 'bg-red-100 text-red-800 border-red-200',
|
||||
'sort_order': 3,
|
||||
'can_transition_to': [],
|
||||
'requires_moderator': True,
|
||||
'is_actionable': False,
|
||||
'is_final': True
|
||||
},
|
||||
category=ChoiceCategory.STATUS
|
||||
),
|
||||
RichChoice(
|
||||
value="ESCALATED",
|
||||
label="Escalated",
|
||||
description="Submission has been escalated for higher-level review",
|
||||
metadata={
|
||||
'color': 'purple',
|
||||
'icon': 'arrow-up',
|
||||
'css_class': 'bg-purple-100 text-purple-800 border-purple-200',
|
||||
'sort_order': 4,
|
||||
'can_transition_to': ['APPROVED', 'REJECTED'],
|
||||
'requires_moderator': True,
|
||||
'is_actionable': True,
|
||||
'escalation_level': 'admin'
|
||||
},
|
||||
category=ChoiceCategory.STATUS
|
||||
),
|
||||
]
|
||||
|
||||
SUBMISSION_TYPES = [
|
||||
RichChoice(
|
||||
value="EDIT",
|
||||
label="Edit Existing",
|
||||
description="Modification to existing content",
|
||||
metadata={
|
||||
'color': 'blue',
|
||||
'icon': 'pencil',
|
||||
'css_class': 'bg-blue-100 text-blue-800 border-blue-200',
|
||||
'sort_order': 1,
|
||||
'requires_existing_object': True,
|
||||
'complexity_level': 'medium'
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="CREATE",
|
||||
label="Create New",
|
||||
description="Creation of new content",
|
||||
metadata={
|
||||
'color': 'green',
|
||||
'icon': 'plus-circle',
|
||||
'css_class': 'bg-green-100 text-green-800 border-green-200',
|
||||
'sort_order': 2,
|
||||
'requires_existing_object': False,
|
||||
'complexity_level': 'high'
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
]
|
||||
|
||||
# ============================================================================
|
||||
# ModerationReport Choices
|
||||
# ============================================================================
|
||||
|
||||
MODERATION_REPORT_STATUSES = [
|
||||
RichChoice(
|
||||
value="PENDING",
|
||||
label="Pending Review",
|
||||
description="Report awaiting initial moderator review",
|
||||
metadata={
|
||||
'color': 'yellow',
|
||||
'icon': 'clock',
|
||||
'css_class': 'bg-yellow-100 text-yellow-800 border-yellow-200',
|
||||
'sort_order': 1,
|
||||
'can_transition_to': ['UNDER_REVIEW', 'DISMISSED'],
|
||||
'requires_assignment': False,
|
||||
'is_actionable': True
|
||||
},
|
||||
category=ChoiceCategory.STATUS
|
||||
),
|
||||
RichChoice(
|
||||
value="UNDER_REVIEW",
|
||||
label="Under Review",
|
||||
description="Report is actively being investigated by a moderator",
|
||||
metadata={
|
||||
'color': 'blue',
|
||||
'icon': 'eye',
|
||||
'css_class': 'bg-blue-100 text-blue-800 border-blue-200',
|
||||
'sort_order': 2,
|
||||
'can_transition_to': ['RESOLVED', 'DISMISSED'],
|
||||
'requires_assignment': True,
|
||||
'is_actionable': True
|
||||
},
|
||||
category=ChoiceCategory.STATUS
|
||||
),
|
||||
RichChoice(
|
||||
value="RESOLVED",
|
||||
label="Resolved",
|
||||
description="Report has been resolved with appropriate action taken",
|
||||
metadata={
|
||||
'color': 'green',
|
||||
'icon': 'check-circle',
|
||||
'css_class': 'bg-green-100 text-green-800 border-green-200',
|
||||
'sort_order': 3,
|
||||
'can_transition_to': [],
|
||||
'requires_assignment': True,
|
||||
'is_actionable': False,
|
||||
'is_final': True
|
||||
},
|
||||
category=ChoiceCategory.STATUS
|
||||
),
|
||||
RichChoice(
|
||||
value="DISMISSED",
|
||||
label="Dismissed",
|
||||
description="Report was reviewed but no action was necessary",
|
||||
metadata={
|
||||
'color': 'gray',
|
||||
'icon': 'x-circle',
|
||||
'css_class': 'bg-gray-100 text-gray-800 border-gray-200',
|
||||
'sort_order': 4,
|
||||
'can_transition_to': [],
|
||||
'requires_assignment': True,
|
||||
'is_actionable': False,
|
||||
'is_final': True
|
||||
},
|
||||
category=ChoiceCategory.STATUS
|
||||
),
|
||||
]
|
||||
|
||||
PRIORITY_LEVELS = [
|
||||
RichChoice(
|
||||
value="LOW",
|
||||
label="Low",
|
||||
description="Low priority - can be handled in regular workflow",
|
||||
metadata={
|
||||
'color': 'green',
|
||||
'icon': 'arrow-down',
|
||||
'css_class': 'bg-green-100 text-green-800 border-green-200',
|
||||
'sort_order': 1,
|
||||
'sla_hours': 168, # 7 days
|
||||
'escalation_threshold': 240, # 10 days
|
||||
'urgency_level': 1
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="MEDIUM",
|
||||
label="Medium",
|
||||
description="Medium priority - standard response time expected",
|
||||
metadata={
|
||||
'color': 'yellow',
|
||||
'icon': 'minus',
|
||||
'css_class': 'bg-yellow-100 text-yellow-800 border-yellow-200',
|
||||
'sort_order': 2,
|
||||
'sla_hours': 72, # 3 days
|
||||
'escalation_threshold': 120, # 5 days
|
||||
'urgency_level': 2
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="HIGH",
|
||||
label="High",
|
||||
description="High priority - requires prompt attention",
|
||||
metadata={
|
||||
'color': 'orange',
|
||||
'icon': 'arrow-up',
|
||||
'css_class': 'bg-orange-100 text-orange-800 border-orange-200',
|
||||
'sort_order': 3,
|
||||
'sla_hours': 24, # 1 day
|
||||
'escalation_threshold': 48, # 2 days
|
||||
'urgency_level': 3
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="URGENT",
|
||||
label="Urgent",
|
||||
description="Urgent priority - immediate attention required",
|
||||
metadata={
|
||||
'color': 'red',
|
||||
'icon': 'exclamation',
|
||||
'css_class': 'bg-red-100 text-red-800 border-red-200',
|
||||
'sort_order': 4,
|
||||
'sla_hours': 4, # 4 hours
|
||||
'escalation_threshold': 8, # 8 hours
|
||||
'urgency_level': 4,
|
||||
'requires_immediate_notification': True
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
]
|
||||
|
||||
REPORT_TYPES = [
|
||||
RichChoice(
|
||||
value="SPAM",
|
||||
label="Spam",
|
||||
description="Unwanted or repetitive content",
|
||||
metadata={
|
||||
'color': 'yellow',
|
||||
'icon': 'ban',
|
||||
'css_class': 'bg-yellow-100 text-yellow-800 border-yellow-200',
|
||||
'sort_order': 1,
|
||||
'default_priority': 'MEDIUM',
|
||||
'auto_actions': ['content_review'],
|
||||
'severity_level': 2
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="HARASSMENT",
|
||||
label="Harassment",
|
||||
description="Targeted harassment or bullying behavior",
|
||||
metadata={
|
||||
'color': 'red',
|
||||
'icon': 'shield-exclamation',
|
||||
'css_class': 'bg-red-100 text-red-800 border-red-200',
|
||||
'sort_order': 2,
|
||||
'default_priority': 'HIGH',
|
||||
'auto_actions': ['user_review', 'content_review'],
|
||||
'severity_level': 4,
|
||||
'requires_user_action': True
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="INAPPROPRIATE_CONTENT",
|
||||
label="Inappropriate Content",
|
||||
description="Content that violates community guidelines",
|
||||
metadata={
|
||||
'color': 'orange',
|
||||
'icon': 'exclamation-triangle',
|
||||
'css_class': 'bg-orange-100 text-orange-800 border-orange-200',
|
||||
'sort_order': 3,
|
||||
'default_priority': 'HIGH',
|
||||
'auto_actions': ['content_review'],
|
||||
'severity_level': 3
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="MISINFORMATION",
|
||||
label="Misinformation",
|
||||
description="False or misleading information",
|
||||
metadata={
|
||||
'color': 'purple',
|
||||
'icon': 'information-circle',
|
||||
'css_class': 'bg-purple-100 text-purple-800 border-purple-200',
|
||||
'sort_order': 4,
|
||||
'default_priority': 'HIGH',
|
||||
'auto_actions': ['content_review', 'fact_check'],
|
||||
'severity_level': 3,
|
||||
'requires_expert_review': True
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="COPYRIGHT",
|
||||
label="Copyright Violation",
|
||||
description="Unauthorized use of copyrighted material",
|
||||
metadata={
|
||||
'color': 'indigo',
|
||||
'icon': 'document-duplicate',
|
||||
'css_class': 'bg-indigo-100 text-indigo-800 border-indigo-200',
|
||||
'sort_order': 5,
|
||||
'default_priority': 'HIGH',
|
||||
'auto_actions': ['content_review', 'legal_review'],
|
||||
'severity_level': 4,
|
||||
'requires_legal_review': True
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="PRIVACY",
|
||||
label="Privacy Violation",
|
||||
description="Unauthorized sharing of private information",
|
||||
metadata={
|
||||
'color': 'pink',
|
||||
'icon': 'lock-closed',
|
||||
'css_class': 'bg-pink-100 text-pink-800 border-pink-200',
|
||||
'sort_order': 6,
|
||||
'default_priority': 'URGENT',
|
||||
'auto_actions': ['content_removal', 'user_review'],
|
||||
'severity_level': 5,
|
||||
'requires_immediate_action': True
|
||||
},
|
||||
category=ChoiceCategory.SECURITY
|
||||
),
|
||||
RichChoice(
|
||||
value="HATE_SPEECH",
|
||||
label="Hate Speech",
|
||||
description="Content promoting hatred or discrimination",
|
||||
metadata={
|
||||
'color': 'red',
|
||||
'icon': 'fire',
|
||||
'css_class': 'bg-red-100 text-red-800 border-red-200',
|
||||
'sort_order': 7,
|
||||
'default_priority': 'URGENT',
|
||||
'auto_actions': ['content_removal', 'user_suspension'],
|
||||
'severity_level': 5,
|
||||
'requires_immediate_action': True,
|
||||
'zero_tolerance': True
|
||||
},
|
||||
category=ChoiceCategory.SECURITY
|
||||
),
|
||||
RichChoice(
|
||||
value="VIOLENCE",
|
||||
label="Violence or Threats",
|
||||
description="Content containing violence or threatening behavior",
|
||||
metadata={
|
||||
'color': 'red',
|
||||
'icon': 'exclamation',
|
||||
'css_class': 'bg-red-100 text-red-800 border-red-200',
|
||||
'sort_order': 8,
|
||||
'default_priority': 'URGENT',
|
||||
'auto_actions': ['content_removal', 'user_ban', 'law_enforcement_notification'],
|
||||
'severity_level': 5,
|
||||
'requires_immediate_action': True,
|
||||
'zero_tolerance': True,
|
||||
'requires_law_enforcement': True
|
||||
},
|
||||
category=ChoiceCategory.SECURITY
|
||||
),
|
||||
RichChoice(
|
||||
value="OTHER",
|
||||
label="Other",
|
||||
description="Other issues not covered by specific categories",
|
||||
metadata={
|
||||
'color': 'gray',
|
||||
'icon': 'dots-horizontal',
|
||||
'css_class': 'bg-gray-100 text-gray-800 border-gray-200',
|
||||
'sort_order': 9,
|
||||
'default_priority': 'MEDIUM',
|
||||
'auto_actions': ['manual_review'],
|
||||
'severity_level': 1,
|
||||
'requires_manual_categorization': True
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
]
|
||||
|
||||
# ============================================================================
|
||||
# ModerationQueue Choices
|
||||
# ============================================================================
|
||||
|
||||
MODERATION_QUEUE_STATUSES = [
|
||||
RichChoice(
|
||||
value="PENDING",
|
||||
label="Pending",
|
||||
description="Queue item awaiting assignment or action",
|
||||
metadata={
|
||||
'color': 'yellow',
|
||||
'icon': 'clock',
|
||||
'css_class': 'bg-yellow-100 text-yellow-800 border-yellow-200',
|
||||
'sort_order': 1,
|
||||
'can_transition_to': ['IN_PROGRESS', 'CANCELLED'],
|
||||
'requires_assignment': False,
|
||||
'is_actionable': True
|
||||
},
|
||||
category=ChoiceCategory.STATUS
|
||||
),
|
||||
RichChoice(
|
||||
value="IN_PROGRESS",
|
||||
label="In Progress",
|
||||
description="Queue item is actively being worked on",
|
||||
metadata={
|
||||
'color': 'blue',
|
||||
'icon': 'play',
|
||||
'css_class': 'bg-blue-100 text-blue-800 border-blue-200',
|
||||
'sort_order': 2,
|
||||
'can_transition_to': ['COMPLETED', 'CANCELLED'],
|
||||
'requires_assignment': True,
|
||||
'is_actionable': True
|
||||
},
|
||||
category=ChoiceCategory.STATUS
|
||||
),
|
||||
RichChoice(
|
||||
value="COMPLETED",
|
||||
label="Completed",
|
||||
description="Queue item has been successfully completed",
|
||||
metadata={
|
||||
'color': 'green',
|
||||
'icon': 'check-circle',
|
||||
'css_class': 'bg-green-100 text-green-800 border-green-200',
|
||||
'sort_order': 3,
|
||||
'can_transition_to': [],
|
||||
'requires_assignment': True,
|
||||
'is_actionable': False,
|
||||
'is_final': True
|
||||
},
|
||||
category=ChoiceCategory.STATUS
|
||||
),
|
||||
RichChoice(
|
||||
value="CANCELLED",
|
||||
label="Cancelled",
|
||||
description="Queue item was cancelled and will not be completed",
|
||||
metadata={
|
||||
'color': 'gray',
|
||||
'icon': 'x-circle',
|
||||
'css_class': 'bg-gray-100 text-gray-800 border-gray-200',
|
||||
'sort_order': 4,
|
||||
'can_transition_to': [],
|
||||
'requires_assignment': False,
|
||||
'is_actionable': False,
|
||||
'is_final': True
|
||||
},
|
||||
category=ChoiceCategory.STATUS
|
||||
),
|
||||
]
|
||||
|
||||
QUEUE_ITEM_TYPES = [
|
||||
RichChoice(
|
||||
value="CONTENT_REVIEW",
|
||||
label="Content Review",
|
||||
description="Review of user-submitted content for policy compliance",
|
||||
metadata={
|
||||
'color': 'blue',
|
||||
'icon': 'document-text',
|
||||
'css_class': 'bg-blue-100 text-blue-800 border-blue-200',
|
||||
'sort_order': 1,
|
||||
'estimated_time_minutes': 15,
|
||||
'required_permissions': ['content_moderation'],
|
||||
'complexity_level': 'medium'
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="USER_REVIEW",
|
||||
label="User Review",
|
||||
description="Review of user account or behavior",
|
||||
metadata={
|
||||
'color': 'purple',
|
||||
'icon': 'user',
|
||||
'css_class': 'bg-purple-100 text-purple-800 border-purple-200',
|
||||
'sort_order': 2,
|
||||
'estimated_time_minutes': 30,
|
||||
'required_permissions': ['user_moderation'],
|
||||
'complexity_level': 'high'
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="BULK_ACTION",
|
||||
label="Bulk Action",
|
||||
description="Large-scale administrative operation",
|
||||
metadata={
|
||||
'color': 'indigo',
|
||||
'icon': 'collection',
|
||||
'css_class': 'bg-indigo-100 text-indigo-800 border-indigo-200',
|
||||
'sort_order': 3,
|
||||
'estimated_time_minutes': 60,
|
||||
'required_permissions': ['bulk_operations'],
|
||||
'complexity_level': 'high'
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="POLICY_VIOLATION",
|
||||
label="Policy Violation",
|
||||
description="Investigation of potential policy violations",
|
||||
metadata={
|
||||
'color': 'red',
|
||||
'icon': 'shield-exclamation',
|
||||
'css_class': 'bg-red-100 text-red-800 border-red-200',
|
||||
'sort_order': 4,
|
||||
'estimated_time_minutes': 45,
|
||||
'required_permissions': ['policy_enforcement'],
|
||||
'complexity_level': 'high'
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="APPEAL",
|
||||
label="Appeal",
|
||||
description="Review of user appeal against moderation action",
|
||||
metadata={
|
||||
'color': 'orange',
|
||||
'icon': 'scale',
|
||||
'css_class': 'bg-orange-100 text-orange-800 border-orange-200',
|
||||
'sort_order': 5,
|
||||
'estimated_time_minutes': 30,
|
||||
'required_permissions': ['appeal_review'],
|
||||
'complexity_level': 'high'
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="OTHER",
|
||||
label="Other",
|
||||
description="Other moderation tasks not covered by specific types",
|
||||
metadata={
|
||||
'color': 'gray',
|
||||
'icon': 'dots-horizontal',
|
||||
'css_class': 'bg-gray-100 text-gray-800 border-gray-200',
|
||||
'sort_order': 6,
|
||||
'estimated_time_minutes': 20,
|
||||
'required_permissions': ['general_moderation'],
|
||||
'complexity_level': 'medium'
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
]
|
||||
|
||||
# ============================================================================
|
||||
# ModerationAction Choices
|
||||
# ============================================================================
|
||||
|
||||
MODERATION_ACTION_TYPES = [
|
||||
RichChoice(
|
||||
value="WARNING",
|
||||
label="Warning",
|
||||
description="Formal warning issued to user",
|
||||
metadata={
|
||||
'color': 'yellow',
|
||||
'icon': 'exclamation-triangle',
|
||||
'css_class': 'bg-yellow-100 text-yellow-800 border-yellow-200',
|
||||
'sort_order': 1,
|
||||
'severity_level': 1,
|
||||
'is_temporary': False,
|
||||
'affects_privileges': False,
|
||||
'escalation_path': ['USER_SUSPENSION']
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="USER_SUSPENSION",
|
||||
label="User Suspension",
|
||||
description="Temporary suspension of user account",
|
||||
metadata={
|
||||
'color': 'orange',
|
||||
'icon': 'pause',
|
||||
'css_class': 'bg-orange-100 text-orange-800 border-orange-200',
|
||||
'sort_order': 2,
|
||||
'severity_level': 3,
|
||||
'is_temporary': True,
|
||||
'affects_privileges': True,
|
||||
'requires_duration': True,
|
||||
'escalation_path': ['USER_BAN']
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="USER_BAN",
|
||||
label="User Ban",
|
||||
description="Permanent ban of user account",
|
||||
metadata={
|
||||
'color': 'red',
|
||||
'icon': 'ban',
|
||||
'css_class': 'bg-red-100 text-red-800 border-red-200',
|
||||
'sort_order': 3,
|
||||
'severity_level': 5,
|
||||
'is_temporary': False,
|
||||
'affects_privileges': True,
|
||||
'is_permanent': True,
|
||||
'requires_admin_approval': True
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="CONTENT_REMOVAL",
|
||||
label="Content Removal",
|
||||
description="Removal of specific content",
|
||||
metadata={
|
||||
'color': 'red',
|
||||
'icon': 'trash',
|
||||
'css_class': 'bg-red-100 text-red-800 border-red-200',
|
||||
'sort_order': 4,
|
||||
'severity_level': 2,
|
||||
'is_temporary': False,
|
||||
'affects_privileges': False,
|
||||
'is_content_action': True
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="CONTENT_EDIT",
|
||||
label="Content Edit",
|
||||
description="Modification of content to comply with policies",
|
||||
metadata={
|
||||
'color': 'blue',
|
||||
'icon': 'pencil',
|
||||
'css_class': 'bg-blue-100 text-blue-800 border-blue-200',
|
||||
'sort_order': 5,
|
||||
'severity_level': 1,
|
||||
'is_temporary': False,
|
||||
'affects_privileges': False,
|
||||
'is_content_action': True,
|
||||
'preserves_content': True
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="CONTENT_RESTRICTION",
|
||||
label="Content Restriction",
|
||||
description="Restriction of content visibility or access",
|
||||
metadata={
|
||||
'color': 'purple',
|
||||
'icon': 'eye-off',
|
||||
'css_class': 'bg-purple-100 text-purple-800 border-purple-200',
|
||||
'sort_order': 6,
|
||||
'severity_level': 2,
|
||||
'is_temporary': True,
|
||||
'affects_privileges': False,
|
||||
'is_content_action': True,
|
||||
'requires_duration': True
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="ACCOUNT_RESTRICTION",
|
||||
label="Account Restriction",
|
||||
description="Restriction of specific account privileges",
|
||||
metadata={
|
||||
'color': 'indigo',
|
||||
'icon': 'lock-closed',
|
||||
'css_class': 'bg-indigo-100 text-indigo-800 border-indigo-200',
|
||||
'sort_order': 7,
|
||||
'severity_level': 3,
|
||||
'is_temporary': True,
|
||||
'affects_privileges': True,
|
||||
'requires_duration': True,
|
||||
'escalation_path': ['USER_SUSPENSION']
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="OTHER",
|
||||
label="Other",
|
||||
description="Other moderation actions not covered by specific types",
|
||||
metadata={
|
||||
'color': 'gray',
|
||||
'icon': 'dots-horizontal',
|
||||
'css_class': 'bg-gray-100 text-gray-800 border-gray-200',
|
||||
'sort_order': 8,
|
||||
'severity_level': 1,
|
||||
'is_temporary': False,
|
||||
'affects_privileges': False,
|
||||
'requires_manual_review': True
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
]
|
||||
|
||||
# ============================================================================
|
||||
# BulkOperation Choices
|
||||
# ============================================================================
|
||||
|
||||
BULK_OPERATION_STATUSES = [
|
||||
RichChoice(
|
||||
value="PENDING",
|
||||
label="Pending",
|
||||
description="Operation is queued and waiting to start",
|
||||
metadata={
|
||||
'color': 'yellow',
|
||||
'icon': 'clock',
|
||||
'css_class': 'bg-yellow-100 text-yellow-800 border-yellow-200',
|
||||
'sort_order': 1,
|
||||
'can_transition_to': ['RUNNING', 'CANCELLED'],
|
||||
'is_actionable': True,
|
||||
'can_cancel': True
|
||||
},
|
||||
category=ChoiceCategory.STATUS
|
||||
),
|
||||
RichChoice(
|
||||
value="RUNNING",
|
||||
label="Running",
|
||||
description="Operation is currently executing",
|
||||
metadata={
|
||||
'color': 'blue',
|
||||
'icon': 'play',
|
||||
'css_class': 'bg-blue-100 text-blue-800 border-blue-200',
|
||||
'sort_order': 2,
|
||||
'can_transition_to': ['COMPLETED', 'FAILED', 'CANCELLED'],
|
||||
'is_actionable': True,
|
||||
'can_cancel': True,
|
||||
'shows_progress': True
|
||||
},
|
||||
category=ChoiceCategory.STATUS
|
||||
),
|
||||
RichChoice(
|
||||
value="COMPLETED",
|
||||
label="Completed",
|
||||
description="Operation completed successfully",
|
||||
metadata={
|
||||
'color': 'green',
|
||||
'icon': 'check-circle',
|
||||
'css_class': 'bg-green-100 text-green-800 border-green-200',
|
||||
'sort_order': 3,
|
||||
'can_transition_to': [],
|
||||
'is_actionable': False,
|
||||
'can_cancel': False,
|
||||
'is_final': True
|
||||
},
|
||||
category=ChoiceCategory.STATUS
|
||||
),
|
||||
RichChoice(
|
||||
value="FAILED",
|
||||
label="Failed",
|
||||
description="Operation failed with errors",
|
||||
metadata={
|
||||
'color': 'red',
|
||||
'icon': 'x-circle',
|
||||
'css_class': 'bg-red-100 text-red-800 border-red-200',
|
||||
'sort_order': 4,
|
||||
'can_transition_to': [],
|
||||
'is_actionable': False,
|
||||
'can_cancel': False,
|
||||
'is_final': True,
|
||||
'requires_investigation': True
|
||||
},
|
||||
category=ChoiceCategory.STATUS
|
||||
),
|
||||
RichChoice(
|
||||
value="CANCELLED",
|
||||
label="Cancelled",
|
||||
description="Operation was cancelled before completion",
|
||||
metadata={
|
||||
'color': 'gray',
|
||||
'icon': 'stop',
|
||||
'css_class': 'bg-gray-100 text-gray-800 border-gray-200',
|
||||
'sort_order': 5,
|
||||
'can_transition_to': [],
|
||||
'is_actionable': False,
|
||||
'can_cancel': False,
|
||||
'is_final': True
|
||||
},
|
||||
category=ChoiceCategory.STATUS
|
||||
),
|
||||
]
|
||||
|
||||
BULK_OPERATION_TYPES = [
|
||||
RichChoice(
|
||||
value="UPDATE_PARKS",
|
||||
label="Update Parks",
|
||||
description="Bulk update operations on park data",
|
||||
metadata={
|
||||
'color': 'green',
|
||||
'icon': 'map',
|
||||
'css_class': 'bg-green-100 text-green-800 border-green-200',
|
||||
'sort_order': 1,
|
||||
'estimated_duration_minutes': 30,
|
||||
'required_permissions': ['bulk_park_operations'],
|
||||
'affects_data': ['parks'],
|
||||
'risk_level': 'medium'
|
||||
},
|
||||
category=ChoiceCategory.TECHNICAL
|
||||
),
|
||||
RichChoice(
|
||||
value="UPDATE_RIDES",
|
||||
label="Update Rides",
|
||||
description="Bulk update operations on ride data",
|
||||
metadata={
|
||||
'color': 'blue',
|
||||
'icon': 'cog',
|
||||
'css_class': 'bg-blue-100 text-blue-800 border-blue-200',
|
||||
'sort_order': 2,
|
||||
'estimated_duration_minutes': 45,
|
||||
'required_permissions': ['bulk_ride_operations'],
|
||||
'affects_data': ['rides'],
|
||||
'risk_level': 'medium'
|
||||
},
|
||||
category=ChoiceCategory.TECHNICAL
|
||||
),
|
||||
RichChoice(
|
||||
value="IMPORT_DATA",
|
||||
label="Import Data",
|
||||
description="Import data from external sources",
|
||||
metadata={
|
||||
'color': 'purple',
|
||||
'icon': 'download',
|
||||
'css_class': 'bg-purple-100 text-purple-800 border-purple-200',
|
||||
'sort_order': 3,
|
||||
'estimated_duration_minutes': 60,
|
||||
'required_permissions': ['data_import'],
|
||||
'affects_data': ['parks', 'rides', 'users'],
|
||||
'risk_level': 'high'
|
||||
},
|
||||
category=ChoiceCategory.TECHNICAL
|
||||
),
|
||||
RichChoice(
|
||||
value="EXPORT_DATA",
|
||||
label="Export Data",
|
||||
description="Export data for backup or analysis",
|
||||
metadata={
|
||||
'color': 'indigo',
|
||||
'icon': 'upload',
|
||||
'css_class': 'bg-indigo-100 text-indigo-800 border-indigo-200',
|
||||
'sort_order': 4,
|
||||
'estimated_duration_minutes': 20,
|
||||
'required_permissions': ['data_export'],
|
||||
'affects_data': [],
|
||||
'risk_level': 'low'
|
||||
},
|
||||
category=ChoiceCategory.TECHNICAL
|
||||
),
|
||||
RichChoice(
|
||||
value="MODERATE_CONTENT",
|
||||
label="Moderate Content",
|
||||
description="Bulk moderation actions on content",
|
||||
metadata={
|
||||
'color': 'orange',
|
||||
'icon': 'shield-check',
|
||||
'css_class': 'bg-orange-100 text-orange-800 border-orange-200',
|
||||
'sort_order': 5,
|
||||
'estimated_duration_minutes': 40,
|
||||
'required_permissions': ['bulk_moderation'],
|
||||
'affects_data': ['content', 'users'],
|
||||
'risk_level': 'high'
|
||||
},
|
||||
category=ChoiceCategory.TECHNICAL
|
||||
),
|
||||
RichChoice(
|
||||
value="USER_ACTIONS",
|
||||
label="User Actions",
|
||||
description="Bulk actions on user accounts",
|
||||
metadata={
|
||||
'color': 'red',
|
||||
'icon': 'users',
|
||||
'css_class': 'bg-red-100 text-red-800 border-red-200',
|
||||
'sort_order': 6,
|
||||
'estimated_duration_minutes': 50,
|
||||
'required_permissions': ['bulk_user_operations'],
|
||||
'affects_data': ['users'],
|
||||
'risk_level': 'high'
|
||||
},
|
||||
category=ChoiceCategory.TECHNICAL
|
||||
),
|
||||
RichChoice(
|
||||
value="CLEANUP",
|
||||
label="Cleanup",
|
||||
description="System cleanup and maintenance operations",
|
||||
metadata={
|
||||
'color': 'gray',
|
||||
'icon': 'trash',
|
||||
'css_class': 'bg-gray-100 text-gray-800 border-gray-200',
|
||||
'sort_order': 7,
|
||||
'estimated_duration_minutes': 25,
|
||||
'required_permissions': ['system_maintenance'],
|
||||
'affects_data': ['system'],
|
||||
'risk_level': 'low'
|
||||
},
|
||||
category=ChoiceCategory.TECHNICAL
|
||||
),
|
||||
RichChoice(
|
||||
value="OTHER",
|
||||
label="Other",
|
||||
description="Other bulk operations not covered by specific types",
|
||||
metadata={
|
||||
'color': 'gray',
|
||||
'icon': 'dots-horizontal',
|
||||
'css_class': 'bg-gray-100 text-gray-800 border-gray-200',
|
||||
'sort_order': 8,
|
||||
'estimated_duration_minutes': 30,
|
||||
'required_permissions': ['general_operations'],
|
||||
'affects_data': [],
|
||||
'risk_level': 'medium'
|
||||
},
|
||||
category=ChoiceCategory.TECHNICAL
|
||||
),
|
||||
]
|
||||
|
||||
# ============================================================================
|
||||
# PhotoSubmission Choices (Shared with EditSubmission)
|
||||
# ============================================================================
|
||||
|
||||
# PhotoSubmission uses the same STATUS_CHOICES as EditSubmission
|
||||
PHOTO_SUBMISSION_STATUSES = EDIT_SUBMISSION_STATUSES
|
||||
|
||||
# ============================================================================
|
||||
# Choice Registration
|
||||
# ============================================================================
|
||||
|
||||
# Register all choice groups with the global registry
|
||||
register_choices("edit_submission_statuses", EDIT_SUBMISSION_STATUSES, "moderation", "Edit submission status options")
|
||||
register_choices("submission_types", SUBMISSION_TYPES, "moderation", "Submission type classifications")
|
||||
register_choices("moderation_report_statuses", MODERATION_REPORT_STATUSES, "moderation", "Moderation report status options")
|
||||
register_choices("priority_levels", PRIORITY_LEVELS, "moderation", "Priority level classifications")
|
||||
register_choices("report_types", REPORT_TYPES, "moderation", "Report type classifications")
|
||||
register_choices("moderation_queue_statuses", MODERATION_QUEUE_STATUSES, "moderation", "Moderation queue status options")
|
||||
register_choices("queue_item_types", QUEUE_ITEM_TYPES, "moderation", "Queue item type classifications")
|
||||
register_choices("moderation_action_types", MODERATION_ACTION_TYPES, "moderation", "Moderation action type classifications")
|
||||
register_choices("bulk_operation_statuses", BULK_OPERATION_STATUSES, "moderation", "Bulk operation status options")
|
||||
register_choices("bulk_operation_types", BULK_OPERATION_TYPES, "moderation", "Bulk operation type classifications")
|
||||
register_choices("photo_submission_statuses", PHOTO_SUBMISSION_STATUSES, "moderation", "Photo submission status options")
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,692 +0,0 @@
|
||||
"""
|
||||
Moderation Models
|
||||
|
||||
This module contains models for the ThrillWiki moderation system, including:
|
||||
- EditSubmission: Original content submission and approval workflow
|
||||
- ModerationReport: User reports for content moderation
|
||||
- ModerationQueue: Workflow management for moderation tasks
|
||||
- ModerationAction: Actions taken against users/content
|
||||
- BulkOperation: Administrative bulk operations
|
||||
|
||||
All models use pghistory for change tracking and TrackedModel base class.
|
||||
"""
|
||||
|
||||
from typing import Any, Dict, Optional, Union
|
||||
from django.db import models
|
||||
from django.contrib.contenttypes.fields import GenericForeignKey
|
||||
from django.contrib.contenttypes.models import ContentType
|
||||
from django.conf import settings
|
||||
from django.utils import timezone
|
||||
from django.core.exceptions import ObjectDoesNotExist, FieldDoesNotExist
|
||||
from django.contrib.auth.base_user import AbstractBaseUser
|
||||
from django.contrib.auth.models import AnonymousUser
|
||||
from datetime import timedelta
|
||||
import pghistory
|
||||
from apps.core.history import TrackedModel
|
||||
from apps.core.choices.fields import RichChoiceField
|
||||
|
||||
UserType = Union[AbstractBaseUser, AnonymousUser]
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Original EditSubmission Model (Preserved)
|
||||
# ============================================================================
|
||||
|
||||
@pghistory.track() # Track all changes by default
|
||||
class EditSubmission(TrackedModel):
|
||||
|
||||
# Who submitted the edit
|
||||
user = models.ForeignKey(
|
||||
settings.AUTH_USER_MODEL,
|
||||
on_delete=models.CASCADE,
|
||||
related_name="edit_submissions",
|
||||
)
|
||||
|
||||
# What is being edited (Park or Ride)
|
||||
content_type = models.ForeignKey(ContentType, on_delete=models.CASCADE)
|
||||
object_id = models.PositiveIntegerField(
|
||||
null=True, blank=True
|
||||
) # Null for new objects
|
||||
content_object = GenericForeignKey("content_type", "object_id")
|
||||
|
||||
# Type of submission
|
||||
submission_type = RichChoiceField(
|
||||
choice_group="submission_types",
|
||||
domain="moderation",
|
||||
max_length=10,
|
||||
default="EDIT"
|
||||
)
|
||||
|
||||
# The actual changes/data
|
||||
changes = models.JSONField(
|
||||
help_text="JSON representation of the changes or new object data"
|
||||
)
|
||||
|
||||
# Moderator's edited version of changes before approval
|
||||
moderator_changes = models.JSONField(
|
||||
null=True,
|
||||
blank=True,
|
||||
help_text="Moderator's edited version of the changes before approval",
|
||||
)
|
||||
|
||||
# Metadata
|
||||
reason = models.TextField(help_text="Why this edit/addition is needed")
|
||||
source = models.TextField(
|
||||
blank=True, help_text="Source of information (if applicable)"
|
||||
)
|
||||
status = RichChoiceField(
|
||||
choice_group="edit_submission_statuses",
|
||||
domain="moderation",
|
||||
max_length=20,
|
||||
default="PENDING"
|
||||
)
|
||||
created_at = models.DateTimeField(auto_now_add=True)
|
||||
|
||||
# Review details
|
||||
handled_by = models.ForeignKey(
|
||||
settings.AUTH_USER_MODEL,
|
||||
on_delete=models.SET_NULL,
|
||||
null=True,
|
||||
blank=True,
|
||||
related_name="handled_submissions",
|
||||
)
|
||||
handled_at = models.DateTimeField(null=True, blank=True)
|
||||
notes = models.TextField(
|
||||
blank=True, help_text="Notes from the moderator about this submission"
|
||||
)
|
||||
|
||||
class Meta(TrackedModel.Meta):
|
||||
ordering = ["-created_at"]
|
||||
indexes = [
|
||||
models.Index(fields=["content_type", "object_id"]),
|
||||
models.Index(fields=["status"]),
|
||||
]
|
||||
|
||||
def __str__(self) -> str:
|
||||
action = "creation" if self.submission_type == "CREATE" else "edit"
|
||||
if model_class := self.content_type.model_class():
|
||||
target = self.content_object or model_class.__name__
|
||||
else:
|
||||
target = "Unknown"
|
||||
return f"{action} by {self.user.username} on {target}"
|
||||
|
||||
def _resolve_foreign_keys(self, data: Dict[str, Any]) -> Dict[str, Any]:
|
||||
"""Convert foreign key IDs to model instances"""
|
||||
if not (model_class := self.content_type.model_class()):
|
||||
raise ValueError("Could not resolve model class")
|
||||
|
||||
resolved_data = data.copy()
|
||||
|
||||
for field_name, value in data.items():
|
||||
try:
|
||||
field = model_class._meta.get_field(field_name)
|
||||
if isinstance(field, models.ForeignKey) and value is not None:
|
||||
try:
|
||||
related_obj = field.related_model.objects.get(pk=value) # type: ignore
|
||||
resolved_data[field_name] = related_obj
|
||||
except ObjectDoesNotExist:
|
||||
raise ValueError(
|
||||
f"Related object {field.related_model.__name__} with pk={value} does not exist" # type: ignore
|
||||
)
|
||||
except FieldDoesNotExist:
|
||||
# Field doesn't exist on model, skip it
|
||||
continue
|
||||
|
||||
return resolved_data
|
||||
|
||||
def _get_final_changes(self) -> Dict[str, Any]:
|
||||
"""Get the final changes to apply (moderator changes if available, otherwise original changes)"""
|
||||
return self.moderator_changes or self.changes
|
||||
|
||||
def approve(self, moderator: UserType) -> Optional[models.Model]:
|
||||
"""
|
||||
Approve this submission and apply the changes.
|
||||
|
||||
Args:
|
||||
moderator: The user approving the submission
|
||||
|
||||
Returns:
|
||||
The created or updated model instance
|
||||
|
||||
Raises:
|
||||
ValueError: If submission cannot be approved
|
||||
ValidationError: If the data is invalid
|
||||
"""
|
||||
if self.status != "PENDING":
|
||||
raise ValueError(f"Cannot approve submission with status {self.status}")
|
||||
|
||||
model_class = self.content_type.model_class()
|
||||
if not model_class:
|
||||
raise ValueError("Could not resolve model class")
|
||||
|
||||
final_changes = self._get_final_changes()
|
||||
resolved_changes = self._resolve_foreign_keys(final_changes)
|
||||
|
||||
try:
|
||||
if self.submission_type == "CREATE":
|
||||
# Create new object
|
||||
obj = model_class(**resolved_changes)
|
||||
obj.full_clean()
|
||||
obj.save()
|
||||
else:
|
||||
# Update existing object
|
||||
if not self.content_object:
|
||||
raise ValueError("Cannot update: content object not found")
|
||||
|
||||
obj = self.content_object
|
||||
for field_name, value in resolved_changes.items():
|
||||
if hasattr(obj, field_name):
|
||||
setattr(obj, field_name, value)
|
||||
|
||||
obj.full_clean()
|
||||
obj.save()
|
||||
|
||||
# Mark submission as approved
|
||||
self.status = "APPROVED"
|
||||
self.handled_by = moderator
|
||||
self.handled_at = timezone.now()
|
||||
self.save()
|
||||
|
||||
return obj
|
||||
|
||||
except Exception as e:
|
||||
# Mark as rejected on any error
|
||||
self.status = "REJECTED"
|
||||
self.handled_by = moderator
|
||||
self.handled_at = timezone.now()
|
||||
self.notes = f"Approval failed: {str(e)}"
|
||||
self.save()
|
||||
raise
|
||||
|
||||
def reject(self, moderator: UserType, reason: str) -> None:
|
||||
"""
|
||||
Reject this submission.
|
||||
|
||||
Args:
|
||||
moderator: The user rejecting the submission
|
||||
reason: Reason for rejection
|
||||
"""
|
||||
if self.status != "PENDING":
|
||||
raise ValueError(f"Cannot reject submission with status {self.status}")
|
||||
|
||||
self.status = "REJECTED"
|
||||
self.handled_by = moderator
|
||||
self.handled_at = timezone.now()
|
||||
self.notes = f"Rejected: {reason}"
|
||||
self.save()
|
||||
|
||||
def escalate(self, moderator: UserType, reason: str) -> None:
|
||||
"""
|
||||
Escalate this submission for higher-level review.
|
||||
|
||||
Args:
|
||||
moderator: The user escalating the submission
|
||||
reason: Reason for escalation
|
||||
"""
|
||||
if self.status != "PENDING":
|
||||
raise ValueError(f"Cannot escalate submission with status {self.status}")
|
||||
|
||||
self.status = "ESCALATED"
|
||||
self.handled_by = moderator
|
||||
self.handled_at = timezone.now()
|
||||
self.notes = f"Escalated: {reason}"
|
||||
self.save()
|
||||
|
||||
@property
|
||||
def submitted_by(self):
|
||||
"""Alias for user field to maintain compatibility"""
|
||||
return self.user
|
||||
|
||||
@property
|
||||
def submitted_at(self):
|
||||
"""Alias for created_at field to maintain compatibility"""
|
||||
return self.created_at
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# New Moderation System Models
|
||||
# ============================================================================
|
||||
|
||||
@pghistory.track()
|
||||
class ModerationReport(TrackedModel):
|
||||
"""
|
||||
Model for tracking user reports about content, users, or behavior.
|
||||
|
||||
This handles the initial reporting phase where users flag content
|
||||
or behavior that needs moderator attention.
|
||||
"""
|
||||
|
||||
# Report details
|
||||
report_type = RichChoiceField(
|
||||
choice_group="report_types",
|
||||
domain="moderation",
|
||||
max_length=50
|
||||
)
|
||||
status = RichChoiceField(
|
||||
choice_group="moderation_report_statuses",
|
||||
domain="moderation",
|
||||
max_length=20,
|
||||
default='PENDING'
|
||||
)
|
||||
priority = RichChoiceField(
|
||||
choice_group="priority_levels",
|
||||
domain="moderation",
|
||||
max_length=10,
|
||||
default='MEDIUM'
|
||||
)
|
||||
|
||||
# What is being reported
|
||||
reported_entity_type = models.CharField(
|
||||
max_length=50, help_text="Type of entity being reported (park, ride, user, etc.)")
|
||||
reported_entity_id = models.PositiveIntegerField(
|
||||
help_text="ID of the entity being reported")
|
||||
content_type = models.ForeignKey(
|
||||
ContentType, on_delete=models.CASCADE, null=True, blank=True)
|
||||
|
||||
# Report content
|
||||
reason = models.CharField(max_length=200, help_text="Brief reason for the report")
|
||||
description = models.TextField(help_text="Detailed description of the issue")
|
||||
evidence_urls = models.JSONField(
|
||||
default=list, blank=True, help_text="URLs to evidence (screenshots, etc.)")
|
||||
|
||||
# Users involved
|
||||
reported_by = models.ForeignKey(
|
||||
settings.AUTH_USER_MODEL,
|
||||
on_delete=models.CASCADE,
|
||||
related_name='moderation_reports_made'
|
||||
)
|
||||
assigned_moderator = models.ForeignKey(
|
||||
settings.AUTH_USER_MODEL,
|
||||
on_delete=models.SET_NULL,
|
||||
null=True,
|
||||
blank=True,
|
||||
related_name='assigned_moderation_reports'
|
||||
)
|
||||
|
||||
# Resolution
|
||||
resolution_action = models.CharField(
|
||||
max_length=100, blank=True, help_text="Action taken to resolve")
|
||||
resolution_notes = models.TextField(
|
||||
blank=True, help_text="Notes about the resolution")
|
||||
resolved_at = models.DateTimeField(null=True, blank=True)
|
||||
|
||||
# Timestamps
|
||||
created_at = models.DateTimeField(auto_now_add=True)
|
||||
updated_at = models.DateTimeField(auto_now=True)
|
||||
|
||||
class Meta(TrackedModel.Meta):
|
||||
ordering = ['-created_at']
|
||||
indexes = [
|
||||
models.Index(fields=['status', 'priority']),
|
||||
models.Index(fields=['reported_by']),
|
||||
models.Index(fields=['assigned_moderator']),
|
||||
models.Index(fields=['created_at']),
|
||||
]
|
||||
|
||||
def __str__(self):
|
||||
return f"{self.get_report_type_display()} report by {self.reported_by.username}" # type: ignore
|
||||
|
||||
|
||||
@pghistory.track()
|
||||
class ModerationQueue(TrackedModel):
|
||||
"""
|
||||
Model for managing moderation workflow and task assignment.
|
||||
|
||||
This represents items in the moderation queue that need attention,
|
||||
separate from the initial reports.
|
||||
"""
|
||||
|
||||
# Queue item details
|
||||
item_type = RichChoiceField(
|
||||
choice_group="queue_item_types",
|
||||
domain="moderation",
|
||||
max_length=50
|
||||
)
|
||||
status = RichChoiceField(
|
||||
choice_group="moderation_queue_statuses",
|
||||
domain="moderation",
|
||||
max_length=20,
|
||||
default='PENDING'
|
||||
)
|
||||
priority = RichChoiceField(
|
||||
choice_group="priority_levels",
|
||||
domain="moderation",
|
||||
max_length=10,
|
||||
default='MEDIUM'
|
||||
)
|
||||
|
||||
title = models.CharField(max_length=200, help_text="Brief title for the queue item")
|
||||
description = models.TextField(
|
||||
help_text="Detailed description of what needs to be done")
|
||||
|
||||
# What entity this relates to
|
||||
entity_type = models.CharField(
|
||||
max_length=50, blank=True, help_text="Type of entity (park, ride, user, etc.)")
|
||||
entity_id = models.PositiveIntegerField(
|
||||
null=True, blank=True, help_text="ID of the related entity")
|
||||
entity_preview = models.JSONField(
|
||||
default=dict, blank=True, help_text="Preview data for the entity")
|
||||
content_type = models.ForeignKey(
|
||||
ContentType, on_delete=models.CASCADE, null=True, blank=True)
|
||||
|
||||
# Assignment and timing
|
||||
assigned_to = models.ForeignKey(
|
||||
settings.AUTH_USER_MODEL,
|
||||
on_delete=models.SET_NULL,
|
||||
null=True,
|
||||
blank=True,
|
||||
related_name='assigned_queue_items'
|
||||
)
|
||||
assigned_at = models.DateTimeField(null=True, blank=True)
|
||||
estimated_review_time = models.PositiveIntegerField(
|
||||
default=30, help_text="Estimated time in minutes")
|
||||
|
||||
# Metadata
|
||||
flagged_by = models.ForeignKey(
|
||||
settings.AUTH_USER_MODEL,
|
||||
on_delete=models.SET_NULL,
|
||||
null=True,
|
||||
blank=True,
|
||||
related_name='flagged_queue_items'
|
||||
)
|
||||
tags = models.JSONField(default=list, blank=True,
|
||||
help_text="Tags for categorization")
|
||||
|
||||
# Related objects
|
||||
related_report = models.ForeignKey(
|
||||
ModerationReport,
|
||||
on_delete=models.CASCADE,
|
||||
null=True,
|
||||
blank=True,
|
||||
related_name='queue_items'
|
||||
)
|
||||
|
||||
# Timestamps
|
||||
created_at = models.DateTimeField(auto_now_add=True)
|
||||
updated_at = models.DateTimeField(auto_now=True)
|
||||
|
||||
class Meta(TrackedModel.Meta):
|
||||
ordering = ['priority', 'created_at']
|
||||
indexes = [
|
||||
models.Index(fields=['status', 'priority']),
|
||||
models.Index(fields=['assigned_to']),
|
||||
models.Index(fields=['created_at']),
|
||||
]
|
||||
|
||||
def __str__(self):
|
||||
return f"{self.get_item_type_display()}: {self.title}" # type: ignore
|
||||
|
||||
|
||||
@pghistory.track()
|
||||
class ModerationAction(TrackedModel):
|
||||
"""
|
||||
Model for tracking actions taken against users or content.
|
||||
|
||||
This records what actions moderators have taken, including
|
||||
warnings, suspensions, content removal, etc.
|
||||
"""
|
||||
|
||||
# Action details
|
||||
action_type = RichChoiceField(
|
||||
choice_group="moderation_action_types",
|
||||
domain="moderation",
|
||||
max_length=50
|
||||
)
|
||||
reason = models.CharField(max_length=200, help_text="Brief reason for the action")
|
||||
details = models.TextField(help_text="Detailed explanation of the action")
|
||||
|
||||
# Duration (for temporary actions)
|
||||
duration_hours = models.PositiveIntegerField(
|
||||
null=True,
|
||||
blank=True,
|
||||
help_text="Duration in hours for temporary actions"
|
||||
)
|
||||
expires_at = models.DateTimeField(
|
||||
null=True, blank=True, help_text="When this action expires")
|
||||
is_active = models.BooleanField(
|
||||
default=True, help_text="Whether this action is currently active")
|
||||
|
||||
# Users involved
|
||||
moderator = models.ForeignKey(
|
||||
settings.AUTH_USER_MODEL,
|
||||
on_delete=models.CASCADE,
|
||||
related_name='moderation_actions_taken'
|
||||
)
|
||||
target_user = models.ForeignKey(
|
||||
settings.AUTH_USER_MODEL,
|
||||
on_delete=models.CASCADE,
|
||||
related_name='moderation_actions_received'
|
||||
)
|
||||
|
||||
# Related objects
|
||||
related_report = models.ForeignKey(
|
||||
ModerationReport,
|
||||
on_delete=models.SET_NULL,
|
||||
null=True,
|
||||
blank=True,
|
||||
related_name='actions_taken'
|
||||
)
|
||||
|
||||
# Timestamps
|
||||
created_at = models.DateTimeField(auto_now_add=True)
|
||||
updated_at = models.DateTimeField(auto_now=True)
|
||||
|
||||
class Meta(TrackedModel.Meta):
|
||||
ordering = ['-created_at']
|
||||
indexes = [
|
||||
models.Index(fields=['target_user', 'is_active']),
|
||||
models.Index(fields=['moderator']),
|
||||
models.Index(fields=['expires_at']),
|
||||
models.Index(fields=['created_at']),
|
||||
]
|
||||
|
||||
def __str__(self):
|
||||
return f"{self.get_action_type_display()} against {self.target_user.username} by {self.moderator.username}" # type: ignore
|
||||
|
||||
def save(self, *args, **kwargs):
|
||||
# Set expiration time if duration is provided
|
||||
if self.duration_hours and not self.expires_at:
|
||||
self.expires_at = timezone.now() + timedelta(hours=self.duration_hours)
|
||||
super().save(*args, **kwargs)
|
||||
|
||||
|
||||
@pghistory.track()
|
||||
class BulkOperation(TrackedModel):
|
||||
"""
|
||||
Model for tracking bulk administrative operations.
|
||||
|
||||
This handles large-scale operations like bulk updates,
|
||||
imports, exports, or mass moderation actions.
|
||||
"""
|
||||
|
||||
# Operation details
|
||||
operation_type = RichChoiceField(
|
||||
choice_group="bulk_operation_types",
|
||||
domain="moderation",
|
||||
max_length=50
|
||||
)
|
||||
status = RichChoiceField(
|
||||
choice_group="bulk_operation_statuses",
|
||||
domain="moderation",
|
||||
max_length=20,
|
||||
default='PENDING'
|
||||
)
|
||||
priority = RichChoiceField(
|
||||
choice_group="priority_levels",
|
||||
domain="moderation",
|
||||
max_length=10,
|
||||
default='MEDIUM'
|
||||
)
|
||||
description = models.TextField(help_text="Description of what this operation does")
|
||||
|
||||
# Operation parameters and results
|
||||
parameters = models.JSONField(
|
||||
default=dict, help_text="Parameters for the operation")
|
||||
results = models.JSONField(default=dict, blank=True,
|
||||
help_text="Results and output from the operation")
|
||||
|
||||
# Progress tracking
|
||||
total_items = models.PositiveIntegerField(
|
||||
default=0, help_text="Total number of items to process")
|
||||
processed_items = models.PositiveIntegerField(
|
||||
default=0, help_text="Number of items processed")
|
||||
failed_items = models.PositiveIntegerField(
|
||||
default=0, help_text="Number of items that failed")
|
||||
|
||||
# Timing
|
||||
estimated_duration_minutes = models.PositiveIntegerField(
|
||||
null=True,
|
||||
blank=True,
|
||||
help_text="Estimated duration in minutes"
|
||||
)
|
||||
schedule_for = models.DateTimeField(
|
||||
null=True, blank=True, help_text="When to run this operation")
|
||||
|
||||
# Control
|
||||
can_cancel = models.BooleanField(
|
||||
default=True, help_text="Whether this operation can be cancelled")
|
||||
|
||||
# User who created the operation
|
||||
created_by = models.ForeignKey(
|
||||
settings.AUTH_USER_MODEL,
|
||||
on_delete=models.CASCADE,
|
||||
related_name='bulk_operations_created'
|
||||
)
|
||||
|
||||
# Timestamps
|
||||
created_at = models.DateTimeField(auto_now_add=True)
|
||||
started_at = models.DateTimeField(null=True, blank=True)
|
||||
completed_at = models.DateTimeField(null=True, blank=True)
|
||||
updated_at = models.DateTimeField(auto_now=True)
|
||||
|
||||
class Meta(TrackedModel.Meta):
|
||||
ordering = ['-created_at']
|
||||
indexes = [
|
||||
models.Index(fields=['status', 'priority']),
|
||||
models.Index(fields=['created_by']),
|
||||
models.Index(fields=['schedule_for']),
|
||||
models.Index(fields=['created_at']),
|
||||
]
|
||||
|
||||
def __str__(self):
|
||||
return f"{self.get_operation_type_display()}: {self.description[:50]}" # type: ignore
|
||||
|
||||
@property
|
||||
def progress_percentage(self):
|
||||
"""Calculate progress percentage."""
|
||||
if self.total_items == 0:
|
||||
return 0.0
|
||||
return round((self.processed_items / self.total_items) * 100, 2)
|
||||
|
||||
|
||||
@pghistory.track() # Track all changes by default
|
||||
class PhotoSubmission(TrackedModel):
|
||||
|
||||
# Who submitted the photo
|
||||
user = models.ForeignKey(
|
||||
settings.AUTH_USER_MODEL,
|
||||
on_delete=models.CASCADE,
|
||||
related_name="photo_submissions",
|
||||
)
|
||||
|
||||
# What the photo is for (Park or Ride)
|
||||
content_type = models.ForeignKey(ContentType, on_delete=models.CASCADE)
|
||||
object_id = models.PositiveIntegerField()
|
||||
content_object = GenericForeignKey("content_type", "object_id")
|
||||
|
||||
# The photo itself
|
||||
photo = models.ForeignKey(
|
||||
'django_cloudflareimages_toolkit.CloudflareImage',
|
||||
on_delete=models.CASCADE,
|
||||
help_text="Photo submission stored on Cloudflare Images"
|
||||
)
|
||||
caption = models.CharField(max_length=255, blank=True)
|
||||
date_taken = models.DateField(null=True, blank=True)
|
||||
|
||||
# Metadata
|
||||
status = RichChoiceField(
|
||||
choice_group="photo_submission_statuses",
|
||||
domain="moderation",
|
||||
max_length=20,
|
||||
default="PENDING"
|
||||
)
|
||||
created_at = models.DateTimeField(auto_now_add=True)
|
||||
|
||||
# Review details
|
||||
handled_by = models.ForeignKey(
|
||||
settings.AUTH_USER_MODEL,
|
||||
on_delete=models.SET_NULL,
|
||||
null=True,
|
||||
blank=True,
|
||||
related_name="handled_photos",
|
||||
)
|
||||
handled_at = models.DateTimeField(null=True, blank=True)
|
||||
notes = models.TextField(
|
||||
blank=True,
|
||||
help_text="Notes from the moderator about this photo submission",
|
||||
)
|
||||
|
||||
class Meta(TrackedModel.Meta):
|
||||
ordering = ["-created_at"]
|
||||
indexes = [
|
||||
models.Index(fields=["content_type", "object_id"]),
|
||||
models.Index(fields=["status"]),
|
||||
]
|
||||
|
||||
def __str__(self) -> str:
|
||||
return f"Photo submission by {self.user.username} for {self.content_object}"
|
||||
|
||||
def approve(self, moderator: UserType, notes: str = "") -> None:
|
||||
"""Approve the photo submission"""
|
||||
from apps.parks.models.media import ParkPhoto
|
||||
from apps.rides.models.media import RidePhoto
|
||||
|
||||
self.status = "APPROVED"
|
||||
self.handled_by = moderator # type: ignore
|
||||
self.handled_at = timezone.now()
|
||||
self.notes = notes
|
||||
|
||||
# Determine the correct photo model based on the content type
|
||||
model_class = self.content_type.model_class()
|
||||
if model_class.__name__ == "Park":
|
||||
PhotoModel = ParkPhoto
|
||||
elif model_class.__name__ == "Ride":
|
||||
PhotoModel = RidePhoto
|
||||
else:
|
||||
raise ValueError(f"Unsupported content type: {model_class.__name__}")
|
||||
|
||||
# Create the approved photo
|
||||
PhotoModel.objects.create(
|
||||
uploaded_by=self.user,
|
||||
content_object=self.content_object,
|
||||
image=self.photo,
|
||||
caption=self.caption,
|
||||
is_approved=True,
|
||||
)
|
||||
|
||||
self.save()
|
||||
|
||||
def reject(self, moderator: UserType, notes: str) -> None:
|
||||
"""Reject the photo submission"""
|
||||
self.status = "REJECTED"
|
||||
self.handled_by = moderator # type: ignore
|
||||
self.handled_at = timezone.now()
|
||||
self.notes = notes
|
||||
self.save()
|
||||
|
||||
def auto_approve(self) -> None:
|
||||
"""Auto - approve submissions from moderators"""
|
||||
# Get user role safely
|
||||
user_role = getattr(self.user, "role", None)
|
||||
|
||||
# If user is moderator or above, auto-approve
|
||||
if user_role in ["MODERATOR", "ADMIN", "SUPERUSER"]:
|
||||
self.approve(self.user)
|
||||
|
||||
def escalate(self, moderator: UserType, notes: str = "") -> None:
|
||||
"""Escalate the photo submission to admin"""
|
||||
self.status = "ESCALATED"
|
||||
self.handled_by = moderator # type: ignore
|
||||
self.handled_at = timezone.now()
|
||||
self.notes = notes
|
||||
self.save()
|
||||
@@ -1,349 +0,0 @@
|
||||
from django.test import TestCase, Client
|
||||
from django.contrib.auth import get_user_model
|
||||
from django.contrib.auth.models import AnonymousUser
|
||||
from django.contrib.contenttypes.models import ContentType
|
||||
from django.core.files.uploadedfile import SimpleUploadedFile
|
||||
from django.http import JsonResponse, HttpRequest
|
||||
from .models import EditSubmission
|
||||
from .mixins import (
|
||||
EditSubmissionMixin,
|
||||
PhotoSubmissionMixin,
|
||||
ModeratorRequiredMixin,
|
||||
AdminRequiredMixin,
|
||||
InlineEditMixin,
|
||||
HistoryMixin,
|
||||
)
|
||||
from apps.parks.models import Company as Operator
|
||||
from django.views.generic import DetailView
|
||||
from django.test import RequestFactory
|
||||
import json
|
||||
|
||||
User = get_user_model()
|
||||
|
||||
|
||||
class TestView(
|
||||
EditSubmissionMixin,
|
||||
PhotoSubmissionMixin,
|
||||
InlineEditMixin,
|
||||
HistoryMixin,
|
||||
DetailView,
|
||||
):
|
||||
model = Operator
|
||||
template_name = "test.html"
|
||||
pk_url_kwarg = "pk"
|
||||
slug_url_kwarg = "slug"
|
||||
|
||||
def get_context_data(self, **kwargs):
|
||||
if not hasattr(self, "object"):
|
||||
self.object = self.get_object()
|
||||
return super().get_context_data(**kwargs)
|
||||
|
||||
def setup(self, request: HttpRequest, *args, **kwargs):
|
||||
super().setup(request, *args, **kwargs)
|
||||
self.request = request
|
||||
|
||||
|
||||
class ModerationMixinsTests(TestCase):
|
||||
def setUp(self):
|
||||
self.client = Client()
|
||||
self.factory = RequestFactory()
|
||||
|
||||
# Create users with different roles
|
||||
self.user = User.objects.create_user(
|
||||
username="testuser",
|
||||
email="test@example.com",
|
||||
password="testpass123",
|
||||
)
|
||||
self.moderator = User.objects.create_user(
|
||||
username="moderator",
|
||||
email="moderator@example.com",
|
||||
password="modpass123",
|
||||
role="MODERATOR",
|
||||
)
|
||||
self.admin = User.objects.create_user(
|
||||
username="admin",
|
||||
email="admin@example.com",
|
||||
password="adminpass123",
|
||||
role="ADMIN",
|
||||
)
|
||||
|
||||
# Create test company
|
||||
self.operator = Operator.objects.create(
|
||||
name="Test Operator",
|
||||
website="http://example.com",
|
||||
description="Test Description",
|
||||
)
|
||||
|
||||
def test_edit_submission_mixin_unauthenticated(self):
|
||||
"""Test edit submission when not logged in"""
|
||||
view = TestView()
|
||||
request = self.factory.post(f"/test/{self.operator.pk}/")
|
||||
request.user = AnonymousUser()
|
||||
view.setup(request, pk=self.operator.pk)
|
||||
view.kwargs = {"pk": self.operator.pk}
|
||||
response = view.handle_edit_submission(request, {})
|
||||
self.assertIsInstance(response, JsonResponse)
|
||||
self.assertEqual(response.status_code, 403)
|
||||
|
||||
def test_edit_submission_mixin_no_changes(self):
|
||||
"""Test edit submission with no changes"""
|
||||
view = TestView()
|
||||
request = self.factory.post(
|
||||
f"/test/{self.operator.pk}/",
|
||||
data=json.dumps({}),
|
||||
content_type="application/json",
|
||||
)
|
||||
request.user = self.user
|
||||
view.setup(request, pk=self.operator.pk)
|
||||
view.kwargs = {"pk": self.operator.pk}
|
||||
response = view.post(request)
|
||||
self.assertIsInstance(response, JsonResponse)
|
||||
self.assertEqual(response.status_code, 400)
|
||||
|
||||
def test_edit_submission_mixin_invalid_json(self):
|
||||
"""Test edit submission with invalid JSON"""
|
||||
view = TestView()
|
||||
request = self.factory.post(
|
||||
f"/test/{self.operator.pk}/",
|
||||
data="invalid json",
|
||||
content_type="application/json",
|
||||
)
|
||||
request.user = self.user
|
||||
view.setup(request, pk=self.operator.pk)
|
||||
view.kwargs = {"pk": self.operator.pk}
|
||||
response = view.post(request)
|
||||
self.assertIsInstance(response, JsonResponse)
|
||||
self.assertEqual(response.status_code, 400)
|
||||
|
||||
def test_edit_submission_mixin_regular_user(self):
|
||||
"""Test edit submission as regular user"""
|
||||
view = TestView()
|
||||
request = self.factory.post(f"/test/{self.operator.pk}/")
|
||||
request.user = self.user
|
||||
view.setup(request, pk=self.operator.pk)
|
||||
view.kwargs = {"pk": self.operator.pk}
|
||||
changes = {"name": "New Name"}
|
||||
response = view.handle_edit_submission(
|
||||
request, changes, "Test reason", "Test source"
|
||||
)
|
||||
self.assertIsInstance(response, JsonResponse)
|
||||
self.assertEqual(response.status_code, 200)
|
||||
data = json.loads(response.content.decode())
|
||||
self.assertFalse(data["auto_approved"])
|
||||
|
||||
def test_edit_submission_mixin_moderator(self):
|
||||
"""Test edit submission as moderator"""
|
||||
view = TestView()
|
||||
request = self.factory.post(f"/test/{self.operator.pk}/")
|
||||
request.user = self.moderator
|
||||
view.setup(request, pk=self.operator.pk)
|
||||
view.kwargs = {"pk": self.operator.pk}
|
||||
changes = {"name": "New Name"}
|
||||
response = view.handle_edit_submission(
|
||||
request, changes, "Test reason", "Test source"
|
||||
)
|
||||
self.assertIsInstance(response, JsonResponse)
|
||||
self.assertEqual(response.status_code, 200)
|
||||
data = json.loads(response.content.decode())
|
||||
self.assertTrue(data["auto_approved"])
|
||||
|
||||
def test_photo_submission_mixin_unauthenticated(self):
|
||||
"""Test photo submission when not logged in"""
|
||||
view = TestView()
|
||||
view.kwargs = {"pk": self.operator.pk}
|
||||
view.object = self.operator
|
||||
|
||||
request = self.factory.post(
|
||||
f"/test/{self.operator.pk}/", data={}, format="multipart"
|
||||
)
|
||||
request.user = AnonymousUser()
|
||||
view.setup(request, pk=self.operator.pk)
|
||||
response = view.handle_photo_submission(request)
|
||||
self.assertIsInstance(response, JsonResponse)
|
||||
self.assertEqual(response.status_code, 403)
|
||||
|
||||
def test_photo_submission_mixin_no_photo(self):
|
||||
"""Test photo submission with no photo"""
|
||||
view = TestView()
|
||||
view.kwargs = {"pk": self.operator.pk}
|
||||
view.object = self.operator
|
||||
|
||||
request = self.factory.post(
|
||||
f"/test/{self.operator.pk}/", data={}, format="multipart"
|
||||
)
|
||||
request.user = self.user
|
||||
view.setup(request, pk=self.operator.pk)
|
||||
response = view.handle_photo_submission(request)
|
||||
self.assertIsInstance(response, JsonResponse)
|
||||
self.assertEqual(response.status_code, 400)
|
||||
|
||||
def test_photo_submission_mixin_regular_user(self):
|
||||
"""Test photo submission as regular user"""
|
||||
view = TestView()
|
||||
view.kwargs = {"pk": self.operator.pk}
|
||||
view.object = self.operator
|
||||
|
||||
# Create a test photo file
|
||||
photo = SimpleUploadedFile(
|
||||
"test.gif",
|
||||
b"GIF87a\x01\x00\x01\x00\x80\x01\x00\x00\x00\x00ccc,\x00\x00\x00\x00\x01\x00\x01\x00\x00\x02\x02D\x01\x00;",
|
||||
content_type="image/gif",
|
||||
)
|
||||
|
||||
request = self.factory.post(
|
||||
f"/test/{self.operator.pk}/",
|
||||
data={
|
||||
"photo": photo,
|
||||
"caption": "Test Photo",
|
||||
"date_taken": "2024-01-01",
|
||||
},
|
||||
format="multipart",
|
||||
)
|
||||
request.user = self.user
|
||||
view.setup(request, pk=self.operator.pk)
|
||||
|
||||
response = view.handle_photo_submission(request)
|
||||
self.assertIsInstance(response, JsonResponse)
|
||||
self.assertEqual(response.status_code, 200)
|
||||
data = json.loads(response.content.decode())
|
||||
self.assertFalse(data["auto_approved"])
|
||||
|
||||
def test_photo_submission_mixin_moderator(self):
|
||||
"""Test photo submission as moderator"""
|
||||
view = TestView()
|
||||
view.kwargs = {"pk": self.operator.pk}
|
||||
view.object = self.operator
|
||||
|
||||
# Create a test photo file
|
||||
photo = SimpleUploadedFile(
|
||||
"test.gif",
|
||||
b"GIF87a\x01\x00\x01\x00\x80\x01\x00\x00\x00\x00ccc,\x00\x00\x00\x00\x01\x00\x01\x00\x00\x02\x02D\x01\x00;",
|
||||
content_type="image/gif",
|
||||
)
|
||||
|
||||
request = self.factory.post(
|
||||
f"/test/{self.operator.pk}/",
|
||||
data={
|
||||
"photo": photo,
|
||||
"caption": "Test Photo",
|
||||
"date_taken": "2024-01-01",
|
||||
},
|
||||
format="multipart",
|
||||
)
|
||||
request.user = self.moderator
|
||||
view.setup(request, pk=self.operator.pk)
|
||||
|
||||
response = view.handle_photo_submission(request)
|
||||
self.assertIsInstance(response, JsonResponse)
|
||||
self.assertEqual(response.status_code, 200)
|
||||
data = json.loads(response.content.decode())
|
||||
self.assertTrue(data["auto_approved"])
|
||||
|
||||
def test_moderator_required_mixin(self):
|
||||
"""Test moderator required mixin"""
|
||||
|
||||
class TestModeratorView(ModeratorRequiredMixin):
|
||||
pass
|
||||
|
||||
view = TestModeratorView()
|
||||
|
||||
# Test unauthenticated user
|
||||
request = self.factory.get("/test/")
|
||||
request.user = AnonymousUser()
|
||||
view.request = request
|
||||
self.assertFalse(view.test_func())
|
||||
|
||||
# Test regular user
|
||||
request.user = self.user
|
||||
view.request = request
|
||||
self.assertFalse(view.test_func())
|
||||
|
||||
# Test moderator
|
||||
request.user = self.moderator
|
||||
view.request = request
|
||||
self.assertTrue(view.test_func())
|
||||
|
||||
# Test admin
|
||||
request.user = self.admin
|
||||
view.request = request
|
||||
self.assertTrue(view.test_func())
|
||||
|
||||
def test_admin_required_mixin(self):
|
||||
"""Test admin required mixin"""
|
||||
|
||||
class TestAdminView(AdminRequiredMixin):
|
||||
pass
|
||||
|
||||
view = TestAdminView()
|
||||
|
||||
# Test unauthenticated user
|
||||
request = self.factory.get("/test/")
|
||||
request.user = AnonymousUser()
|
||||
view.request = request
|
||||
self.assertFalse(view.test_func())
|
||||
|
||||
# Test regular user
|
||||
request.user = self.user
|
||||
view.request = request
|
||||
self.assertFalse(view.test_func())
|
||||
|
||||
# Test moderator
|
||||
request.user = self.moderator
|
||||
view.request = request
|
||||
self.assertFalse(view.test_func())
|
||||
|
||||
# Test admin
|
||||
request.user = self.admin
|
||||
view.request = request
|
||||
self.assertTrue(view.test_func())
|
||||
|
||||
def test_inline_edit_mixin(self):
|
||||
"""Test inline edit mixin"""
|
||||
view = TestView()
|
||||
view.kwargs = {"pk": self.operator.pk}
|
||||
view.object = self.operator
|
||||
|
||||
# Test unauthenticated user
|
||||
request = self.factory.get(f"/test/{self.operator.pk}/")
|
||||
request.user = AnonymousUser()
|
||||
view.setup(request, pk=self.operator.pk)
|
||||
context = view.get_context_data()
|
||||
self.assertNotIn("can_edit", context)
|
||||
|
||||
# Test regular user
|
||||
request.user = self.user
|
||||
view.setup(request, pk=self.operator.pk)
|
||||
context = view.get_context_data()
|
||||
self.assertTrue(context["can_edit"])
|
||||
self.assertFalse(context["can_auto_approve"])
|
||||
|
||||
# Test moderator
|
||||
request.user = self.moderator
|
||||
view.setup(request, pk=self.operator.pk)
|
||||
context = view.get_context_data()
|
||||
self.assertTrue(context["can_edit"])
|
||||
self.assertTrue(context["can_auto_approve"])
|
||||
|
||||
def test_history_mixin(self):
|
||||
"""Test history mixin"""
|
||||
view = TestView()
|
||||
view.kwargs = {"pk": self.operator.pk}
|
||||
view.object = self.operator
|
||||
request = self.factory.get(f"/test/{self.operator.pk}/")
|
||||
request.user = self.user
|
||||
view.setup(request, pk=self.operator.pk)
|
||||
|
||||
# Create some edit submissions
|
||||
EditSubmission.objects.create(
|
||||
user=self.user,
|
||||
content_type=ContentType.objects.get_for_model(Operator),
|
||||
object_id=getattr(self.operator, "id", None),
|
||||
submission_type="EDIT",
|
||||
changes={"name": "New Name"},
|
||||
status="APPROVED",
|
||||
)
|
||||
|
||||
context = view.get_context_data()
|
||||
self.assertIn("history", context)
|
||||
self.assertIn("edit_submissions", context)
|
||||
self.assertEqual(len(context["edit_submissions"]), 1)
|
||||
@@ -1,87 +0,0 @@
|
||||
"""
|
||||
Moderation URLs
|
||||
|
||||
This module defines URL patterns for the moderation API endpoints.
|
||||
All endpoints are nested under /api/moderation/ and provide comprehensive
|
||||
moderation functionality including reports, queue management, actions, and bulk operations.
|
||||
"""
|
||||
|
||||
from django.urls import path, include
|
||||
from rest_framework.routers import DefaultRouter
|
||||
|
||||
from .views import (
|
||||
ModerationReportViewSet,
|
||||
ModerationQueueViewSet,
|
||||
ModerationActionViewSet,
|
||||
BulkOperationViewSet,
|
||||
UserModerationViewSet,
|
||||
)
|
||||
|
||||
# Create router and register viewsets
|
||||
router = DefaultRouter()
|
||||
router.register(r"reports", ModerationReportViewSet, basename="moderation-reports")
|
||||
router.register(r"queue", ModerationQueueViewSet, basename="moderation-queue")
|
||||
router.register(r"actions", ModerationActionViewSet, basename="moderation-actions")
|
||||
router.register(r"bulk-operations", BulkOperationViewSet, basename="bulk-operations")
|
||||
router.register(r"users", UserModerationViewSet, basename="user-moderation")
|
||||
|
||||
app_name = "moderation"
|
||||
|
||||
urlpatterns = [
|
||||
# Include all router URLs
|
||||
path("", include(router.urls)),
|
||||
]
|
||||
|
||||
# URL patterns generated by the router:
|
||||
#
|
||||
# Moderation Reports:
|
||||
# GET /api/moderation/reports/ - List all reports
|
||||
# POST /api/moderation/reports/ - Create new report
|
||||
# GET /api/moderation/reports/{id}/ - Get specific report
|
||||
# PUT /api/moderation/reports/{id}/ - Update report
|
||||
# PATCH /api/moderation/reports/{id}/ - Partial update report
|
||||
# DELETE /api/moderation/reports/{id}/ - Delete report
|
||||
# POST /api/moderation/reports/{id}/assign/ - Assign report to moderator
|
||||
# POST /api/moderation/reports/{id}/resolve/ - Resolve report
|
||||
# GET /api/moderation/reports/stats/ - Get report statistics
|
||||
#
|
||||
# Moderation Queue:
|
||||
# GET /api/moderation/queue/ - List queue items
|
||||
# POST /api/moderation/queue/ - Create queue item
|
||||
# GET /api/moderation/queue/{id}/ - Get specific queue item
|
||||
# PUT /api/moderation/queue/{id}/ - Update queue item
|
||||
# PATCH /api/moderation/queue/{id}/ - Partial update queue item
|
||||
# DELETE /api/moderation/queue/{id}/ - Delete queue item
|
||||
# POST /api/moderation/queue/{id}/assign/ - Assign queue item
|
||||
# POST /api/moderation/queue/{id}/unassign/ - Unassign queue item
|
||||
# POST /api/moderation/queue/{id}/complete/ - Complete queue item
|
||||
# GET /api/moderation/queue/my_queue/ - Get current user's queue items
|
||||
#
|
||||
# Moderation Actions:
|
||||
# GET /api/moderation/actions/ - List all actions
|
||||
# POST /api/moderation/actions/ - Create new action
|
||||
# GET /api/moderation/actions/{id}/ - Get specific action
|
||||
# PUT /api/moderation/actions/{id}/ - Update action
|
||||
# PATCH /api/moderation/actions/{id}/ - Partial update action
|
||||
# DELETE /api/moderation/actions/{id}/ - Delete action
|
||||
# POST /api/moderation/actions/{id}/deactivate/ - Deactivate action
|
||||
# GET /api/moderation/actions/active/ - Get active actions
|
||||
# GET /api/moderation/actions/expired/ - Get expired actions
|
||||
#
|
||||
# Bulk Operations:
|
||||
# GET /api/moderation/bulk-operations/ - List bulk operations
|
||||
# POST /api/moderation/bulk-operations/ - Create bulk operation
|
||||
# GET /api/moderation/bulk-operations/{id}/ - Get specific operation
|
||||
# PUT /api/moderation/bulk-operations/{id}/ - Update operation
|
||||
# PATCH /api/moderation/bulk-operations/{id}/ - Partial update operation
|
||||
# DELETE /api/moderation/bulk-operations/{id}/ - Delete operation
|
||||
# POST /api/moderation/bulk-operations/{id}/cancel/ - Cancel operation
|
||||
# POST /api/moderation/bulk-operations/{id}/retry/ - Retry failed operation
|
||||
# GET /api/moderation/bulk-operations/{id}/logs/ - Get operation logs
|
||||
# GET /api/moderation/bulk-operations/running/ - Get running operations
|
||||
#
|
||||
# User Moderation:
|
||||
# GET /api/moderation/users/{id}/ - Get user moderation profile
|
||||
# POST /api/moderation/users/{id}/moderate/ - Take action against user
|
||||
# GET /api/moderation/users/search/ - Search users for moderation
|
||||
# GET /api/moderation/users/stats/ - Get user moderation statistics
|
||||
@@ -1,737 +0,0 @@
|
||||
"""
|
||||
Moderation API Views
|
||||
|
||||
This module contains DRF viewsets for the moderation system, including:
|
||||
- ModerationReport views for content reporting
|
||||
- ModerationQueue views for moderation workflow
|
||||
- ModerationAction views for tracking moderation actions
|
||||
- BulkOperation views for administrative bulk operations
|
||||
|
||||
All views include comprehensive permissions, filtering, and pagination.
|
||||
"""
|
||||
|
||||
from rest_framework import viewsets, status, permissions
|
||||
from rest_framework.decorators import action
|
||||
from rest_framework.response import Response
|
||||
from rest_framework.filters import SearchFilter, OrderingFilter
|
||||
from django_filters.rest_framework import DjangoFilterBackend
|
||||
from django.contrib.auth import get_user_model
|
||||
from django.utils import timezone
|
||||
from django.db.models import Q, Count
|
||||
from datetime import timedelta
|
||||
|
||||
from .models import (
|
||||
ModerationReport,
|
||||
ModerationQueue,
|
||||
ModerationAction,
|
||||
BulkOperation,
|
||||
)
|
||||
from .serializers import (
|
||||
ModerationReportSerializer,
|
||||
CreateModerationReportSerializer,
|
||||
UpdateModerationReportSerializer,
|
||||
ModerationQueueSerializer,
|
||||
AssignQueueItemSerializer,
|
||||
CompleteQueueItemSerializer,
|
||||
ModerationActionSerializer,
|
||||
CreateModerationActionSerializer,
|
||||
BulkOperationSerializer,
|
||||
CreateBulkOperationSerializer,
|
||||
UserModerationProfileSerializer,
|
||||
)
|
||||
from .filters import (
|
||||
ModerationReportFilter,
|
||||
ModerationQueueFilter,
|
||||
ModerationActionFilter,
|
||||
BulkOperationFilter,
|
||||
)
|
||||
from .permissions import (
|
||||
IsModeratorOrAdmin,
|
||||
IsAdminOrSuperuser,
|
||||
CanViewModerationData,
|
||||
)
|
||||
|
||||
User = get_user_model()
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Moderation Report ViewSet
|
||||
# ============================================================================
|
||||
|
||||
|
||||
class ModerationReportViewSet(viewsets.ModelViewSet):
|
||||
"""
|
||||
ViewSet for managing moderation reports.
|
||||
|
||||
Provides CRUD operations for moderation reports with comprehensive
|
||||
filtering, search, and permission controls.
|
||||
"""
|
||||
|
||||
queryset = ModerationReport.objects.select_related(
|
||||
"reported_by", "assigned_moderator", "content_type"
|
||||
).all()
|
||||
|
||||
filter_backends = [DjangoFilterBackend, SearchFilter, OrderingFilter]
|
||||
filterset_class = ModerationReportFilter
|
||||
search_fields = ["reason", "description", "resolution_notes"]
|
||||
ordering_fields = ["created_at", "updated_at", "priority", "status"]
|
||||
ordering = ["-created_at"]
|
||||
|
||||
def get_serializer_class(self):
|
||||
"""Return appropriate serializer based on action."""
|
||||
if self.action == "create":
|
||||
return CreateModerationReportSerializer
|
||||
elif self.action in ["update", "partial_update"]:
|
||||
return UpdateModerationReportSerializer
|
||||
return ModerationReportSerializer
|
||||
|
||||
def get_permissions(self):
|
||||
"""Return appropriate permissions based on action."""
|
||||
if self.action == "create":
|
||||
# Any authenticated user can create reports
|
||||
permission_classes = [permissions.IsAuthenticated]
|
||||
elif self.action in ["list", "retrieve"]:
|
||||
# Moderators and above can view reports
|
||||
permission_classes = [CanViewModerationData]
|
||||
else:
|
||||
# Only moderators and above can modify reports
|
||||
permission_classes = [IsModeratorOrAdmin]
|
||||
|
||||
return [permission() for permission in permission_classes]
|
||||
|
||||
def get_queryset(self):
|
||||
"""Filter queryset based on user permissions."""
|
||||
queryset = super().get_queryset()
|
||||
|
||||
# Regular users can only see their own reports
|
||||
if not self.request.user.is_authenticated:
|
||||
return queryset.none()
|
||||
|
||||
user_role = getattr(self.request.user, "role", "USER")
|
||||
if user_role == "USER":
|
||||
queryset = queryset.filter(reported_by=self.request.user)
|
||||
|
||||
return queryset
|
||||
|
||||
@action(detail=True, methods=["post"], permission_classes=[IsModeratorOrAdmin])
|
||||
def assign(self, request, pk=None):
|
||||
"""Assign a report to a moderator."""
|
||||
report = self.get_object()
|
||||
moderator_id = request.data.get("moderator_id")
|
||||
|
||||
try:
|
||||
moderator = User.objects.get(id=moderator_id)
|
||||
moderator_role = getattr(moderator, "role", "USER")
|
||||
|
||||
if moderator_role not in ["MODERATOR", "ADMIN", "SUPERUSER"]:
|
||||
return Response(
|
||||
{"error": "User must be a moderator, admin, or superuser"},
|
||||
status=status.HTTP_400_BAD_REQUEST,
|
||||
)
|
||||
|
||||
report.assigned_moderator = moderator
|
||||
report.status = "UNDER_REVIEW"
|
||||
report.save()
|
||||
|
||||
serializer = self.get_serializer(report)
|
||||
return Response(serializer.data)
|
||||
|
||||
except User.DoesNotExist:
|
||||
return Response(
|
||||
{"error": "Moderator not found"}, status=status.HTTP_404_NOT_FOUND
|
||||
)
|
||||
|
||||
@action(detail=True, methods=["post"], permission_classes=[IsModeratorOrAdmin])
|
||||
def resolve(self, request, pk=None):
|
||||
"""Resolve a moderation report."""
|
||||
report = self.get_object()
|
||||
|
||||
resolution_action = request.data.get("resolution_action")
|
||||
resolution_notes = request.data.get("resolution_notes", "")
|
||||
|
||||
if not resolution_action:
|
||||
return Response(
|
||||
{"error": "resolution_action is required"},
|
||||
status=status.HTTP_400_BAD_REQUEST,
|
||||
)
|
||||
|
||||
report.status = "RESOLVED"
|
||||
report.resolution_action = resolution_action
|
||||
report.resolution_notes = resolution_notes
|
||||
report.resolved_at = timezone.now()
|
||||
report.save()
|
||||
|
||||
serializer = self.get_serializer(report)
|
||||
return Response(serializer.data)
|
||||
|
||||
@action(detail=False, methods=["get"], permission_classes=[CanViewModerationData])
|
||||
def stats(self, request):
|
||||
"""Get moderation report statistics."""
|
||||
queryset = self.get_queryset()
|
||||
|
||||
# Basic counts
|
||||
total_reports = queryset.count()
|
||||
pending_reports = queryset.filter(status="PENDING").count()
|
||||
resolved_reports = queryset.filter(status="RESOLVED").count()
|
||||
|
||||
# Overdue reports (based on priority SLA)
|
||||
now = timezone.now()
|
||||
overdue_reports = 0
|
||||
|
||||
for report in queryset.filter(status__in=["PENDING", "UNDER_REVIEW"]):
|
||||
sla_hours = {"URGENT": 2, "HIGH": 8, "MEDIUM": 24, "LOW": 72}
|
||||
hours_since_created = (now - report.created_at).total_seconds() / 3600
|
||||
if report.priority in sla_hours:
|
||||
threshold = sla_hours[report.priority]
|
||||
else:
|
||||
raise ValueError(f"Unknown priority level: {report.priority}")
|
||||
if hours_since_created > threshold:
|
||||
overdue_reports += 1
|
||||
|
||||
# Reports by priority and type
|
||||
reports_by_priority = dict(
|
||||
queryset.values_list("priority").annotate(count=Count("id"))
|
||||
)
|
||||
reports_by_type = dict(
|
||||
queryset.values_list("report_type").annotate(count=Count("id"))
|
||||
)
|
||||
|
||||
# Average resolution time
|
||||
resolved_queryset = queryset.filter(
|
||||
status="RESOLVED", resolved_at__isnull=False
|
||||
)
|
||||
|
||||
avg_resolution_time = 0
|
||||
if resolved_queryset.exists():
|
||||
total_time = sum(
|
||||
[
|
||||
(report.resolved_at - report.created_at).total_seconds() / 3600
|
||||
for report in resolved_queryset
|
||||
if report.resolved_at
|
||||
]
|
||||
)
|
||||
avg_resolution_time = total_time / resolved_queryset.count()
|
||||
|
||||
stats_data = {
|
||||
"total_reports": total_reports,
|
||||
"pending_reports": pending_reports,
|
||||
"resolved_reports": resolved_reports,
|
||||
"overdue_reports": overdue_reports,
|
||||
"reports_by_priority": reports_by_priority,
|
||||
"reports_by_type": reports_by_type,
|
||||
"average_resolution_time_hours": round(avg_resolution_time, 2),
|
||||
}
|
||||
|
||||
return Response(stats_data)
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Moderation Queue ViewSet
|
||||
# ============================================================================
|
||||
|
||||
|
||||
class ModerationQueueViewSet(viewsets.ModelViewSet):
|
||||
"""
|
||||
ViewSet for managing moderation queue items.
|
||||
|
||||
Provides workflow management for moderation tasks with assignment,
|
||||
completion, and progress tracking.
|
||||
"""
|
||||
|
||||
queryset = ModerationQueue.objects.select_related(
|
||||
"assigned_to", "related_report", "content_type"
|
||||
).all()
|
||||
|
||||
serializer_class = ModerationQueueSerializer
|
||||
permission_classes = [CanViewModerationData]
|
||||
|
||||
filter_backends = [DjangoFilterBackend, SearchFilter, OrderingFilter]
|
||||
filterset_class = ModerationQueueFilter
|
||||
search_fields = ["title", "description"]
|
||||
ordering_fields = ["created_at", "updated_at", "priority", "status"]
|
||||
ordering = ["-created_at"]
|
||||
|
||||
@action(detail=True, methods=["post"], permission_classes=[IsModeratorOrAdmin])
|
||||
def assign(self, request, pk=None):
|
||||
"""Assign a queue item to a moderator."""
|
||||
queue_item = self.get_object()
|
||||
serializer = AssignQueueItemSerializer(data=request.data)
|
||||
|
||||
if serializer.is_valid():
|
||||
moderator_id = serializer.validated_data["moderator_id"]
|
||||
moderator = User.objects.get(id=moderator_id)
|
||||
|
||||
queue_item.assigned_to = moderator
|
||||
queue_item.assigned_at = timezone.now()
|
||||
queue_item.status = "IN_PROGRESS"
|
||||
queue_item.save()
|
||||
|
||||
response_serializer = self.get_serializer(queue_item)
|
||||
return Response(response_serializer.data)
|
||||
|
||||
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
|
||||
|
||||
@action(detail=True, methods=["post"], permission_classes=[IsModeratorOrAdmin])
|
||||
def unassign(self, request, pk=None):
|
||||
"""Unassign a queue item."""
|
||||
queue_item = self.get_object()
|
||||
|
||||
queue_item.assigned_to = None
|
||||
queue_item.assigned_at = None
|
||||
queue_item.status = "PENDING"
|
||||
queue_item.save()
|
||||
|
||||
serializer = self.get_serializer(queue_item)
|
||||
return Response(serializer.data)
|
||||
|
||||
@action(detail=True, methods=["post"], permission_classes=[IsModeratorOrAdmin])
|
||||
def complete(self, request, pk=None):
|
||||
"""Complete a queue item."""
|
||||
queue_item = self.get_object()
|
||||
serializer = CompleteQueueItemSerializer(data=request.data)
|
||||
|
||||
if serializer.is_valid():
|
||||
action_taken = serializer.validated_data["action"]
|
||||
notes = serializer.validated_data.get("notes", "")
|
||||
|
||||
queue_item.status = "COMPLETED"
|
||||
queue_item.save()
|
||||
|
||||
# Create moderation action if needed
|
||||
if action_taken != "NO_ACTION" and queue_item.related_report:
|
||||
ModerationAction.objects.create(
|
||||
action_type=action_taken,
|
||||
reason=f"Queue item completion: {action_taken}",
|
||||
details=notes,
|
||||
moderator=request.user,
|
||||
target_user=queue_item.related_report.reported_by,
|
||||
related_report=queue_item.related_report,
|
||||
is_active=True,
|
||||
)
|
||||
|
||||
response_serializer = self.get_serializer(queue_item)
|
||||
return Response(response_serializer.data)
|
||||
|
||||
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
|
||||
|
||||
@action(detail=False, methods=["get"], permission_classes=[CanViewModerationData])
|
||||
def my_queue(self, request):
|
||||
"""Get queue items assigned to the current user."""
|
||||
queryset = self.get_queryset().filter(assigned_to=request.user)
|
||||
|
||||
page = self.paginate_queryset(queryset)
|
||||
if page is not None:
|
||||
serializer = self.get_serializer(page, many=True)
|
||||
return self.get_paginated_response(serializer.data)
|
||||
|
||||
serializer = self.get_serializer(queryset, many=True)
|
||||
return Response(serializer.data)
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Moderation Action ViewSet
|
||||
# ============================================================================
|
||||
|
||||
|
||||
class ModerationActionViewSet(viewsets.ModelViewSet):
|
||||
"""
|
||||
ViewSet for managing moderation actions.
|
||||
|
||||
Tracks actions taken against users and content with expiration
|
||||
and status management.
|
||||
"""
|
||||
|
||||
queryset = ModerationAction.objects.select_related(
|
||||
"moderator", "target_user", "related_report"
|
||||
).all()
|
||||
|
||||
filter_backends = [DjangoFilterBackend, SearchFilter, OrderingFilter]
|
||||
filterset_class = ModerationActionFilter
|
||||
search_fields = ["reason", "details"]
|
||||
ordering_fields = ["created_at", "expires_at", "action_type"]
|
||||
ordering = ["-created_at"]
|
||||
|
||||
def get_serializer_class(self):
|
||||
"""Return appropriate serializer based on action."""
|
||||
if self.action == "create":
|
||||
return CreateModerationActionSerializer
|
||||
return ModerationActionSerializer
|
||||
|
||||
def get_permissions(self):
|
||||
"""Return appropriate permissions based on action."""
|
||||
if self.action == "create":
|
||||
permission_classes = [IsModeratorOrAdmin]
|
||||
else:
|
||||
permission_classes = [CanViewModerationData]
|
||||
|
||||
return [permission() for permission in permission_classes]
|
||||
|
||||
@action(detail=True, methods=["post"], permission_classes=[IsModeratorOrAdmin])
|
||||
def deactivate(self, request, pk=None):
|
||||
"""Deactivate a moderation action."""
|
||||
action_obj = self.get_object()
|
||||
|
||||
action_obj.is_active = False
|
||||
action_obj.save()
|
||||
|
||||
serializer = self.get_serializer(action_obj)
|
||||
return Response(serializer.data)
|
||||
|
||||
@action(detail=False, methods=["get"], permission_classes=[CanViewModerationData])
|
||||
def active(self, request):
|
||||
"""Get all active moderation actions."""
|
||||
queryset = self.get_queryset().filter(
|
||||
is_active=True, expires_at__gt=timezone.now()
|
||||
)
|
||||
|
||||
page = self.paginate_queryset(queryset)
|
||||
if page is not None:
|
||||
serializer = self.get_serializer(page, many=True)
|
||||
return self.get_paginated_response(serializer.data)
|
||||
|
||||
serializer = self.get_serializer(queryset, many=True)
|
||||
return Response(serializer.data)
|
||||
|
||||
@action(detail=False, methods=["get"], permission_classes=[CanViewModerationData])
|
||||
def expired(self, request):
|
||||
"""Get all expired moderation actions."""
|
||||
queryset = self.get_queryset().filter(
|
||||
expires_at__lte=timezone.now(), is_active=True
|
||||
)
|
||||
|
||||
page = self.paginate_queryset(queryset)
|
||||
if page is not None:
|
||||
serializer = self.get_serializer(page, many=True)
|
||||
return self.get_paginated_response(serializer.data)
|
||||
|
||||
serializer = self.get_serializer(queryset, many=True)
|
||||
return Response(serializer.data)
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Bulk Operation ViewSet
|
||||
# ============================================================================
|
||||
|
||||
|
||||
class BulkOperationViewSet(viewsets.ModelViewSet):
|
||||
"""
|
||||
ViewSet for managing bulk operations.
|
||||
|
||||
Provides administrative bulk operations with progress tracking
|
||||
and cancellation support.
|
||||
"""
|
||||
|
||||
queryset = BulkOperation.objects.select_related("created_by").all()
|
||||
permission_classes = [IsAdminOrSuperuser]
|
||||
|
||||
filter_backends = [DjangoFilterBackend, SearchFilter, OrderingFilter]
|
||||
filterset_class = BulkOperationFilter
|
||||
search_fields = ["description"]
|
||||
ordering_fields = ["created_at", "started_at", "completed_at", "priority"]
|
||||
ordering = ["-created_at"]
|
||||
|
||||
def get_serializer_class(self):
|
||||
"""Return appropriate serializer based on action."""
|
||||
if self.action == "create":
|
||||
return CreateBulkOperationSerializer
|
||||
return BulkOperationSerializer
|
||||
|
||||
@action(detail=True, methods=["post"])
|
||||
def cancel(self, request, pk=None):
|
||||
"""Cancel a bulk operation."""
|
||||
operation = self.get_object()
|
||||
|
||||
if operation.status not in ["PENDING", "RUNNING"]:
|
||||
return Response(
|
||||
{"error": "Operation cannot be cancelled"},
|
||||
status=status.HTTP_400_BAD_REQUEST,
|
||||
)
|
||||
|
||||
if not operation.can_cancel:
|
||||
return Response(
|
||||
{"error": "Operation is not cancellable"},
|
||||
status=status.HTTP_400_BAD_REQUEST,
|
||||
)
|
||||
|
||||
operation.status = "CANCELLED"
|
||||
operation.completed_at = timezone.now()
|
||||
operation.save()
|
||||
|
||||
serializer = self.get_serializer(operation)
|
||||
return Response(serializer.data)
|
||||
|
||||
@action(detail=True, methods=["post"])
|
||||
def retry(self, request, pk=None):
|
||||
"""Retry a failed bulk operation."""
|
||||
operation = self.get_object()
|
||||
|
||||
if operation.status != "FAILED":
|
||||
return Response(
|
||||
{"error": "Only failed operations can be retried"},
|
||||
status=status.HTTP_400_BAD_REQUEST,
|
||||
)
|
||||
|
||||
# Reset operation status
|
||||
operation.status = "PENDING"
|
||||
operation.started_at = None
|
||||
operation.completed_at = None
|
||||
operation.processed_items = 0
|
||||
operation.failed_items = 0
|
||||
operation.results = {}
|
||||
operation.save()
|
||||
|
||||
serializer = self.get_serializer(operation)
|
||||
return Response(serializer.data)
|
||||
|
||||
@action(detail=True, methods=["get"])
|
||||
def logs(self, request, pk=None):
|
||||
"""Get logs for a bulk operation."""
|
||||
operation = self.get_object()
|
||||
|
||||
# This would typically fetch logs from a logging system
|
||||
# For now, return a placeholder response
|
||||
logs = {
|
||||
"logs": [
|
||||
{
|
||||
"timestamp": operation.created_at.isoformat(),
|
||||
"level": "INFO",
|
||||
"message": f"Operation {operation.id} created",
|
||||
"details": operation.parameters,
|
||||
}
|
||||
],
|
||||
"count": 1,
|
||||
}
|
||||
|
||||
return Response(logs)
|
||||
|
||||
@action(detail=False, methods=["get"])
|
||||
def running(self, request):
|
||||
"""Get all running bulk operations."""
|
||||
queryset = self.get_queryset().filter(status="RUNNING")
|
||||
|
||||
page = self.paginate_queryset(queryset)
|
||||
if page is not None:
|
||||
serializer = self.get_serializer(page, many=True)
|
||||
return self.get_paginated_response(serializer.data)
|
||||
|
||||
serializer = self.get_serializer(queryset, many=True)
|
||||
return Response(serializer.data)
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# User Moderation ViewSet
|
||||
# ============================================================================
|
||||
|
||||
|
||||
class UserModerationViewSet(viewsets.ViewSet):
|
||||
"""
|
||||
ViewSet for user moderation operations.
|
||||
|
||||
Provides user-specific moderation data, statistics, and actions.
|
||||
"""
|
||||
|
||||
permission_classes = [IsModeratorOrAdmin]
|
||||
# Default serializer for schema generation
|
||||
serializer_class = UserModerationProfileSerializer
|
||||
|
||||
def retrieve(self, request, pk=None):
|
||||
"""Get moderation profile for a specific user."""
|
||||
try:
|
||||
user = User.objects.get(pk=pk)
|
||||
except User.DoesNotExist:
|
||||
return Response(
|
||||
{"error": "User not found"}, status=status.HTTP_404_NOT_FOUND
|
||||
)
|
||||
|
||||
# Gather user moderation data
|
||||
reports_made = ModerationReport.objects.filter(reported_by=user).count()
|
||||
reports_against = ModerationReport.objects.filter(
|
||||
reported_entity_type="user", reported_entity_id=user.id
|
||||
).count()
|
||||
|
||||
actions_against = ModerationAction.objects.filter(target_user=user)
|
||||
warnings_received = actions_against.filter(action_type="WARNING").count()
|
||||
suspensions_received = actions_against.filter(
|
||||
action_type="USER_SUSPENSION"
|
||||
).count()
|
||||
active_restrictions = actions_against.filter(
|
||||
is_active=True, expires_at__gt=timezone.now()
|
||||
).count()
|
||||
|
||||
# Risk assessment (simplified)
|
||||
risk_factors = []
|
||||
risk_level = "LOW"
|
||||
|
||||
if reports_against > 5:
|
||||
risk_factors.append("Multiple reports against user")
|
||||
risk_level = "MEDIUM"
|
||||
|
||||
if suspensions_received > 0:
|
||||
risk_factors.append("Previous suspensions")
|
||||
risk_level = "HIGH"
|
||||
|
||||
if active_restrictions > 0:
|
||||
risk_factors.append("Active restrictions")
|
||||
risk_level = "HIGH"
|
||||
|
||||
# Recent activity
|
||||
recent_reports = ModerationReport.objects.filter(reported_by=user).order_by(
|
||||
"-created_at"
|
||||
)[:5]
|
||||
|
||||
recent_actions = actions_against.order_by("-created_at")[:5]
|
||||
|
||||
# Account status
|
||||
account_status = "ACTIVE"
|
||||
if getattr(user, "is_banned", False):
|
||||
account_status = "BANNED"
|
||||
elif active_restrictions > 0:
|
||||
account_status = "RESTRICTED"
|
||||
|
||||
last_violation = (
|
||||
actions_against.filter(
|
||||
action_type__in=["WARNING", "USER_SUSPENSION", "USER_BAN"]
|
||||
)
|
||||
.order_by("-created_at")
|
||||
.first()
|
||||
)
|
||||
|
||||
profile_data = {
|
||||
"user": {
|
||||
"id": user.id,
|
||||
"username": user.username,
|
||||
"display_name": user.get_display_name(),
|
||||
"email": user.email,
|
||||
"role": getattr(user, "role", "USER"),
|
||||
},
|
||||
"reports_made": reports_made,
|
||||
"reports_against": reports_against,
|
||||
"warnings_received": warnings_received,
|
||||
"suspensions_received": suspensions_received,
|
||||
"active_restrictions": active_restrictions,
|
||||
"risk_level": risk_level,
|
||||
"risk_factors": risk_factors,
|
||||
"recent_reports": ModerationReportSerializer(
|
||||
recent_reports, many=True
|
||||
).data,
|
||||
"recent_actions": ModerationActionSerializer(
|
||||
recent_actions, many=True
|
||||
).data,
|
||||
"account_status": account_status,
|
||||
"last_violation_date": (
|
||||
last_violation.created_at if last_violation else None
|
||||
),
|
||||
"next_review_date": None, # Would be calculated based on business rules
|
||||
}
|
||||
|
||||
return Response(profile_data)
|
||||
|
||||
@action(detail=True, methods=["post"])
|
||||
def moderate(self, request, pk=None):
|
||||
"""Take moderation action against a user."""
|
||||
try:
|
||||
user = User.objects.get(pk=pk)
|
||||
except User.DoesNotExist:
|
||||
return Response(
|
||||
{"error": "User not found"}, status=status.HTTP_404_NOT_FOUND
|
||||
)
|
||||
|
||||
serializer = CreateModerationActionSerializer(
|
||||
data=request.data, context={"request": request}
|
||||
)
|
||||
|
||||
if serializer.is_valid():
|
||||
# Override target_user_id with the user from URL
|
||||
validated_data = serializer.validated_data.copy()
|
||||
validated_data["target_user_id"] = user.id
|
||||
|
||||
action = ModerationAction.objects.create(
|
||||
action_type=validated_data["action_type"],
|
||||
reason=validated_data["reason"],
|
||||
details=validated_data["details"],
|
||||
duration_hours=validated_data.get("duration_hours"),
|
||||
moderator=request.user,
|
||||
target_user=user,
|
||||
related_report_id=validated_data.get("related_report_id"),
|
||||
is_active=True,
|
||||
expires_at=(
|
||||
timezone.now() + timedelta(hours=validated_data["duration_hours"])
|
||||
if validated_data.get("duration_hours")
|
||||
else None
|
||||
),
|
||||
)
|
||||
|
||||
response_serializer = ModerationActionSerializer(action)
|
||||
return Response(response_serializer.data, status=status.HTTP_201_CREATED)
|
||||
|
||||
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
|
||||
|
||||
@action(detail=False, methods=["get"])
|
||||
def search(self, request):
|
||||
"""Search users for moderation purposes."""
|
||||
query = request.query_params.get("query", "")
|
||||
role = request.query_params.get("role")
|
||||
has_restrictions = request.query_params.get("has_restrictions")
|
||||
|
||||
queryset = User.objects.all()
|
||||
|
||||
if query:
|
||||
queryset = queryset.filter(
|
||||
Q(username__icontains=query) | Q(email__icontains=query)
|
||||
)
|
||||
|
||||
if role:
|
||||
queryset = queryset.filter(role=role)
|
||||
|
||||
if has_restrictions == "true":
|
||||
active_action_users = ModerationAction.objects.filter(
|
||||
is_active=True, expires_at__gt=timezone.now()
|
||||
).values_list("target_user_id", flat=True)
|
||||
queryset = queryset.filter(id__in=active_action_users)
|
||||
|
||||
# Paginate results
|
||||
page = self.paginate_queryset(queryset)
|
||||
if page is not None:
|
||||
users_data = []
|
||||
for user in page:
|
||||
restriction_count = ModerationAction.objects.filter(
|
||||
target_user=user, is_active=True, expires_at__gt=timezone.now()
|
||||
).count()
|
||||
|
||||
users_data.append(
|
||||
{
|
||||
"id": user.id,
|
||||
"username": user.username,
|
||||
"display_name": user.get_display_name(),
|
||||
"email": user.email,
|
||||
"role": getattr(user, "role", "USER"),
|
||||
"date_joined": user.date_joined,
|
||||
"last_login": user.last_login,
|
||||
"is_active": user.is_active,
|
||||
"restriction_count": restriction_count,
|
||||
"risk_level": "HIGH" if restriction_count > 0 else "LOW",
|
||||
}
|
||||
)
|
||||
|
||||
return self.get_paginated_response(users_data)
|
||||
|
||||
return Response([])
|
||||
|
||||
@action(detail=False, methods=["get"])
|
||||
def stats(self, request):
|
||||
"""Get overall user moderation statistics."""
|
||||
total_actions = ModerationAction.objects.count()
|
||||
active_actions = ModerationAction.objects.filter(
|
||||
is_active=True, expires_at__gt=timezone.now()
|
||||
).count()
|
||||
expired_actions = ModerationAction.objects.filter(
|
||||
expires_at__lte=timezone.now()
|
||||
).count()
|
||||
|
||||
stats_data = {
|
||||
"total_actions": total_actions,
|
||||
"active_actions": active_actions,
|
||||
"expired_actions": expired_actions,
|
||||
}
|
||||
|
||||
return Response(stats_data)
|
||||
@@ -1,403 +0,0 @@
|
||||
from django.contrib import admin
|
||||
# from django.contrib.gis.admin import GISModelAdmin # Disabled temporarily for setup
|
||||
from django.utils.html import format_html
|
||||
import pghistory.models
|
||||
from .models import (
|
||||
Park,
|
||||
ParkArea,
|
||||
ParkLocation,
|
||||
Company,
|
||||
CompanyHeadquarters,
|
||||
ParkReview,
|
||||
)
|
||||
|
||||
|
||||
class ParkLocationInline(admin.StackedInline):
|
||||
"""Inline admin for ParkLocation"""
|
||||
|
||||
model = ParkLocation
|
||||
extra = 0
|
||||
fields = (
|
||||
("city", "state", "country"),
|
||||
"street_address",
|
||||
"postal_code",
|
||||
"point",
|
||||
("highway_exit", "best_arrival_time"),
|
||||
"parking_notes",
|
||||
"seasonal_notes",
|
||||
("osm_id", "osm_type"),
|
||||
)
|
||||
|
||||
|
||||
class ParkLocationAdmin(admin.ModelAdmin): # GISModelAdmin disabled for setup
|
||||
"""Admin for standalone ParkLocation management"""
|
||||
|
||||
list_display = (
|
||||
"park",
|
||||
"city",
|
||||
"state",
|
||||
"country",
|
||||
"latitude",
|
||||
"longitude",
|
||||
)
|
||||
list_filter = ("country", "state")
|
||||
search_fields = (
|
||||
"park__name",
|
||||
"city",
|
||||
"state",
|
||||
"country",
|
||||
"street_address",
|
||||
)
|
||||
readonly_fields = ("latitude", "longitude", "coordinates")
|
||||
fieldsets = (
|
||||
("Park", {"fields": ("park",)}),
|
||||
(
|
||||
"Address",
|
||||
{
|
||||
"fields": (
|
||||
"street_address",
|
||||
"city",
|
||||
"state",
|
||||
"country",
|
||||
"postal_code",
|
||||
)
|
||||
},
|
||||
),
|
||||
(
|
||||
"Geographic Coordinates",
|
||||
{
|
||||
"fields": ("point", "latitude", "longitude", "coordinates"),
|
||||
"description": "Set coordinates by clicking on the map or entering latitude/longitude",
|
||||
},
|
||||
),
|
||||
(
|
||||
"Travel Information",
|
||||
{
|
||||
"fields": (
|
||||
"highway_exit",
|
||||
"best_arrival_time",
|
||||
"parking_notes",
|
||||
"seasonal_notes",
|
||||
),
|
||||
"classes": ("collapse",),
|
||||
},
|
||||
),
|
||||
(
|
||||
"OpenStreetMap Integration",
|
||||
{"fields": ("osm_id", "osm_type"), "classes": ("collapse",)},
|
||||
),
|
||||
)
|
||||
|
||||
@admin.display(description="Latitude")
|
||||
def latitude(self, obj):
|
||||
return obj.latitude
|
||||
|
||||
@admin.display(description="Longitude")
|
||||
def longitude(self, obj):
|
||||
return obj.longitude
|
||||
|
||||
|
||||
class ParkAdmin(admin.ModelAdmin):
|
||||
list_display = (
|
||||
"name",
|
||||
"formatted_location",
|
||||
"status",
|
||||
"operator",
|
||||
"property_owner",
|
||||
"created_at",
|
||||
"updated_at",
|
||||
)
|
||||
list_filter = ("status", "location__country", "location__state")
|
||||
search_fields = (
|
||||
"name",
|
||||
"description",
|
||||
"location__city",
|
||||
"location__state",
|
||||
"location__country",
|
||||
)
|
||||
readonly_fields = ("created_at", "updated_at")
|
||||
prepopulated_fields = {"slug": ("name",)}
|
||||
inlines = [ParkLocationInline]
|
||||
|
||||
@admin.display(description="Location")
|
||||
def formatted_location(self, obj):
|
||||
"""Display formatted location string"""
|
||||
return obj.formatted_location
|
||||
|
||||
|
||||
class ParkAreaAdmin(admin.ModelAdmin):
|
||||
list_display = ("name", "park", "created_at", "updated_at")
|
||||
list_filter = ("park",)
|
||||
search_fields = ("name", "description", "park__name")
|
||||
readonly_fields = ("created_at", "updated_at")
|
||||
prepopulated_fields = {"slug": ("name",)}
|
||||
|
||||
|
||||
class CompanyHeadquartersInline(admin.StackedInline):
|
||||
"""Inline admin for CompanyHeadquarters"""
|
||||
|
||||
model = CompanyHeadquarters
|
||||
extra = 0
|
||||
fields = (
|
||||
("city", "state_province", "country"),
|
||||
"street_address",
|
||||
"postal_code",
|
||||
"mailing_address",
|
||||
)
|
||||
|
||||
|
||||
class CompanyHeadquartersAdmin(admin.ModelAdmin):
|
||||
"""Admin for standalone CompanyHeadquarters management"""
|
||||
|
||||
list_display = (
|
||||
"company",
|
||||
"location_display",
|
||||
"city",
|
||||
"country",
|
||||
"created_at",
|
||||
)
|
||||
list_filter = ("country", "state_province")
|
||||
search_fields = (
|
||||
"company__name",
|
||||
"city",
|
||||
"state_province",
|
||||
"country",
|
||||
"street_address",
|
||||
)
|
||||
readonly_fields = ("created_at", "updated_at")
|
||||
fieldsets = (
|
||||
("Company", {"fields": ("company",)}),
|
||||
(
|
||||
"Address",
|
||||
{
|
||||
"fields": (
|
||||
"street_address",
|
||||
"city",
|
||||
"state_province",
|
||||
"country",
|
||||
"postal_code",
|
||||
)
|
||||
},
|
||||
),
|
||||
(
|
||||
"Additional Information",
|
||||
{"fields": ("mailing_address",), "classes": ("collapse",)},
|
||||
),
|
||||
(
|
||||
"Metadata",
|
||||
{"fields": ("created_at", "updated_at"), "classes": ("collapse",)},
|
||||
),
|
||||
)
|
||||
|
||||
|
||||
class CompanyAdmin(admin.ModelAdmin):
|
||||
"""Enhanced Company admin with headquarters inline"""
|
||||
|
||||
list_display = (
|
||||
"name",
|
||||
"roles_display",
|
||||
"headquarters_location",
|
||||
"website",
|
||||
"founded_year",
|
||||
)
|
||||
list_filter = ("roles",)
|
||||
search_fields = ("name", "description")
|
||||
readonly_fields = ("created_at", "updated_at")
|
||||
prepopulated_fields = {"slug": ("name",)}
|
||||
inlines = [CompanyHeadquartersInline]
|
||||
|
||||
@admin.display(description="Roles")
|
||||
def roles_display(self, obj):
|
||||
"""Display roles as a formatted string"""
|
||||
return ", ".join(obj.roles) if obj.roles else "No roles"
|
||||
|
||||
@admin.display(description="Headquarters")
|
||||
def headquarters_location(self, obj):
|
||||
"""Display headquarters location if available"""
|
||||
if hasattr(obj, "headquarters"):
|
||||
return obj.headquarters.location_display
|
||||
return "No headquarters"
|
||||
|
||||
|
||||
@admin.register(ParkReview)
|
||||
class ParkReviewAdmin(admin.ModelAdmin):
|
||||
"""Admin interface for park reviews"""
|
||||
|
||||
list_display = (
|
||||
"park",
|
||||
"user",
|
||||
"rating",
|
||||
"title",
|
||||
"visit_date",
|
||||
"is_published",
|
||||
"created_at",
|
||||
"moderation_status",
|
||||
)
|
||||
list_filter = (
|
||||
"rating",
|
||||
"is_published",
|
||||
"visit_date",
|
||||
"created_at",
|
||||
"park",
|
||||
"moderated_by",
|
||||
)
|
||||
search_fields = (
|
||||
"title",
|
||||
"content",
|
||||
"user__username",
|
||||
"park__name",
|
||||
)
|
||||
readonly_fields = ("created_at", "updated_at")
|
||||
date_hierarchy = "created_at"
|
||||
ordering = ("-created_at",)
|
||||
|
||||
fieldsets = (
|
||||
(
|
||||
"Review Details",
|
||||
{
|
||||
"fields": (
|
||||
"user",
|
||||
"park",
|
||||
"rating",
|
||||
"title",
|
||||
"content",
|
||||
"visit_date",
|
||||
)
|
||||
},
|
||||
),
|
||||
(
|
||||
"Publication Status",
|
||||
{
|
||||
"fields": ("is_published",),
|
||||
},
|
||||
),
|
||||
(
|
||||
"Moderation",
|
||||
{
|
||||
"fields": (
|
||||
"moderated_by",
|
||||
"moderated_at",
|
||||
"moderation_notes",
|
||||
),
|
||||
"classes": ("collapse",),
|
||||
},
|
||||
),
|
||||
(
|
||||
"Metadata",
|
||||
{
|
||||
"fields": ("created_at", "updated_at"),
|
||||
"classes": ("collapse",),
|
||||
},
|
||||
),
|
||||
)
|
||||
|
||||
@admin.display(description="Moderation Status", boolean=True)
|
||||
def moderation_status(self, obj):
|
||||
"""Display moderation status with color coding"""
|
||||
if obj.moderated_by:
|
||||
return format_html(
|
||||
'<span style="color: {};">{}</span>',
|
||||
"green" if obj.is_published else "red",
|
||||
"Approved" if obj.is_published else "Rejected",
|
||||
)
|
||||
return format_html('<span style="color: orange;">Pending</span>')
|
||||
|
||||
def save_model(self, request, obj, form, change):
|
||||
"""Auto-set moderation info when status changes"""
|
||||
if change and "is_published" in form.changed_data:
|
||||
from django.utils import timezone
|
||||
|
||||
obj.moderated_by = request.user
|
||||
obj.moderated_at = timezone.now()
|
||||
super().save_model(request, obj, form, change)
|
||||
|
||||
|
||||
@admin.register(pghistory.models.Events)
|
||||
class PgHistoryEventsAdmin(admin.ModelAdmin):
|
||||
"""Admin interface for pghistory Events"""
|
||||
|
||||
list_display = (
|
||||
"pgh_id",
|
||||
"pgh_created_at",
|
||||
"pgh_label",
|
||||
"pgh_model",
|
||||
"pgh_obj_id",
|
||||
"pgh_context_display",
|
||||
)
|
||||
list_filter = (
|
||||
"pgh_label",
|
||||
"pgh_model",
|
||||
"pgh_created_at",
|
||||
)
|
||||
search_fields = (
|
||||
"pgh_obj_id",
|
||||
"pgh_context",
|
||||
)
|
||||
readonly_fields = (
|
||||
"pgh_id",
|
||||
"pgh_created_at",
|
||||
"pgh_label",
|
||||
"pgh_model",
|
||||
"pgh_obj_id",
|
||||
"pgh_context",
|
||||
"pgh_data",
|
||||
)
|
||||
date_hierarchy = "pgh_created_at"
|
||||
ordering = ("-pgh_created_at",)
|
||||
|
||||
fieldsets = (
|
||||
(
|
||||
"Event Information",
|
||||
{
|
||||
"fields": (
|
||||
"pgh_id",
|
||||
"pgh_created_at",
|
||||
"pgh_label",
|
||||
"pgh_model",
|
||||
"pgh_obj_id",
|
||||
)
|
||||
},
|
||||
),
|
||||
(
|
||||
"Context & Data",
|
||||
{
|
||||
"fields": (
|
||||
"pgh_context",
|
||||
"pgh_data",
|
||||
),
|
||||
"classes": ("collapse",),
|
||||
},
|
||||
),
|
||||
)
|
||||
|
||||
@admin.display(description="Context")
|
||||
def pgh_context_display(self, obj):
|
||||
"""Display context information in a readable format"""
|
||||
if obj.pgh_context:
|
||||
if isinstance(obj.pgh_context, dict):
|
||||
context_items = []
|
||||
for key, value in obj.pgh_context.items():
|
||||
context_items.append(f"{key}: {value}")
|
||||
return ", ".join(context_items)
|
||||
return str(obj.pgh_context)
|
||||
return "No context"
|
||||
|
||||
def has_add_permission(self, request):
|
||||
"""Disable manual creation of history events"""
|
||||
return False
|
||||
|
||||
def has_change_permission(self, request, obj=None):
|
||||
"""Make history events read-only"""
|
||||
return False
|
||||
|
||||
def has_delete_permission(self, request, obj=None):
|
||||
"""Prevent deletion of history events"""
|
||||
return getattr(request.user, "is_superuser", False)
|
||||
|
||||
|
||||
# Register the models with their admin classes
|
||||
admin.site.register(Park, ParkAdmin)
|
||||
admin.site.register(ParkArea, ParkAreaAdmin)
|
||||
admin.site.register(ParkLocation, ParkLocationAdmin)
|
||||
admin.site.register(Company, CompanyAdmin)
|
||||
admin.site.register(CompanyHeadquarters, CompanyHeadquartersAdmin)
|
||||
@@ -1,9 +0,0 @@
|
||||
from django.apps import AppConfig
|
||||
|
||||
|
||||
class ParksConfig(AppConfig):
|
||||
default_auto_field = "django.db.models.BigAutoField"
|
||||
name = "apps.parks"
|
||||
|
||||
def ready(self):
|
||||
import apps.parks.signals # noqa: F401 - Register signals
|
||||
@@ -1,288 +0,0 @@
|
||||
"""
|
||||
Rich Choice Objects for Parks Domain
|
||||
|
||||
This module defines all choice objects for the parks domain, replacing
|
||||
the legacy tuple-based choices with rich choice objects.
|
||||
"""
|
||||
|
||||
from apps.core.choices import RichChoice, ChoiceCategory
|
||||
from apps.core.choices.registry import register_choices
|
||||
|
||||
|
||||
# Park Status Choices
|
||||
PARK_STATUSES = [
|
||||
RichChoice(
|
||||
value="OPERATING",
|
||||
label="Operating",
|
||||
description="Park is currently open and operating normally",
|
||||
metadata={
|
||||
'color': 'green',
|
||||
'icon': 'check-circle',
|
||||
'css_class': 'bg-green-100 text-green-800',
|
||||
'sort_order': 1
|
||||
},
|
||||
category=ChoiceCategory.STATUS
|
||||
),
|
||||
RichChoice(
|
||||
value="CLOSED_TEMP",
|
||||
label="Temporarily Closed",
|
||||
description="Park is temporarily closed for maintenance, weather, or seasonal reasons",
|
||||
metadata={
|
||||
'color': 'yellow',
|
||||
'icon': 'pause-circle',
|
||||
'css_class': 'bg-yellow-100 text-yellow-800',
|
||||
'sort_order': 2
|
||||
},
|
||||
category=ChoiceCategory.STATUS
|
||||
),
|
||||
RichChoice(
|
||||
value="CLOSED_PERM",
|
||||
label="Permanently Closed",
|
||||
description="Park has been permanently closed and will not reopen",
|
||||
metadata={
|
||||
'color': 'red',
|
||||
'icon': 'x-circle',
|
||||
'css_class': 'bg-red-100 text-red-800',
|
||||
'sort_order': 3
|
||||
},
|
||||
category=ChoiceCategory.STATUS
|
||||
),
|
||||
RichChoice(
|
||||
value="UNDER_CONSTRUCTION",
|
||||
label="Under Construction",
|
||||
description="Park is currently being built or undergoing major renovation",
|
||||
metadata={
|
||||
'color': 'blue',
|
||||
'icon': 'tool',
|
||||
'css_class': 'bg-blue-100 text-blue-800',
|
||||
'sort_order': 4
|
||||
},
|
||||
category=ChoiceCategory.STATUS
|
||||
),
|
||||
RichChoice(
|
||||
value="DEMOLISHED",
|
||||
label="Demolished",
|
||||
description="Park has been completely demolished and removed",
|
||||
metadata={
|
||||
'color': 'gray',
|
||||
'icon': 'trash',
|
||||
'css_class': 'bg-gray-100 text-gray-800',
|
||||
'sort_order': 5
|
||||
},
|
||||
category=ChoiceCategory.STATUS
|
||||
),
|
||||
RichChoice(
|
||||
value="RELOCATED",
|
||||
label="Relocated",
|
||||
description="Park has been moved to a different location",
|
||||
metadata={
|
||||
'color': 'purple',
|
||||
'icon': 'arrow-right',
|
||||
'css_class': 'bg-purple-100 text-purple-800',
|
||||
'sort_order': 6
|
||||
},
|
||||
category=ChoiceCategory.STATUS
|
||||
),
|
||||
]
|
||||
|
||||
# Park Type Choices
|
||||
PARK_TYPES = [
|
||||
RichChoice(
|
||||
value="THEME_PARK",
|
||||
label="Theme Park",
|
||||
description="Large-scale amusement park with themed areas and attractions",
|
||||
metadata={
|
||||
'color': 'red',
|
||||
'icon': 'castle',
|
||||
'css_class': 'bg-red-100 text-red-800',
|
||||
'sort_order': 1
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="AMUSEMENT_PARK",
|
||||
label="Amusement Park",
|
||||
description="Traditional amusement park with rides and games",
|
||||
metadata={
|
||||
'color': 'blue',
|
||||
'icon': 'ferris-wheel',
|
||||
'css_class': 'bg-blue-100 text-blue-800',
|
||||
'sort_order': 2
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="WATER_PARK",
|
||||
label="Water Park",
|
||||
description="Park featuring water-based attractions and activities",
|
||||
metadata={
|
||||
'color': 'cyan',
|
||||
'icon': 'water',
|
||||
'css_class': 'bg-cyan-100 text-cyan-800',
|
||||
'sort_order': 3
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="FAMILY_ENTERTAINMENT_CENTER",
|
||||
label="Family Entertainment Center",
|
||||
description="Indoor entertainment facility with games and family attractions",
|
||||
metadata={
|
||||
'color': 'green',
|
||||
'icon': 'family',
|
||||
'css_class': 'bg-green-100 text-green-800',
|
||||
'sort_order': 4
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="CARNIVAL",
|
||||
label="Carnival",
|
||||
description="Traveling amusement show with rides, games, and entertainment",
|
||||
metadata={
|
||||
'color': 'yellow',
|
||||
'icon': 'carnival',
|
||||
'css_class': 'bg-yellow-100 text-yellow-800',
|
||||
'sort_order': 5
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="FAIR",
|
||||
label="Fair",
|
||||
description="Temporary event featuring rides, games, and agricultural exhibits",
|
||||
metadata={
|
||||
'color': 'orange',
|
||||
'icon': 'fair',
|
||||
'css_class': 'bg-orange-100 text-orange-800',
|
||||
'sort_order': 6
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="PIER",
|
||||
label="Pier",
|
||||
description="Seaside entertainment pier with rides and attractions",
|
||||
metadata={
|
||||
'color': 'teal',
|
||||
'icon': 'pier',
|
||||
'css_class': 'bg-teal-100 text-teal-800',
|
||||
'sort_order': 7
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="BOARDWALK",
|
||||
label="Boardwalk",
|
||||
description="Waterfront entertainment area with rides and attractions",
|
||||
metadata={
|
||||
'color': 'indigo',
|
||||
'icon': 'boardwalk',
|
||||
'css_class': 'bg-indigo-100 text-indigo-800',
|
||||
'sort_order': 8
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="SAFARI_PARK",
|
||||
label="Safari Park",
|
||||
description="Wildlife park with drive-through animal experiences",
|
||||
metadata={
|
||||
'color': 'emerald',
|
||||
'icon': 'safari',
|
||||
'css_class': 'bg-emerald-100 text-emerald-800',
|
||||
'sort_order': 9
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="ZOO",
|
||||
label="Zoo",
|
||||
description="Zoological park with animal exhibits and educational programs",
|
||||
metadata={
|
||||
'color': 'lime',
|
||||
'icon': 'zoo',
|
||||
'css_class': 'bg-lime-100 text-lime-800',
|
||||
'sort_order': 10
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="OTHER",
|
||||
label="Other",
|
||||
description="Park type that doesn't fit into standard categories",
|
||||
metadata={
|
||||
'color': 'gray',
|
||||
'icon': 'other',
|
||||
'css_class': 'bg-gray-100 text-gray-800',
|
||||
'sort_order': 11
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
]
|
||||
|
||||
# Company Role Choices for Parks Domain (OPERATOR and PROPERTY_OWNER only)
|
||||
PARKS_COMPANY_ROLES = [
|
||||
RichChoice(
|
||||
value="OPERATOR",
|
||||
label="Park Operator",
|
||||
description="Company that operates and manages theme parks and amusement facilities",
|
||||
metadata={
|
||||
'color': 'blue',
|
||||
'icon': 'building-office',
|
||||
'css_class': 'bg-blue-100 text-blue-800',
|
||||
'sort_order': 1,
|
||||
'domain': 'parks',
|
||||
'permissions': ['manage_parks', 'view_operations'],
|
||||
'url_pattern': '/parks/operators/{slug}/'
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="PROPERTY_OWNER",
|
||||
label="Property Owner",
|
||||
description="Company that owns the land and property where parks are located",
|
||||
metadata={
|
||||
'color': 'green',
|
||||
'icon': 'home',
|
||||
'css_class': 'bg-green-100 text-green-800',
|
||||
'sort_order': 2,
|
||||
'domain': 'parks',
|
||||
'permissions': ['manage_property', 'view_ownership'],
|
||||
'url_pattern': '/parks/owners/{slug}/'
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
]
|
||||
|
||||
|
||||
def register_parks_choices():
|
||||
"""Register all parks domain choices with the global registry"""
|
||||
|
||||
register_choices(
|
||||
name="statuses",
|
||||
choices=PARK_STATUSES,
|
||||
domain="parks",
|
||||
description="Park operational status options",
|
||||
metadata={'domain': 'parks', 'type': 'status'}
|
||||
)
|
||||
|
||||
register_choices(
|
||||
name="types",
|
||||
choices=PARK_TYPES,
|
||||
domain="parks",
|
||||
description="Park type and category classifications",
|
||||
metadata={'domain': 'parks', 'type': 'park_type'}
|
||||
)
|
||||
|
||||
register_choices(
|
||||
name="company_roles",
|
||||
choices=PARKS_COMPANY_ROLES,
|
||||
domain="parks",
|
||||
description="Company role classifications for parks domain (OPERATOR and PROPERTY_OWNER only)",
|
||||
metadata={'domain': 'parks', 'type': 'company_role'}
|
||||
)
|
||||
|
||||
|
||||
# Auto-register choices when module is imported
|
||||
register_parks_choices()
|
||||
@@ -1,198 +0,0 @@
|
||||
"""
|
||||
Django management command to run performance benchmarks.
|
||||
"""
|
||||
|
||||
from django.core.management.base import BaseCommand
|
||||
from django.utils import timezone
|
||||
import json
|
||||
import time
|
||||
|
||||
|
||||
class Command(BaseCommand):
|
||||
help = 'Run comprehensive performance benchmarks for park listing features'
|
||||
|
||||
def add_arguments(self, parser):
|
||||
parser.add_argument(
|
||||
'--save',
|
||||
action='store_true',
|
||||
help='Save detailed benchmark results to file',
|
||||
)
|
||||
parser.add_argument(
|
||||
'--autocomplete-only',
|
||||
action='store_true',
|
||||
help='Run only autocomplete benchmarks',
|
||||
)
|
||||
parser.add_argument(
|
||||
'--listing-only',
|
||||
action='store_true',
|
||||
help='Run only listing benchmarks',
|
||||
)
|
||||
parser.add_argument(
|
||||
'--pagination-only',
|
||||
action='store_true',
|
||||
help='Run only pagination benchmarks',
|
||||
)
|
||||
parser.add_argument(
|
||||
'--iterations',
|
||||
type=int,
|
||||
default=1,
|
||||
help='Number of iterations to run (default: 1)',
|
||||
)
|
||||
|
||||
def handle(self, *args, **options):
|
||||
from apps.parks.services.performance_monitoring import BenchmarkSuite
|
||||
|
||||
self.stdout.write(
|
||||
self.style.SUCCESS('Starting Park Listing Performance Benchmarks')
|
||||
)
|
||||
|
||||
suite = BenchmarkSuite()
|
||||
iterations = options['iterations']
|
||||
all_results = []
|
||||
|
||||
for i in range(iterations):
|
||||
if iterations > 1:
|
||||
self.stdout.write(f'\nIteration {i + 1}/{iterations}')
|
||||
|
||||
start_time = time.perf_counter()
|
||||
|
||||
# Run specific benchmarks or full suite
|
||||
if options['autocomplete_only']:
|
||||
result = suite.run_autocomplete_benchmark()
|
||||
elif options['listing_only']:
|
||||
result = suite.run_listing_benchmark()
|
||||
elif options['pagination_only']:
|
||||
result = suite.run_pagination_benchmark()
|
||||
else:
|
||||
result = suite.run_full_benchmark_suite()
|
||||
|
||||
duration = time.perf_counter() - start_time
|
||||
result['iteration'] = i + 1
|
||||
result['benchmark_duration'] = duration
|
||||
all_results.append(result)
|
||||
|
||||
# Display summary for this iteration
|
||||
self._display_iteration_summary(result, duration)
|
||||
|
||||
# Display overall summary if multiple iterations
|
||||
if iterations > 1:
|
||||
self._display_overall_summary(all_results)
|
||||
|
||||
# Save results if requested
|
||||
if options['save']:
|
||||
self._save_results(all_results)
|
||||
|
||||
self.stdout.write(
|
||||
self.style.SUCCESS('\\nBenchmark completed successfully!')
|
||||
)
|
||||
|
||||
def _display_iteration_summary(self, result, duration):
|
||||
"""Display summary for a single iteration."""
|
||||
|
||||
if 'overall_summary' in result:
|
||||
summary = result['overall_summary']
|
||||
|
||||
self.stdout.write(f'\\nBenchmark Duration: {duration:.3f}s')
|
||||
self.stdout.write(f'Total Operations: {summary["total_operations"]}')
|
||||
self.stdout.write(f'Average Response Time: {summary["duration_stats"]["mean"]:.3f}s')
|
||||
self.stdout.write(f'Average Query Count: {summary["query_stats"]["mean"]:.1f}')
|
||||
self.stdout.write(f'Cache Hit Rate: {summary["cache_stats"]["hit_rate"]:.1f}%')
|
||||
|
||||
# Display slowest operations
|
||||
if summary.get('slowest_operations'):
|
||||
self.stdout.write('\\nSlowest Operations:')
|
||||
for op in summary['slowest_operations'][:3]:
|
||||
self.stdout.write(f' {op["operation"]}: {op["duration"]:.3f}s ({op["query_count"]} queries)')
|
||||
|
||||
# Display recommendations
|
||||
if result.get('recommendations'):
|
||||
self.stdout.write('\\nRecommendations:')
|
||||
for rec in result['recommendations']:
|
||||
self.stdout.write(f' • {rec}')
|
||||
|
||||
# Display specific benchmark results
|
||||
for benchmark_type in ['autocomplete', 'listing', 'pagination']:
|
||||
if benchmark_type in result:
|
||||
self._display_benchmark_results(benchmark_type, result[benchmark_type])
|
||||
|
||||
def _display_benchmark_results(self, benchmark_type, results):
|
||||
"""Display results for a specific benchmark type."""
|
||||
self.stdout.write(f'\\n{benchmark_type.title()} Benchmark Results:')
|
||||
|
||||
if benchmark_type == 'autocomplete':
|
||||
for query_result in results.get('results', []):
|
||||
self.stdout.write(
|
||||
f' Query "{query_result["query"]}": {query_result["response_time"]:.3f}s '
|
||||
f'({query_result["query_count"]} queries)'
|
||||
)
|
||||
|
||||
elif benchmark_type == 'listing':
|
||||
for scenario in results.get('results', []):
|
||||
self.stdout.write(
|
||||
f' {scenario["scenario"]}: {scenario["response_time"]:.3f}s '
|
||||
f'({scenario["query_count"]} queries, {scenario["result_count"]} results)'
|
||||
)
|
||||
|
||||
elif benchmark_type == 'pagination':
|
||||
# Group by page size for cleaner display
|
||||
by_page_size = {}
|
||||
for result in results.get('results', []):
|
||||
size = result['page_size']
|
||||
if size not in by_page_size:
|
||||
by_page_size[size] = []
|
||||
by_page_size[size].append(result)
|
||||
|
||||
for page_size, page_results in by_page_size.items():
|
||||
avg_time = sum(r['response_time'] for r in page_results) / len(page_results)
|
||||
avg_queries = sum(r['query_count'] for r in page_results) / len(page_results)
|
||||
self.stdout.write(
|
||||
f' Page size {page_size}: avg {avg_time:.3f}s ({avg_queries:.1f} queries)'
|
||||
)
|
||||
|
||||
def _display_overall_summary(self, all_results):
|
||||
"""Display summary across all iterations."""
|
||||
self.stdout.write('\\n' + '='*50)
|
||||
self.stdout.write('OVERALL SUMMARY ACROSS ALL ITERATIONS')
|
||||
self.stdout.write('='*50)
|
||||
|
||||
# Calculate averages across iterations
|
||||
total_duration = sum(r['benchmark_duration'] for r in all_results)
|
||||
|
||||
# Extract performance metrics from iterations with overall_summary
|
||||
overall_summaries = [r['overall_summary'] for r in all_results if 'overall_summary' in r]
|
||||
|
||||
if overall_summaries:
|
||||
avg_response_time = sum(s['duration_stats']['mean'] for s in overall_summaries) / len(overall_summaries)
|
||||
avg_query_count = sum(s['query_stats']['mean'] for s in overall_summaries) / len(overall_summaries)
|
||||
avg_cache_hit_rate = sum(s['cache_stats']['hit_rate'] for s in overall_summaries) / len(overall_summaries)
|
||||
|
||||
self.stdout.write(f'Total Benchmark Time: {total_duration:.3f}s')
|
||||
self.stdout.write(f'Average Response Time: {avg_response_time:.3f}s')
|
||||
self.stdout.write(f'Average Query Count: {avg_query_count:.1f}')
|
||||
self.stdout.write(f'Average Cache Hit Rate: {avg_cache_hit_rate:.1f}%')
|
||||
|
||||
def _save_results(self, results):
|
||||
"""Save benchmark results to file."""
|
||||
timestamp = timezone.now().strftime('%Y%m%d_%H%M%S')
|
||||
filename = f'benchmark_results_{timestamp}.json'
|
||||
|
||||
try:
|
||||
import os
|
||||
|
||||
# Ensure logs directory exists
|
||||
logs_dir = 'logs'
|
||||
os.makedirs(logs_dir, exist_ok=True)
|
||||
|
||||
filepath = os.path.join(logs_dir, filename)
|
||||
|
||||
with open(filepath, 'w') as f:
|
||||
json.dump(results, f, indent=2, default=str)
|
||||
|
||||
self.stdout.write(
|
||||
self.style.SUCCESS(f'Results saved to {filepath}')
|
||||
)
|
||||
|
||||
except Exception as e:
|
||||
self.stdout.write(
|
||||
self.style.ERROR(f'Error saving results: {e}')
|
||||
)
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,54 +0,0 @@
|
||||
# Generated by Django 5.2.6 on 2025-09-23 22:29
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('parks', '0001_initial'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
# Performance indexes for frequently filtered fields
|
||||
migrations.RunSQL(
|
||||
"CREATE INDEX IF NOT EXISTS idx_parks_status_operator ON parks_park(status, operator_id);",
|
||||
reverse_sql="DROP INDEX IF EXISTS idx_parks_status_operator;"
|
||||
),
|
||||
migrations.RunSQL(
|
||||
"CREATE INDEX IF NOT EXISTS idx_parks_park_type_status ON parks_park(park_type, status);",
|
||||
reverse_sql="DROP INDEX IF EXISTS idx_parks_park_type_status;"
|
||||
),
|
||||
migrations.RunSQL(
|
||||
"CREATE INDEX IF NOT EXISTS idx_parks_opening_year_status ON parks_park(opening_year, status) WHERE opening_year IS NOT NULL;",
|
||||
reverse_sql="DROP INDEX IF EXISTS idx_parks_opening_year_status;"
|
||||
),
|
||||
migrations.RunSQL(
|
||||
"CREATE INDEX IF NOT EXISTS idx_parks_ride_count_coaster_count ON parks_park(ride_count, coaster_count) WHERE ride_count IS NOT NULL;",
|
||||
reverse_sql="DROP INDEX IF EXISTS idx_parks_ride_count_coaster_count;"
|
||||
),
|
||||
migrations.RunSQL(
|
||||
"CREATE INDEX IF NOT EXISTS idx_parks_average_rating_status ON parks_park(average_rating, status) WHERE average_rating IS NOT NULL;",
|
||||
reverse_sql="DROP INDEX IF EXISTS idx_parks_average_rating_status;"
|
||||
),
|
||||
# Search optimization index
|
||||
migrations.RunSQL(
|
||||
"CREATE INDEX IF NOT EXISTS idx_parks_search_text_gin ON parks_park USING gin(search_text gin_trgm_ops);",
|
||||
reverse_sql="DROP INDEX IF EXISTS idx_parks_search_text_gin;"
|
||||
),
|
||||
# Location-based indexes for ParkLocation
|
||||
migrations.RunSQL(
|
||||
"CREATE INDEX IF NOT EXISTS idx_parklocation_country_city ON parks_parklocation(country, city);",
|
||||
reverse_sql="DROP INDEX IF EXISTS idx_parklocation_country_city;"
|
||||
),
|
||||
# Company name index for operator filtering
|
||||
migrations.RunSQL(
|
||||
"CREATE INDEX IF NOT EXISTS idx_company_name_roles ON parks_company USING gin(name gin_trgm_ops, roles);",
|
||||
reverse_sql="DROP INDEX IF EXISTS idx_company_name_roles;"
|
||||
),
|
||||
# Timestamps for ordering and filtering
|
||||
migrations.RunSQL(
|
||||
"CREATE INDEX IF NOT EXISTS idx_parks_created_at_status ON parks_park(created_at, status);",
|
||||
reverse_sql="DROP INDEX IF EXISTS idx_parks_created_at_status;"
|
||||
),
|
||||
]
|
||||
@@ -1,32 +0,0 @@
|
||||
from django.db import models
|
||||
from django.utils.text import slugify
|
||||
import pghistory
|
||||
|
||||
from apps.core.history import TrackedModel
|
||||
from .parks import Park
|
||||
|
||||
|
||||
@pghistory.track()
|
||||
class ParkArea(TrackedModel):
|
||||
# Import managers
|
||||
from ..managers import ParkAreaManager
|
||||
|
||||
objects = ParkAreaManager()
|
||||
id: int # Type hint for Django's automatic id field
|
||||
park = models.ForeignKey(Park, on_delete=models.CASCADE, related_name="areas")
|
||||
name = models.CharField(max_length=255)
|
||||
slug = models.SlugField(max_length=255)
|
||||
description = models.TextField(blank=True)
|
||||
opening_date = models.DateField(null=True, blank=True)
|
||||
closing_date = models.DateField(null=True, blank=True)
|
||||
|
||||
def save(self, *args, **kwargs):
|
||||
if not self.slug:
|
||||
self.slug = slugify(self.name)
|
||||
super().save(*args, **kwargs)
|
||||
|
||||
def __str__(self):
|
||||
return self.name
|
||||
|
||||
class Meta:
|
||||
unique_together = ("park", "slug")
|
||||
@@ -1,127 +0,0 @@
|
||||
from django.contrib.postgres.fields import ArrayField
|
||||
from django.db import models
|
||||
from django.utils.text import slugify
|
||||
from apps.core.models import TrackedModel
|
||||
from apps.core.choices.fields import RichChoiceField
|
||||
import pghistory
|
||||
|
||||
|
||||
@pghistory.track()
|
||||
class Company(TrackedModel):
|
||||
# Import managers
|
||||
from ..managers import CompanyManager
|
||||
|
||||
objects = CompanyManager()
|
||||
|
||||
name = models.CharField(max_length=255)
|
||||
slug = models.SlugField(max_length=255, unique=True)
|
||||
roles = ArrayField(
|
||||
RichChoiceField(choice_group="company_roles", domain="parks", max_length=20),
|
||||
default=list,
|
||||
blank=True,
|
||||
)
|
||||
description = models.TextField(blank=True)
|
||||
website = models.URLField(blank=True)
|
||||
|
||||
# Operator-specific fields
|
||||
founded_year = models.PositiveIntegerField(blank=True, null=True)
|
||||
parks_count = models.IntegerField(default=0)
|
||||
rides_count = models.IntegerField(default=0)
|
||||
|
||||
def save(self, *args, **kwargs):
|
||||
if not self.slug:
|
||||
self.slug = slugify(self.name)
|
||||
super().save(*args, **kwargs)
|
||||
|
||||
def __str__(self):
|
||||
return self.name
|
||||
|
||||
class Meta(TrackedModel.Meta):
|
||||
app_label = "parks"
|
||||
ordering = ["name"]
|
||||
verbose_name_plural = "Companies"
|
||||
|
||||
|
||||
@pghistory.track()
|
||||
class CompanyHeadquarters(models.Model):
|
||||
"""
|
||||
Simple address storage for company headquarters without coordinate tracking.
|
||||
Focus on human-readable location information for display purposes.
|
||||
"""
|
||||
|
||||
# Relationships
|
||||
company = models.OneToOneField(
|
||||
"Company", on_delete=models.CASCADE, related_name="headquarters"
|
||||
)
|
||||
|
||||
# Address Fields (No coordinates needed)
|
||||
street_address = models.CharField(
|
||||
max_length=255,
|
||||
blank=True,
|
||||
help_text="Mailing address if publicly available",
|
||||
)
|
||||
city = models.CharField(
|
||||
max_length=100, db_index=True, help_text="Headquarters city"
|
||||
)
|
||||
state_province = models.CharField(
|
||||
max_length=100,
|
||||
blank=True,
|
||||
db_index=True,
|
||||
help_text="State/Province/Region",
|
||||
)
|
||||
country = models.CharField(
|
||||
max_length=100,
|
||||
default="USA",
|
||||
db_index=True,
|
||||
help_text="Country where headquarters is located",
|
||||
)
|
||||
postal_code = models.CharField(
|
||||
max_length=20, blank=True, help_text="ZIP or postal code"
|
||||
)
|
||||
|
||||
# Optional mailing address if different or more complete
|
||||
mailing_address = models.TextField(
|
||||
blank=True,
|
||||
help_text="Complete mailing address if different from basic address",
|
||||
)
|
||||
|
||||
# Metadata
|
||||
created_at = models.DateTimeField(auto_now_add=True)
|
||||
updated_at = models.DateTimeField(auto_now=True)
|
||||
|
||||
@property
|
||||
def formatted_location(self):
|
||||
"""Returns a formatted address string for display."""
|
||||
components = []
|
||||
if self.street_address:
|
||||
components.append(self.street_address)
|
||||
if self.city:
|
||||
components.append(self.city)
|
||||
if self.state_province:
|
||||
components.append(self.state_province)
|
||||
if self.postal_code:
|
||||
components.append(self.postal_code)
|
||||
if self.country and self.country != "USA":
|
||||
components.append(self.country)
|
||||
return ", ".join(components) if components else f"{self.city}, {self.country}"
|
||||
|
||||
@property
|
||||
def location_display(self):
|
||||
"""Simple city, state/country display for compact views."""
|
||||
parts = [self.city]
|
||||
if self.state_province:
|
||||
parts.append(self.state_province)
|
||||
elif self.country != "USA":
|
||||
parts.append(self.country)
|
||||
return ", ".join(parts) if parts else "Unknown Location"
|
||||
|
||||
def __str__(self):
|
||||
return f"{self.company.name} Headquarters - {self.location_display}"
|
||||
|
||||
class Meta:
|
||||
verbose_name = "Company Headquarters"
|
||||
verbose_name_plural = "Company Headquarters"
|
||||
ordering = ["company__name"]
|
||||
indexes = [
|
||||
models.Index(fields=["city", "country"]),
|
||||
]
|
||||
File diff suppressed because one or more lines are too long
@@ -1,467 +0,0 @@
|
||||
"""
|
||||
Park Filter Service
|
||||
|
||||
Provides filtering functionality, aggregations, and caching for park filters.
|
||||
This service handles complex filter logic and provides useful filter statistics.
|
||||
"""
|
||||
|
||||
from typing import Dict, List, Any, Optional
|
||||
from django.db.models import QuerySet, Count, Q
|
||||
from django.core.cache import cache
|
||||
from django.conf import settings
|
||||
from ..models import Park, Company
|
||||
from ..querysets import get_base_park_queryset
|
||||
|
||||
|
||||
class ParkFilterService:
|
||||
"""
|
||||
Service class for handling park filtering operations, aggregations,
|
||||
and providing filter suggestions based on available data.
|
||||
"""
|
||||
|
||||
CACHE_TIMEOUT = getattr(settings, "PARK_FILTER_CACHE_TIMEOUT", 300) # 5 minutes
|
||||
|
||||
def __init__(self):
|
||||
self.cache_prefix = "park_filter"
|
||||
|
||||
def get_filter_counts(
|
||||
self, base_queryset: Optional[QuerySet] = None
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Get counts for various filter options with optimized single-query aggregations.
|
||||
This eliminates multiple expensive COUNT queries.
|
||||
|
||||
Args:
|
||||
base_queryset: Optional base queryset to use for calculations
|
||||
|
||||
Returns:
|
||||
Dictionary containing counts for different filter categories
|
||||
"""
|
||||
cache_key = f"{self.cache_prefix}:filter_counts"
|
||||
cached_result = cache.get(cache_key)
|
||||
|
||||
if cached_result is not None:
|
||||
return cached_result
|
||||
|
||||
from apps.core.utils.query_optimization import track_queries
|
||||
|
||||
with track_queries("optimized_filter_counts"):
|
||||
if base_queryset is None:
|
||||
base_queryset = get_base_park_queryset()
|
||||
|
||||
# Use optimized single-query aggregation instead of multiple COUNT queries
|
||||
aggregates = base_queryset.aggregate(
|
||||
total_parks=Count('id'),
|
||||
operating_parks=Count('id', filter=Q(status='OPERATING')),
|
||||
parks_with_coasters=Count('id', filter=Q(coaster_count__gt=0)),
|
||||
big_parks=Count('id', filter=Q(ride_count__gte=10)),
|
||||
highly_rated=Count('id', filter=Q(average_rating__gte=4.0)),
|
||||
disney_parks=Count('id', filter=Q(operator__name__icontains='Disney')),
|
||||
universal_parks=Count('id', filter=Q(operator__name__icontains='Universal')),
|
||||
six_flags_parks=Count('id', filter=Q(operator__name__icontains='Six Flags')),
|
||||
cedar_fair_parks=Count('id', filter=Q(
|
||||
Q(operator__name__icontains='Cedar Fair') |
|
||||
Q(operator__name__icontains='Cedar Point') |
|
||||
Q(operator__name__icontains='Kings Island')
|
||||
))
|
||||
)
|
||||
|
||||
# Calculate filter counts efficiently
|
||||
filter_counts = {
|
||||
"total_parks": aggregates['total_parks'],
|
||||
"operating_parks": aggregates['operating_parks'],
|
||||
"parks_with_coasters": aggregates['parks_with_coasters'],
|
||||
"big_parks": aggregates['big_parks'],
|
||||
"highly_rated": aggregates['highly_rated'],
|
||||
"park_types": {
|
||||
"disney": aggregates['disney_parks'],
|
||||
"universal": aggregates['universal_parks'],
|
||||
"six_flags": aggregates['six_flags_parks'],
|
||||
"cedar_fair": aggregates['cedar_fair_parks'],
|
||||
},
|
||||
"top_operators": self._get_top_operators_optimized(base_queryset),
|
||||
"countries": self._get_country_counts_optimized(base_queryset),
|
||||
}
|
||||
|
||||
# Cache the result for longer since this is expensive
|
||||
cache.set(cache_key, filter_counts, self.CACHE_TIMEOUT * 2)
|
||||
return filter_counts
|
||||
|
||||
def _get_park_type_counts(self, queryset: QuerySet) -> Dict[str, int]:
|
||||
"""Get counts for different park types based on operator names."""
|
||||
return {
|
||||
"disney": queryset.filter(operator__name__icontains="Disney").count(),
|
||||
"universal": queryset.filter(operator__name__icontains="Universal").count(),
|
||||
"six_flags": queryset.filter(operator__name__icontains="Six Flags").count(),
|
||||
"cedar_fair": queryset.filter(
|
||||
Q(operator__name__icontains="Cedar Fair")
|
||||
| Q(operator__name__icontains="Cedar Point")
|
||||
| Q(operator__name__icontains="Kings Island")
|
||||
).count(),
|
||||
}
|
||||
|
||||
def _get_top_operators(
|
||||
self, queryset: QuerySet, limit: int = 10
|
||||
) -> List[Dict[str, Any]]:
|
||||
"""Get the top operators by number of parks."""
|
||||
return list(
|
||||
queryset.values("operator__name", "operator__id")
|
||||
.annotate(park_count=Count("id"))
|
||||
.filter(park_count__gt=0)
|
||||
.order_by("-park_count")[:limit]
|
||||
)
|
||||
|
||||
def _get_country_counts(
|
||||
self, queryset: QuerySet, limit: int = 10
|
||||
) -> List[Dict[str, Any]]:
|
||||
"""Get countries with the most parks."""
|
||||
return list(
|
||||
queryset.filter(location__country__isnull=False)
|
||||
.values("location__country")
|
||||
.annotate(park_count=Count("id"))
|
||||
.filter(park_count__gt=0)
|
||||
.order_by("-park_count")[:limit]
|
||||
)
|
||||
|
||||
def get_filter_suggestions(self, query: str) -> Dict[str, List[str]]:
|
||||
"""
|
||||
Get filter suggestions based on a search query.
|
||||
|
||||
Args:
|
||||
query: Search query string
|
||||
|
||||
Returns:
|
||||
Dictionary with suggestion categories
|
||||
"""
|
||||
cache_key = f"{self.cache_prefix}:suggestions:{query.lower()}"
|
||||
cached_result = cache.get(cache_key)
|
||||
|
||||
if cached_result is not None:
|
||||
return cached_result
|
||||
|
||||
suggestions = {
|
||||
"parks": [],
|
||||
"operators": [],
|
||||
"locations": [],
|
||||
}
|
||||
|
||||
if len(query) >= 2: # Only search for queries of 2+ characters
|
||||
# Park name suggestions
|
||||
park_names = Park.objects.filter(name__icontains=query).values_list(
|
||||
"name", flat=True
|
||||
)[:5]
|
||||
suggestions["parks"] = list(park_names)
|
||||
|
||||
# Operator suggestions
|
||||
operator_names = Company.objects.filter(
|
||||
roles__contains=["OPERATOR"], name__icontains=query
|
||||
).values_list("name", flat=True)[:5]
|
||||
suggestions["operators"] = list(operator_names)
|
||||
|
||||
# Location suggestions (cities and countries)
|
||||
locations = Park.objects.filter(
|
||||
Q(location__city__icontains=query)
|
||||
| Q(location__country__icontains=query)
|
||||
).values_list("location__city", "location__country")[:5]
|
||||
|
||||
location_suggestions = []
|
||||
for city, country in locations:
|
||||
if city and city.lower().startswith(query.lower()):
|
||||
location_suggestions.append(city)
|
||||
elif country and country.lower().startswith(query.lower()):
|
||||
location_suggestions.append(country)
|
||||
|
||||
suggestions["locations"] = list(set(location_suggestions))[:5]
|
||||
|
||||
# Cache suggestions for a shorter time
|
||||
cache.set(cache_key, suggestions, 60) # 1 minute cache
|
||||
return suggestions
|
||||
|
||||
def get_popular_filters(self) -> Dict[str, Any]:
|
||||
"""
|
||||
Get commonly used filter combinations and popular filter values.
|
||||
|
||||
Returns:
|
||||
Dictionary containing popular filter configurations
|
||||
"""
|
||||
cache_key = f"{self.cache_prefix}:popular_filters"
|
||||
cached_result = cache.get(cache_key)
|
||||
|
||||
if cached_result is not None:
|
||||
return cached_result
|
||||
|
||||
base_qs = get_base_park_queryset()
|
||||
|
||||
popular_filters = {
|
||||
"quick_filters": [
|
||||
{
|
||||
"label": "Disney Parks",
|
||||
"filters": {"park_type": "disney"},
|
||||
"count": base_qs.filter(operator__name__icontains="Disney").count(),
|
||||
},
|
||||
{
|
||||
"label": "Parks with Coasters",
|
||||
"filters": {"has_coasters": True},
|
||||
"count": base_qs.filter(coaster_count__gt=0).count(),
|
||||
},
|
||||
{
|
||||
"label": "Highly Rated",
|
||||
"filters": {"min_rating": "4"},
|
||||
"count": base_qs.filter(average_rating__gte=4.0).count(),
|
||||
},
|
||||
{
|
||||
"label": "Major Parks",
|
||||
"filters": {"big_parks_only": True},
|
||||
"count": base_qs.filter(ride_count__gte=10).count(),
|
||||
},
|
||||
],
|
||||
"recommended_sorts": [
|
||||
{"value": "-average_rating", "label": "Highest Rated"},
|
||||
{"value": "-coaster_count", "label": "Most Coasters"},
|
||||
{"value": "name", "label": "A-Z"},
|
||||
],
|
||||
}
|
||||
|
||||
# Cache for longer since these don't change often
|
||||
cache.set(cache_key, popular_filters, self.CACHE_TIMEOUT * 2)
|
||||
return popular_filters
|
||||
|
||||
def clear_filter_cache(self) -> None:
|
||||
"""Clear all cached filter data."""
|
||||
# Simple cache clearing - delete known keys
|
||||
cache_keys = [
|
||||
f"{self.cache_prefix}:filter_counts",
|
||||
f"{self.cache_prefix}:popular_filters",
|
||||
]
|
||||
for key in cache_keys:
|
||||
cache.delete(key)
|
||||
|
||||
def get_optimized_filtered_queryset(self, filters: Dict[str, Any]) -> QuerySet: # noqa: C901
|
||||
"""
|
||||
Apply filters to get a filtered queryset with comprehensive optimizations.
|
||||
This method eliminates the expensive subquery pattern and builds an optimized
|
||||
queryset from the ground up.
|
||||
|
||||
Args:
|
||||
filters: Dictionary of filter parameters
|
||||
|
||||
Returns:
|
||||
Filtered and optimized QuerySet
|
||||
"""
|
||||
from apps.core.utils.query_optimization import track_queries
|
||||
|
||||
with track_queries("optimized_filtered_queryset"):
|
||||
# Start with base Park queryset and apply all optimizations at once
|
||||
queryset = (
|
||||
Park.objects
|
||||
.select_related(
|
||||
"operator",
|
||||
"property_owner",
|
||||
"location",
|
||||
"banner_image",
|
||||
"card_image"
|
||||
)
|
||||
.prefetch_related(
|
||||
"photos",
|
||||
"rides__manufacturer",
|
||||
"areas"
|
||||
)
|
||||
.annotate(
|
||||
current_ride_count=Count("rides", distinct=True),
|
||||
current_coaster_count=Count(
|
||||
"rides", filter=Q(rides__category="RC"), distinct=True
|
||||
),
|
||||
)
|
||||
)
|
||||
|
||||
# Build optimized filter conditions
|
||||
filter_conditions = Q()
|
||||
|
||||
# Apply status filter
|
||||
if filters.get("status"):
|
||||
filter_conditions &= Q(status=filters["status"])
|
||||
|
||||
# Apply park type filter
|
||||
if filters.get("park_type"):
|
||||
filter_conditions &= self._get_park_type_filter(filters["park_type"])
|
||||
|
||||
# Apply coaster filter
|
||||
if filters.get("has_coasters"):
|
||||
filter_conditions &= Q(coaster_count__gt=0)
|
||||
|
||||
# Apply rating filter
|
||||
if filters.get("min_rating"):
|
||||
try:
|
||||
min_rating = float(filters["min_rating"])
|
||||
filter_conditions &= Q(average_rating__gte=min_rating)
|
||||
except (ValueError, TypeError):
|
||||
pass
|
||||
|
||||
# Apply big parks filter
|
||||
if filters.get("big_parks_only"):
|
||||
filter_conditions &= Q(ride_count__gte=10)
|
||||
|
||||
# Apply optimized search using search_text field
|
||||
if filters.get("search"):
|
||||
search_query = filters["search"].strip()
|
||||
if search_query:
|
||||
# Use the computed search_text field for better performance
|
||||
search_conditions = (
|
||||
Q(search_text__icontains=search_query)
|
||||
| Q(name__icontains=search_query)
|
||||
| Q(location__city__icontains=search_query)
|
||||
| Q(location__country__icontains=search_query)
|
||||
)
|
||||
filter_conditions &= search_conditions
|
||||
|
||||
# Apply location filters
|
||||
if filters.get("country_filter"):
|
||||
filter_conditions &= Q(
|
||||
location__country__icontains=filters["country_filter"]
|
||||
)
|
||||
|
||||
if filters.get("state_filter"):
|
||||
filter_conditions &= Q(
|
||||
location__state__icontains=filters["state_filter"]
|
||||
)
|
||||
|
||||
# Apply all filters at once for better query planning
|
||||
if filter_conditions:
|
||||
queryset = queryset.filter(filter_conditions)
|
||||
|
||||
return queryset.distinct()
|
||||
|
||||
def get_filtered_queryset(self, filters: Dict[str, Any]) -> QuerySet: # noqa: C901
|
||||
"""
|
||||
Legacy method - kept for backward compatibility.
|
||||
Use get_optimized_filtered_queryset for new implementations.
|
||||
"""
|
||||
queryset = (
|
||||
get_base_park_queryset()
|
||||
.select_related("operator", "property_owner", "location")
|
||||
.prefetch_related("photos", "rides__manufacturer")
|
||||
)
|
||||
|
||||
# Apply status filter
|
||||
if filters.get("status"):
|
||||
queryset = queryset.filter(status=filters["status"])
|
||||
|
||||
# Apply park type filter
|
||||
if filters.get("park_type"):
|
||||
queryset = self._apply_park_type_filter(queryset, filters["park_type"])
|
||||
|
||||
# Apply coaster filter
|
||||
if filters.get("has_coasters"):
|
||||
queryset = queryset.filter(coaster_count__gt=0)
|
||||
|
||||
# Apply rating filter
|
||||
if filters.get("min_rating"):
|
||||
try:
|
||||
min_rating = float(filters["min_rating"])
|
||||
queryset = queryset.filter(average_rating__gte=min_rating)
|
||||
except (ValueError, TypeError):
|
||||
pass
|
||||
|
||||
# Apply big parks filter
|
||||
if filters.get("big_parks_only"):
|
||||
queryset = queryset.filter(ride_count__gte=10)
|
||||
|
||||
# Apply search
|
||||
if filters.get("search"):
|
||||
search_query = filters["search"]
|
||||
queryset = queryset.filter(
|
||||
Q(name__icontains=search_query)
|
||||
| Q(description__icontains=search_query)
|
||||
| Q(location__city__icontains=search_query)
|
||||
| Q(location__country__icontains=search_query)
|
||||
)
|
||||
|
||||
# Apply location filters
|
||||
if filters.get("country_filter"):
|
||||
queryset = queryset.filter(
|
||||
location__country__icontains=filters["country_filter"]
|
||||
)
|
||||
|
||||
if filters.get("state_filter"):
|
||||
queryset = queryset.filter(
|
||||
location__state__icontains=filters["state_filter"]
|
||||
)
|
||||
|
||||
# Apply ordering
|
||||
if filters.get("ordering"):
|
||||
queryset = queryset.order_by(filters["ordering"])
|
||||
|
||||
return queryset.distinct()
|
||||
|
||||
def _apply_park_type_filter(self, queryset: QuerySet, park_type: str) -> QuerySet:
|
||||
"""Apply park type filter logic."""
|
||||
type_filters = {
|
||||
"disney": Q(operator__name__icontains="Disney"),
|
||||
"universal": Q(operator__name__icontains="Universal"),
|
||||
"six_flags": Q(operator__name__icontains="Six Flags"),
|
||||
"cedar_fair": (
|
||||
Q(operator__name__icontains="Cedar Fair")
|
||||
| Q(operator__name__icontains="Cedar Point")
|
||||
| Q(operator__name__icontains="Kings Island")
|
||||
| Q(operator__name__icontains="Canada's Wonderland")
|
||||
),
|
||||
"independent": ~(
|
||||
Q(operator__name__icontains="Disney")
|
||||
| Q(operator__name__icontains="Universal")
|
||||
| Q(operator__name__icontains="Six Flags")
|
||||
| Q(operator__name__icontains="Cedar Fair")
|
||||
| Q(operator__name__icontains="Cedar Point")
|
||||
),
|
||||
}
|
||||
|
||||
if park_type in type_filters:
|
||||
return queryset.filter(type_filters[park_type])
|
||||
|
||||
return queryset
|
||||
|
||||
def _get_park_type_filter(self, park_type: str) -> Q:
|
||||
"""Get park type filter as Q object for optimized filtering."""
|
||||
type_filters = {
|
||||
"disney": Q(operator__name__icontains="Disney"),
|
||||
"universal": Q(operator__name__icontains="Universal"),
|
||||
"six_flags": Q(operator__name__icontains="Six Flags"),
|
||||
"cedar_fair": (
|
||||
Q(operator__name__icontains="Cedar Fair")
|
||||
| Q(operator__name__icontains="Cedar Point")
|
||||
| Q(operator__name__icontains="Kings Island")
|
||||
| Q(operator__name__icontains="Canada's Wonderland")
|
||||
),
|
||||
"independent": ~(
|
||||
Q(operator__name__icontains="Disney")
|
||||
| Q(operator__name__icontains="Universal")
|
||||
| Q(operator__name__icontains="Six Flags")
|
||||
| Q(operator__name__icontains="Cedar Fair")
|
||||
| Q(operator__name__icontains="Cedar Point")
|
||||
| Q(operator__name__icontains="Kings Island")
|
||||
| Q(operator__name__icontains="Canada's Wonderland")
|
||||
),
|
||||
}
|
||||
return type_filters.get(park_type, Q())
|
||||
|
||||
def _get_top_operators_optimized(
|
||||
self, queryset: QuerySet, limit: int = 10
|
||||
) -> List[Dict[str, Any]]:
|
||||
"""Get the top operators by number of parks using optimized query."""
|
||||
return list(
|
||||
queryset.values("operator__name", "operator__id")
|
||||
.annotate(park_count=Count("id"))
|
||||
.filter(park_count__gt=0)
|
||||
.order_by("-park_count")[:limit]
|
||||
)
|
||||
|
||||
def _get_country_counts_optimized(
|
||||
self, queryset: QuerySet, limit: int = 10
|
||||
) -> List[Dict[str, Any]]:
|
||||
"""Get countries with the most parks using optimized query."""
|
||||
return list(
|
||||
queryset.filter(location__country__isnull=False)
|
||||
.values("location__country")
|
||||
.annotate(park_count=Count("id"))
|
||||
.filter(park_count__gt=0)
|
||||
.order_by("-park_count")[:limit]
|
||||
)
|
||||
@@ -1,428 +0,0 @@
|
||||
"""
|
||||
Smart Park Loader for Hybrid Filtering Strategy
|
||||
|
||||
This module provides intelligent data loading capabilities for the hybrid filtering approach,
|
||||
optimizing database queries and implementing progressive loading strategies.
|
||||
"""
|
||||
|
||||
from typing import Dict, Optional, Any
|
||||
from django.db import models
|
||||
from django.core.cache import cache
|
||||
from django.conf import settings
|
||||
from apps.parks.models import Park
|
||||
|
||||
|
||||
class SmartParkLoader:
|
||||
"""
|
||||
Intelligent park data loader that optimizes queries based on filtering requirements.
|
||||
Implements progressive loading and smart caching strategies.
|
||||
"""
|
||||
|
||||
# Cache configuration
|
||||
CACHE_TIMEOUT = getattr(settings, 'HYBRID_FILTER_CACHE_TIMEOUT', 300) # 5 minutes
|
||||
CACHE_KEY_PREFIX = 'hybrid_parks'
|
||||
|
||||
# Progressive loading thresholds
|
||||
INITIAL_LOAD_SIZE = 50
|
||||
PROGRESSIVE_LOAD_SIZE = 25
|
||||
MAX_CLIENT_SIDE_RECORDS = 200
|
||||
|
||||
def __init__(self):
|
||||
self.base_queryset = self._get_optimized_queryset()
|
||||
|
||||
def _get_optimized_queryset(self) -> models.QuerySet:
|
||||
"""Get optimized base queryset with all necessary prefetches."""
|
||||
return Park.objects.select_related(
|
||||
'operator',
|
||||
'property_owner',
|
||||
'banner_image',
|
||||
'card_image',
|
||||
).prefetch_related(
|
||||
'location', # ParkLocation relationship
|
||||
).filter(
|
||||
# Only include operating and temporarily closed parks by default
|
||||
status__in=['OPERATING', 'CLOSED_TEMP']
|
||||
).order_by('name')
|
||||
|
||||
def get_initial_load(self, filters: Optional[Dict[str, Any]] = None) -> Dict[str, Any]:
|
||||
"""
|
||||
Get initial park data load with smart filtering decisions.
|
||||
|
||||
Args:
|
||||
filters: Optional filters to apply
|
||||
|
||||
Returns:
|
||||
Dictionary containing parks data and metadata
|
||||
"""
|
||||
cache_key = self._generate_cache_key('initial', filters)
|
||||
cached_result = cache.get(cache_key)
|
||||
|
||||
if cached_result:
|
||||
return cached_result
|
||||
|
||||
# Apply filters if provided
|
||||
queryset = self.base_queryset
|
||||
if filters:
|
||||
queryset = self._apply_filters(queryset, filters)
|
||||
|
||||
# Get total count for pagination decisions
|
||||
total_count = queryset.count()
|
||||
|
||||
# Determine loading strategy
|
||||
if total_count <= self.MAX_CLIENT_SIDE_RECORDS:
|
||||
# Load all data for client-side filtering
|
||||
parks = list(queryset.all())
|
||||
strategy = 'client_side'
|
||||
has_more = False
|
||||
else:
|
||||
# Load initial batch for server-side pagination
|
||||
parks = list(queryset[:self.INITIAL_LOAD_SIZE])
|
||||
strategy = 'server_side'
|
||||
has_more = total_count > self.INITIAL_LOAD_SIZE
|
||||
|
||||
result = {
|
||||
'parks': parks,
|
||||
'total_count': total_count,
|
||||
'strategy': strategy,
|
||||
'has_more': has_more,
|
||||
'next_offset': len(parks) if has_more else None,
|
||||
'filter_metadata': self._get_filter_metadata(queryset),
|
||||
}
|
||||
|
||||
# Cache the result
|
||||
cache.set(cache_key, result, self.CACHE_TIMEOUT)
|
||||
|
||||
return result
|
||||
|
||||
def get_progressive_load(
|
||||
self,
|
||||
offset: int,
|
||||
filters: Optional[Dict[str, Any]] = None
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Get next batch of parks for progressive loading.
|
||||
|
||||
Args:
|
||||
offset: Starting offset for the batch
|
||||
filters: Optional filters to apply
|
||||
|
||||
Returns:
|
||||
Dictionary containing parks data and metadata
|
||||
"""
|
||||
cache_key = self._generate_cache_key(f'progressive_{offset}', filters)
|
||||
cached_result = cache.get(cache_key)
|
||||
|
||||
if cached_result:
|
||||
return cached_result
|
||||
|
||||
# Apply filters if provided
|
||||
queryset = self.base_queryset
|
||||
if filters:
|
||||
queryset = self._apply_filters(queryset, filters)
|
||||
|
||||
# Get the batch
|
||||
end_offset = offset + self.PROGRESSIVE_LOAD_SIZE
|
||||
parks = list(queryset[offset:end_offset])
|
||||
|
||||
# Check if there are more records
|
||||
total_count = queryset.count()
|
||||
has_more = end_offset < total_count
|
||||
|
||||
result = {
|
||||
'parks': parks,
|
||||
'total_count': total_count,
|
||||
'has_more': has_more,
|
||||
'next_offset': end_offset if has_more else None,
|
||||
}
|
||||
|
||||
# Cache the result
|
||||
cache.set(cache_key, result, self.CACHE_TIMEOUT)
|
||||
|
||||
return result
|
||||
|
||||
def get_filter_metadata(self, filters: Optional[Dict[str, Any]] = None) -> Dict[str, Any]:
|
||||
"""
|
||||
Get metadata about available filter options.
|
||||
|
||||
Args:
|
||||
filters: Current filters to scope the metadata
|
||||
|
||||
Returns:
|
||||
Dictionary containing filter metadata
|
||||
"""
|
||||
cache_key = self._generate_cache_key('metadata', filters)
|
||||
cached_result = cache.get(cache_key)
|
||||
|
||||
if cached_result:
|
||||
return cached_result
|
||||
|
||||
# Apply filters if provided
|
||||
queryset = self.base_queryset
|
||||
if filters:
|
||||
queryset = self._apply_filters(queryset, filters)
|
||||
|
||||
result = self._get_filter_metadata(queryset)
|
||||
|
||||
# Cache the result
|
||||
cache.set(cache_key, result, self.CACHE_TIMEOUT)
|
||||
|
||||
return result
|
||||
|
||||
def _apply_filters(self, queryset: models.QuerySet, filters: Dict[str, Any]) -> models.QuerySet:
|
||||
"""Apply filters to the queryset."""
|
||||
|
||||
# Status filter
|
||||
if 'status' in filters and filters['status']:
|
||||
if isinstance(filters['status'], list):
|
||||
queryset = queryset.filter(status__in=filters['status'])
|
||||
else:
|
||||
queryset = queryset.filter(status=filters['status'])
|
||||
|
||||
# Park type filter
|
||||
if 'park_type' in filters and filters['park_type']:
|
||||
if isinstance(filters['park_type'], list):
|
||||
queryset = queryset.filter(park_type__in=filters['park_type'])
|
||||
else:
|
||||
queryset = queryset.filter(park_type=filters['park_type'])
|
||||
|
||||
# Country filter
|
||||
if 'country' in filters and filters['country']:
|
||||
queryset = queryset.filter(location__country__in=filters['country'])
|
||||
|
||||
# State filter
|
||||
if 'state' in filters and filters['state']:
|
||||
queryset = queryset.filter(location__state__in=filters['state'])
|
||||
|
||||
# Opening year range
|
||||
if 'opening_year_min' in filters and filters['opening_year_min']:
|
||||
queryset = queryset.filter(opening_year__gte=filters['opening_year_min'])
|
||||
|
||||
if 'opening_year_max' in filters and filters['opening_year_max']:
|
||||
queryset = queryset.filter(opening_year__lte=filters['opening_year_max'])
|
||||
|
||||
# Size range
|
||||
if 'size_min' in filters and filters['size_min']:
|
||||
queryset = queryset.filter(size_acres__gte=filters['size_min'])
|
||||
|
||||
if 'size_max' in filters and filters['size_max']:
|
||||
queryset = queryset.filter(size_acres__lte=filters['size_max'])
|
||||
|
||||
# Rating range
|
||||
if 'rating_min' in filters and filters['rating_min']:
|
||||
queryset = queryset.filter(average_rating__gte=filters['rating_min'])
|
||||
|
||||
if 'rating_max' in filters and filters['rating_max']:
|
||||
queryset = queryset.filter(average_rating__lte=filters['rating_max'])
|
||||
|
||||
# Ride count range
|
||||
if 'ride_count_min' in filters and filters['ride_count_min']:
|
||||
queryset = queryset.filter(ride_count__gte=filters['ride_count_min'])
|
||||
|
||||
if 'ride_count_max' in filters and filters['ride_count_max']:
|
||||
queryset = queryset.filter(ride_count__lte=filters['ride_count_max'])
|
||||
|
||||
# Coaster count range
|
||||
if 'coaster_count_min' in filters and filters['coaster_count_min']:
|
||||
queryset = queryset.filter(coaster_count__gte=filters['coaster_count_min'])
|
||||
|
||||
if 'coaster_count_max' in filters and filters['coaster_count_max']:
|
||||
queryset = queryset.filter(coaster_count__lte=filters['coaster_count_max'])
|
||||
|
||||
# Operator filter
|
||||
if 'operator' in filters and filters['operator']:
|
||||
if isinstance(filters['operator'], list):
|
||||
queryset = queryset.filter(operator__slug__in=filters['operator'])
|
||||
else:
|
||||
queryset = queryset.filter(operator__slug=filters['operator'])
|
||||
|
||||
# Search query
|
||||
if 'search' in filters and filters['search']:
|
||||
search_term = filters['search'].lower()
|
||||
queryset = queryset.filter(search_text__icontains=search_term)
|
||||
|
||||
return queryset
|
||||
|
||||
def _get_filter_metadata(self, queryset: models.QuerySet) -> Dict[str, Any]:
|
||||
"""Generate filter metadata from the current queryset."""
|
||||
|
||||
# Get distinct values for categorical filters with counts
|
||||
countries_data = list(
|
||||
queryset.values('location__country')
|
||||
.exclude(location__country__isnull=True)
|
||||
.annotate(count=models.Count('id'))
|
||||
.order_by('location__country')
|
||||
)
|
||||
|
||||
states_data = list(
|
||||
queryset.values('location__state')
|
||||
.exclude(location__state__isnull=True)
|
||||
.annotate(count=models.Count('id'))
|
||||
.order_by('location__state')
|
||||
)
|
||||
|
||||
park_types_data = list(
|
||||
queryset.values('park_type')
|
||||
.exclude(park_type__isnull=True)
|
||||
.annotate(count=models.Count('id'))
|
||||
.order_by('park_type')
|
||||
)
|
||||
|
||||
statuses_data = list(
|
||||
queryset.values('status')
|
||||
.annotate(count=models.Count('id'))
|
||||
.order_by('status')
|
||||
)
|
||||
|
||||
operators_data = list(
|
||||
queryset.select_related('operator')
|
||||
.values('operator__id', 'operator__name', 'operator__slug')
|
||||
.exclude(operator__isnull=True)
|
||||
.annotate(count=models.Count('id'))
|
||||
.order_by('operator__name')
|
||||
)
|
||||
|
||||
# Convert to frontend-expected format with value/label/count
|
||||
countries = [
|
||||
{
|
||||
'value': item['location__country'],
|
||||
'label': item['location__country'],
|
||||
'count': item['count']
|
||||
}
|
||||
for item in countries_data
|
||||
]
|
||||
|
||||
states = [
|
||||
{
|
||||
'value': item['location__state'],
|
||||
'label': item['location__state'],
|
||||
'count': item['count']
|
||||
}
|
||||
for item in states_data
|
||||
]
|
||||
|
||||
park_types = [
|
||||
{
|
||||
'value': item['park_type'],
|
||||
'label': item['park_type'],
|
||||
'count': item['count']
|
||||
}
|
||||
for item in park_types_data
|
||||
]
|
||||
|
||||
statuses = [
|
||||
{
|
||||
'value': item['status'],
|
||||
'label': self._get_status_label(item['status']),
|
||||
'count': item['count']
|
||||
}
|
||||
for item in statuses_data
|
||||
]
|
||||
|
||||
operators = [
|
||||
{
|
||||
'value': item['operator__slug'],
|
||||
'label': item['operator__name'],
|
||||
'count': item['count']
|
||||
}
|
||||
for item in operators_data
|
||||
]
|
||||
|
||||
# Get ranges for numerical filters
|
||||
aggregates = queryset.aggregate(
|
||||
opening_year_min=models.Min('opening_year'),
|
||||
opening_year_max=models.Max('opening_year'),
|
||||
size_min=models.Min('size_acres'),
|
||||
size_max=models.Max('size_acres'),
|
||||
rating_min=models.Min('average_rating'),
|
||||
rating_max=models.Max('average_rating'),
|
||||
ride_count_min=models.Min('ride_count'),
|
||||
ride_count_max=models.Max('ride_count'),
|
||||
coaster_count_min=models.Min('coaster_count'),
|
||||
coaster_count_max=models.Max('coaster_count'),
|
||||
)
|
||||
|
||||
return {
|
||||
'categorical': {
|
||||
'countries': countries,
|
||||
'states': states,
|
||||
'park_types': park_types,
|
||||
'statuses': statuses,
|
||||
'operators': operators,
|
||||
},
|
||||
'ranges': {
|
||||
'opening_year': {
|
||||
'min': aggregates['opening_year_min'],
|
||||
'max': aggregates['opening_year_max'],
|
||||
'step': 1,
|
||||
'unit': 'year'
|
||||
},
|
||||
'size_acres': {
|
||||
'min': float(aggregates['size_min']) if aggregates['size_min'] else None,
|
||||
'max': float(aggregates['size_max']) if aggregates['size_max'] else None,
|
||||
'step': 1.0,
|
||||
'unit': 'acres'
|
||||
},
|
||||
'average_rating': {
|
||||
'min': float(aggregates['rating_min']) if aggregates['rating_min'] else None,
|
||||
'max': float(aggregates['rating_max']) if aggregates['rating_max'] else None,
|
||||
'step': 0.1,
|
||||
'unit': 'stars'
|
||||
},
|
||||
'ride_count': {
|
||||
'min': aggregates['ride_count_min'],
|
||||
'max': aggregates['ride_count_max'],
|
||||
'step': 1,
|
||||
'unit': 'rides'
|
||||
},
|
||||
'coaster_count': {
|
||||
'min': aggregates['coaster_count_min'],
|
||||
'max': aggregates['coaster_count_max'],
|
||||
'step': 1,
|
||||
'unit': 'coasters'
|
||||
},
|
||||
},
|
||||
'total_count': queryset.count(),
|
||||
}
|
||||
|
||||
def _get_status_label(self, status: str) -> str:
|
||||
"""Convert status code to human-readable label."""
|
||||
status_labels = {
|
||||
'OPERATING': 'Operating',
|
||||
'CLOSED_TEMP': 'Temporarily Closed',
|
||||
'CLOSED_PERM': 'Permanently Closed',
|
||||
'UNDER_CONSTRUCTION': 'Under Construction',
|
||||
}
|
||||
if status in status_labels:
|
||||
return status_labels[status]
|
||||
else:
|
||||
raise ValueError(f"Unknown park status: {status}")
|
||||
|
||||
def _generate_cache_key(self, operation: str, filters: Optional[Dict[str, Any]] = None) -> str:
|
||||
"""Generate cache key for the given operation and filters."""
|
||||
key_parts = [self.CACHE_KEY_PREFIX, operation]
|
||||
|
||||
if filters:
|
||||
# Create a consistent string representation of filters
|
||||
filter_str = '_'.join(f"{k}:{v}" for k, v in sorted(filters.items()) if v)
|
||||
key_parts.append(filter_str)
|
||||
|
||||
return '_'.join(key_parts)
|
||||
|
||||
def invalidate_cache(self, filters: Optional[Dict[str, Any]] = None) -> None:
|
||||
"""Invalidate cached data for the given filters."""
|
||||
# This is a simplified implementation
|
||||
# In production, you might want to use cache versioning or tags
|
||||
cache_keys = [
|
||||
self._generate_cache_key('initial', filters),
|
||||
self._generate_cache_key('metadata', filters),
|
||||
]
|
||||
|
||||
# Also invalidate progressive load caches
|
||||
for offset in range(0, 1000, self.PROGRESSIVE_LOAD_SIZE):
|
||||
cache_keys.append(self._generate_cache_key(f'progressive_{offset}', filters))
|
||||
|
||||
cache.delete_many(cache_keys)
|
||||
|
||||
|
||||
# Singleton instance
|
||||
smart_park_loader = SmartParkLoader()
|
||||
@@ -1,311 +0,0 @@
|
||||
"""
|
||||
Optimized pagination service for large datasets with efficient counting.
|
||||
"""
|
||||
|
||||
from typing import Dict, Any, Optional, Tuple
|
||||
from django.core.paginator import Paginator, Page
|
||||
from django.core.cache import cache
|
||||
from django.db.models import QuerySet, Count
|
||||
from django.conf import settings
|
||||
import hashlib
|
||||
import time
|
||||
import logging
|
||||
|
||||
logger = logging.getLogger("pagination_service")
|
||||
|
||||
|
||||
class OptimizedPaginator(Paginator):
|
||||
"""
|
||||
Custom paginator that optimizes COUNT queries and provides caching.
|
||||
"""
|
||||
|
||||
def __init__(self, object_list, per_page, cache_timeout=300, **kwargs):
|
||||
super().__init__(object_list, per_page, **kwargs)
|
||||
self.cache_timeout = cache_timeout
|
||||
self._cached_count = None
|
||||
self._count_cache_key = None
|
||||
|
||||
def _get_count_cache_key(self) -> str:
|
||||
"""Generate cache key for count based on queryset SQL."""
|
||||
if self._count_cache_key:
|
||||
return self._count_cache_key
|
||||
|
||||
# Create cache key from queryset SQL
|
||||
if hasattr(self.object_list, 'query'):
|
||||
sql_hash = hashlib.md5(
|
||||
str(self.object_list.query).encode('utf-8')
|
||||
).hexdigest()[:16]
|
||||
self._count_cache_key = f"paginator_count:{sql_hash}"
|
||||
else:
|
||||
# Fallback for non-queryset object lists
|
||||
self._count_cache_key = f"paginator_count:list:{len(self.object_list)}"
|
||||
|
||||
return self._count_cache_key
|
||||
|
||||
@property
|
||||
def count(self):
|
||||
"""
|
||||
Optimized count with caching for expensive querysets.
|
||||
"""
|
||||
if self._cached_count is not None:
|
||||
return self._cached_count
|
||||
|
||||
cache_key = self._get_count_cache_key()
|
||||
cached_count = cache.get(cache_key)
|
||||
|
||||
if cached_count is not None:
|
||||
logger.debug(f"Cache hit for pagination count: {cache_key}")
|
||||
self._cached_count = cached_count
|
||||
return cached_count
|
||||
|
||||
# Perform optimized count
|
||||
start_time = time.time()
|
||||
|
||||
if hasattr(self.object_list, 'count'):
|
||||
# For QuerySets, try to optimize the count query
|
||||
count = self._get_optimized_count()
|
||||
else:
|
||||
count = len(self.object_list)
|
||||
|
||||
execution_time = time.time() - start_time
|
||||
|
||||
# Cache the result
|
||||
cache.set(cache_key, count, self.cache_timeout)
|
||||
self._cached_count = count
|
||||
|
||||
if execution_time > 0.5: # Log slow count queries
|
||||
logger.warning(
|
||||
f"Slow pagination count query: {execution_time:.3f}s for {count} items",
|
||||
extra={'cache_key': cache_key, 'execution_time': execution_time}
|
||||
)
|
||||
|
||||
return count
|
||||
|
||||
def _get_optimized_count(self) -> int:
|
||||
"""
|
||||
Get optimized count for complex querysets.
|
||||
"""
|
||||
queryset = self.object_list
|
||||
|
||||
# For complex queries with joins, use approximate counting for very large datasets
|
||||
if self._is_complex_query(queryset):
|
||||
# Try to get count from a simpler subquery
|
||||
try:
|
||||
# Use subquery approach for complex queries
|
||||
subquery = queryset.values('pk')
|
||||
return subquery.count()
|
||||
except Exception as e:
|
||||
logger.warning(f"Optimized count failed, falling back to standard count: {e}")
|
||||
return queryset.count()
|
||||
else:
|
||||
return queryset.count()
|
||||
|
||||
def _is_complex_query(self, queryset) -> bool:
|
||||
"""
|
||||
Determine if a queryset is complex and might benefit from optimization.
|
||||
"""
|
||||
if not hasattr(queryset, 'query'):
|
||||
return False
|
||||
|
||||
sql = str(queryset.query).upper()
|
||||
|
||||
# Consider complex if it has multiple joins or subqueries
|
||||
complexity_indicators = [
|
||||
'JOIN' in sql and sql.count('JOIN') > 2,
|
||||
'DISTINCT' in sql,
|
||||
'GROUP BY' in sql,
|
||||
'HAVING' in sql,
|
||||
]
|
||||
|
||||
return any(complexity_indicators)
|
||||
|
||||
|
||||
class CursorPaginator:
|
||||
"""
|
||||
Cursor-based pagination for very large datasets.
|
||||
More efficient than offset-based pagination for large page numbers.
|
||||
"""
|
||||
|
||||
def __init__(self, queryset: QuerySet, ordering_field: str = 'id', per_page: int = 20):
|
||||
self.queryset = queryset
|
||||
self.ordering_field = ordering_field
|
||||
self.per_page = per_page
|
||||
self.reverse = ordering_field.startswith('-')
|
||||
self.field_name = ordering_field.lstrip('-')
|
||||
|
||||
def get_page(self, cursor: Optional[str] = None) -> Dict[str, Any]:
|
||||
"""
|
||||
Get a page of results using cursor-based pagination.
|
||||
|
||||
Args:
|
||||
cursor: Base64 encoded cursor value from previous page
|
||||
|
||||
Returns:
|
||||
Dictionary with page data and navigation cursors
|
||||
"""
|
||||
queryset = self.queryset.order_by(self.ordering_field)
|
||||
|
||||
if cursor:
|
||||
# Decode cursor and filter from that point
|
||||
try:
|
||||
cursor_value = self._decode_cursor(cursor)
|
||||
if self.reverse:
|
||||
queryset = queryset.filter(**{f"{self.field_name}__lt": cursor_value})
|
||||
else:
|
||||
queryset = queryset.filter(**{f"{self.field_name}__gt": cursor_value})
|
||||
except (ValueError, TypeError):
|
||||
# Invalid cursor, start from beginning
|
||||
pass
|
||||
|
||||
# Get one extra item to check if there's a next page
|
||||
items = list(queryset[:self.per_page + 1])
|
||||
has_next = len(items) > self.per_page
|
||||
|
||||
if has_next:
|
||||
items = items[:-1] # Remove the extra item
|
||||
|
||||
# Generate cursors for navigation
|
||||
next_cursor = None
|
||||
previous_cursor = None
|
||||
|
||||
if items and has_next:
|
||||
last_item = items[-1]
|
||||
next_cursor = self._encode_cursor(getattr(last_item, self.field_name))
|
||||
|
||||
if items and cursor:
|
||||
first_item = items[0]
|
||||
previous_cursor = self._encode_cursor(getattr(first_item, self.field_name))
|
||||
|
||||
return {
|
||||
'items': items,
|
||||
'has_next': has_next,
|
||||
'has_previous': cursor is not None,
|
||||
'next_cursor': next_cursor,
|
||||
'previous_cursor': previous_cursor,
|
||||
'count': len(items)
|
||||
}
|
||||
|
||||
def _encode_cursor(self, value) -> str:
|
||||
"""Encode cursor value to base64 string."""
|
||||
import base64
|
||||
return base64.b64encode(str(value).encode()).decode()
|
||||
|
||||
def _decode_cursor(self, cursor: str):
|
||||
"""Decode cursor from base64 string."""
|
||||
import base64
|
||||
decoded = base64.b64decode(cursor.encode()).decode()
|
||||
|
||||
# Try to convert to appropriate type based on field
|
||||
field = self.queryset.model._meta.get_field(self.field_name)
|
||||
|
||||
if hasattr(field, 'to_python'):
|
||||
return field.to_python(decoded)
|
||||
return decoded
|
||||
|
||||
|
||||
class PaginationCache:
|
||||
"""
|
||||
Advanced caching for pagination metadata and results.
|
||||
"""
|
||||
|
||||
CACHE_PREFIX = "pagination"
|
||||
DEFAULT_TIMEOUT = 300 # 5 minutes
|
||||
|
||||
@classmethod
|
||||
def get_page_cache_key(cls, queryset_hash: str, page_num: int) -> str:
|
||||
"""Generate cache key for a specific page."""
|
||||
return f"{cls.CACHE_PREFIX}:page:{queryset_hash}:{page_num}"
|
||||
|
||||
@classmethod
|
||||
def get_metadata_cache_key(cls, queryset_hash: str) -> str:
|
||||
"""Generate cache key for pagination metadata."""
|
||||
return f"{cls.CACHE_PREFIX}:meta:{queryset_hash}"
|
||||
|
||||
@classmethod
|
||||
def cache_page_results(
|
||||
cls,
|
||||
queryset_hash: str,
|
||||
page_num: int,
|
||||
page_data: Dict[str, Any],
|
||||
timeout: int = DEFAULT_TIMEOUT
|
||||
):
|
||||
"""Cache page results."""
|
||||
cache_key = cls.get_page_cache_key(queryset_hash, page_num)
|
||||
cache.set(cache_key, page_data, timeout)
|
||||
|
||||
@classmethod
|
||||
def get_cached_page(cls, queryset_hash: str, page_num: int) -> Optional[Dict[str, Any]]:
|
||||
"""Get cached page results."""
|
||||
cache_key = cls.get_page_cache_key(queryset_hash, page_num)
|
||||
return cache.get(cache_key)
|
||||
|
||||
@classmethod
|
||||
def cache_metadata(
|
||||
cls,
|
||||
queryset_hash: str,
|
||||
metadata: Dict[str, Any],
|
||||
timeout: int = DEFAULT_TIMEOUT
|
||||
):
|
||||
"""Cache pagination metadata."""
|
||||
cache_key = cls.get_metadata_cache_key(queryset_hash)
|
||||
cache.set(cache_key, metadata, timeout)
|
||||
|
||||
@classmethod
|
||||
def get_cached_metadata(cls, queryset_hash: str) -> Optional[Dict[str, Any]]:
|
||||
"""Get cached pagination metadata."""
|
||||
cache_key = cls.get_metadata_cache_key(queryset_hash)
|
||||
return cache.get(cache_key)
|
||||
|
||||
@classmethod
|
||||
def invalidate_cache(cls, queryset_hash: str):
|
||||
"""Invalidate all cache entries for a queryset."""
|
||||
# This would require a cache backend that supports pattern deletion
|
||||
# For now, we'll rely on TTL expiration
|
||||
pass
|
||||
|
||||
|
||||
def get_optimized_page(
|
||||
queryset: QuerySet,
|
||||
page_number: int,
|
||||
per_page: int = 20,
|
||||
use_cursor: bool = False,
|
||||
cursor: Optional[str] = None,
|
||||
cache_timeout: int = 300
|
||||
) -> Tuple[Page, Dict[str, Any]]:
|
||||
"""
|
||||
Get an optimized page with caching and performance monitoring.
|
||||
|
||||
Args:
|
||||
queryset: The queryset to paginate
|
||||
page_number: Page number to retrieve
|
||||
per_page: Items per page
|
||||
use_cursor: Whether to use cursor-based pagination
|
||||
cursor: Cursor for cursor-based pagination
|
||||
cache_timeout: Cache timeout in seconds
|
||||
|
||||
Returns:
|
||||
Tuple of (Page object, metadata dict)
|
||||
"""
|
||||
if use_cursor:
|
||||
paginator = CursorPaginator(queryset, per_page=per_page)
|
||||
page_data = paginator.get_page(cursor)
|
||||
|
||||
return page_data, {
|
||||
'pagination_type': 'cursor',
|
||||
'has_next': page_data['has_next'],
|
||||
'has_previous': page_data['has_previous'],
|
||||
'next_cursor': page_data['next_cursor'],
|
||||
'previous_cursor': page_data['previous_cursor']
|
||||
}
|
||||
else:
|
||||
paginator = OptimizedPaginator(queryset, per_page, cache_timeout=cache_timeout)
|
||||
page = paginator.get_page(page_number)
|
||||
|
||||
return page, {
|
||||
'pagination_type': 'offset',
|
||||
'total_pages': paginator.num_pages,
|
||||
'total_count': paginator.count,
|
||||
'has_next': page.has_next(),
|
||||
'has_previous': page.has_previous(),
|
||||
'current_page': page.number
|
||||
}
|
||||
@@ -1,228 +0,0 @@
|
||||
"""
|
||||
Services for park-related business logic.
|
||||
Following Django styleguide pattern for business logic encapsulation.
|
||||
"""
|
||||
|
||||
from typing import Optional, Dict, Any, TYPE_CHECKING
|
||||
from django.db import transaction
|
||||
from django.db.models import Q
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from django.contrib.auth.models import AbstractUser
|
||||
|
||||
from ..models import Park, ParkArea
|
||||
from .location_service import ParkLocationService
|
||||
|
||||
|
||||
class ParkService:
|
||||
"""Service for managing park operations."""
|
||||
|
||||
@staticmethod
|
||||
def create_park(
|
||||
*,
|
||||
name: str,
|
||||
description: str = "",
|
||||
status: str = "OPERATING",
|
||||
operator_id: Optional[int] = None,
|
||||
property_owner_id: Optional[int] = None,
|
||||
opening_date: Optional[str] = None,
|
||||
closing_date: Optional[str] = None,
|
||||
operating_season: str = "",
|
||||
size_acres: Optional[float] = None,
|
||||
website: str = "",
|
||||
location_data: Optional[Dict[str, Any]] = None,
|
||||
created_by: Optional["AbstractUser"] = None,
|
||||
) -> Park:
|
||||
"""
|
||||
Create a new park with validation and location handling.
|
||||
|
||||
Args:
|
||||
name: Park name
|
||||
description: Park description
|
||||
status: Operating status
|
||||
operator_id: ID of operating company
|
||||
property_owner_id: ID of property owner company
|
||||
opening_date: Opening date
|
||||
closing_date: Closing date
|
||||
operating_season: Operating season description
|
||||
size_acres: Park size in acres
|
||||
website: Park website URL
|
||||
location_data: Dictionary containing location information
|
||||
created_by: User creating the park
|
||||
|
||||
Returns:
|
||||
Created Park instance
|
||||
|
||||
Raises:
|
||||
ValidationError: If park data is invalid
|
||||
"""
|
||||
with transaction.atomic():
|
||||
# Create park instance
|
||||
park = Park(
|
||||
name=name,
|
||||
description=description,
|
||||
status=status,
|
||||
opening_date=opening_date,
|
||||
closing_date=closing_date,
|
||||
operating_season=operating_season,
|
||||
size_acres=size_acres,
|
||||
website=website,
|
||||
)
|
||||
|
||||
# Set foreign key relationships if provided
|
||||
if operator_id:
|
||||
from apps.parks.models import Company
|
||||
|
||||
park.operator = Company.objects.get(id=operator_id)
|
||||
|
||||
if property_owner_id:
|
||||
from apps.parks.models import Company
|
||||
|
||||
park.property_owner = Company.objects.get(id=property_owner_id)
|
||||
|
||||
# CRITICAL STYLEGUIDE FIX: Call full_clean before save
|
||||
park.full_clean()
|
||||
park.save()
|
||||
|
||||
# Handle location if provided
|
||||
if location_data:
|
||||
ParkLocationService.create_park_location(park=park, **location_data)
|
||||
|
||||
return park
|
||||
|
||||
@staticmethod
|
||||
def update_park(
|
||||
*,
|
||||
park_id: int,
|
||||
updates: Dict[str, Any],
|
||||
updated_by: Optional["AbstractUser"] = None,
|
||||
) -> Park:
|
||||
"""
|
||||
Update an existing park with validation.
|
||||
|
||||
Args:
|
||||
park_id: ID of park to update
|
||||
updates: Dictionary of field updates
|
||||
updated_by: User performing the update
|
||||
|
||||
Returns:
|
||||
Updated Park instance
|
||||
|
||||
Raises:
|
||||
Park.DoesNotExist: If park doesn't exist
|
||||
ValidationError: If update data is invalid
|
||||
"""
|
||||
with transaction.atomic():
|
||||
park = Park.objects.select_for_update().get(id=park_id)
|
||||
|
||||
# Apply updates
|
||||
for field, value in updates.items():
|
||||
if hasattr(park, field):
|
||||
setattr(park, field, value)
|
||||
|
||||
# CRITICAL STYLEGUIDE FIX: Call full_clean before save
|
||||
park.full_clean()
|
||||
park.save()
|
||||
|
||||
return park
|
||||
|
||||
@staticmethod
|
||||
def delete_park(
|
||||
*, park_id: int, deleted_by: Optional["AbstractUser"] = None
|
||||
) -> bool:
|
||||
"""
|
||||
Soft delete a park by setting status to DEMOLISHED.
|
||||
|
||||
Args:
|
||||
park_id: ID of park to delete
|
||||
deleted_by: User performing the deletion
|
||||
|
||||
Returns:
|
||||
True if successfully deleted
|
||||
|
||||
Raises:
|
||||
Park.DoesNotExist: If park doesn't exist
|
||||
"""
|
||||
with transaction.atomic():
|
||||
park = Park.objects.select_for_update().get(id=park_id)
|
||||
park.status = "DEMOLISHED"
|
||||
|
||||
# CRITICAL STYLEGUIDE FIX: Call full_clean before save
|
||||
park.full_clean()
|
||||
park.save()
|
||||
|
||||
return True
|
||||
|
||||
@staticmethod
|
||||
def create_park_area(
|
||||
*,
|
||||
park_id: int,
|
||||
name: str,
|
||||
description: str = "",
|
||||
created_by: Optional["AbstractUser"] = None,
|
||||
) -> ParkArea:
|
||||
"""
|
||||
Create a new area within a park.
|
||||
|
||||
Args:
|
||||
park_id: ID of the parent park
|
||||
name: Area name
|
||||
description: Area description
|
||||
created_by: User creating the area
|
||||
|
||||
Returns:
|
||||
Created ParkArea instance
|
||||
|
||||
Raises:
|
||||
Park.DoesNotExist: If park doesn't exist
|
||||
ValidationError: If area data is invalid
|
||||
"""
|
||||
park = Park.objects.get(id=park_id)
|
||||
|
||||
area = ParkArea(park=park, name=name, description=description)
|
||||
|
||||
# CRITICAL STYLEGUIDE FIX: Call full_clean before save
|
||||
area.full_clean()
|
||||
area.save()
|
||||
|
||||
return area
|
||||
|
||||
@staticmethod
|
||||
def update_park_statistics(*, park_id: int) -> Park:
|
||||
"""
|
||||
Recalculate and update park statistics (ride counts, ratings).
|
||||
|
||||
Args:
|
||||
park_id: ID of park to update statistics for
|
||||
|
||||
Returns:
|
||||
Updated Park instance with fresh statistics
|
||||
"""
|
||||
from apps.rides.models import Ride
|
||||
from apps.parks.models import ParkReview
|
||||
from django.db.models import Count, Avg
|
||||
|
||||
with transaction.atomic():
|
||||
park = Park.objects.select_for_update().get(id=park_id)
|
||||
|
||||
# Calculate ride counts
|
||||
ride_stats = Ride.objects.filter(park=park).aggregate(
|
||||
total_rides=Count("id"),
|
||||
coaster_count=Count("id", filter=Q(category__in=["RC", "WC"])),
|
||||
)
|
||||
|
||||
# Calculate average rating
|
||||
avg_rating = ParkReview.objects.filter(
|
||||
park=park, is_published=True
|
||||
).aggregate(avg_rating=Avg("rating"))["avg_rating"]
|
||||
|
||||
# Update park fields
|
||||
park.ride_count = ride_stats["total_rides"] or 0
|
||||
park.coaster_count = ride_stats["coaster_count"] or 0
|
||||
park.average_rating = avg_rating
|
||||
|
||||
# CRITICAL STYLEGUIDE FIX: Call full_clean before save
|
||||
park.full_clean()
|
||||
park.save()
|
||||
|
||||
return park
|
||||
@@ -1,402 +0,0 @@
|
||||
"""
|
||||
Performance monitoring and benchmarking tools for park listing optimizations.
|
||||
"""
|
||||
|
||||
import time
|
||||
import logging
|
||||
import statistics
|
||||
from typing import Dict, List, Any, Optional, Callable
|
||||
from contextlib import contextmanager
|
||||
from dataclasses import dataclass, field
|
||||
from datetime import datetime, timedelta
|
||||
from django.db import connection
|
||||
from django.core.cache import cache
|
||||
from django.conf import settings
|
||||
from django.test import RequestFactory
|
||||
import json
|
||||
|
||||
logger = logging.getLogger("performance_monitoring")
|
||||
|
||||
|
||||
@dataclass
|
||||
class PerformanceMetric:
|
||||
"""Data class for storing performance metrics."""
|
||||
operation: str
|
||||
duration: float
|
||||
query_count: int
|
||||
cache_hits: int = 0
|
||||
cache_misses: int = 0
|
||||
memory_usage: Optional[float] = None
|
||||
timestamp: datetime = field(default_factory=datetime.now)
|
||||
metadata: Dict[str, Any] = field(default_factory=dict)
|
||||
|
||||
|
||||
class PerformanceMonitor:
|
||||
"""
|
||||
Comprehensive performance monitoring for park listing operations.
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
self.metrics: List[PerformanceMetric] = []
|
||||
self.cache_stats = {'hits': 0, 'misses': 0}
|
||||
|
||||
@contextmanager
|
||||
def measure_operation(self, operation_name: str, **metadata):
|
||||
"""Context manager to measure operation performance."""
|
||||
initial_queries = len(connection.queries) if hasattr(connection, 'queries') else 0
|
||||
initial_cache_hits = self.cache_stats['hits']
|
||||
initial_cache_misses = self.cache_stats['misses']
|
||||
|
||||
start_time = time.perf_counter()
|
||||
start_memory = self._get_memory_usage()
|
||||
|
||||
try:
|
||||
yield
|
||||
finally:
|
||||
end_time = time.perf_counter()
|
||||
end_memory = self._get_memory_usage()
|
||||
|
||||
duration = end_time - start_time
|
||||
query_count = (len(connection.queries) - initial_queries) if hasattr(connection, 'queries') else 0
|
||||
cache_hits = self.cache_stats['hits'] - initial_cache_hits
|
||||
cache_misses = self.cache_stats['misses'] - initial_cache_misses
|
||||
memory_delta = end_memory - start_memory if start_memory and end_memory else None
|
||||
|
||||
metric = PerformanceMetric(
|
||||
operation=operation_name,
|
||||
duration=duration,
|
||||
query_count=query_count,
|
||||
cache_hits=cache_hits,
|
||||
cache_misses=cache_misses,
|
||||
memory_usage=memory_delta,
|
||||
metadata=metadata
|
||||
)
|
||||
|
||||
self.metrics.append(metric)
|
||||
self._log_metric(metric)
|
||||
|
||||
def _get_memory_usage(self) -> Optional[float]:
|
||||
"""Get current memory usage in MB."""
|
||||
try:
|
||||
import psutil
|
||||
process = psutil.Process()
|
||||
return process.memory_info().rss / 1024 / 1024 # Convert to MB
|
||||
except ImportError:
|
||||
return None
|
||||
|
||||
def _log_metric(self, metric: PerformanceMetric):
|
||||
"""Log performance metric with appropriate level."""
|
||||
message = (
|
||||
f"{metric.operation}: {metric.duration:.3f}s, "
|
||||
f"{metric.query_count} queries, "
|
||||
f"{metric.cache_hits} cache hits"
|
||||
)
|
||||
|
||||
if metric.memory_usage:
|
||||
message += f", {metric.memory_usage:.2f}MB memory delta"
|
||||
|
||||
# Log as warning if performance is concerning
|
||||
if metric.duration > 1.0 or metric.query_count > 10:
|
||||
logger.warning(f"Performance concern: {message}")
|
||||
else:
|
||||
logger.info(f"Performance metric: {message}")
|
||||
|
||||
def get_performance_summary(self) -> Dict[str, Any]:
|
||||
"""Get summary of all performance metrics."""
|
||||
if not self.metrics:
|
||||
return {'message': 'No metrics collected'}
|
||||
|
||||
durations = [m.duration for m in self.metrics]
|
||||
query_counts = [m.query_count for m in self.metrics]
|
||||
|
||||
return {
|
||||
'total_operations': len(self.metrics),
|
||||
'duration_stats': {
|
||||
'mean': statistics.mean(durations),
|
||||
'median': statistics.median(durations),
|
||||
'min': min(durations),
|
||||
'max': max(durations),
|
||||
'total': sum(durations)
|
||||
},
|
||||
'query_stats': {
|
||||
'mean': statistics.mean(query_counts),
|
||||
'median': statistics.median(query_counts),
|
||||
'min': min(query_counts),
|
||||
'max': max(query_counts),
|
||||
'total': sum(query_counts)
|
||||
},
|
||||
'cache_stats': {
|
||||
'total_hits': sum(m.cache_hits for m in self.metrics),
|
||||
'total_misses': sum(m.cache_misses for m in self.metrics),
|
||||
'hit_rate': self._calculate_cache_hit_rate()
|
||||
},
|
||||
'slowest_operations': self._get_slowest_operations(5),
|
||||
'most_query_intensive': self._get_most_query_intensive(5)
|
||||
}
|
||||
|
||||
def _calculate_cache_hit_rate(self) -> float:
|
||||
"""Calculate overall cache hit rate."""
|
||||
total_hits = sum(m.cache_hits for m in self.metrics)
|
||||
total_requests = total_hits + sum(m.cache_misses for m in self.metrics)
|
||||
return (total_hits / total_requests * 100) if total_requests > 0 else 0.0
|
||||
|
||||
def _get_slowest_operations(self, count: int) -> List[Dict[str, Any]]:
|
||||
"""Get the slowest operations."""
|
||||
sorted_metrics = sorted(self.metrics, key=lambda m: m.duration, reverse=True)
|
||||
return [
|
||||
{
|
||||
'operation': m.operation,
|
||||
'duration': m.duration,
|
||||
'query_count': m.query_count,
|
||||
'timestamp': m.timestamp.isoformat()
|
||||
}
|
||||
for m in sorted_metrics[:count]
|
||||
]
|
||||
|
||||
def _get_most_query_intensive(self, count: int) -> List[Dict[str, Any]]:
|
||||
"""Get operations with the most database queries."""
|
||||
sorted_metrics = sorted(self.metrics, key=lambda m: m.query_count, reverse=True)
|
||||
return [
|
||||
{
|
||||
'operation': m.operation,
|
||||
'query_count': m.query_count,
|
||||
'duration': m.duration,
|
||||
'timestamp': m.timestamp.isoformat()
|
||||
}
|
||||
for m in sorted_metrics[:count]
|
||||
]
|
||||
|
||||
|
||||
class BenchmarkSuite:
|
||||
"""
|
||||
Comprehensive benchmarking suite for park listing performance.
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
self.monitor = PerformanceMonitor()
|
||||
self.factory = RequestFactory()
|
||||
|
||||
def run_autocomplete_benchmark(self, queries: List[str] = None) -> Dict[str, Any]:
|
||||
"""Benchmark autocomplete performance with various queries."""
|
||||
if not queries:
|
||||
queries = [
|
||||
'Di', # Short query
|
||||
'Disney', # Common brand
|
||||
'Universal', # Another common brand
|
||||
'Cedar Point', # Specific park
|
||||
'California', # Location
|
||||
'Roller', # Generic term
|
||||
'Xyz123' # Non-existent query
|
||||
]
|
||||
|
||||
results = []
|
||||
|
||||
for query in queries:
|
||||
with self.monitor.measure_operation(f"autocomplete_{query}", query=query):
|
||||
# Simulate autocomplete request
|
||||
from apps.parks.views_autocomplete import ParkAutocompleteView
|
||||
|
||||
request = self.factory.get(f'/api/parks/autocomplete/?q={query}')
|
||||
view = ParkAutocompleteView()
|
||||
response = view.get(request)
|
||||
|
||||
results.append({
|
||||
'query': query,
|
||||
'status_code': response.status_code,
|
||||
'response_time': self.monitor.metrics[-1].duration,
|
||||
'query_count': self.monitor.metrics[-1].query_count
|
||||
})
|
||||
|
||||
return {
|
||||
'benchmark_type': 'autocomplete',
|
||||
'queries_tested': len(queries),
|
||||
'results': results,
|
||||
'summary': self.monitor.get_performance_summary()
|
||||
}
|
||||
|
||||
def run_listing_benchmark(self, scenarios: List[Dict[str, Any]] = None) -> Dict[str, Any]:
|
||||
"""Benchmark park listing performance with various filter scenarios."""
|
||||
if not scenarios:
|
||||
scenarios = [
|
||||
{'name': 'no_filters', 'params': {}},
|
||||
{'name': 'status_filter', 'params': {'status': 'OPERATING'}},
|
||||
{'name': 'operator_filter', 'params': {'operator': 'Disney'}},
|
||||
{'name': 'location_filter', 'params': {'country': 'United States'}},
|
||||
{'name': 'complex_filter', 'params': {
|
||||
'status': 'OPERATING',
|
||||
'has_coasters': 'true',
|
||||
'min_rating': '4.0'
|
||||
}},
|
||||
{'name': 'search_query', 'params': {'search': 'Magic Kingdom'}},
|
||||
{'name': 'pagination_last_page', 'params': {'page': '10'}}
|
||||
]
|
||||
|
||||
results = []
|
||||
|
||||
for scenario in scenarios:
|
||||
with self.monitor.measure_operation(f"listing_{scenario['name']}", **scenario['params']):
|
||||
# Simulate listing request
|
||||
from apps.parks.views import ParkListView
|
||||
|
||||
query_string = '&'.join([f"{k}={v}" for k, v in scenario['params'].items()])
|
||||
request = self.factory.get(f'/parks/?{query_string}')
|
||||
|
||||
view = ParkListView()
|
||||
view.setup(request)
|
||||
|
||||
# Simulate getting the queryset and context
|
||||
queryset = view.get_queryset()
|
||||
context = view.get_context_data()
|
||||
|
||||
results.append({
|
||||
'scenario': scenario['name'],
|
||||
'params': scenario['params'],
|
||||
'result_count': queryset.count() if hasattr(queryset, 'count') else len(queryset),
|
||||
'response_time': self.monitor.metrics[-1].duration,
|
||||
'query_count': self.monitor.metrics[-1].query_count
|
||||
})
|
||||
|
||||
return {
|
||||
'benchmark_type': 'listing',
|
||||
'scenarios_tested': len(scenarios),
|
||||
'results': results,
|
||||
'summary': self.monitor.get_performance_summary()
|
||||
}
|
||||
|
||||
def run_pagination_benchmark(self, page_sizes: List[int] = None, page_numbers: List[int] = None) -> Dict[str, Any]:
|
||||
"""Benchmark pagination performance with different page sizes and numbers."""
|
||||
if not page_sizes:
|
||||
page_sizes = [10, 20, 50, 100]
|
||||
if not page_numbers:
|
||||
page_numbers = [1, 5, 10, 50]
|
||||
|
||||
results = []
|
||||
|
||||
for page_size in page_sizes:
|
||||
for page_number in page_numbers:
|
||||
scenario_name = f"page_{page_number}_size_{page_size}"
|
||||
|
||||
with self.monitor.measure_operation(scenario_name, page_size=page_size, page_number=page_number):
|
||||
from apps.parks.services.pagination_service import get_optimized_page
|
||||
from apps.parks.querysets import get_base_park_queryset
|
||||
|
||||
queryset = get_base_park_queryset()
|
||||
page, metadata = get_optimized_page(queryset, page_number, page_size)
|
||||
|
||||
results.append({
|
||||
'page_size': page_size,
|
||||
'page_number': page_number,
|
||||
'total_count': metadata.get('total_count', 0),
|
||||
'response_time': self.monitor.metrics[-1].duration,
|
||||
'query_count': self.monitor.metrics[-1].query_count
|
||||
})
|
||||
|
||||
return {
|
||||
'benchmark_type': 'pagination',
|
||||
'configurations_tested': len(results),
|
||||
'results': results,
|
||||
'summary': self.monitor.get_performance_summary()
|
||||
}
|
||||
|
||||
def run_full_benchmark_suite(self) -> Dict[str, Any]:
|
||||
"""Run the complete benchmark suite."""
|
||||
logger.info("Starting comprehensive benchmark suite")
|
||||
|
||||
suite_start = time.perf_counter()
|
||||
|
||||
# Run all benchmarks
|
||||
autocomplete_results = self.run_autocomplete_benchmark()
|
||||
listing_results = self.run_listing_benchmark()
|
||||
pagination_results = self.run_pagination_benchmark()
|
||||
|
||||
suite_duration = time.perf_counter() - suite_start
|
||||
|
||||
# Generate comprehensive report
|
||||
report = {
|
||||
'benchmark_suite': 'Park Listing Performance',
|
||||
'timestamp': datetime.now().isoformat(),
|
||||
'total_duration': suite_duration,
|
||||
'autocomplete': autocomplete_results,
|
||||
'listing': listing_results,
|
||||
'pagination': pagination_results,
|
||||
'overall_summary': self.monitor.get_performance_summary(),
|
||||
'recommendations': self._generate_recommendations()
|
||||
}
|
||||
|
||||
# Save report
|
||||
self._save_benchmark_report(report)
|
||||
|
||||
logger.info(f"Benchmark suite completed in {suite_duration:.3f}s")
|
||||
|
||||
return report
|
||||
|
||||
def _generate_recommendations(self) -> List[str]:
|
||||
"""Generate performance recommendations based on benchmark results."""
|
||||
recommendations = []
|
||||
summary = self.monitor.get_performance_summary()
|
||||
|
||||
# Check average response times
|
||||
if summary['duration_stats']['mean'] > 0.5:
|
||||
recommendations.append("Average response time is high (>500ms). Consider implementing additional caching.")
|
||||
|
||||
# Check query counts
|
||||
if summary['query_stats']['mean'] > 5:
|
||||
recommendations.append("High average query count. Review and optimize database queries.")
|
||||
|
||||
# Check cache hit rate
|
||||
if summary['cache_stats']['hit_rate'] < 80:
|
||||
recommendations.append("Cache hit rate is low (<80%). Increase cache timeouts or improve cache key strategy.")
|
||||
|
||||
# Check for slow operations
|
||||
slowest = summary.get('slowest_operations', [])
|
||||
if slowest and slowest[0]['duration'] > 2.0:
|
||||
recommendations.append(f"Slowest operation ({slowest[0]['operation']}) is very slow (>{slowest[0]['duration']:.2f}s).")
|
||||
|
||||
if not recommendations:
|
||||
recommendations.append("Performance appears to be within acceptable ranges.")
|
||||
|
||||
return recommendations
|
||||
|
||||
def _save_benchmark_report(self, report: Dict[str, Any]):
|
||||
"""Save benchmark report to file and cache."""
|
||||
timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
|
||||
filename = f"benchmark_report_{timestamp}.json"
|
||||
|
||||
try:
|
||||
# Save to logs directory
|
||||
import os
|
||||
logs_dir = "logs"
|
||||
os.makedirs(logs_dir, exist_ok=True)
|
||||
|
||||
filepath = os.path.join(logs_dir, filename)
|
||||
with open(filepath, 'w') as f:
|
||||
json.dump(report, f, indent=2, default=str)
|
||||
|
||||
logger.info(f"Benchmark report saved to {filepath}")
|
||||
|
||||
# Also cache the report
|
||||
cache.set(f"benchmark_report_latest", report, 3600) # 1 hour
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error saving benchmark report: {e}")
|
||||
|
||||
|
||||
# Global performance monitor instance
|
||||
performance_monitor = PerformanceMonitor()
|
||||
|
||||
|
||||
def benchmark_operation(operation_name: str):
|
||||
"""Decorator to benchmark a function."""
|
||||
def decorator(func: Callable):
|
||||
def wrapper(*args, **kwargs):
|
||||
with performance_monitor.measure_operation(operation_name):
|
||||
return func(*args, **kwargs)
|
||||
return wrapper
|
||||
return decorator
|
||||
|
||||
|
||||
# Convenience function to run benchmarks
|
||||
def run_performance_benchmark():
|
||||
"""Run the complete performance benchmark suite."""
|
||||
suite = BenchmarkSuite()
|
||||
return suite.run_full_benchmark_suite()
|
||||
@@ -1,34 +0,0 @@
|
||||
from django.db.models.signals import post_save, post_delete
|
||||
from django.dispatch import receiver
|
||||
from django.db.models import Q
|
||||
|
||||
from apps.rides.models import Ride
|
||||
from .models import Park
|
||||
|
||||
|
||||
def update_park_ride_counts(park):
|
||||
"""Update ride_count and coaster_count for a park"""
|
||||
operating_rides = Q(status="OPERATING")
|
||||
|
||||
# Count total operating rides
|
||||
ride_count = park.rides.filter(operating_rides).count()
|
||||
|
||||
# Count total operating roller coasters
|
||||
coaster_count = park.rides.filter(operating_rides, category="RC").count()
|
||||
|
||||
# Update park counts
|
||||
Park.objects.filter(id=park.id).update(
|
||||
ride_count=ride_count, coaster_count=coaster_count
|
||||
)
|
||||
|
||||
|
||||
@receiver(post_save, sender=Ride)
|
||||
def ride_saved(sender, instance, **kwargs):
|
||||
"""Update park counts when a ride is saved"""
|
||||
update_park_ride_counts(instance.park)
|
||||
|
||||
|
||||
@receiver(post_delete, sender=Ride)
|
||||
def ride_deleted(sender, instance, **kwargs):
|
||||
"""Update park counts when a ride is deleted"""
|
||||
update_park_ride_counts(instance.park)
|
||||
@@ -1,363 +0,0 @@
|
||||
/* Performance-optimized CSS for park listing page */
|
||||
|
||||
/* Critical CSS that should be inlined */
|
||||
.park-listing {
|
||||
/* Use GPU acceleration for smooth animations */
|
||||
transform: translateZ(0);
|
||||
backface-visibility: hidden;
|
||||
}
|
||||
|
||||
/* Lazy loading image styles */
|
||||
img[data-src] {
|
||||
background: linear-gradient(90deg, #f0f0f0 25%, #e0e0e0 50%, #f0f0f0 75%);
|
||||
background-size: 200% 100%;
|
||||
animation: shimmer 1.5s infinite;
|
||||
transition: opacity 0.3s ease;
|
||||
}
|
||||
|
||||
img.loading {
|
||||
opacity: 0.7;
|
||||
filter: blur(2px);
|
||||
}
|
||||
|
||||
img.loaded {
|
||||
opacity: 1;
|
||||
filter: none;
|
||||
animation: none;
|
||||
}
|
||||
|
||||
img.error {
|
||||
background: #f5f5f5;
|
||||
opacity: 0.5;
|
||||
}
|
||||
|
||||
@keyframes shimmer {
|
||||
0% {
|
||||
background-position: -200% 0;
|
||||
}
|
||||
100% {
|
||||
background-position: 200% 0;
|
||||
}
|
||||
}
|
||||
|
||||
/* Optimized grid layout using CSS Grid */
|
||||
.park-grid {
|
||||
display: grid;
|
||||
grid-template-columns: repeat(auto-fill, minmax(300px, 1fr));
|
||||
gap: 1.5rem;
|
||||
/* Use containment for better performance */
|
||||
contain: layout style;
|
||||
}
|
||||
|
||||
.park-card {
|
||||
/* Optimize for animations */
|
||||
will-change: transform, box-shadow;
|
||||
transition: transform 0.2s ease, box-shadow 0.2s ease;
|
||||
/* Enable GPU acceleration */
|
||||
transform: translateZ(0);
|
||||
/* Optimize paint */
|
||||
contain: layout style paint;
|
||||
}
|
||||
|
||||
.park-card:hover {
|
||||
transform: translateY(-4px) translateZ(0);
|
||||
box-shadow: 0 8px 25px rgba(0, 0, 0, 0.15);
|
||||
}
|
||||
|
||||
/* Efficient loading states */
|
||||
.loading {
|
||||
position: relative;
|
||||
overflow: hidden;
|
||||
}
|
||||
|
||||
.loading::after {
|
||||
content: '';
|
||||
position: absolute;
|
||||
top: 0;
|
||||
left: 0;
|
||||
right: 0;
|
||||
bottom: 0;
|
||||
background: linear-gradient(
|
||||
90deg,
|
||||
transparent,
|
||||
rgba(255, 255, 255, 0.4),
|
||||
transparent
|
||||
);
|
||||
animation: loading-sweep 1.5s infinite;
|
||||
pointer-events: none;
|
||||
}
|
||||
|
||||
@keyframes loading-sweep {
|
||||
0% {
|
||||
transform: translateX(-100%);
|
||||
}
|
||||
100% {
|
||||
transform: translateX(100%);
|
||||
}
|
||||
}
|
||||
|
||||
/* Optimized autocomplete dropdown */
|
||||
.autocomplete-suggestions {
|
||||
position: absolute;
|
||||
top: 100%;
|
||||
left: 0;
|
||||
right: 0;
|
||||
background: white;
|
||||
border: 1px solid #ddd;
|
||||
border-top: none;
|
||||
border-radius: 0 0 4px 4px;
|
||||
box-shadow: 0 4px 6px rgba(0, 0, 0, 0.1);
|
||||
z-index: 1000;
|
||||
max-height: 300px;
|
||||
overflow-y: auto;
|
||||
/* Hide by default */
|
||||
opacity: 0;
|
||||
visibility: hidden;
|
||||
transform: translateY(-10px);
|
||||
transition: all 0.2s ease;
|
||||
/* Optimize scrolling */
|
||||
-webkit-overflow-scrolling: touch;
|
||||
contain: layout style;
|
||||
}
|
||||
|
||||
.autocomplete-suggestions.visible {
|
||||
opacity: 1;
|
||||
visibility: visible;
|
||||
transform: translateY(0);
|
||||
}
|
||||
|
||||
.suggestion-item {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
padding: 0.75rem 1rem;
|
||||
cursor: pointer;
|
||||
border-bottom: 1px solid #f0f0f0;
|
||||
transition: background-color 0.15s ease;
|
||||
}
|
||||
|
||||
.suggestion-item:hover,
|
||||
.suggestion-item.active {
|
||||
background-color: #f8f9fa;
|
||||
}
|
||||
|
||||
.suggestion-icon {
|
||||
margin-right: 0.5rem;
|
||||
font-size: 0.875rem;
|
||||
}
|
||||
|
||||
.suggestion-name {
|
||||
font-weight: 500;
|
||||
flex-grow: 1;
|
||||
}
|
||||
|
||||
.suggestion-details {
|
||||
font-size: 0.875rem;
|
||||
color: #666;
|
||||
}
|
||||
|
||||
/* Optimized filter panel */
|
||||
.filter-panel {
|
||||
/* Use flexbox for efficient layout */
|
||||
display: flex;
|
||||
flex-wrap: wrap;
|
||||
gap: 1rem;
|
||||
padding: 1rem;
|
||||
background: #f8f9fa;
|
||||
border-radius: 8px;
|
||||
/* Optimize for frequent updates */
|
||||
contain: layout style;
|
||||
}
|
||||
|
||||
.filter-group {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
min-width: 150px;
|
||||
}
|
||||
|
||||
.filter-input {
|
||||
padding: 0.5rem;
|
||||
border: 1px solid #ddd;
|
||||
border-radius: 4px;
|
||||
transition: border-color 0.15s ease;
|
||||
}
|
||||
|
||||
.filter-input:focus {
|
||||
outline: none;
|
||||
border-color: #007bff;
|
||||
box-shadow: 0 0 0 2px rgba(0, 123, 255, 0.25);
|
||||
}
|
||||
|
||||
/* Performance-optimized pagination */
|
||||
.pagination {
|
||||
display: flex;
|
||||
justify-content: center;
|
||||
align-items: center;
|
||||
gap: 0.5rem;
|
||||
margin: 2rem 0;
|
||||
/* Optimize for position changes */
|
||||
contain: layout;
|
||||
}
|
||||
|
||||
.pagination-btn {
|
||||
padding: 0.5rem 1rem;
|
||||
border: 1px solid #ddd;
|
||||
background: white;
|
||||
color: #333;
|
||||
text-decoration: none;
|
||||
border-radius: 4px;
|
||||
transition: all 0.15s ease;
|
||||
/* Optimize for hover effects */
|
||||
will-change: background-color, border-color;
|
||||
}
|
||||
|
||||
.pagination-btn:hover:not(.disabled) {
|
||||
background: #f8f9fa;
|
||||
border-color: #bbb;
|
||||
}
|
||||
|
||||
.pagination-btn.active {
|
||||
background: #007bff;
|
||||
color: white;
|
||||
border-color: #007bff;
|
||||
}
|
||||
|
||||
.pagination-btn.disabled {
|
||||
opacity: 0.5;
|
||||
cursor: not-allowed;
|
||||
}
|
||||
|
||||
/* Responsive optimizations */
|
||||
@media (max-width: 768px) {
|
||||
.park-grid {
|
||||
grid-template-columns: 1fr;
|
||||
gap: 1rem;
|
||||
}
|
||||
|
||||
.filter-panel {
|
||||
flex-direction: column;
|
||||
}
|
||||
|
||||
.suggestion-item {
|
||||
padding: 1rem;
|
||||
}
|
||||
}
|
||||
|
||||
/* High DPI optimizations */
|
||||
@media (-webkit-min-device-pixel-ratio: 2), (min-resolution: 192dpi) {
|
||||
.park-card img {
|
||||
/* Use higher quality images on retina displays */
|
||||
image-rendering: -webkit-optimize-contrast;
|
||||
}
|
||||
}
|
||||
|
||||
/* Reduce motion for accessibility */
|
||||
@media (prefers-reduced-motion: reduce) {
|
||||
*,
|
||||
*::before,
|
||||
*::after {
|
||||
animation-duration: 0.01ms !important;
|
||||
animation-iteration-count: 1 !important;
|
||||
transition-duration: 0.01ms !important;
|
||||
scroll-behavior: auto !important;
|
||||
}
|
||||
}
|
||||
|
||||
/* Performance debugging styles (only in development) */
|
||||
.debug-metrics {
|
||||
position: fixed;
|
||||
top: 10px;
|
||||
right: 10px;
|
||||
background: rgba(0, 0, 0, 0.8);
|
||||
color: white;
|
||||
padding: 0.5rem;
|
||||
border-radius: 4px;
|
||||
font-size: 0.75rem;
|
||||
font-family: monospace;
|
||||
z-index: 9999;
|
||||
display: none;
|
||||
}
|
||||
|
||||
body.debug .debug-metrics {
|
||||
display: block;
|
||||
}
|
||||
|
||||
.debug-metrics span {
|
||||
display: block;
|
||||
margin-bottom: 0.25rem;
|
||||
}
|
||||
|
||||
/* Print optimizations */
|
||||
@media print {
|
||||
.autocomplete-suggestions,
|
||||
.filter-panel,
|
||||
.pagination,
|
||||
.debug-metrics {
|
||||
display: none;
|
||||
}
|
||||
|
||||
.park-grid {
|
||||
grid-template-columns: repeat(2, 1fr);
|
||||
gap: 1rem;
|
||||
}
|
||||
|
||||
.park-card {
|
||||
break-inside: avoid;
|
||||
page-break-inside: avoid;
|
||||
}
|
||||
}
|
||||
|
||||
/* Container queries for better responsive design */
|
||||
@container (max-width: 400px) {
|
||||
.park-card {
|
||||
padding: 1rem;
|
||||
}
|
||||
|
||||
.park-card img {
|
||||
height: 150px;
|
||||
}
|
||||
}
|
||||
|
||||
/* Focus management for better accessibility */
|
||||
.skip-link {
|
||||
position: absolute;
|
||||
top: -40px;
|
||||
left: 6px;
|
||||
background: #000;
|
||||
color: white;
|
||||
padding: 8px;
|
||||
text-decoration: none;
|
||||
border-radius: 4px;
|
||||
z-index: 10000;
|
||||
}
|
||||
|
||||
.skip-link:focus {
|
||||
top: 6px;
|
||||
}
|
||||
|
||||
/* Efficient animations using transform and opacity only */
|
||||
.fade-in {
|
||||
animation: fadeIn 0.3s ease-in-out;
|
||||
}
|
||||
|
||||
@keyframes fadeIn {
|
||||
from {
|
||||
opacity: 0;
|
||||
transform: translateY(10px);
|
||||
}
|
||||
to {
|
||||
opacity: 1;
|
||||
transform: translateY(0);
|
||||
}
|
||||
}
|
||||
|
||||
/* Optimize for critical rendering path */
|
||||
.above-fold {
|
||||
/* Ensure critical content renders first */
|
||||
contain: layout style paint;
|
||||
}
|
||||
|
||||
.below-fold {
|
||||
/* Defer non-critical content */
|
||||
content-visibility: auto;
|
||||
contain-intrinsic-size: 500px;
|
||||
}
|
||||
@@ -1,518 +0,0 @@
|
||||
/**
|
||||
* Performance-optimized JavaScript for park listing page
|
||||
* Implements lazy loading, debouncing, and efficient DOM manipulation
|
||||
*/
|
||||
|
||||
class ParkListingPerformance {
|
||||
constructor() {
|
||||
this.searchTimeout = null;
|
||||
this.lastScrollPosition = 0;
|
||||
this.observerOptions = {
|
||||
root: null,
|
||||
rootMargin: '50px',
|
||||
threshold: 0.1
|
||||
};
|
||||
|
||||
this.init();
|
||||
}
|
||||
|
||||
init() {
|
||||
this.setupLazyLoading();
|
||||
this.setupDebouncedSearch();
|
||||
this.setupOptimizedFiltering();
|
||||
this.setupProgressiveImageLoading();
|
||||
this.setupPerformanceMonitoring();
|
||||
}
|
||||
|
||||
/**
|
||||
* Setup lazy loading for park images using Intersection Observer
|
||||
*/
|
||||
setupLazyLoading() {
|
||||
if ('IntersectionObserver' in window) {
|
||||
this.imageObserver = new IntersectionObserver((entries) => {
|
||||
entries.forEach(entry => {
|
||||
if (entry.isIntersecting) {
|
||||
this.loadImage(entry.target);
|
||||
this.imageObserver.unobserve(entry.target);
|
||||
}
|
||||
});
|
||||
}, this.observerOptions);
|
||||
|
||||
// Observe all lazy images
|
||||
document.querySelectorAll('img[data-src]').forEach(img => {
|
||||
this.imageObserver.observe(img);
|
||||
});
|
||||
} else {
|
||||
// Fallback for browsers without Intersection Observer
|
||||
this.loadAllImages();
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Load individual image with error handling and placeholder
|
||||
*/
|
||||
loadImage(img) {
|
||||
const src = img.dataset.src;
|
||||
const placeholder = img.dataset.placeholder;
|
||||
|
||||
// Start with low quality placeholder
|
||||
if (placeholder && !img.src) {
|
||||
img.src = placeholder;
|
||||
img.classList.add('loading');
|
||||
}
|
||||
|
||||
// Load high quality image
|
||||
const highQualityImg = new Image();
|
||||
highQualityImg.onload = () => {
|
||||
img.src = highQualityImg.src;
|
||||
img.classList.remove('loading');
|
||||
img.classList.add('loaded');
|
||||
};
|
||||
|
||||
highQualityImg.onerror = () => {
|
||||
img.src = '/static/images/placeholders/park-placeholder.jpg';
|
||||
img.classList.add('error');
|
||||
};
|
||||
|
||||
highQualityImg.src = src;
|
||||
}
|
||||
|
||||
/**
|
||||
* Load all images (fallback for older browsers)
|
||||
*/
|
||||
loadAllImages() {
|
||||
document.querySelectorAll('img[data-src]').forEach(img => {
|
||||
this.loadImage(img);
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Setup debounced search to reduce API calls
|
||||
*/
|
||||
setupDebouncedSearch() {
|
||||
const searchInput = document.querySelector('[data-autocomplete]');
|
||||
if (!searchInput) return;
|
||||
|
||||
searchInput.addEventListener('input', (e) => {
|
||||
clearTimeout(this.searchTimeout);
|
||||
|
||||
const query = e.target.value.trim();
|
||||
|
||||
if (query.length < 2) {
|
||||
this.hideSuggestions();
|
||||
return;
|
||||
}
|
||||
|
||||
// Debounce search requests
|
||||
this.searchTimeout = setTimeout(() => {
|
||||
this.performSearch(query);
|
||||
}, 300);
|
||||
});
|
||||
|
||||
// Handle keyboard navigation
|
||||
searchInput.addEventListener('keydown', (e) => {
|
||||
this.handleSearchKeyboard(e);
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Perform optimized search with caching
|
||||
*/
|
||||
async performSearch(query) {
|
||||
const cacheKey = `search_${query.toLowerCase()}`;
|
||||
|
||||
// Check session storage for cached results
|
||||
const cached = sessionStorage.getItem(cacheKey);
|
||||
if (cached) {
|
||||
const results = JSON.parse(cached);
|
||||
this.displaySuggestions(results);
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
const response = await fetch(`/api/parks/autocomplete/?q=${encodeURIComponent(query)}`, {
|
||||
headers: {
|
||||
'X-Requested-With': 'XMLHttpRequest'
|
||||
}
|
||||
});
|
||||
|
||||
if (response.ok) {
|
||||
const data = await response.json();
|
||||
|
||||
// Cache results for session
|
||||
sessionStorage.setItem(cacheKey, JSON.stringify(data));
|
||||
|
||||
this.displaySuggestions(data);
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Search error:', error);
|
||||
this.hideSuggestions();
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Display search suggestions with efficient DOM manipulation
|
||||
*/
|
||||
displaySuggestions(data) {
|
||||
const container = document.querySelector('[data-suggestions]');
|
||||
if (!container) return;
|
||||
|
||||
// Use document fragment for efficient DOM updates
|
||||
const fragment = document.createDocumentFragment();
|
||||
|
||||
if (data.suggestions && data.suggestions.length > 0) {
|
||||
data.suggestions.forEach(suggestion => {
|
||||
const item = this.createSuggestionItem(suggestion);
|
||||
fragment.appendChild(item);
|
||||
});
|
||||
} else {
|
||||
const noResults = document.createElement('div');
|
||||
noResults.className = 'no-results';
|
||||
noResults.textContent = 'No suggestions found';
|
||||
fragment.appendChild(noResults);
|
||||
}
|
||||
|
||||
// Replace content efficiently
|
||||
container.innerHTML = '';
|
||||
container.appendChild(fragment);
|
||||
container.classList.add('visible');
|
||||
}
|
||||
|
||||
/**
|
||||
* Create suggestion item element
|
||||
*/
|
||||
createSuggestionItem(suggestion) {
|
||||
const item = document.createElement('div');
|
||||
item.className = `suggestion-item suggestion-${suggestion.type}`;
|
||||
|
||||
const icon = this.getSuggestionIcon(suggestion.type);
|
||||
const details = suggestion.operator ? ` • ${suggestion.operator}` :
|
||||
suggestion.park_count ? ` • ${suggestion.park_count} parks` : '';
|
||||
|
||||
item.innerHTML = `
|
||||
<span class="suggestion-icon">${icon}</span>
|
||||
<span class="suggestion-name">${this.escapeHtml(suggestion.name)}</span>
|
||||
<span class="suggestion-details">${details}</span>
|
||||
`;
|
||||
|
||||
item.addEventListener('click', () => {
|
||||
this.selectSuggestion(suggestion);
|
||||
});
|
||||
|
||||
return item;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get icon for suggestion type
|
||||
*/
|
||||
getSuggestionIcon(type) {
|
||||
const icons = {
|
||||
park: '🏰',
|
||||
operator: '🏢',
|
||||
location: '📍'
|
||||
};
|
||||
return icons[type] || '🔍';
|
||||
}
|
||||
|
||||
/**
|
||||
* Handle suggestion selection
|
||||
*/
|
||||
selectSuggestion(suggestion) {
|
||||
const searchInput = document.querySelector('[data-autocomplete]');
|
||||
if (searchInput) {
|
||||
searchInput.value = suggestion.name;
|
||||
|
||||
// Trigger search or navigation
|
||||
if (suggestion.url) {
|
||||
window.location.href = suggestion.url;
|
||||
} else {
|
||||
// Trigger filter update
|
||||
this.updateFilters({ search: suggestion.name });
|
||||
}
|
||||
}
|
||||
|
||||
this.hideSuggestions();
|
||||
}
|
||||
|
||||
/**
|
||||
* Hide suggestions dropdown
|
||||
*/
|
||||
hideSuggestions() {
|
||||
const container = document.querySelector('[data-suggestions]');
|
||||
if (container) {
|
||||
container.classList.remove('visible');
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Setup optimized filtering with minimal reflows
|
||||
*/
|
||||
setupOptimizedFiltering() {
|
||||
const filterForm = document.querySelector('[data-filter-form]');
|
||||
if (!filterForm) return;
|
||||
|
||||
// Debounce filter changes
|
||||
filterForm.addEventListener('change', (e) => {
|
||||
clearTimeout(this.filterTimeout);
|
||||
|
||||
this.filterTimeout = setTimeout(() => {
|
||||
this.updateFilters();
|
||||
}, 150);
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Update filters using HTMX with loading states
|
||||
*/
|
||||
updateFilters(extraParams = {}) {
|
||||
const form = document.querySelector('[data-filter-form]');
|
||||
const resultsContainer = document.querySelector('[data-results]');
|
||||
|
||||
if (!form || !resultsContainer) return;
|
||||
|
||||
// Show loading state
|
||||
resultsContainer.classList.add('loading');
|
||||
|
||||
const formData = new FormData(form);
|
||||
|
||||
// Add extra parameters
|
||||
Object.entries(extraParams).forEach(([key, value]) => {
|
||||
formData.set(key, value);
|
||||
});
|
||||
|
||||
// Use HTMX for efficient partial updates
|
||||
if (window.htmx) {
|
||||
htmx.ajax('GET', form.action + '?' + new URLSearchParams(formData), {
|
||||
target: '[data-results]',
|
||||
swap: 'innerHTML'
|
||||
}).then(() => {
|
||||
resultsContainer.classList.remove('loading');
|
||||
this.setupLazyLoading(); // Re-initialize for new content
|
||||
this.updatePerformanceMetrics();
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Setup progressive image loading with CloudFlare optimization
|
||||
*/
|
||||
setupProgressiveImageLoading() {
|
||||
// Use CloudFlare's automatic image optimization
|
||||
document.querySelectorAll('img[data-cf-image]').forEach(img => {
|
||||
const imageId = img.dataset.cfImage;
|
||||
const width = img.dataset.width || 400;
|
||||
|
||||
// Start with low quality
|
||||
img.src = this.getCloudFlareImageUrl(imageId, width, 'low');
|
||||
|
||||
// Load high quality when in viewport
|
||||
if (this.imageObserver) {
|
||||
this.imageObserver.observe(img);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Get optimized CloudFlare image URL
|
||||
*/
|
||||
getCloudFlareImageUrl(imageId, width, quality = 'high') {
|
||||
const baseUrl = window.CLOUDFLARE_IMAGES_BASE_URL || '/images';
|
||||
const qualityMap = {
|
||||
low: 20,
|
||||
medium: 60,
|
||||
high: 85
|
||||
};
|
||||
|
||||
return `${baseUrl}/${imageId}/w=${width},quality=${qualityMap[quality]}`;
|
||||
}
|
||||
|
||||
/**
|
||||
* Setup performance monitoring
|
||||
*/
|
||||
setupPerformanceMonitoring() {
|
||||
// Track page load performance
|
||||
if ('performance' in window) {
|
||||
window.addEventListener('load', () => {
|
||||
setTimeout(() => {
|
||||
this.reportPerformanceMetrics();
|
||||
}, 100);
|
||||
});
|
||||
}
|
||||
|
||||
// Track user interactions
|
||||
this.setupInteractionTracking();
|
||||
}
|
||||
|
||||
/**
|
||||
* Report performance metrics
|
||||
*/
|
||||
reportPerformanceMetrics() {
|
||||
if (!('performance' in window)) return;
|
||||
|
||||
const navigation = performance.getEntriesByType('navigation')[0];
|
||||
const paint = performance.getEntriesByType('paint');
|
||||
|
||||
const metrics = {
|
||||
loadTime: navigation.loadEventEnd - navigation.loadEventStart,
|
||||
domContentLoaded: navigation.domContentLoadedEventEnd - navigation.domContentLoadedEventStart,
|
||||
firstPaint: paint.find(p => p.name === 'first-paint')?.startTime || 0,
|
||||
firstContentfulPaint: paint.find(p => p.name === 'first-contentful-paint')?.startTime || 0,
|
||||
timestamp: Date.now(),
|
||||
page: 'park-listing'
|
||||
};
|
||||
|
||||
// Send metrics to analytics (if configured)
|
||||
this.sendAnalytics('performance', metrics);
|
||||
}
|
||||
|
||||
/**
|
||||
* Setup interaction tracking for performance insights
|
||||
*/
|
||||
setupInteractionTracking() {
|
||||
const startTime = performance.now();
|
||||
|
||||
['click', 'input', 'scroll'].forEach(eventType => {
|
||||
document.addEventListener(eventType, (e) => {
|
||||
this.trackInteraction(eventType, e.target, performance.now() - startTime);
|
||||
}, { passive: true });
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Track user interactions
|
||||
*/
|
||||
trackInteraction(type, target, time) {
|
||||
// Throttle interaction tracking
|
||||
if (!this.lastInteractionTime || time - this.lastInteractionTime > 100) {
|
||||
this.lastInteractionTime = time;
|
||||
|
||||
const interaction = {
|
||||
type,
|
||||
element: target.tagName.toLowerCase(),
|
||||
class: target.className,
|
||||
time: Math.round(time),
|
||||
page: 'park-listing'
|
||||
};
|
||||
|
||||
this.sendAnalytics('interaction', interaction);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Send analytics data
|
||||
*/
|
||||
sendAnalytics(event, data) {
|
||||
// Only send in production and if analytics is configured
|
||||
if (window.ENABLE_ANALYTICS && navigator.sendBeacon) {
|
||||
const payload = JSON.stringify({
|
||||
event,
|
||||
data,
|
||||
timestamp: Date.now(),
|
||||
url: window.location.pathname
|
||||
});
|
||||
|
||||
navigator.sendBeacon('/api/analytics/', payload);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Update performance metrics display
|
||||
*/
|
||||
updatePerformanceMetrics() {
|
||||
const metricsDisplay = document.querySelector('[data-performance-metrics]');
|
||||
if (!metricsDisplay || !window.SHOW_DEBUG) return;
|
||||
|
||||
const imageCount = document.querySelectorAll('img').length;
|
||||
const loadedImages = document.querySelectorAll('img.loaded').length;
|
||||
const cacheHits = Object.keys(sessionStorage).filter(k => k.startsWith('search_')).length;
|
||||
|
||||
metricsDisplay.innerHTML = `
|
||||
<div class="debug-metrics">
|
||||
<span>Images: ${loadedImages}/${imageCount}</span>
|
||||
<span>Cache hits: ${cacheHits}</span>
|
||||
<span>Memory: ${this.getMemoryUsage()}MB</span>
|
||||
</div>
|
||||
`;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get approximate memory usage
|
||||
*/
|
||||
getMemoryUsage() {
|
||||
if ('memory' in performance) {
|
||||
return Math.round(performance.memory.usedJSHeapSize / 1024 / 1024);
|
||||
}
|
||||
return 'N/A';
|
||||
}
|
||||
|
||||
/**
|
||||
* Handle keyboard navigation in search
|
||||
*/
|
||||
handleSearchKeyboard(e) {
|
||||
const suggestions = document.querySelectorAll('.suggestion-item');
|
||||
const active = document.querySelector('.suggestion-item.active');
|
||||
|
||||
switch (e.key) {
|
||||
case 'ArrowDown':
|
||||
e.preventDefault();
|
||||
this.navigateSuggestions(suggestions, active, 1);
|
||||
break;
|
||||
case 'ArrowUp':
|
||||
e.preventDefault();
|
||||
this.navigateSuggestions(suggestions, active, -1);
|
||||
break;
|
||||
case 'Enter':
|
||||
e.preventDefault();
|
||||
if (active) {
|
||||
active.click();
|
||||
}
|
||||
break;
|
||||
case 'Escape':
|
||||
this.hideSuggestions();
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Navigate through suggestions with keyboard
|
||||
*/
|
||||
navigateSuggestions(suggestions, active, direction) {
|
||||
if (active) {
|
||||
active.classList.remove('active');
|
||||
}
|
||||
|
||||
let index = active ? Array.from(suggestions).indexOf(active) : -1;
|
||||
index += direction;
|
||||
|
||||
if (index < 0) index = suggestions.length - 1;
|
||||
if (index >= suggestions.length) index = 0;
|
||||
|
||||
if (suggestions[index]) {
|
||||
suggestions[index].classList.add('active');
|
||||
suggestions[index].scrollIntoView({ block: 'nearest' });
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Utility function to escape HTML
|
||||
*/
|
||||
escapeHtml(text) {
|
||||
const div = document.createElement('div');
|
||||
div.textContent = text;
|
||||
return div.innerHTML;
|
||||
}
|
||||
}
|
||||
|
||||
// Initialize performance optimizations when DOM is ready
|
||||
if (document.readyState === 'loading') {
|
||||
document.addEventListener('DOMContentLoaded', () => {
|
||||
new ParkListingPerformance();
|
||||
});
|
||||
} else {
|
||||
new ParkListingPerformance();
|
||||
}
|
||||
|
||||
// Export for testing
|
||||
if (typeof module !== 'undefined' && module.exports) {
|
||||
module.exports = ParkListingPerformance;
|
||||
}
|
||||
@@ -1,36 +0,0 @@
|
||||
{% load static %}
|
||||
{% load cotton %}
|
||||
|
||||
{% if error %}
|
||||
<div class="p-4" data-testid="park-list-error">
|
||||
<div class="inline-flex items-center px-4 py-2 rounded-md bg-red-50 dark:bg-red-900/20 text-red-700 dark:text-red-400 border border-red-200 dark:border-red-800">
|
||||
<svg class="w-5 h-5 mr-2" fill="currentColor" viewBox="0 0 20 20">
|
||||
<path fill-rule="evenodd" d="M10 18a8 8 0 100-16 8 8 0 000 16zM8.707 7.293a1 1 0 00-1.414 1.414L8.586 10l-1.293 1.293a1 1 0 101.414 1.414L10 11.414l1.293 1.293a1 1 0 001.414-1.414L11.414 10l1.293-1.293a1 1 0 00-1.414-1.414L10 8.586 8.707 7.293z" clip-rule="evenodd"/>
|
||||
</svg>
|
||||
{{ error }}
|
||||
</div>
|
||||
</div>
|
||||
{% else %}
|
||||
{% for park in object_list|default:parks %}
|
||||
<c-park_card park=park view_mode=view_mode />
|
||||
{% empty %}
|
||||
<div class="{% if view_mode == 'list' %}w-full{% else %}col-span-full{% endif %} p-12 text-center" data-testid="no-parks-found">
|
||||
<div class="mx-auto w-24 h-24 text-gray-300 dark:text-gray-600 mb-6">
|
||||
<svg fill="none" stroke="currentColor" viewBox="0 0 24 24" class="w-full h-full">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="1" d="M19 11H5m14 0a2 2 0 012 2v6a2 2 0 01-2 2H5a2 2 0 01-2-2v-6a2 2 0 012-2m14 0V9a2 2 0 00-2-2M5 11V9a2 2 0 012-2m0 0V5a2 2 0 012-2h6a2 2 0 012 2v2M7 7h10"/>
|
||||
</svg>
|
||||
</div>
|
||||
<h3 class="text-xl font-bold text-gray-900 dark:text-white mb-3">No parks found</h3>
|
||||
<div class="text-gray-600 dark:text-gray-400">
|
||||
{% if search_query %}
|
||||
<p class="mb-4">No parks found matching "{{ search_query }}". Try adjusting your search terms.</p>
|
||||
{% else %}
|
||||
<p class="mb-4">No parks found matching your criteria. Try adjusting your filters.</p>
|
||||
{% endif %}
|
||||
{% if user.is_authenticated %}
|
||||
<p>You can also <a href="{% url 'parks:park_create' %}" class="text-blue-600 dark:text-blue-400 hover:text-blue-800 dark:hover:text-blue-300 font-semibold">add a new park</a>.</p>
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
{% endfor %}
|
||||
{% endif %}
|
||||
@@ -1,11 +0,0 @@
|
||||
from django import template
|
||||
|
||||
register = template.Library()
|
||||
|
||||
|
||||
@register.filter
|
||||
def has_reviewed_park(user, park):
|
||||
"""Check if a user has reviewed a park"""
|
||||
if not user.is_authenticated:
|
||||
return False
|
||||
return park.reviews.filter(user=user).exists()
|
||||
@@ -1,117 +0,0 @@
|
||||
from django.test import TestCase, Client
|
||||
from django.contrib.auth import get_user_model
|
||||
from apps.parks.models import Park, ParkArea, ParkLocation, Company as Operator
|
||||
|
||||
User = get_user_model()
|
||||
|
||||
|
||||
def create_test_location(park: Park) -> ParkLocation:
|
||||
"""Helper function to create a test location"""
|
||||
park_location = ParkLocation.objects.create(
|
||||
park=park,
|
||||
street_address="123 Test St",
|
||||
city="Test City",
|
||||
state="TS",
|
||||
country="Test Country",
|
||||
postal_code="12345",
|
||||
)
|
||||
# Set coordinates using the helper method
|
||||
park_location.set_coordinates(34.0522, -118.2437) # latitude, longitude
|
||||
park_location.save()
|
||||
return park_location
|
||||
|
||||
|
||||
class ParkModelTests(TestCase):
|
||||
@classmethod
|
||||
def setUpTestData(cls) -> None:
|
||||
# Create test user
|
||||
cls.user = User.objects.create_user(
|
||||
username="testuser",
|
||||
email="test@example.com",
|
||||
password="testpass123",
|
||||
)
|
||||
|
||||
# Create test company
|
||||
cls.operator = Operator.objects.create(
|
||||
name="Test Company", website="http://example.com"
|
||||
)
|
||||
|
||||
# Create test park
|
||||
cls.park = Park.objects.create(
|
||||
name="Test Park",
|
||||
operator=cls.operator,
|
||||
status="OPERATING",
|
||||
website="http://testpark.com",
|
||||
)
|
||||
|
||||
# Create test location
|
||||
cls.location = create_test_location(cls.park)
|
||||
|
||||
def test_park_creation(self) -> None:
|
||||
"""Test park instance creation and field values"""
|
||||
self.assertEqual(self.park.name, "Test Park")
|
||||
self.assertEqual(self.park.operator, self.operator)
|
||||
self.assertEqual(self.park.status, "OPERATING")
|
||||
self.assertEqual(self.park.website, "http://testpark.com")
|
||||
self.assertTrue(self.park.slug)
|
||||
|
||||
def test_park_str_representation(self) -> None:
|
||||
"""Test string representation of park"""
|
||||
self.assertEqual(str(self.park), "Test Park")
|
||||
|
||||
def test_park_coordinates(self) -> None:
|
||||
"""Test park coordinates property"""
|
||||
coords = self.park.coordinates
|
||||
self.assertIsNotNone(coords)
|
||||
if coords:
|
||||
self.assertAlmostEqual(coords[0], 34.0522, places=4) # latitude
|
||||
self.assertAlmostEqual(coords[1], -118.2437, places=4) # longitude
|
||||
|
||||
def test_park_formatted_location(self) -> None:
|
||||
"""Test park formatted_location property"""
|
||||
expected = "123 Test St, Test City, TS, 12345, Test Country"
|
||||
self.assertEqual(self.park.formatted_location, expected)
|
||||
|
||||
|
||||
class ParkAreaTests(TestCase):
|
||||
def setUp(self) -> None:
|
||||
# Create test company
|
||||
self.operator = Operator.objects.create(
|
||||
name="Test Company", website="http://example.com"
|
||||
)
|
||||
|
||||
# Create test park
|
||||
self.park = Park.objects.create(
|
||||
name="Test Park", operator=self.operator, status="OPERATING"
|
||||
)
|
||||
|
||||
# Create test location
|
||||
self.location = create_test_location(self.park)
|
||||
|
||||
# Create test area
|
||||
self.area = ParkArea.objects.create(
|
||||
park=self.park, name="Test Area", description="Test Description"
|
||||
)
|
||||
|
||||
def test_area_creation(self) -> None:
|
||||
"""Test park area creation"""
|
||||
self.assertEqual(self.area.name, "Test Area")
|
||||
self.assertEqual(self.area.park, self.park)
|
||||
self.assertTrue(self.area.slug)
|
||||
|
||||
|
||||
class ParkViewTests(TestCase):
|
||||
def setUp(self) -> None:
|
||||
self.client = Client()
|
||||
self.user = User.objects.create_user(
|
||||
username="testuser",
|
||||
email="test@example.com",
|
||||
password="testpass123",
|
||||
)
|
||||
self.operator = Operator.objects.create(
|
||||
name="Test Company", website="http://example.com"
|
||||
)
|
||||
self.park = Park.objects.create(
|
||||
name="Test Park", operator=self.operator, status="OPERATING"
|
||||
)
|
||||
self.location = create_test_location(self.park)
|
||||
@@ -1,178 +0,0 @@
|
||||
"""
|
||||
Park search autocomplete views for enhanced search functionality.
|
||||
Provides fast, cached autocomplete suggestions for park search.
|
||||
"""
|
||||
|
||||
from typing import Dict, List, Any
|
||||
from django.http import JsonResponse
|
||||
from django.views import View
|
||||
from django.core.cache import cache
|
||||
from django.db.models import Q
|
||||
from django.utils.decorators import method_decorator
|
||||
from django.views.decorators.cache import cache_page
|
||||
|
||||
from .models import Park
|
||||
from .models.companies import Company
|
||||
from .services.filter_service import ParkFilterService
|
||||
|
||||
|
||||
class ParkAutocompleteView(View):
|
||||
"""
|
||||
Provides autocomplete suggestions for park search.
|
||||
Returns JSON with park names, operators, and location suggestions.
|
||||
"""
|
||||
|
||||
def get(self, request):
|
||||
"""Handle GET request for autocomplete suggestions."""
|
||||
query = request.GET.get('q', '').strip()
|
||||
|
||||
if len(query) < 2:
|
||||
return JsonResponse({
|
||||
'suggestions': [],
|
||||
'message': 'Type at least 2 characters to search'
|
||||
})
|
||||
|
||||
# Check cache first
|
||||
cache_key = f"park_autocomplete:{query.lower()}"
|
||||
cached_result = cache.get(cache_key)
|
||||
|
||||
if cached_result:
|
||||
return JsonResponse(cached_result)
|
||||
|
||||
# Generate suggestions
|
||||
suggestions = self._get_suggestions(query)
|
||||
|
||||
# Cache results for 5 minutes
|
||||
result = {
|
||||
'suggestions': suggestions,
|
||||
'query': query
|
||||
}
|
||||
cache.set(cache_key, result, 300)
|
||||
|
||||
return JsonResponse(result)
|
||||
|
||||
def _get_suggestions(self, query: str) -> List[Dict[str, Any]]:
|
||||
"""Generate autocomplete suggestions based on query."""
|
||||
suggestions = []
|
||||
|
||||
# Park name suggestions (top 5)
|
||||
park_suggestions = self._get_park_suggestions(query)
|
||||
suggestions.extend(park_suggestions)
|
||||
|
||||
# Operator suggestions (top 3)
|
||||
operator_suggestions = self._get_operator_suggestions(query)
|
||||
suggestions.extend(operator_suggestions)
|
||||
|
||||
# Location suggestions (top 3)
|
||||
location_suggestions = self._get_location_suggestions(query)
|
||||
suggestions.extend(location_suggestions)
|
||||
|
||||
# Remove duplicates and limit results
|
||||
seen = set()
|
||||
unique_suggestions = []
|
||||
for suggestion in suggestions:
|
||||
key = suggestion['name'].lower()
|
||||
if key not in seen:
|
||||
seen.add(key)
|
||||
unique_suggestions.append(suggestion)
|
||||
|
||||
return unique_suggestions[:10] # Limit to 10 suggestions
|
||||
|
||||
def _get_park_suggestions(self, query: str) -> List[Dict[str, Any]]:
|
||||
"""Get park name suggestions."""
|
||||
parks = Park.objects.filter(
|
||||
name__icontains=query,
|
||||
status='OPERATING'
|
||||
).select_related('operator').order_by('name')[:5]
|
||||
|
||||
suggestions = []
|
||||
for park in parks:
|
||||
suggestion = {
|
||||
'name': park.name,
|
||||
'type': 'park',
|
||||
'operator': park.operator.name if park.operator else None,
|
||||
'url': f'/parks/{park.slug}/' if park.slug else None
|
||||
}
|
||||
suggestions.append(suggestion)
|
||||
|
||||
return suggestions
|
||||
|
||||
def _get_operator_suggestions(self, query: str) -> List[Dict[str, Any]]:
|
||||
"""Get operator suggestions."""
|
||||
operators = Company.objects.filter(
|
||||
roles__contains=['OPERATOR'],
|
||||
name__icontains=query
|
||||
).order_by('name')[:3]
|
||||
|
||||
suggestions = []
|
||||
for operator in operators:
|
||||
suggestion = {
|
||||
'name': operator.name,
|
||||
'type': 'operator',
|
||||
'park_count': operator.operated_parks.filter(status='OPERATING').count()
|
||||
}
|
||||
suggestions.append(suggestion)
|
||||
|
||||
return suggestions
|
||||
|
||||
def _get_location_suggestions(self, query: str) -> List[Dict[str, Any]]:
|
||||
"""Get location (city/country) suggestions."""
|
||||
# Get unique cities
|
||||
city_parks = Park.objects.filter(
|
||||
location__city__icontains=query,
|
||||
status='OPERATING'
|
||||
).select_related('location').order_by('location__city').distinct()[:2]
|
||||
|
||||
# Get unique countries
|
||||
country_parks = Park.objects.filter(
|
||||
location__country__icontains=query,
|
||||
status='OPERATING'
|
||||
).select_related('location').order_by('location__country').distinct()[:2]
|
||||
|
||||
suggestions = []
|
||||
|
||||
# Add city suggestions
|
||||
for park in city_parks:
|
||||
if park.location and park.location.city:
|
||||
city_name = park.location.city
|
||||
if park.location.country:
|
||||
city_name += f", {park.location.country}"
|
||||
|
||||
suggestion = {
|
||||
'name': city_name,
|
||||
'type': 'location',
|
||||
'location_type': 'city'
|
||||
}
|
||||
suggestions.append(suggestion)
|
||||
|
||||
# Add country suggestions
|
||||
for park in country_parks:
|
||||
if park.location and park.location.country:
|
||||
suggestion = {
|
||||
'name': park.location.country,
|
||||
'type': 'location',
|
||||
'location_type': 'country'
|
||||
}
|
||||
suggestions.append(suggestion)
|
||||
|
||||
return suggestions
|
||||
|
||||
|
||||
@method_decorator(cache_page(60 * 5), name='dispatch') # Cache for 5 minutes
|
||||
class QuickFilterSuggestionsView(View):
|
||||
"""
|
||||
Provides quick filter suggestions and popular filters.
|
||||
Used for search dropdown quick actions.
|
||||
"""
|
||||
|
||||
def get(self, request):
|
||||
"""Handle GET request for quick filter suggestions."""
|
||||
filter_service = ParkFilterService()
|
||||
popular_filters = filter_service.get_popular_filters()
|
||||
filter_counts = filter_service.get_filter_counts()
|
||||
|
||||
return JsonResponse({
|
||||
'quick_filters': popular_filters.get('quick_filters', []),
|
||||
'filter_counts': filter_counts,
|
||||
'recommended_sorts': popular_filters.get('recommended_sorts', [])
|
||||
})
|
||||
@@ -1,710 +0,0 @@
|
||||
from django.contrib import admin
|
||||
# from django.contrib.gis.admin import GISModelAdmin # Disabled temporarily for setup
|
||||
from django.utils.html import format_html
|
||||
from .models.company import Company
|
||||
from .models.rides import Ride, RideModel, RollerCoasterStats
|
||||
from .models.location import RideLocation
|
||||
from .models.reviews import RideReview
|
||||
from .models.rankings import RideRanking, RidePairComparison, RankingSnapshot
|
||||
|
||||
|
||||
class ManufacturerAdmin(admin.ModelAdmin):
|
||||
list_display = ("name", "headquarters", "website", "rides_count")
|
||||
search_fields = ("name",)
|
||||
|
||||
def get_queryset(self, request):
|
||||
return super().get_queryset(request).filter(roles__contains=["MANUFACTURER"])
|
||||
|
||||
|
||||
class DesignerAdmin(admin.ModelAdmin):
|
||||
list_display = ("name", "headquarters", "website")
|
||||
search_fields = ("name",)
|
||||
|
||||
def get_queryset(self, request):
|
||||
return super().get_queryset(request).filter(roles__contains=["DESIGNER"])
|
||||
|
||||
|
||||
class RideLocationInline(admin.StackedInline):
|
||||
"""Inline admin for RideLocation"""
|
||||
|
||||
model = RideLocation
|
||||
extra = 0
|
||||
fields = (
|
||||
"park_area",
|
||||
"point",
|
||||
"entrance_notes",
|
||||
"accessibility_notes",
|
||||
)
|
||||
|
||||
|
||||
class RideLocationAdmin(admin.ModelAdmin): # GISModelAdmin disabled for setup
|
||||
"""Admin for standalone RideLocation management"""
|
||||
|
||||
list_display = ("ride", "park_area", "has_coordinates", "created_at")
|
||||
list_filter = ("park_area", "created_at")
|
||||
search_fields = ("ride__name", "park_area", "entrance_notes")
|
||||
readonly_fields = (
|
||||
"latitude",
|
||||
"longitude",
|
||||
"coordinates",
|
||||
"created_at",
|
||||
"updated_at",
|
||||
)
|
||||
fieldsets = (
|
||||
("Ride", {"fields": ("ride",)}),
|
||||
(
|
||||
"Location Information",
|
||||
{
|
||||
"fields": (
|
||||
"park_area",
|
||||
"point",
|
||||
"latitude",
|
||||
"longitude",
|
||||
"coordinates",
|
||||
),
|
||||
"description": "Optional coordinates - not all rides need precise location tracking",
|
||||
},
|
||||
),
|
||||
(
|
||||
"Navigation Notes",
|
||||
{
|
||||
"fields": ("entrance_notes", "accessibility_notes"),
|
||||
},
|
||||
),
|
||||
(
|
||||
"Metadata",
|
||||
{"fields": ("created_at", "updated_at"), "classes": ("collapse",)},
|
||||
),
|
||||
)
|
||||
|
||||
@admin.display(description="Latitude")
|
||||
def latitude(self, obj):
|
||||
return obj.latitude
|
||||
|
||||
@admin.display(description="Longitude")
|
||||
def longitude(self, obj):
|
||||
return obj.longitude
|
||||
|
||||
|
||||
class RollerCoasterStatsInline(admin.StackedInline):
|
||||
"""Inline admin for RollerCoasterStats"""
|
||||
|
||||
model = RollerCoasterStats
|
||||
extra = 0
|
||||
fields = (
|
||||
("height_ft", "length_ft", "speed_mph"),
|
||||
("track_material", "roller_coaster_type"),
|
||||
("propulsion_system", "inversions"),
|
||||
("max_drop_height_ft", "ride_time_seconds"),
|
||||
("train_style", "trains_count"),
|
||||
("cars_per_train", "seats_per_car"),
|
||||
)
|
||||
classes = ("collapse",)
|
||||
|
||||
|
||||
@admin.register(Ride)
|
||||
class RideAdmin(admin.ModelAdmin):
|
||||
"""Enhanced Ride admin with location and coaster stats inlines"""
|
||||
|
||||
list_display = (
|
||||
"name",
|
||||
"park",
|
||||
"category_display",
|
||||
"manufacturer",
|
||||
"status",
|
||||
"opening_date",
|
||||
"average_rating",
|
||||
)
|
||||
list_filter = (
|
||||
"category",
|
||||
"status",
|
||||
"park",
|
||||
"manufacturer",
|
||||
"designer",
|
||||
"opening_date",
|
||||
)
|
||||
search_fields = (
|
||||
"name",
|
||||
"description",
|
||||
"park__name",
|
||||
"manufacturer__name",
|
||||
"designer__name",
|
||||
)
|
||||
readonly_fields = ("created_at", "updated_at")
|
||||
prepopulated_fields = {"slug": ("name",)}
|
||||
inlines = [RideLocationInline, RollerCoasterStatsInline]
|
||||
date_hierarchy = "opening_date"
|
||||
ordering = ("park", "name")
|
||||
|
||||
fieldsets = (
|
||||
(
|
||||
"Basic Information",
|
||||
{
|
||||
"fields": (
|
||||
"name",
|
||||
"slug",
|
||||
"description",
|
||||
"park",
|
||||
"park_area",
|
||||
"category",
|
||||
)
|
||||
},
|
||||
),
|
||||
(
|
||||
"Companies",
|
||||
{
|
||||
"fields": (
|
||||
"manufacturer",
|
||||
"designer",
|
||||
"ride_model",
|
||||
)
|
||||
},
|
||||
),
|
||||
(
|
||||
"Status & Dates",
|
||||
{
|
||||
"fields": (
|
||||
"status",
|
||||
"post_closing_status",
|
||||
"opening_date",
|
||||
"closing_date",
|
||||
"status_since",
|
||||
)
|
||||
},
|
||||
),
|
||||
(
|
||||
"Ride Specifications",
|
||||
{
|
||||
"fields": (
|
||||
"min_height_in",
|
||||
"max_height_in",
|
||||
"capacity_per_hour",
|
||||
"ride_duration_seconds",
|
||||
"average_rating",
|
||||
),
|
||||
"classes": ("collapse",),
|
||||
},
|
||||
),
|
||||
(
|
||||
"Metadata",
|
||||
{
|
||||
"fields": ("created_at", "updated_at"),
|
||||
"classes": ("collapse",),
|
||||
},
|
||||
),
|
||||
)
|
||||
|
||||
@admin.display(description="Category")
|
||||
def category_display(self, obj):
|
||||
"""Display category with full name"""
|
||||
choices_dict = dict(obj._meta.get_field("category").choices)
|
||||
if obj.category in choices_dict:
|
||||
return choices_dict[obj.category]
|
||||
else:
|
||||
raise ValueError(f"Unknown category: {obj.category}")
|
||||
|
||||
|
||||
@admin.register(RideModel)
|
||||
class RideModelAdmin(admin.ModelAdmin):
|
||||
"""Admin interface for ride models"""
|
||||
|
||||
list_display = (
|
||||
"name",
|
||||
"manufacturer",
|
||||
"category_display",
|
||||
"ride_count",
|
||||
)
|
||||
list_filter = (
|
||||
"manufacturer",
|
||||
"category",
|
||||
)
|
||||
search_fields = (
|
||||
"name",
|
||||
"description",
|
||||
"manufacturer__name",
|
||||
)
|
||||
ordering = ("manufacturer", "name")
|
||||
|
||||
fieldsets = (
|
||||
(
|
||||
"Model Information",
|
||||
{
|
||||
"fields": (
|
||||
"name",
|
||||
"manufacturer",
|
||||
"category",
|
||||
"description",
|
||||
)
|
||||
},
|
||||
),
|
||||
)
|
||||
|
||||
@admin.display(description="Category")
|
||||
def category_display(self, obj):
|
||||
"""Display category with full name"""
|
||||
choices_dict = dict(obj._meta.get_field("category").choices)
|
||||
if obj.category in choices_dict:
|
||||
return choices_dict[obj.category]
|
||||
else:
|
||||
raise ValueError(f"Unknown category: {obj.category}")
|
||||
|
||||
@admin.display(description="Installations")
|
||||
def ride_count(self, obj):
|
||||
"""Display number of ride installations"""
|
||||
return obj.rides.count()
|
||||
|
||||
|
||||
@admin.register(RollerCoasterStats)
|
||||
class RollerCoasterStatsAdmin(admin.ModelAdmin):
|
||||
"""Admin interface for roller coaster statistics"""
|
||||
|
||||
list_display = (
|
||||
"ride",
|
||||
"height_ft",
|
||||
"speed_mph",
|
||||
"length_ft",
|
||||
"inversions",
|
||||
"track_material",
|
||||
"roller_coaster_type",
|
||||
)
|
||||
list_filter = (
|
||||
"track_material",
|
||||
"roller_coaster_type",
|
||||
"propulsion_system",
|
||||
"inversions",
|
||||
)
|
||||
search_fields = (
|
||||
"ride__name",
|
||||
"ride__park__name",
|
||||
"track_type",
|
||||
"train_style",
|
||||
)
|
||||
readonly_fields = ("calculated_capacity",)
|
||||
|
||||
fieldsets = (
|
||||
(
|
||||
"Basic Stats",
|
||||
{
|
||||
"fields": (
|
||||
"ride",
|
||||
"height_ft",
|
||||
"length_ft",
|
||||
"speed_mph",
|
||||
"max_drop_height_ft",
|
||||
)
|
||||
},
|
||||
),
|
||||
(
|
||||
"Track & Design",
|
||||
{
|
||||
"fields": (
|
||||
"track_material",
|
||||
"track_type",
|
||||
"roller_coaster_type",
|
||||
"propulsion_system",
|
||||
"inversions",
|
||||
)
|
||||
},
|
||||
),
|
||||
(
|
||||
"Operation Details",
|
||||
{
|
||||
"fields": (
|
||||
"ride_time_seconds",
|
||||
"train_style",
|
||||
"trains_count",
|
||||
"cars_per_train",
|
||||
"seats_per_car",
|
||||
"calculated_capacity",
|
||||
),
|
||||
"classes": ("collapse",),
|
||||
},
|
||||
),
|
||||
)
|
||||
|
||||
@admin.display(description="Calculated Capacity")
|
||||
def calculated_capacity(self, obj):
|
||||
"""Calculate theoretical hourly capacity"""
|
||||
if all(
|
||||
[
|
||||
obj.trains_count,
|
||||
obj.cars_per_train,
|
||||
obj.seats_per_car,
|
||||
obj.ride_time_seconds,
|
||||
]
|
||||
):
|
||||
total_seats = obj.trains_count * obj.cars_per_train * obj.seats_per_car
|
||||
# Add 2 min loading time
|
||||
cycles_per_hour = 3600 / (obj.ride_time_seconds + 120)
|
||||
return f"{int(total_seats * cycles_per_hour)} riders/hour"
|
||||
return "N/A"
|
||||
|
||||
|
||||
@admin.register(RideReview)
|
||||
class RideReviewAdmin(admin.ModelAdmin):
|
||||
"""Admin interface for ride reviews"""
|
||||
|
||||
list_display = (
|
||||
"ride",
|
||||
"user",
|
||||
"rating",
|
||||
"title",
|
||||
"visit_date",
|
||||
"is_published",
|
||||
"created_at",
|
||||
"moderation_status",
|
||||
)
|
||||
list_filter = (
|
||||
"rating",
|
||||
"is_published",
|
||||
"visit_date",
|
||||
"created_at",
|
||||
"ride__park",
|
||||
"moderated_by",
|
||||
)
|
||||
search_fields = (
|
||||
"title",
|
||||
"content",
|
||||
"user__username",
|
||||
"ride__name",
|
||||
"ride__park__name",
|
||||
)
|
||||
readonly_fields = ("created_at", "updated_at")
|
||||
date_hierarchy = "created_at"
|
||||
ordering = ("-created_at",)
|
||||
|
||||
fieldsets = (
|
||||
(
|
||||
"Review Details",
|
||||
{
|
||||
"fields": (
|
||||
"user",
|
||||
"ride",
|
||||
"rating",
|
||||
"title",
|
||||
"content",
|
||||
"visit_date",
|
||||
)
|
||||
},
|
||||
),
|
||||
(
|
||||
"Publication Status",
|
||||
{
|
||||
"fields": ("is_published",),
|
||||
},
|
||||
),
|
||||
(
|
||||
"Moderation",
|
||||
{
|
||||
"fields": (
|
||||
"moderated_by",
|
||||
"moderated_at",
|
||||
"moderation_notes",
|
||||
),
|
||||
"classes": ("collapse",),
|
||||
},
|
||||
),
|
||||
(
|
||||
"Metadata",
|
||||
{
|
||||
"fields": ("created_at", "updated_at"),
|
||||
"classes": ("collapse",),
|
||||
},
|
||||
),
|
||||
)
|
||||
|
||||
@admin.display(description="Moderation Status", boolean=True)
|
||||
def moderation_status(self, obj):
|
||||
"""Display moderation status with color coding"""
|
||||
if obj.moderated_by:
|
||||
return format_html(
|
||||
'<span style="color: {};">{}</span>',
|
||||
"green" if obj.is_published else "red",
|
||||
"Approved" if obj.is_published else "Rejected",
|
||||
)
|
||||
return format_html('<span style="color: orange;">Pending</span>')
|
||||
|
||||
def save_model(self, request, obj, form, change):
|
||||
"""Auto-set moderation info when status changes"""
|
||||
if change and "is_published" in form.changed_data:
|
||||
from django.utils import timezone
|
||||
|
||||
obj.moderated_by = request.user
|
||||
obj.moderated_at = timezone.now()
|
||||
super().save_model(request, obj, form, change)
|
||||
|
||||
|
||||
@admin.register(Company)
|
||||
class CompanyAdmin(admin.ModelAdmin):
|
||||
"""Enhanced Company admin for rides app"""
|
||||
|
||||
list_display = (
|
||||
"name",
|
||||
"roles_display",
|
||||
"website",
|
||||
"founded_date",
|
||||
"rides_count",
|
||||
"coasters_count",
|
||||
)
|
||||
list_filter = ("roles", "founded_date")
|
||||
search_fields = ("name", "description")
|
||||
readonly_fields = ("created_at", "updated_at")
|
||||
prepopulated_fields = {"slug": ("name",)}
|
||||
|
||||
fieldsets = (
|
||||
(
|
||||
"Basic Information",
|
||||
{
|
||||
"fields": (
|
||||
"name",
|
||||
"slug",
|
||||
"roles",
|
||||
"description",
|
||||
"website",
|
||||
)
|
||||
},
|
||||
),
|
||||
(
|
||||
"Company Details",
|
||||
{
|
||||
"fields": (
|
||||
"founded_date",
|
||||
"rides_count",
|
||||
"coasters_count",
|
||||
)
|
||||
},
|
||||
),
|
||||
(
|
||||
"Metadata",
|
||||
{
|
||||
"fields": ("created_at", "updated_at"),
|
||||
"classes": ("collapse",),
|
||||
},
|
||||
),
|
||||
)
|
||||
|
||||
@admin.display(description="Roles")
|
||||
def roles_display(self, obj):
|
||||
"""Display roles as a formatted string"""
|
||||
return ", ".join(obj.roles) if obj.roles else "No roles"
|
||||
|
||||
|
||||
@admin.register(RideRanking)
|
||||
class RideRankingAdmin(admin.ModelAdmin):
|
||||
"""Admin interface for ride rankings"""
|
||||
|
||||
list_display = (
|
||||
"rank",
|
||||
"ride_name",
|
||||
"park_name",
|
||||
"winning_percentage_display",
|
||||
"wins",
|
||||
"losses",
|
||||
"ties",
|
||||
"average_rating",
|
||||
"mutual_riders_count",
|
||||
"last_calculated",
|
||||
)
|
||||
list_filter = (
|
||||
"ride__category",
|
||||
"last_calculated",
|
||||
"calculation_version",
|
||||
)
|
||||
search_fields = (
|
||||
"ride__name",
|
||||
"ride__park__name",
|
||||
)
|
||||
readonly_fields = (
|
||||
"ride",
|
||||
"rank",
|
||||
"wins",
|
||||
"losses",
|
||||
"ties",
|
||||
"winning_percentage",
|
||||
"mutual_riders_count",
|
||||
"comparison_count",
|
||||
"average_rating",
|
||||
"last_calculated",
|
||||
"calculation_version",
|
||||
"total_comparisons",
|
||||
)
|
||||
ordering = ["rank"]
|
||||
|
||||
fieldsets = (
|
||||
(
|
||||
"Ride Information",
|
||||
{"fields": ("ride",)},
|
||||
),
|
||||
(
|
||||
"Ranking Metrics",
|
||||
{
|
||||
"fields": (
|
||||
"rank",
|
||||
"winning_percentage",
|
||||
"wins",
|
||||
"losses",
|
||||
"ties",
|
||||
"total_comparisons",
|
||||
)
|
||||
},
|
||||
),
|
||||
(
|
||||
"Additional Metrics",
|
||||
{
|
||||
"fields": (
|
||||
"average_rating",
|
||||
"mutual_riders_count",
|
||||
"comparison_count",
|
||||
)
|
||||
},
|
||||
),
|
||||
(
|
||||
"Calculation Info",
|
||||
{
|
||||
"fields": (
|
||||
"last_calculated",
|
||||
"calculation_version",
|
||||
),
|
||||
"classes": ("collapse",),
|
||||
},
|
||||
),
|
||||
)
|
||||
|
||||
@admin.display(description="Ride")
|
||||
def ride_name(self, obj):
|
||||
return obj.ride.name
|
||||
|
||||
@admin.display(description="Park")
|
||||
def park_name(self, obj):
|
||||
return obj.ride.park.name
|
||||
|
||||
@admin.display(description="Win %")
|
||||
def winning_percentage_display(self, obj):
|
||||
return f"{obj.winning_percentage:.1%}"
|
||||
|
||||
def has_add_permission(self, request):
|
||||
# Rankings are calculated automatically
|
||||
return False
|
||||
|
||||
def has_change_permission(self, request, obj=None):
|
||||
# Rankings are read-only
|
||||
return False
|
||||
|
||||
|
||||
@admin.register(RidePairComparison)
|
||||
class RidePairComparisonAdmin(admin.ModelAdmin):
|
||||
"""Admin interface for ride pair comparisons"""
|
||||
|
||||
list_display = (
|
||||
"comparison_summary",
|
||||
"ride_a_name",
|
||||
"ride_b_name",
|
||||
"winner_display",
|
||||
"ride_a_wins",
|
||||
"ride_b_wins",
|
||||
"ties",
|
||||
"mutual_riders_count",
|
||||
"last_calculated",
|
||||
)
|
||||
list_filter = ("last_calculated",)
|
||||
search_fields = (
|
||||
"ride_a__name",
|
||||
"ride_b__name",
|
||||
"ride_a__park__name",
|
||||
"ride_b__park__name",
|
||||
)
|
||||
readonly_fields = (
|
||||
"ride_a",
|
||||
"ride_b",
|
||||
"ride_a_wins",
|
||||
"ride_b_wins",
|
||||
"ties",
|
||||
"mutual_riders_count",
|
||||
"ride_a_avg_rating",
|
||||
"ride_b_avg_rating",
|
||||
"last_calculated",
|
||||
"winner",
|
||||
"is_tie",
|
||||
)
|
||||
ordering = ["-mutual_riders_count"]
|
||||
|
||||
@admin.display(description="Comparison")
|
||||
def comparison_summary(self, obj):
|
||||
return f"{obj.ride_a.name} vs {obj.ride_b.name}"
|
||||
|
||||
@admin.display(description="Ride A")
|
||||
def ride_a_name(self, obj):
|
||||
return obj.ride_a.name
|
||||
|
||||
@admin.display(description="Ride B")
|
||||
def ride_b_name(self, obj):
|
||||
return obj.ride_b.name
|
||||
|
||||
@admin.display(description="Winner")
|
||||
def winner_display(self, obj):
|
||||
if obj.is_tie:
|
||||
return "TIE"
|
||||
winner = obj.winner
|
||||
if winner:
|
||||
return winner.name
|
||||
return "N/A"
|
||||
|
||||
def has_add_permission(self, request):
|
||||
# Comparisons are calculated automatically
|
||||
return False
|
||||
|
||||
def has_change_permission(self, request, obj=None):
|
||||
# Comparisons are read-only
|
||||
return False
|
||||
|
||||
|
||||
@admin.register(RankingSnapshot)
|
||||
class RankingSnapshotAdmin(admin.ModelAdmin):
|
||||
"""Admin interface for ranking history snapshots"""
|
||||
|
||||
list_display = (
|
||||
"ride_name",
|
||||
"park_name",
|
||||
"rank",
|
||||
"winning_percentage_display",
|
||||
"snapshot_date",
|
||||
)
|
||||
list_filter = (
|
||||
"snapshot_date",
|
||||
"ride__category",
|
||||
)
|
||||
search_fields = (
|
||||
"ride__name",
|
||||
"ride__park__name",
|
||||
)
|
||||
readonly_fields = (
|
||||
"ride",
|
||||
"rank",
|
||||
"winning_percentage",
|
||||
"snapshot_date",
|
||||
)
|
||||
date_hierarchy = "snapshot_date"
|
||||
ordering = ["-snapshot_date", "rank"]
|
||||
|
||||
@admin.display(description="Ride")
|
||||
def ride_name(self, obj):
|
||||
return obj.ride.name
|
||||
|
||||
@admin.display(description="Park")
|
||||
def park_name(self, obj):
|
||||
return obj.ride.park.name
|
||||
|
||||
@admin.display(description="Win %")
|
||||
def winning_percentage_display(self, obj):
|
||||
return f"{obj.winning_percentage:.1%}"
|
||||
|
||||
def has_add_permission(self, request):
|
||||
# Snapshots are created automatically
|
||||
return False
|
||||
|
||||
def has_change_permission(self, request, obj=None):
|
||||
# Snapshots are read-only
|
||||
return False
|
||||
|
||||
|
||||
admin.site.register(RideLocation, RideLocationAdmin)
|
||||
@@ -1,804 +0,0 @@
|
||||
"""
|
||||
Rich Choice Objects for Rides Domain
|
||||
|
||||
This module defines all choice objects for the rides domain, replacing
|
||||
the legacy tuple-based choices with rich choice objects.
|
||||
"""
|
||||
|
||||
from apps.core.choices import RichChoice, ChoiceCategory
|
||||
from apps.core.choices.registry import register_choices
|
||||
|
||||
|
||||
# Ride Category Choices
|
||||
RIDE_CATEGORIES = [
|
||||
RichChoice(
|
||||
value="RC",
|
||||
label="Roller Coaster",
|
||||
description="Thrill rides with tracks featuring hills, loops, and high speeds",
|
||||
metadata={
|
||||
'color': 'red',
|
||||
'icon': 'roller-coaster',
|
||||
'css_class': 'bg-red-100 text-red-800',
|
||||
'sort_order': 1
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="DR",
|
||||
label="Dark Ride",
|
||||
description="Indoor rides with themed environments and storytelling",
|
||||
metadata={
|
||||
'color': 'purple',
|
||||
'icon': 'dark-ride',
|
||||
'css_class': 'bg-purple-100 text-purple-800',
|
||||
'sort_order': 2
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="FR",
|
||||
label="Flat Ride",
|
||||
description="Rides that move along a generally flat plane with spinning, swinging, or bouncing motions",
|
||||
metadata={
|
||||
'color': 'blue',
|
||||
'icon': 'flat-ride',
|
||||
'css_class': 'bg-blue-100 text-blue-800',
|
||||
'sort_order': 3
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="WR",
|
||||
label="Water Ride",
|
||||
description="Rides that incorporate water elements like splashing, floating, or getting wet",
|
||||
metadata={
|
||||
'color': 'cyan',
|
||||
'icon': 'water-ride',
|
||||
'css_class': 'bg-cyan-100 text-cyan-800',
|
||||
'sort_order': 4
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="TR",
|
||||
label="Transport Ride",
|
||||
description="Rides primarily designed for transportation around the park",
|
||||
metadata={
|
||||
'color': 'green',
|
||||
'icon': 'transport',
|
||||
'css_class': 'bg-green-100 text-green-800',
|
||||
'sort_order': 5
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="OT",
|
||||
label="Other",
|
||||
description="Rides that don't fit into standard categories",
|
||||
metadata={
|
||||
'color': 'gray',
|
||||
'icon': 'other',
|
||||
'css_class': 'bg-gray-100 text-gray-800',
|
||||
'sort_order': 6
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
]
|
||||
|
||||
# Ride Status Choices
|
||||
RIDE_STATUSES = [
|
||||
RichChoice(
|
||||
value="OPERATING",
|
||||
label="Operating",
|
||||
description="Ride is currently open and operating normally",
|
||||
metadata={
|
||||
'color': 'green',
|
||||
'icon': 'check-circle',
|
||||
'css_class': 'bg-green-100 text-green-800',
|
||||
'sort_order': 1
|
||||
},
|
||||
category=ChoiceCategory.STATUS
|
||||
),
|
||||
RichChoice(
|
||||
value="CLOSED_TEMP",
|
||||
label="Temporarily Closed",
|
||||
description="Ride is temporarily closed for maintenance, weather, or other short-term reasons",
|
||||
metadata={
|
||||
'color': 'yellow',
|
||||
'icon': 'pause-circle',
|
||||
'css_class': 'bg-yellow-100 text-yellow-800',
|
||||
'sort_order': 2
|
||||
},
|
||||
category=ChoiceCategory.STATUS
|
||||
),
|
||||
RichChoice(
|
||||
value="SBNO",
|
||||
label="Standing But Not Operating",
|
||||
description="Ride structure remains but is not currently operating",
|
||||
metadata={
|
||||
'color': 'orange',
|
||||
'icon': 'stop-circle',
|
||||
'css_class': 'bg-orange-100 text-orange-800',
|
||||
'sort_order': 3
|
||||
},
|
||||
category=ChoiceCategory.STATUS
|
||||
),
|
||||
RichChoice(
|
||||
value="CLOSING",
|
||||
label="Closing",
|
||||
description="Ride is scheduled to close permanently",
|
||||
metadata={
|
||||
'color': 'red',
|
||||
'icon': 'x-circle',
|
||||
'css_class': 'bg-red-100 text-red-800',
|
||||
'sort_order': 4
|
||||
},
|
||||
category=ChoiceCategory.STATUS
|
||||
),
|
||||
RichChoice(
|
||||
value="CLOSED_PERM",
|
||||
label="Permanently Closed",
|
||||
description="Ride has been permanently closed and will not reopen",
|
||||
metadata={
|
||||
'color': 'red',
|
||||
'icon': 'x-circle',
|
||||
'css_class': 'bg-red-100 text-red-800',
|
||||
'sort_order': 5
|
||||
},
|
||||
category=ChoiceCategory.STATUS
|
||||
),
|
||||
RichChoice(
|
||||
value="UNDER_CONSTRUCTION",
|
||||
label="Under Construction",
|
||||
description="Ride is currently being built or undergoing major renovation",
|
||||
metadata={
|
||||
'color': 'blue',
|
||||
'icon': 'tool',
|
||||
'css_class': 'bg-blue-100 text-blue-800',
|
||||
'sort_order': 6
|
||||
},
|
||||
category=ChoiceCategory.STATUS
|
||||
),
|
||||
RichChoice(
|
||||
value="DEMOLISHED",
|
||||
label="Demolished",
|
||||
description="Ride has been completely removed and demolished",
|
||||
metadata={
|
||||
'color': 'gray',
|
||||
'icon': 'trash',
|
||||
'css_class': 'bg-gray-100 text-gray-800',
|
||||
'sort_order': 7
|
||||
},
|
||||
category=ChoiceCategory.STATUS
|
||||
),
|
||||
RichChoice(
|
||||
value="RELOCATED",
|
||||
label="Relocated",
|
||||
description="Ride has been moved to a different location",
|
||||
metadata={
|
||||
'color': 'purple',
|
||||
'icon': 'arrow-right',
|
||||
'css_class': 'bg-purple-100 text-purple-800',
|
||||
'sort_order': 8
|
||||
},
|
||||
category=ChoiceCategory.STATUS
|
||||
),
|
||||
]
|
||||
|
||||
# Post-Closing Status Choices
|
||||
POST_CLOSING_STATUSES = [
|
||||
RichChoice(
|
||||
value="SBNO",
|
||||
label="Standing But Not Operating",
|
||||
description="Ride structure remains but is not operating after closure",
|
||||
metadata={
|
||||
'color': 'orange',
|
||||
'icon': 'stop-circle',
|
||||
'css_class': 'bg-orange-100 text-orange-800',
|
||||
'sort_order': 1
|
||||
},
|
||||
category=ChoiceCategory.STATUS
|
||||
),
|
||||
RichChoice(
|
||||
value="CLOSED_PERM",
|
||||
label="Permanently Closed",
|
||||
description="Ride has been permanently closed after the closing date",
|
||||
metadata={
|
||||
'color': 'red',
|
||||
'icon': 'x-circle',
|
||||
'css_class': 'bg-red-100 text-red-800',
|
||||
'sort_order': 2
|
||||
},
|
||||
category=ChoiceCategory.STATUS
|
||||
),
|
||||
]
|
||||
|
||||
# Roller Coaster Track Material Choices
|
||||
TRACK_MATERIALS = [
|
||||
RichChoice(
|
||||
value="STEEL",
|
||||
label="Steel",
|
||||
description="Modern steel track construction providing smooth rides and complex layouts",
|
||||
metadata={
|
||||
'color': 'gray',
|
||||
'icon': 'steel',
|
||||
'css_class': 'bg-gray-100 text-gray-800',
|
||||
'sort_order': 1
|
||||
},
|
||||
category=ChoiceCategory.TECHNICAL
|
||||
),
|
||||
RichChoice(
|
||||
value="WOOD",
|
||||
label="Wood",
|
||||
description="Traditional wooden track construction providing classic coaster experience",
|
||||
metadata={
|
||||
'color': 'amber',
|
||||
'icon': 'wood',
|
||||
'css_class': 'bg-amber-100 text-amber-800',
|
||||
'sort_order': 2
|
||||
},
|
||||
category=ChoiceCategory.TECHNICAL
|
||||
),
|
||||
RichChoice(
|
||||
value="HYBRID",
|
||||
label="Hybrid",
|
||||
description="Combination of steel and wooden construction elements",
|
||||
metadata={
|
||||
'color': 'orange',
|
||||
'icon': 'hybrid',
|
||||
'css_class': 'bg-orange-100 text-orange-800',
|
||||
'sort_order': 3
|
||||
},
|
||||
category=ChoiceCategory.TECHNICAL
|
||||
),
|
||||
]
|
||||
|
||||
# Roller Coaster Type Choices
|
||||
COASTER_TYPES = [
|
||||
RichChoice(
|
||||
value="SITDOWN",
|
||||
label="Sit Down",
|
||||
description="Traditional seated roller coaster with riders sitting upright",
|
||||
metadata={
|
||||
'color': 'blue',
|
||||
'icon': 'sitdown',
|
||||
'css_class': 'bg-blue-100 text-blue-800',
|
||||
'sort_order': 1
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="INVERTED",
|
||||
label="Inverted",
|
||||
description="Coaster where riders' feet dangle freely below the track",
|
||||
metadata={
|
||||
'color': 'purple',
|
||||
'icon': 'inverted',
|
||||
'css_class': 'bg-purple-100 text-purple-800',
|
||||
'sort_order': 2
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="FLYING",
|
||||
label="Flying",
|
||||
description="Riders lie face-down in a flying position",
|
||||
metadata={
|
||||
'color': 'sky',
|
||||
'icon': 'flying',
|
||||
'css_class': 'bg-sky-100 text-sky-800',
|
||||
'sort_order': 3
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="STANDUP",
|
||||
label="Stand Up",
|
||||
description="Riders stand upright during the ride",
|
||||
metadata={
|
||||
'color': 'green',
|
||||
'icon': 'standup',
|
||||
'css_class': 'bg-green-100 text-green-800',
|
||||
'sort_order': 4
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="WING",
|
||||
label="Wing",
|
||||
description="Riders sit on either side of the track with nothing above or below",
|
||||
metadata={
|
||||
'color': 'indigo',
|
||||
'icon': 'wing',
|
||||
'css_class': 'bg-indigo-100 text-indigo-800',
|
||||
'sort_order': 5
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="DIVE",
|
||||
label="Dive",
|
||||
description="Features a vertical or near-vertical drop as the main element",
|
||||
metadata={
|
||||
'color': 'red',
|
||||
'icon': 'dive',
|
||||
'css_class': 'bg-red-100 text-red-800',
|
||||
'sort_order': 6
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="FAMILY",
|
||||
label="Family",
|
||||
description="Designed for riders of all ages with moderate thrills",
|
||||
metadata={
|
||||
'color': 'emerald',
|
||||
'icon': 'family',
|
||||
'css_class': 'bg-emerald-100 text-emerald-800',
|
||||
'sort_order': 7
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="WILD_MOUSE",
|
||||
label="Wild Mouse",
|
||||
description="Compact coaster with sharp turns and sudden drops",
|
||||
metadata={
|
||||
'color': 'yellow',
|
||||
'icon': 'mouse',
|
||||
'css_class': 'bg-yellow-100 text-yellow-800',
|
||||
'sort_order': 8
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="SPINNING",
|
||||
label="Spinning",
|
||||
description="Cars rotate freely during the ride",
|
||||
metadata={
|
||||
'color': 'pink',
|
||||
'icon': 'spinning',
|
||||
'css_class': 'bg-pink-100 text-pink-800',
|
||||
'sort_order': 9
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="FOURTH_DIMENSION",
|
||||
label="4th Dimension",
|
||||
description="Seats rotate independently of the track direction",
|
||||
metadata={
|
||||
'color': 'violet',
|
||||
'icon': '4d',
|
||||
'css_class': 'bg-violet-100 text-violet-800',
|
||||
'sort_order': 10
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="OTHER",
|
||||
label="Other",
|
||||
description="Coaster type that doesn't fit standard classifications",
|
||||
metadata={
|
||||
'color': 'gray',
|
||||
'icon': 'other',
|
||||
'css_class': 'bg-gray-100 text-gray-800',
|
||||
'sort_order': 11
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
]
|
||||
|
||||
# Propulsion System Choices
|
||||
PROPULSION_SYSTEMS = [
|
||||
RichChoice(
|
||||
value="CHAIN",
|
||||
label="Chain Lift",
|
||||
description="Traditional chain lift system to pull trains up the lift hill",
|
||||
metadata={
|
||||
'color': 'gray',
|
||||
'icon': 'chain',
|
||||
'css_class': 'bg-gray-100 text-gray-800',
|
||||
'sort_order': 1
|
||||
},
|
||||
category=ChoiceCategory.TECHNICAL
|
||||
),
|
||||
RichChoice(
|
||||
value="LSM",
|
||||
label="LSM Launch",
|
||||
description="Linear Synchronous Motor launch system using magnetic propulsion",
|
||||
metadata={
|
||||
'color': 'blue',
|
||||
'icon': 'lightning',
|
||||
'css_class': 'bg-blue-100 text-blue-800',
|
||||
'sort_order': 2
|
||||
},
|
||||
category=ChoiceCategory.TECHNICAL
|
||||
),
|
||||
RichChoice(
|
||||
value="HYDRAULIC",
|
||||
label="Hydraulic Launch",
|
||||
description="High-pressure hydraulic launch system for rapid acceleration",
|
||||
metadata={
|
||||
'color': 'red',
|
||||
'icon': 'hydraulic',
|
||||
'css_class': 'bg-red-100 text-red-800',
|
||||
'sort_order': 3
|
||||
},
|
||||
category=ChoiceCategory.TECHNICAL
|
||||
),
|
||||
RichChoice(
|
||||
value="GRAVITY",
|
||||
label="Gravity",
|
||||
description="Uses gravity and momentum without mechanical lift systems",
|
||||
metadata={
|
||||
'color': 'green',
|
||||
'icon': 'gravity',
|
||||
'css_class': 'bg-green-100 text-green-800',
|
||||
'sort_order': 4
|
||||
},
|
||||
category=ChoiceCategory.TECHNICAL
|
||||
),
|
||||
RichChoice(
|
||||
value="OTHER",
|
||||
label="Other",
|
||||
description="Propulsion system that doesn't fit standard categories",
|
||||
metadata={
|
||||
'color': 'gray',
|
||||
'icon': 'other',
|
||||
'css_class': 'bg-gray-100 text-gray-800',
|
||||
'sort_order': 5
|
||||
},
|
||||
category=ChoiceCategory.TECHNICAL
|
||||
),
|
||||
]
|
||||
|
||||
# Ride Model Target Market Choices
|
||||
TARGET_MARKETS = [
|
||||
RichChoice(
|
||||
value="FAMILY",
|
||||
label="Family",
|
||||
description="Designed for families with children, moderate thrills",
|
||||
metadata={
|
||||
'color': 'green',
|
||||
'icon': 'family',
|
||||
'css_class': 'bg-green-100 text-green-800',
|
||||
'sort_order': 1
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="THRILL",
|
||||
label="Thrill",
|
||||
description="High-intensity rides for thrill seekers",
|
||||
metadata={
|
||||
'color': 'red',
|
||||
'icon': 'thrill',
|
||||
'css_class': 'bg-red-100 text-red-800',
|
||||
'sort_order': 2
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="EXTREME",
|
||||
label="Extreme",
|
||||
description="Maximum intensity rides for extreme thrill seekers",
|
||||
metadata={
|
||||
'color': 'purple',
|
||||
'icon': 'extreme',
|
||||
'css_class': 'bg-purple-100 text-purple-800',
|
||||
'sort_order': 3
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="KIDDIE",
|
||||
label="Kiddie",
|
||||
description="Gentle rides designed specifically for young children",
|
||||
metadata={
|
||||
'color': 'yellow',
|
||||
'icon': 'kiddie',
|
||||
'css_class': 'bg-yellow-100 text-yellow-800',
|
||||
'sort_order': 4
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="ALL_AGES",
|
||||
label="All Ages",
|
||||
description="Suitable for riders of all ages and thrill preferences",
|
||||
metadata={
|
||||
'color': 'blue',
|
||||
'icon': 'all-ages',
|
||||
'css_class': 'bg-blue-100 text-blue-800',
|
||||
'sort_order': 5
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
]
|
||||
|
||||
# Ride Model Photo Type Choices
|
||||
PHOTO_TYPES = [
|
||||
RichChoice(
|
||||
value="PROMOTIONAL",
|
||||
label="Promotional",
|
||||
description="Marketing and promotional photos of the ride model",
|
||||
metadata={
|
||||
'color': 'blue',
|
||||
'icon': 'camera',
|
||||
'css_class': 'bg-blue-100 text-blue-800',
|
||||
'sort_order': 1
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="TECHNICAL",
|
||||
label="Technical Drawing",
|
||||
description="Technical drawings and engineering diagrams",
|
||||
metadata={
|
||||
'color': 'gray',
|
||||
'icon': 'blueprint',
|
||||
'css_class': 'bg-gray-100 text-gray-800',
|
||||
'sort_order': 2
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="INSTALLATION",
|
||||
label="Installation Example",
|
||||
description="Photos of actual installations of this ride model",
|
||||
metadata={
|
||||
'color': 'green',
|
||||
'icon': 'installation',
|
||||
'css_class': 'bg-green-100 text-green-800',
|
||||
'sort_order': 3
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="RENDERING",
|
||||
label="3D Rendering",
|
||||
description="Computer-generated 3D renderings of the ride model",
|
||||
metadata={
|
||||
'color': 'purple',
|
||||
'icon': 'cube',
|
||||
'css_class': 'bg-purple-100 text-purple-800',
|
||||
'sort_order': 4
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="CATALOG",
|
||||
label="Catalog Image",
|
||||
description="Official catalog and brochure images",
|
||||
metadata={
|
||||
'color': 'orange',
|
||||
'icon': 'catalog',
|
||||
'css_class': 'bg-orange-100 text-orange-800',
|
||||
'sort_order': 5
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
]
|
||||
|
||||
# Technical Specification Category Choices
|
||||
SPEC_CATEGORIES = [
|
||||
RichChoice(
|
||||
value="DIMENSIONS",
|
||||
label="Dimensions",
|
||||
description="Physical dimensions and measurements",
|
||||
metadata={
|
||||
'color': 'blue',
|
||||
'icon': 'ruler',
|
||||
'css_class': 'bg-blue-100 text-blue-800',
|
||||
'sort_order': 1
|
||||
},
|
||||
category=ChoiceCategory.TECHNICAL
|
||||
),
|
||||
RichChoice(
|
||||
value="PERFORMANCE",
|
||||
label="Performance",
|
||||
description="Performance specifications and capabilities",
|
||||
metadata={
|
||||
'color': 'red',
|
||||
'icon': 'speedometer',
|
||||
'css_class': 'bg-red-100 text-red-800',
|
||||
'sort_order': 2
|
||||
},
|
||||
category=ChoiceCategory.TECHNICAL
|
||||
),
|
||||
RichChoice(
|
||||
value="CAPACITY",
|
||||
label="Capacity",
|
||||
description="Rider capacity and throughput specifications",
|
||||
metadata={
|
||||
'color': 'green',
|
||||
'icon': 'users',
|
||||
'css_class': 'bg-green-100 text-green-800',
|
||||
'sort_order': 3
|
||||
},
|
||||
category=ChoiceCategory.TECHNICAL
|
||||
),
|
||||
RichChoice(
|
||||
value="SAFETY",
|
||||
label="Safety Features",
|
||||
description="Safety systems and features",
|
||||
metadata={
|
||||
'color': 'yellow',
|
||||
'icon': 'shield',
|
||||
'css_class': 'bg-yellow-100 text-yellow-800',
|
||||
'sort_order': 4
|
||||
},
|
||||
category=ChoiceCategory.TECHNICAL
|
||||
),
|
||||
RichChoice(
|
||||
value="ELECTRICAL",
|
||||
label="Electrical Requirements",
|
||||
description="Power and electrical system requirements",
|
||||
metadata={
|
||||
'color': 'purple',
|
||||
'icon': 'lightning',
|
||||
'css_class': 'bg-purple-100 text-purple-800',
|
||||
'sort_order': 5
|
||||
},
|
||||
category=ChoiceCategory.TECHNICAL
|
||||
),
|
||||
RichChoice(
|
||||
value="FOUNDATION",
|
||||
label="Foundation Requirements",
|
||||
description="Foundation and structural requirements",
|
||||
metadata={
|
||||
'color': 'gray',
|
||||
'icon': 'foundation',
|
||||
'css_class': 'bg-gray-100 text-gray-800',
|
||||
'sort_order': 6
|
||||
},
|
||||
category=ChoiceCategory.TECHNICAL
|
||||
),
|
||||
RichChoice(
|
||||
value="MAINTENANCE",
|
||||
label="Maintenance",
|
||||
description="Maintenance requirements and procedures",
|
||||
metadata={
|
||||
'color': 'orange',
|
||||
'icon': 'wrench',
|
||||
'css_class': 'bg-orange-100 text-orange-800',
|
||||
'sort_order': 7
|
||||
},
|
||||
category=ChoiceCategory.TECHNICAL
|
||||
),
|
||||
RichChoice(
|
||||
value="OTHER",
|
||||
label="Other",
|
||||
description="Other technical specifications",
|
||||
metadata={
|
||||
'color': 'gray',
|
||||
'icon': 'other',
|
||||
'css_class': 'bg-gray-100 text-gray-800',
|
||||
'sort_order': 8
|
||||
},
|
||||
category=ChoiceCategory.TECHNICAL
|
||||
),
|
||||
]
|
||||
|
||||
# Company Role Choices for Rides Domain (MANUFACTURER and DESIGNER only)
|
||||
RIDES_COMPANY_ROLES = [
|
||||
RichChoice(
|
||||
value="MANUFACTURER",
|
||||
label="Ride Manufacturer",
|
||||
description="Company that designs and builds ride hardware and systems",
|
||||
metadata={
|
||||
'color': 'blue',
|
||||
'icon': 'factory',
|
||||
'css_class': 'bg-blue-100 text-blue-800',
|
||||
'sort_order': 1,
|
||||
'domain': 'rides',
|
||||
'permissions': ['manage_ride_models', 'view_manufacturing'],
|
||||
'url_pattern': '/rides/manufacturers/{slug}/'
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
RichChoice(
|
||||
value="DESIGNER",
|
||||
label="Ride Designer",
|
||||
description="Company that specializes in ride design, layout, and engineering",
|
||||
metadata={
|
||||
'color': 'purple',
|
||||
'icon': 'design',
|
||||
'css_class': 'bg-purple-100 text-purple-800',
|
||||
'sort_order': 2,
|
||||
'domain': 'rides',
|
||||
'permissions': ['manage_ride_designs', 'view_design_specs'],
|
||||
'url_pattern': '/rides/designers/{slug}/'
|
||||
},
|
||||
category=ChoiceCategory.CLASSIFICATION
|
||||
),
|
||||
]
|
||||
|
||||
|
||||
def register_rides_choices():
|
||||
"""Register all rides domain choices with the global registry"""
|
||||
|
||||
register_choices(
|
||||
name="categories",
|
||||
choices=RIDE_CATEGORIES,
|
||||
domain="rides",
|
||||
description="Ride category classifications",
|
||||
metadata={'domain': 'rides', 'type': 'category'}
|
||||
)
|
||||
|
||||
register_choices(
|
||||
name="statuses",
|
||||
choices=RIDE_STATUSES,
|
||||
domain="rides",
|
||||
description="Ride operational status options",
|
||||
metadata={'domain': 'rides', 'type': 'status'}
|
||||
)
|
||||
|
||||
register_choices(
|
||||
name="post_closing_statuses",
|
||||
choices=POST_CLOSING_STATUSES,
|
||||
domain="rides",
|
||||
description="Status options after ride closure",
|
||||
metadata={'domain': 'rides', 'type': 'post_closing_status'}
|
||||
)
|
||||
|
||||
register_choices(
|
||||
name="track_materials",
|
||||
choices=TRACK_MATERIALS,
|
||||
domain="rides",
|
||||
description="Roller coaster track material types",
|
||||
metadata={'domain': 'rides', 'type': 'track_material', 'applies_to': 'roller_coasters'}
|
||||
)
|
||||
|
||||
register_choices(
|
||||
name="coaster_types",
|
||||
choices=COASTER_TYPES,
|
||||
domain="rides",
|
||||
description="Roller coaster type classifications",
|
||||
metadata={'domain': 'rides', 'type': 'coaster_type', 'applies_to': 'roller_coasters'}
|
||||
)
|
||||
|
||||
register_choices(
|
||||
name="propulsion_systems",
|
||||
choices=PROPULSION_SYSTEMS,
|
||||
domain="rides",
|
||||
description="Roller coaster propulsion and lift systems",
|
||||
metadata={'domain': 'rides', 'type': 'propulsion_system', 'applies_to': 'roller_coasters'}
|
||||
)
|
||||
|
||||
register_choices(
|
||||
name="target_markets",
|
||||
choices=TARGET_MARKETS,
|
||||
domain="rides",
|
||||
description="Target market classifications for ride models",
|
||||
metadata={'domain': 'rides', 'type': 'target_market', 'applies_to': 'ride_models'}
|
||||
)
|
||||
|
||||
register_choices(
|
||||
name="photo_types",
|
||||
choices=PHOTO_TYPES,
|
||||
domain="rides",
|
||||
description="Photo type classifications for ride model images",
|
||||
metadata={'domain': 'rides', 'type': 'photo_type', 'applies_to': 'ride_model_photos'}
|
||||
)
|
||||
|
||||
register_choices(
|
||||
name="spec_categories",
|
||||
choices=SPEC_CATEGORIES,
|
||||
domain="rides",
|
||||
description="Technical specification category classifications",
|
||||
metadata={'domain': 'rides', 'type': 'spec_category', 'applies_to': 'ride_model_specs'}
|
||||
)
|
||||
|
||||
register_choices(
|
||||
name="company_roles",
|
||||
choices=RIDES_COMPANY_ROLES,
|
||||
domain="rides",
|
||||
description="Company role classifications for rides domain (MANUFACTURER and DESIGNER only)",
|
||||
metadata={'domain': 'rides', 'type': 'company_role'}
|
||||
)
|
||||
|
||||
|
||||
# Auto-register choices when module is imported
|
||||
register_rides_choices()
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,96 +0,0 @@
|
||||
import pghistory
|
||||
from django.contrib.postgres.fields import ArrayField
|
||||
from django.db import models
|
||||
from django.urls import reverse
|
||||
from django.utils.text import slugify
|
||||
from django.conf import settings
|
||||
|
||||
from apps.core.history import HistoricalSlug
|
||||
from apps.core.models import TrackedModel
|
||||
from apps.core.choices.fields import RichChoiceField
|
||||
|
||||
|
||||
@pghistory.track()
|
||||
class Company(TrackedModel):
|
||||
name = models.CharField(max_length=255)
|
||||
slug = models.SlugField(max_length=255, unique=True)
|
||||
roles = ArrayField(
|
||||
RichChoiceField(choice_group="company_roles", domain="rides", max_length=20),
|
||||
default=list,
|
||||
blank=True,
|
||||
)
|
||||
description = models.TextField(blank=True)
|
||||
website = models.URLField(blank=True)
|
||||
|
||||
# General company info
|
||||
founded_date = models.DateField(null=True, blank=True)
|
||||
|
||||
# Manufacturer-specific fields
|
||||
rides_count = models.IntegerField(default=0)
|
||||
coasters_count = models.IntegerField(default=0)
|
||||
|
||||
# Frontend URL
|
||||
url = models.URLField(blank=True, help_text="Frontend URL for this company")
|
||||
|
||||
def __str__(self):
|
||||
return self.name
|
||||
|
||||
def save(self, *args, **kwargs):
|
||||
if not self.slug:
|
||||
self.slug = slugify(self.name)
|
||||
|
||||
# Generate frontend URL based on primary role
|
||||
# CRITICAL: Only MANUFACTURER and DESIGNER are for rides domain
|
||||
# OPERATOR and PROPERTY_OWNER are for parks domain and handled separately
|
||||
if self.roles:
|
||||
frontend_domain = getattr(
|
||||
settings, "FRONTEND_DOMAIN", "https://thrillwiki.com"
|
||||
)
|
||||
primary_role = self.roles[0] # Use first role as primary
|
||||
|
||||
if primary_role == "MANUFACTURER":
|
||||
self.url = f"{frontend_domain}/rides/manufacturers/{self.slug}/"
|
||||
elif primary_role == "DESIGNER":
|
||||
self.url = f"{frontend_domain}/rides/designers/{self.slug}/"
|
||||
# OPERATOR and PROPERTY_OWNER URLs are handled by parks domain, not here
|
||||
|
||||
super().save(*args, **kwargs)
|
||||
|
||||
def get_absolute_url(self):
|
||||
# This will need to be updated to handle different roles
|
||||
return reverse("companies:detail", kwargs={"slug": self.slug})
|
||||
|
||||
@classmethod
|
||||
def get_by_slug(cls, slug):
|
||||
"""Get company by current or historical slug"""
|
||||
try:
|
||||
return cls.objects.get(slug=slug), False
|
||||
except cls.DoesNotExist:
|
||||
# Check pghistory first
|
||||
try:
|
||||
from django.apps import apps
|
||||
history_model = apps.get_model('rides', f'{cls.__name__}Event')
|
||||
history_entry = (
|
||||
history_model.objects.filter(slug=slug)
|
||||
.order_by("-pgh_created_at")
|
||||
.first()
|
||||
)
|
||||
if history_entry:
|
||||
return cls.objects.get(id=history_entry.pgh_obj_id), True
|
||||
except LookupError:
|
||||
# History model doesn't exist, skip pghistory check
|
||||
pass
|
||||
|
||||
# Check manual slug history as fallback
|
||||
try:
|
||||
historical = HistoricalSlug.objects.get(
|
||||
content_type__model="company", slug=slug
|
||||
)
|
||||
return cls.objects.get(pk=historical.object_id), True
|
||||
except (HistoricalSlug.DoesNotExist, cls.DoesNotExist):
|
||||
raise cls.DoesNotExist("No company found with this slug")
|
||||
|
||||
class Meta(TrackedModel.Meta):
|
||||
app_label = "rides"
|
||||
ordering = ["name"]
|
||||
verbose_name_plural = "Companies"
|
||||
@@ -1,4 +0,0 @@
|
||||
from .location_service import RideLocationService
|
||||
from .media_service import RideMediaService
|
||||
|
||||
__all__ = ["RideLocationService", "RideMediaService"]
|
||||
@@ -1,784 +0,0 @@
|
||||
"""
|
||||
Smart Ride Loader for Hybrid Filtering Strategy
|
||||
|
||||
This service implements intelligent data loading for rides, automatically choosing
|
||||
between client-side and server-side filtering based on data size and complexity.
|
||||
|
||||
Key Features:
|
||||
- Automatic strategy selection (≤200 records = client-side, >200 = server-side)
|
||||
- Progressive loading for large datasets
|
||||
- Intelligent caching with automatic invalidation
|
||||
- Comprehensive filter metadata generation
|
||||
- Optimized database queries with strategic prefetching
|
||||
|
||||
Architecture:
|
||||
- Client-side: Load all data once, filter in frontend
|
||||
- Server-side: Apply filters in database, paginate results
|
||||
- Hybrid: Combine both approaches based on data characteristics
|
||||
"""
|
||||
|
||||
from typing import Dict, List, Any, Optional
|
||||
from django.core.cache import cache
|
||||
from django.db import models
|
||||
from django.db.models import Q, Min, Max
|
||||
import logging
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class SmartRideLoader:
|
||||
"""
|
||||
Intelligent ride data loader that chooses optimal filtering strategy.
|
||||
|
||||
Strategy Selection:
|
||||
- ≤200 total records: Client-side filtering (load all data)
|
||||
- >200 total records: Server-side filtering (database filtering + pagination)
|
||||
|
||||
Features:
|
||||
- Progressive loading for large datasets
|
||||
- 5-minute intelligent caching
|
||||
- Comprehensive filter metadata
|
||||
- Optimized queries with prefetch_related
|
||||
"""
|
||||
|
||||
# Configuration constants
|
||||
INITIAL_LOAD_SIZE = 50
|
||||
PROGRESSIVE_LOAD_SIZE = 25
|
||||
MAX_CLIENT_SIDE_RECORDS = 200
|
||||
CACHE_TIMEOUT = 300 # 5 minutes
|
||||
|
||||
def __init__(self):
|
||||
self.cache_prefix = "rides_hybrid_"
|
||||
|
||||
def get_initial_load(self, filters: Optional[Dict[str, Any]] = None) -> Dict[str, Any]:
|
||||
"""
|
||||
Get initial data load with automatic strategy selection.
|
||||
|
||||
Args:
|
||||
filters: Optional filter parameters
|
||||
|
||||
Returns:
|
||||
Dict containing:
|
||||
- strategy: 'client_side' or 'server_side'
|
||||
- data: List of ride records
|
||||
- total_count: Total number of records
|
||||
- has_more: Whether more data is available
|
||||
- filter_metadata: Available filter options
|
||||
"""
|
||||
|
||||
# Get total count for strategy decision
|
||||
total_count = self._get_total_count(filters)
|
||||
|
||||
# Choose strategy based on total count
|
||||
if total_count <= self.MAX_CLIENT_SIDE_RECORDS:
|
||||
return self._get_client_side_data(filters, total_count)
|
||||
else:
|
||||
return self._get_server_side_data(filters, total_count)
|
||||
|
||||
def get_progressive_load(self, offset: int, filters: Optional[Dict[str, Any]] = None) -> Dict[str, Any]:
|
||||
"""
|
||||
Get additional data for progressive loading (server-side strategy only).
|
||||
|
||||
Args:
|
||||
offset: Number of records to skip
|
||||
filters: Filter parameters
|
||||
|
||||
Returns:
|
||||
Dict containing additional ride records
|
||||
"""
|
||||
|
||||
# Build queryset with filters
|
||||
queryset = self._build_filtered_queryset(filters)
|
||||
|
||||
# Get total count for this filtered set
|
||||
total_count = queryset.count()
|
||||
|
||||
# Get progressive batch
|
||||
rides = list(queryset[offset:offset + self.PROGRESSIVE_LOAD_SIZE])
|
||||
|
||||
return {
|
||||
'rides': self._serialize_rides(rides),
|
||||
'total_count': total_count,
|
||||
'has_more': len(rides) == self.PROGRESSIVE_LOAD_SIZE,
|
||||
'next_offset': offset + len(rides) if len(rides) == self.PROGRESSIVE_LOAD_SIZE else None
|
||||
}
|
||||
|
||||
def get_filter_metadata(self, filters: Optional[Dict[str, Any]] = None) -> Dict[str, Any]:
|
||||
"""
|
||||
Get comprehensive filter metadata for dynamic filter generation.
|
||||
|
||||
Args:
|
||||
filters: Optional filters to scope the metadata
|
||||
|
||||
Returns:
|
||||
Dict containing all available filter options and ranges
|
||||
"""
|
||||
cache_key = f"{self.cache_prefix}filter_metadata_{hash(str(filters))}"
|
||||
metadata = cache.get(cache_key)
|
||||
|
||||
if metadata is None:
|
||||
metadata = self._generate_filter_metadata(filters)
|
||||
cache.set(cache_key, metadata, self.CACHE_TIMEOUT)
|
||||
|
||||
return metadata
|
||||
|
||||
def invalidate_cache(self) -> None:
|
||||
"""Invalidate all cached data for rides."""
|
||||
# Note: In production, you might want to use cache versioning
|
||||
# or more sophisticated cache invalidation
|
||||
cache_keys = [
|
||||
f"{self.cache_prefix}client_side_all",
|
||||
f"{self.cache_prefix}filter_metadata",
|
||||
f"{self.cache_prefix}total_count",
|
||||
]
|
||||
|
||||
for key in cache_keys:
|
||||
cache.delete(key)
|
||||
|
||||
def _get_total_count(self, filters: Optional[Dict[str, Any]] = None) -> int:
|
||||
"""Get total count of rides matching filters."""
|
||||
cache_key = f"{self.cache_prefix}total_count_{hash(str(filters))}"
|
||||
count = cache.get(cache_key)
|
||||
|
||||
if count is None:
|
||||
queryset = self._build_filtered_queryset(filters)
|
||||
count = queryset.count()
|
||||
cache.set(cache_key, count, self.CACHE_TIMEOUT)
|
||||
|
||||
return count
|
||||
|
||||
def _get_client_side_data(self, filters: Optional[Dict[str, Any]],
|
||||
total_count: int) -> Dict[str, Any]:
|
||||
"""Get all data for client-side filtering."""
|
||||
cache_key = f"{self.cache_prefix}client_side_all"
|
||||
cached_data = cache.get(cache_key)
|
||||
|
||||
if cached_data is None:
|
||||
from apps.rides.models import Ride
|
||||
|
||||
# Load all rides with optimized query
|
||||
queryset = Ride.objects.select_related(
|
||||
'park',
|
||||
'park__location',
|
||||
'park_area',
|
||||
'manufacturer',
|
||||
'designer',
|
||||
'ride_model',
|
||||
'ride_model__manufacturer'
|
||||
).prefetch_related(
|
||||
'coaster_stats'
|
||||
).order_by('name')
|
||||
|
||||
rides = list(queryset)
|
||||
cached_data = self._serialize_rides(rides)
|
||||
cache.set(cache_key, cached_data, self.CACHE_TIMEOUT)
|
||||
|
||||
return {
|
||||
'strategy': 'client_side',
|
||||
'rides': cached_data,
|
||||
'total_count': total_count,
|
||||
'has_more': False,
|
||||
'filter_metadata': self.get_filter_metadata(filters)
|
||||
}
|
||||
|
||||
def _get_server_side_data(self, filters: Optional[Dict[str, Any]],
|
||||
total_count: int) -> Dict[str, Any]:
|
||||
"""Get initial batch for server-side filtering."""
|
||||
# Build filtered queryset
|
||||
queryset = self._build_filtered_queryset(filters)
|
||||
|
||||
# Get initial batch
|
||||
rides = list(queryset[:self.INITIAL_LOAD_SIZE])
|
||||
|
||||
return {
|
||||
'strategy': 'server_side',
|
||||
'rides': self._serialize_rides(rides),
|
||||
'total_count': total_count,
|
||||
'has_more': len(rides) == self.INITIAL_LOAD_SIZE,
|
||||
'next_offset': len(rides) if len(rides) == self.INITIAL_LOAD_SIZE else None
|
||||
}
|
||||
|
||||
def _build_filtered_queryset(self, filters: Optional[Dict[str, Any]]):
|
||||
"""Build Django queryset with applied filters."""
|
||||
from apps.rides.models import Ride
|
||||
|
||||
# Start with optimized base queryset
|
||||
queryset = Ride.objects.select_related(
|
||||
'park',
|
||||
'park__location',
|
||||
'park_area',
|
||||
'manufacturer',
|
||||
'designer',
|
||||
'ride_model',
|
||||
'ride_model__manufacturer'
|
||||
).prefetch_related(
|
||||
'coaster_stats'
|
||||
)
|
||||
|
||||
if not filters:
|
||||
return queryset.order_by('name')
|
||||
|
||||
# Apply filters
|
||||
q_objects = Q()
|
||||
|
||||
# Text search using computed search_text field
|
||||
if 'search' in filters and filters['search']:
|
||||
search_term = filters['search'].lower()
|
||||
q_objects &= Q(search_text__icontains=search_term)
|
||||
|
||||
# Park filters
|
||||
if 'park_slug' in filters and filters['park_slug']:
|
||||
q_objects &= Q(park__slug=filters['park_slug'])
|
||||
|
||||
if 'park_id' in filters and filters['park_id']:
|
||||
q_objects &= Q(park_id=filters['park_id'])
|
||||
|
||||
# Category filters
|
||||
if 'category' in filters and filters['category']:
|
||||
q_objects &= Q(category__in=filters['category'])
|
||||
|
||||
# Status filters
|
||||
if 'status' in filters and filters['status']:
|
||||
q_objects &= Q(status__in=filters['status'])
|
||||
|
||||
# Company filters
|
||||
if 'manufacturer_ids' in filters and filters['manufacturer_ids']:
|
||||
q_objects &= Q(manufacturer_id__in=filters['manufacturer_ids'])
|
||||
|
||||
if 'designer_ids' in filters and filters['designer_ids']:
|
||||
q_objects &= Q(designer_id__in=filters['designer_ids'])
|
||||
|
||||
# Ride model filters
|
||||
if 'ride_model_ids' in filters and filters['ride_model_ids']:
|
||||
q_objects &= Q(ride_model_id__in=filters['ride_model_ids'])
|
||||
|
||||
# Opening year filters using computed opening_year field
|
||||
if 'opening_year' in filters and filters['opening_year']:
|
||||
q_objects &= Q(opening_year=filters['opening_year'])
|
||||
|
||||
if 'min_opening_year' in filters and filters['min_opening_year']:
|
||||
q_objects &= Q(opening_year__gte=filters['min_opening_year'])
|
||||
|
||||
if 'max_opening_year' in filters and filters['max_opening_year']:
|
||||
q_objects &= Q(opening_year__lte=filters['max_opening_year'])
|
||||
|
||||
# Rating filters
|
||||
if 'min_rating' in filters and filters['min_rating']:
|
||||
q_objects &= Q(average_rating__gte=filters['min_rating'])
|
||||
|
||||
if 'max_rating' in filters and filters['max_rating']:
|
||||
q_objects &= Q(average_rating__lte=filters['max_rating'])
|
||||
|
||||
# Height requirement filters
|
||||
if 'min_height_requirement' in filters and filters['min_height_requirement']:
|
||||
q_objects &= Q(min_height_in__gte=filters['min_height_requirement'])
|
||||
|
||||
if 'max_height_requirement' in filters and filters['max_height_requirement']:
|
||||
q_objects &= Q(max_height_in__lte=filters['max_height_requirement'])
|
||||
|
||||
# Capacity filters
|
||||
if 'min_capacity' in filters and filters['min_capacity']:
|
||||
q_objects &= Q(capacity_per_hour__gte=filters['min_capacity'])
|
||||
|
||||
if 'max_capacity' in filters and filters['max_capacity']:
|
||||
q_objects &= Q(capacity_per_hour__lte=filters['max_capacity'])
|
||||
|
||||
# Roller coaster specific filters
|
||||
if 'roller_coaster_type' in filters and filters['roller_coaster_type']:
|
||||
q_objects &= Q(coaster_stats__roller_coaster_type__in=filters['roller_coaster_type'])
|
||||
|
||||
if 'track_material' in filters and filters['track_material']:
|
||||
q_objects &= Q(coaster_stats__track_material__in=filters['track_material'])
|
||||
|
||||
if 'propulsion_system' in filters and filters['propulsion_system']:
|
||||
q_objects &= Q(coaster_stats__propulsion_system__in=filters['propulsion_system'])
|
||||
|
||||
# Roller coaster height filters
|
||||
if 'min_height_ft' in filters and filters['min_height_ft']:
|
||||
q_objects &= Q(coaster_stats__height_ft__gte=filters['min_height_ft'])
|
||||
|
||||
if 'max_height_ft' in filters and filters['max_height_ft']:
|
||||
q_objects &= Q(coaster_stats__height_ft__lte=filters['max_height_ft'])
|
||||
|
||||
# Roller coaster speed filters
|
||||
if 'min_speed_mph' in filters and filters['min_speed_mph']:
|
||||
q_objects &= Q(coaster_stats__speed_mph__gte=filters['min_speed_mph'])
|
||||
|
||||
if 'max_speed_mph' in filters and filters['max_speed_mph']:
|
||||
q_objects &= Q(coaster_stats__speed_mph__lte=filters['max_speed_mph'])
|
||||
|
||||
# Inversion filters
|
||||
if 'min_inversions' in filters and filters['min_inversions']:
|
||||
q_objects &= Q(coaster_stats__inversions__gte=filters['min_inversions'])
|
||||
|
||||
if 'max_inversions' in filters and filters['max_inversions']:
|
||||
q_objects &= Q(coaster_stats__inversions__lte=filters['max_inversions'])
|
||||
|
||||
if 'has_inversions' in filters and filters['has_inversions'] is not None:
|
||||
if filters['has_inversions']:
|
||||
q_objects &= Q(coaster_stats__inversions__gt=0)
|
||||
else:
|
||||
q_objects &= Q(coaster_stats__inversions=0)
|
||||
|
||||
# Apply filters and ordering
|
||||
queryset = queryset.filter(q_objects)
|
||||
|
||||
# Apply ordering
|
||||
ordering = filters.get('ordering', 'name')
|
||||
if ordering in ['height_ft', '-height_ft', 'speed_mph', '-speed_mph']:
|
||||
# For coaster stats ordering, we need to join and order by the stats
|
||||
ordering_field = ordering.replace('height_ft', 'coaster_stats__height_ft').replace('speed_mph', 'coaster_stats__speed_mph')
|
||||
queryset = queryset.order_by(ordering_field)
|
||||
else:
|
||||
queryset = queryset.order_by(ordering)
|
||||
|
||||
return queryset
|
||||
|
||||
def _serialize_rides(self, rides: List) -> List[Dict[str, Any]]:
|
||||
"""Serialize ride objects to dictionaries."""
|
||||
serialized = []
|
||||
|
||||
for ride in rides:
|
||||
# Basic ride data
|
||||
ride_data = {
|
||||
'id': ride.id,
|
||||
'name': ride.name,
|
||||
'slug': ride.slug,
|
||||
'description': ride.description,
|
||||
'category': ride.category,
|
||||
'status': ride.status,
|
||||
'opening_date': ride.opening_date.isoformat() if ride.opening_date else None,
|
||||
'closing_date': ride.closing_date.isoformat() if ride.closing_date else None,
|
||||
'opening_year': ride.opening_year,
|
||||
'min_height_in': ride.min_height_in,
|
||||
'max_height_in': ride.max_height_in,
|
||||
'capacity_per_hour': ride.capacity_per_hour,
|
||||
'ride_duration_seconds': ride.ride_duration_seconds,
|
||||
'average_rating': float(ride.average_rating) if ride.average_rating else None,
|
||||
'url': ride.url,
|
||||
'park_url': ride.park_url,
|
||||
'created_at': ride.created_at.isoformat(),
|
||||
'updated_at': ride.updated_at.isoformat(),
|
||||
}
|
||||
|
||||
# Park data
|
||||
if ride.park:
|
||||
ride_data['park'] = {
|
||||
'id': ride.park.id,
|
||||
'name': ride.park.name,
|
||||
'slug': ride.park.slug,
|
||||
}
|
||||
|
||||
# Park location data
|
||||
if hasattr(ride.park, 'location') and ride.park.location:
|
||||
ride_data['park']['location'] = {
|
||||
'city': ride.park.location.city,
|
||||
'state': ride.park.location.state,
|
||||
'country': ride.park.location.country,
|
||||
}
|
||||
|
||||
# Park area data
|
||||
if ride.park_area:
|
||||
ride_data['park_area'] = {
|
||||
'id': ride.park_area.id,
|
||||
'name': ride.park_area.name,
|
||||
'slug': ride.park_area.slug,
|
||||
}
|
||||
|
||||
# Company data
|
||||
if ride.manufacturer:
|
||||
ride_data['manufacturer'] = {
|
||||
'id': ride.manufacturer.id,
|
||||
'name': ride.manufacturer.name,
|
||||
'slug': ride.manufacturer.slug,
|
||||
}
|
||||
|
||||
if ride.designer:
|
||||
ride_data['designer'] = {
|
||||
'id': ride.designer.id,
|
||||
'name': ride.designer.name,
|
||||
'slug': ride.designer.slug,
|
||||
}
|
||||
|
||||
# Ride model data
|
||||
if ride.ride_model:
|
||||
ride_data['ride_model'] = {
|
||||
'id': ride.ride_model.id,
|
||||
'name': ride.ride_model.name,
|
||||
'slug': ride.ride_model.slug,
|
||||
'category': ride.ride_model.category,
|
||||
}
|
||||
|
||||
if ride.ride_model.manufacturer:
|
||||
ride_data['ride_model']['manufacturer'] = {
|
||||
'id': ride.ride_model.manufacturer.id,
|
||||
'name': ride.ride_model.manufacturer.name,
|
||||
'slug': ride.ride_model.manufacturer.slug,
|
||||
}
|
||||
|
||||
# Roller coaster stats
|
||||
if hasattr(ride, 'coaster_stats') and ride.coaster_stats:
|
||||
stats = ride.coaster_stats
|
||||
ride_data['coaster_stats'] = {
|
||||
'height_ft': float(stats.height_ft) if stats.height_ft else None,
|
||||
'length_ft': float(stats.length_ft) if stats.length_ft else None,
|
||||
'speed_mph': float(stats.speed_mph) if stats.speed_mph else None,
|
||||
'inversions': stats.inversions,
|
||||
'ride_time_seconds': stats.ride_time_seconds,
|
||||
'track_type': stats.track_type,
|
||||
'track_material': stats.track_material,
|
||||
'roller_coaster_type': stats.roller_coaster_type,
|
||||
'max_drop_height_ft': float(stats.max_drop_height_ft) if stats.max_drop_height_ft else None,
|
||||
'propulsion_system': stats.propulsion_system,
|
||||
'train_style': stats.train_style,
|
||||
'trains_count': stats.trains_count,
|
||||
'cars_per_train': stats.cars_per_train,
|
||||
'seats_per_car': stats.seats_per_car,
|
||||
}
|
||||
|
||||
serialized.append(ride_data)
|
||||
|
||||
return serialized
|
||||
|
||||
def _generate_filter_metadata(self, filters: Optional[Dict[str, Any]] = None) -> Dict[str, Any]:
|
||||
"""Generate comprehensive filter metadata."""
|
||||
from apps.rides.models import Ride, RideModel
|
||||
from apps.rides.models.company import Company
|
||||
from apps.rides.models.rides import RollerCoasterStats
|
||||
|
||||
# Get unique values from database with counts
|
||||
parks_data = list(Ride.objects.exclude(
|
||||
park__isnull=True
|
||||
).select_related('park').values(
|
||||
'park__id', 'park__name', 'park__slug'
|
||||
).annotate(count=models.Count('id')).distinct().order_by('park__name'))
|
||||
|
||||
park_areas_data = list(Ride.objects.exclude(
|
||||
park_area__isnull=True
|
||||
).select_related('park_area').values(
|
||||
'park_area__id', 'park_area__name', 'park_area__slug'
|
||||
).annotate(count=models.Count('id')).distinct().order_by('park_area__name'))
|
||||
|
||||
manufacturers_data = list(Company.objects.filter(
|
||||
roles__contains=['MANUFACTURER']
|
||||
).values('id', 'name', 'slug').annotate(
|
||||
count=models.Count('manufactured_rides')
|
||||
).order_by('name'))
|
||||
|
||||
designers_data = list(Company.objects.filter(
|
||||
roles__contains=['DESIGNER']
|
||||
).values('id', 'name', 'slug').annotate(
|
||||
count=models.Count('designed_rides')
|
||||
).order_by('name'))
|
||||
|
||||
ride_models_data = list(RideModel.objects.select_related(
|
||||
'manufacturer'
|
||||
).values(
|
||||
'id', 'name', 'slug', 'manufacturer__name', 'manufacturer__slug', 'category'
|
||||
).annotate(count=models.Count('rides')).order_by('manufacturer__name', 'name'))
|
||||
|
||||
# Get categories and statuses with counts
|
||||
categories_data = list(Ride.objects.values('category').annotate(
|
||||
count=models.Count('id')
|
||||
).order_by('category'))
|
||||
|
||||
statuses_data = list(Ride.objects.values('status').annotate(
|
||||
count=models.Count('id')
|
||||
).order_by('status'))
|
||||
|
||||
# Get roller coaster specific data with counts
|
||||
rc_types_data = list(RollerCoasterStats.objects.values('roller_coaster_type').annotate(
|
||||
count=models.Count('ride')
|
||||
).exclude(roller_coaster_type__isnull=True).order_by('roller_coaster_type'))
|
||||
|
||||
track_materials_data = list(RollerCoasterStats.objects.values('track_material').annotate(
|
||||
count=models.Count('ride')
|
||||
).exclude(track_material__isnull=True).order_by('track_material'))
|
||||
|
||||
propulsion_systems_data = list(RollerCoasterStats.objects.values('propulsion_system').annotate(
|
||||
count=models.Count('ride')
|
||||
).exclude(propulsion_system__isnull=True).order_by('propulsion_system'))
|
||||
|
||||
# Convert to frontend-expected format with value/label/count
|
||||
categories = [
|
||||
{
|
||||
'value': item['category'],
|
||||
'label': self._get_category_label(item['category']),
|
||||
'count': item['count']
|
||||
}
|
||||
for item in categories_data
|
||||
]
|
||||
|
||||
statuses = [
|
||||
{
|
||||
'value': item['status'],
|
||||
'label': self._get_status_label(item['status']),
|
||||
'count': item['count']
|
||||
}
|
||||
for item in statuses_data
|
||||
]
|
||||
|
||||
roller_coaster_types = [
|
||||
{
|
||||
'value': item['roller_coaster_type'],
|
||||
'label': self._get_rc_type_label(item['roller_coaster_type']),
|
||||
'count': item['count']
|
||||
}
|
||||
for item in rc_types_data
|
||||
]
|
||||
|
||||
track_materials = [
|
||||
{
|
||||
'value': item['track_material'],
|
||||
'label': self._get_track_material_label(item['track_material']),
|
||||
'count': item['count']
|
||||
}
|
||||
for item in track_materials_data
|
||||
]
|
||||
|
||||
propulsion_systems = [
|
||||
{
|
||||
'value': item['propulsion_system'],
|
||||
'label': self._get_propulsion_system_label(item['propulsion_system']),
|
||||
'count': item['count']
|
||||
}
|
||||
for item in propulsion_systems_data
|
||||
]
|
||||
|
||||
# Convert other data to expected format
|
||||
parks = [
|
||||
{
|
||||
'value': str(item['park__id']),
|
||||
'label': item['park__name'],
|
||||
'count': item['count']
|
||||
}
|
||||
for item in parks_data
|
||||
]
|
||||
|
||||
park_areas = [
|
||||
{
|
||||
'value': str(item['park_area__id']),
|
||||
'label': item['park_area__name'],
|
||||
'count': item['count']
|
||||
}
|
||||
for item in park_areas_data
|
||||
]
|
||||
|
||||
manufacturers = [
|
||||
{
|
||||
'value': str(item['id']),
|
||||
'label': item['name'],
|
||||
'count': item['count']
|
||||
}
|
||||
for item in manufacturers_data
|
||||
]
|
||||
|
||||
designers = [
|
||||
{
|
||||
'value': str(item['id']),
|
||||
'label': item['name'],
|
||||
'count': item['count']
|
||||
}
|
||||
for item in designers_data
|
||||
]
|
||||
|
||||
ride_models = [
|
||||
{
|
||||
'value': str(item['id']),
|
||||
'label': f"{item['manufacturer__name']} {item['name']}",
|
||||
'count': item['count']
|
||||
}
|
||||
for item in ride_models_data
|
||||
]
|
||||
|
||||
# Calculate ranges from actual data
|
||||
ride_stats = Ride.objects.aggregate(
|
||||
min_rating=Min('average_rating'),
|
||||
max_rating=Max('average_rating'),
|
||||
min_height_req=Min('min_height_in'),
|
||||
max_height_req=Max('max_height_in'),
|
||||
min_capacity=Min('capacity_per_hour'),
|
||||
max_capacity=Max('capacity_per_hour'),
|
||||
min_duration=Min('ride_duration_seconds'),
|
||||
max_duration=Max('ride_duration_seconds'),
|
||||
min_year=Min('opening_year'),
|
||||
max_year=Max('opening_year'),
|
||||
)
|
||||
|
||||
# Calculate roller coaster specific ranges
|
||||
coaster_stats = RollerCoasterStats.objects.aggregate(
|
||||
min_height_ft=Min('height_ft'),
|
||||
max_height_ft=Max('height_ft'),
|
||||
min_length_ft=Min('length_ft'),
|
||||
max_length_ft=Max('length_ft'),
|
||||
min_speed_mph=Min('speed_mph'),
|
||||
max_speed_mph=Max('speed_mph'),
|
||||
min_inversions=Min('inversions'),
|
||||
max_inversions=Max('inversions'),
|
||||
min_ride_time=Min('ride_time_seconds'),
|
||||
max_ride_time=Max('ride_time_seconds'),
|
||||
min_drop_height=Min('max_drop_height_ft'),
|
||||
max_drop_height=Max('max_drop_height_ft'),
|
||||
min_trains=Min('trains_count'),
|
||||
max_trains=Max('trains_count'),
|
||||
min_cars=Min('cars_per_train'),
|
||||
max_cars=Max('cars_per_train'),
|
||||
min_seats=Min('seats_per_car'),
|
||||
max_seats=Max('seats_per_car'),
|
||||
)
|
||||
|
||||
return {
|
||||
'categorical': {
|
||||
'categories': categories,
|
||||
'statuses': statuses,
|
||||
'roller_coaster_types': roller_coaster_types,
|
||||
'track_materials': track_materials,
|
||||
'propulsion_systems': propulsion_systems,
|
||||
'parks': parks,
|
||||
'park_areas': park_areas,
|
||||
'manufacturers': manufacturers,
|
||||
'designers': designers,
|
||||
'ride_models': ride_models,
|
||||
},
|
||||
'ranges': {
|
||||
'rating': {
|
||||
'min': float(ride_stats['min_rating'] or 1),
|
||||
'max': float(ride_stats['max_rating'] or 10),
|
||||
'step': 0.1,
|
||||
'unit': 'stars'
|
||||
},
|
||||
'height_requirement': {
|
||||
'min': ride_stats['min_height_req'] or 30,
|
||||
'max': ride_stats['max_height_req'] or 90,
|
||||
'step': 1,
|
||||
'unit': 'inches'
|
||||
},
|
||||
'capacity': {
|
||||
'min': ride_stats['min_capacity'] or 0,
|
||||
'max': ride_stats['max_capacity'] or 5000,
|
||||
'step': 50,
|
||||
'unit': 'riders/hour'
|
||||
},
|
||||
'ride_duration': {
|
||||
'min': ride_stats['min_duration'] or 0,
|
||||
'max': ride_stats['max_duration'] or 600,
|
||||
'step': 10,
|
||||
'unit': 'seconds'
|
||||
},
|
||||
'opening_year': {
|
||||
'min': ride_stats['min_year'] or 1800,
|
||||
'max': ride_stats['max_year'] or 2030,
|
||||
'step': 1,
|
||||
'unit': 'year'
|
||||
},
|
||||
'height_ft': {
|
||||
'min': float(coaster_stats['min_height_ft'] or 0),
|
||||
'max': float(coaster_stats['max_height_ft'] or 500),
|
||||
'step': 5,
|
||||
'unit': 'feet'
|
||||
},
|
||||
'length_ft': {
|
||||
'min': float(coaster_stats['min_length_ft'] or 0),
|
||||
'max': float(coaster_stats['max_length_ft'] or 10000),
|
||||
'step': 100,
|
||||
'unit': 'feet'
|
||||
},
|
||||
'speed_mph': {
|
||||
'min': float(coaster_stats['min_speed_mph'] or 0),
|
||||
'max': float(coaster_stats['max_speed_mph'] or 150),
|
||||
'step': 5,
|
||||
'unit': 'mph'
|
||||
},
|
||||
'inversions': {
|
||||
'min': coaster_stats['min_inversions'] or 0,
|
||||
'max': coaster_stats['max_inversions'] or 20,
|
||||
'step': 1,
|
||||
'unit': 'inversions'
|
||||
},
|
||||
},
|
||||
'total_count': Ride.objects.count(),
|
||||
}
|
||||
|
||||
def _get_category_label(self, category: str) -> str:
|
||||
"""Convert category code to human-readable label."""
|
||||
category_labels = {
|
||||
'RC': 'Roller Coaster',
|
||||
'DR': 'Dark Ride',
|
||||
'FR': 'Flat Ride',
|
||||
'WR': 'Water Ride',
|
||||
'TR': 'Transport Ride',
|
||||
'OT': 'Other',
|
||||
}
|
||||
if category in category_labels:
|
||||
return category_labels[category]
|
||||
else:
|
||||
raise ValueError(f"Unknown ride category: {category}")
|
||||
|
||||
def _get_status_label(self, status: str) -> str:
|
||||
"""Convert status code to human-readable label."""
|
||||
status_labels = {
|
||||
'OPERATING': 'Operating',
|
||||
'CLOSED_TEMP': 'Temporarily Closed',
|
||||
'SBNO': 'Standing But Not Operating',
|
||||
'CLOSING': 'Closing Soon',
|
||||
'CLOSED_PERM': 'Permanently Closed',
|
||||
'UNDER_CONSTRUCTION': 'Under Construction',
|
||||
'DEMOLISHED': 'Demolished',
|
||||
'RELOCATED': 'Relocated',
|
||||
}
|
||||
if status in status_labels:
|
||||
return status_labels[status]
|
||||
else:
|
||||
raise ValueError(f"Unknown ride status: {status}")
|
||||
|
||||
def _get_rc_type_label(self, rc_type: str) -> str:
|
||||
"""Convert roller coaster type to human-readable label."""
|
||||
rc_type_labels = {
|
||||
'SITDOWN': 'Sit Down',
|
||||
'INVERTED': 'Inverted',
|
||||
'SUSPENDED': 'Suspended',
|
||||
'FLOORLESS': 'Floorless',
|
||||
'FLYING': 'Flying',
|
||||
'WING': 'Wing',
|
||||
'DIVE': 'Dive',
|
||||
'SPINNING': 'Spinning',
|
||||
'WILD_MOUSE': 'Wild Mouse',
|
||||
'BOBSLED': 'Bobsled',
|
||||
'PIPELINE': 'Pipeline',
|
||||
'FOURTH_DIMENSION': '4th Dimension',
|
||||
'FAMILY': 'Family',
|
||||
}
|
||||
if rc_type in rc_type_labels:
|
||||
return rc_type_labels[rc_type]
|
||||
else:
|
||||
raise ValueError(f"Unknown roller coaster type: {rc_type}")
|
||||
|
||||
def _get_track_material_label(self, material: str) -> str:
|
||||
"""Convert track material to human-readable label."""
|
||||
material_labels = {
|
||||
'STEEL': 'Steel',
|
||||
'WOOD': 'Wood',
|
||||
'HYBRID': 'Hybrid (Steel/Wood)',
|
||||
}
|
||||
if material in material_labels:
|
||||
return material_labels[material]
|
||||
else:
|
||||
raise ValueError(f"Unknown track material: {material}")
|
||||
|
||||
def _get_propulsion_system_label(self, propulsion_system: str) -> str:
|
||||
"""Convert propulsion system to human-readable label."""
|
||||
propulsion_labels = {
|
||||
'CHAIN': 'Chain Lift',
|
||||
'LSM': 'Linear Synchronous Motor',
|
||||
'LIM': 'Linear Induction Motor',
|
||||
'HYDRAULIC': 'Hydraulic Launch',
|
||||
'PNEUMATIC': 'Pneumatic Launch',
|
||||
'CABLE': 'Cable Lift',
|
||||
'FLYWHEEL': 'Flywheel Launch',
|
||||
'GRAVITY': 'Gravity',
|
||||
'NONE': 'No Propulsion System',
|
||||
}
|
||||
if propulsion_system in propulsion_labels:
|
||||
return propulsion_labels[propulsion_system]
|
||||
else:
|
||||
raise ValueError(f"Unknown propulsion system: {propulsion_system}")
|
||||
@@ -1,17 +0,0 @@
|
||||
from django.db.models.signals import pre_save
|
||||
from django.dispatch import receiver
|
||||
from django.utils import timezone
|
||||
from .models import Ride
|
||||
|
||||
|
||||
@receiver(pre_save, sender=Ride)
|
||||
def handle_ride_status(sender, instance, **kwargs):
|
||||
"""Handle ride status changes based on closing date"""
|
||||
if instance.closing_date:
|
||||
today = timezone.now().date()
|
||||
|
||||
# If we've reached the closing date and status is "Closing"
|
||||
if today >= instance.closing_date and instance.status == "CLOSING":
|
||||
# Change to the selected post-closing status
|
||||
instance.status = instance.post_closing_status or "SBNO"
|
||||
instance.status_since = instance.closing_date
|
||||
@@ -1 +0,0 @@
|
||||
# Create your tests here.
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user