mirror of
https://github.com/pacnpal/thrillwiki_django_no_react.git
synced 2025-12-20 08:31:08 -05:00
feat: complete monorepo structure with frontend and shared resources
- Add complete backend/ directory with full Django application - Add frontend/ directory with Vite + TypeScript setup ready for Next.js - Add comprehensive shared/ directory with: - Complete documentation and memory-bank archives - Media files and avatars (letters, park/ride images) - Deployment scripts and automation tools - Shared types and utilities - Add architecture/ directory with migration guides - Configure pnpm workspace for monorepo development - Update .gitignore to exclude .django_tailwind_cli/ build artifacts - Preserve all historical documentation in shared/docs/memory-bank/ - Set up proper structure for full-stack development with shared resources
This commit is contained in:
1
.gitignore
vendored
1
.gitignore
vendored
@@ -114,3 +114,4 @@ temp/
|
|||||||
# Local development
|
# Local development
|
||||||
/uploads/
|
/uploads/
|
||||||
/backups/
|
/backups/
|
||||||
|
.django_tailwind_cli/
|
||||||
|
|||||||
372
architecture/architecture-validation.md
Normal file
372
architecture/architecture-validation.md
Normal file
@@ -0,0 +1,372 @@
|
|||||||
|
# ThrillWiki Monorepo Architecture Validation
|
||||||
|
|
||||||
|
This document provides a comprehensive review and validation of the proposed monorepo architecture for migrating ThrillWiki from Django-only to Django + Vue.js.
|
||||||
|
|
||||||
|
## Architecture Overview Validation
|
||||||
|
|
||||||
|
### ✅ Core Requirements Met
|
||||||
|
|
||||||
|
1. **Clean Separation of Concerns**
|
||||||
|
- Backend: Django API, business logic, database management
|
||||||
|
- Frontend: Vue.js SPA with modern tooling
|
||||||
|
- Shared: Common resources and media files
|
||||||
|
|
||||||
|
2. **Development Workflow Preservation**
|
||||||
|
- UV package management for Python maintained
|
||||||
|
- pnpm for Node.js package management
|
||||||
|
- Existing development scripts adapted
|
||||||
|
- Hot reloading for both backend and frontend
|
||||||
|
|
||||||
|
3. **Project Structure Compatibility**
|
||||||
|
- Django apps preserved under `backend/apps/`
|
||||||
|
- Configuration maintained under `backend/config/`
|
||||||
|
- Static files strategy clearly defined
|
||||||
|
- Media files centralized in `shared/media/`
|
||||||
|
|
||||||
|
## Technical Architecture Validation
|
||||||
|
|
||||||
|
### Backend Architecture ✅
|
||||||
|
|
||||||
|
```mermaid
|
||||||
|
graph TB
|
||||||
|
A[Django Backend] --> B[Apps Directory]
|
||||||
|
A --> C[Config Directory]
|
||||||
|
A --> D[Static Files]
|
||||||
|
|
||||||
|
B --> E[accounts]
|
||||||
|
B --> F[parks]
|
||||||
|
B --> G[rides]
|
||||||
|
B --> H[moderation]
|
||||||
|
B --> I[location]
|
||||||
|
B --> J[media]
|
||||||
|
B --> K[email_service]
|
||||||
|
B --> L[core]
|
||||||
|
|
||||||
|
C --> M[Django Settings]
|
||||||
|
C --> N[URL Configuration]
|
||||||
|
C --> O[WSGI/ASGI]
|
||||||
|
|
||||||
|
D --> P[Admin Assets]
|
||||||
|
D --> Q[Backend Static]
|
||||||
|
```
|
||||||
|
|
||||||
|
**Validation Points:**
|
||||||
|
- ✅ All 8 Django apps properly mapped to new structure
|
||||||
|
- ✅ Configuration files maintain their organization
|
||||||
|
- ✅ Static file handling preserves Django admin functionality
|
||||||
|
- ✅ UV package management integration maintained
|
||||||
|
|
||||||
|
### Frontend Architecture ✅
|
||||||
|
|
||||||
|
```mermaid
|
||||||
|
graph TB
|
||||||
|
A[Vue.js Frontend] --> B[Source Code]
|
||||||
|
A --> C[Build System]
|
||||||
|
A --> D[Development Tools]
|
||||||
|
|
||||||
|
B --> E[Components]
|
||||||
|
B --> F[Views/Pages]
|
||||||
|
B --> G[Router]
|
||||||
|
B --> H[State Management]
|
||||||
|
B --> I[API Layer]
|
||||||
|
|
||||||
|
C --> J[Vite]
|
||||||
|
C --> K[TypeScript]
|
||||||
|
C --> L[Tailwind CSS]
|
||||||
|
|
||||||
|
D --> M[Hot Reload]
|
||||||
|
D --> N[Dev Server]
|
||||||
|
D --> O[Build Tools]
|
||||||
|
```
|
||||||
|
|
||||||
|
**Validation Points:**
|
||||||
|
- ✅ Modern Vue.js 3 + Composition API
|
||||||
|
- ✅ TypeScript for type safety
|
||||||
|
- ✅ Vite for fast development and builds
|
||||||
|
- ✅ Tailwind CSS for styling (matching current setup)
|
||||||
|
- ✅ Pinia for state management
|
||||||
|
- ✅ Vue Router for SPA navigation
|
||||||
|
|
||||||
|
### Integration Architecture ✅
|
||||||
|
|
||||||
|
```mermaid
|
||||||
|
graph LR
|
||||||
|
A[Vue.js Frontend] --> B[HTTP API Calls]
|
||||||
|
B --> C[Django REST API]
|
||||||
|
C --> D[Database]
|
||||||
|
C --> E[Media Files]
|
||||||
|
E --> F[Shared Media Directory]
|
||||||
|
F --> G[Frontend Access]
|
||||||
|
```
|
||||||
|
|
||||||
|
**Validation Points:**
|
||||||
|
- ✅ RESTful API integration between frontend and backend
|
||||||
|
- ✅ Media files accessible to both systems
|
||||||
|
- ✅ Authentication handling via API tokens
|
||||||
|
- ✅ CORS configuration for cross-origin requests
|
||||||
|
|
||||||
|
## File Migration Validation
|
||||||
|
|
||||||
|
### Critical File Mappings ✅
|
||||||
|
|
||||||
|
| Component | Current | New Location | Status |
|
||||||
|
|-----------|---------|--------------|--------|
|
||||||
|
| Django Apps | `/apps/` | `/backend/apps/` | ✅ Mapped |
|
||||||
|
| Configuration | `/config/` | `/backend/config/` | ✅ Mapped |
|
||||||
|
| Static Files | `/static/` | `/backend/static/` | ✅ Mapped |
|
||||||
|
| Media Files | `/media/` | `/shared/media/` | ✅ Mapped |
|
||||||
|
| Scripts | `/scripts/` | `/scripts/` | ✅ Preserved |
|
||||||
|
| Dependencies | `/pyproject.toml` | `/backend/pyproject.toml` | ✅ Mapped |
|
||||||
|
|
||||||
|
### Import Path Updates Required ✅
|
||||||
|
|
||||||
|
**Django Settings Updates:**
|
||||||
|
```python
|
||||||
|
# OLD
|
||||||
|
INSTALLED_APPS = [
|
||||||
|
'accounts',
|
||||||
|
'parks',
|
||||||
|
'rides',
|
||||||
|
# ...
|
||||||
|
]
|
||||||
|
|
||||||
|
# NEW
|
||||||
|
INSTALLED_APPS = [
|
||||||
|
'apps.accounts',
|
||||||
|
'apps.parks',
|
||||||
|
'apps.rides',
|
||||||
|
# ...
|
||||||
|
]
|
||||||
|
```
|
||||||
|
|
||||||
|
**Media Path Updates:**
|
||||||
|
```python
|
||||||
|
# NEW
|
||||||
|
MEDIA_ROOT = BASE_DIR.parent / 'shared' / 'media'
|
||||||
|
```
|
||||||
|
|
||||||
|
## Development Workflow Validation
|
||||||
|
|
||||||
|
### Package Management ✅
|
||||||
|
|
||||||
|
**Backend (UV):**
|
||||||
|
- ✅ `uv add <package>` for new dependencies
|
||||||
|
- ✅ `uv run manage.py <command>` for Django commands
|
||||||
|
- ✅ `uv sync` for dependency installation
|
||||||
|
|
||||||
|
**Frontend (pnpm):**
|
||||||
|
- ✅ `pnpm add <package>` for new dependencies
|
||||||
|
- ✅ `pnpm install` for dependency installation
|
||||||
|
- ✅ `pnpm run dev` for development server
|
||||||
|
|
||||||
|
**Root Workspace:**
|
||||||
|
- ✅ `pnpm run dev` starts both servers concurrently
|
||||||
|
- ✅ Individual server commands available
|
||||||
|
- ✅ Build and test scripts coordinated
|
||||||
|
|
||||||
|
### Development Scripts ✅
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Root level coordination
|
||||||
|
pnpm run dev # Both servers
|
||||||
|
pnpm run backend:dev # Django only
|
||||||
|
pnpm run frontend:dev # Vue.js only
|
||||||
|
pnpm run build # Production build
|
||||||
|
pnpm run test # All tests
|
||||||
|
pnpm run lint # All linting
|
||||||
|
pnpm run format # Code formatting
|
||||||
|
```
|
||||||
|
|
||||||
|
## Deployment Strategy Validation
|
||||||
|
|
||||||
|
### Container Strategy ✅
|
||||||
|
|
||||||
|
**Multi-container Approach:**
|
||||||
|
- ✅ Separate containers for backend and frontend
|
||||||
|
- ✅ Shared volumes for media files
|
||||||
|
- ✅ Database and Redis containers
|
||||||
|
- ✅ Nginx reverse proxy configuration
|
||||||
|
|
||||||
|
**Build Process:**
|
||||||
|
- ✅ Backend: Django static collection + uv dependencies
|
||||||
|
- ✅ Frontend: Vite production build + asset optimization
|
||||||
|
- ✅ Shared: Media file persistence across deployments
|
||||||
|
|
||||||
|
### Platform Compatibility ✅
|
||||||
|
|
||||||
|
**Supported Deployment Platforms:**
|
||||||
|
- ✅ Docker Compose (local and production)
|
||||||
|
- ✅ Vercel (frontend + serverless backend)
|
||||||
|
- ✅ Railway (container deployment)
|
||||||
|
- ✅ DigitalOcean App Platform
|
||||||
|
- ✅ AWS ECS/Fargate
|
||||||
|
- ✅ Google Cloud Run
|
||||||
|
|
||||||
|
## Performance Considerations ✅
|
||||||
|
|
||||||
|
### Backend Optimization
|
||||||
|
- ✅ Database connection pooling
|
||||||
|
- ✅ Redis caching strategy
|
||||||
|
- ✅ Static file CDN integration
|
||||||
|
- ✅ API response optimization
|
||||||
|
|
||||||
|
### Frontend Optimization
|
||||||
|
- ✅ Code splitting and lazy loading
|
||||||
|
- ✅ Asset optimization with Vite
|
||||||
|
- ✅ Tree shaking for minimal bundle size
|
||||||
|
- ✅ Modern build targets
|
||||||
|
|
||||||
|
### Development Performance
|
||||||
|
- ✅ Hot module replacement for Vue.js
|
||||||
|
- ✅ Django auto-reload for backend changes
|
||||||
|
- ✅ Fast dependency installation with UV and pnpm
|
||||||
|
- ✅ Concurrent development servers
|
||||||
|
|
||||||
|
## Security Validation ✅
|
||||||
|
|
||||||
|
### Backend Security
|
||||||
|
- ✅ Django security middleware maintained
|
||||||
|
- ✅ CORS configuration for API access
|
||||||
|
- ✅ Authentication token management
|
||||||
|
- ✅ Input validation and sanitization
|
||||||
|
|
||||||
|
### Frontend Security
|
||||||
|
- ✅ Content Security Policy headers
|
||||||
|
- ✅ XSS protection mechanisms
|
||||||
|
- ✅ Secure API communication (HTTPS)
|
||||||
|
- ✅ Environment variable protection
|
||||||
|
|
||||||
|
### Deployment Security
|
||||||
|
- ✅ SSL/TLS termination
|
||||||
|
- ✅ Security headers configuration
|
||||||
|
- ✅ Secret management strategy
|
||||||
|
- ✅ Container security best practices
|
||||||
|
|
||||||
|
## Risk Assessment and Mitigation
|
||||||
|
|
||||||
|
### Low Risk Items ✅
|
||||||
|
- **File organization**: Clear mapping and systematic approach
|
||||||
|
- **Package management**: Both UV and pnpm are stable and well-supported
|
||||||
|
- **Development workflow**: Incremental changes to existing process
|
||||||
|
|
||||||
|
### Medium Risk Items ⚠️
|
||||||
|
- **Import path updates**: Requires careful testing of all Django apps
|
||||||
|
- **Static file handling**: Need to verify Django admin continues working
|
||||||
|
- **API integration**: New frontend-backend communication layer
|
||||||
|
|
||||||
|
**Mitigation Strategies:**
|
||||||
|
- Comprehensive testing suite for Django apps after migration
|
||||||
|
- Static file serving verification in development and production
|
||||||
|
- API endpoint testing and documentation
|
||||||
|
- Gradual migration approach with rollback capabilities
|
||||||
|
|
||||||
|
### High Risk Items 🔴
|
||||||
|
- **Data migration**: Database changes during restructuring
|
||||||
|
- **Production deployment**: New deployment process requires validation
|
||||||
|
|
||||||
|
**Mitigation Strategies:**
|
||||||
|
- Database backup before any structural changes
|
||||||
|
- Staging environment testing before production deployment
|
||||||
|
- Blue-green deployment strategy for zero-downtime migration
|
||||||
|
- Monitoring and alerting for post-migration issues
|
||||||
|
|
||||||
|
## Testing Strategy Validation
|
||||||
|
|
||||||
|
### Backend Testing ✅
|
||||||
|
```bash
|
||||||
|
# Django tests
|
||||||
|
cd backend
|
||||||
|
uv run manage.py test
|
||||||
|
|
||||||
|
# Code quality
|
||||||
|
uv run flake8 .
|
||||||
|
uv run black --check .
|
||||||
|
```
|
||||||
|
|
||||||
|
### Frontend Testing ✅
|
||||||
|
```bash
|
||||||
|
# Vue.js tests
|
||||||
|
cd frontend
|
||||||
|
pnpm run test
|
||||||
|
pnpm run test:unit
|
||||||
|
pnpm run test:e2e
|
||||||
|
|
||||||
|
# Code quality
|
||||||
|
pnpm run lint
|
||||||
|
pnpm run type-check
|
||||||
|
```
|
||||||
|
|
||||||
|
### Integration Testing ✅
|
||||||
|
- API endpoint testing
|
||||||
|
- Frontend-backend communication testing
|
||||||
|
- Media file access testing
|
||||||
|
- Authentication flow testing
|
||||||
|
|
||||||
|
## Documentation Validation ✅
|
||||||
|
|
||||||
|
### Created Documentation
|
||||||
|
- ✅ **Monorepo Structure Plan**: Complete directory organization
|
||||||
|
- ✅ **Migration Mapping**: File-by-file migration guide
|
||||||
|
- ✅ **Deployment Guide**: Comprehensive deployment strategies
|
||||||
|
- ✅ **Architecture Validation**: This validation document
|
||||||
|
|
||||||
|
### Required Updates
|
||||||
|
- ✅ Root README.md update for monorepo structure
|
||||||
|
- ✅ Development setup instructions
|
||||||
|
- ✅ API documentation for frontend integration
|
||||||
|
- ✅ Deployment runbooks
|
||||||
|
|
||||||
|
## Implementation Readiness Assessment
|
||||||
|
|
||||||
|
### Prerequisites Met ✅
|
||||||
|
- [x] Current Django project analysis complete
|
||||||
|
- [x] Monorepo structure designed
|
||||||
|
- [x] File migration strategy defined
|
||||||
|
- [x] Development workflow planned
|
||||||
|
- [x] Deployment strategy documented
|
||||||
|
- [x] Risk assessment completed
|
||||||
|
|
||||||
|
### Ready for Implementation ✅
|
||||||
|
- [x] Clear step-by-step migration plan
|
||||||
|
- [x] File mapping completeness verified
|
||||||
|
- [x] Package management strategy confirmed
|
||||||
|
- [x] Testing approach defined
|
||||||
|
- [x] Rollback strategy available
|
||||||
|
|
||||||
|
### Success Criteria Defined ✅
|
||||||
|
1. **Functional Requirements**
|
||||||
|
- All existing Django functionality preserved
|
||||||
|
- Modern Vue.js frontend operational
|
||||||
|
- API integration working correctly
|
||||||
|
- Media file handling functional
|
||||||
|
|
||||||
|
2. **Performance Requirements**
|
||||||
|
- Development servers start within reasonable time
|
||||||
|
- Build process completes successfully
|
||||||
|
- Production deployment successful
|
||||||
|
|
||||||
|
3. **Quality Requirements**
|
||||||
|
- All tests passing after migration
|
||||||
|
- Code quality standards maintained
|
||||||
|
- Documentation updated and complete
|
||||||
|
|
||||||
|
## Final Recommendation ✅
|
||||||
|
|
||||||
|
**Approval Status: APPROVED FOR IMPLEMENTATION**
|
||||||
|
|
||||||
|
The proposed monorepo architecture for ThrillWiki is comprehensive, well-planned, and ready for implementation. The plan demonstrates:
|
||||||
|
|
||||||
|
1. **Technical Soundness**: Architecture follows modern best practices
|
||||||
|
2. **Risk Management**: Potential issues identified with mitigation strategies
|
||||||
|
3. **Implementation Clarity**: Clear step-by-step migration process
|
||||||
|
4. **Operational Readiness**: Deployment and maintenance procedures defined
|
||||||
|
|
||||||
|
**Next Steps:**
|
||||||
|
1. Switch to **Code Mode** for implementation
|
||||||
|
2. Begin with directory structure creation
|
||||||
|
3. Migrate backend files systematically
|
||||||
|
4. Create Vue.js frontend application
|
||||||
|
5. Test integration between systems
|
||||||
|
6. Update deployment configurations
|
||||||
|
|
||||||
|
The architecture provides a solid foundation for scaling ThrillWiki with modern frontend technologies while preserving the robust Django backend functionality.
|
||||||
628
architecture/deployment-guide.md
Normal file
628
architecture/deployment-guide.md
Normal file
@@ -0,0 +1,628 @@
|
|||||||
|
# ThrillWiki Monorepo Deployment Guide
|
||||||
|
|
||||||
|
This document outlines deployment strategies, build processes, and infrastructure considerations for the ThrillWiki Django + Vue.js monorepo.
|
||||||
|
|
||||||
|
## Build Process Overview
|
||||||
|
|
||||||
|
```mermaid
|
||||||
|
graph TB
|
||||||
|
A[Source Code] --> B[Backend Build]
|
||||||
|
A --> C[Frontend Build]
|
||||||
|
B --> D[Django Static Collection]
|
||||||
|
C --> E[Vue.js Production Build]
|
||||||
|
D --> F[Backend Container]
|
||||||
|
E --> G[Frontend Assets]
|
||||||
|
F --> H[Production Deployment]
|
||||||
|
G --> H
|
||||||
|
```
|
||||||
|
|
||||||
|
## Development Environment
|
||||||
|
|
||||||
|
### Prerequisites
|
||||||
|
- Python 3.11+ with UV package manager
|
||||||
|
- Node.js 18+ with pnpm
|
||||||
|
- PostgreSQL (production) / SQLite (development)
|
||||||
|
- Redis (for caching and sessions)
|
||||||
|
|
||||||
|
### Local Development Setup
|
||||||
|
```bash
|
||||||
|
# Clone repository
|
||||||
|
git clone <repository-url>
|
||||||
|
cd thrillwiki-monorepo
|
||||||
|
|
||||||
|
# Install root dependencies
|
||||||
|
pnpm install
|
||||||
|
|
||||||
|
# Backend setup
|
||||||
|
cd backend
|
||||||
|
uv sync
|
||||||
|
uv run manage.py migrate
|
||||||
|
uv run manage.py collectstatic
|
||||||
|
|
||||||
|
# Frontend setup
|
||||||
|
cd ../frontend
|
||||||
|
pnpm install
|
||||||
|
|
||||||
|
# Start development servers
|
||||||
|
cd ..
|
||||||
|
pnpm run dev # Starts both backend and frontend
|
||||||
|
```
|
||||||
|
|
||||||
|
## Build Strategies
|
||||||
|
|
||||||
|
### 1. Containerized Deployment (Recommended)
|
||||||
|
|
||||||
|
#### Multi-stage Dockerfile for Backend
|
||||||
|
```dockerfile
|
||||||
|
# backend/Dockerfile
|
||||||
|
FROM python:3.11-slim as builder
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
COPY pyproject.toml uv.lock ./
|
||||||
|
RUN pip install uv
|
||||||
|
RUN uv sync --no-dev
|
||||||
|
|
||||||
|
FROM python:3.11-slim as runtime
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
COPY --from=builder /app/.venv /app/.venv
|
||||||
|
ENV PATH="/app/.venv/bin:$PATH"
|
||||||
|
|
||||||
|
COPY . .
|
||||||
|
RUN python manage.py collectstatic --noinput
|
||||||
|
|
||||||
|
EXPOSE 8000
|
||||||
|
CMD ["gunicorn", "config.wsgi:application", "--bind", "0.0.0.0:8000"]
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Dockerfile for Frontend
|
||||||
|
```dockerfile
|
||||||
|
# frontend/Dockerfile
|
||||||
|
FROM node:18-alpine as builder
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
COPY package.json pnpm-lock.yaml ./
|
||||||
|
RUN npm install -g pnpm
|
||||||
|
RUN pnpm install --frozen-lockfile
|
||||||
|
|
||||||
|
COPY . .
|
||||||
|
RUN pnpm run build
|
||||||
|
|
||||||
|
FROM nginx:alpine as runtime
|
||||||
|
COPY --from=builder /app/dist /usr/share/nginx/html
|
||||||
|
COPY nginx.conf /etc/nginx/nginx.conf
|
||||||
|
EXPOSE 80
|
||||||
|
CMD ["nginx", "-g", "daemon off;"]
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Docker Compose for Development
|
||||||
|
```yaml
|
||||||
|
# docker-compose.dev.yml
|
||||||
|
version: '3.8'
|
||||||
|
|
||||||
|
services:
|
||||||
|
db:
|
||||||
|
image: postgres:15
|
||||||
|
environment:
|
||||||
|
POSTGRES_DB: thrillwiki
|
||||||
|
POSTGRES_USER: thrillwiki
|
||||||
|
POSTGRES_PASSWORD: password
|
||||||
|
volumes:
|
||||||
|
- postgres_data:/var/lib/postgresql/data
|
||||||
|
ports:
|
||||||
|
- "5432:5432"
|
||||||
|
|
||||||
|
redis:
|
||||||
|
image: redis:7-alpine
|
||||||
|
ports:
|
||||||
|
- "6379:6379"
|
||||||
|
|
||||||
|
backend:
|
||||||
|
build:
|
||||||
|
context: ./backend
|
||||||
|
dockerfile: Dockerfile.dev
|
||||||
|
ports:
|
||||||
|
- "8000:8000"
|
||||||
|
volumes:
|
||||||
|
- ./backend:/app
|
||||||
|
- ./shared/media:/app/media
|
||||||
|
environment:
|
||||||
|
- DEBUG=1
|
||||||
|
- DATABASE_URL=postgresql://thrillwiki:password@db:5432/thrillwiki
|
||||||
|
- REDIS_URL=redis://redis:6379/0
|
||||||
|
depends_on:
|
||||||
|
- db
|
||||||
|
- redis
|
||||||
|
|
||||||
|
frontend:
|
||||||
|
build:
|
||||||
|
context: ./frontend
|
||||||
|
dockerfile: Dockerfile.dev
|
||||||
|
ports:
|
||||||
|
- "3000:3000"
|
||||||
|
volumes:
|
||||||
|
- ./frontend:/app
|
||||||
|
- /app/node_modules
|
||||||
|
environment:
|
||||||
|
- VITE_API_URL=http://localhost:8000
|
||||||
|
|
||||||
|
volumes:
|
||||||
|
postgres_data:
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Docker Compose for Production
|
||||||
|
```yaml
|
||||||
|
# docker-compose.prod.yml
|
||||||
|
version: '3.8'
|
||||||
|
|
||||||
|
services:
|
||||||
|
db:
|
||||||
|
image: postgres:15
|
||||||
|
environment:
|
||||||
|
POSTGRES_DB: ${POSTGRES_DB}
|
||||||
|
POSTGRES_USER: ${POSTGRES_USER}
|
||||||
|
POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}
|
||||||
|
volumes:
|
||||||
|
- postgres_data:/var/lib/postgresql/data
|
||||||
|
restart: unless-stopped
|
||||||
|
|
||||||
|
redis:
|
||||||
|
image: redis:7-alpine
|
||||||
|
restart: unless-stopped
|
||||||
|
|
||||||
|
backend:
|
||||||
|
build:
|
||||||
|
context: ./backend
|
||||||
|
dockerfile: Dockerfile
|
||||||
|
environment:
|
||||||
|
- DEBUG=0
|
||||||
|
- DATABASE_URL=${DATABASE_URL}
|
||||||
|
- REDIS_URL=${REDIS_URL}
|
||||||
|
- SECRET_KEY=${SECRET_KEY}
|
||||||
|
- ALLOWED_HOSTS=${ALLOWED_HOSTS}
|
||||||
|
volumes:
|
||||||
|
- ./shared/media:/app/media
|
||||||
|
- static_files:/app/staticfiles
|
||||||
|
depends_on:
|
||||||
|
- db
|
||||||
|
- redis
|
||||||
|
restart: unless-stopped
|
||||||
|
|
||||||
|
frontend:
|
||||||
|
build:
|
||||||
|
context: ./frontend
|
||||||
|
dockerfile: Dockerfile
|
||||||
|
restart: unless-stopped
|
||||||
|
|
||||||
|
nginx:
|
||||||
|
image: nginx:alpine
|
||||||
|
ports:
|
||||||
|
- "80:80"
|
||||||
|
- "443:443"
|
||||||
|
volumes:
|
||||||
|
- ./nginx/nginx.conf:/etc/nginx/nginx.conf
|
||||||
|
- ./nginx/ssl:/etc/nginx/ssl
|
||||||
|
- static_files:/usr/share/nginx/html/static
|
||||||
|
- ./shared/media:/usr/share/nginx/html/media
|
||||||
|
depends_on:
|
||||||
|
- backend
|
||||||
|
- frontend
|
||||||
|
restart: unless-stopped
|
||||||
|
|
||||||
|
volumes:
|
||||||
|
postgres_data:
|
||||||
|
static_files:
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2. Static Site Generation (Alternative)
|
||||||
|
|
||||||
|
For sites with mostly static content, consider pre-rendering:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Frontend build with pre-rendering
|
||||||
|
cd frontend
|
||||||
|
pnpm run build:prerender
|
||||||
|
|
||||||
|
# Serve static files with minimal backend
|
||||||
|
```
|
||||||
|
|
||||||
|
## CI/CD Pipeline
|
||||||
|
|
||||||
|
### GitHub Actions Workflow
|
||||||
|
```yaml
|
||||||
|
# .github/workflows/deploy.yml
|
||||||
|
name: Deploy ThrillWiki
|
||||||
|
|
||||||
|
on:
|
||||||
|
push:
|
||||||
|
branches: [main]
|
||||||
|
pull_request:
|
||||||
|
branches: [main]
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
test:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
|
||||||
|
services:
|
||||||
|
postgres:
|
||||||
|
image: postgres:15
|
||||||
|
env:
|
||||||
|
POSTGRES_PASSWORD: postgres
|
||||||
|
options: >-
|
||||||
|
--health-cmd pg_isready
|
||||||
|
--health-interval 10s
|
||||||
|
--health-timeout 5s
|
||||||
|
--health-retries 5
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Set up Python
|
||||||
|
uses: actions/setup-python@v4
|
||||||
|
with:
|
||||||
|
python-version: '3.11'
|
||||||
|
|
||||||
|
- name: Install UV
|
||||||
|
run: pip install uv
|
||||||
|
|
||||||
|
- name: Backend Tests
|
||||||
|
run: |
|
||||||
|
cd backend
|
||||||
|
uv sync
|
||||||
|
uv run manage.py test
|
||||||
|
uv run flake8 .
|
||||||
|
uv run black --check .
|
||||||
|
|
||||||
|
- name: Set up Node.js
|
||||||
|
uses: actions/setup-node@v4
|
||||||
|
with:
|
||||||
|
node-version: '18'
|
||||||
|
|
||||||
|
- name: Install pnpm
|
||||||
|
run: npm install -g pnpm
|
||||||
|
|
||||||
|
- name: Frontend Tests
|
||||||
|
run: |
|
||||||
|
cd frontend
|
||||||
|
pnpm install --frozen-lockfile
|
||||||
|
pnpm run test
|
||||||
|
pnpm run lint
|
||||||
|
pnpm run type-check
|
||||||
|
|
||||||
|
build:
|
||||||
|
needs: test
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
if: github.ref == 'refs/heads/main'
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Build and push Docker images
|
||||||
|
run: |
|
||||||
|
docker build -t thrillwiki-backend ./backend
|
||||||
|
docker build -t thrillwiki-frontend ./frontend
|
||||||
|
# Push to registry
|
||||||
|
|
||||||
|
- name: Deploy to production
|
||||||
|
run: |
|
||||||
|
# Deploy using your preferred method
|
||||||
|
# (AWS ECS, GCP Cloud Run, Azure Container Instances, etc.)
|
||||||
|
```
|
||||||
|
|
||||||
|
## Platform-Specific Deployments
|
||||||
|
|
||||||
|
### 1. Vercel Deployment (Frontend + API)
|
||||||
|
|
||||||
|
```json
|
||||||
|
// vercel.json
|
||||||
|
{
|
||||||
|
"version": 2,
|
||||||
|
"builds": [
|
||||||
|
{
|
||||||
|
"src": "frontend/package.json",
|
||||||
|
"use": "@vercel/static-build",
|
||||||
|
"config": {
|
||||||
|
"distDir": "dist"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"src": "backend/config/wsgi.py",
|
||||||
|
"use": "@vercel/python"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"routes": [
|
||||||
|
{
|
||||||
|
"src": "/api/(.*)",
|
||||||
|
"dest": "backend/config/wsgi.py"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"src": "/(.*)",
|
||||||
|
"dest": "frontend/dist/$1"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2. Railway Deployment
|
||||||
|
|
||||||
|
```toml
|
||||||
|
# railway.toml
|
||||||
|
[environments.production]
|
||||||
|
|
||||||
|
[environments.production.services.backend]
|
||||||
|
dockerfile = "backend/Dockerfile"
|
||||||
|
variables = { DEBUG = "0" }
|
||||||
|
|
||||||
|
[environments.production.services.frontend]
|
||||||
|
dockerfile = "frontend/Dockerfile"
|
||||||
|
|
||||||
|
[environments.production.services.postgres]
|
||||||
|
image = "postgres:15"
|
||||||
|
variables = { POSTGRES_DB = "thrillwiki" }
|
||||||
|
```
|
||||||
|
|
||||||
|
### 3. DigitalOcean App Platform
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
# .do/app.yaml
|
||||||
|
name: thrillwiki
|
||||||
|
services:
|
||||||
|
- name: backend
|
||||||
|
source_dir: backend
|
||||||
|
github:
|
||||||
|
repo: your-username/thrillwiki-monorepo
|
||||||
|
branch: main
|
||||||
|
run_command: gunicorn config.wsgi:application
|
||||||
|
environment_slug: python
|
||||||
|
instance_count: 1
|
||||||
|
instance_size_slug: basic-xxs
|
||||||
|
envs:
|
||||||
|
- key: DEBUG
|
||||||
|
value: "0"
|
||||||
|
|
||||||
|
- name: frontend
|
||||||
|
source_dir: frontend
|
||||||
|
github:
|
||||||
|
repo: your-username/thrillwiki-monorepo
|
||||||
|
branch: main
|
||||||
|
build_command: pnpm run build
|
||||||
|
run_command: pnpm run preview
|
||||||
|
environment_slug: node-js
|
||||||
|
instance_count: 1
|
||||||
|
instance_size_slug: basic-xxs
|
||||||
|
|
||||||
|
databases:
|
||||||
|
- name: thrillwiki-db
|
||||||
|
engine: PG
|
||||||
|
version: "15"
|
||||||
|
```
|
||||||
|
|
||||||
|
## Environment Configuration
|
||||||
|
|
||||||
|
### Environment Variables
|
||||||
|
|
||||||
|
#### Backend (.env)
|
||||||
|
```bash
|
||||||
|
# Django Settings
|
||||||
|
DEBUG=0
|
||||||
|
SECRET_KEY=your-secret-key-here
|
||||||
|
ALLOWED_HOSTS=yourdomain.com,www.yourdomain.com
|
||||||
|
|
||||||
|
# Database
|
||||||
|
DATABASE_URL=postgresql://user:password@host:port/database
|
||||||
|
|
||||||
|
# Redis
|
||||||
|
REDIS_URL=redis://host:port/0
|
||||||
|
|
||||||
|
# File Storage
|
||||||
|
MEDIA_ROOT=/app/media
|
||||||
|
STATIC_ROOT=/app/staticfiles
|
||||||
|
|
||||||
|
# Email
|
||||||
|
EMAIL_BACKEND=django.core.mail.backends.smtp.EmailBackend
|
||||||
|
EMAIL_HOST=smtp.yourmailprovider.com
|
||||||
|
EMAIL_PORT=587
|
||||||
|
EMAIL_USE_TLS=True
|
||||||
|
EMAIL_HOST_USER=your-email@yourdomain.com
|
||||||
|
EMAIL_HOST_PASSWORD=your-email-password
|
||||||
|
|
||||||
|
# Third-party Services
|
||||||
|
SENTRY_DSN=your-sentry-dsn
|
||||||
|
AWS_ACCESS_KEY_ID=your-aws-key
|
||||||
|
AWS_SECRET_ACCESS_KEY=your-aws-secret
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Frontend (.env.production)
|
||||||
|
```bash
|
||||||
|
VITE_API_URL=https://api.yourdomain.com
|
||||||
|
VITE_APP_TITLE=ThrillWiki
|
||||||
|
VITE_SENTRY_DSN=your-frontend-sentry-dsn
|
||||||
|
VITE_GOOGLE_ANALYTICS_ID=your-ga-id
|
||||||
|
```
|
||||||
|
|
||||||
|
## Performance Optimization
|
||||||
|
|
||||||
|
### Backend Optimizations
|
||||||
|
```python
|
||||||
|
# backend/config/settings/production.py
|
||||||
|
|
||||||
|
# Database optimization
|
||||||
|
DATABASES = {
|
||||||
|
'default': {
|
||||||
|
'ENGINE': 'django.db.backends.postgresql',
|
||||||
|
'CONN_MAX_AGE': 60,
|
||||||
|
'OPTIONS': {
|
||||||
|
'MAX_CONNS': 20,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
# Caching
|
||||||
|
CACHES = {
|
||||||
|
'default': {
|
||||||
|
'BACKEND': 'django.core.cache.backends.redis.RedisCache',
|
||||||
|
'LOCATION': 'redis://127.0.0.1:6379/1',
|
||||||
|
'OPTIONS': {
|
||||||
|
'CLIENT_CLASS': 'django_redis.client.DefaultClient',
|
||||||
|
},
|
||||||
|
'KEY_PREFIX': 'thrillwiki'
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
# Static files with CDN
|
||||||
|
AWS_S3_CUSTOM_DOMAIN = 'cdn.yourdomain.com'
|
||||||
|
STATICFILES_STORAGE = 'storages.backends.s3boto3.StaticS3Boto3Storage'
|
||||||
|
DEFAULT_FILE_STORAGE = 'storages.backends.s3boto3.MediaS3Boto3Storage'
|
||||||
|
```
|
||||||
|
|
||||||
|
### Frontend Optimizations
|
||||||
|
```typescript
|
||||||
|
// frontend/vite.config.ts
|
||||||
|
export default defineConfig({
|
||||||
|
build: {
|
||||||
|
rollupOptions: {
|
||||||
|
output: {
|
||||||
|
manualChunks: {
|
||||||
|
vendor: ['vue', 'vue-router', 'pinia'],
|
||||||
|
ui: ['@headlessui/vue', '@heroicons/vue']
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
sourcemap: false,
|
||||||
|
minify: 'terser',
|
||||||
|
terserOptions: {
|
||||||
|
compress: {
|
||||||
|
drop_console: true,
|
||||||
|
drop_debugger: true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
})
|
||||||
|
```
|
||||||
|
|
||||||
|
## Monitoring and Logging
|
||||||
|
|
||||||
|
### Application Monitoring
|
||||||
|
```python
|
||||||
|
# backend/config/settings/production.py
|
||||||
|
import sentry_sdk
|
||||||
|
from sentry_sdk.integrations.django import DjangoIntegration
|
||||||
|
|
||||||
|
sentry_sdk.init(
|
||||||
|
dsn="your-sentry-dsn",
|
||||||
|
integrations=[DjangoIntegration()],
|
||||||
|
traces_sample_rate=0.1,
|
||||||
|
send_default_pii=True
|
||||||
|
)
|
||||||
|
|
||||||
|
# Logging configuration
|
||||||
|
LOGGING = {
|
||||||
|
'version': 1,
|
||||||
|
'disable_existing_loggers': False,
|
||||||
|
'handlers': {
|
||||||
|
'file': {
|
||||||
|
'level': 'INFO',
|
||||||
|
'class': 'logging.FileHandler',
|
||||||
|
'filename': '/var/log/django/thrillwiki.log',
|
||||||
|
},
|
||||||
|
},
|
||||||
|
'root': {
|
||||||
|
'handlers': ['file'],
|
||||||
|
},
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Infrastructure Monitoring
|
||||||
|
- Use Prometheus + Grafana for metrics
|
||||||
|
- Implement health check endpoints
|
||||||
|
- Set up log aggregation (ELK stack or similar)
|
||||||
|
- Monitor database performance
|
||||||
|
- Track API response times
|
||||||
|
|
||||||
|
## Security Considerations
|
||||||
|
|
||||||
|
### Production Security Checklist
|
||||||
|
- [ ] HTTPS enforced with SSL certificates
|
||||||
|
- [ ] Security headers configured (HSTS, CSP, etc.)
|
||||||
|
- [ ] Database credentials secured
|
||||||
|
- [ ] Secret keys rotated regularly
|
||||||
|
- [ ] CORS properly configured
|
||||||
|
- [ ] Rate limiting implemented
|
||||||
|
- [ ] File upload validation
|
||||||
|
- [ ] SQL injection protection
|
||||||
|
- [ ] XSS protection enabled
|
||||||
|
- [ ] CSRF protection active
|
||||||
|
|
||||||
|
### Security Headers
|
||||||
|
```python
|
||||||
|
# backend/config/settings/production.py
|
||||||
|
SECURE_SSL_REDIRECT = True
|
||||||
|
SECURE_HSTS_SECONDS = 31536000
|
||||||
|
SECURE_HSTS_INCLUDE_SUBDOMAINS = True
|
||||||
|
SECURE_HSTS_PRELOAD = True
|
||||||
|
SECURE_CONTENT_TYPE_NOSNIFF = True
|
||||||
|
SECURE_BROWSER_XSS_FILTER = True
|
||||||
|
X_FRAME_OPTIONS = 'DENY'
|
||||||
|
|
||||||
|
# CORS for API
|
||||||
|
CORS_ALLOWED_ORIGINS = [
|
||||||
|
"https://yourdomain.com",
|
||||||
|
"https://www.yourdomain.com",
|
||||||
|
]
|
||||||
|
```
|
||||||
|
|
||||||
|
## Backup and Recovery
|
||||||
|
|
||||||
|
### Database Backup Strategy
|
||||||
|
```bash
|
||||||
|
# Automated backup script
|
||||||
|
#!/bin/bash
|
||||||
|
pg_dump $DATABASE_URL | gzip > backup_$(date +%Y%m%d_%H%M%S).sql.gz
|
||||||
|
aws s3 cp backup_*.sql.gz s3://your-backup-bucket/database/
|
||||||
|
```
|
||||||
|
|
||||||
|
### Media Files Backup
|
||||||
|
```bash
|
||||||
|
# Sync media files to S3
|
||||||
|
aws s3 sync ./shared/media/ s3://your-media-bucket/media/ --delete
|
||||||
|
```
|
||||||
|
|
||||||
|
## Scaling Strategies
|
||||||
|
|
||||||
|
### Horizontal Scaling
|
||||||
|
- Load balancer configuration
|
||||||
|
- Database read replicas
|
||||||
|
- CDN for static assets
|
||||||
|
- Redis clustering
|
||||||
|
- Auto-scaling groups
|
||||||
|
|
||||||
|
### Vertical Scaling
|
||||||
|
- Database connection pooling
|
||||||
|
- Application server optimization
|
||||||
|
- Memory usage optimization
|
||||||
|
- CPU-intensive task optimization
|
||||||
|
|
||||||
|
## Troubleshooting Guide
|
||||||
|
|
||||||
|
### Common Issues
|
||||||
|
1. **Build failures**: Check dependencies and environment variables
|
||||||
|
2. **Database connection errors**: Verify connection strings and firewall rules
|
||||||
|
3. **Static file 404s**: Ensure collectstatic runs and paths are correct
|
||||||
|
4. **CORS errors**: Check CORS configuration and allowed origins
|
||||||
|
5. **Memory issues**: Monitor application memory usage and optimize queries
|
||||||
|
|
||||||
|
### Debug Commands
|
||||||
|
```bash
|
||||||
|
# Backend debugging
|
||||||
|
cd backend
|
||||||
|
uv run manage.py check --deploy
|
||||||
|
uv run manage.py shell
|
||||||
|
uv run manage.py dbshell
|
||||||
|
|
||||||
|
# Frontend debugging
|
||||||
|
cd frontend
|
||||||
|
pnpm run build --debug
|
||||||
|
pnpm run preview
|
||||||
|
```
|
||||||
|
|
||||||
|
This deployment guide provides a comprehensive approach to deploying the ThrillWiki monorepo across various platforms while maintaining security, performance, and scalability.
|
||||||
353
architecture/migration-mapping.md
Normal file
353
architecture/migration-mapping.md
Normal file
@@ -0,0 +1,353 @@
|
|||||||
|
# ThrillWiki Migration Mapping Document
|
||||||
|
|
||||||
|
This document provides a comprehensive mapping of files from the current Django project to the new monorepo structure.
|
||||||
|
|
||||||
|
## Root Level Files
|
||||||
|
|
||||||
|
| Current Location | New Location | Notes |
|
||||||
|
|------------------|--------------|-------|
|
||||||
|
| `manage.py` | `backend/manage.py` | Core Django management |
|
||||||
|
| `pyproject.toml` | `backend/pyproject.toml` | Python dependencies |
|
||||||
|
| `uv.lock` | `backend/uv.lock` | UV lock file |
|
||||||
|
| `.gitignore` | `.gitignore` (update) | Merge with monorepo patterns |
|
||||||
|
| `README.md` | `README.md` (update) | Update for monorepo |
|
||||||
|
| `.pre-commit-config.yaml` | `.pre-commit-config.yaml` | Root level |
|
||||||
|
|
||||||
|
## Configuration Directory
|
||||||
|
|
||||||
|
| Current Location | New Location | Notes |
|
||||||
|
|------------------|--------------|-------|
|
||||||
|
| `config/django/` | `backend/config/django/` | Django settings |
|
||||||
|
| `config/settings/` | `backend/config/settings/` | Environment settings |
|
||||||
|
| `config/urls.py` | `backend/config/urls.py` | URL configuration |
|
||||||
|
| `config/wsgi.py` | `backend/config/wsgi.py` | WSGI configuration |
|
||||||
|
| `config/asgi.py` | `backend/config/asgi.py` | ASGI configuration |
|
||||||
|
|
||||||
|
## Django Apps
|
||||||
|
|
||||||
|
### Accounts App
|
||||||
|
| Current Location | New Location |
|
||||||
|
|------------------|--------------|
|
||||||
|
| `accounts/` | `backend/apps/accounts/` |
|
||||||
|
| `accounts/__init__.py` | `backend/apps/accounts/__init__.py` |
|
||||||
|
| `accounts/models.py` | `backend/apps/accounts/models.py` |
|
||||||
|
| `accounts/views.py` | `backend/apps/accounts/views.py` |
|
||||||
|
| `accounts/admin.py` | `backend/apps/accounts/admin.py` |
|
||||||
|
| `accounts/apps.py` | `backend/apps/accounts/apps.py` |
|
||||||
|
| `accounts/migrations/` | `backend/apps/accounts/migrations/` |
|
||||||
|
| `accounts/tests/` | `backend/apps/accounts/tests/` |
|
||||||
|
|
||||||
|
### Parks App
|
||||||
|
| Current Location | New Location |
|
||||||
|
|------------------|--------------|
|
||||||
|
| `parks/` | `backend/apps/parks/` |
|
||||||
|
| `parks/__init__.py` | `backend/apps/parks/__init__.py` |
|
||||||
|
| `parks/models.py` | `backend/apps/parks/models.py` |
|
||||||
|
| `parks/views.py` | `backend/apps/parks/views.py` |
|
||||||
|
| `parks/admin.py` | `backend/apps/parks/admin.py` |
|
||||||
|
| `parks/apps.py` | `backend/apps/parks/apps.py` |
|
||||||
|
| `parks/migrations/` | `backend/apps/parks/migrations/` |
|
||||||
|
| `parks/tests/` | `backend/apps/parks/tests/` |
|
||||||
|
|
||||||
|
### Rides App
|
||||||
|
| Current Location | New Location |
|
||||||
|
|------------------|--------------|
|
||||||
|
| `rides/` | `backend/apps/rides/` |
|
||||||
|
| `rides/__init__.py` | `backend/apps/rides/__init__.py` |
|
||||||
|
| `rides/models.py` | `backend/apps/rides/models.py` |
|
||||||
|
| `rides/views.py` | `backend/apps/rides/views.py` |
|
||||||
|
| `rides/admin.py` | `backend/apps/rides/admin.py` |
|
||||||
|
| `rides/apps.py` | `backend/apps/rides/apps.py` |
|
||||||
|
| `rides/migrations/` | `backend/apps/rides/migrations/` |
|
||||||
|
| `rides/tests/` | `backend/apps/rides/tests/` |
|
||||||
|
|
||||||
|
### Moderation App
|
||||||
|
| Current Location | New Location |
|
||||||
|
|------------------|--------------|
|
||||||
|
| `moderation/` | `backend/apps/moderation/` |
|
||||||
|
| `moderation/__init__.py` | `backend/apps/moderation/__init__.py` |
|
||||||
|
| `moderation/models.py` | `backend/apps/moderation/models.py` |
|
||||||
|
| `moderation/views.py` | `backend/apps/moderation/views.py` |
|
||||||
|
| `moderation/admin.py` | `backend/apps/moderation/admin.py` |
|
||||||
|
| `moderation/apps.py` | `backend/apps/moderation/apps.py` |
|
||||||
|
| `moderation/migrations/` | `backend/apps/moderation/migrations/` |
|
||||||
|
| `moderation/tests/` | `backend/apps/moderation/tests/` |
|
||||||
|
|
||||||
|
### Location App
|
||||||
|
| Current Location | New Location |
|
||||||
|
|------------------|--------------|
|
||||||
|
| `location/` | `backend/apps/location/` |
|
||||||
|
| `location/__init__.py` | `backend/apps/location/__init__.py` |
|
||||||
|
| `location/models.py` | `backend/apps/location/models.py` |
|
||||||
|
| `location/views.py` | `backend/apps/location/views.py` |
|
||||||
|
| `location/admin.py` | `backend/apps/location/admin.py` |
|
||||||
|
| `location/apps.py` | `backend/apps/location/apps.py` |
|
||||||
|
| `location/migrations/` | `backend/apps/location/migrations/` |
|
||||||
|
| `location/tests/` | `backend/apps/location/tests/` |
|
||||||
|
|
||||||
|
### Media App
|
||||||
|
| Current Location | New Location |
|
||||||
|
|------------------|--------------|
|
||||||
|
| `media/` | `backend/apps/media/` |
|
||||||
|
| `media/__init__.py` | `backend/apps/media/__init__.py` |
|
||||||
|
| `media/models.py` | `backend/apps/media/models.py` |
|
||||||
|
| `media/views.py` | `backend/apps/media/views.py` |
|
||||||
|
| `media/admin.py` | `backend/apps/media/admin.py` |
|
||||||
|
| `media/apps.py` | `backend/apps/media/apps.py` |
|
||||||
|
| `media/migrations/` | `backend/apps/media/migrations/` |
|
||||||
|
| `media/tests/` | `backend/apps/media/tests/` |
|
||||||
|
|
||||||
|
### Email Service App
|
||||||
|
| Current Location | New Location |
|
||||||
|
|------------------|--------------|
|
||||||
|
| `email_service/` | `backend/apps/email_service/` |
|
||||||
|
| `email_service/__init__.py` | `backend/apps/email_service/__init__.py` |
|
||||||
|
| `email_service/models.py` | `backend/apps/email_service/models.py` |
|
||||||
|
| `email_service/views.py` | `backend/apps/email_service/views.py` |
|
||||||
|
| `email_service/admin.py` | `backend/apps/email_service/admin.py` |
|
||||||
|
| `email_service/apps.py` | `backend/apps/email_service/apps.py` |
|
||||||
|
| `email_service/migrations/` | `backend/apps/email_service/migrations/` |
|
||||||
|
| `email_service/tests/` | `backend/apps/email_service/tests/` |
|
||||||
|
|
||||||
|
### Core App
|
||||||
|
| Current Location | New Location |
|
||||||
|
|------------------|--------------|
|
||||||
|
| `core/` | `backend/apps/core/` |
|
||||||
|
| `core/__init__.py` | `backend/apps/core/__init__.py` |
|
||||||
|
| `core/models.py` | `backend/apps/core/models.py` |
|
||||||
|
| `core/views.py` | `backend/apps/core/views.py` |
|
||||||
|
| `core/admin.py` | `backend/apps/core/admin.py` |
|
||||||
|
| `core/apps.py` | `backend/apps/core/apps.py` |
|
||||||
|
| `core/migrations/` | `backend/apps/core/migrations/` |
|
||||||
|
| `core/tests/` | `backend/apps/core/tests/` |
|
||||||
|
|
||||||
|
## Static Files and Templates
|
||||||
|
|
||||||
|
| Current Location | New Location | Notes |
|
||||||
|
|------------------|--------------|-------|
|
||||||
|
| `static/` | `backend/static/` | Django admin and backend assets |
|
||||||
|
| `staticfiles/` | `backend/staticfiles/` | Collected static files |
|
||||||
|
| `templates/` | `backend/templates/` | Django templates (if any) |
|
||||||
|
|
||||||
|
## Media Files
|
||||||
|
|
||||||
|
| Current Location | New Location | Notes |
|
||||||
|
|------------------|--------------|-------|
|
||||||
|
| `media/` | `shared/media/` | User uploaded content |
|
||||||
|
|
||||||
|
## Scripts and Development Tools
|
||||||
|
|
||||||
|
| Current Location | New Location | Notes |
|
||||||
|
|------------------|--------------|-------|
|
||||||
|
| `scripts/` | `scripts/` | Root level scripts |
|
||||||
|
| `scripts/dev_server.sh` | `scripts/backend_dev.sh` | Rename for clarity |
|
||||||
|
|
||||||
|
## New Frontend Structure (Created)
|
||||||
|
|
||||||
|
| New Location | Purpose |
|
||||||
|
|--------------|---------|
|
||||||
|
| `frontend/` | Vue.js application root |
|
||||||
|
| `frontend/package.json` | Node.js dependencies |
|
||||||
|
| `frontend/pnpm-lock.yaml` | pnpm lock file |
|
||||||
|
| `frontend/vite.config.ts` | Vite configuration |
|
||||||
|
| `frontend/tsconfig.json` | TypeScript configuration |
|
||||||
|
| `frontend/tailwind.config.js` | Tailwind CSS configuration |
|
||||||
|
| `frontend/src/` | Vue.js source code |
|
||||||
|
| `frontend/src/main.ts` | Application entry point |
|
||||||
|
| `frontend/src/App.vue` | Root component |
|
||||||
|
| `frontend/src/components/` | Vue components |
|
||||||
|
| `frontend/src/views/` | Page components |
|
||||||
|
| `frontend/src/router/` | Vue Router configuration |
|
||||||
|
| `frontend/src/stores/` | Pinia stores |
|
||||||
|
| `frontend/src/composables/` | Vue composables |
|
||||||
|
| `frontend/src/utils/` | Utility functions |
|
||||||
|
| `frontend/src/types/` | TypeScript type definitions |
|
||||||
|
| `frontend/src/assets/` | Static assets |
|
||||||
|
| `frontend/public/` | Public assets |
|
||||||
|
| `frontend/dist/` | Build output |
|
||||||
|
|
||||||
|
## New Shared Resources (Created)
|
||||||
|
|
||||||
|
| New Location | Purpose |
|
||||||
|
|--------------|---------|
|
||||||
|
| `shared/` | Cross-platform resources |
|
||||||
|
| `shared/media/` | User uploaded files |
|
||||||
|
| `shared/docs/` | Documentation |
|
||||||
|
| `shared/types/` | Shared TypeScript types |
|
||||||
|
| `shared/constants/` | Shared constants |
|
||||||
|
|
||||||
|
## Updated Root Files
|
||||||
|
|
||||||
|
### package.json (Root)
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"name": "thrillwiki-monorepo",
|
||||||
|
"private": true,
|
||||||
|
"workspaces": [
|
||||||
|
"frontend"
|
||||||
|
],
|
||||||
|
"scripts": {
|
||||||
|
"dev": "concurrently \"pnpm --filter frontend dev\" \"./scripts/backend_dev.sh\"",
|
||||||
|
"build": "pnpm --filter frontend build",
|
||||||
|
"backend:dev": "./scripts/backend_dev.sh",
|
||||||
|
"frontend:dev": "pnpm --filter frontend dev",
|
||||||
|
"test": "pnpm --filter frontend test && cd backend && uv run manage.py test",
|
||||||
|
"lint": "pnpm --filter frontend lint && cd backend && uv run flake8 .",
|
||||||
|
"format": "pnpm --filter frontend format && cd backend && uv run black ."
|
||||||
|
},
|
||||||
|
"devDependencies": {
|
||||||
|
"concurrently": "^8.2.2"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### .gitignore (Updated)
|
||||||
|
```gitignore
|
||||||
|
# Python
|
||||||
|
__pycache__/
|
||||||
|
*.py[cod]
|
||||||
|
*$py.class
|
||||||
|
*.so
|
||||||
|
.Python
|
||||||
|
build/
|
||||||
|
develop-eggs/
|
||||||
|
dist/
|
||||||
|
downloads/
|
||||||
|
eggs/
|
||||||
|
.eggs/
|
||||||
|
lib/
|
||||||
|
lib64/
|
||||||
|
parts/
|
||||||
|
sdist/
|
||||||
|
var/
|
||||||
|
wheels/
|
||||||
|
share/python-wheels/
|
||||||
|
*.egg-info/
|
||||||
|
.installed.cfg
|
||||||
|
*.egg
|
||||||
|
MANIFEST
|
||||||
|
|
||||||
|
# Django
|
||||||
|
*.log
|
||||||
|
local_settings.py
|
||||||
|
db.sqlite3
|
||||||
|
db.sqlite3-journal
|
||||||
|
/backend/static/
|
||||||
|
/backend/media/
|
||||||
|
|
||||||
|
# UV
|
||||||
|
.uv/
|
||||||
|
|
||||||
|
# Node.js
|
||||||
|
node_modules/
|
||||||
|
npm-debug.log*
|
||||||
|
yarn-debug.log*
|
||||||
|
yarn-error.log*
|
||||||
|
pnpm-debug.log*
|
||||||
|
lerna-debug.log*
|
||||||
|
.pnpm-store/
|
||||||
|
|
||||||
|
# Vue.js / Vite
|
||||||
|
/frontend/dist/
|
||||||
|
/frontend/dist-ssr/
|
||||||
|
*.local
|
||||||
|
|
||||||
|
# Environment variables
|
||||||
|
.env
|
||||||
|
.env.local
|
||||||
|
.env.development.local
|
||||||
|
.env.test.local
|
||||||
|
.env.production.local
|
||||||
|
|
||||||
|
# IDEs
|
||||||
|
.vscode/
|
||||||
|
.idea/
|
||||||
|
*.swp
|
||||||
|
*.swo
|
||||||
|
|
||||||
|
# OS
|
||||||
|
.DS_Store
|
||||||
|
Thumbs.db
|
||||||
|
|
||||||
|
# Logs
|
||||||
|
logs/
|
||||||
|
*.log
|
||||||
|
|
||||||
|
# Coverage
|
||||||
|
coverage/
|
||||||
|
*.lcov
|
||||||
|
.nyc_output
|
||||||
|
```
|
||||||
|
|
||||||
|
## Configuration Updates Required
|
||||||
|
|
||||||
|
### Backend Django Settings
|
||||||
|
Update `INSTALLED_APPS` paths:
|
||||||
|
```python
|
||||||
|
INSTALLED_APPS = [
|
||||||
|
'django.contrib.admin',
|
||||||
|
'django.contrib.auth',
|
||||||
|
'django.contrib.contenttypes',
|
||||||
|
'django.contrib.sessions',
|
||||||
|
'django.contrib.messages',
|
||||||
|
'django.contrib.staticfiles',
|
||||||
|
|
||||||
|
# Local apps
|
||||||
|
'apps.accounts',
|
||||||
|
'apps.parks',
|
||||||
|
'apps.rides',
|
||||||
|
'apps.moderation',
|
||||||
|
'apps.location',
|
||||||
|
'apps.media',
|
||||||
|
'apps.email_service',
|
||||||
|
'apps.core',
|
||||||
|
]
|
||||||
|
```
|
||||||
|
|
||||||
|
Update media and static files paths:
|
||||||
|
```python
|
||||||
|
STATIC_URL = '/static/'
|
||||||
|
STATIC_ROOT = BASE_DIR / 'staticfiles'
|
||||||
|
STATICFILES_DIRS = [
|
||||||
|
BASE_DIR / 'static',
|
||||||
|
]
|
||||||
|
|
||||||
|
MEDIA_URL = '/media/'
|
||||||
|
MEDIA_ROOT = BASE_DIR.parent / 'shared' / 'media'
|
||||||
|
```
|
||||||
|
|
||||||
|
### Script Updates
|
||||||
|
Update `scripts/backend_dev.sh`:
|
||||||
|
```bash
|
||||||
|
#!/bin/bash
|
||||||
|
cd backend
|
||||||
|
lsof -ti :8000 | xargs kill -9 2>/dev/null || true
|
||||||
|
find . -type d -name "__pycache__" -exec rm -r {} + 2>/dev/null || true
|
||||||
|
uv run manage.py runserver 0.0.0.0:8000
|
||||||
|
```
|
||||||
|
|
||||||
|
## Migration Steps Summary
|
||||||
|
|
||||||
|
1. **Create new directory structure**
|
||||||
|
2. **Move backend files** to `backend/` directory
|
||||||
|
3. **Update import paths** in Django settings and apps
|
||||||
|
4. **Create frontend** Vue.js application
|
||||||
|
5. **Update scripts** and configuration files
|
||||||
|
6. **Test both backend and frontend** independently
|
||||||
|
7. **Configure API integration** between Django and Vue.js
|
||||||
|
8. **Update deployment** configurations
|
||||||
|
|
||||||
|
## Validation Checklist
|
||||||
|
|
||||||
|
- [ ] All Django apps moved to `backend/apps/`
|
||||||
|
- [ ] Configuration files updated with new paths
|
||||||
|
- [ ] Static and media file paths configured correctly
|
||||||
|
- [ ] Frontend Vue.js application created and configured
|
||||||
|
- [ ] Root package.json with workspace configuration
|
||||||
|
- [ ] Development scripts updated and tested
|
||||||
|
- [ ] Git configuration updated
|
||||||
|
- [ ] Documentation updated
|
||||||
|
- [ ] CI/CD pipelines updated (if applicable)
|
||||||
|
- [ ] Database migrations work correctly
|
||||||
|
- [ ] Both development servers start successfully
|
||||||
|
- [ ] API endpoints accessible from frontend
|
||||||
525
architecture/monorepo-structure-plan.md
Normal file
525
architecture/monorepo-structure-plan.md
Normal file
@@ -0,0 +1,525 @@
|
|||||||
|
# ThrillWiki Django + Vue.js Monorepo Architecture Plan
|
||||||
|
|
||||||
|
## Executive Summary
|
||||||
|
|
||||||
|
This document outlines the optimal monorepo directory structure for migrating the ThrillWiki Django project to a Django + Vue.js architecture. The design separates backend and frontend concerns while maintaining existing Django app organization and supporting modern development workflows.
|
||||||
|
|
||||||
|
## Current Project Analysis
|
||||||
|
|
||||||
|
### Django Apps Structure
|
||||||
|
- **accounts**: User management and authentication
|
||||||
|
- **parks**: Theme park data and operations
|
||||||
|
- **rides**: Ride information and management
|
||||||
|
- **moderation**: Content moderation system
|
||||||
|
- **location**: Geographic data handling
|
||||||
|
- **media**: File and image management
|
||||||
|
- **email_service**: Email functionality
|
||||||
|
- **core**: Core utilities and services
|
||||||
|
|
||||||
|
### Key Infrastructure
|
||||||
|
- **Package Management**: UV-based Python setup
|
||||||
|
- **Configuration**: `config/django/` for settings, `config/settings/` for modular settings
|
||||||
|
- **Development**: `scripts/dev_server.sh` with comprehensive setup
|
||||||
|
- **Static Assets**: Tailwind CSS integration, `static/` and `staticfiles/`
|
||||||
|
- **Media Handling**: Organized `media/` directory with park/ride subdirectories
|
||||||
|
|
||||||
|
## Proposed Monorepo Structure
|
||||||
|
|
||||||
|
```
|
||||||
|
thrillwiki-monorepo/
|
||||||
|
├── README.md
|
||||||
|
├── pyproject.toml # Python dependencies (backend only)
|
||||||
|
├── package.json # Node.js dependencies (monorepo coordination)
|
||||||
|
├── pnpm-workspace.yaml # pnpm workspace configuration
|
||||||
|
├── .env.example
|
||||||
|
├── .gitignore
|
||||||
|
├──
|
||||||
|
├── backend/ # Django Backend
|
||||||
|
│ ├── manage.py
|
||||||
|
│ ├── pyproject.toml # Backend-specific dependencies
|
||||||
|
│ ├── config/
|
||||||
|
│ │ ├── django/
|
||||||
|
│ │ │ ├── base.py
|
||||||
|
│ │ │ ├── local.py
|
||||||
|
│ │ │ ├── production.py
|
||||||
|
│ │ │ └── test.py
|
||||||
|
│ │ └── settings/
|
||||||
|
│ │ ├── database.py
|
||||||
|
│ │ ├── email.py
|
||||||
|
│ │ └── security.py
|
||||||
|
│ ├── thrillwiki/
|
||||||
|
│ │ ├── __init__.py
|
||||||
|
│ │ ├── urls.py
|
||||||
|
│ │ ├── wsgi.py
|
||||||
|
│ │ ├── asgi.py
|
||||||
|
│ │ └── views.py
|
||||||
|
│ ├── apps/ # Django apps
|
||||||
|
│ │ ├── accounts/
|
||||||
|
│ │ ├── parks/
|
||||||
|
│ │ ├── rides/
|
||||||
|
│ │ ├── moderation/
|
||||||
|
│ │ ├── location/
|
||||||
|
│ │ ├── media/
|
||||||
|
│ │ ├── email_service/
|
||||||
|
│ │ └── core/
|
||||||
|
│ ├── templates/ # Django templates (API responses, admin)
|
||||||
|
│ ├── static/ # Backend static files
|
||||||
|
│ │ └── admin/ # Django admin assets
|
||||||
|
│ ├── media/ # User uploads
|
||||||
|
│ │ ├── avatars/
|
||||||
|
│ │ ├── park/
|
||||||
|
│ │ └── submissions/
|
||||||
|
│ └── tests/ # Backend tests
|
||||||
|
│
|
||||||
|
├── frontend/ # Vue.js Frontend
|
||||||
|
│ ├── package.json
|
||||||
|
│ ├── pnpm-lock.yaml
|
||||||
|
│ ├── vite.config.js
|
||||||
|
│ ├── tailwind.config.js
|
||||||
|
│ ├── index.html
|
||||||
|
│ ├── src/
|
||||||
|
│ │ ├── main.js
|
||||||
|
│ │ ├── App.vue
|
||||||
|
│ │ ├── router/
|
||||||
|
│ │ │ └── index.js
|
||||||
|
│ │ ├── stores/ # Pinia/Vuex stores
|
||||||
|
│ │ │ ├── auth.js
|
||||||
|
│ │ │ ├── parks.js
|
||||||
|
│ │ │ └── rides.js
|
||||||
|
│ │ ├── components/
|
||||||
|
│ │ │ ├── common/ # Shared components
|
||||||
|
│ │ │ ├── parks/ # Park-specific components
|
||||||
|
│ │ │ ├── rides/ # Ride-specific components
|
||||||
|
│ │ │ └── moderation/ # Moderation components
|
||||||
|
│ │ ├── views/ # Page components
|
||||||
|
│ │ │ ├── Home.vue
|
||||||
|
│ │ │ ├── parks/
|
||||||
|
│ │ │ ├── rides/
|
||||||
|
│ │ │ └── auth/
|
||||||
|
│ │ ├── composables/ # Vue 3 composables
|
||||||
|
│ │ │ ├── useAuth.js
|
||||||
|
│ │ │ ├── useApi.js
|
||||||
|
│ │ │ └── useTheme.js
|
||||||
|
│ │ ├── services/ # API service layer
|
||||||
|
│ │ │ ├── api.js
|
||||||
|
│ │ │ ├── auth.js
|
||||||
|
│ │ │ ├── parks.js
|
||||||
|
│ │ │ └── rides.js
|
||||||
|
│ │ ├── assets/
|
||||||
|
│ │ │ ├── images/
|
||||||
|
│ │ │ └── styles/
|
||||||
|
│ │ │ ├── globals.css
|
||||||
|
│ │ │ └── components/
|
||||||
|
│ │ └── utils/
|
||||||
|
│ ├── public/
|
||||||
|
│ │ ├── favicon.ico
|
||||||
|
│ │ └── images/
|
||||||
|
│ ├── dist/ # Build output
|
||||||
|
│ └── tests/ # Frontend tests
|
||||||
|
│ ├── unit/
|
||||||
|
│ └── e2e/
|
||||||
|
│
|
||||||
|
├── shared/ # Shared Resources
|
||||||
|
│ ├── docs/ # Documentation
|
||||||
|
│ │ ├── api/ # API documentation
|
||||||
|
│ │ ├── deployment/ # Deployment guides
|
||||||
|
│ │ └── development/ # Development setup
|
||||||
|
│ ├── scripts/ # Build and deployment scripts
|
||||||
|
│ │ ├── dev/
|
||||||
|
│ │ │ ├── start-backend.sh
|
||||||
|
│ │ │ ├── start-frontend.sh
|
||||||
|
│ │ │ └── start-full-stack.sh
|
||||||
|
│ │ ├── build/
|
||||||
|
│ │ │ ├── build-frontend.sh
|
||||||
|
│ │ │ └── build-production.sh
|
||||||
|
│ │ ├── deploy/
|
||||||
|
│ │ └── utils/
|
||||||
|
│ ├── config/ # Shared configuration
|
||||||
|
│ │ ├── docker/
|
||||||
|
│ │ │ ├── Dockerfile.backend
|
||||||
|
│ │ │ ├── Dockerfile.frontend
|
||||||
|
│ │ │ └── docker-compose.yml
|
||||||
|
│ │ ├── nginx/
|
||||||
|
│ │ └── ci/ # CI/CD configuration
|
||||||
|
│ │ └── github-actions/
|
||||||
|
│ └── types/ # Shared TypeScript types
|
||||||
|
│ ├── api.ts
|
||||||
|
│ ├── parks.ts
|
||||||
|
│ └── rides.ts
|
||||||
|
│
|
||||||
|
├── logs/ # Application logs
|
||||||
|
├── backups/ # Database backups
|
||||||
|
├── uploads/ # Temporary upload directory
|
||||||
|
└── dist/ # Production build output
|
||||||
|
├── backend/ # Django static files
|
||||||
|
└── frontend/ # Vue.js build
|
||||||
|
```
|
||||||
|
|
||||||
|
## Directory Organization Rationale
|
||||||
|
|
||||||
|
### 1. Clear Separation of Concerns
|
||||||
|
- **backend/**: Contains all Django-related code, maintaining existing app structure
|
||||||
|
- **frontend/**: Vue.js application with modern structure (Vite + Vue 3)
|
||||||
|
- **shared/**: Common resources, documentation, and configuration
|
||||||
|
|
||||||
|
### 2. Backend Structure (`backend/`)
|
||||||
|
- Preserves existing Django app organization under `apps/`
|
||||||
|
- Maintains UV-based Python dependency management
|
||||||
|
- Keeps configuration structure with `config/django/` and `config/settings/`
|
||||||
|
- Separates templates for API responses vs. frontend UI
|
||||||
|
|
||||||
|
### 3. Frontend Structure (`frontend/`)
|
||||||
|
- Modern Vue 3 + Vite setup with TypeScript support
|
||||||
|
- Organized by feature areas (parks, rides, auth)
|
||||||
|
- Composables for Vue 3 Composition API patterns
|
||||||
|
- Service layer for API communication with Django backend
|
||||||
|
- Tailwind CSS integration with shared design system
|
||||||
|
|
||||||
|
### 4. Shared Resources (`shared/`)
|
||||||
|
- Centralized documentation and deployment scripts
|
||||||
|
- Docker configuration for containerized deployment
|
||||||
|
- TypeScript type definitions shared between frontend and API
|
||||||
|
- CI/CD pipeline configuration
|
||||||
|
|
||||||
|
## Static File Strategy
|
||||||
|
|
||||||
|
### Development
|
||||||
|
```mermaid
|
||||||
|
graph LR
|
||||||
|
A[Vue Dev Server :3000] --> B[Vite HMR]
|
||||||
|
C[Django Dev Server :8000] --> D[Django Static Files]
|
||||||
|
E[Tailwind CSS] --> F[Both Frontend & Backend]
|
||||||
|
```
|
||||||
|
|
||||||
|
### Production
|
||||||
|
```mermaid
|
||||||
|
graph LR
|
||||||
|
A[Vue Build] --> B[dist/frontend/]
|
||||||
|
C[Django Collectstatic] --> D[dist/backend/]
|
||||||
|
E[Nginx] --> F[Serves Both]
|
||||||
|
F --> G[Frontend Assets]
|
||||||
|
F --> H[API Endpoints]
|
||||||
|
F --> I[Media Files]
|
||||||
|
```
|
||||||
|
|
||||||
|
### Implementation Details
|
||||||
|
|
||||||
|
1. **Development Mode**:
|
||||||
|
- Frontend: Vite dev server on port 3000 with HMR
|
||||||
|
- Backend: Django dev server on port 8000
|
||||||
|
- Proxy API calls from frontend to backend
|
||||||
|
|
||||||
|
2. **Production Mode**:
|
||||||
|
- Frontend built to `dist/frontend/`
|
||||||
|
- Django static files collected to `dist/backend/`
|
||||||
|
- Nginx serves static files and proxies API calls
|
||||||
|
|
||||||
|
## Media File Management
|
||||||
|
|
||||||
|
### Current Structure Preservation
|
||||||
|
```
|
||||||
|
media/
|
||||||
|
├── avatars/ # User profile images
|
||||||
|
├── park/ # Park-specific media
|
||||||
|
│ ├── {park-slug}/
|
||||||
|
│ │ └── {ride-slug}/
|
||||||
|
└── submissions/ # User-submitted content
|
||||||
|
└── photos/
|
||||||
|
```
|
||||||
|
|
||||||
|
### Strategy
|
||||||
|
- **Development**: Django serves media files directly
|
||||||
|
- **Production**: CDN or object storage (S3/CloudFlare) integration
|
||||||
|
- **Frontend Access**: Media URLs provided via API responses
|
||||||
|
- **Upload Handling**: Django handles all file uploads, Vue.js provides UI
|
||||||
|
|
||||||
|
## Development Workflow Integration
|
||||||
|
|
||||||
|
### Package Management
|
||||||
|
- **Root**: Node.js dependencies for frontend and tooling (using pnpm)
|
||||||
|
- **Backend**: UV for Python dependencies (existing approach)
|
||||||
|
- **Frontend**: pnpm for Vue.js dependencies
|
||||||
|
|
||||||
|
### Development Scripts
|
||||||
|
```bash
|
||||||
|
# Root level scripts
|
||||||
|
pnpm run dev # Start both backend and frontend
|
||||||
|
pnpm run dev:backend # Start only Django
|
||||||
|
pnpm run dev:frontend # Start only Vue.js
|
||||||
|
pnpm run build # Build for production
|
||||||
|
pnpm run test # Run all tests
|
||||||
|
|
||||||
|
# Backend specific (using UV)
|
||||||
|
cd backend && uv run manage.py runserver
|
||||||
|
cd backend && uv run manage.py test
|
||||||
|
|
||||||
|
# Frontend specific
|
||||||
|
cd frontend && pnpm run dev
|
||||||
|
cd frontend && pnpm run build
|
||||||
|
cd frontend && pnpm run test
|
||||||
|
```
|
||||||
|
|
||||||
|
### Environment Configuration
|
||||||
|
```bash
|
||||||
|
# Root .env (shared settings)
|
||||||
|
DATABASE_URL=
|
||||||
|
REDIS_URL=
|
||||||
|
SECRET_KEY=
|
||||||
|
|
||||||
|
# Backend .env (Django specific)
|
||||||
|
DJANGO_SETTINGS_MODULE=config.django.local
|
||||||
|
DEBUG=True
|
||||||
|
|
||||||
|
# Frontend .env (Vue specific)
|
||||||
|
VITE_API_BASE_URL=http://localhost:8000/api
|
||||||
|
VITE_APP_TITLE=ThrillWiki
|
||||||
|
```
|
||||||
|
|
||||||
|
### Package Manager Configuration
|
||||||
|
|
||||||
|
#### Root pnpm-workspace.yaml
|
||||||
|
```yaml
|
||||||
|
packages:
|
||||||
|
- 'frontend'
|
||||||
|
# Backend is managed separately with uv
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Root package.json
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"name": "thrillwiki-monorepo",
|
||||||
|
"private": true,
|
||||||
|
"packageManager": "pnpm@9.0.0",
|
||||||
|
"scripts": {
|
||||||
|
"dev": "concurrently \"pnpm run dev:backend\" \"pnpm run dev:frontend\"",
|
||||||
|
"dev:backend": "cd backend && uv run manage.py runserver",
|
||||||
|
"dev:frontend": "cd frontend && pnpm run dev",
|
||||||
|
"build": "pnpm run build:frontend && cd backend && uv run manage.py collectstatic --noinput",
|
||||||
|
"build:frontend": "cd frontend && pnpm run build",
|
||||||
|
"test": "pnpm run test:backend && pnpm run test:frontend",
|
||||||
|
"test:backend": "cd backend && uv run manage.py test",
|
||||||
|
"test:frontend": "cd frontend && pnpm run test",
|
||||||
|
"lint": "cd frontend && pnpm run lint && cd ../backend && uv run flake8 .",
|
||||||
|
"format": "cd frontend && pnpm run format && cd ../backend && uv run black ."
|
||||||
|
},
|
||||||
|
"devDependencies": {
|
||||||
|
"concurrently": "^8.2.0"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Frontend package.json
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"name": "thrillwiki-frontend",
|
||||||
|
"private": true,
|
||||||
|
"version": "0.1.0",
|
||||||
|
"type": "module",
|
||||||
|
"scripts": {
|
||||||
|
"dev": "vite",
|
||||||
|
"build": "vite build",
|
||||||
|
"preview": "vite preview",
|
||||||
|
"test": "vitest",
|
||||||
|
"test:e2e": "playwright test",
|
||||||
|
"lint": "eslint . --ext .vue,.js,.jsx,.cjs,.mjs,.ts,.tsx,.cts,.mts --fix",
|
||||||
|
"format": "prettier --write src/",
|
||||||
|
"type-check": "vue-tsc --noEmit"
|
||||||
|
},
|
||||||
|
"dependencies": {
|
||||||
|
"vue": "^3.4.0",
|
||||||
|
"vue-router": "^4.3.0",
|
||||||
|
"pinia": "^2.1.0",
|
||||||
|
"axios": "^1.6.0"
|
||||||
|
},
|
||||||
|
"devDependencies": {
|
||||||
|
"@vitejs/plugin-vue": "^5.0.0",
|
||||||
|
"vite": "^5.0.0",
|
||||||
|
"vue-tsc": "^2.0.0",
|
||||||
|
"typescript": "^5.3.0",
|
||||||
|
"tailwindcss": "^3.4.0",
|
||||||
|
"autoprefixer": "^10.4.0",
|
||||||
|
"postcss": "^8.4.0",
|
||||||
|
"eslint": "^8.57.0",
|
||||||
|
"prettier": "^3.2.0",
|
||||||
|
"vitest": "^1.3.0",
|
||||||
|
"@playwright/test": "^1.42.0"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## File Migration Mapping
|
||||||
|
|
||||||
|
### High-Level Moves
|
||||||
|
```
|
||||||
|
Current → New Location
|
||||||
|
├── manage.py → backend/manage.py
|
||||||
|
├── pyproject.toml → backend/pyproject.toml (+ root package.json)
|
||||||
|
├── config/ → backend/config/
|
||||||
|
├── thrillwiki/ → backend/thrillwiki/
|
||||||
|
├── accounts/ → backend/apps/accounts/
|
||||||
|
├── parks/ → backend/apps/parks/
|
||||||
|
├── rides/ → backend/apps/rides/
|
||||||
|
├── moderation/ → backend/apps/moderation/
|
||||||
|
├── location/ → backend/apps/location/
|
||||||
|
├── media/ → backend/apps/media/
|
||||||
|
├── email_service/ → backend/apps/email_service/
|
||||||
|
├── core/ → backend/apps/core/
|
||||||
|
├── templates/ → backend/templates/ (API) + frontend/src/views/ (UI)
|
||||||
|
├── static/ → backend/static/ (admin) + frontend/src/assets/
|
||||||
|
├── media/ → media/ (shared, accessible to both)
|
||||||
|
├── scripts/ → shared/scripts/
|
||||||
|
├── docs/ → shared/docs/
|
||||||
|
├── tests/ → backend/tests/ + frontend/tests/
|
||||||
|
└── staticfiles/ → dist/backend/ (generated)
|
||||||
|
```
|
||||||
|
|
||||||
|
### Detailed Backend App Moves
|
||||||
|
Each Django app moves to `backend/apps/{app_name}/` with structure preserved:
|
||||||
|
- Models, views, serializers stay the same
|
||||||
|
- Templates for API responses remain in app directories
|
||||||
|
- Static files move to frontend if UI-related
|
||||||
|
- Tests remain with respective apps
|
||||||
|
|
||||||
|
## Build and Deployment Strategy
|
||||||
|
|
||||||
|
### Development Build Process
|
||||||
|
1. **Backend**: No build step, runs directly with Django dev server
|
||||||
|
2. **Frontend**: Vite development server with HMR
|
||||||
|
3. **Shared**: Scripts orchestrate starting both services
|
||||||
|
|
||||||
|
### Production Build Process
|
||||||
|
```mermaid
|
||||||
|
graph TD
|
||||||
|
A[CI/CD Trigger] --> B[Install Dependencies]
|
||||||
|
B --> C[Build Frontend]
|
||||||
|
B --> D[Collect Django Static]
|
||||||
|
C --> E[Generate Frontend Bundle]
|
||||||
|
D --> F[Collect Backend Assets]
|
||||||
|
E --> G[Create Docker Images]
|
||||||
|
F --> G
|
||||||
|
G --> H[Deploy to Production]
|
||||||
|
```
|
||||||
|
|
||||||
|
### Container Strategy
|
||||||
|
- **Multi-stage Docker builds**: Separate backend and frontend images
|
||||||
|
- **Nginx**: Reverse proxy and static file serving
|
||||||
|
- **Volume mounts**: For media files and logs
|
||||||
|
- **Environment-based configuration**: Development vs. production
|
||||||
|
|
||||||
|
## API Integration Strategy
|
||||||
|
|
||||||
|
### Backend API Structure
|
||||||
|
```python
|
||||||
|
# Enhanced DRF setup for SPA
|
||||||
|
REST_FRAMEWORK = {
|
||||||
|
'DEFAULT_RENDERER_CLASSES': [
|
||||||
|
'rest_framework.renderers.JSONRenderer',
|
||||||
|
],
|
||||||
|
'DEFAULT_AUTHENTICATION_CLASSES': [
|
||||||
|
'rest_framework.authentication.SessionAuthentication',
|
||||||
|
'rest_framework.authentication.TokenAuthentication',
|
||||||
|
],
|
||||||
|
}
|
||||||
|
|
||||||
|
# CORS for development
|
||||||
|
CORS_ALLOWED_ORIGINS = [
|
||||||
|
"http://localhost:3000", # Vue dev server
|
||||||
|
]
|
||||||
|
```
|
||||||
|
|
||||||
|
### Frontend API Service
|
||||||
|
```javascript
|
||||||
|
// API service with auth integration
|
||||||
|
class ApiService {
|
||||||
|
constructor() {
|
||||||
|
this.client = axios.create({
|
||||||
|
baseURL: import.meta.env.VITE_API_BASE_URL,
|
||||||
|
withCredentials: true,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Park operations
|
||||||
|
getParks(params = {}) {
|
||||||
|
return this.client.get('/parks/', { params });
|
||||||
|
}
|
||||||
|
|
||||||
|
// Ride operations
|
||||||
|
getRides(parkId, params = {}) {
|
||||||
|
return this.client.get(`/parks/${parkId}/rides/`, { params });
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Configuration Management
|
||||||
|
|
||||||
|
### Shared Environment Variables
|
||||||
|
- Database connections
|
||||||
|
- Redis/Cache settings
|
||||||
|
- Secret keys and API keys
|
||||||
|
- Feature flags
|
||||||
|
|
||||||
|
### Application-Specific Settings
|
||||||
|
- **Django**: `backend/config/django/`
|
||||||
|
- **Vue.js**: `frontend/.env` files
|
||||||
|
- **Docker**: `shared/config/docker/`
|
||||||
|
|
||||||
|
### Development vs. Production
|
||||||
|
- Development: Multiple local servers, hot reloading
|
||||||
|
- Production: Containerized deployment, CDN integration
|
||||||
|
|
||||||
|
## Benefits of This Structure
|
||||||
|
|
||||||
|
1. **Clear Separation**: Backend and frontend concerns are clearly separated
|
||||||
|
2. **Scalability**: Each part can be developed, tested, and deployed independently
|
||||||
|
3. **Modern Workflow**: Supports latest Vue 3, Vite, and Django patterns
|
||||||
|
4. **Backward Compatibility**: Preserves existing Django app structure
|
||||||
|
5. **Developer Experience**: Hot reloading, TypeScript support, modern tooling
|
||||||
|
6. **Deployment Flexibility**: Can deploy as SPA + API or traditional Django
|
||||||
|
|
||||||
|
## Implementation Phases
|
||||||
|
|
||||||
|
### Phase 1: Structure Setup
|
||||||
|
1. Create new directory structure
|
||||||
|
2. Move Django code to `backend/`
|
||||||
|
3. Initialize Vue.js frontend
|
||||||
|
4. Set up basic API integration
|
||||||
|
|
||||||
|
### Phase 2: Frontend Development
|
||||||
|
1. Create Vue.js components for existing Django templates
|
||||||
|
2. Implement routing and state management
|
||||||
|
3. Integrate with Django API endpoints
|
||||||
|
4. Add authentication flow
|
||||||
|
|
||||||
|
### Phase 3: Build & Deploy
|
||||||
|
1. Set up build processes
|
||||||
|
2. Configure CI/CD pipelines
|
||||||
|
3. Implement production deployment
|
||||||
|
4. Performance optimization
|
||||||
|
|
||||||
|
## Considerations and Trade-offs
|
||||||
|
|
||||||
|
### Advantages
|
||||||
|
- Modern development experience
|
||||||
|
- Better code organization
|
||||||
|
- Independent scaling
|
||||||
|
- Rich frontend interactions
|
||||||
|
- API-first architecture
|
||||||
|
|
||||||
|
### Challenges
|
||||||
|
- Increased complexity
|
||||||
|
- Build process coordination
|
||||||
|
- Authentication across services
|
||||||
|
- SEO considerations (if needed)
|
||||||
|
- Development environment setup
|
||||||
|
|
||||||
|
## Next Steps
|
||||||
|
|
||||||
|
1. **Validate Architecture**: Review with development team
|
||||||
|
2. **Prototype Setup**: Create basic structure with sample components
|
||||||
|
3. **Migration Planning**: Detailed plan for moving existing code
|
||||||
|
4. **Tool Selection**: Finalize Vue.js ecosystem choices (Pinia vs. Vuex, etc.)
|
||||||
|
5. **Implementation**: Begin phase-by-phase migration
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
This architecture provides a solid foundation for migrating ThrillWiki to a modern Django + Vue.js monorepo while preserving existing functionality and enabling future growth.
|
||||||
31
backend/.env.example
Normal file
31
backend/.env.example
Normal file
@@ -0,0 +1,31 @@
|
|||||||
|
# Django Configuration
|
||||||
|
SECRET_KEY=your-secret-key-here
|
||||||
|
DEBUG=True
|
||||||
|
DJANGO_SETTINGS_MODULE=config.django.local
|
||||||
|
|
||||||
|
# Database
|
||||||
|
DATABASE_URL=postgresql://user:password@localhost:5432/thrillwiki
|
||||||
|
|
||||||
|
# Redis
|
||||||
|
REDIS_URL=redis://localhost:6379
|
||||||
|
|
||||||
|
# Email Configuration (Optional)
|
||||||
|
EMAIL_HOST=smtp.gmail.com
|
||||||
|
EMAIL_PORT=587
|
||||||
|
EMAIL_USE_TLS=True
|
||||||
|
EMAIL_HOST_USER=your-email@gmail.com
|
||||||
|
EMAIL_HOST_PASSWORD=your-app-password
|
||||||
|
|
||||||
|
# Media and Static Files
|
||||||
|
MEDIA_URL=/media/
|
||||||
|
STATIC_URL=/static/
|
||||||
|
|
||||||
|
# Security
|
||||||
|
ALLOWED_HOSTS=localhost,127.0.0.1
|
||||||
|
|
||||||
|
# API Configuration
|
||||||
|
CORS_ALLOWED_ORIGINS=http://localhost:3000
|
||||||
|
|
||||||
|
# Feature Flags
|
||||||
|
ENABLE_DEBUG_TOOLBAR=True
|
||||||
|
ENABLE_SILK_PROFILER=False
|
||||||
229
backend/README.md
Normal file
229
backend/README.md
Normal file
@@ -0,0 +1,229 @@
|
|||||||
|
# ThrillWiki Backend
|
||||||
|
|
||||||
|
Django REST API backend for the ThrillWiki monorepo.
|
||||||
|
|
||||||
|
## 🏗️ Architecture
|
||||||
|
|
||||||
|
This backend follows Django best practices with a modular app structure:
|
||||||
|
|
||||||
|
```
|
||||||
|
backend/
|
||||||
|
├── apps/ # Django applications
|
||||||
|
│ ├── accounts/ # User management
|
||||||
|
│ ├── parks/ # Theme park data
|
||||||
|
│ ├── rides/ # Ride information
|
||||||
|
│ ├── moderation/ # Content moderation
|
||||||
|
│ ├── location/ # Geographic data
|
||||||
|
│ ├── media/ # File management
|
||||||
|
│ ├── email_service/ # Email functionality
|
||||||
|
│ └── core/ # Core utilities
|
||||||
|
├── config/ # Django configuration
|
||||||
|
│ ├── django/ # Settings files
|
||||||
|
│ └── settings/ # Modular settings
|
||||||
|
├── templates/ # Django templates
|
||||||
|
├── static/ # Static files
|
||||||
|
└── tests/ # Test files
|
||||||
|
```
|
||||||
|
|
||||||
|
## 🛠️ Technology Stack
|
||||||
|
|
||||||
|
- **Django 5.0+** - Web framework
|
||||||
|
- **Django REST Framework** - API framework
|
||||||
|
- **PostgreSQL** - Primary database
|
||||||
|
- **Redis** - Caching and sessions
|
||||||
|
- **UV** - Python package management
|
||||||
|
- **Celery** - Background task processing
|
||||||
|
|
||||||
|
## 🚀 Quick Start
|
||||||
|
|
||||||
|
### Prerequisites
|
||||||
|
|
||||||
|
- Python 3.11+
|
||||||
|
- [uv](https://docs.astral.sh/uv/) package manager
|
||||||
|
- PostgreSQL 14+
|
||||||
|
- Redis 6+
|
||||||
|
|
||||||
|
### Setup
|
||||||
|
|
||||||
|
1. **Install dependencies**
|
||||||
|
```bash
|
||||||
|
cd backend
|
||||||
|
uv sync
|
||||||
|
```
|
||||||
|
|
||||||
|
2. **Environment configuration**
|
||||||
|
```bash
|
||||||
|
cp .env.example .env
|
||||||
|
# Edit .env with your settings
|
||||||
|
```
|
||||||
|
|
||||||
|
3. **Database setup**
|
||||||
|
```bash
|
||||||
|
uv run manage.py migrate
|
||||||
|
uv run manage.py createsuperuser
|
||||||
|
```
|
||||||
|
|
||||||
|
4. **Start development server**
|
||||||
|
```bash
|
||||||
|
uv run manage.py runserver
|
||||||
|
```
|
||||||
|
|
||||||
|
## 🔧 Configuration
|
||||||
|
|
||||||
|
### Environment Variables
|
||||||
|
|
||||||
|
Required environment variables:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Database
|
||||||
|
DATABASE_URL=postgresql://user:pass@localhost/thrillwiki
|
||||||
|
|
||||||
|
# Django
|
||||||
|
SECRET_KEY=your-secret-key
|
||||||
|
DEBUG=True
|
||||||
|
DJANGO_SETTINGS_MODULE=config.django.local
|
||||||
|
|
||||||
|
# Redis
|
||||||
|
REDIS_URL=redis://localhost:6379
|
||||||
|
|
||||||
|
# Email (optional)
|
||||||
|
EMAIL_HOST=smtp.gmail.com
|
||||||
|
EMAIL_PORT=587
|
||||||
|
EMAIL_USE_TLS=True
|
||||||
|
EMAIL_HOST_USER=your-email@gmail.com
|
||||||
|
EMAIL_HOST_PASSWORD=your-app-password
|
||||||
|
```
|
||||||
|
|
||||||
|
### Settings Structure
|
||||||
|
|
||||||
|
- `config/django/base.py` - Base settings
|
||||||
|
- `config/django/local.py` - Development settings
|
||||||
|
- `config/django/production.py` - Production settings
|
||||||
|
- `config/django/test.py` - Test settings
|
||||||
|
|
||||||
|
## 📁 Apps Overview
|
||||||
|
|
||||||
|
### Core Apps
|
||||||
|
|
||||||
|
- **accounts** - User authentication and profile management
|
||||||
|
- **parks** - Theme park models and operations
|
||||||
|
- **rides** - Ride information and relationships
|
||||||
|
- **core** - Shared utilities and base classes
|
||||||
|
|
||||||
|
### Support Apps
|
||||||
|
|
||||||
|
- **moderation** - Content moderation workflows
|
||||||
|
- **location** - Geographic data and services
|
||||||
|
- **media** - File upload and management
|
||||||
|
- **email_service** - Email sending and templates
|
||||||
|
|
||||||
|
## 🔌 API Endpoints
|
||||||
|
|
||||||
|
Base URL: `http://localhost:8000/api/`
|
||||||
|
|
||||||
|
### Authentication
|
||||||
|
- `POST /auth/login/` - User login
|
||||||
|
- `POST /auth/logout/` - User logout
|
||||||
|
- `POST /auth/register/` - User registration
|
||||||
|
|
||||||
|
### Parks
|
||||||
|
- `GET /parks/` - List parks
|
||||||
|
- `GET /parks/{id}/` - Park details
|
||||||
|
- `POST /parks/` - Create park (admin)
|
||||||
|
|
||||||
|
### Rides
|
||||||
|
- `GET /rides/` - List rides
|
||||||
|
- `GET /rides/{id}/` - Ride details
|
||||||
|
- `GET /parks/{park_id}/rides/` - Rides by park
|
||||||
|
|
||||||
|
## 🧪 Testing
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Run all tests
|
||||||
|
uv run manage.py test
|
||||||
|
|
||||||
|
# Run specific app tests
|
||||||
|
uv run manage.py test apps.parks
|
||||||
|
|
||||||
|
# Run with coverage
|
||||||
|
uv run coverage run manage.py test
|
||||||
|
uv run coverage report
|
||||||
|
```
|
||||||
|
|
||||||
|
## 🔧 Management Commands
|
||||||
|
|
||||||
|
Custom management commands:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Import park data
|
||||||
|
uv run manage.py import_parks data/parks.json
|
||||||
|
|
||||||
|
# Generate test data
|
||||||
|
uv run manage.py generate_test_data
|
||||||
|
|
||||||
|
# Clean up expired sessions
|
||||||
|
uv run manage.py clearsessions
|
||||||
|
```
|
||||||
|
|
||||||
|
## 📊 Database
|
||||||
|
|
||||||
|
### Entity Relationships
|
||||||
|
|
||||||
|
- **Parks** have Operators (required) and PropertyOwners (optional)
|
||||||
|
- **Rides** belong to Parks and may have Manufacturers/Designers
|
||||||
|
- **Users** can create submissions and moderate content
|
||||||
|
|
||||||
|
### Migrations
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Create migrations
|
||||||
|
uv run manage.py makemigrations
|
||||||
|
|
||||||
|
# Apply migrations
|
||||||
|
uv run manage.py migrate
|
||||||
|
|
||||||
|
# Show migration status
|
||||||
|
uv run manage.py showmigrations
|
||||||
|
```
|
||||||
|
|
||||||
|
## 🔐 Security
|
||||||
|
|
||||||
|
- CORS configured for frontend integration
|
||||||
|
- CSRF protection enabled
|
||||||
|
- JWT token authentication
|
||||||
|
- Rate limiting on API endpoints
|
||||||
|
- Input validation and sanitization
|
||||||
|
|
||||||
|
## 📈 Performance
|
||||||
|
|
||||||
|
- Database query optimization
|
||||||
|
- Redis caching for frequent queries
|
||||||
|
- Background task processing with Celery
|
||||||
|
- Database connection pooling
|
||||||
|
|
||||||
|
## 🚀 Deployment
|
||||||
|
|
||||||
|
See the [Deployment Guide](../shared/docs/deployment/) for production setup.
|
||||||
|
|
||||||
|
## 🐛 Debugging
|
||||||
|
|
||||||
|
### Development Tools
|
||||||
|
|
||||||
|
- Django Debug Toolbar
|
||||||
|
- Django Extensions
|
||||||
|
- Silk profiler for performance analysis
|
||||||
|
|
||||||
|
### Logging
|
||||||
|
|
||||||
|
Logs are written to:
|
||||||
|
- Console (development)
|
||||||
|
- Files in `logs/` directory (production)
|
||||||
|
- External logging service (production)
|
||||||
|
|
||||||
|
## 🤝 Contributing
|
||||||
|
|
||||||
|
1. Follow Django coding standards
|
||||||
|
2. Write tests for new features
|
||||||
|
3. Update documentation
|
||||||
|
4. Run linting: `uv run flake8 .`
|
||||||
|
5. Format code: `uv run black .`
|
||||||
6
backend/apps/__init__.py
Normal file
6
backend/apps/__init__.py
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
"""
|
||||||
|
Django apps package.
|
||||||
|
|
||||||
|
This directory contains all Django applications for the ThrillWiki backend.
|
||||||
|
Each app is self-contained and follows Django best practices.
|
||||||
|
"""
|
||||||
0
backend/apps/accounts/__init__.py
Normal file
0
backend/apps/accounts/__init__.py
Normal file
64
backend/apps/accounts/adapters.py
Normal file
64
backend/apps/accounts/adapters.py
Normal file
@@ -0,0 +1,64 @@
|
|||||||
|
from django.conf import settings
|
||||||
|
from allauth.account.adapter import DefaultAccountAdapter
|
||||||
|
from allauth.socialaccount.adapter import DefaultSocialAccountAdapter
|
||||||
|
from django.contrib.auth import get_user_model
|
||||||
|
from django.contrib.sites.shortcuts import get_current_site
|
||||||
|
|
||||||
|
User = get_user_model()
|
||||||
|
|
||||||
|
|
||||||
|
class CustomAccountAdapter(DefaultAccountAdapter):
|
||||||
|
def is_open_for_signup(self, request):
|
||||||
|
"""
|
||||||
|
Whether to allow sign ups.
|
||||||
|
"""
|
||||||
|
return True
|
||||||
|
|
||||||
|
def get_email_confirmation_url(self, request, emailconfirmation):
|
||||||
|
"""
|
||||||
|
Constructs the email confirmation (activation) url.
|
||||||
|
"""
|
||||||
|
get_current_site(request)
|
||||||
|
return f"{settings.LOGIN_REDIRECT_URL}verify-email?key={emailconfirmation.key}"
|
||||||
|
|
||||||
|
def send_confirmation_mail(self, request, emailconfirmation, signup):
|
||||||
|
"""
|
||||||
|
Sends the confirmation email.
|
||||||
|
"""
|
||||||
|
current_site = get_current_site(request)
|
||||||
|
activate_url = self.get_email_confirmation_url(request, emailconfirmation)
|
||||||
|
ctx = {
|
||||||
|
"user": emailconfirmation.email_address.user,
|
||||||
|
"activate_url": activate_url,
|
||||||
|
"current_site": current_site,
|
||||||
|
"key": emailconfirmation.key,
|
||||||
|
}
|
||||||
|
if signup:
|
||||||
|
email_template = "account/email/email_confirmation_signup"
|
||||||
|
else:
|
||||||
|
email_template = "account/email/email_confirmation"
|
||||||
|
self.send_mail(email_template, emailconfirmation.email_address.email, ctx)
|
||||||
|
|
||||||
|
|
||||||
|
class CustomSocialAccountAdapter(DefaultSocialAccountAdapter):
|
||||||
|
def is_open_for_signup(self, request, sociallogin):
|
||||||
|
"""
|
||||||
|
Whether to allow social account sign ups.
|
||||||
|
"""
|
||||||
|
return True
|
||||||
|
|
||||||
|
def populate_user(self, request, sociallogin, data):
|
||||||
|
"""
|
||||||
|
Hook that can be used to further populate the user instance.
|
||||||
|
"""
|
||||||
|
user = super().populate_user(request, sociallogin, data)
|
||||||
|
if sociallogin.account.provider == "discord":
|
||||||
|
user.discord_id = sociallogin.account.uid
|
||||||
|
return user
|
||||||
|
|
||||||
|
def save_user(self, request, sociallogin, form=None):
|
||||||
|
"""
|
||||||
|
Save the newly signed up social login.
|
||||||
|
"""
|
||||||
|
user = super().save_user(request, sociallogin, form)
|
||||||
|
return user
|
||||||
282
backend/apps/accounts/admin.py
Normal file
282
backend/apps/accounts/admin.py
Normal file
@@ -0,0 +1,282 @@
|
|||||||
|
from django.contrib import admin
|
||||||
|
from django.contrib.auth.admin import UserAdmin
|
||||||
|
from django.utils.html import format_html
|
||||||
|
from django.contrib.auth.models import Group
|
||||||
|
from .models import User, UserProfile, EmailVerification, TopList, TopListItem
|
||||||
|
|
||||||
|
|
||||||
|
class UserProfileInline(admin.StackedInline):
|
||||||
|
model = UserProfile
|
||||||
|
can_delete = False
|
||||||
|
verbose_name_plural = "Profile"
|
||||||
|
fieldsets = (
|
||||||
|
(
|
||||||
|
"Personal Info",
|
||||||
|
{"fields": ("display_name", "avatar", "pronouns", "bio")},
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"Social Media",
|
||||||
|
{"fields": ("twitter", "instagram", "youtube", "discord")},
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"Ride Credits",
|
||||||
|
{
|
||||||
|
"fields": (
|
||||||
|
"coaster_credits",
|
||||||
|
"dark_ride_credits",
|
||||||
|
"flat_ride_credits",
|
||||||
|
"water_ride_credits",
|
||||||
|
)
|
||||||
|
},
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class TopListItemInline(admin.TabularInline):
|
||||||
|
model = TopListItem
|
||||||
|
extra = 1
|
||||||
|
fields = ("content_type", "object_id", "rank", "notes")
|
||||||
|
ordering = ("rank",)
|
||||||
|
|
||||||
|
|
||||||
|
@admin.register(User)
|
||||||
|
class CustomUserAdmin(UserAdmin):
|
||||||
|
list_display = (
|
||||||
|
"username",
|
||||||
|
"email",
|
||||||
|
"get_avatar",
|
||||||
|
"get_status",
|
||||||
|
"role",
|
||||||
|
"date_joined",
|
||||||
|
"last_login",
|
||||||
|
"get_credits",
|
||||||
|
)
|
||||||
|
list_filter = (
|
||||||
|
"is_active",
|
||||||
|
"is_staff",
|
||||||
|
"role",
|
||||||
|
"is_banned",
|
||||||
|
"groups",
|
||||||
|
"date_joined",
|
||||||
|
)
|
||||||
|
search_fields = ("username", "email")
|
||||||
|
ordering = ("-date_joined",)
|
||||||
|
actions = [
|
||||||
|
"activate_users",
|
||||||
|
"deactivate_users",
|
||||||
|
"ban_users",
|
||||||
|
"unban_users",
|
||||||
|
]
|
||||||
|
inlines = [UserProfileInline]
|
||||||
|
|
||||||
|
fieldsets = (
|
||||||
|
(None, {"fields": ("username", "password")}),
|
||||||
|
("Personal info", {"fields": ("email", "pending_email")}),
|
||||||
|
(
|
||||||
|
"Roles and Permissions",
|
||||||
|
{
|
||||||
|
"fields": ("role", "groups", "user_permissions"),
|
||||||
|
"description": (
|
||||||
|
"Role determines group membership. Groups determine permissions."
|
||||||
|
),
|
||||||
|
},
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"Status",
|
||||||
|
{
|
||||||
|
"fields": ("is_active", "is_staff", "is_superuser"),
|
||||||
|
"description": "These are automatically managed based on role.",
|
||||||
|
},
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"Ban Status",
|
||||||
|
{
|
||||||
|
"fields": ("is_banned", "ban_reason", "ban_date"),
|
||||||
|
},
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"Preferences",
|
||||||
|
{
|
||||||
|
"fields": ("theme_preference",),
|
||||||
|
},
|
||||||
|
),
|
||||||
|
("Important dates", {"fields": ("last_login", "date_joined")}),
|
||||||
|
)
|
||||||
|
add_fieldsets = (
|
||||||
|
(
|
||||||
|
None,
|
||||||
|
{
|
||||||
|
"classes": ("wide",),
|
||||||
|
"fields": (
|
||||||
|
"username",
|
||||||
|
"email",
|
||||||
|
"password1",
|
||||||
|
"password2",
|
||||||
|
"role",
|
||||||
|
),
|
||||||
|
},
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
@admin.display(description="Avatar")
|
||||||
|
def get_avatar(self, obj):
|
||||||
|
if obj.profile.avatar:
|
||||||
|
return format_html(
|
||||||
|
'<img src="{}" width="30" height="30" style="border-radius:50%;" />',
|
||||||
|
obj.profile.avatar.url,
|
||||||
|
)
|
||||||
|
return format_html(
|
||||||
|
'<div style="width:30px; height:30px; border-radius:50%; '
|
||||||
|
"background-color:#007bff; color:white; display:flex; "
|
||||||
|
'align-items:center; justify-content:center;">{}</div>',
|
||||||
|
obj.username[0].upper(),
|
||||||
|
)
|
||||||
|
|
||||||
|
@admin.display(description="Status")
|
||||||
|
def get_status(self, obj):
|
||||||
|
if obj.is_banned:
|
||||||
|
return format_html('<span style="color: red;">Banned</span>')
|
||||||
|
if not obj.is_active:
|
||||||
|
return format_html('<span style="color: orange;">Inactive</span>')
|
||||||
|
if obj.is_superuser:
|
||||||
|
return format_html('<span style="color: purple;">Superuser</span>')
|
||||||
|
if obj.is_staff:
|
||||||
|
return format_html('<span style="color: blue;">Staff</span>')
|
||||||
|
return format_html('<span style="color: green;">Active</span>')
|
||||||
|
|
||||||
|
@admin.display(description="Ride Credits")
|
||||||
|
def get_credits(self, obj):
|
||||||
|
try:
|
||||||
|
profile = obj.profile
|
||||||
|
return format_html(
|
||||||
|
"RC: {}<br>DR: {}<br>FR: {}<br>WR: {}",
|
||||||
|
profile.coaster_credits,
|
||||||
|
profile.dark_ride_credits,
|
||||||
|
profile.flat_ride_credits,
|
||||||
|
profile.water_ride_credits,
|
||||||
|
)
|
||||||
|
except UserProfile.DoesNotExist:
|
||||||
|
return "-"
|
||||||
|
|
||||||
|
@admin.action(description="Activate selected users")
|
||||||
|
def activate_users(self, request, queryset):
|
||||||
|
queryset.update(is_active=True)
|
||||||
|
|
||||||
|
@admin.action(description="Deactivate selected users")
|
||||||
|
def deactivate_users(self, request, queryset):
|
||||||
|
queryset.update(is_active=False)
|
||||||
|
|
||||||
|
@admin.action(description="Ban selected users")
|
||||||
|
def ban_users(self, request, queryset):
|
||||||
|
from django.utils import timezone
|
||||||
|
|
||||||
|
queryset.update(is_banned=True, ban_date=timezone.now())
|
||||||
|
|
||||||
|
@admin.action(description="Unban selected users")
|
||||||
|
def unban_users(self, request, queryset):
|
||||||
|
queryset.update(is_banned=False, ban_date=None, ban_reason="")
|
||||||
|
|
||||||
|
def save_model(self, request, obj, form, change):
|
||||||
|
creating = not obj.pk
|
||||||
|
super().save_model(request, obj, form, change)
|
||||||
|
if creating and obj.role != User.Roles.USER:
|
||||||
|
# Ensure new user with role gets added to appropriate group
|
||||||
|
group = Group.objects.filter(name=obj.role).first()
|
||||||
|
if group:
|
||||||
|
obj.groups.add(group)
|
||||||
|
|
||||||
|
|
||||||
|
@admin.register(UserProfile)
|
||||||
|
class UserProfileAdmin(admin.ModelAdmin):
|
||||||
|
list_display = (
|
||||||
|
"user",
|
||||||
|
"display_name",
|
||||||
|
"coaster_credits",
|
||||||
|
"dark_ride_credits",
|
||||||
|
"flat_ride_credits",
|
||||||
|
"water_ride_credits",
|
||||||
|
)
|
||||||
|
list_filter = (
|
||||||
|
"coaster_credits",
|
||||||
|
"dark_ride_credits",
|
||||||
|
"flat_ride_credits",
|
||||||
|
"water_ride_credits",
|
||||||
|
)
|
||||||
|
search_fields = ("user__username", "user__email", "display_name", "bio")
|
||||||
|
|
||||||
|
fieldsets = (
|
||||||
|
(
|
||||||
|
"User Information",
|
||||||
|
{"fields": ("user", "display_name", "avatar", "pronouns", "bio")},
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"Social Media",
|
||||||
|
{"fields": ("twitter", "instagram", "youtube", "discord")},
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"Ride Credits",
|
||||||
|
{
|
||||||
|
"fields": (
|
||||||
|
"coaster_credits",
|
||||||
|
"dark_ride_credits",
|
||||||
|
"flat_ride_credits",
|
||||||
|
"water_ride_credits",
|
||||||
|
)
|
||||||
|
},
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@admin.register(EmailVerification)
|
||||||
|
class EmailVerificationAdmin(admin.ModelAdmin):
|
||||||
|
list_display = ("user", "created_at", "last_sent", "is_expired")
|
||||||
|
list_filter = ("created_at", "last_sent")
|
||||||
|
search_fields = ("user__username", "user__email", "token")
|
||||||
|
readonly_fields = ("created_at", "last_sent")
|
||||||
|
|
||||||
|
fieldsets = (
|
||||||
|
("Verification Details", {"fields": ("user", "token")}),
|
||||||
|
("Timing", {"fields": ("created_at", "last_sent")}),
|
||||||
|
)
|
||||||
|
|
||||||
|
@admin.display(description="Status")
|
||||||
|
def is_expired(self, obj):
|
||||||
|
from django.utils import timezone
|
||||||
|
from datetime import timedelta
|
||||||
|
|
||||||
|
if timezone.now() - obj.last_sent > timedelta(days=1):
|
||||||
|
return format_html('<span style="color: red;">Expired</span>')
|
||||||
|
return format_html('<span style="color: green;">Valid</span>')
|
||||||
|
|
||||||
|
|
||||||
|
@admin.register(TopList)
|
||||||
|
class TopListAdmin(admin.ModelAdmin):
|
||||||
|
list_display = ("title", "user", "category", "created_at", "updated_at")
|
||||||
|
list_filter = ("category", "created_at", "updated_at")
|
||||||
|
search_fields = ("title", "user__username", "description")
|
||||||
|
inlines = [TopListItemInline]
|
||||||
|
|
||||||
|
fieldsets = (
|
||||||
|
(
|
||||||
|
"Basic Information",
|
||||||
|
{"fields": ("user", "title", "category", "description")},
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"Timestamps",
|
||||||
|
{"fields": ("created_at", "updated_at"), "classes": ("collapse",)},
|
||||||
|
),
|
||||||
|
)
|
||||||
|
readonly_fields = ("created_at", "updated_at")
|
||||||
|
|
||||||
|
|
||||||
|
@admin.register(TopListItem)
|
||||||
|
class TopListItemAdmin(admin.ModelAdmin):
|
||||||
|
list_display = ("top_list", "content_type", "object_id", "rank")
|
||||||
|
list_filter = ("top_list__category", "rank")
|
||||||
|
search_fields = ("top_list__title", "notes")
|
||||||
|
ordering = ("top_list", "rank")
|
||||||
|
|
||||||
|
fieldsets = (
|
||||||
|
("List Information", {"fields": ("top_list", "rank")}),
|
||||||
|
("Item Details", {"fields": ("content_type", "object_id", "notes")}),
|
||||||
|
)
|
||||||
9
backend/apps/accounts/apps.py
Normal file
9
backend/apps/accounts/apps.py
Normal file
@@ -0,0 +1,9 @@
|
|||||||
|
from django.apps import AppConfig
|
||||||
|
|
||||||
|
|
||||||
|
class AccountsConfig(AppConfig):
|
||||||
|
default_auto_field = "django.db.models.BigAutoField"
|
||||||
|
name = "apps.accounts"
|
||||||
|
|
||||||
|
def ready(self):
|
||||||
|
import apps.accounts.signals # noqa
|
||||||
@@ -0,0 +1,46 @@
|
|||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
from allauth.socialaccount.models import SocialApp, SocialAccount, SocialToken
|
||||||
|
from django.contrib.sites.models import Site
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = "Check all social auth related tables"
|
||||||
|
|
||||||
|
def handle(self, *args, **options):
|
||||||
|
# Check SocialApp
|
||||||
|
self.stdout.write("\nChecking SocialApp table:")
|
||||||
|
for app in SocialApp.objects.all():
|
||||||
|
self.stdout.write(
|
||||||
|
f"ID: {
|
||||||
|
app.pk}, Provider: {
|
||||||
|
app.provider}, Name: {
|
||||||
|
app.name}, Client ID: {
|
||||||
|
app.client_id}"
|
||||||
|
)
|
||||||
|
self.stdout.write("Sites:")
|
||||||
|
for site in app.sites.all():
|
||||||
|
self.stdout.write(f" - {site.domain}")
|
||||||
|
|
||||||
|
# Check SocialAccount
|
||||||
|
self.stdout.write("\nChecking SocialAccount table:")
|
||||||
|
for account in SocialAccount.objects.all():
|
||||||
|
self.stdout.write(
|
||||||
|
f"ID: {
|
||||||
|
account.pk}, Provider: {
|
||||||
|
account.provider}, UID: {
|
||||||
|
account.uid}"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Check SocialToken
|
||||||
|
self.stdout.write("\nChecking SocialToken table:")
|
||||||
|
for token in SocialToken.objects.all():
|
||||||
|
self.stdout.write(
|
||||||
|
f"ID: {token.pk}, Account: {token.account}, App: {token.app}"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Check Site
|
||||||
|
self.stdout.write("\nChecking Site table:")
|
||||||
|
for site in Site.objects.all():
|
||||||
|
self.stdout.write(
|
||||||
|
f"ID: {site.pk}, Domain: {site.domain}, Name: {site.name}"
|
||||||
|
)
|
||||||
@@ -0,0 +1,27 @@
|
|||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
from allauth.socialaccount.models import SocialApp
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = "Check social app configurations"
|
||||||
|
|
||||||
|
def handle(self, *args, **options):
|
||||||
|
social_apps = SocialApp.objects.all()
|
||||||
|
|
||||||
|
if not social_apps:
|
||||||
|
self.stdout.write(self.style.ERROR("No social apps found"))
|
||||||
|
return
|
||||||
|
|
||||||
|
for app in social_apps:
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS(
|
||||||
|
f"\nProvider: {
|
||||||
|
app.provider}"
|
||||||
|
)
|
||||||
|
)
|
||||||
|
self.stdout.write(f"Name: {app.name}")
|
||||||
|
self.stdout.write(f"Client ID: {app.client_id}")
|
||||||
|
self.stdout.write(f"Secret: {app.secret}")
|
||||||
|
self.stdout.write(
|
||||||
|
f'Sites: {", ".join(str(site.domain) for site in app.sites.all())}'
|
||||||
|
)
|
||||||
@@ -0,0 +1,28 @@
|
|||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
from django.db import connection
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = "Clean up social auth tables and migrations"
|
||||||
|
|
||||||
|
def handle(self, *args, **options):
|
||||||
|
with connection.cursor() as cursor:
|
||||||
|
# Drop social auth tables
|
||||||
|
cursor.execute("DROP TABLE IF EXISTS socialaccount_socialapp")
|
||||||
|
cursor.execute("DROP TABLE IF EXISTS socialaccount_socialapp_sites")
|
||||||
|
cursor.execute("DROP TABLE IF EXISTS socialaccount_socialaccount")
|
||||||
|
cursor.execute("DROP TABLE IF EXISTS socialaccount_socialtoken")
|
||||||
|
|
||||||
|
# Remove migration records
|
||||||
|
cursor.execute("DELETE FROM django_migrations WHERE app='socialaccount'")
|
||||||
|
cursor.execute(
|
||||||
|
"DELETE FROM django_migrations WHERE app='accounts' "
|
||||||
|
"AND name LIKE '%social%'"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Reset sequences
|
||||||
|
cursor.execute("DELETE FROM sqlite_sequence WHERE name LIKE '%social%'")
|
||||||
|
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS("Successfully cleaned up social auth configuration")
|
||||||
|
)
|
||||||
@@ -0,0 +1,67 @@
|
|||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
from django.contrib.auth import get_user_model
|
||||||
|
from apps.parks.models import ParkReview, Park
|
||||||
|
from apps.rides.models import Ride
|
||||||
|
from apps.media.models import Photo
|
||||||
|
|
||||||
|
User = get_user_model()
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = "Cleans up test users and data created during e2e testing"
|
||||||
|
|
||||||
|
def handle(self, *args, **kwargs):
|
||||||
|
# Delete test users
|
||||||
|
test_users = User.objects.filter(username__in=["testuser", "moderator"])
|
||||||
|
count = test_users.count()
|
||||||
|
test_users.delete()
|
||||||
|
self.stdout.write(self.style.SUCCESS(f"Deleted {count} test users"))
|
||||||
|
|
||||||
|
# Delete test reviews
|
||||||
|
reviews = ParkReview.objects.filter(
|
||||||
|
user__username__in=["testuser", "moderator"]
|
||||||
|
)
|
||||||
|
count = reviews.count()
|
||||||
|
reviews.delete()
|
||||||
|
self.stdout.write(self.style.SUCCESS(f"Deleted {count} test reviews"))
|
||||||
|
|
||||||
|
# Delete test photos
|
||||||
|
photos = Photo.objects.filter(uploader__username__in=["testuser", "moderator"])
|
||||||
|
count = photos.count()
|
||||||
|
photos.delete()
|
||||||
|
self.stdout.write(self.style.SUCCESS(f"Deleted {count} test photos"))
|
||||||
|
|
||||||
|
# Delete test parks
|
||||||
|
parks = Park.objects.filter(name__startswith="Test Park")
|
||||||
|
count = parks.count()
|
||||||
|
parks.delete()
|
||||||
|
self.stdout.write(self.style.SUCCESS(f"Deleted {count} test parks"))
|
||||||
|
|
||||||
|
# Delete test rides
|
||||||
|
rides = Ride.objects.filter(name__startswith="Test Ride")
|
||||||
|
count = rides.count()
|
||||||
|
rides.delete()
|
||||||
|
self.stdout.write(self.style.SUCCESS(f"Deleted {count} test rides"))
|
||||||
|
|
||||||
|
# Clean up test files
|
||||||
|
import os
|
||||||
|
import glob
|
||||||
|
|
||||||
|
# Clean up test uploads
|
||||||
|
media_patterns = [
|
||||||
|
"media/uploads/test_*",
|
||||||
|
"media/avatars/test_*",
|
||||||
|
"media/park/test_*",
|
||||||
|
"media/rides/test_*",
|
||||||
|
]
|
||||||
|
|
||||||
|
for pattern in media_patterns:
|
||||||
|
files = glob.glob(pattern)
|
||||||
|
for f in files:
|
||||||
|
try:
|
||||||
|
os.remove(f)
|
||||||
|
self.stdout.write(self.style.SUCCESS(f"Deleted {f}"))
|
||||||
|
except OSError as e:
|
||||||
|
self.stdout.write(self.style.WARNING(f"Error deleting {f}: {e}"))
|
||||||
|
|
||||||
|
self.stdout.write(self.style.SUCCESS("Test data cleanup complete"))
|
||||||
@@ -0,0 +1,55 @@
|
|||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
from django.contrib.sites.models import Site
|
||||||
|
from allauth.socialaccount.models import SocialApp
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = "Create social apps for authentication"
|
||||||
|
|
||||||
|
def handle(self, *args, **options):
|
||||||
|
# Get the default site
|
||||||
|
site = Site.objects.get_or_create(
|
||||||
|
id=1,
|
||||||
|
defaults={
|
||||||
|
"domain": "localhost:8000",
|
||||||
|
"name": "ThrillWiki Development",
|
||||||
|
},
|
||||||
|
)[0]
|
||||||
|
|
||||||
|
# Create Discord app
|
||||||
|
discord_app, created = SocialApp.objects.get_or_create(
|
||||||
|
provider="discord",
|
||||||
|
defaults={
|
||||||
|
"name": "Discord",
|
||||||
|
"client_id": "1299112802274902047",
|
||||||
|
"secret": "ece7Pe_M4mD4mYzAgcINjTEKL_3ftL11",
|
||||||
|
},
|
||||||
|
)
|
||||||
|
if not created:
|
||||||
|
discord_app.client_id = "1299112802274902047"
|
||||||
|
discord_app.secret = "ece7Pe_M4mD4mYzAgcINjTEKL_3ftL11"
|
||||||
|
discord_app.save()
|
||||||
|
discord_app.sites.add(site)
|
||||||
|
self.stdout.write(f'{"Created" if created else "Updated"} Discord app')
|
||||||
|
|
||||||
|
# Create Google app
|
||||||
|
google_app, created = SocialApp.objects.get_or_create(
|
||||||
|
provider="google",
|
||||||
|
defaults={
|
||||||
|
"name": "Google",
|
||||||
|
"client_id": (
|
||||||
|
"135166769591-nopcgmo0fkqfqfs9qe783a137mtmcrt2."
|
||||||
|
"apps.googleusercontent.com"
|
||||||
|
),
|
||||||
|
"secret": "GOCSPX-Wd_0Ue0Ue0Ue0Ue0Ue0Ue0Ue0Ue",
|
||||||
|
},
|
||||||
|
)
|
||||||
|
if not created:
|
||||||
|
google_app.client_id = (
|
||||||
|
"135166769591-nopcgmo0fkqfqfs9qe783a137mtmcrt2."
|
||||||
|
"apps.googleusercontent.com"
|
||||||
|
)
|
||||||
|
google_app.secret = "GOCSPX-Wd_0Ue0Ue0Ue0Ue0Ue0Ue0Ue0Ue"
|
||||||
|
google_app.save()
|
||||||
|
google_app.sites.add(site)
|
||||||
|
self.stdout.write(f'{"Created" if created else "Updated"} Google app')
|
||||||
@@ -0,0 +1,58 @@
|
|||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
from django.contrib.auth.models import Group, Permission, User
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = "Creates test users for e2e testing"
|
||||||
|
|
||||||
|
def handle(self, *args, **kwargs):
|
||||||
|
# Create regular test user
|
||||||
|
if not User.objects.filter(username="testuser").exists():
|
||||||
|
user = User.objects.create(
|
||||||
|
username="testuser",
|
||||||
|
email="testuser@example.com",
|
||||||
|
)
|
||||||
|
user.set_password("testpass123")
|
||||||
|
user.save()
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS(f"Created test user: {user.get_username()}")
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
self.stdout.write(self.style.WARNING("Test user already exists"))
|
||||||
|
|
||||||
|
if not User.objects.filter(username="moderator").exists():
|
||||||
|
moderator = User.objects.create(
|
||||||
|
username="moderator",
|
||||||
|
email="moderator@example.com",
|
||||||
|
)
|
||||||
|
moderator.set_password("modpass123")
|
||||||
|
moderator.save()
|
||||||
|
|
||||||
|
# Create moderator group if it doesn't exist
|
||||||
|
moderator_group, created = Group.objects.get_or_create(name="Moderators")
|
||||||
|
|
||||||
|
# Add relevant permissions
|
||||||
|
permissions = Permission.objects.filter(
|
||||||
|
codename__in=[
|
||||||
|
"change_review",
|
||||||
|
"delete_review",
|
||||||
|
"change_park",
|
||||||
|
"change_ride",
|
||||||
|
"moderate_photos",
|
||||||
|
"moderate_comments",
|
||||||
|
]
|
||||||
|
)
|
||||||
|
moderator_group.permissions.add(*permissions)
|
||||||
|
|
||||||
|
# Add user to moderator group
|
||||||
|
moderator.groups.add(moderator_group)
|
||||||
|
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS(
|
||||||
|
f"Created moderator user: {moderator.get_username()}"
|
||||||
|
)
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
self.stdout.write(self.style.WARNING("Moderator user already exists"))
|
||||||
|
|
||||||
|
self.stdout.write(self.style.SUCCESS("Test users setup complete"))
|
||||||
@@ -0,0 +1,18 @@
|
|||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
from django.db import connection
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = "Fix migration history by removing rides.0001_initial"
|
||||||
|
|
||||||
|
def handle(self, *args, **kwargs):
|
||||||
|
with connection.cursor() as cursor:
|
||||||
|
cursor.execute(
|
||||||
|
"DELETE FROM django_migrations WHERE app='rides' "
|
||||||
|
"AND name='0001_initial';"
|
||||||
|
)
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS(
|
||||||
|
"Successfully removed rides.0001_initial from migration history"
|
||||||
|
)
|
||||||
|
)
|
||||||
41
backend/apps/accounts/management/commands/fix_social_apps.py
Normal file
41
backend/apps/accounts/management/commands/fix_social_apps.py
Normal file
@@ -0,0 +1,41 @@
|
|||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
from allauth.socialaccount.models import SocialApp
|
||||||
|
from django.contrib.sites.models import Site
|
||||||
|
import os
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = "Fix social app configurations"
|
||||||
|
|
||||||
|
def handle(self, *args, **options):
|
||||||
|
# Delete all existing social apps
|
||||||
|
SocialApp.objects.all().delete()
|
||||||
|
self.stdout.write("Deleted all existing social apps")
|
||||||
|
|
||||||
|
# Get the default site
|
||||||
|
site = Site.objects.get(id=1)
|
||||||
|
|
||||||
|
# Create Google provider
|
||||||
|
google_app = SocialApp.objects.create(
|
||||||
|
provider="google",
|
||||||
|
name="Google",
|
||||||
|
client_id=os.getenv("GOOGLE_CLIENT_ID"),
|
||||||
|
secret=os.getenv("GOOGLE_CLIENT_SECRET"),
|
||||||
|
)
|
||||||
|
google_app.sites.add(site)
|
||||||
|
self.stdout.write(
|
||||||
|
f"Created Google app with client_id: {
|
||||||
|
google_app.client_id}"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Create Discord provider
|
||||||
|
discord_app = SocialApp.objects.create(
|
||||||
|
provider="discord",
|
||||||
|
name="Discord",
|
||||||
|
client_id=os.getenv("DISCORD_CLIENT_ID"),
|
||||||
|
secret=os.getenv("DISCORD_CLIENT_SECRET"),
|
||||||
|
)
|
||||||
|
discord_app.sites.add(site)
|
||||||
|
self.stdout.write(
|
||||||
|
f"Created Discord app with client_id: {discord_app.client_id}"
|
||||||
|
)
|
||||||
@@ -0,0 +1,54 @@
|
|||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
from PIL import Image, ImageDraw, ImageFont
|
||||||
|
import os
|
||||||
|
|
||||||
|
|
||||||
|
def generate_avatar(letter):
|
||||||
|
"""Generate an avatar for a given letter or number"""
|
||||||
|
avatar_size = (100, 100)
|
||||||
|
background_color = (0, 123, 255) # Blue background
|
||||||
|
text_color = (255, 255, 255) # White text
|
||||||
|
font_size = 100
|
||||||
|
|
||||||
|
# Create a blank image with background color
|
||||||
|
image = Image.new("RGB", avatar_size, background_color)
|
||||||
|
draw = ImageDraw.Draw(image)
|
||||||
|
|
||||||
|
# Load a font
|
||||||
|
font_path = "[AWS-SECRET-REMOVED]ans-Bold.ttf"
|
||||||
|
font = ImageFont.truetype(font_path, font_size)
|
||||||
|
|
||||||
|
# Calculate text size and position using textbbox
|
||||||
|
text_bbox = draw.textbbox((0, 0), letter, font=font)
|
||||||
|
text_width, text_height = (
|
||||||
|
text_bbox[2] - text_bbox[0],
|
||||||
|
text_bbox[3] - text_bbox[1],
|
||||||
|
)
|
||||||
|
text_position = (
|
||||||
|
(avatar_size[0] - text_width) / 2,
|
||||||
|
(avatar_size[1] - text_height) / 2,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Draw the text on the image
|
||||||
|
draw.text(text_position, letter, font=font, fill=text_color)
|
||||||
|
|
||||||
|
# Ensure the avatars directory exists
|
||||||
|
avatar_dir = "avatars/letters"
|
||||||
|
if not os.path.exists(avatar_dir):
|
||||||
|
os.makedirs(avatar_dir)
|
||||||
|
|
||||||
|
# Save the image to the avatars directory
|
||||||
|
avatar_path = os.path.join(avatar_dir, f"{letter}_avatar.png")
|
||||||
|
image.save(avatar_path)
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = "Generate avatars for letters A-Z and numbers 0-9"
|
||||||
|
|
||||||
|
def handle(self, *args, **kwargs):
|
||||||
|
characters = [chr(i) for i in range(65, 91)] + [
|
||||||
|
str(i) for i in range(10)
|
||||||
|
] # A-Z and 0-9
|
||||||
|
for char in characters:
|
||||||
|
generate_avatar(char)
|
||||||
|
self.stdout.write(self.style.SUCCESS(f"Generated avatar for {char}"))
|
||||||
@@ -0,0 +1,18 @@
|
|||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
from apps.accounts.models import UserProfile
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = "Regenerate default avatars for users without an uploaded avatar"
|
||||||
|
|
||||||
|
def handle(self, *args, **kwargs):
|
||||||
|
profiles = UserProfile.objects.filter(avatar="")
|
||||||
|
for profile in profiles:
|
||||||
|
# This will trigger the avatar generation logic in the save method
|
||||||
|
profile.save()
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS(
|
||||||
|
f"Regenerated avatar for {
|
||||||
|
profile.user.username}"
|
||||||
|
)
|
||||||
|
)
|
||||||
113
backend/apps/accounts/management/commands/reset_db.py
Normal file
113
backend/apps/accounts/management/commands/reset_db.py
Normal file
@@ -0,0 +1,113 @@
|
|||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
from django.db import connection
|
||||||
|
from django.contrib.auth.hashers import make_password
|
||||||
|
import uuid
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = "Reset database and create admin user"
|
||||||
|
|
||||||
|
def handle(self, *args, **options):
|
||||||
|
self.stdout.write("Resetting database...")
|
||||||
|
|
||||||
|
# Drop all tables
|
||||||
|
with connection.cursor() as cursor:
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
DO $$ DECLARE
|
||||||
|
r RECORD;
|
||||||
|
BEGIN
|
||||||
|
FOR r IN (
|
||||||
|
SELECT tablename FROM pg_tables
|
||||||
|
WHERE schemaname = current_schema()
|
||||||
|
) LOOP
|
||||||
|
EXECUTE 'DROP TABLE IF EXISTS ' || \
|
||||||
|
quote_ident(r.tablename) || ' CASCADE';
|
||||||
|
END LOOP;
|
||||||
|
END $$;
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
|
||||||
|
# Reset sequences
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
DO $$ DECLARE
|
||||||
|
r RECORD;
|
||||||
|
BEGIN
|
||||||
|
FOR r IN (
|
||||||
|
SELECT sequencename FROM pg_sequences
|
||||||
|
WHERE schemaname = current_schema()
|
||||||
|
) LOOP
|
||||||
|
EXECUTE 'ALTER SEQUENCE ' || \
|
||||||
|
quote_ident(r.sequencename) || ' RESTART WITH 1';
|
||||||
|
END LOOP;
|
||||||
|
END $$;
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
|
||||||
|
self.stdout.write("All tables dropped and sequences reset.")
|
||||||
|
|
||||||
|
# Run migrations
|
||||||
|
from django.core.management import call_command
|
||||||
|
|
||||||
|
call_command("migrate")
|
||||||
|
|
||||||
|
self.stdout.write("Migrations applied.")
|
||||||
|
|
||||||
|
# Create superuser using raw SQL
|
||||||
|
try:
|
||||||
|
with connection.cursor() as cursor:
|
||||||
|
# Create user
|
||||||
|
user_id = str(uuid.uuid4())[:10]
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
INSERT INTO accounts_user (
|
||||||
|
username, password, email, is_superuser, is_staff,
|
||||||
|
is_active, date_joined, user_id, first_name,
|
||||||
|
last_name, role, is_banned, ban_reason,
|
||||||
|
theme_preference
|
||||||
|
) VALUES (
|
||||||
|
'admin', %s, 'admin@thrillwiki.com', true, true,
|
||||||
|
true, NOW(), %s, '', '', 'SUPERUSER', false, '',
|
||||||
|
'light'
|
||||||
|
) RETURNING id;
|
||||||
|
""",
|
||||||
|
[make_password("admin"), user_id],
|
||||||
|
)
|
||||||
|
|
||||||
|
result = cursor.fetchone()
|
||||||
|
if result is None:
|
||||||
|
raise Exception("Failed to create user - no ID returned")
|
||||||
|
user_db_id = result[0]
|
||||||
|
|
||||||
|
# Create profile
|
||||||
|
profile_id = str(uuid.uuid4())[:10]
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
INSERT INTO accounts_userprofile (
|
||||||
|
profile_id, display_name, pronouns, bio,
|
||||||
|
twitter, instagram, youtube, discord,
|
||||||
|
coaster_credits, dark_ride_credits,
|
||||||
|
flat_ride_credits, water_ride_credits,
|
||||||
|
user_id, avatar
|
||||||
|
) VALUES (
|
||||||
|
%s, 'Admin', 'they/them', 'ThrillWiki Administrator',
|
||||||
|
'', '', '', '',
|
||||||
|
0, 0, 0, 0,
|
||||||
|
%s, ''
|
||||||
|
);
|
||||||
|
""",
|
||||||
|
[profile_id, user_db_id],
|
||||||
|
)
|
||||||
|
|
||||||
|
self.stdout.write("Superuser created.")
|
||||||
|
except Exception as e:
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.ERROR(
|
||||||
|
f"Error creating superuser: {
|
||||||
|
str(e)}"
|
||||||
|
)
|
||||||
|
)
|
||||||
|
raise
|
||||||
|
|
||||||
|
self.stdout.write(self.style.SUCCESS("Database reset complete."))
|
||||||
@@ -0,0 +1,39 @@
|
|||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
from allauth.socialaccount.models import SocialApp
|
||||||
|
from django.contrib.sites.models import Site
|
||||||
|
from django.db import connection
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = "Reset social apps configuration"
|
||||||
|
|
||||||
|
def handle(self, *args, **options):
|
||||||
|
# Delete all social apps using raw SQL to bypass Django's ORM
|
||||||
|
with connection.cursor() as cursor:
|
||||||
|
cursor.execute("DELETE FROM socialaccount_socialapp_sites")
|
||||||
|
cursor.execute("DELETE FROM socialaccount_socialapp")
|
||||||
|
|
||||||
|
# Get the default site
|
||||||
|
site = Site.objects.get(id=1)
|
||||||
|
|
||||||
|
# Create Discord app
|
||||||
|
discord_app = SocialApp.objects.create(
|
||||||
|
provider="discord",
|
||||||
|
name="Discord",
|
||||||
|
client_id="1299112802274902047",
|
||||||
|
secret="ece7Pe_M4mD4mYzAgcINjTEKL_3ftL11",
|
||||||
|
)
|
||||||
|
discord_app.sites.add(site)
|
||||||
|
self.stdout.write(f"Created Discord app with ID: {discord_app.pk}")
|
||||||
|
|
||||||
|
# Create Google app
|
||||||
|
google_app = SocialApp.objects.create(
|
||||||
|
provider="google",
|
||||||
|
name="Google",
|
||||||
|
client_id=(
|
||||||
|
"135166769591-nopcgmo0fkqfqfs9qe783a137mtmcrt2.apps.googleusercontent.com"
|
||||||
|
),
|
||||||
|
secret="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",
|
||||||
|
)
|
||||||
|
google_app.sites.add(site)
|
||||||
|
self.stdout.write(f"Created Google app with ID: {google_app.pk}")
|
||||||
@@ -0,0 +1,24 @@
|
|||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
from django.db import connection
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = "Reset social auth configuration"
|
||||||
|
|
||||||
|
def handle(self, *args, **options):
|
||||||
|
with connection.cursor() as cursor:
|
||||||
|
# Delete all social apps
|
||||||
|
cursor.execute("DELETE FROM socialaccount_socialapp")
|
||||||
|
cursor.execute("DELETE FROM socialaccount_socialapp_sites")
|
||||||
|
|
||||||
|
# Reset sequences
|
||||||
|
cursor.execute(
|
||||||
|
"DELETE FROM sqlite_sequence WHERE name='socialaccount_socialapp'"
|
||||||
|
)
|
||||||
|
cursor.execute(
|
||||||
|
"DELETE FROM sqlite_sequence WHERE name='socialaccount_socialapp_sites'"
|
||||||
|
)
|
||||||
|
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS("Successfully reset social auth configuration")
|
||||||
|
)
|
||||||
49
backend/apps/accounts/management/commands/setup_groups.py
Normal file
49
backend/apps/accounts/management/commands/setup_groups.py
Normal file
@@ -0,0 +1,49 @@
|
|||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
from django.contrib.auth.models import Group
|
||||||
|
from apps.accounts.models import User
|
||||||
|
from apps.accounts.signals import create_default_groups
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = "Set up default groups and permissions for user roles"
|
||||||
|
|
||||||
|
def handle(self, *args, **options):
|
||||||
|
self.stdout.write("Creating default groups and permissions...")
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Create default groups with permissions
|
||||||
|
create_default_groups()
|
||||||
|
|
||||||
|
# Sync existing users with groups based on their roles
|
||||||
|
users = User.objects.exclude(role=User.Roles.USER)
|
||||||
|
for user in users:
|
||||||
|
group = Group.objects.filter(name=user.role).first()
|
||||||
|
if group:
|
||||||
|
user.groups.add(group)
|
||||||
|
|
||||||
|
# Update staff/superuser status based on role
|
||||||
|
if user.role == User.Roles.SUPERUSER:
|
||||||
|
user.is_superuser = True
|
||||||
|
user.is_staff = True
|
||||||
|
elif user.role in [User.Roles.ADMIN, User.Roles.MODERATOR]:
|
||||||
|
user.is_staff = True
|
||||||
|
user.save()
|
||||||
|
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS("Successfully set up groups and permissions")
|
||||||
|
)
|
||||||
|
|
||||||
|
# Print summary
|
||||||
|
for group in Group.objects.all():
|
||||||
|
self.stdout.write(f"\nGroup: {group.name}")
|
||||||
|
self.stdout.write("Permissions:")
|
||||||
|
for perm in group.permissions.all():
|
||||||
|
self.stdout.write(f" - {perm.codename}")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.ERROR(
|
||||||
|
f"Error setting up groups: {
|
||||||
|
str(e)}"
|
||||||
|
)
|
||||||
|
)
|
||||||
16
backend/apps/accounts/management/commands/setup_site.py
Normal file
16
backend/apps/accounts/management/commands/setup_site.py
Normal file
@@ -0,0 +1,16 @@
|
|||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
from django.contrib.sites.models import Site
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = "Set up default site"
|
||||||
|
|
||||||
|
def handle(self, *args, **options):
|
||||||
|
# Delete any existing sites
|
||||||
|
Site.objects.all().delete()
|
||||||
|
|
||||||
|
# Create default site
|
||||||
|
site = Site.objects.create(
|
||||||
|
id=1, domain="localhost:8000", name="ThrillWiki Development"
|
||||||
|
)
|
||||||
|
self.stdout.write(self.style.SUCCESS(f"Created site: {site.domain}"))
|
||||||
126
backend/apps/accounts/management/commands/setup_social_auth.py
Normal file
126
backend/apps/accounts/management/commands/setup_social_auth.py
Normal file
@@ -0,0 +1,126 @@
|
|||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
from django.contrib.sites.models import Site
|
||||||
|
from allauth.socialaccount.models import SocialApp
|
||||||
|
from dotenv import load_dotenv
|
||||||
|
import os
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = "Sets up social authentication apps"
|
||||||
|
|
||||||
|
def handle(self, *args, **kwargs):
|
||||||
|
# Load environment variables
|
||||||
|
load_dotenv()
|
||||||
|
|
||||||
|
# Get environment variables
|
||||||
|
google_client_id = os.getenv("GOOGLE_CLIENT_ID")
|
||||||
|
google_client_secret = os.getenv("GOOGLE_CLIENT_SECRET")
|
||||||
|
discord_client_id = os.getenv("DISCORD_CLIENT_ID")
|
||||||
|
discord_client_secret = os.getenv("DISCORD_CLIENT_SECRET")
|
||||||
|
|
||||||
|
# DEBUG: Log environment variable values
|
||||||
|
self.stdout.write(
|
||||||
|
f"DEBUG: google_client_id type: {
|
||||||
|
type(google_client_id)}, value: {google_client_id}"
|
||||||
|
)
|
||||||
|
self.stdout.write(
|
||||||
|
f"DEBUG: google_client_secret type: {
|
||||||
|
type(google_client_secret)}, value: {google_client_secret}"
|
||||||
|
)
|
||||||
|
self.stdout.write(
|
||||||
|
f"DEBUG: discord_client_id type: {
|
||||||
|
type(discord_client_id)}, value: {discord_client_id}"
|
||||||
|
)
|
||||||
|
self.stdout.write(
|
||||||
|
f"DEBUG: discord_client_secret type: {
|
||||||
|
type(discord_client_secret)}, value: {discord_client_secret}"
|
||||||
|
)
|
||||||
|
|
||||||
|
if not all(
|
||||||
|
[
|
||||||
|
google_client_id,
|
||||||
|
google_client_secret,
|
||||||
|
discord_client_id,
|
||||||
|
discord_client_secret,
|
||||||
|
]
|
||||||
|
):
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.ERROR("Missing required environment variables")
|
||||||
|
)
|
||||||
|
self.stdout.write(
|
||||||
|
f"DEBUG: google_client_id is None: {google_client_id is None}"
|
||||||
|
)
|
||||||
|
self.stdout.write(
|
||||||
|
f"DEBUG: google_client_secret is None: {
|
||||||
|
google_client_secret is None}"
|
||||||
|
)
|
||||||
|
self.stdout.write(
|
||||||
|
f"DEBUG: discord_client_id is None: {
|
||||||
|
discord_client_id is None}"
|
||||||
|
)
|
||||||
|
self.stdout.write(
|
||||||
|
f"DEBUG: discord_client_secret is None: {
|
||||||
|
discord_client_secret is None}"
|
||||||
|
)
|
||||||
|
return
|
||||||
|
|
||||||
|
# Get or create the default site
|
||||||
|
site, _ = Site.objects.get_or_create(
|
||||||
|
id=1, defaults={"domain": "localhost:8000", "name": "localhost"}
|
||||||
|
)
|
||||||
|
|
||||||
|
# Set up Google
|
||||||
|
google_app, created = SocialApp.objects.get_or_create(
|
||||||
|
provider="google",
|
||||||
|
defaults={
|
||||||
|
"name": "Google",
|
||||||
|
"client_id": google_client_id,
|
||||||
|
"secret": google_client_secret,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
if not created:
|
||||||
|
self.stdout.write(
|
||||||
|
f"DEBUG: About to assign google_client_id: {google_client_id} (type: {
|
||||||
|
type(google_client_id)})"
|
||||||
|
)
|
||||||
|
if google_client_id is not None and google_client_secret is not None:
|
||||||
|
google_app.client_id = google_client_id
|
||||||
|
google_app.secret = google_client_secret
|
||||||
|
google_app.save()
|
||||||
|
self.stdout.write("DEBUG: Successfully updated Google app")
|
||||||
|
else:
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.ERROR(
|
||||||
|
"Google client_id or secret is None, skipping update."
|
||||||
|
)
|
||||||
|
)
|
||||||
|
google_app.sites.add(site)
|
||||||
|
|
||||||
|
# Set up Discord
|
||||||
|
discord_app, created = SocialApp.objects.get_or_create(
|
||||||
|
provider="discord",
|
||||||
|
defaults={
|
||||||
|
"name": "Discord",
|
||||||
|
"client_id": discord_client_id,
|
||||||
|
"secret": discord_client_secret,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
if not created:
|
||||||
|
self.stdout.write(
|
||||||
|
f"DEBUG: About to assign discord_client_id: {discord_client_id} (type: {
|
||||||
|
type(discord_client_id)})"
|
||||||
|
)
|
||||||
|
if discord_client_id is not None and discord_client_secret is not None:
|
||||||
|
discord_app.client_id = discord_client_id
|
||||||
|
discord_app.secret = discord_client_secret
|
||||||
|
discord_app.save()
|
||||||
|
self.stdout.write("DEBUG: Successfully updated Discord app")
|
||||||
|
else:
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.ERROR(
|
||||||
|
"Discord client_id or secret is None, skipping update."
|
||||||
|
)
|
||||||
|
)
|
||||||
|
discord_app.sites.add(site)
|
||||||
|
|
||||||
|
self.stdout.write(self.style.SUCCESS("Successfully set up social auth apps"))
|
||||||
@@ -0,0 +1,70 @@
|
|||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
from django.contrib.sites.models import Site
|
||||||
|
from django.contrib.auth import get_user_model
|
||||||
|
|
||||||
|
User = get_user_model()
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = "Set up social authentication through admin interface"
|
||||||
|
|
||||||
|
def handle(self, *args, **options):
|
||||||
|
# Get or create the default site
|
||||||
|
site, _ = Site.objects.get_or_create(
|
||||||
|
id=1,
|
||||||
|
defaults={
|
||||||
|
"domain": "localhost:8000",
|
||||||
|
"name": "ThrillWiki Development",
|
||||||
|
},
|
||||||
|
)
|
||||||
|
if not _:
|
||||||
|
site.domain = "localhost:8000"
|
||||||
|
site.name = "ThrillWiki Development"
|
||||||
|
site.save()
|
||||||
|
self.stdout.write(f'{"Created" if _ else "Updated"} site: {site.domain}')
|
||||||
|
|
||||||
|
# Create superuser if it doesn't exist
|
||||||
|
if not User.objects.filter(username="admin").exists():
|
||||||
|
admin_user = User.objects.create(
|
||||||
|
username="admin",
|
||||||
|
email="admin@example.com",
|
||||||
|
is_staff=True,
|
||||||
|
is_superuser=True,
|
||||||
|
)
|
||||||
|
admin_user.set_password("admin")
|
||||||
|
admin_user.save()
|
||||||
|
self.stdout.write("Created superuser: admin/admin")
|
||||||
|
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS(
|
||||||
|
"""
|
||||||
|
Social auth setup instructions:
|
||||||
|
|
||||||
|
1. Run the development server:
|
||||||
|
python manage.py runserver
|
||||||
|
|
||||||
|
2. Go to the admin interface:
|
||||||
|
http://localhost:8000/admin/
|
||||||
|
|
||||||
|
3. Log in with:
|
||||||
|
Username: admin
|
||||||
|
Password: admin
|
||||||
|
|
||||||
|
4. Add social applications:
|
||||||
|
- Go to "Social applications" under "Social Accounts"
|
||||||
|
- Add Discord app:
|
||||||
|
Provider: discord
|
||||||
|
Name: Discord
|
||||||
|
Client id: 1299112802274902047
|
||||||
|
Secret key: ece7Pe_M4mD4mYzAgcINjTEKL_3ftL11
|
||||||
|
Sites: Add "localhost:8000"
|
||||||
|
|
||||||
|
- Add Google app:
|
||||||
|
Provider: google
|
||||||
|
Name: Google
|
||||||
|
Client id: 135166769591-nopcgmo0fkqfqfs9qe783a137mtmcrt2.apps.googleusercontent.com
|
||||||
|
Secret key: GOCSPX-Wd_0Ue0Ue0Ue0Ue0Ue0Ue0Ue0Ue
|
||||||
|
Sites: Add "localhost:8000"
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
)
|
||||||
@@ -0,0 +1,61 @@
|
|||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
from django.test import Client
|
||||||
|
from allauth.socialaccount.models import SocialApp
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = "Test Discord OAuth2 authentication flow"
|
||||||
|
|
||||||
|
def handle(self, *args, **options):
|
||||||
|
client = Client(HTTP_HOST="localhost:8000")
|
||||||
|
|
||||||
|
# Get Discord app
|
||||||
|
try:
|
||||||
|
discord_app = SocialApp.objects.get(provider="discord")
|
||||||
|
self.stdout.write("Found Discord app configuration:")
|
||||||
|
self.stdout.write(f"Client ID: {discord_app.client_id}")
|
||||||
|
|
||||||
|
# Test login URL
|
||||||
|
login_url = "/accounts/discord/login/"
|
||||||
|
response = client.get(login_url, HTTP_HOST="localhost:8000")
|
||||||
|
self.stdout.write(f"\nTesting login URL: {login_url}")
|
||||||
|
self.stdout.write(f"Status code: {response.status_code}")
|
||||||
|
|
||||||
|
if response.status_code == 302:
|
||||||
|
redirect_url = response["Location"]
|
||||||
|
self.stdout.write(f"Redirects to: {redirect_url}")
|
||||||
|
|
||||||
|
# Parse OAuth2 parameters
|
||||||
|
self.stdout.write("\nOAuth2 Parameters:")
|
||||||
|
if "client_id=" in redirect_url:
|
||||||
|
self.stdout.write("✓ client_id parameter present")
|
||||||
|
if "redirect_uri=" in redirect_url:
|
||||||
|
self.stdout.write("✓ redirect_uri parameter present")
|
||||||
|
if "scope=" in redirect_url:
|
||||||
|
self.stdout.write("✓ scope parameter present")
|
||||||
|
if "response_type=" in redirect_url:
|
||||||
|
self.stdout.write("✓ response_type parameter present")
|
||||||
|
if "code_challenge=" in redirect_url:
|
||||||
|
self.stdout.write("✓ PKCE enabled (code_challenge present)")
|
||||||
|
|
||||||
|
# Show callback URL
|
||||||
|
callback_url = "http://localhost:8000/accounts/discord/login/callback/"
|
||||||
|
self.stdout.write(
|
||||||
|
"\nCallback URL to configure in Discord Developer Portal:"
|
||||||
|
)
|
||||||
|
self.stdout.write(callback_url)
|
||||||
|
|
||||||
|
# Show frontend login URL
|
||||||
|
frontend_url = "http://localhost:5173"
|
||||||
|
self.stdout.write("\nFrontend configuration:")
|
||||||
|
self.stdout.write(f"Frontend URL: {frontend_url}")
|
||||||
|
self.stdout.write("Discord login button should use:")
|
||||||
|
self.stdout.write("/accounts/discord/login/?process=login")
|
||||||
|
|
||||||
|
# Show allauth URLs
|
||||||
|
self.stdout.write("\nAllauth URLs:")
|
||||||
|
self.stdout.write("Login URL: /accounts/discord/login/?process=login")
|
||||||
|
self.stdout.write("Callback URL: /accounts/discord/login/callback/")
|
||||||
|
|
||||||
|
except SocialApp.DoesNotExist:
|
||||||
|
self.stdout.write(self.style.ERROR("Discord app not found"))
|
||||||
@@ -0,0 +1,23 @@
|
|||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
from allauth.socialaccount.models import SocialApp
|
||||||
|
from django.contrib.sites.models import Site
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = "Update social apps to be associated with all sites"
|
||||||
|
|
||||||
|
def handle(self, *args, **options):
|
||||||
|
# Get all sites
|
||||||
|
sites = Site.objects.all()
|
||||||
|
|
||||||
|
# Update each social app
|
||||||
|
for app in SocialApp.objects.all():
|
||||||
|
self.stdout.write(f"Updating {app.provider} app...")
|
||||||
|
# Clear existing sites
|
||||||
|
app.sites.clear()
|
||||||
|
# Add all sites
|
||||||
|
for site in sites:
|
||||||
|
app.sites.add(site)
|
||||||
|
self.stdout.write(
|
||||||
|
f'Added sites: {", ".join(site.domain for site in sites)}'
|
||||||
|
)
|
||||||
@@ -0,0 +1,42 @@
|
|||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
from allauth.socialaccount.models import SocialApp
|
||||||
|
from django.conf import settings
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = "Verify Discord OAuth2 settings"
|
||||||
|
|
||||||
|
def handle(self, *args, **options):
|
||||||
|
# Get Discord app
|
||||||
|
try:
|
||||||
|
discord_app = SocialApp.objects.get(provider="discord")
|
||||||
|
self.stdout.write("Found Discord app configuration:")
|
||||||
|
self.stdout.write(f"Client ID: {discord_app.client_id}")
|
||||||
|
self.stdout.write(f"Secret: {discord_app.secret}")
|
||||||
|
|
||||||
|
# Get sites
|
||||||
|
sites = discord_app.sites.all()
|
||||||
|
self.stdout.write("\nAssociated sites:")
|
||||||
|
for site in sites:
|
||||||
|
self.stdout.write(f"- {site.domain} ({site.name})")
|
||||||
|
|
||||||
|
# Show callback URL
|
||||||
|
callback_url = "http://localhost:8000/accounts/discord/login/callback/"
|
||||||
|
self.stdout.write(
|
||||||
|
"\nCallback URL to configure in Discord Developer Portal:"
|
||||||
|
)
|
||||||
|
self.stdout.write(callback_url)
|
||||||
|
|
||||||
|
# Show OAuth2 settings
|
||||||
|
self.stdout.write("\nOAuth2 settings in settings.py:")
|
||||||
|
discord_settings = settings.SOCIALACCOUNT_PROVIDERS.get("discord", {})
|
||||||
|
self.stdout.write(
|
||||||
|
f'PKCE Enabled: {
|
||||||
|
discord_settings.get(
|
||||||
|
"OAUTH_PKCE_ENABLED",
|
||||||
|
False)}'
|
||||||
|
)
|
||||||
|
self.stdout.write(f'Scopes: {discord_settings.get("SCOPE", [])}')
|
||||||
|
|
||||||
|
except SocialApp.DoesNotExist:
|
||||||
|
self.stdout.write(self.style.ERROR("Discord app not found"))
|
||||||
552
backend/apps/accounts/migrations/0001_initial.py
Normal file
552
backend/apps/accounts/migrations/0001_initial.py
Normal file
@@ -0,0 +1,552 @@
|
|||||||
|
# Generated by Django 5.1.4 on 2025-08-13 21:35
|
||||||
|
|
||||||
|
import django.contrib.auth.models
|
||||||
|
import django.contrib.auth.validators
|
||||||
|
import django.db.models.deletion
|
||||||
|
import django.utils.timezone
|
||||||
|
import pgtrigger.compiler
|
||||||
|
import pgtrigger.migrations
|
||||||
|
from django.conf import settings
|
||||||
|
from django.db import migrations, models
|
||||||
|
|
||||||
|
|
||||||
|
class Migration(migrations.Migration):
|
||||||
|
|
||||||
|
initial = True
|
||||||
|
|
||||||
|
dependencies = [
|
||||||
|
("auth", "0012_alter_user_first_name_max_length"),
|
||||||
|
("contenttypes", "0002_remove_content_type_name"),
|
||||||
|
("pghistory", "0006_delete_aggregateevent"),
|
||||||
|
]
|
||||||
|
|
||||||
|
operations = [
|
||||||
|
migrations.CreateModel(
|
||||||
|
name="User",
|
||||||
|
fields=[
|
||||||
|
(
|
||||||
|
"id",
|
||||||
|
models.BigAutoField(
|
||||||
|
auto_created=True,
|
||||||
|
primary_key=True,
|
||||||
|
serialize=False,
|
||||||
|
verbose_name="ID",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"password",
|
||||||
|
models.CharField(max_length=128, verbose_name="password"),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"last_login",
|
||||||
|
models.DateTimeField(
|
||||||
|
blank=True, null=True, verbose_name="last login"
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"is_superuser",
|
||||||
|
models.BooleanField(
|
||||||
|
default=False,
|
||||||
|
help_text="Designates that this user has all permissions without explicitly assigning them.",
|
||||||
|
verbose_name="superuser status",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"username",
|
||||||
|
models.CharField(
|
||||||
|
error_messages={
|
||||||
|
"unique": "A user with that username already exists."
|
||||||
|
},
|
||||||
|
help_text="Required. 150 characters or fewer. Letters, digits and @/./+/-/_ only.",
|
||||||
|
max_length=150,
|
||||||
|
unique=True,
|
||||||
|
validators=[
|
||||||
|
django.contrib.auth.validators.UnicodeUsernameValidator()
|
||||||
|
],
|
||||||
|
verbose_name="username",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"first_name",
|
||||||
|
models.CharField(
|
||||||
|
blank=True, max_length=150, verbose_name="first name"
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"last_name",
|
||||||
|
models.CharField(
|
||||||
|
blank=True, max_length=150, verbose_name="last name"
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"email",
|
||||||
|
models.EmailField(
|
||||||
|
blank=True,
|
||||||
|
max_length=254,
|
||||||
|
verbose_name="email address",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"is_staff",
|
||||||
|
models.BooleanField(
|
||||||
|
default=False,
|
||||||
|
help_text="Designates whether the user can log into this admin site.",
|
||||||
|
verbose_name="staff status",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"is_active",
|
||||||
|
models.BooleanField(
|
||||||
|
default=True,
|
||||||
|
help_text="Designates whether this user should be treated as active. Unselect this instead of deleting accounts.",
|
||||||
|
verbose_name="active",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"date_joined",
|
||||||
|
models.DateTimeField(
|
||||||
|
default=django.utils.timezone.now,
|
||||||
|
verbose_name="date joined",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"user_id",
|
||||||
|
models.CharField(
|
||||||
|
editable=False,
|
||||||
|
help_text="Unique identifier for this user that remains constant even if the username changes",
|
||||||
|
max_length=10,
|
||||||
|
unique=True,
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"role",
|
||||||
|
models.CharField(
|
||||||
|
choices=[
|
||||||
|
("USER", "User"),
|
||||||
|
("MODERATOR", "Moderator"),
|
||||||
|
("ADMIN", "Admin"),
|
||||||
|
("SUPERUSER", "Superuser"),
|
||||||
|
],
|
||||||
|
default="USER",
|
||||||
|
max_length=10,
|
||||||
|
),
|
||||||
|
),
|
||||||
|
("is_banned", models.BooleanField(default=False)),
|
||||||
|
("ban_reason", models.TextField(blank=True)),
|
||||||
|
("ban_date", models.DateTimeField(blank=True, null=True)),
|
||||||
|
(
|
||||||
|
"pending_email",
|
||||||
|
models.EmailField(blank=True, max_length=254, null=True),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"theme_preference",
|
||||||
|
models.CharField(
|
||||||
|
choices=[("light", "Light"), ("dark", "Dark")],
|
||||||
|
default="light",
|
||||||
|
max_length=5,
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"groups",
|
||||||
|
models.ManyToManyField(
|
||||||
|
blank=True,
|
||||||
|
help_text="The groups this user belongs to. A user will get all permissions granted to each of their groups.",
|
||||||
|
related_name="user_set",
|
||||||
|
related_query_name="user",
|
||||||
|
to="auth.group",
|
||||||
|
verbose_name="groups",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"user_permissions",
|
||||||
|
models.ManyToManyField(
|
||||||
|
blank=True,
|
||||||
|
help_text="Specific permissions for this user.",
|
||||||
|
related_name="user_set",
|
||||||
|
related_query_name="user",
|
||||||
|
to="auth.permission",
|
||||||
|
verbose_name="user permissions",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
],
|
||||||
|
options={
|
||||||
|
"verbose_name": "user",
|
||||||
|
"verbose_name_plural": "users",
|
||||||
|
"abstract": False,
|
||||||
|
},
|
||||||
|
managers=[
|
||||||
|
("objects", django.contrib.auth.models.UserManager()),
|
||||||
|
],
|
||||||
|
),
|
||||||
|
migrations.CreateModel(
|
||||||
|
name="EmailVerification",
|
||||||
|
fields=[
|
||||||
|
(
|
||||||
|
"id",
|
||||||
|
models.BigAutoField(
|
||||||
|
auto_created=True,
|
||||||
|
primary_key=True,
|
||||||
|
serialize=False,
|
||||||
|
verbose_name="ID",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
("token", models.CharField(max_length=64, unique=True)),
|
||||||
|
("created_at", models.DateTimeField(auto_now_add=True)),
|
||||||
|
("last_sent", models.DateTimeField(auto_now_add=True)),
|
||||||
|
(
|
||||||
|
"user",
|
||||||
|
models.OneToOneField(
|
||||||
|
on_delete=django.db.models.deletion.CASCADE,
|
||||||
|
to=settings.AUTH_USER_MODEL,
|
||||||
|
),
|
||||||
|
),
|
||||||
|
],
|
||||||
|
options={
|
||||||
|
"verbose_name": "Email Verification",
|
||||||
|
"verbose_name_plural": "Email Verifications",
|
||||||
|
},
|
||||||
|
),
|
||||||
|
migrations.CreateModel(
|
||||||
|
name="PasswordReset",
|
||||||
|
fields=[
|
||||||
|
(
|
||||||
|
"id",
|
||||||
|
models.BigAutoField(
|
||||||
|
auto_created=True,
|
||||||
|
primary_key=True,
|
||||||
|
serialize=False,
|
||||||
|
verbose_name="ID",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
("token", models.CharField(max_length=64)),
|
||||||
|
("created_at", models.DateTimeField(auto_now_add=True)),
|
||||||
|
("expires_at", models.DateTimeField()),
|
||||||
|
("used", models.BooleanField(default=False)),
|
||||||
|
(
|
||||||
|
"user",
|
||||||
|
models.ForeignKey(
|
||||||
|
on_delete=django.db.models.deletion.CASCADE,
|
||||||
|
to=settings.AUTH_USER_MODEL,
|
||||||
|
),
|
||||||
|
),
|
||||||
|
],
|
||||||
|
options={
|
||||||
|
"verbose_name": "Password Reset",
|
||||||
|
"verbose_name_plural": "Password Resets",
|
||||||
|
},
|
||||||
|
),
|
||||||
|
migrations.CreateModel(
|
||||||
|
name="TopList",
|
||||||
|
fields=[
|
||||||
|
(
|
||||||
|
"id",
|
||||||
|
models.BigAutoField(
|
||||||
|
auto_created=True,
|
||||||
|
primary_key=True,
|
||||||
|
serialize=False,
|
||||||
|
verbose_name="ID",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
("title", models.CharField(max_length=100)),
|
||||||
|
(
|
||||||
|
"category",
|
||||||
|
models.CharField(
|
||||||
|
choices=[
|
||||||
|
("RC", "Roller Coaster"),
|
||||||
|
("DR", "Dark Ride"),
|
||||||
|
("FR", "Flat Ride"),
|
||||||
|
("WR", "Water Ride"),
|
||||||
|
("PK", "Park"),
|
||||||
|
],
|
||||||
|
max_length=2,
|
||||||
|
),
|
||||||
|
),
|
||||||
|
("description", models.TextField(blank=True)),
|
||||||
|
("created_at", models.DateTimeField(auto_now_add=True)),
|
||||||
|
("updated_at", models.DateTimeField(auto_now=True)),
|
||||||
|
(
|
||||||
|
"user",
|
||||||
|
models.ForeignKey(
|
||||||
|
on_delete=django.db.models.deletion.CASCADE,
|
||||||
|
related_name="top_lists",
|
||||||
|
to=settings.AUTH_USER_MODEL,
|
||||||
|
),
|
||||||
|
),
|
||||||
|
],
|
||||||
|
options={
|
||||||
|
"ordering": ["-updated_at"],
|
||||||
|
},
|
||||||
|
),
|
||||||
|
migrations.CreateModel(
|
||||||
|
name="TopListEvent",
|
||||||
|
fields=[
|
||||||
|
(
|
||||||
|
"pgh_id",
|
||||||
|
models.AutoField(primary_key=True, serialize=False),
|
||||||
|
),
|
||||||
|
("pgh_created_at", models.DateTimeField(auto_now_add=True)),
|
||||||
|
("pgh_label", models.TextField(help_text="The event label.")),
|
||||||
|
("id", models.BigIntegerField()),
|
||||||
|
("title", models.CharField(max_length=100)),
|
||||||
|
(
|
||||||
|
"category",
|
||||||
|
models.CharField(
|
||||||
|
choices=[
|
||||||
|
("RC", "Roller Coaster"),
|
||||||
|
("DR", "Dark Ride"),
|
||||||
|
("FR", "Flat Ride"),
|
||||||
|
("WR", "Water Ride"),
|
||||||
|
("PK", "Park"),
|
||||||
|
],
|
||||||
|
max_length=2,
|
||||||
|
),
|
||||||
|
),
|
||||||
|
("description", models.TextField(blank=True)),
|
||||||
|
("created_at", models.DateTimeField(auto_now_add=True)),
|
||||||
|
("updated_at", models.DateTimeField(auto_now=True)),
|
||||||
|
(
|
||||||
|
"pgh_context",
|
||||||
|
models.ForeignKey(
|
||||||
|
db_constraint=False,
|
||||||
|
null=True,
|
||||||
|
on_delete=django.db.models.deletion.DO_NOTHING,
|
||||||
|
related_name="+",
|
||||||
|
to="pghistory.context",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"pgh_obj",
|
||||||
|
models.ForeignKey(
|
||||||
|
db_constraint=False,
|
||||||
|
on_delete=django.db.models.deletion.DO_NOTHING,
|
||||||
|
related_name="events",
|
||||||
|
to="accounts.toplist",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"user",
|
||||||
|
models.ForeignKey(
|
||||||
|
db_constraint=False,
|
||||||
|
on_delete=django.db.models.deletion.DO_NOTHING,
|
||||||
|
related_name="+",
|
||||||
|
related_query_name="+",
|
||||||
|
to=settings.AUTH_USER_MODEL,
|
||||||
|
),
|
||||||
|
),
|
||||||
|
],
|
||||||
|
options={
|
||||||
|
"abstract": False,
|
||||||
|
},
|
||||||
|
),
|
||||||
|
migrations.CreateModel(
|
||||||
|
name="TopListItem",
|
||||||
|
fields=[
|
||||||
|
(
|
||||||
|
"id",
|
||||||
|
models.BigAutoField(
|
||||||
|
auto_created=True,
|
||||||
|
primary_key=True,
|
||||||
|
serialize=False,
|
||||||
|
verbose_name="ID",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
("created_at", models.DateTimeField(auto_now_add=True)),
|
||||||
|
("updated_at", models.DateTimeField(auto_now=True)),
|
||||||
|
("object_id", models.PositiveIntegerField()),
|
||||||
|
("rank", models.PositiveIntegerField()),
|
||||||
|
("notes", models.TextField(blank=True)),
|
||||||
|
(
|
||||||
|
"content_type",
|
||||||
|
models.ForeignKey(
|
||||||
|
on_delete=django.db.models.deletion.CASCADE,
|
||||||
|
to="contenttypes.contenttype",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"top_list",
|
||||||
|
models.ForeignKey(
|
||||||
|
on_delete=django.db.models.deletion.CASCADE,
|
||||||
|
related_name="items",
|
||||||
|
to="accounts.toplist",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
],
|
||||||
|
options={
|
||||||
|
"ordering": ["rank"],
|
||||||
|
},
|
||||||
|
),
|
||||||
|
migrations.CreateModel(
|
||||||
|
name="TopListItemEvent",
|
||||||
|
fields=[
|
||||||
|
(
|
||||||
|
"pgh_id",
|
||||||
|
models.AutoField(primary_key=True, serialize=False),
|
||||||
|
),
|
||||||
|
("pgh_created_at", models.DateTimeField(auto_now_add=True)),
|
||||||
|
("pgh_label", models.TextField(help_text="The event label.")),
|
||||||
|
("id", models.BigIntegerField()),
|
||||||
|
("created_at", models.DateTimeField(auto_now_add=True)),
|
||||||
|
("updated_at", models.DateTimeField(auto_now=True)),
|
||||||
|
("object_id", models.PositiveIntegerField()),
|
||||||
|
("rank", models.PositiveIntegerField()),
|
||||||
|
("notes", models.TextField(blank=True)),
|
||||||
|
(
|
||||||
|
"content_type",
|
||||||
|
models.ForeignKey(
|
||||||
|
db_constraint=False,
|
||||||
|
on_delete=django.db.models.deletion.DO_NOTHING,
|
||||||
|
related_name="+",
|
||||||
|
related_query_name="+",
|
||||||
|
to="contenttypes.contenttype",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"pgh_context",
|
||||||
|
models.ForeignKey(
|
||||||
|
db_constraint=False,
|
||||||
|
null=True,
|
||||||
|
on_delete=django.db.models.deletion.DO_NOTHING,
|
||||||
|
related_name="+",
|
||||||
|
to="pghistory.context",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"pgh_obj",
|
||||||
|
models.ForeignKey(
|
||||||
|
db_constraint=False,
|
||||||
|
on_delete=django.db.models.deletion.DO_NOTHING,
|
||||||
|
related_name="events",
|
||||||
|
to="accounts.toplistitem",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"top_list",
|
||||||
|
models.ForeignKey(
|
||||||
|
db_constraint=False,
|
||||||
|
on_delete=django.db.models.deletion.DO_NOTHING,
|
||||||
|
related_name="+",
|
||||||
|
related_query_name="+",
|
||||||
|
to="accounts.toplist",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
],
|
||||||
|
options={
|
||||||
|
"abstract": False,
|
||||||
|
},
|
||||||
|
),
|
||||||
|
migrations.CreateModel(
|
||||||
|
name="UserProfile",
|
||||||
|
fields=[
|
||||||
|
(
|
||||||
|
"id",
|
||||||
|
models.BigAutoField(
|
||||||
|
auto_created=True,
|
||||||
|
primary_key=True,
|
||||||
|
serialize=False,
|
||||||
|
verbose_name="ID",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"profile_id",
|
||||||
|
models.CharField(
|
||||||
|
editable=False,
|
||||||
|
help_text="Unique identifier for this profile that remains constant",
|
||||||
|
max_length=10,
|
||||||
|
unique=True,
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"display_name",
|
||||||
|
models.CharField(
|
||||||
|
help_text="This is the name that will be displayed on the site",
|
||||||
|
max_length=50,
|
||||||
|
unique=True,
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"avatar",
|
||||||
|
models.ImageField(blank=True, upload_to="avatars/"),
|
||||||
|
),
|
||||||
|
("pronouns", models.CharField(blank=True, max_length=50)),
|
||||||
|
("bio", models.TextField(blank=True, max_length=500)),
|
||||||
|
("twitter", models.URLField(blank=True)),
|
||||||
|
("instagram", models.URLField(blank=True)),
|
||||||
|
("youtube", models.URLField(blank=True)),
|
||||||
|
("discord", models.CharField(blank=True, max_length=100)),
|
||||||
|
("coaster_credits", models.IntegerField(default=0)),
|
||||||
|
("dark_ride_credits", models.IntegerField(default=0)),
|
||||||
|
("flat_ride_credits", models.IntegerField(default=0)),
|
||||||
|
("water_ride_credits", models.IntegerField(default=0)),
|
||||||
|
(
|
||||||
|
"user",
|
||||||
|
models.OneToOneField(
|
||||||
|
on_delete=django.db.models.deletion.CASCADE,
|
||||||
|
related_name="profile",
|
||||||
|
to=settings.AUTH_USER_MODEL,
|
||||||
|
),
|
||||||
|
),
|
||||||
|
],
|
||||||
|
),
|
||||||
|
pgtrigger.migrations.AddTrigger(
|
||||||
|
model_name="toplist",
|
||||||
|
trigger=pgtrigger.compiler.Trigger(
|
||||||
|
name="insert_insert",
|
||||||
|
sql=pgtrigger.compiler.UpsertTriggerSql(
|
||||||
|
func='INSERT INTO "accounts_toplistevent" ("category", "created_at", "description", "id", "pgh_context_id", "pgh_created_at", "pgh_label", "pgh_obj_id", "title", "updated_at", "user_id") VALUES (NEW."category", NEW."created_at", NEW."description", NEW."id", _pgh_attach_context(), NOW(), \'insert\', NEW."id", NEW."title", NEW."updated_at", NEW."user_id"); RETURN NULL;',
|
||||||
|
hash="[AWS-SECRET-REMOVED]",
|
||||||
|
operation="INSERT",
|
||||||
|
pgid="pgtrigger_insert_insert_26546",
|
||||||
|
table="accounts_toplist",
|
||||||
|
when="AFTER",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
pgtrigger.migrations.AddTrigger(
|
||||||
|
model_name="toplist",
|
||||||
|
trigger=pgtrigger.compiler.Trigger(
|
||||||
|
name="update_update",
|
||||||
|
sql=pgtrigger.compiler.UpsertTriggerSql(
|
||||||
|
condition="WHEN (OLD.* IS DISTINCT FROM NEW.*)",
|
||||||
|
func='INSERT INTO "accounts_toplistevent" ("category", "created_at", "description", "id", "pgh_context_id", "pgh_created_at", "pgh_label", "pgh_obj_id", "title", "updated_at", "user_id") VALUES (NEW."category", NEW."created_at", NEW."description", NEW."id", _pgh_attach_context(), NOW(), \'update\', NEW."id", NEW."title", NEW."updated_at", NEW."user_id"); RETURN NULL;',
|
||||||
|
hash="[AWS-SECRET-REMOVED]",
|
||||||
|
operation="UPDATE",
|
||||||
|
pgid="pgtrigger_update_update_84849",
|
||||||
|
table="accounts_toplist",
|
||||||
|
when="AFTER",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
migrations.AlterUniqueTogether(
|
||||||
|
name="toplistitem",
|
||||||
|
unique_together={("top_list", "rank")},
|
||||||
|
),
|
||||||
|
pgtrigger.migrations.AddTrigger(
|
||||||
|
model_name="toplistitem",
|
||||||
|
trigger=pgtrigger.compiler.Trigger(
|
||||||
|
name="insert_insert",
|
||||||
|
sql=pgtrigger.compiler.UpsertTriggerSql(
|
||||||
|
func='INSERT INTO "accounts_toplistitemevent" ("content_type_id", "created_at", "id", "notes", "object_id", "pgh_context_id", "pgh_created_at", "pgh_label", "pgh_obj_id", "rank", "top_list_id", "updated_at") VALUES (NEW."content_type_id", NEW."created_at", NEW."id", NEW."notes", NEW."object_id", _pgh_attach_context(), NOW(), \'insert\', NEW."id", NEW."rank", NEW."top_list_id", NEW."updated_at"); RETURN NULL;',
|
||||||
|
hash="[AWS-SECRET-REMOVED]",
|
||||||
|
operation="INSERT",
|
||||||
|
pgid="pgtrigger_insert_insert_56dfc",
|
||||||
|
table="accounts_toplistitem",
|
||||||
|
when="AFTER",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
pgtrigger.migrations.AddTrigger(
|
||||||
|
model_name="toplistitem",
|
||||||
|
trigger=pgtrigger.compiler.Trigger(
|
||||||
|
name="update_update",
|
||||||
|
sql=pgtrigger.compiler.UpsertTriggerSql(
|
||||||
|
condition="WHEN (OLD.* IS DISTINCT FROM NEW.*)",
|
||||||
|
func='INSERT INTO "accounts_toplistitemevent" ("content_type_id", "created_at", "id", "notes", "object_id", "pgh_context_id", "pgh_created_at", "pgh_label", "pgh_obj_id", "rank", "top_list_id", "updated_at") VALUES (NEW."content_type_id", NEW."created_at", NEW."id", NEW."notes", NEW."object_id", _pgh_attach_context(), NOW(), \'update\', NEW."id", NEW."rank", NEW."top_list_id", NEW."updated_at"); RETURN NULL;',
|
||||||
|
hash="[AWS-SECRET-REMOVED]",
|
||||||
|
operation="UPDATE",
|
||||||
|
pgid="pgtrigger_update_update_2b6e3",
|
||||||
|
table="accounts_toplistitem",
|
||||||
|
when="AFTER",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
]
|
||||||
0
backend/apps/accounts/migrations/__init__.py
Normal file
0
backend/apps/accounts/migrations/__init__.py
Normal file
35
backend/apps/accounts/mixins.py
Normal file
35
backend/apps/accounts/mixins.py
Normal file
@@ -0,0 +1,35 @@
|
|||||||
|
import requests
|
||||||
|
from django.conf import settings
|
||||||
|
from django.core.exceptions import ValidationError
|
||||||
|
|
||||||
|
|
||||||
|
class TurnstileMixin:
|
||||||
|
"""
|
||||||
|
Mixin to handle Cloudflare Turnstile validation.
|
||||||
|
Bypasses validation when DEBUG is True.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def validate_turnstile(self, request):
|
||||||
|
"""
|
||||||
|
Validate the Turnstile response token.
|
||||||
|
Skips validation when DEBUG is True.
|
||||||
|
"""
|
||||||
|
if settings.DEBUG:
|
||||||
|
return
|
||||||
|
|
||||||
|
token = request.POST.get("cf-turnstile-response")
|
||||||
|
if not token:
|
||||||
|
raise ValidationError("Please complete the Turnstile challenge.")
|
||||||
|
|
||||||
|
# Verify the token with Cloudflare
|
||||||
|
data = {
|
||||||
|
"secret": settings.TURNSTILE_SECRET_KEY,
|
||||||
|
"response": token,
|
||||||
|
"remoteip": request.META.get("REMOTE_ADDR"),
|
||||||
|
}
|
||||||
|
|
||||||
|
response = requests.post(settings.TURNSTILE_VERIFY_URL, data=data, timeout=60)
|
||||||
|
result = response.json()
|
||||||
|
|
||||||
|
if not result.get("success"):
|
||||||
|
raise ValidationError("Turnstile validation failed. Please try again.")
|
||||||
219
backend/apps/accounts/models.py
Normal file
219
backend/apps/accounts/models.py
Normal file
@@ -0,0 +1,219 @@
|
|||||||
|
from django.contrib.auth.models import AbstractUser
|
||||||
|
from django.db import models
|
||||||
|
from django.urls import reverse
|
||||||
|
from django.utils.translation import gettext_lazy as _
|
||||||
|
import os
|
||||||
|
import secrets
|
||||||
|
from apps.core.history import TrackedModel
|
||||||
|
|
||||||
|
# import pghistory
|
||||||
|
|
||||||
|
|
||||||
|
def generate_random_id(model_class, id_field):
|
||||||
|
"""Generate a random ID starting at 4 digits, expanding to 5 if needed"""
|
||||||
|
while True:
|
||||||
|
# Try to get a 4-digit number first
|
||||||
|
new_id = str(secrets.SystemRandom().randint(1000, 9999))
|
||||||
|
if not model_class.objects.filter(**{id_field: new_id}).exists():
|
||||||
|
return new_id
|
||||||
|
|
||||||
|
# If all 4-digit numbers are taken, try 5 digits
|
||||||
|
new_id = str(secrets.SystemRandom().randint(10000, 99999))
|
||||||
|
if not model_class.objects.filter(**{id_field: new_id}).exists():
|
||||||
|
return new_id
|
||||||
|
|
||||||
|
|
||||||
|
class User(AbstractUser):
|
||||||
|
class Roles(models.TextChoices):
|
||||||
|
USER = "USER", _("User")
|
||||||
|
MODERATOR = "MODERATOR", _("Moderator")
|
||||||
|
ADMIN = "ADMIN", _("Admin")
|
||||||
|
SUPERUSER = "SUPERUSER", _("Superuser")
|
||||||
|
|
||||||
|
class ThemePreference(models.TextChoices):
|
||||||
|
LIGHT = "light", _("Light")
|
||||||
|
DARK = "dark", _("Dark")
|
||||||
|
|
||||||
|
# Read-only ID
|
||||||
|
user_id = models.CharField(
|
||||||
|
max_length=10,
|
||||||
|
unique=True,
|
||||||
|
editable=False,
|
||||||
|
help_text=(
|
||||||
|
"Unique identifier for this user that remains constant even if the "
|
||||||
|
"username changes"
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
role = models.CharField(
|
||||||
|
max_length=10,
|
||||||
|
choices=Roles.choices,
|
||||||
|
default=Roles.USER,
|
||||||
|
)
|
||||||
|
is_banned = models.BooleanField(default=False)
|
||||||
|
ban_reason = models.TextField(blank=True)
|
||||||
|
ban_date = models.DateTimeField(null=True, blank=True)
|
||||||
|
pending_email = models.EmailField(blank=True, null=True)
|
||||||
|
theme_preference = models.CharField(
|
||||||
|
max_length=5,
|
||||||
|
choices=ThemePreference.choices,
|
||||||
|
default=ThemePreference.LIGHT,
|
||||||
|
)
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return self.get_display_name()
|
||||||
|
|
||||||
|
def get_absolute_url(self):
|
||||||
|
return reverse("profile", kwargs={"username": self.username})
|
||||||
|
|
||||||
|
def get_display_name(self):
|
||||||
|
"""Get the user's display name, falling back to username if not set"""
|
||||||
|
profile = getattr(self, "profile", None)
|
||||||
|
if profile and profile.display_name:
|
||||||
|
return profile.display_name
|
||||||
|
return self.username
|
||||||
|
|
||||||
|
def save(self, *args, **kwargs):
|
||||||
|
if not self.user_id:
|
||||||
|
self.user_id = generate_random_id(User, "user_id")
|
||||||
|
super().save(*args, **kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
class UserProfile(models.Model):
|
||||||
|
# Read-only ID
|
||||||
|
profile_id = models.CharField(
|
||||||
|
max_length=10,
|
||||||
|
unique=True,
|
||||||
|
editable=False,
|
||||||
|
help_text="Unique identifier for this profile that remains constant",
|
||||||
|
)
|
||||||
|
|
||||||
|
user = models.OneToOneField(User, on_delete=models.CASCADE, related_name="profile")
|
||||||
|
display_name = models.CharField(
|
||||||
|
max_length=50,
|
||||||
|
unique=True,
|
||||||
|
help_text="This is the name that will be displayed on the site",
|
||||||
|
)
|
||||||
|
avatar = models.ImageField(upload_to="avatars/", blank=True)
|
||||||
|
pronouns = models.CharField(max_length=50, blank=True)
|
||||||
|
|
||||||
|
bio = models.TextField(max_length=500, blank=True)
|
||||||
|
|
||||||
|
# Social media links
|
||||||
|
twitter = models.URLField(blank=True)
|
||||||
|
instagram = models.URLField(blank=True)
|
||||||
|
youtube = models.URLField(blank=True)
|
||||||
|
discord = models.CharField(max_length=100, blank=True)
|
||||||
|
|
||||||
|
# Ride statistics
|
||||||
|
coaster_credits = models.IntegerField(default=0)
|
||||||
|
dark_ride_credits = models.IntegerField(default=0)
|
||||||
|
flat_ride_credits = models.IntegerField(default=0)
|
||||||
|
water_ride_credits = models.IntegerField(default=0)
|
||||||
|
|
||||||
|
def get_avatar(self):
|
||||||
|
"""
|
||||||
|
Return the avatar URL or serve a pre-generated avatar based on the
|
||||||
|
first letter of the username
|
||||||
|
"""
|
||||||
|
if self.avatar:
|
||||||
|
return self.avatar.url
|
||||||
|
first_letter = self.user.username.upper()
|
||||||
|
avatar_path = f"avatars/letters/{first_letter}_avatar.png"
|
||||||
|
if os.path.exists(avatar_path):
|
||||||
|
return f"/{avatar_path}"
|
||||||
|
return "/static/images/default-avatar.png"
|
||||||
|
|
||||||
|
def save(self, *args, **kwargs):
|
||||||
|
# If no display name is set, use the username
|
||||||
|
if not self.display_name:
|
||||||
|
self.display_name = self.user.username
|
||||||
|
|
||||||
|
if not self.profile_id:
|
||||||
|
self.profile_id = generate_random_id(UserProfile, "profile_id")
|
||||||
|
super().save(*args, **kwargs)
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return self.display_name
|
||||||
|
|
||||||
|
|
||||||
|
class EmailVerification(models.Model):
|
||||||
|
user = models.OneToOneField(User, on_delete=models.CASCADE)
|
||||||
|
token = models.CharField(max_length=64, unique=True)
|
||||||
|
created_at = models.DateTimeField(auto_now_add=True)
|
||||||
|
last_sent = models.DateTimeField(auto_now_add=True)
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return f"Email verification for {self.user.username}"
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
verbose_name = "Email Verification"
|
||||||
|
verbose_name_plural = "Email Verifications"
|
||||||
|
|
||||||
|
|
||||||
|
class PasswordReset(models.Model):
|
||||||
|
user = models.ForeignKey(User, on_delete=models.CASCADE)
|
||||||
|
token = models.CharField(max_length=64)
|
||||||
|
created_at = models.DateTimeField(auto_now_add=True)
|
||||||
|
expires_at = models.DateTimeField()
|
||||||
|
used = models.BooleanField(default=False)
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return f"Password reset for {self.user.username}"
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
verbose_name = "Password Reset"
|
||||||
|
verbose_name_plural = "Password Resets"
|
||||||
|
|
||||||
|
|
||||||
|
# @pghistory.track()
|
||||||
|
|
||||||
|
|
||||||
|
class TopList(TrackedModel):
|
||||||
|
class Categories(models.TextChoices):
|
||||||
|
ROLLER_COASTER = "RC", _("Roller Coaster")
|
||||||
|
DARK_RIDE = "DR", _("Dark Ride")
|
||||||
|
FLAT_RIDE = "FR", _("Flat Ride")
|
||||||
|
WATER_RIDE = "WR", _("Water Ride")
|
||||||
|
PARK = "PK", _("Park")
|
||||||
|
|
||||||
|
user = models.ForeignKey(
|
||||||
|
User,
|
||||||
|
on_delete=models.CASCADE,
|
||||||
|
related_name="top_lists", # Added related_name for User model access
|
||||||
|
)
|
||||||
|
title = models.CharField(max_length=100)
|
||||||
|
category = models.CharField(max_length=2, choices=Categories.choices)
|
||||||
|
description = models.TextField(blank=True)
|
||||||
|
created_at = models.DateTimeField(auto_now_add=True)
|
||||||
|
updated_at = models.DateTimeField(auto_now=True)
|
||||||
|
|
||||||
|
class Meta(TrackedModel.Meta):
|
||||||
|
ordering = ["-updated_at"]
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return (
|
||||||
|
f"{self.user.get_display_name()}'s {self.category} Top List: {self.title}"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
# @pghistory.track()
|
||||||
|
|
||||||
|
|
||||||
|
class TopListItem(TrackedModel):
|
||||||
|
top_list = models.ForeignKey(
|
||||||
|
TopList, on_delete=models.CASCADE, related_name="items"
|
||||||
|
)
|
||||||
|
content_type = models.ForeignKey(
|
||||||
|
"contenttypes.ContentType", on_delete=models.CASCADE
|
||||||
|
)
|
||||||
|
object_id = models.PositiveIntegerField()
|
||||||
|
rank = models.PositiveIntegerField()
|
||||||
|
notes = models.TextField(blank=True)
|
||||||
|
|
||||||
|
class Meta(TrackedModel.Meta):
|
||||||
|
ordering = ["rank"]
|
||||||
|
unique_together = [["top_list", "rank"]]
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return f"#{self.rank} in {self.top_list.title}"
|
||||||
208
backend/apps/accounts/models_temp.py
Normal file
208
backend/apps/accounts/models_temp.py
Normal file
@@ -0,0 +1,208 @@
|
|||||||
|
from django.contrib.auth.models import AbstractUser
|
||||||
|
from django.db import models
|
||||||
|
from django.urls import reverse
|
||||||
|
from django.utils.translation import gettext_lazy as _
|
||||||
|
import os
|
||||||
|
import secrets
|
||||||
|
from apps.core.history import TrackedModel
|
||||||
|
import pghistory
|
||||||
|
|
||||||
|
|
||||||
|
def generate_random_id(model_class, id_field):
|
||||||
|
"""Generate a random ID starting at 4 digits, expanding to 5 if needed"""
|
||||||
|
while True:
|
||||||
|
# Try to get a 4-digit number first
|
||||||
|
new_id = str(secrets.SystemRandom().randint(1000, 9999))
|
||||||
|
if not model_class.objects.filter(**{id_field: new_id}).exists():
|
||||||
|
return new_id
|
||||||
|
|
||||||
|
# If all 4-digit numbers are taken, try 5 digits
|
||||||
|
new_id = str(secrets.SystemRandom().randint(10000, 99999))
|
||||||
|
if not model_class.objects.filter(**{id_field: new_id}).exists():
|
||||||
|
return new_id
|
||||||
|
|
||||||
|
|
||||||
|
class User(AbstractUser):
|
||||||
|
class Roles(models.TextChoices):
|
||||||
|
USER = "USER", _("User")
|
||||||
|
MODERATOR = "MODERATOR", _("Moderator")
|
||||||
|
ADMIN = "ADMIN", _("Admin")
|
||||||
|
SUPERUSER = "SUPERUSER", _("Superuser")
|
||||||
|
|
||||||
|
class ThemePreference(models.TextChoices):
|
||||||
|
LIGHT = "light", _("Light")
|
||||||
|
DARK = "dark", _("Dark")
|
||||||
|
|
||||||
|
# Read-only ID
|
||||||
|
user_id = models.CharField(
|
||||||
|
max_length=10,
|
||||||
|
unique=True,
|
||||||
|
editable=False,
|
||||||
|
help_text="Unique identifier for this user that remains constant even if the username changes",
|
||||||
|
)
|
||||||
|
|
||||||
|
role = models.CharField(
|
||||||
|
max_length=10,
|
||||||
|
choices=Roles.choices,
|
||||||
|
default=Roles.USER,
|
||||||
|
)
|
||||||
|
is_banned = models.BooleanField(default=False)
|
||||||
|
ban_reason = models.TextField(blank=True)
|
||||||
|
ban_date = models.DateTimeField(null=True, blank=True)
|
||||||
|
pending_email = models.EmailField(blank=True, null=True)
|
||||||
|
theme_preference = models.CharField(
|
||||||
|
max_length=5,
|
||||||
|
choices=ThemePreference.choices,
|
||||||
|
default=ThemePreference.LIGHT,
|
||||||
|
)
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return self.get_display_name()
|
||||||
|
|
||||||
|
def get_absolute_url(self):
|
||||||
|
return reverse("profile", kwargs={"username": self.username})
|
||||||
|
|
||||||
|
def get_display_name(self):
|
||||||
|
"""Get the user's display name, falling back to username if not set"""
|
||||||
|
profile = getattr(self, "profile", None)
|
||||||
|
if profile and profile.display_name:
|
||||||
|
return profile.display_name
|
||||||
|
return self.username
|
||||||
|
|
||||||
|
def save(self, *args, **kwargs):
|
||||||
|
if not self.user_id:
|
||||||
|
self.user_id = generate_random_id(User, "user_id")
|
||||||
|
super().save(*args, **kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
class UserProfile(models.Model):
|
||||||
|
# Read-only ID
|
||||||
|
profile_id = models.CharField(
|
||||||
|
max_length=10,
|
||||||
|
unique=True,
|
||||||
|
editable=False,
|
||||||
|
help_text="Unique identifier for this profile that remains constant",
|
||||||
|
)
|
||||||
|
|
||||||
|
user = models.OneToOneField(User, on_delete=models.CASCADE, related_name="profile")
|
||||||
|
display_name = models.CharField(
|
||||||
|
max_length=50,
|
||||||
|
unique=True,
|
||||||
|
help_text="This is the name that will be displayed on the site",
|
||||||
|
)
|
||||||
|
avatar = models.ImageField(upload_to="avatars/", blank=True)
|
||||||
|
pronouns = models.CharField(max_length=50, blank=True)
|
||||||
|
|
||||||
|
bio = models.TextField(max_length=500, blank=True)
|
||||||
|
|
||||||
|
# Social media links
|
||||||
|
twitter = models.URLField(blank=True)
|
||||||
|
instagram = models.URLField(blank=True)
|
||||||
|
youtube = models.URLField(blank=True)
|
||||||
|
discord = models.CharField(max_length=100, blank=True)
|
||||||
|
|
||||||
|
# Ride statistics
|
||||||
|
coaster_credits = models.IntegerField(default=0)
|
||||||
|
dark_ride_credits = models.IntegerField(default=0)
|
||||||
|
flat_ride_credits = models.IntegerField(default=0)
|
||||||
|
water_ride_credits = models.IntegerField(default=0)
|
||||||
|
|
||||||
|
def get_avatar(self):
|
||||||
|
"""Return the avatar URL or serve a pre-generated avatar based on the first letter of the username"""
|
||||||
|
if self.avatar:
|
||||||
|
return self.avatar.url
|
||||||
|
first_letter = self.user.username[0].upper()
|
||||||
|
avatar_path = f"avatars/letters/{first_letter}_avatar.png"
|
||||||
|
if os.path.exists(avatar_path):
|
||||||
|
return f"/{avatar_path}"
|
||||||
|
return "/static/images/default-avatar.png"
|
||||||
|
|
||||||
|
def save(self, *args, **kwargs):
|
||||||
|
# If no display name is set, use the username
|
||||||
|
if not self.display_name:
|
||||||
|
self.display_name = self.user.username
|
||||||
|
|
||||||
|
if not self.profile_id:
|
||||||
|
self.profile_id = generate_random_id(UserProfile, "profile_id")
|
||||||
|
super().save(*args, **kwargs)
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return self.display_name
|
||||||
|
|
||||||
|
|
||||||
|
class EmailVerification(models.Model):
|
||||||
|
user = models.OneToOneField(User, on_delete=models.CASCADE)
|
||||||
|
token = models.CharField(max_length=64, unique=True)
|
||||||
|
created_at = models.DateTimeField(auto_now_add=True)
|
||||||
|
last_sent = models.DateTimeField(auto_now_add=True)
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return f"Email verification for {self.user.username}"
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
verbose_name = "Email Verification"
|
||||||
|
verbose_name_plural = "Email Verifications"
|
||||||
|
|
||||||
|
|
||||||
|
class PasswordReset(models.Model):
|
||||||
|
user = models.ForeignKey(User, on_delete=models.CASCADE)
|
||||||
|
token = models.CharField(max_length=64)
|
||||||
|
created_at = models.DateTimeField(auto_now_add=True)
|
||||||
|
expires_at = models.DateTimeField()
|
||||||
|
used = models.BooleanField(default=False)
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return f"Password reset for {self.user.username}"
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
verbose_name = "Password Reset"
|
||||||
|
verbose_name_plural = "Password Resets"
|
||||||
|
|
||||||
|
|
||||||
|
@pghistory.track()
|
||||||
|
class TopList(TrackedModel):
|
||||||
|
class Categories(models.TextChoices):
|
||||||
|
ROLLER_COASTER = "RC", _("Roller Coaster")
|
||||||
|
DARK_RIDE = "DR", _("Dark Ride")
|
||||||
|
FLAT_RIDE = "FR", _("Flat Ride")
|
||||||
|
WATER_RIDE = "WR", _("Water Ride")
|
||||||
|
PARK = "PK", _("Park")
|
||||||
|
|
||||||
|
user = models.ForeignKey(
|
||||||
|
User,
|
||||||
|
on_delete=models.CASCADE,
|
||||||
|
related_name="top_lists", # Added related_name for User model access
|
||||||
|
)
|
||||||
|
title = models.CharField(max_length=100)
|
||||||
|
category = models.CharField(max_length=2, choices=Categories.choices)
|
||||||
|
description = models.TextField(blank=True)
|
||||||
|
created_at = models.DateTimeField(auto_now_add=True)
|
||||||
|
updated_at = models.DateTimeField(auto_now=True)
|
||||||
|
|
||||||
|
class Meta(TrackedModel.Meta):
|
||||||
|
ordering = ["-updated_at"]
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return (
|
||||||
|
f"{self.user.get_display_name()}'s {self.category} Top List: {self.title}"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@pghistory.track()
|
||||||
|
class TopListItem(TrackedModel):
|
||||||
|
top_list = models.ForeignKey(
|
||||||
|
TopList, on_delete=models.CASCADE, related_name="items"
|
||||||
|
)
|
||||||
|
content_type = models.ForeignKey(
|
||||||
|
"contenttypes.ContentType", on_delete=models.CASCADE
|
||||||
|
)
|
||||||
|
object_id = models.PositiveIntegerField()
|
||||||
|
rank = models.PositiveIntegerField()
|
||||||
|
notes = models.TextField(blank=True)
|
||||||
|
|
||||||
|
class Meta(TrackedModel.Meta):
|
||||||
|
ordering = ["rank"]
|
||||||
|
unique_together = [["top_list", "rank"]]
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return f"#{self.rank} in {self.top_list.title}"
|
||||||
273
backend/apps/accounts/selectors.py
Normal file
273
backend/apps/accounts/selectors.py
Normal file
@@ -0,0 +1,273 @@
|
|||||||
|
"""
|
||||||
|
Selectors for user and account-related data retrieval.
|
||||||
|
Following Django styleguide pattern for separating data access from business logic.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from typing import Dict, Any
|
||||||
|
from django.db.models import QuerySet, Q, F, Count
|
||||||
|
from django.contrib.auth import get_user_model
|
||||||
|
from django.utils import timezone
|
||||||
|
from datetime import timedelta
|
||||||
|
|
||||||
|
User = get_user_model()
|
||||||
|
|
||||||
|
|
||||||
|
def user_profile_optimized(*, user_id: int) -> Any:
|
||||||
|
"""
|
||||||
|
Get a user with optimized queries for profile display.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
user_id: User ID
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
User instance with prefetched related data
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
User.DoesNotExist: If user doesn't exist
|
||||||
|
"""
|
||||||
|
return (
|
||||||
|
User.objects.prefetch_related(
|
||||||
|
"park_reviews", "ride_reviews", "socialaccount_set"
|
||||||
|
)
|
||||||
|
.annotate(
|
||||||
|
park_review_count=Count(
|
||||||
|
"park_reviews", filter=Q(park_reviews__is_published=True)
|
||||||
|
),
|
||||||
|
ride_review_count=Count(
|
||||||
|
"ride_reviews", filter=Q(ride_reviews__is_published=True)
|
||||||
|
),
|
||||||
|
total_review_count=F("park_review_count") + F("ride_review_count"),
|
||||||
|
)
|
||||||
|
.get(id=user_id)
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def active_users_with_stats() -> QuerySet:
|
||||||
|
"""
|
||||||
|
Get active users with review statistics.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
QuerySet of active users with review counts
|
||||||
|
"""
|
||||||
|
return (
|
||||||
|
User.objects.filter(is_active=True)
|
||||||
|
.annotate(
|
||||||
|
park_review_count=Count(
|
||||||
|
"park_reviews", filter=Q(park_reviews__is_published=True)
|
||||||
|
),
|
||||||
|
ride_review_count=Count(
|
||||||
|
"ride_reviews", filter=Q(ride_reviews__is_published=True)
|
||||||
|
),
|
||||||
|
total_review_count=F("park_review_count") + F("ride_review_count"),
|
||||||
|
)
|
||||||
|
.order_by("-total_review_count")
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def users_with_recent_activity(*, days: int = 30) -> QuerySet:
|
||||||
|
"""
|
||||||
|
Get users who have been active in the last N days.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
days: Number of days to look back for activity
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
QuerySet of recently active users
|
||||||
|
"""
|
||||||
|
cutoff_date = timezone.now() - timedelta(days=days)
|
||||||
|
|
||||||
|
return (
|
||||||
|
User.objects.filter(
|
||||||
|
Q(last_login__gte=cutoff_date)
|
||||||
|
| Q(park_reviews__created_at__gte=cutoff_date)
|
||||||
|
| Q(ride_reviews__created_at__gte=cutoff_date)
|
||||||
|
)
|
||||||
|
.annotate(
|
||||||
|
recent_park_reviews=Count(
|
||||||
|
"park_reviews",
|
||||||
|
filter=Q(park_reviews__created_at__gte=cutoff_date),
|
||||||
|
),
|
||||||
|
recent_ride_reviews=Count(
|
||||||
|
"ride_reviews",
|
||||||
|
filter=Q(ride_reviews__created_at__gte=cutoff_date),
|
||||||
|
),
|
||||||
|
recent_total_reviews=F("recent_park_reviews") + F("recent_ride_reviews"),
|
||||||
|
)
|
||||||
|
.order_by("-last_login")
|
||||||
|
.distinct()
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def top_reviewers(*, limit: int = 10) -> QuerySet:
|
||||||
|
"""
|
||||||
|
Get top users by review count.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
limit: Maximum number of users to return
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
QuerySet of top reviewers
|
||||||
|
"""
|
||||||
|
return (
|
||||||
|
User.objects.filter(is_active=True)
|
||||||
|
.annotate(
|
||||||
|
park_review_count=Count(
|
||||||
|
"park_reviews", filter=Q(park_reviews__is_published=True)
|
||||||
|
),
|
||||||
|
ride_review_count=Count(
|
||||||
|
"ride_reviews", filter=Q(ride_reviews__is_published=True)
|
||||||
|
),
|
||||||
|
total_review_count=F("park_review_count") + F("ride_review_count"),
|
||||||
|
)
|
||||||
|
.filter(total_review_count__gt=0)
|
||||||
|
.order_by("-total_review_count")[:limit]
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def moderator_users() -> QuerySet:
|
||||||
|
"""
|
||||||
|
Get users with moderation permissions.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
QuerySet of users who can moderate content
|
||||||
|
"""
|
||||||
|
return (
|
||||||
|
User.objects.filter(
|
||||||
|
Q(is_staff=True)
|
||||||
|
| Q(groups__name="Moderators")
|
||||||
|
| Q(
|
||||||
|
user_permissions__codename__in=[
|
||||||
|
"change_parkreview",
|
||||||
|
"change_ridereview",
|
||||||
|
]
|
||||||
|
)
|
||||||
|
)
|
||||||
|
.distinct()
|
||||||
|
.order_by("username")
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def users_by_registration_date(*, start_date, end_date) -> QuerySet:
|
||||||
|
"""
|
||||||
|
Get users who registered within a date range.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
start_date: Start of date range
|
||||||
|
end_date: End of date range
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
QuerySet of users registered in the date range
|
||||||
|
"""
|
||||||
|
return User.objects.filter(
|
||||||
|
date_joined__date__gte=start_date, date_joined__date__lte=end_date
|
||||||
|
).order_by("-date_joined")
|
||||||
|
|
||||||
|
|
||||||
|
def user_search_autocomplete(*, query: str, limit: int = 10) -> QuerySet:
|
||||||
|
"""
|
||||||
|
Get users matching a search query for autocomplete functionality.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
query: Search string
|
||||||
|
limit: Maximum number of results
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
QuerySet of matching users for autocomplete
|
||||||
|
"""
|
||||||
|
return User.objects.filter(
|
||||||
|
Q(username__icontains=query)
|
||||||
|
| Q(first_name__icontains=query)
|
||||||
|
| Q(last_name__icontains=query),
|
||||||
|
is_active=True,
|
||||||
|
).order_by("username")[:limit]
|
||||||
|
|
||||||
|
|
||||||
|
def users_with_social_accounts() -> QuerySet:
|
||||||
|
"""
|
||||||
|
Get users who have connected social accounts.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
QuerySet of users with social account connections
|
||||||
|
"""
|
||||||
|
return (
|
||||||
|
User.objects.filter(socialaccount__isnull=False)
|
||||||
|
.prefetch_related("socialaccount_set")
|
||||||
|
.distinct()
|
||||||
|
.order_by("username")
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def user_statistics_summary() -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Get overall user statistics for dashboard/analytics.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary containing user statistics
|
||||||
|
"""
|
||||||
|
total_users = User.objects.count()
|
||||||
|
active_users = User.objects.filter(is_active=True).count()
|
||||||
|
staff_users = User.objects.filter(is_staff=True).count()
|
||||||
|
|
||||||
|
# Users with reviews
|
||||||
|
users_with_reviews = (
|
||||||
|
User.objects.filter(
|
||||||
|
Q(park_reviews__isnull=False) | Q(ride_reviews__isnull=False)
|
||||||
|
)
|
||||||
|
.distinct()
|
||||||
|
.count()
|
||||||
|
)
|
||||||
|
|
||||||
|
# Recent registrations (last 30 days)
|
||||||
|
cutoff_date = timezone.now() - timedelta(days=30)
|
||||||
|
recent_registrations = User.objects.filter(date_joined__gte=cutoff_date).count()
|
||||||
|
|
||||||
|
return {
|
||||||
|
"total_users": total_users,
|
||||||
|
"active_users": active_users,
|
||||||
|
"inactive_users": total_users - active_users,
|
||||||
|
"staff_users": staff_users,
|
||||||
|
"users_with_reviews": users_with_reviews,
|
||||||
|
"recent_registrations": recent_registrations,
|
||||||
|
"review_participation_rate": (
|
||||||
|
(users_with_reviews / total_users * 100) if total_users > 0 else 0
|
||||||
|
),
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def users_needing_email_verification() -> QuerySet:
|
||||||
|
"""
|
||||||
|
Get users who haven't verified their email addresses.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
QuerySet of users with unverified emails
|
||||||
|
"""
|
||||||
|
return (
|
||||||
|
User.objects.filter(is_active=True, emailaddress__verified=False)
|
||||||
|
.distinct()
|
||||||
|
.order_by("date_joined")
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def users_by_review_activity(*, min_reviews: int = 1) -> QuerySet:
|
||||||
|
"""
|
||||||
|
Get users who have written at least a minimum number of reviews.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
min_reviews: Minimum number of reviews required
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
QuerySet of users with sufficient review activity
|
||||||
|
"""
|
||||||
|
return (
|
||||||
|
User.objects.annotate(
|
||||||
|
park_review_count=Count(
|
||||||
|
"park_reviews", filter=Q(park_reviews__is_published=True)
|
||||||
|
),
|
||||||
|
ride_review_count=Count(
|
||||||
|
"ride_reviews", filter=Q(ride_reviews__is_published=True)
|
||||||
|
),
|
||||||
|
total_review_count=F("park_review_count") + F("ride_review_count"),
|
||||||
|
)
|
||||||
|
.filter(total_review_count__gte=min_reviews)
|
||||||
|
.order_by("-total_review_count")
|
||||||
|
)
|
||||||
189
backend/apps/accounts/signals.py
Normal file
189
backend/apps/accounts/signals.py
Normal file
@@ -0,0 +1,189 @@
|
|||||||
|
from django.db.models.signals import post_save, pre_save
|
||||||
|
from django.dispatch import receiver
|
||||||
|
from django.contrib.auth.models import Group
|
||||||
|
from django.db import transaction
|
||||||
|
from django.core.files import File
|
||||||
|
from django.core.files.temp import NamedTemporaryFile
|
||||||
|
import requests
|
||||||
|
from .models import User, UserProfile
|
||||||
|
|
||||||
|
|
||||||
|
@receiver(post_save, sender=User)
|
||||||
|
def create_user_profile(sender, instance, created, **kwargs):
|
||||||
|
"""Create UserProfile for new users"""
|
||||||
|
try:
|
||||||
|
if created:
|
||||||
|
# Create profile
|
||||||
|
profile = UserProfile.objects.create(user=instance)
|
||||||
|
|
||||||
|
# If user has a social account with avatar, download it
|
||||||
|
social_account = instance.socialaccount_set.first()
|
||||||
|
if social_account:
|
||||||
|
extra_data = social_account.extra_data
|
||||||
|
avatar_url = None
|
||||||
|
|
||||||
|
if social_account.provider == "google":
|
||||||
|
avatar_url = extra_data.get("picture")
|
||||||
|
elif social_account.provider == "discord":
|
||||||
|
avatar = extra_data.get("avatar")
|
||||||
|
discord_id = extra_data.get("id")
|
||||||
|
if avatar:
|
||||||
|
avatar_url = f"https://cdn.discordapp.com/avatars/{discord_id}/{avatar}.png"
|
||||||
|
|
||||||
|
if avatar_url:
|
||||||
|
try:
|
||||||
|
response = requests.get(avatar_url, timeout=60)
|
||||||
|
if response.status_code == 200:
|
||||||
|
img_temp = NamedTemporaryFile(delete=True)
|
||||||
|
img_temp.write(response.content)
|
||||||
|
img_temp.flush()
|
||||||
|
|
||||||
|
file_name = f"avatar_{instance.username}.png"
|
||||||
|
profile.avatar.save(file_name, File(img_temp), save=True)
|
||||||
|
except Exception as e:
|
||||||
|
print(
|
||||||
|
f"Error downloading avatar for user {
|
||||||
|
instance.username}: {
|
||||||
|
str(e)}"
|
||||||
|
)
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Error creating profile for user {instance.username}: {str(e)}")
|
||||||
|
|
||||||
|
|
||||||
|
@receiver(post_save, sender=User)
|
||||||
|
def save_user_profile(sender, instance, **kwargs):
|
||||||
|
"""Ensure UserProfile exists and is saved"""
|
||||||
|
try:
|
||||||
|
# Try to get existing profile first
|
||||||
|
try:
|
||||||
|
profile = instance.profile
|
||||||
|
profile.save()
|
||||||
|
except UserProfile.DoesNotExist:
|
||||||
|
# Profile doesn't exist, create it
|
||||||
|
UserProfile.objects.create(user=instance)
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Error saving profile for user {instance.username}: {str(e)}")
|
||||||
|
|
||||||
|
|
||||||
|
@receiver(pre_save, sender=User)
|
||||||
|
def sync_user_role_with_groups(sender, instance, **kwargs):
|
||||||
|
"""Sync user role with Django groups"""
|
||||||
|
if instance.pk: # Only for existing users
|
||||||
|
try:
|
||||||
|
old_instance = User.objects.get(pk=instance.pk)
|
||||||
|
if old_instance.role != instance.role:
|
||||||
|
# Role has changed, update groups
|
||||||
|
with transaction.atomic():
|
||||||
|
# Remove from old role group if exists
|
||||||
|
if old_instance.role != User.Roles.USER:
|
||||||
|
old_group = Group.objects.filter(name=old_instance.role).first()
|
||||||
|
if old_group:
|
||||||
|
instance.groups.remove(old_group)
|
||||||
|
|
||||||
|
# Add to new role group
|
||||||
|
if instance.role != User.Roles.USER:
|
||||||
|
new_group, _ = Group.objects.get_or_create(name=instance.role)
|
||||||
|
instance.groups.add(new_group)
|
||||||
|
|
||||||
|
# Special handling for superuser role
|
||||||
|
if instance.role == User.Roles.SUPERUSER:
|
||||||
|
instance.is_superuser = True
|
||||||
|
instance.is_staff = True
|
||||||
|
elif old_instance.role == User.Roles.SUPERUSER:
|
||||||
|
# If removing superuser role, remove superuser
|
||||||
|
# status
|
||||||
|
instance.is_superuser = False
|
||||||
|
if instance.role not in [
|
||||||
|
User.Roles.ADMIN,
|
||||||
|
User.Roles.MODERATOR,
|
||||||
|
]:
|
||||||
|
instance.is_staff = False
|
||||||
|
|
||||||
|
# Handle staff status for admin and moderator roles
|
||||||
|
if instance.role in [
|
||||||
|
User.Roles.ADMIN,
|
||||||
|
User.Roles.MODERATOR,
|
||||||
|
]:
|
||||||
|
instance.is_staff = True
|
||||||
|
elif old_instance.role in [
|
||||||
|
User.Roles.ADMIN,
|
||||||
|
User.Roles.MODERATOR,
|
||||||
|
]:
|
||||||
|
# If removing admin/moderator role, remove staff
|
||||||
|
# status
|
||||||
|
if instance.role not in [User.Roles.SUPERUSER]:
|
||||||
|
instance.is_staff = False
|
||||||
|
except User.DoesNotExist:
|
||||||
|
pass
|
||||||
|
except Exception as e:
|
||||||
|
print(
|
||||||
|
f"Error syncing role with groups for user {
|
||||||
|
instance.username}: {
|
||||||
|
str(e)}"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def create_default_groups():
|
||||||
|
"""
|
||||||
|
Create default groups with appropriate permissions.
|
||||||
|
Call this in a migration or management command.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
from django.contrib.auth.models import Permission
|
||||||
|
|
||||||
|
# Create Moderator group
|
||||||
|
moderator_group, _ = Group.objects.get_or_create(name=User.Roles.MODERATOR)
|
||||||
|
moderator_permissions = [
|
||||||
|
# Review moderation permissions
|
||||||
|
"change_review",
|
||||||
|
"delete_review",
|
||||||
|
"change_reviewreport",
|
||||||
|
"delete_reviewreport",
|
||||||
|
# Edit moderation permissions
|
||||||
|
"change_parkedit",
|
||||||
|
"delete_parkedit",
|
||||||
|
"change_rideedit",
|
||||||
|
"delete_rideedit",
|
||||||
|
"change_companyedit",
|
||||||
|
"delete_companyedit",
|
||||||
|
"change_manufactureredit",
|
||||||
|
"delete_manufactureredit",
|
||||||
|
]
|
||||||
|
|
||||||
|
# Create Admin group
|
||||||
|
admin_group, _ = Group.objects.get_or_create(name=User.Roles.ADMIN)
|
||||||
|
admin_permissions = moderator_permissions + [
|
||||||
|
# User management permissions
|
||||||
|
"change_user",
|
||||||
|
"delete_user",
|
||||||
|
# Content management permissions
|
||||||
|
"add_park",
|
||||||
|
"change_park",
|
||||||
|
"delete_park",
|
||||||
|
"add_ride",
|
||||||
|
"change_ride",
|
||||||
|
"delete_ride",
|
||||||
|
"add_company",
|
||||||
|
"change_company",
|
||||||
|
"delete_company",
|
||||||
|
"add_manufacturer",
|
||||||
|
"change_manufacturer",
|
||||||
|
"delete_manufacturer",
|
||||||
|
]
|
||||||
|
|
||||||
|
# Assign permissions to groups
|
||||||
|
for codename in moderator_permissions:
|
||||||
|
try:
|
||||||
|
perm = Permission.objects.get(codename=codename)
|
||||||
|
moderator_group.permissions.add(perm)
|
||||||
|
except Permission.DoesNotExist:
|
||||||
|
print(f"Permission not found: {codename}")
|
||||||
|
|
||||||
|
for codename in admin_permissions:
|
||||||
|
try:
|
||||||
|
perm = Permission.objects.get(codename=codename)
|
||||||
|
admin_group.permissions.add(perm)
|
||||||
|
except Permission.DoesNotExist:
|
||||||
|
print(f"Permission not found: {codename}")
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Error creating default groups: {str(e)}")
|
||||||
0
backend/apps/accounts/templatetags/__init__.py
Normal file
0
backend/apps/accounts/templatetags/__init__.py
Normal file
23
backend/apps/accounts/templatetags/turnstile_tags.py
Normal file
23
backend/apps/accounts/templatetags/turnstile_tags.py
Normal file
@@ -0,0 +1,23 @@
|
|||||||
|
from django import template
|
||||||
|
from django.conf import settings
|
||||||
|
from django.template.loader import render_to_string
|
||||||
|
|
||||||
|
register = template.Library()
|
||||||
|
|
||||||
|
|
||||||
|
@register.simple_tag
|
||||||
|
def turnstile_widget():
|
||||||
|
"""
|
||||||
|
Template tag to render the Cloudflare Turnstile widget.
|
||||||
|
When DEBUG is True, renders an empty template.
|
||||||
|
When DEBUG is False, renders the normal widget.
|
||||||
|
Usage: {% load turnstile_tags %}{% turnstile_widget %}
|
||||||
|
"""
|
||||||
|
if settings.DEBUG:
|
||||||
|
template_name = "accounts/turnstile_widget_empty.html"
|
||||||
|
context = {}
|
||||||
|
else:
|
||||||
|
template_name = "accounts/turnstile_widget.html"
|
||||||
|
context = {"site_key": settings.TURNSTILE_SITE_KEY}
|
||||||
|
|
||||||
|
return render_to_string(template_name, context)
|
||||||
126
backend/apps/accounts/tests.py
Normal file
126
backend/apps/accounts/tests.py
Normal file
@@ -0,0 +1,126 @@
|
|||||||
|
from django.test import TestCase
|
||||||
|
from django.contrib.auth.models import Group, Permission
|
||||||
|
from django.contrib.contenttypes.models import ContentType
|
||||||
|
from unittest.mock import patch, MagicMock
|
||||||
|
from .models import User, UserProfile
|
||||||
|
from .signals import create_default_groups
|
||||||
|
|
||||||
|
|
||||||
|
class SignalsTestCase(TestCase):
|
||||||
|
def setUp(self):
|
||||||
|
self.user = User.objects.create_user(
|
||||||
|
username="testuser",
|
||||||
|
email="testuser@example.com",
|
||||||
|
password="password",
|
||||||
|
)
|
||||||
|
|
||||||
|
def test_create_user_profile(self):
|
||||||
|
# Refresh user from database to ensure signals have been processed
|
||||||
|
self.user.refresh_from_db()
|
||||||
|
|
||||||
|
# Check if profile exists in database first
|
||||||
|
profile_exists = UserProfile.objects.filter(user=self.user).exists()
|
||||||
|
self.assertTrue(profile_exists, "UserProfile should be created by signals")
|
||||||
|
|
||||||
|
# Now safely access the profile
|
||||||
|
profile = UserProfile.objects.get(user=self.user)
|
||||||
|
self.assertIsInstance(profile, UserProfile)
|
||||||
|
|
||||||
|
# Test the reverse relationship
|
||||||
|
self.assertTrue(hasattr(self.user, "profile"))
|
||||||
|
# Test that we can access the profile through the user relationship
|
||||||
|
user_profile = getattr(self.user, "profile", None)
|
||||||
|
self.assertEqual(user_profile, profile)
|
||||||
|
|
||||||
|
@patch("accounts.signals.requests.get")
|
||||||
|
def test_create_user_profile_with_social_avatar(self, mock_get):
|
||||||
|
# Mock the response from requests.get
|
||||||
|
mock_response = MagicMock()
|
||||||
|
mock_response.status_code = 200
|
||||||
|
mock_response.content = b"fake-image-content"
|
||||||
|
mock_get.return_value = mock_response
|
||||||
|
|
||||||
|
# Create a social account for the user (we'll skip this test since socialaccount_set requires allauth setup)
|
||||||
|
# This test would need proper allauth configuration to work
|
||||||
|
self.skipTest("Requires proper allauth socialaccount setup")
|
||||||
|
|
||||||
|
def test_save_user_profile(self):
|
||||||
|
# Get the profile safely first
|
||||||
|
profile = UserProfile.objects.get(user=self.user)
|
||||||
|
profile.delete()
|
||||||
|
|
||||||
|
# Refresh user to clear cached profile relationship
|
||||||
|
self.user.refresh_from_db()
|
||||||
|
|
||||||
|
# Check that profile no longer exists
|
||||||
|
self.assertFalse(UserProfile.objects.filter(user=self.user).exists())
|
||||||
|
|
||||||
|
# Trigger save to recreate profile via signal
|
||||||
|
self.user.save()
|
||||||
|
|
||||||
|
# Verify profile was recreated
|
||||||
|
self.assertTrue(UserProfile.objects.filter(user=self.user).exists())
|
||||||
|
new_profile = UserProfile.objects.get(user=self.user)
|
||||||
|
self.assertIsInstance(new_profile, UserProfile)
|
||||||
|
|
||||||
|
def test_sync_user_role_with_groups(self):
|
||||||
|
self.user.role = User.Roles.MODERATOR
|
||||||
|
self.user.save()
|
||||||
|
self.assertTrue(self.user.groups.filter(name=User.Roles.MODERATOR).exists())
|
||||||
|
self.assertTrue(self.user.is_staff)
|
||||||
|
|
||||||
|
self.user.role = User.Roles.ADMIN
|
||||||
|
self.user.save()
|
||||||
|
self.assertFalse(self.user.groups.filter(name=User.Roles.MODERATOR).exists())
|
||||||
|
self.assertTrue(self.user.groups.filter(name=User.Roles.ADMIN).exists())
|
||||||
|
self.assertTrue(self.user.is_staff)
|
||||||
|
|
||||||
|
self.user.role = User.Roles.SUPERUSER
|
||||||
|
self.user.save()
|
||||||
|
self.assertFalse(self.user.groups.filter(name=User.Roles.ADMIN).exists())
|
||||||
|
self.assertTrue(self.user.groups.filter(name=User.Roles.SUPERUSER).exists())
|
||||||
|
self.assertTrue(self.user.is_superuser)
|
||||||
|
self.assertTrue(self.user.is_staff)
|
||||||
|
|
||||||
|
self.user.role = User.Roles.USER
|
||||||
|
self.user.save()
|
||||||
|
self.assertFalse(self.user.groups.exists())
|
||||||
|
self.assertFalse(self.user.is_superuser)
|
||||||
|
self.assertFalse(self.user.is_staff)
|
||||||
|
|
||||||
|
def test_create_default_groups(self):
|
||||||
|
# Create some permissions for testing
|
||||||
|
content_type = ContentType.objects.get_for_model(User)
|
||||||
|
Permission.objects.create(
|
||||||
|
codename="change_review",
|
||||||
|
name="Can change review",
|
||||||
|
content_type=content_type,
|
||||||
|
)
|
||||||
|
Permission.objects.create(
|
||||||
|
codename="delete_review",
|
||||||
|
name="Can delete review",
|
||||||
|
content_type=content_type,
|
||||||
|
)
|
||||||
|
Permission.objects.create(
|
||||||
|
codename="change_user",
|
||||||
|
name="Can change user",
|
||||||
|
content_type=content_type,
|
||||||
|
)
|
||||||
|
|
||||||
|
create_default_groups()
|
||||||
|
|
||||||
|
moderator_group = Group.objects.get(name=User.Roles.MODERATOR)
|
||||||
|
self.assertIsNotNone(moderator_group)
|
||||||
|
self.assertTrue(
|
||||||
|
moderator_group.permissions.filter(codename="change_review").exists()
|
||||||
|
)
|
||||||
|
self.assertFalse(
|
||||||
|
moderator_group.permissions.filter(codename="change_user").exists()
|
||||||
|
)
|
||||||
|
|
||||||
|
admin_group = Group.objects.get(name=User.Roles.ADMIN)
|
||||||
|
self.assertIsNotNone(admin_group)
|
||||||
|
self.assertTrue(
|
||||||
|
admin_group.permissions.filter(codename="change_review").exists()
|
||||||
|
)
|
||||||
|
self.assertTrue(admin_group.permissions.filter(codename="change_user").exists())
|
||||||
48
backend/apps/accounts/urls.py
Normal file
48
backend/apps/accounts/urls.py
Normal file
@@ -0,0 +1,48 @@
|
|||||||
|
from django.urls import path
|
||||||
|
from django.contrib.auth import views as auth_views
|
||||||
|
from allauth.account.views import LogoutView
|
||||||
|
from . import views
|
||||||
|
|
||||||
|
app_name = "accounts"
|
||||||
|
|
||||||
|
urlpatterns = [
|
||||||
|
# Override allauth's login and signup views with our Turnstile-enabled
|
||||||
|
# versions
|
||||||
|
path("login/", views.CustomLoginView.as_view(), name="account_login"),
|
||||||
|
path("signup/", views.CustomSignupView.as_view(), name="account_signup"),
|
||||||
|
# Authentication views
|
||||||
|
path("logout/", LogoutView.as_view(), name="logout"),
|
||||||
|
path(
|
||||||
|
"password_change/",
|
||||||
|
auth_views.PasswordChangeView.as_view(),
|
||||||
|
name="password_change",
|
||||||
|
),
|
||||||
|
path(
|
||||||
|
"password_change/done/",
|
||||||
|
auth_views.PasswordChangeDoneView.as_view(),
|
||||||
|
name="password_change_done",
|
||||||
|
),
|
||||||
|
path(
|
||||||
|
"password_reset/",
|
||||||
|
auth_views.PasswordResetView.as_view(),
|
||||||
|
name="password_reset",
|
||||||
|
),
|
||||||
|
path(
|
||||||
|
"password_reset/done/",
|
||||||
|
auth_views.PasswordResetDoneView.as_view(),
|
||||||
|
name="password_reset_done",
|
||||||
|
),
|
||||||
|
path(
|
||||||
|
"reset/<uidb64>/<token>/",
|
||||||
|
auth_views.PasswordResetConfirmView.as_view(),
|
||||||
|
name="password_reset_confirm",
|
||||||
|
),
|
||||||
|
path(
|
||||||
|
"reset/done/",
|
||||||
|
auth_views.PasswordResetCompleteView.as_view(),
|
||||||
|
name="password_reset_complete",
|
||||||
|
),
|
||||||
|
# Profile views
|
||||||
|
path("profile/", views.user_redirect_view, name="profile_redirect"),
|
||||||
|
path("settings/", views.SettingsView.as_view(), name="settings"),
|
||||||
|
]
|
||||||
426
backend/apps/accounts/views.py
Normal file
426
backend/apps/accounts/views.py
Normal file
@@ -0,0 +1,426 @@
|
|||||||
|
from django.views.generic import DetailView, TemplateView
|
||||||
|
from django.contrib.auth import get_user_model
|
||||||
|
from django.shortcuts import get_object_or_404, redirect, render
|
||||||
|
from django.contrib.auth.decorators import login_required
|
||||||
|
from django.contrib.auth.mixins import LoginRequiredMixin
|
||||||
|
from django.contrib import messages
|
||||||
|
from django.core.exceptions import ValidationError
|
||||||
|
from django.template.loader import render_to_string
|
||||||
|
from django.utils.crypto import get_random_string
|
||||||
|
from django.utils import timezone
|
||||||
|
from datetime import timedelta
|
||||||
|
from django.contrib.sites.shortcuts import get_current_site
|
||||||
|
from django.contrib.sites.models import Site
|
||||||
|
from django.contrib.sites.requests import RequestSite
|
||||||
|
from django.db.models import QuerySet
|
||||||
|
from django.http import HttpResponseRedirect, HttpResponse, HttpRequest
|
||||||
|
from django.urls import reverse
|
||||||
|
from django.contrib.auth import login
|
||||||
|
from django.core.files.uploadedfile import UploadedFile
|
||||||
|
from apps.accounts.models import (
|
||||||
|
User,
|
||||||
|
PasswordReset,
|
||||||
|
TopList,
|
||||||
|
EmailVerification,
|
||||||
|
UserProfile,
|
||||||
|
)
|
||||||
|
from apps.email_service.services import EmailService
|
||||||
|
from apps.parks.models import ParkReview
|
||||||
|
from apps.rides.models import RideReview
|
||||||
|
from allauth.account.views import LoginView, SignupView
|
||||||
|
from .mixins import TurnstileMixin
|
||||||
|
from typing import Dict, Any, Optional, Union, cast
|
||||||
|
from django_htmx.http import HttpResponseClientRefresh
|
||||||
|
from contextlib import suppress
|
||||||
|
import re
|
||||||
|
|
||||||
|
UserModel = get_user_model()
|
||||||
|
|
||||||
|
|
||||||
|
class CustomLoginView(TurnstileMixin, LoginView):
|
||||||
|
def form_valid(self, form):
|
||||||
|
try:
|
||||||
|
self.validate_turnstile(self.request)
|
||||||
|
except ValidationError as e:
|
||||||
|
form.add_error(None, str(e))
|
||||||
|
return self.form_invalid(form)
|
||||||
|
|
||||||
|
response = super().form_valid(form)
|
||||||
|
return (
|
||||||
|
HttpResponseClientRefresh()
|
||||||
|
if getattr(self.request, "htmx", False)
|
||||||
|
else response
|
||||||
|
)
|
||||||
|
|
||||||
|
def form_invalid(self, form):
|
||||||
|
if getattr(self.request, "htmx", False):
|
||||||
|
return render(
|
||||||
|
self.request,
|
||||||
|
"account/partials/login_form.html",
|
||||||
|
self.get_context_data(form=form),
|
||||||
|
)
|
||||||
|
return super().form_invalid(form)
|
||||||
|
|
||||||
|
def get(self, request: HttpRequest, *args: Any, **kwargs: Any) -> HttpResponse:
|
||||||
|
if getattr(request, "htmx", False):
|
||||||
|
return render(
|
||||||
|
request,
|
||||||
|
"account/partials/login_modal.html",
|
||||||
|
self.get_context_data(),
|
||||||
|
)
|
||||||
|
return super().get(request, *args, **kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
class CustomSignupView(TurnstileMixin, SignupView):
|
||||||
|
def form_valid(self, form):
|
||||||
|
try:
|
||||||
|
self.validate_turnstile(self.request)
|
||||||
|
except ValidationError as e:
|
||||||
|
form.add_error(None, str(e))
|
||||||
|
return self.form_invalid(form)
|
||||||
|
|
||||||
|
response = super().form_valid(form)
|
||||||
|
return (
|
||||||
|
HttpResponseClientRefresh()
|
||||||
|
if getattr(self.request, "htmx", False)
|
||||||
|
else response
|
||||||
|
)
|
||||||
|
|
||||||
|
def form_invalid(self, form):
|
||||||
|
if getattr(self.request, "htmx", False):
|
||||||
|
return render(
|
||||||
|
self.request,
|
||||||
|
"account/partials/signup_modal.html",
|
||||||
|
self.get_context_data(form=form),
|
||||||
|
)
|
||||||
|
return super().form_invalid(form)
|
||||||
|
|
||||||
|
def get(self, request: HttpRequest, *args: Any, **kwargs: Any) -> HttpResponse:
|
||||||
|
if getattr(request, "htmx", False):
|
||||||
|
return render(
|
||||||
|
request,
|
||||||
|
"account/partials/signup_modal.html",
|
||||||
|
self.get_context_data(),
|
||||||
|
)
|
||||||
|
return super().get(request, *args, **kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
@login_required
|
||||||
|
def user_redirect_view(request: HttpRequest) -> HttpResponse:
|
||||||
|
user = cast(User, request.user)
|
||||||
|
return redirect("profile", username=user.username)
|
||||||
|
|
||||||
|
|
||||||
|
def handle_social_login(request: HttpRequest, email: str) -> HttpResponse:
|
||||||
|
if sociallogin := request.session.get("socialaccount_sociallogin"):
|
||||||
|
sociallogin.user.email = email
|
||||||
|
sociallogin.save()
|
||||||
|
login(request, sociallogin.user)
|
||||||
|
del request.session["socialaccount_sociallogin"]
|
||||||
|
messages.success(request, "Successfully logged in")
|
||||||
|
return redirect("/")
|
||||||
|
|
||||||
|
|
||||||
|
def email_required(request: HttpRequest) -> HttpResponse:
|
||||||
|
if not request.session.get("socialaccount_sociallogin"):
|
||||||
|
messages.error(request, "No social login in progress")
|
||||||
|
return redirect("/")
|
||||||
|
|
||||||
|
if request.method == "POST":
|
||||||
|
if email := request.POST.get("email"):
|
||||||
|
return handle_social_login(request, email)
|
||||||
|
messages.error(request, "Email is required")
|
||||||
|
return render(
|
||||||
|
request,
|
||||||
|
"accounts/email_required.html",
|
||||||
|
{"error": "Email is required"},
|
||||||
|
)
|
||||||
|
|
||||||
|
return render(request, "accounts/email_required.html")
|
||||||
|
|
||||||
|
|
||||||
|
class ProfileView(DetailView):
|
||||||
|
model = User
|
||||||
|
template_name = "accounts/profile.html"
|
||||||
|
context_object_name = "profile_user"
|
||||||
|
slug_field = "username"
|
||||||
|
slug_url_kwarg = "username"
|
||||||
|
|
||||||
|
def get_queryset(self) -> QuerySet[User]:
|
||||||
|
return User.objects.select_related("profile")
|
||||||
|
|
||||||
|
def get_context_data(self, **kwargs: Any) -> Dict[str, Any]:
|
||||||
|
context = super().get_context_data(**kwargs)
|
||||||
|
user = cast(User, self.get_object())
|
||||||
|
|
||||||
|
context["park_reviews"] = self._get_user_park_reviews(user)
|
||||||
|
context["ride_reviews"] = self._get_user_ride_reviews(user)
|
||||||
|
context["top_lists"] = self._get_user_top_lists(user)
|
||||||
|
|
||||||
|
return context
|
||||||
|
|
||||||
|
def _get_user_park_reviews(self, user: User) -> QuerySet[ParkReview]:
|
||||||
|
return (
|
||||||
|
ParkReview.objects.filter(user=user, is_published=True)
|
||||||
|
.select_related("user", "user__profile", "park")
|
||||||
|
.order_by("-created_at")[:5]
|
||||||
|
)
|
||||||
|
|
||||||
|
def _get_user_ride_reviews(self, user: User) -> QuerySet[RideReview]:
|
||||||
|
return (
|
||||||
|
RideReview.objects.filter(user=user, is_published=True)
|
||||||
|
.select_related("user", "user__profile", "ride")
|
||||||
|
.order_by("-created_at")[:5]
|
||||||
|
)
|
||||||
|
|
||||||
|
def _get_user_top_lists(self, user: User) -> QuerySet[TopList]:
|
||||||
|
return (
|
||||||
|
TopList.objects.filter(user=user)
|
||||||
|
.select_related("user", "user__profile")
|
||||||
|
.prefetch_related("items")
|
||||||
|
.order_by("-created_at")[:5]
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class SettingsView(LoginRequiredMixin, TemplateView):
|
||||||
|
template_name = "accounts/settings.html"
|
||||||
|
|
||||||
|
def get_context_data(self, **kwargs: Any) -> Dict[str, Any]:
|
||||||
|
context = super().get_context_data(**kwargs)
|
||||||
|
context["user"] = self.request.user
|
||||||
|
return context
|
||||||
|
|
||||||
|
def _handle_profile_update(self, request: HttpRequest) -> None:
|
||||||
|
user = cast(User, request.user)
|
||||||
|
profile = get_object_or_404(UserProfile, user=user)
|
||||||
|
|
||||||
|
if display_name := request.POST.get("display_name"):
|
||||||
|
profile.display_name = display_name
|
||||||
|
|
||||||
|
if "avatar" in request.FILES:
|
||||||
|
avatar_file = cast(UploadedFile, request.FILES["avatar"])
|
||||||
|
profile.avatar.save(avatar_file.name, avatar_file, save=False)
|
||||||
|
profile.save()
|
||||||
|
|
||||||
|
user.save()
|
||||||
|
messages.success(request, "Profile updated successfully")
|
||||||
|
|
||||||
|
def _validate_password(self, password: str) -> bool:
|
||||||
|
"""Validate password meets requirements."""
|
||||||
|
return (
|
||||||
|
len(password) >= 8
|
||||||
|
and bool(re.search(r"[A-Z]", password))
|
||||||
|
and bool(re.search(r"[a-z]", password))
|
||||||
|
and bool(re.search(r"[0-9]", password))
|
||||||
|
)
|
||||||
|
|
||||||
|
def _send_password_change_confirmation(
|
||||||
|
self, request: HttpRequest, user: User
|
||||||
|
) -> None:
|
||||||
|
"""Send password change confirmation email."""
|
||||||
|
site = get_current_site(request)
|
||||||
|
context = {
|
||||||
|
"user": user,
|
||||||
|
"site_name": site.name,
|
||||||
|
}
|
||||||
|
|
||||||
|
email_html = render_to_string(
|
||||||
|
"accounts/email/password_change_confirmation.html", context
|
||||||
|
)
|
||||||
|
|
||||||
|
EmailService.send_email(
|
||||||
|
to=user.email,
|
||||||
|
subject="Password Changed Successfully",
|
||||||
|
text="Your password has been changed successfully.",
|
||||||
|
site=site,
|
||||||
|
html=email_html,
|
||||||
|
)
|
||||||
|
|
||||||
|
def _handle_password_change(
|
||||||
|
self, request: HttpRequest
|
||||||
|
) -> Optional[HttpResponseRedirect]:
|
||||||
|
user = cast(User, request.user)
|
||||||
|
old_password = request.POST.get("old_password", "")
|
||||||
|
new_password = request.POST.get("new_password", "")
|
||||||
|
confirm_password = request.POST.get("confirm_password", "")
|
||||||
|
|
||||||
|
if not user.check_password(old_password):
|
||||||
|
messages.error(request, "Current password is incorrect")
|
||||||
|
return None
|
||||||
|
|
||||||
|
if new_password != confirm_password:
|
||||||
|
messages.error(request, "New passwords do not match")
|
||||||
|
return None
|
||||||
|
|
||||||
|
if not self._validate_password(new_password):
|
||||||
|
messages.error(
|
||||||
|
request,
|
||||||
|
"Password must be at least 8 characters and contain uppercase, lowercase, and numbers",
|
||||||
|
)
|
||||||
|
return None
|
||||||
|
|
||||||
|
user.set_password(new_password)
|
||||||
|
user.save()
|
||||||
|
|
||||||
|
self._send_password_change_confirmation(request, user)
|
||||||
|
messages.success(
|
||||||
|
request,
|
||||||
|
"Password changed successfully. Please check your email for confirmation.",
|
||||||
|
)
|
||||||
|
return HttpResponseRedirect(reverse("account_login"))
|
||||||
|
|
||||||
|
def _handle_email_change(self, request: HttpRequest) -> None:
|
||||||
|
if new_email := request.POST.get("new_email"):
|
||||||
|
self._send_email_verification(request, new_email)
|
||||||
|
messages.success(
|
||||||
|
request, "Verification email sent to your new email address"
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
messages.error(request, "New email is required")
|
||||||
|
|
||||||
|
def _send_email_verification(self, request: HttpRequest, new_email: str) -> None:
|
||||||
|
user = cast(User, request.user)
|
||||||
|
token = get_random_string(64)
|
||||||
|
EmailVerification.objects.update_or_create(user=user, defaults={"token": token})
|
||||||
|
|
||||||
|
site = cast(Site, get_current_site(request))
|
||||||
|
verification_url = reverse("verify_email", kwargs={"token": token})
|
||||||
|
|
||||||
|
context = {
|
||||||
|
"user": user,
|
||||||
|
"verification_url": verification_url,
|
||||||
|
"site_name": site.name,
|
||||||
|
}
|
||||||
|
|
||||||
|
email_html = render_to_string("accounts/email/verify_email.html", context)
|
||||||
|
EmailService.send_email(
|
||||||
|
to=new_email,
|
||||||
|
subject="Verify your new email address",
|
||||||
|
text="Click the link to verify your new email address",
|
||||||
|
site=site,
|
||||||
|
html=email_html,
|
||||||
|
)
|
||||||
|
|
||||||
|
user.pending_email = new_email
|
||||||
|
user.save()
|
||||||
|
|
||||||
|
def post(self, request: HttpRequest, *args: Any, **kwargs: Any) -> HttpResponse:
|
||||||
|
action = request.POST.get("action")
|
||||||
|
|
||||||
|
if action == "update_profile":
|
||||||
|
self._handle_profile_update(request)
|
||||||
|
elif action == "change_password":
|
||||||
|
if response := self._handle_password_change(request):
|
||||||
|
return response
|
||||||
|
elif action == "change_email":
|
||||||
|
self._handle_email_change(request)
|
||||||
|
|
||||||
|
return self.get(request, *args, **kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
def create_password_reset_token(user: User) -> str:
|
||||||
|
token = get_random_string(64)
|
||||||
|
PasswordReset.objects.update_or_create(
|
||||||
|
user=user,
|
||||||
|
defaults={
|
||||||
|
"token": token,
|
||||||
|
"expires_at": timezone.now() + timedelta(hours=24),
|
||||||
|
},
|
||||||
|
)
|
||||||
|
return token
|
||||||
|
|
||||||
|
|
||||||
|
def send_password_reset_email(
|
||||||
|
user: User, site: Union[Site, RequestSite], token: str
|
||||||
|
) -> None:
|
||||||
|
reset_url = reverse("password_reset_confirm", kwargs={"token": token})
|
||||||
|
context = {
|
||||||
|
"user": user,
|
||||||
|
"reset_url": reset_url,
|
||||||
|
"site_name": site.name,
|
||||||
|
}
|
||||||
|
email_html = render_to_string("accounts/email/password_reset.html", context)
|
||||||
|
|
||||||
|
EmailService.send_email(
|
||||||
|
to=user.email,
|
||||||
|
subject="Reset your password",
|
||||||
|
text="Click the link to reset your password",
|
||||||
|
site=site,
|
||||||
|
html=email_html,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def request_password_reset(request: HttpRequest) -> HttpResponse:
|
||||||
|
if request.method != "POST":
|
||||||
|
return render(request, "accounts/password_reset.html")
|
||||||
|
|
||||||
|
if not (email := request.POST.get("email")):
|
||||||
|
messages.error(request, "Email is required")
|
||||||
|
return redirect("account_reset_password")
|
||||||
|
|
||||||
|
with suppress(User.DoesNotExist):
|
||||||
|
user = User.objects.get(email=email)
|
||||||
|
token = create_password_reset_token(user)
|
||||||
|
site = get_current_site(request)
|
||||||
|
send_password_reset_email(user, site, token)
|
||||||
|
|
||||||
|
messages.success(request, "Password reset email sent")
|
||||||
|
return redirect("account_login")
|
||||||
|
|
||||||
|
|
||||||
|
def handle_password_reset(
|
||||||
|
request: HttpRequest,
|
||||||
|
user: User,
|
||||||
|
new_password: str,
|
||||||
|
reset: PasswordReset,
|
||||||
|
site: Union[Site, RequestSite],
|
||||||
|
) -> None:
|
||||||
|
user.set_password(new_password)
|
||||||
|
user.save()
|
||||||
|
|
||||||
|
reset.used = True
|
||||||
|
reset.save()
|
||||||
|
|
||||||
|
send_password_reset_confirmation(user, site)
|
||||||
|
messages.success(request, "Password reset successfully")
|
||||||
|
|
||||||
|
|
||||||
|
def send_password_reset_confirmation(
|
||||||
|
user: User, site: Union[Site, RequestSite]
|
||||||
|
) -> None:
|
||||||
|
context = {
|
||||||
|
"user": user,
|
||||||
|
"site_name": site.name,
|
||||||
|
}
|
||||||
|
email_html = render_to_string(
|
||||||
|
"accounts/email/password_reset_complete.html", context
|
||||||
|
)
|
||||||
|
|
||||||
|
EmailService.send_email(
|
||||||
|
to=user.email,
|
||||||
|
subject="Password Reset Complete",
|
||||||
|
text="Your password has been reset successfully.",
|
||||||
|
site=site,
|
||||||
|
html=email_html,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def reset_password(request: HttpRequest, token: str) -> HttpResponse:
|
||||||
|
try:
|
||||||
|
reset = PasswordReset.objects.select_related("user").get(
|
||||||
|
token=token, expires_at__gt=timezone.now(), used=False
|
||||||
|
)
|
||||||
|
|
||||||
|
if request.method == "POST":
|
||||||
|
if new_password := request.POST.get("new_password"):
|
||||||
|
site = get_current_site(request)
|
||||||
|
handle_password_reset(request, reset.user, new_password, reset, site)
|
||||||
|
return redirect("account_login")
|
||||||
|
|
||||||
|
messages.error(request, "New password is required")
|
||||||
|
|
||||||
|
return render(request, "accounts/password_reset_confirm.html", {"token": token})
|
||||||
|
|
||||||
|
except PasswordReset.DoesNotExist:
|
||||||
|
messages.error(request, "Invalid or expired reset token")
|
||||||
|
return redirect("account_reset_password")
|
||||||
43
backend/apps/context_portal/alembic.ini
Normal file
43
backend/apps/context_portal/alembic.ini
Normal file
@@ -0,0 +1,43 @@
|
|||||||
|
|
||||||
|
# A generic Alembic configuration file.
|
||||||
|
|
||||||
|
[alembic]
|
||||||
|
# path to migration scripts
|
||||||
|
script_location = alembic
|
||||||
|
|
||||||
|
# The database URL is now set dynamically by ConPort's run_migrations function.
|
||||||
|
# sqlalchemy.url = sqlite:///your_database.db
|
||||||
|
# ... other Alembic settings ...
|
||||||
|
[loggers]
|
||||||
|
keys = root,sqlalchemy,alembic
|
||||||
|
|
||||||
|
[handlers]
|
||||||
|
keys = console
|
||||||
|
|
||||||
|
[formatters]
|
||||||
|
keys = generic
|
||||||
|
|
||||||
|
[logger_root]
|
||||||
|
level = WARN
|
||||||
|
handlers = console
|
||||||
|
qualname =
|
||||||
|
|
||||||
|
[logger_sqlalchemy]
|
||||||
|
level = WARN
|
||||||
|
handlers =
|
||||||
|
qualname = sqlalchemy.engine
|
||||||
|
|
||||||
|
[logger_alembic]
|
||||||
|
level = INFO
|
||||||
|
handlers =
|
||||||
|
qualname = alembic
|
||||||
|
|
||||||
|
[handler_console]
|
||||||
|
class = StreamHandler
|
||||||
|
args = (sys.stderr,)
|
||||||
|
level = NOTSET
|
||||||
|
formatter = generic
|
||||||
|
|
||||||
|
[formatter_generic]
|
||||||
|
format = %(levelname)-5.5s [%(name)s] %(message)s
|
||||||
|
datefmt = %H:%M:%S
|
||||||
76
backend/apps/context_portal/alembic/env.py
Normal file
76
backend/apps/context_portal/alembic/env.py
Normal file
@@ -0,0 +1,76 @@
|
|||||||
|
from logging.config import fileConfig
|
||||||
|
|
||||||
|
from sqlalchemy import engine_from_config
|
||||||
|
from sqlalchemy import pool
|
||||||
|
|
||||||
|
from alembic import context
|
||||||
|
|
||||||
|
# this is the Alembic Config object, which provides
|
||||||
|
# access to the values within the .ini file in use.
|
||||||
|
config = context.config
|
||||||
|
|
||||||
|
# Interpret the config file for Python logging.
|
||||||
|
# This line prevents the need to have a separate logging config file.
|
||||||
|
if config.config_file_name is not None:
|
||||||
|
fileConfig(config.config_file_name)
|
||||||
|
|
||||||
|
# add your model's MetaData object here
|
||||||
|
# for 'autogenerate' support
|
||||||
|
# from myapp import mymodel
|
||||||
|
# target_metadata = mymodel.Base.metadata
|
||||||
|
target_metadata = None
|
||||||
|
|
||||||
|
# other values from the config, defined by the needs of env.py,
|
||||||
|
# can be acquired:
|
||||||
|
# my_important_option = config.get_main_option("my_important_option")
|
||||||
|
# ... etc.
|
||||||
|
|
||||||
|
|
||||||
|
def run_migrations_offline() -> None:
|
||||||
|
"""Run migrations in 'offline' mode.
|
||||||
|
|
||||||
|
This configures the context with just a URL
|
||||||
|
and not an Engine, though an Engine is acceptable
|
||||||
|
here as well. By skipping the Engine creation
|
||||||
|
we don't even need a DBAPI to be available.
|
||||||
|
|
||||||
|
Calls to context.execute() here emit the given string to the
|
||||||
|
script output.
|
||||||
|
|
||||||
|
"""
|
||||||
|
url = config.get_main_option("sqlalchemy.url")
|
||||||
|
context.configure(
|
||||||
|
url=url,
|
||||||
|
target_metadata=target_metadata,
|
||||||
|
literal_binds=True,
|
||||||
|
dialect_opts={"paramstyle": "named"},
|
||||||
|
)
|
||||||
|
|
||||||
|
with context.begin_transaction():
|
||||||
|
context.run_migrations()
|
||||||
|
|
||||||
|
|
||||||
|
def run_migrations_online() -> None:
|
||||||
|
"""Run migrations in 'online' mode.
|
||||||
|
|
||||||
|
In this scenario we need to create an Engine
|
||||||
|
and associate a connection with the context.
|
||||||
|
|
||||||
|
"""
|
||||||
|
connectable = engine_from_config(
|
||||||
|
config.get_section(config.config_ini_section, {}),
|
||||||
|
prefix="sqlalchemy.",
|
||||||
|
poolclass=pool.NullPool,
|
||||||
|
)
|
||||||
|
|
||||||
|
with connectable.connect() as connection:
|
||||||
|
context.configure(connection=connection, target_metadata=target_metadata)
|
||||||
|
|
||||||
|
with context.begin_transaction():
|
||||||
|
context.run_migrations()
|
||||||
|
|
||||||
|
|
||||||
|
if context.is_offline_mode():
|
||||||
|
run_migrations_offline()
|
||||||
|
else:
|
||||||
|
run_migrations_online()
|
||||||
@@ -0,0 +1,247 @@
|
|||||||
|
"""Initial schema
|
||||||
|
|
||||||
|
Revision ID: 20250617
|
||||||
|
Revises:
|
||||||
|
Create Date: 2025-06-17 15:00:00.000000
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
import json
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision = "20250617"
|
||||||
|
down_revision = None
|
||||||
|
branch_labels = None
|
||||||
|
depends_on = None
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade() -> None:
|
||||||
|
# ### commands auto-generated by Alembic - please adjust! ###
|
||||||
|
op.create_table(
|
||||||
|
"active_context",
|
||||||
|
sa.Column("id", sa.Integer(), nullable=False),
|
||||||
|
sa.Column("content", sa.Text(), nullable=False),
|
||||||
|
sa.PrimaryKeyConstraint("id"),
|
||||||
|
)
|
||||||
|
op.create_table(
|
||||||
|
"active_context_history",
|
||||||
|
sa.Column("id", sa.Integer(), nullable=False),
|
||||||
|
sa.Column("timestamp", sa.DateTime(), nullable=False),
|
||||||
|
sa.Column("version", sa.Integer(), nullable=False),
|
||||||
|
sa.Column("content", sa.Text(), nullable=False),
|
||||||
|
sa.Column("change_source", sa.String(length=255), nullable=True),
|
||||||
|
sa.PrimaryKeyConstraint("id"),
|
||||||
|
)
|
||||||
|
op.create_table(
|
||||||
|
"context_links",
|
||||||
|
sa.Column("id", sa.Integer(), nullable=False),
|
||||||
|
sa.Column("workspace_id", sa.String(length=1024), nullable=False),
|
||||||
|
sa.Column("source_item_type", sa.String(length=255), nullable=False),
|
||||||
|
sa.Column("source_item_id", sa.String(length=255), nullable=False),
|
||||||
|
sa.Column("target_item_type", sa.String(length=255), nullable=False),
|
||||||
|
sa.Column("target_item_id", sa.String(length=255), nullable=False),
|
||||||
|
sa.Column("relationship_type", sa.String(length=255), nullable=False),
|
||||||
|
sa.Column("description", sa.Text(), nullable=True),
|
||||||
|
sa.Column(
|
||||||
|
"timestamp",
|
||||||
|
sa.DateTime(),
|
||||||
|
server_default=sa.text("(CURRENT_TIMESTAMP)"),
|
||||||
|
nullable=False,
|
||||||
|
),
|
||||||
|
sa.PrimaryKeyConstraint("id"),
|
||||||
|
)
|
||||||
|
op.create_index(
|
||||||
|
op.f("ix_context_links_source_item_id"),
|
||||||
|
"context_links",
|
||||||
|
["source_item_id"],
|
||||||
|
unique=False,
|
||||||
|
)
|
||||||
|
op.create_index(
|
||||||
|
op.f("ix_context_links_source_item_type"),
|
||||||
|
"context_links",
|
||||||
|
["source_item_type"],
|
||||||
|
unique=False,
|
||||||
|
)
|
||||||
|
op.create_index(
|
||||||
|
op.f("ix_context_links_target_item_id"),
|
||||||
|
"context_links",
|
||||||
|
["target_item_id"],
|
||||||
|
unique=False,
|
||||||
|
)
|
||||||
|
op.create_index(
|
||||||
|
op.f("ix_context_links_target_item_type"),
|
||||||
|
"context_links",
|
||||||
|
["target_item_type"],
|
||||||
|
unique=False,
|
||||||
|
)
|
||||||
|
op.create_table(
|
||||||
|
"custom_data",
|
||||||
|
sa.Column("id", sa.Integer(), nullable=False),
|
||||||
|
sa.Column("timestamp", sa.DateTime(), nullable=False),
|
||||||
|
sa.Column("category", sa.String(length=255), nullable=False),
|
||||||
|
sa.Column("key", sa.String(length=255), nullable=False),
|
||||||
|
sa.Column("value", sa.Text(), nullable=False),
|
||||||
|
sa.PrimaryKeyConstraint("id"),
|
||||||
|
sa.UniqueConstraint("category", "key"),
|
||||||
|
)
|
||||||
|
op.create_table(
|
||||||
|
"decisions",
|
||||||
|
sa.Column("id", sa.Integer(), nullable=False),
|
||||||
|
sa.Column("timestamp", sa.DateTime(), nullable=False),
|
||||||
|
sa.Column("summary", sa.Text(), nullable=False),
|
||||||
|
sa.Column("rationale", sa.Text(), nullable=True),
|
||||||
|
sa.Column("implementation_details", sa.Text(), nullable=True),
|
||||||
|
sa.Column("tags", sa.Text(), nullable=True),
|
||||||
|
sa.PrimaryKeyConstraint("id"),
|
||||||
|
)
|
||||||
|
op.create_table(
|
||||||
|
"product_context",
|
||||||
|
sa.Column("id", sa.Integer(), nullable=False),
|
||||||
|
sa.Column("content", sa.Text(), nullable=False),
|
||||||
|
sa.PrimaryKeyConstraint("id"),
|
||||||
|
)
|
||||||
|
op.create_table(
|
||||||
|
"product_context_history",
|
||||||
|
sa.Column("id", sa.Integer(), nullable=False),
|
||||||
|
sa.Column("timestamp", sa.DateTime(), nullable=False),
|
||||||
|
sa.Column("version", sa.Integer(), nullable=False),
|
||||||
|
sa.Column("content", sa.Text(), nullable=False),
|
||||||
|
sa.Column("change_source", sa.String(length=255), nullable=True),
|
||||||
|
sa.PrimaryKeyConstraint("id"),
|
||||||
|
)
|
||||||
|
op.create_table(
|
||||||
|
"progress_entries",
|
||||||
|
sa.Column("id", sa.Integer(), nullable=False),
|
||||||
|
sa.Column("timestamp", sa.DateTime(), nullable=False),
|
||||||
|
sa.Column("status", sa.String(length=50), nullable=False),
|
||||||
|
sa.Column("description", sa.Text(), nullable=False),
|
||||||
|
sa.Column("parent_id", sa.Integer(), nullable=True),
|
||||||
|
sa.ForeignKeyConstraint(
|
||||||
|
["parent_id"], ["progress_entries.id"], ondelete="SET NULL"
|
||||||
|
),
|
||||||
|
sa.PrimaryKeyConstraint("id"),
|
||||||
|
)
|
||||||
|
op.create_table(
|
||||||
|
"system_patterns",
|
||||||
|
sa.Column("id", sa.Integer(), nullable=False),
|
||||||
|
sa.Column("timestamp", sa.DateTime(), nullable=False),
|
||||||
|
sa.Column("name", sa.String(length=255), nullable=False),
|
||||||
|
sa.Column("description", sa.Text(), nullable=True),
|
||||||
|
sa.Column("tags", sa.Text(), nullable=True),
|
||||||
|
sa.PrimaryKeyConstraint("id"),
|
||||||
|
sa.UniqueConstraint("name"),
|
||||||
|
)
|
||||||
|
|
||||||
|
# Seed initial data
|
||||||
|
op.execute("INSERT INTO product_context (id, content) VALUES (1, '{}')")
|
||||||
|
op.execute("INSERT INTO active_context (id, content) VALUES (1, '{}')")
|
||||||
|
|
||||||
|
# Create FTS5 virtual table for decisions
|
||||||
|
op.execute(
|
||||||
|
"""
|
||||||
|
CREATE VIRTUAL TABLE decisions_fts USING fts5(
|
||||||
|
summary,
|
||||||
|
rationale,
|
||||||
|
implementation_details,
|
||||||
|
tags,
|
||||||
|
content="decisions",
|
||||||
|
content_rowid="id"
|
||||||
|
);
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
|
||||||
|
# Create triggers to keep the FTS table in sync with the decisions table
|
||||||
|
op.execute(
|
||||||
|
"""
|
||||||
|
CREATE TRIGGER decisions_after_insert AFTER INSERT ON decisions
|
||||||
|
BEGIN
|
||||||
|
INSERT INTO decisions_fts (rowid, summary, rationale, implementation_details, tags)
|
||||||
|
VALUES (new.id, new.summary, new.rationale, new.implementation_details, new.tags);
|
||||||
|
END;
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
op.execute(
|
||||||
|
"""
|
||||||
|
CREATE TRIGGER decisions_after_delete AFTER DELETE ON decisions
|
||||||
|
BEGIN
|
||||||
|
INSERT INTO decisions_fts (decisions_fts, rowid, summary, rationale, implementation_details, tags)
|
||||||
|
VALUES ('delete', old.id, old.summary, old.rationale, old.implementation_details, old.tags);
|
||||||
|
END;
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
op.execute(
|
||||||
|
"""
|
||||||
|
CREATE TRIGGER decisions_after_update AFTER UPDATE ON decisions
|
||||||
|
BEGIN
|
||||||
|
INSERT INTO decisions_fts (decisions_fts, rowid, summary, rationale, implementation_details, tags)
|
||||||
|
VALUES ('delete', old.id, old.summary, old.rationale, old.implementation_details, old.tags);
|
||||||
|
INSERT INTO decisions_fts (rowid, summary, rationale, implementation_details, tags)
|
||||||
|
VALUES (new.id, new.summary, new.rationale, new.implementation_details, new.tags);
|
||||||
|
END;
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
|
||||||
|
# Create FTS5 virtual table for custom_data
|
||||||
|
op.execute(
|
||||||
|
"""
|
||||||
|
CREATE VIRTUAL TABLE custom_data_fts USING fts5(
|
||||||
|
category,
|
||||||
|
key,
|
||||||
|
value_text,
|
||||||
|
content="custom_data",
|
||||||
|
content_rowid="id"
|
||||||
|
);
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
|
||||||
|
# Create triggers for custom_data_fts
|
||||||
|
op.execute(
|
||||||
|
"""
|
||||||
|
CREATE TRIGGER custom_data_after_insert AFTER INSERT ON custom_data
|
||||||
|
BEGIN
|
||||||
|
INSERT INTO custom_data_fts (rowid, category, key, value_text)
|
||||||
|
VALUES (new.id, new.category, new.key, new.value);
|
||||||
|
END;
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
op.execute(
|
||||||
|
"""
|
||||||
|
CREATE TRIGGER custom_data_after_delete AFTER DELETE ON custom_data
|
||||||
|
BEGIN
|
||||||
|
INSERT INTO custom_data_fts (custom_data_fts, rowid, category, key, value_text)
|
||||||
|
VALUES ('delete', old.id, old.category, old.key, old.value);
|
||||||
|
END;
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
op.execute(
|
||||||
|
"""
|
||||||
|
CREATE TRIGGER custom_data_after_update AFTER UPDATE ON custom_data
|
||||||
|
BEGIN
|
||||||
|
INSERT INTO custom_data_fts (custom_data_fts, rowid, category, key, value_text)
|
||||||
|
VALUES ('delete', old.id, old.category, old.key, old.value);
|
||||||
|
INSERT INTO custom_data_fts (rowid, category, key, value_text)
|
||||||
|
VALUES (new.id, new.category, new.key, new.value);
|
||||||
|
END;
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
# ### end Alembic commands ###
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade() -> None:
|
||||||
|
# ### commands auto-generated by Alembic - please adjust! ###
|
||||||
|
op.drop_table("system_patterns")
|
||||||
|
op.drop_table("progress_entries")
|
||||||
|
op.drop_table("product_context_history")
|
||||||
|
op.drop_table("product_context")
|
||||||
|
op.drop_table("decisions")
|
||||||
|
op.drop_table("custom_data")
|
||||||
|
op.drop_index(op.f("ix_context_links_target_item_type"), table_name="context_links")
|
||||||
|
op.drop_index(op.f("ix_context_links_target_item_id"), table_name="context_links")
|
||||||
|
op.drop_index(op.f("ix_context_links_source_item_type"), table_name="context_links")
|
||||||
|
op.drop_index(op.f("ix_context_links_source_item_id"), table_name="context_links")
|
||||||
|
op.drop_table("context_links")
|
||||||
|
op.drop_table("active_context_history")
|
||||||
|
op.drop_table("active_context")
|
||||||
|
# ### end Alembic commands ###
|
||||||
BIN
backend/apps/context_portal/context.db
Normal file
BIN
backend/apps/context_portal/context.db
Normal file
Binary file not shown.
0
backend/apps/core/__init__.py
Normal file
0
backend/apps/core/__init__.py
Normal file
30
backend/apps/core/admin.py
Normal file
30
backend/apps/core/admin.py
Normal file
@@ -0,0 +1,30 @@
|
|||||||
|
from django.contrib import admin
|
||||||
|
from django.utils.html import format_html
|
||||||
|
from .models import SlugHistory
|
||||||
|
|
||||||
|
|
||||||
|
@admin.register(SlugHistory)
|
||||||
|
class SlugHistoryAdmin(admin.ModelAdmin):
|
||||||
|
list_display = ["content_object_link", "old_slug", "created_at"]
|
||||||
|
list_filter = ["content_type", "created_at"]
|
||||||
|
search_fields = ["old_slug", "object_id"]
|
||||||
|
readonly_fields = ["content_type", "object_id", "old_slug", "created_at"]
|
||||||
|
date_hierarchy = "created_at"
|
||||||
|
ordering = ["-created_at"]
|
||||||
|
|
||||||
|
@admin.display(description="Object")
|
||||||
|
def content_object_link(self, obj):
|
||||||
|
"""Create a link to the related object's admin page"""
|
||||||
|
try:
|
||||||
|
url = obj.content_object.get_absolute_url()
|
||||||
|
return format_html('<a href="{}">{}</a>', url, str(obj.content_object))
|
||||||
|
except (AttributeError, ValueError):
|
||||||
|
return str(obj.content_object)
|
||||||
|
|
||||||
|
def has_add_permission(self, request):
|
||||||
|
"""Disable manual creation of slug history records"""
|
||||||
|
return False
|
||||||
|
|
||||||
|
def has_change_permission(self, request, obj=None):
|
||||||
|
"""Disable editing of slug history records"""
|
||||||
|
return False
|
||||||
60
backend/apps/core/analytics.py
Normal file
60
backend/apps/core/analytics.py
Normal file
@@ -0,0 +1,60 @@
|
|||||||
|
from django.db import models
|
||||||
|
from django.contrib.contenttypes.fields import GenericForeignKey
|
||||||
|
from django.contrib.contenttypes.models import ContentType
|
||||||
|
from django.utils import timezone
|
||||||
|
from django.db.models import Count
|
||||||
|
|
||||||
|
|
||||||
|
class PageView(models.Model):
|
||||||
|
content_type = models.ForeignKey(
|
||||||
|
ContentType, on_delete=models.CASCADE, related_name="page_views"
|
||||||
|
)
|
||||||
|
object_id = models.PositiveIntegerField()
|
||||||
|
content_object = GenericForeignKey("content_type", "object_id")
|
||||||
|
|
||||||
|
timestamp = models.DateTimeField(auto_now_add=True, db_index=True)
|
||||||
|
ip_address = models.GenericIPAddressField()
|
||||||
|
user_agent = models.CharField(max_length=512, blank=True)
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
indexes = [
|
||||||
|
models.Index(fields=["timestamp"]),
|
||||||
|
models.Index(fields=["content_type", "object_id"]),
|
||||||
|
]
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def get_trending_items(cls, model_class, hours=24, limit=10):
|
||||||
|
"""Get trending items of a specific model class based on views in last X hours.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
model_class: The model class to get trending items for (e.g., Park, Ride)
|
||||||
|
hours (int): Number of hours to look back for views (default: 24)
|
||||||
|
limit (int): Maximum number of items to return (default: 10)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
QuerySet: The trending items ordered by view count
|
||||||
|
"""
|
||||||
|
content_type = ContentType.objects.get_for_model(model_class)
|
||||||
|
cutoff = timezone.now() - timezone.timedelta(hours=hours)
|
||||||
|
|
||||||
|
# Query through the ContentType relationship
|
||||||
|
item_ids = (
|
||||||
|
cls.objects.filter(content_type=content_type, timestamp__gte=cutoff)
|
||||||
|
.values("object_id")
|
||||||
|
.annotate(view_count=Count("id"))
|
||||||
|
.filter(view_count__gt=0)
|
||||||
|
.order_by("-view_count")
|
||||||
|
.values_list("object_id", flat=True)[:limit]
|
||||||
|
)
|
||||||
|
|
||||||
|
# Get the actual items in the correct order
|
||||||
|
if item_ids:
|
||||||
|
# Convert the list to a string of comma-separated values
|
||||||
|
id_list = list(item_ids)
|
||||||
|
# Use Case/When to preserve the ordering
|
||||||
|
from django.db.models import Case, When
|
||||||
|
|
||||||
|
preserved = Case(*[When(pk=pk, then=pos) for pos, pk in enumerate(id_list)])
|
||||||
|
return model_class.objects.filter(pk__in=id_list).order_by(preserved)
|
||||||
|
|
||||||
|
return model_class.objects.none()
|
||||||
1
backend/apps/core/api/__init__.py
Normal file
1
backend/apps/core/api/__init__.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
# Core API infrastructure for ThrillWiki
|
||||||
205
backend/apps/core/api/exceptions.py
Normal file
205
backend/apps/core/api/exceptions.py
Normal file
@@ -0,0 +1,205 @@
|
|||||||
|
"""
|
||||||
|
Custom exception handling for ThrillWiki API.
|
||||||
|
Provides standardized error responses following Django styleguide patterns.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from typing import Any, Dict, Optional
|
||||||
|
|
||||||
|
from django.http import Http404
|
||||||
|
from django.core.exceptions import (
|
||||||
|
PermissionDenied,
|
||||||
|
ValidationError as DjangoValidationError,
|
||||||
|
)
|
||||||
|
from rest_framework import status
|
||||||
|
from rest_framework.response import Response
|
||||||
|
from rest_framework.views import exception_handler
|
||||||
|
from rest_framework.exceptions import (
|
||||||
|
ValidationError as DRFValidationError,
|
||||||
|
NotFound,
|
||||||
|
PermissionDenied as DRFPermissionDenied,
|
||||||
|
)
|
||||||
|
|
||||||
|
from ..exceptions import ThrillWikiException
|
||||||
|
from ..logging import get_logger, log_exception
|
||||||
|
|
||||||
|
logger = get_logger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
def custom_exception_handler(
|
||||||
|
exc: Exception, context: Dict[str, Any]
|
||||||
|
) -> Optional[Response]:
|
||||||
|
"""
|
||||||
|
Custom exception handler for DRF that provides standardized error responses.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Response with standardized error format or None to fallback to default handler
|
||||||
|
"""
|
||||||
|
# Call REST framework's default exception handler first
|
||||||
|
response = exception_handler(exc, context)
|
||||||
|
|
||||||
|
if response is not None:
|
||||||
|
# Standardize the error response format
|
||||||
|
custom_response_data = {
|
||||||
|
"status": "error",
|
||||||
|
"error": {
|
||||||
|
"code": _get_error_code(exc),
|
||||||
|
"message": _get_error_message(exc, response.data),
|
||||||
|
"details": _get_error_details(exc, response.data),
|
||||||
|
},
|
||||||
|
"data": None,
|
||||||
|
}
|
||||||
|
|
||||||
|
# Add request context for debugging
|
||||||
|
if hasattr(context.get("request"), "user"):
|
||||||
|
custom_response_data["error"]["request_user"] = str(context["request"].user)
|
||||||
|
|
||||||
|
# Log the error for monitoring
|
||||||
|
log_exception(
|
||||||
|
logger,
|
||||||
|
exc,
|
||||||
|
context={"response_status": response.status_code},
|
||||||
|
request=context.get("request"),
|
||||||
|
)
|
||||||
|
|
||||||
|
response.data = custom_response_data
|
||||||
|
|
||||||
|
# Handle ThrillWiki custom exceptions
|
||||||
|
elif isinstance(exc, ThrillWikiException):
|
||||||
|
custom_response_data = {
|
||||||
|
"status": "error",
|
||||||
|
"error": exc.to_dict(),
|
||||||
|
"data": None,
|
||||||
|
}
|
||||||
|
|
||||||
|
log_exception(
|
||||||
|
logger,
|
||||||
|
exc,
|
||||||
|
context={"response_status": exc.status_code},
|
||||||
|
request=context.get("request"),
|
||||||
|
)
|
||||||
|
response = Response(custom_response_data, status=exc.status_code)
|
||||||
|
|
||||||
|
# Handle specific Django exceptions that DRF doesn't catch
|
||||||
|
elif isinstance(exc, DjangoValidationError):
|
||||||
|
custom_response_data = {
|
||||||
|
"status": "error",
|
||||||
|
"error": {
|
||||||
|
"code": "VALIDATION_ERROR",
|
||||||
|
"message": "Validation failed",
|
||||||
|
"details": _format_django_validation_errors(exc),
|
||||||
|
},
|
||||||
|
"data": None,
|
||||||
|
}
|
||||||
|
|
||||||
|
log_exception(
|
||||||
|
logger,
|
||||||
|
exc,
|
||||||
|
context={"response_status": status.HTTP_400_BAD_REQUEST},
|
||||||
|
request=context.get("request"),
|
||||||
|
)
|
||||||
|
response = Response(custom_response_data, status=status.HTTP_400_BAD_REQUEST)
|
||||||
|
|
||||||
|
elif isinstance(exc, Http404):
|
||||||
|
custom_response_data = {
|
||||||
|
"status": "error",
|
||||||
|
"error": {
|
||||||
|
"code": "NOT_FOUND",
|
||||||
|
"message": "Resource not found",
|
||||||
|
"details": str(exc) if str(exc) else None,
|
||||||
|
},
|
||||||
|
"data": None,
|
||||||
|
}
|
||||||
|
|
||||||
|
log_exception(
|
||||||
|
logger,
|
||||||
|
exc,
|
||||||
|
context={"response_status": status.HTTP_404_NOT_FOUND},
|
||||||
|
request=context.get("request"),
|
||||||
|
)
|
||||||
|
response = Response(custom_response_data, status=status.HTTP_404_NOT_FOUND)
|
||||||
|
|
||||||
|
elif isinstance(exc, PermissionDenied):
|
||||||
|
custom_response_data = {
|
||||||
|
"status": "error",
|
||||||
|
"error": {
|
||||||
|
"code": "PERMISSION_DENIED",
|
||||||
|
"message": "Permission denied",
|
||||||
|
"details": str(exc) if str(exc) else None,
|
||||||
|
},
|
||||||
|
"data": None,
|
||||||
|
}
|
||||||
|
|
||||||
|
log_exception(
|
||||||
|
logger,
|
||||||
|
exc,
|
||||||
|
context={"response_status": status.HTTP_403_FORBIDDEN},
|
||||||
|
request=context.get("request"),
|
||||||
|
)
|
||||||
|
response = Response(custom_response_data, status=status.HTTP_403_FORBIDDEN)
|
||||||
|
|
||||||
|
return response
|
||||||
|
|
||||||
|
|
||||||
|
def _get_error_code(exc: Exception) -> str:
|
||||||
|
"""Extract or determine error code from exception."""
|
||||||
|
if hasattr(exc, "default_code"):
|
||||||
|
return exc.default_code.upper()
|
||||||
|
|
||||||
|
if isinstance(exc, DRFValidationError):
|
||||||
|
return "VALIDATION_ERROR"
|
||||||
|
elif isinstance(exc, NotFound):
|
||||||
|
return "NOT_FOUND"
|
||||||
|
elif isinstance(exc, DRFPermissionDenied):
|
||||||
|
return "PERMISSION_DENIED"
|
||||||
|
|
||||||
|
return exc.__class__.__name__.upper()
|
||||||
|
|
||||||
|
|
||||||
|
def _get_error_message(exc: Exception, response_data: Any) -> str:
|
||||||
|
"""Extract user-friendly error message."""
|
||||||
|
if isinstance(response_data, dict):
|
||||||
|
# Handle DRF validation errors
|
||||||
|
if "detail" in response_data:
|
||||||
|
return str(response_data["detail"])
|
||||||
|
elif "non_field_errors" in response_data:
|
||||||
|
errors = response_data["non_field_errors"]
|
||||||
|
return errors[0] if isinstance(errors, list) and errors else str(errors)
|
||||||
|
elif isinstance(response_data, dict) and len(response_data) == 1:
|
||||||
|
key, value = next(iter(response_data.items()))
|
||||||
|
if isinstance(value, list) and value:
|
||||||
|
return f"{key}: {value[0]}"
|
||||||
|
return f"{key}: {value}"
|
||||||
|
|
||||||
|
# Fallback to exception message
|
||||||
|
return str(exc) if str(exc) else "An error occurred"
|
||||||
|
|
||||||
|
|
||||||
|
def _get_error_details(exc: Exception, response_data: Any) -> Optional[Dict[str, Any]]:
|
||||||
|
"""Extract detailed error information for debugging."""
|
||||||
|
if isinstance(response_data, dict) and len(response_data) > 1:
|
||||||
|
return response_data
|
||||||
|
|
||||||
|
if hasattr(exc, "detail") and isinstance(exc.detail, dict):
|
||||||
|
return exc.detail
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def _format_django_validation_errors(
|
||||||
|
exc: DjangoValidationError,
|
||||||
|
) -> Dict[str, Any]:
|
||||||
|
"""Format Django ValidationError for API response."""
|
||||||
|
if hasattr(exc, "error_dict"):
|
||||||
|
# Field-specific errors
|
||||||
|
return {
|
||||||
|
field: [str(error) for error in errors]
|
||||||
|
for field, errors in exc.error_dict.items()
|
||||||
|
}
|
||||||
|
elif hasattr(exc, "error_list"):
|
||||||
|
# Non-field errors
|
||||||
|
return {"non_field_errors": [str(error) for error in exc.error_list]}
|
||||||
|
|
||||||
|
return {"non_field_errors": [str(exc)]}
|
||||||
|
|
||||||
|
|
||||||
|
# Removed _log_api_error - using centralized logging instead
|
||||||
260
backend/apps/core/api/mixins.py
Normal file
260
backend/apps/core/api/mixins.py
Normal file
@@ -0,0 +1,260 @@
|
|||||||
|
"""
|
||||||
|
Common mixins for API views following Django styleguide patterns.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from typing import Dict, Any, Optional
|
||||||
|
from rest_framework.request import Request
|
||||||
|
from rest_framework.response import Response
|
||||||
|
from rest_framework import status
|
||||||
|
|
||||||
|
|
||||||
|
class ApiMixin:
|
||||||
|
"""
|
||||||
|
Base mixin for API views providing standardized response formatting.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def create_response(
|
||||||
|
self,
|
||||||
|
*,
|
||||||
|
data: Any = None,
|
||||||
|
message: Optional[str] = None,
|
||||||
|
status_code: int = status.HTTP_200_OK,
|
||||||
|
pagination: Optional[Dict[str, Any]] = None,
|
||||||
|
metadata: Optional[Dict[str, Any]] = None,
|
||||||
|
) -> Response:
|
||||||
|
"""
|
||||||
|
Create standardized API response.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
data: Response data
|
||||||
|
message: Optional success message
|
||||||
|
status_code: HTTP status code
|
||||||
|
pagination: Pagination information
|
||||||
|
metadata: Additional metadata
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Standardized Response object
|
||||||
|
"""
|
||||||
|
response_data = {
|
||||||
|
"status": "success" if status_code < 400 else "error",
|
||||||
|
"data": data,
|
||||||
|
}
|
||||||
|
|
||||||
|
if message:
|
||||||
|
response_data["message"] = message
|
||||||
|
|
||||||
|
if pagination:
|
||||||
|
response_data["pagination"] = pagination
|
||||||
|
|
||||||
|
if metadata:
|
||||||
|
response_data["metadata"] = metadata
|
||||||
|
|
||||||
|
return Response(response_data, status=status_code)
|
||||||
|
|
||||||
|
def create_error_response(
|
||||||
|
self,
|
||||||
|
*,
|
||||||
|
message: str,
|
||||||
|
status_code: int = status.HTTP_400_BAD_REQUEST,
|
||||||
|
error_code: Optional[str] = None,
|
||||||
|
details: Optional[Dict[str, Any]] = None,
|
||||||
|
) -> Response:
|
||||||
|
"""
|
||||||
|
Create standardized error response.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
message: Error message
|
||||||
|
status_code: HTTP status code
|
||||||
|
error_code: Optional error code
|
||||||
|
details: Additional error details
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Standardized error Response object
|
||||||
|
"""
|
||||||
|
error_data = {
|
||||||
|
"code": error_code or "GENERIC_ERROR",
|
||||||
|
"message": message,
|
||||||
|
}
|
||||||
|
|
||||||
|
if details:
|
||||||
|
error_data["details"] = details
|
||||||
|
|
||||||
|
response_data = {
|
||||||
|
"status": "error",
|
||||||
|
"error": error_data,
|
||||||
|
"data": None,
|
||||||
|
}
|
||||||
|
|
||||||
|
return Response(response_data, status=status_code)
|
||||||
|
|
||||||
|
|
||||||
|
class CreateApiMixin(ApiMixin):
|
||||||
|
"""
|
||||||
|
Mixin for create API endpoints with standardized input/output handling.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def create(self, request: Request, *args, **kwargs) -> Response:
|
||||||
|
"""Handle POST requests for creating resources."""
|
||||||
|
serializer = self.get_input_serializer(data=request.data)
|
||||||
|
serializer.is_valid(raise_exception=True)
|
||||||
|
|
||||||
|
# Create the object using the service layer
|
||||||
|
obj = self.perform_create(**serializer.validated_data)
|
||||||
|
|
||||||
|
# Serialize the output
|
||||||
|
output_serializer = self.get_output_serializer(obj)
|
||||||
|
|
||||||
|
return self.create_response(
|
||||||
|
data=output_serializer.data,
|
||||||
|
status_code=status.HTTP_201_CREATED,
|
||||||
|
message="Resource created successfully",
|
||||||
|
)
|
||||||
|
|
||||||
|
def perform_create(self, **validated_data):
|
||||||
|
"""
|
||||||
|
Override this method to implement object creation logic.
|
||||||
|
Should use service layer methods.
|
||||||
|
"""
|
||||||
|
raise NotImplementedError("Subclasses must implement perform_create")
|
||||||
|
|
||||||
|
def get_input_serializer(self, *args, **kwargs):
|
||||||
|
"""Get the input serializer for validation."""
|
||||||
|
return self.InputSerializer(*args, **kwargs)
|
||||||
|
|
||||||
|
def get_output_serializer(self, *args, **kwargs):
|
||||||
|
"""Get the output serializer for response."""
|
||||||
|
return self.OutputSerializer(*args, **kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
class UpdateApiMixin(ApiMixin):
|
||||||
|
"""
|
||||||
|
Mixin for update API endpoints with standardized input/output handling.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def update(self, request: Request, *args, **kwargs) -> Response:
|
||||||
|
"""Handle PUT/PATCH requests for updating resources."""
|
||||||
|
instance = self.get_object()
|
||||||
|
serializer = self.get_input_serializer(
|
||||||
|
data=request.data, partial=kwargs.get("partial", False)
|
||||||
|
)
|
||||||
|
serializer.is_valid(raise_exception=True)
|
||||||
|
|
||||||
|
# Update the object using the service layer
|
||||||
|
updated_obj = self.perform_update(instance, **serializer.validated_data)
|
||||||
|
|
||||||
|
# Serialize the output
|
||||||
|
output_serializer = self.get_output_serializer(updated_obj)
|
||||||
|
|
||||||
|
return self.create_response(
|
||||||
|
data=output_serializer.data,
|
||||||
|
message="Resource updated successfully",
|
||||||
|
)
|
||||||
|
|
||||||
|
def perform_update(self, instance, **validated_data):
|
||||||
|
"""
|
||||||
|
Override this method to implement object update logic.
|
||||||
|
Should use service layer methods.
|
||||||
|
"""
|
||||||
|
raise NotImplementedError("Subclasses must implement perform_update")
|
||||||
|
|
||||||
|
def get_input_serializer(self, *args, **kwargs):
|
||||||
|
"""Get the input serializer for validation."""
|
||||||
|
return self.InputSerializer(*args, **kwargs)
|
||||||
|
|
||||||
|
def get_output_serializer(self, *args, **kwargs):
|
||||||
|
"""Get the output serializer for response."""
|
||||||
|
return self.OutputSerializer(*args, **kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
class ListApiMixin(ApiMixin):
|
||||||
|
"""
|
||||||
|
Mixin for list API endpoints with pagination and filtering.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def list(self, request: Request, *args, **kwargs) -> Response:
|
||||||
|
"""Handle GET requests for listing resources."""
|
||||||
|
# Use selector to get filtered queryset
|
||||||
|
queryset = self.get_queryset()
|
||||||
|
|
||||||
|
# Apply pagination
|
||||||
|
page = self.paginate_queryset(queryset)
|
||||||
|
if page is not None:
|
||||||
|
serializer = self.get_output_serializer(page, many=True)
|
||||||
|
return self.get_paginated_response(serializer.data)
|
||||||
|
|
||||||
|
# No pagination
|
||||||
|
serializer = self.get_output_serializer(queryset, many=True)
|
||||||
|
return self.create_response(data=serializer.data)
|
||||||
|
|
||||||
|
def get_queryset(self):
|
||||||
|
"""
|
||||||
|
Override this method to use selector patterns.
|
||||||
|
Should call selector functions, not access model managers directly.
|
||||||
|
"""
|
||||||
|
raise NotImplementedError(
|
||||||
|
"Subclasses must implement get_queryset using selectors"
|
||||||
|
)
|
||||||
|
|
||||||
|
def get_output_serializer(self, *args, **kwargs):
|
||||||
|
"""Get the output serializer for response."""
|
||||||
|
return self.OutputSerializer(*args, **kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
class RetrieveApiMixin(ApiMixin):
|
||||||
|
"""
|
||||||
|
Mixin for retrieve API endpoints.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def retrieve(self, request: Request, *args, **kwargs) -> Response:
|
||||||
|
"""Handle GET requests for retrieving a single resource."""
|
||||||
|
instance = self.get_object()
|
||||||
|
serializer = self.get_output_serializer(instance)
|
||||||
|
|
||||||
|
return self.create_response(data=serializer.data)
|
||||||
|
|
||||||
|
def get_object(self):
|
||||||
|
"""
|
||||||
|
Override this method to use selector patterns.
|
||||||
|
Should call selector functions for optimized queries.
|
||||||
|
"""
|
||||||
|
raise NotImplementedError(
|
||||||
|
"Subclasses must implement get_object using selectors"
|
||||||
|
)
|
||||||
|
|
||||||
|
def get_output_serializer(self, *args, **kwargs):
|
||||||
|
"""Get the output serializer for response."""
|
||||||
|
return self.OutputSerializer(*args, **kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
class DestroyApiMixin(ApiMixin):
|
||||||
|
"""
|
||||||
|
Mixin for delete API endpoints.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def destroy(self, request: Request, *args, **kwargs) -> Response:
|
||||||
|
"""Handle DELETE requests for destroying resources."""
|
||||||
|
instance = self.get_object()
|
||||||
|
|
||||||
|
# Delete using service layer
|
||||||
|
self.perform_destroy(instance)
|
||||||
|
|
||||||
|
return self.create_response(
|
||||||
|
status_code=status.HTTP_204_NO_CONTENT,
|
||||||
|
message="Resource deleted successfully",
|
||||||
|
)
|
||||||
|
|
||||||
|
def perform_destroy(self, instance):
|
||||||
|
"""
|
||||||
|
Override this method to implement object deletion logic.
|
||||||
|
Should use service layer methods.
|
||||||
|
"""
|
||||||
|
raise NotImplementedError("Subclasses must implement perform_destroy")
|
||||||
|
|
||||||
|
def get_object(self):
|
||||||
|
"""
|
||||||
|
Override this method to use selector patterns.
|
||||||
|
Should call selector functions for optimized queries.
|
||||||
|
"""
|
||||||
|
raise NotImplementedError(
|
||||||
|
"Subclasses must implement get_object using selectors"
|
||||||
|
)
|
||||||
6
backend/apps/core/apps.py
Normal file
6
backend/apps/core/apps.py
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
from django.apps import AppConfig
|
||||||
|
|
||||||
|
|
||||||
|
class CoreConfig(AppConfig):
|
||||||
|
default_auto_field = "django.db.models.BigAutoField"
|
||||||
|
name = "apps.core"
|
||||||
1
backend/apps/core/decorators/__init__.py
Normal file
1
backend/apps/core/decorators/__init__.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
# Decorators module
|
||||||
409
backend/apps/core/decorators/cache_decorators.py
Normal file
409
backend/apps/core/decorators/cache_decorators.py
Normal file
@@ -0,0 +1,409 @@
|
|||||||
|
"""
|
||||||
|
Advanced caching decorators for API views and functions.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import hashlib
|
||||||
|
import json
|
||||||
|
import time
|
||||||
|
from functools import wraps
|
||||||
|
from typing import Optional, List, Callable
|
||||||
|
from django.utils.decorators import method_decorator
|
||||||
|
from django.views.decorators.vary import vary_on_headers
|
||||||
|
from apps.core.services.enhanced_cache_service import EnhancedCacheService
|
||||||
|
import logging
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
def cache_api_response(
|
||||||
|
timeout=1800, vary_on=None, key_prefix="api", cache_backend="api"
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Advanced decorator for caching API responses with flexible configuration
|
||||||
|
|
||||||
|
Args:
|
||||||
|
timeout: Cache timeout in seconds
|
||||||
|
vary_on: List of request attributes to vary cache on
|
||||||
|
key_prefix: Prefix for cache keys
|
||||||
|
cache_backend: Cache backend to use
|
||||||
|
"""
|
||||||
|
|
||||||
|
def decorator(view_func):
|
||||||
|
@wraps(view_func)
|
||||||
|
def wrapper(self, request, *args, **kwargs):
|
||||||
|
# Only cache GET requests
|
||||||
|
if request.method != "GET":
|
||||||
|
return view_func(self, request, *args, **kwargs)
|
||||||
|
|
||||||
|
# Generate cache key based on view, user, and parameters
|
||||||
|
cache_key_parts = [
|
||||||
|
key_prefix,
|
||||||
|
view_func.__name__,
|
||||||
|
(
|
||||||
|
str(request.user.id)
|
||||||
|
if request.user.is_authenticated
|
||||||
|
else "anonymous"
|
||||||
|
),
|
||||||
|
str(hash(frozenset(request.GET.items()))),
|
||||||
|
]
|
||||||
|
|
||||||
|
# Add URL parameters to cache key
|
||||||
|
if args:
|
||||||
|
cache_key_parts.append(str(hash(args)))
|
||||||
|
if kwargs:
|
||||||
|
cache_key_parts.append(str(hash(frozenset(kwargs.items()))))
|
||||||
|
|
||||||
|
# Add custom vary_on fields
|
||||||
|
if vary_on:
|
||||||
|
for field in vary_on:
|
||||||
|
value = getattr(request, field, "")
|
||||||
|
cache_key_parts.append(str(value))
|
||||||
|
|
||||||
|
cache_key = ":".join(cache_key_parts)
|
||||||
|
|
||||||
|
# Try to get from cache
|
||||||
|
cache_service = EnhancedCacheService()
|
||||||
|
cached_response = getattr(cache_service, cache_backend + "_cache").get(
|
||||||
|
cache_key
|
||||||
|
)
|
||||||
|
|
||||||
|
if cached_response:
|
||||||
|
logger.debug(
|
||||||
|
f"Cache hit for API view {view_func.__name__}",
|
||||||
|
extra={
|
||||||
|
"cache_key": cache_key,
|
||||||
|
"view": view_func.__name__,
|
||||||
|
"cache_hit": True,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
return cached_response
|
||||||
|
|
||||||
|
# Execute view and cache result
|
||||||
|
start_time = time.time()
|
||||||
|
response = view_func(self, request, *args, **kwargs)
|
||||||
|
execution_time = time.time() - start_time
|
||||||
|
|
||||||
|
# Only cache successful responses
|
||||||
|
if hasattr(response, "status_code") and response.status_code == 200:
|
||||||
|
getattr(cache_service, cache_backend + "_cache").set(
|
||||||
|
cache_key, response, timeout
|
||||||
|
)
|
||||||
|
logger.debug(
|
||||||
|
f"Cached API response for view {view_func.__name__}",
|
||||||
|
extra={
|
||||||
|
"cache_key": cache_key,
|
||||||
|
"view": view_func.__name__,
|
||||||
|
"execution_time": execution_time,
|
||||||
|
"cache_timeout": timeout,
|
||||||
|
"cache_miss": True,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
logger.debug(
|
||||||
|
f"Not caching response for view {
|
||||||
|
view_func.__name__} (status: {
|
||||||
|
getattr(
|
||||||
|
response,
|
||||||
|
'status_code',
|
||||||
|
'unknown')})"
|
||||||
|
)
|
||||||
|
|
||||||
|
return response
|
||||||
|
|
||||||
|
return wrapper
|
||||||
|
|
||||||
|
return decorator
|
||||||
|
|
||||||
|
|
||||||
|
def cache_queryset_result(
|
||||||
|
cache_key_template: str, timeout: int = 3600, cache_backend="default"
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Decorator for caching expensive queryset operations
|
||||||
|
|
||||||
|
Args:
|
||||||
|
cache_key_template: Template for cache key (can use format placeholders)
|
||||||
|
timeout: Cache timeout in seconds
|
||||||
|
cache_backend: Cache backend to use
|
||||||
|
"""
|
||||||
|
|
||||||
|
def decorator(func):
|
||||||
|
@wraps(func)
|
||||||
|
def wrapper(*args, **kwargs):
|
||||||
|
# Generate cache key from template and arguments
|
||||||
|
try:
|
||||||
|
cache_key = cache_key_template.format(*args, **kwargs)
|
||||||
|
except (KeyError, IndexError):
|
||||||
|
# Fallback to simpler key generation
|
||||||
|
cache_key = f"{cache_key_template}:{
|
||||||
|
hash(
|
||||||
|
str(args) +
|
||||||
|
str(kwargs))}"
|
||||||
|
|
||||||
|
cache_service = EnhancedCacheService()
|
||||||
|
cached_result = getattr(cache_service, cache_backend + "_cache").get(
|
||||||
|
cache_key
|
||||||
|
)
|
||||||
|
|
||||||
|
if cached_result is not None:
|
||||||
|
logger.debug(
|
||||||
|
f"Cache hit for queryset operation: {
|
||||||
|
func.__name__}"
|
||||||
|
)
|
||||||
|
return cached_result
|
||||||
|
|
||||||
|
# Execute function and cache result
|
||||||
|
start_time = time.time()
|
||||||
|
result = func(*args, **kwargs)
|
||||||
|
execution_time = time.time() - start_time
|
||||||
|
|
||||||
|
getattr(cache_service, cache_backend + "_cache").set(
|
||||||
|
cache_key, result, timeout
|
||||||
|
)
|
||||||
|
logger.debug(
|
||||||
|
f"Cached queryset result for {func.__name__}",
|
||||||
|
extra={
|
||||||
|
"cache_key": cache_key,
|
||||||
|
"function": func.__name__,
|
||||||
|
"execution_time": execution_time,
|
||||||
|
"cache_timeout": timeout,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
return wrapper
|
||||||
|
|
||||||
|
return decorator
|
||||||
|
|
||||||
|
|
||||||
|
def invalidate_cache_on_save(model_name: str, cache_patterns: List[str] = None):
|
||||||
|
"""
|
||||||
|
Decorator to invalidate cache when model instances are saved
|
||||||
|
|
||||||
|
Args:
|
||||||
|
model_name: Name of the model
|
||||||
|
cache_patterns: List of cache key patterns to invalidate
|
||||||
|
"""
|
||||||
|
|
||||||
|
def decorator(func):
|
||||||
|
@wraps(func)
|
||||||
|
def wrapper(self, *args, **kwargs):
|
||||||
|
result = func(self, *args, **kwargs)
|
||||||
|
|
||||||
|
# Invalidate related cache entries
|
||||||
|
cache_service = EnhancedCacheService()
|
||||||
|
|
||||||
|
# Standard model cache invalidation
|
||||||
|
instance_id = getattr(self, "id", None)
|
||||||
|
cache_service.invalidate_model_cache(model_name, instance_id)
|
||||||
|
|
||||||
|
# Custom pattern invalidation
|
||||||
|
if cache_patterns:
|
||||||
|
for pattern in cache_patterns:
|
||||||
|
if instance_id:
|
||||||
|
pattern = pattern.format(model=model_name, id=instance_id)
|
||||||
|
cache_service.invalidate_pattern(pattern)
|
||||||
|
|
||||||
|
logger.info(
|
||||||
|
f"Invalidated cache for {model_name} after save",
|
||||||
|
extra={
|
||||||
|
"model": model_name,
|
||||||
|
"instance_id": instance_id,
|
||||||
|
"patterns": cache_patterns,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
return wrapper
|
||||||
|
|
||||||
|
return decorator
|
||||||
|
|
||||||
|
|
||||||
|
class CachedAPIViewMixin:
|
||||||
|
"""Mixin to add caching capabilities to API views"""
|
||||||
|
|
||||||
|
cache_timeout = 1800 # 30 minutes default
|
||||||
|
cache_vary_on = ["version"]
|
||||||
|
cache_key_prefix = "api"
|
||||||
|
cache_backend = "api"
|
||||||
|
|
||||||
|
@method_decorator(vary_on_headers("User-Agent", "Accept-Language"))
|
||||||
|
def dispatch(self, request, *args, **kwargs):
|
||||||
|
"""Add caching to the dispatch method"""
|
||||||
|
if request.method == "GET" and getattr(self, "enable_caching", True):
|
||||||
|
return self._cached_dispatch(request, *args, **kwargs)
|
||||||
|
return super().dispatch(request, *args, **kwargs)
|
||||||
|
|
||||||
|
def _cached_dispatch(self, request, *args, **kwargs):
|
||||||
|
"""Handle cached dispatch for GET requests"""
|
||||||
|
cache_key = self._generate_cache_key(request, *args, **kwargs)
|
||||||
|
|
||||||
|
cache_service = EnhancedCacheService()
|
||||||
|
cached_response = getattr(cache_service, self.cache_backend + "_cache").get(
|
||||||
|
cache_key
|
||||||
|
)
|
||||||
|
|
||||||
|
if cached_response:
|
||||||
|
logger.debug(f"Cache hit for view {self.__class__.__name__}")
|
||||||
|
return cached_response
|
||||||
|
|
||||||
|
# Execute view
|
||||||
|
response = super().dispatch(request, *args, **kwargs)
|
||||||
|
|
||||||
|
# Cache successful responses
|
||||||
|
if hasattr(response, "status_code") and response.status_code == 200:
|
||||||
|
getattr(cache_service, self.cache_backend + "_cache").set(
|
||||||
|
cache_key, response, self.cache_timeout
|
||||||
|
)
|
||||||
|
logger.debug(f"Cached response for view {self.__class__.__name__}")
|
||||||
|
|
||||||
|
return response
|
||||||
|
|
||||||
|
def _generate_cache_key(self, request, *args, **kwargs):
|
||||||
|
"""Generate cache key for the request"""
|
||||||
|
key_parts = [
|
||||||
|
self.cache_key_prefix,
|
||||||
|
self.__class__.__name__,
|
||||||
|
request.method,
|
||||||
|
(str(request.user.id) if request.user.is_authenticated else "anonymous"),
|
||||||
|
str(hash(frozenset(request.GET.items()))),
|
||||||
|
]
|
||||||
|
|
||||||
|
if args:
|
||||||
|
key_parts.append(str(hash(args)))
|
||||||
|
if kwargs:
|
||||||
|
key_parts.append(str(hash(frozenset(kwargs.items()))))
|
||||||
|
|
||||||
|
# Add vary_on fields
|
||||||
|
for field in self.cache_vary_on:
|
||||||
|
value = getattr(request, field, "")
|
||||||
|
key_parts.append(str(value))
|
||||||
|
|
||||||
|
return ":".join(key_parts)
|
||||||
|
|
||||||
|
|
||||||
|
def smart_cache(
|
||||||
|
timeout: int = 3600,
|
||||||
|
key_func: Optional[Callable] = None,
|
||||||
|
invalidate_on: Optional[List[str]] = None,
|
||||||
|
cache_backend: str = "default",
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Smart caching decorator that adapts to function arguments
|
||||||
|
|
||||||
|
Args:
|
||||||
|
timeout: Cache timeout in seconds
|
||||||
|
key_func: Custom function to generate cache key
|
||||||
|
invalidate_on: List of signals to invalidate cache on
|
||||||
|
cache_backend: Cache backend to use
|
||||||
|
"""
|
||||||
|
|
||||||
|
def decorator(func):
|
||||||
|
@wraps(func)
|
||||||
|
def wrapper(*args, **kwargs):
|
||||||
|
# Generate cache key
|
||||||
|
if key_func:
|
||||||
|
cache_key = key_func(*args, **kwargs)
|
||||||
|
else:
|
||||||
|
# Default key generation
|
||||||
|
key_data = {
|
||||||
|
"func": f"{func.__module__}.{func.__name__}",
|
||||||
|
"args": str(args),
|
||||||
|
"kwargs": json.dumps(kwargs, sort_keys=True, default=str),
|
||||||
|
}
|
||||||
|
key_string = json.dumps(key_data, sort_keys=True)
|
||||||
|
cache_key = f"smart_cache:{
|
||||||
|
hashlib.md5(
|
||||||
|
key_string.encode()).hexdigest()}"
|
||||||
|
|
||||||
|
# Try to get from cache
|
||||||
|
cache_service = EnhancedCacheService()
|
||||||
|
cached_result = getattr(cache_service, cache_backend + "_cache").get(
|
||||||
|
cache_key
|
||||||
|
)
|
||||||
|
|
||||||
|
if cached_result is not None:
|
||||||
|
logger.debug(f"Smart cache hit for {func.__name__}")
|
||||||
|
return cached_result
|
||||||
|
|
||||||
|
# Execute function
|
||||||
|
start_time = time.time()
|
||||||
|
result = func(*args, **kwargs)
|
||||||
|
execution_time = time.time() - start_time
|
||||||
|
|
||||||
|
# Cache result
|
||||||
|
getattr(cache_service, cache_backend + "_cache").set(
|
||||||
|
cache_key, result, timeout
|
||||||
|
)
|
||||||
|
|
||||||
|
logger.debug(
|
||||||
|
f"Smart cached result for {func.__name__}",
|
||||||
|
extra={
|
||||||
|
"cache_key": cache_key,
|
||||||
|
"execution_time": execution_time,
|
||||||
|
"function": func.__name__,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
# Add cache invalidation if specified
|
||||||
|
if invalidate_on:
|
||||||
|
wrapper._cache_invalidate_on = invalidate_on
|
||||||
|
wrapper._cache_backend = cache_backend
|
||||||
|
|
||||||
|
return wrapper
|
||||||
|
|
||||||
|
return decorator
|
||||||
|
|
||||||
|
|
||||||
|
def conditional_cache(condition_func: Callable, **cache_kwargs):
|
||||||
|
"""
|
||||||
|
Cache decorator that only caches when condition is met
|
||||||
|
|
||||||
|
Args:
|
||||||
|
condition_func: Function that returns True if caching should be applied
|
||||||
|
**cache_kwargs: Arguments passed to smart_cache
|
||||||
|
"""
|
||||||
|
|
||||||
|
def decorator(func):
|
||||||
|
cached_func = smart_cache(**cache_kwargs)(func)
|
||||||
|
|
||||||
|
@wraps(func)
|
||||||
|
def wrapper(*args, **kwargs):
|
||||||
|
if condition_func(*args, **kwargs):
|
||||||
|
return cached_func(*args, **kwargs)
|
||||||
|
else:
|
||||||
|
return func(*args, **kwargs)
|
||||||
|
|
||||||
|
return wrapper
|
||||||
|
|
||||||
|
return decorator
|
||||||
|
|
||||||
|
|
||||||
|
# Utility functions for cache key generation
|
||||||
|
def generate_user_cache_key(user, suffix: str = ""):
|
||||||
|
"""Generate cache key based on user"""
|
||||||
|
user_id = user.id if user.is_authenticated else "anonymous"
|
||||||
|
return f"user:{user_id}:{suffix}" if suffix else f"user:{user_id}"
|
||||||
|
|
||||||
|
|
||||||
|
def generate_model_cache_key(model_instance, suffix: str = ""):
|
||||||
|
"""Generate cache key based on model instance"""
|
||||||
|
model_name = model_instance._meta.model_name
|
||||||
|
instance_id = model_instance.id
|
||||||
|
return (
|
||||||
|
f"{model_name}:{instance_id}:{suffix}"
|
||||||
|
if suffix
|
||||||
|
else f"{model_name}:{instance_id}"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def generate_queryset_cache_key(queryset, params: dict = None):
|
||||||
|
"""Generate cache key for queryset with parameters"""
|
||||||
|
model_name = queryset.model._meta.model_name
|
||||||
|
params_str = json.dumps(params or {}, sort_keys=True, default=str)
|
||||||
|
params_hash = hashlib.md5(params_str.encode()).hexdigest()
|
||||||
|
return f"queryset:{model_name}:{params_hash}"
|
||||||
224
backend/apps/core/exceptions.py
Normal file
224
backend/apps/core/exceptions.py
Normal file
@@ -0,0 +1,224 @@
|
|||||||
|
"""
|
||||||
|
Custom exception classes for ThrillWiki.
|
||||||
|
Provides domain-specific exceptions with proper error codes and messages.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from typing import Optional, Dict, Any
|
||||||
|
|
||||||
|
|
||||||
|
class ThrillWikiException(Exception):
|
||||||
|
"""Base exception for all ThrillWiki-specific errors."""
|
||||||
|
|
||||||
|
default_message = "An error occurred"
|
||||||
|
error_code = "THRILLWIKI_ERROR"
|
||||||
|
status_code = 500
|
||||||
|
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
message: Optional[str] = None,
|
||||||
|
error_code: Optional[str] = None,
|
||||||
|
details: Optional[Dict[str, Any]] = None,
|
||||||
|
):
|
||||||
|
self.message = message or self.default_message
|
||||||
|
self.error_code = error_code or self.error_code
|
||||||
|
self.details = details or {}
|
||||||
|
super().__init__(self.message)
|
||||||
|
|
||||||
|
def to_dict(self) -> Dict[str, Any]:
|
||||||
|
"""Convert exception to dictionary for API responses."""
|
||||||
|
return {
|
||||||
|
"error_code": self.error_code,
|
||||||
|
"message": self.message,
|
||||||
|
"details": self.details,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
class ValidationException(ThrillWikiException):
|
||||||
|
"""Raised when data validation fails."""
|
||||||
|
|
||||||
|
default_message = "Validation failed"
|
||||||
|
error_code = "VALIDATION_ERROR"
|
||||||
|
status_code = 400
|
||||||
|
|
||||||
|
|
||||||
|
class NotFoundError(ThrillWikiException):
|
||||||
|
"""Raised when a requested resource is not found."""
|
||||||
|
|
||||||
|
default_message = "Resource not found"
|
||||||
|
error_code = "NOT_FOUND"
|
||||||
|
status_code = 404
|
||||||
|
|
||||||
|
|
||||||
|
class PermissionDeniedError(ThrillWikiException):
|
||||||
|
"""Raised when user lacks permission for an operation."""
|
||||||
|
|
||||||
|
default_message = "Permission denied"
|
||||||
|
error_code = "PERMISSION_DENIED"
|
||||||
|
status_code = 403
|
||||||
|
|
||||||
|
|
||||||
|
class BusinessLogicError(ThrillWikiException):
|
||||||
|
"""Raised when business logic constraints are violated."""
|
||||||
|
|
||||||
|
default_message = "Business logic violation"
|
||||||
|
error_code = "BUSINESS_LOGIC_ERROR"
|
||||||
|
status_code = 400
|
||||||
|
|
||||||
|
|
||||||
|
class ExternalServiceError(ThrillWikiException):
|
||||||
|
"""Raised when external service calls fail."""
|
||||||
|
|
||||||
|
default_message = "External service error"
|
||||||
|
error_code = "EXTERNAL_SERVICE_ERROR"
|
||||||
|
status_code = 502
|
||||||
|
|
||||||
|
|
||||||
|
# Domain-specific exceptions
|
||||||
|
|
||||||
|
|
||||||
|
class ParkError(ThrillWikiException):
|
||||||
|
"""Base exception for park-related errors."""
|
||||||
|
|
||||||
|
error_code = "PARK_ERROR"
|
||||||
|
|
||||||
|
|
||||||
|
class ParkNotFoundError(NotFoundError):
|
||||||
|
"""Raised when a park is not found."""
|
||||||
|
|
||||||
|
default_message = "Park not found"
|
||||||
|
error_code = "PARK_NOT_FOUND"
|
||||||
|
|
||||||
|
def __init__(self, park_slug: Optional[str] = None, **kwargs):
|
||||||
|
if park_slug:
|
||||||
|
kwargs["details"] = {"park_slug": park_slug}
|
||||||
|
kwargs["message"] = f"Park with slug '{park_slug}' not found"
|
||||||
|
super().__init__(**kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
class ParkOperationError(BusinessLogicError):
|
||||||
|
"""Raised when park operation constraints are violated."""
|
||||||
|
|
||||||
|
default_message = "Invalid park operation"
|
||||||
|
error_code = "PARK_OPERATION_ERROR"
|
||||||
|
|
||||||
|
|
||||||
|
class RideError(ThrillWikiException):
|
||||||
|
"""Base exception for ride-related errors."""
|
||||||
|
|
||||||
|
error_code = "RIDE_ERROR"
|
||||||
|
|
||||||
|
|
||||||
|
class RideNotFoundError(NotFoundError):
|
||||||
|
"""Raised when a ride is not found."""
|
||||||
|
|
||||||
|
default_message = "Ride not found"
|
||||||
|
error_code = "RIDE_NOT_FOUND"
|
||||||
|
|
||||||
|
def __init__(self, ride_slug: Optional[str] = None, **kwargs):
|
||||||
|
if ride_slug:
|
||||||
|
kwargs["details"] = {"ride_slug": ride_slug}
|
||||||
|
kwargs["message"] = f"Ride with slug '{ride_slug}' not found"
|
||||||
|
super().__init__(**kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
class RideOperationError(BusinessLogicError):
|
||||||
|
"""Raised when ride operation constraints are violated."""
|
||||||
|
|
||||||
|
default_message = "Invalid ride operation"
|
||||||
|
error_code = "RIDE_OPERATION_ERROR"
|
||||||
|
|
||||||
|
|
||||||
|
class LocationError(ThrillWikiException):
|
||||||
|
"""Base exception for location-related errors."""
|
||||||
|
|
||||||
|
error_code = "LOCATION_ERROR"
|
||||||
|
|
||||||
|
|
||||||
|
class InvalidCoordinatesError(ValidationException):
|
||||||
|
"""Raised when geographic coordinates are invalid."""
|
||||||
|
|
||||||
|
default_message = "Invalid geographic coordinates"
|
||||||
|
error_code = "INVALID_COORDINATES"
|
||||||
|
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
latitude: Optional[float] = None,
|
||||||
|
longitude: Optional[float] = None,
|
||||||
|
**kwargs,
|
||||||
|
):
|
||||||
|
if latitude is not None or longitude is not None:
|
||||||
|
kwargs["details"] = {"latitude": latitude, "longitude": longitude}
|
||||||
|
super().__init__(**kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
class GeolocationError(ExternalServiceError):
|
||||||
|
"""Raised when geolocation services fail."""
|
||||||
|
|
||||||
|
default_message = "Geolocation service unavailable"
|
||||||
|
error_code = "GEOLOCATION_ERROR"
|
||||||
|
|
||||||
|
|
||||||
|
class ReviewError(ThrillWikiException):
|
||||||
|
"""Base exception for review-related errors."""
|
||||||
|
|
||||||
|
error_code = "REVIEW_ERROR"
|
||||||
|
|
||||||
|
|
||||||
|
class ReviewModerationError(BusinessLogicError):
|
||||||
|
"""Raised when review moderation constraints are violated."""
|
||||||
|
|
||||||
|
default_message = "Review moderation error"
|
||||||
|
error_code = "REVIEW_MODERATION_ERROR"
|
||||||
|
|
||||||
|
|
||||||
|
class DuplicateReviewError(BusinessLogicError):
|
||||||
|
"""Raised when user tries to create duplicate reviews."""
|
||||||
|
|
||||||
|
default_message = "User has already reviewed this item"
|
||||||
|
error_code = "DUPLICATE_REVIEW"
|
||||||
|
|
||||||
|
|
||||||
|
class AccountError(ThrillWikiException):
|
||||||
|
"""Base exception for account-related errors."""
|
||||||
|
|
||||||
|
error_code = "ACCOUNT_ERROR"
|
||||||
|
|
||||||
|
|
||||||
|
class InsufficientPermissionsError(PermissionDeniedError):
|
||||||
|
"""Raised when user lacks required permissions."""
|
||||||
|
|
||||||
|
default_message = "Insufficient permissions"
|
||||||
|
error_code = "INSUFFICIENT_PERMISSIONS"
|
||||||
|
|
||||||
|
def __init__(self, required_permission: Optional[str] = None, **kwargs):
|
||||||
|
if required_permission:
|
||||||
|
kwargs["details"] = {"required_permission": required_permission}
|
||||||
|
kwargs["message"] = f"Permission '{required_permission}' required"
|
||||||
|
super().__init__(**kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
class EmailError(ExternalServiceError):
|
||||||
|
"""Raised when email operations fail."""
|
||||||
|
|
||||||
|
default_message = "Email service error"
|
||||||
|
error_code = "EMAIL_ERROR"
|
||||||
|
|
||||||
|
|
||||||
|
class CacheError(ThrillWikiException):
|
||||||
|
"""Raised when cache operations fail."""
|
||||||
|
|
||||||
|
default_message = "Cache operation failed"
|
||||||
|
error_code = "CACHE_ERROR"
|
||||||
|
status_code = 500
|
||||||
|
|
||||||
|
|
||||||
|
class RoadTripError(ExternalServiceError):
|
||||||
|
"""Raised when road trip planning fails."""
|
||||||
|
|
||||||
|
default_message = "Road trip planning error"
|
||||||
|
error_code = "ROADTRIP_ERROR"
|
||||||
|
|
||||||
|
def __init__(self, service_name: Optional[str] = None, **kwargs):
|
||||||
|
if service_name:
|
||||||
|
kwargs["details"] = {"service": service_name}
|
||||||
|
super().__init__(**kwargs)
|
||||||
43
backend/apps/core/forms.py
Normal file
43
backend/apps/core/forms.py
Normal file
@@ -0,0 +1,43 @@
|
|||||||
|
"""Core forms and form components."""
|
||||||
|
|
||||||
|
from django.conf import settings
|
||||||
|
from django.core.exceptions import PermissionDenied
|
||||||
|
from django.utils.translation import gettext_lazy as _
|
||||||
|
|
||||||
|
from autocomplete import Autocomplete
|
||||||
|
|
||||||
|
|
||||||
|
class BaseAutocomplete(Autocomplete):
|
||||||
|
"""Base autocomplete class for consistent autocomplete behavior across the project.
|
||||||
|
|
||||||
|
This class extends django-htmx-autocomplete's base Autocomplete class to provide:
|
||||||
|
- Project-wide defaults for autocomplete behavior
|
||||||
|
- Translation strings
|
||||||
|
- Authentication enforcement
|
||||||
|
- Sensible search configuration
|
||||||
|
"""
|
||||||
|
|
||||||
|
# Search configuration
|
||||||
|
minimum_search_length = 2 # More responsive than default 3
|
||||||
|
max_results = 10 # Reasonable limit for performance
|
||||||
|
|
||||||
|
# UI text configuration using gettext for i18n
|
||||||
|
no_result_text = _("No matches found")
|
||||||
|
narrow_search_text = _(
|
||||||
|
"Showing %(page_size)s of %(total)s matches. Please refine your search."
|
||||||
|
)
|
||||||
|
type_at_least_n_characters = _("Type at least %(n)s characters...")
|
||||||
|
|
||||||
|
# Project-wide component settings
|
||||||
|
placeholder = _("Search...")
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def auth_check(request):
|
||||||
|
"""Enforce authentication by default.
|
||||||
|
|
||||||
|
This can be overridden in subclasses if public access is needed.
|
||||||
|
Configure AUTOCOMPLETE_BLOCK_UNAUTHENTICATED in settings to disable.
|
||||||
|
"""
|
||||||
|
block_unauth = getattr(settings, "AUTOCOMPLETE_BLOCK_UNAUTHENTICATED", True)
|
||||||
|
if block_unauth and not request.user.is_authenticated:
|
||||||
|
raise PermissionDenied(_("Authentication required"))
|
||||||
0
backend/apps/core/forms/__init__.py
Normal file
0
backend/apps/core/forms/__init__.py
Normal file
168
backend/apps/core/forms/search.py
Normal file
168
backend/apps/core/forms/search.py
Normal file
@@ -0,0 +1,168 @@
|
|||||||
|
from django import forms
|
||||||
|
from django.utils.translation import gettext_lazy as _
|
||||||
|
|
||||||
|
|
||||||
|
class LocationSearchForm(forms.Form):
|
||||||
|
"""
|
||||||
|
A comprehensive search form that includes text search, location-based
|
||||||
|
search, and content type filtering for a unified search experience.
|
||||||
|
"""
|
||||||
|
|
||||||
|
# Text search query
|
||||||
|
q = forms.CharField(
|
||||||
|
required=False,
|
||||||
|
label=_("Search Query"),
|
||||||
|
widget=forms.TextInput(
|
||||||
|
attrs={
|
||||||
|
"placeholder": _("Search parks, rides, companies..."),
|
||||||
|
"class": (
|
||||||
|
"w-full px-3 py-2 border border-gray-300 rounded-md shadow-sm "
|
||||||
|
"focus:ring-blue-500 focus:border-blue-500 dark:bg-gray-700 "
|
||||||
|
"dark:border-gray-600 dark:text-white"
|
||||||
|
),
|
||||||
|
}
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
# Location-based search
|
||||||
|
location = forms.CharField(
|
||||||
|
required=False,
|
||||||
|
label=_("Near Location"),
|
||||||
|
widget=forms.TextInput(
|
||||||
|
attrs={
|
||||||
|
"placeholder": _("City, address, or coordinates..."),
|
||||||
|
"id": "location-input",
|
||||||
|
"class": (
|
||||||
|
"w-full px-3 py-2 border border-gray-300 rounded-md shadow-sm "
|
||||||
|
"focus:ring-blue-500 focus:border-blue-500 dark:bg-gray-700 "
|
||||||
|
"dark:border-gray-600 dark:text-white"
|
||||||
|
),
|
||||||
|
}
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
# Hidden fields for coordinates
|
||||||
|
lat = forms.FloatField(
|
||||||
|
required=False, widget=forms.HiddenInput(attrs={"id": "lat-input"})
|
||||||
|
)
|
||||||
|
lng = forms.FloatField(
|
||||||
|
required=False, widget=forms.HiddenInput(attrs={"id": "lng-input"})
|
||||||
|
)
|
||||||
|
|
||||||
|
# Search radius
|
||||||
|
radius_km = forms.ChoiceField(
|
||||||
|
required=False,
|
||||||
|
label=_("Search Radius"),
|
||||||
|
choices=[
|
||||||
|
("", _("Any distance")),
|
||||||
|
("5", _("5 km")),
|
||||||
|
("10", _("10 km")),
|
||||||
|
("25", _("25 km")),
|
||||||
|
("50", _("50 km")),
|
||||||
|
("100", _("100 km")),
|
||||||
|
("200", _("200 km")),
|
||||||
|
],
|
||||||
|
widget=forms.Select(
|
||||||
|
attrs={
|
||||||
|
"class": (
|
||||||
|
"w-full px-3 py-2 border border-gray-300 rounded-md shadow-sm "
|
||||||
|
"focus:ring-blue-500 focus:border-blue-500 dark:bg-gray-700 "
|
||||||
|
"dark:border-gray-600 dark:text-white"
|
||||||
|
)
|
||||||
|
}
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
# Content type filters
|
||||||
|
search_parks = forms.BooleanField(
|
||||||
|
required=False,
|
||||||
|
initial=True,
|
||||||
|
label=_("Search Parks"),
|
||||||
|
widget=forms.CheckboxInput(
|
||||||
|
attrs={
|
||||||
|
"class": (
|
||||||
|
"rounded border-gray-300 text-blue-600 focus:ring-blue-500 "
|
||||||
|
"dark:border-gray-600 dark:bg-gray-700"
|
||||||
|
)
|
||||||
|
}
|
||||||
|
),
|
||||||
|
)
|
||||||
|
search_rides = forms.BooleanField(
|
||||||
|
required=False,
|
||||||
|
label=_("Search Rides"),
|
||||||
|
widget=forms.CheckboxInput(
|
||||||
|
attrs={
|
||||||
|
"class": (
|
||||||
|
"rounded border-gray-300 text-blue-600 focus:ring-blue-500 "
|
||||||
|
"dark:border-gray-600 dark:bg-gray-700"
|
||||||
|
)
|
||||||
|
}
|
||||||
|
),
|
||||||
|
)
|
||||||
|
search_companies = forms.BooleanField(
|
||||||
|
required=False,
|
||||||
|
label=_("Search Companies"),
|
||||||
|
widget=forms.CheckboxInput(
|
||||||
|
attrs={
|
||||||
|
"class": (
|
||||||
|
"rounded border-gray-300 text-blue-600 focus:ring-blue-500 "
|
||||||
|
"dark:border-gray-600 dark:bg-gray-700"
|
||||||
|
)
|
||||||
|
}
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
# Geographic filters
|
||||||
|
country = forms.CharField(
|
||||||
|
required=False,
|
||||||
|
widget=forms.TextInput(
|
||||||
|
attrs={
|
||||||
|
"placeholder": _("Country"),
|
||||||
|
"class": (
|
||||||
|
"w-full px-3 py-2 text-sm border border-gray-300 rounded-md "
|
||||||
|
"shadow-sm focus:ring-blue-500 focus:border-blue-500 "
|
||||||
|
"dark:bg-gray-700 dark:border-gray-600 dark:text-white"
|
||||||
|
),
|
||||||
|
}
|
||||||
|
),
|
||||||
|
)
|
||||||
|
state = forms.CharField(
|
||||||
|
required=False,
|
||||||
|
widget=forms.TextInput(
|
||||||
|
attrs={
|
||||||
|
"placeholder": _("State/Region"),
|
||||||
|
"class": (
|
||||||
|
"w-full px-3 py-2 text-sm border border-gray-300 rounded-md "
|
||||||
|
"shadow-sm focus:ring-blue-500 focus:border-blue-500 "
|
||||||
|
"dark:bg-gray-700 dark:border-gray-600 dark:text-white"
|
||||||
|
),
|
||||||
|
}
|
||||||
|
),
|
||||||
|
)
|
||||||
|
city = forms.CharField(
|
||||||
|
required=False,
|
||||||
|
widget=forms.TextInput(
|
||||||
|
attrs={
|
||||||
|
"placeholder": _("City"),
|
||||||
|
"class": (
|
||||||
|
"w-full px-3 py-2 text-sm border border-gray-300 rounded-md "
|
||||||
|
"shadow-sm focus:ring-blue-500 focus:border-blue-500 "
|
||||||
|
"dark:bg-gray-700 dark:border-gray-600 dark:text-white"
|
||||||
|
),
|
||||||
|
}
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
def clean(self):
|
||||||
|
cleaned_data = super().clean()
|
||||||
|
|
||||||
|
# If lat/lng are provided, ensure location field is populated for
|
||||||
|
# display
|
||||||
|
lat = cleaned_data.get("lat")
|
||||||
|
lng = cleaned_data.get("lng")
|
||||||
|
location = cleaned_data.get("location")
|
||||||
|
|
||||||
|
if lat and lng and not location:
|
||||||
|
cleaned_data["location"] = f"{lat}, {lng}"
|
||||||
|
|
||||||
|
return cleaned_data
|
||||||
1
backend/apps/core/health_checks/__init__.py
Normal file
1
backend/apps/core/health_checks/__init__.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
# Health checks module
|
||||||
325
backend/apps/core/health_checks/custom_checks.py
Normal file
325
backend/apps/core/health_checks/custom_checks.py
Normal file
@@ -0,0 +1,325 @@
|
|||||||
|
"""
|
||||||
|
Custom health checks for ThrillWiki application.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import time
|
||||||
|
import logging
|
||||||
|
from django.core.cache import cache
|
||||||
|
from django.db import connection
|
||||||
|
from health_check.backends import BaseHealthCheckBackend
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
class CacheHealthCheck(BaseHealthCheckBackend):
|
||||||
|
"""Check Redis cache connectivity and performance"""
|
||||||
|
|
||||||
|
critical_service = True
|
||||||
|
|
||||||
|
def check_status(self):
|
||||||
|
try:
|
||||||
|
# Test cache write/read performance
|
||||||
|
test_key = "health_check_test"
|
||||||
|
test_value = "test_value_" + str(int(time.time()))
|
||||||
|
|
||||||
|
start_time = time.time()
|
||||||
|
cache.set(test_key, test_value, timeout=30)
|
||||||
|
cached_value = cache.get(test_key)
|
||||||
|
cache_time = time.time() - start_time
|
||||||
|
|
||||||
|
if cached_value != test_value:
|
||||||
|
self.add_error("Cache read/write test failed - values don't match")
|
||||||
|
return
|
||||||
|
|
||||||
|
# Check cache performance
|
||||||
|
if cache_time > 0.1: # Warn if cache operations take more than 100ms
|
||||||
|
self.add_error(
|
||||||
|
f"Cache performance degraded: {
|
||||||
|
cache_time:.3f}s for read/write operation"
|
||||||
|
)
|
||||||
|
return
|
||||||
|
|
||||||
|
# Clean up test key
|
||||||
|
cache.delete(test_key)
|
||||||
|
|
||||||
|
# Additional Redis-specific checks if using django-redis
|
||||||
|
try:
|
||||||
|
from django_redis import get_redis_connection
|
||||||
|
|
||||||
|
redis_client = get_redis_connection("default")
|
||||||
|
info = redis_client.info()
|
||||||
|
|
||||||
|
# Check memory usage
|
||||||
|
used_memory = info.get("used_memory", 0)
|
||||||
|
max_memory = info.get("maxmemory", 0)
|
||||||
|
|
||||||
|
if max_memory > 0:
|
||||||
|
memory_usage_percent = (used_memory / max_memory) * 100
|
||||||
|
if memory_usage_percent > 90:
|
||||||
|
self.add_error(
|
||||||
|
f"Redis memory usage critical: {
|
||||||
|
memory_usage_percent:.1f}%"
|
||||||
|
)
|
||||||
|
elif memory_usage_percent > 80:
|
||||||
|
logger.warning(
|
||||||
|
f"Redis memory usage high: {
|
||||||
|
memory_usage_percent:.1f}%"
|
||||||
|
)
|
||||||
|
|
||||||
|
except ImportError:
|
||||||
|
# django-redis not available, skip additional checks
|
||||||
|
pass
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning(f"Could not get Redis info: {e}")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
self.add_error(f"Cache service unavailable: {e}")
|
||||||
|
|
||||||
|
|
||||||
|
class DatabasePerformanceCheck(BaseHealthCheckBackend):
|
||||||
|
"""Check database performance and connectivity"""
|
||||||
|
|
||||||
|
critical_service = False
|
||||||
|
|
||||||
|
def check_status(self):
|
||||||
|
try:
|
||||||
|
start_time = time.time()
|
||||||
|
|
||||||
|
# Test basic connectivity
|
||||||
|
with connection.cursor() as cursor:
|
||||||
|
cursor.execute("SELECT 1")
|
||||||
|
result = cursor.fetchone()
|
||||||
|
|
||||||
|
if result[0] != 1:
|
||||||
|
self.add_error("Database connectivity test failed")
|
||||||
|
return
|
||||||
|
|
||||||
|
basic_query_time = time.time() - start_time
|
||||||
|
|
||||||
|
# Test a more complex query (if it takes too long, there might be
|
||||||
|
# performance issues)
|
||||||
|
start_time = time.time()
|
||||||
|
with connection.cursor() as cursor:
|
||||||
|
cursor.execute("SELECT COUNT(*) FROM django_content_type")
|
||||||
|
cursor.fetchone()
|
||||||
|
|
||||||
|
complex_query_time = time.time() - start_time
|
||||||
|
|
||||||
|
# Performance thresholds
|
||||||
|
if basic_query_time > 1.0:
|
||||||
|
self.add_error(
|
||||||
|
f"Database responding slowly: basic query took {
|
||||||
|
basic_query_time:.2f}s"
|
||||||
|
)
|
||||||
|
elif basic_query_time > 0.5:
|
||||||
|
logger.warning(
|
||||||
|
f"Database performance degraded: basic query took {
|
||||||
|
basic_query_time:.2f}s"
|
||||||
|
)
|
||||||
|
|
||||||
|
if complex_query_time > 2.0:
|
||||||
|
self.add_error(
|
||||||
|
f"Database performance critical: complex query took {
|
||||||
|
complex_query_time:.2f}s"
|
||||||
|
)
|
||||||
|
elif complex_query_time > 1.0:
|
||||||
|
logger.warning(
|
||||||
|
f"Database performance slow: complex query took {
|
||||||
|
complex_query_time:.2f}s"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Check database version and settings if possible
|
||||||
|
try:
|
||||||
|
with connection.cursor() as cursor:
|
||||||
|
cursor.execute("SELECT version()")
|
||||||
|
version = cursor.fetchone()[0]
|
||||||
|
logger.debug(f"Database version: {version}")
|
||||||
|
except Exception as e:
|
||||||
|
logger.debug(f"Could not get database version: {e}")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
self.add_error(f"Database performance check failed: {e}")
|
||||||
|
|
||||||
|
|
||||||
|
class ApplicationHealthCheck(BaseHealthCheckBackend):
|
||||||
|
"""Check application-specific health indicators"""
|
||||||
|
|
||||||
|
critical_service = False
|
||||||
|
|
||||||
|
def check_status(self):
|
||||||
|
try:
|
||||||
|
# Check if we can import critical modules
|
||||||
|
critical_modules = [
|
||||||
|
"parks.models",
|
||||||
|
"rides.models",
|
||||||
|
"accounts.models",
|
||||||
|
"core.services",
|
||||||
|
]
|
||||||
|
|
||||||
|
for module_name in critical_modules:
|
||||||
|
try:
|
||||||
|
__import__(module_name)
|
||||||
|
except ImportError as e:
|
||||||
|
self.add_error(
|
||||||
|
f"Critical module import failed: {module_name} - {e}"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Check if we can access critical models
|
||||||
|
try:
|
||||||
|
from parks.models import Park
|
||||||
|
from apps.rides.models import Ride
|
||||||
|
from django.contrib.auth import get_user_model
|
||||||
|
|
||||||
|
User = get_user_model()
|
||||||
|
|
||||||
|
# Test that we can query these models (just count, don't load
|
||||||
|
# data)
|
||||||
|
park_count = Park.objects.count()
|
||||||
|
ride_count = Ride.objects.count()
|
||||||
|
user_count = User.objects.count()
|
||||||
|
|
||||||
|
logger.debug(
|
||||||
|
f"Model counts - Parks: {park_count}, Rides: {ride_count}, Users: {user_count}"
|
||||||
|
)
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
self.add_error(f"Model access check failed: {e}")
|
||||||
|
|
||||||
|
# Check media and static file configuration
|
||||||
|
from django.conf import settings
|
||||||
|
import os
|
||||||
|
|
||||||
|
if not os.path.exists(settings.MEDIA_ROOT):
|
||||||
|
self.add_error(
|
||||||
|
f"Media directory does not exist: {
|
||||||
|
settings.MEDIA_ROOT}"
|
||||||
|
)
|
||||||
|
|
||||||
|
if not os.path.exists(settings.STATIC_ROOT) and not settings.DEBUG:
|
||||||
|
self.add_error(
|
||||||
|
f"Static directory does not exist: {settings.STATIC_ROOT}"
|
||||||
|
)
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
self.add_error(f"Application health check failed: {e}")
|
||||||
|
|
||||||
|
|
||||||
|
class ExternalServiceHealthCheck(BaseHealthCheckBackend):
|
||||||
|
"""Check external services and dependencies"""
|
||||||
|
|
||||||
|
critical_service = False
|
||||||
|
|
||||||
|
def check_status(self):
|
||||||
|
# Check email service if configured
|
||||||
|
try:
|
||||||
|
from django.core.mail import get_connection
|
||||||
|
from django.conf import settings
|
||||||
|
|
||||||
|
if (
|
||||||
|
hasattr(settings, "EMAIL_BACKEND")
|
||||||
|
and "console" not in settings.EMAIL_BACKEND
|
||||||
|
):
|
||||||
|
# Only check if not using console backend
|
||||||
|
connection = get_connection()
|
||||||
|
if hasattr(connection, "open"):
|
||||||
|
try:
|
||||||
|
connection.open()
|
||||||
|
connection.close()
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning(f"Email service check failed: {e}")
|
||||||
|
# Don't fail the health check for email issues in
|
||||||
|
# development
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.debug(f"Email service check error: {e}")
|
||||||
|
|
||||||
|
# Check if Sentry is configured and working
|
||||||
|
try:
|
||||||
|
import sentry_sdk
|
||||||
|
|
||||||
|
if sentry_sdk.Hub.current.client:
|
||||||
|
# Sentry is configured
|
||||||
|
try:
|
||||||
|
# Test that we can capture a test message (this won't
|
||||||
|
# actually send to Sentry)
|
||||||
|
with sentry_sdk.push_scope() as scope:
|
||||||
|
scope.set_tag("health_check", True)
|
||||||
|
# Don't actually send a message, just verify the SDK is
|
||||||
|
# working
|
||||||
|
logger.debug("Sentry SDK is operational")
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning(f"Sentry SDK check failed: {e}")
|
||||||
|
|
||||||
|
except ImportError:
|
||||||
|
logger.debug("Sentry SDK not installed")
|
||||||
|
except Exception as e:
|
||||||
|
logger.debug(f"Sentry check error: {e}")
|
||||||
|
|
||||||
|
# Check Redis connection if configured
|
||||||
|
try:
|
||||||
|
from django.core.cache import caches
|
||||||
|
from django.conf import settings
|
||||||
|
|
||||||
|
cache_config = settings.CACHES.get("default", {})
|
||||||
|
if "redis" in cache_config.get("BACKEND", "").lower():
|
||||||
|
# Redis is configured, test basic connectivity
|
||||||
|
redis_cache = caches["default"]
|
||||||
|
redis_cache.set("health_check_redis", "test", 10)
|
||||||
|
value = redis_cache.get("health_check_redis")
|
||||||
|
if value != "test":
|
||||||
|
self.add_error("Redis cache connectivity test failed")
|
||||||
|
else:
|
||||||
|
redis_cache.delete("health_check_redis")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning(f"Redis connectivity check failed: {e}")
|
||||||
|
|
||||||
|
|
||||||
|
class DiskSpaceHealthCheck(BaseHealthCheckBackend):
|
||||||
|
"""Check available disk space"""
|
||||||
|
|
||||||
|
critical_service = False
|
||||||
|
|
||||||
|
def check_status(self):
|
||||||
|
try:
|
||||||
|
import shutil
|
||||||
|
from django.conf import settings
|
||||||
|
|
||||||
|
# Check disk space for media directory
|
||||||
|
media_usage = shutil.disk_usage(settings.MEDIA_ROOT)
|
||||||
|
media_free_percent = (media_usage.free / media_usage.total) * 100
|
||||||
|
|
||||||
|
# Check disk space for logs directory if it exists
|
||||||
|
logs_dir = getattr(settings, "BASE_DIR", "/tmp") / "logs"
|
||||||
|
if logs_dir.exists():
|
||||||
|
logs_usage = shutil.disk_usage(logs_dir)
|
||||||
|
logs_free_percent = (logs_usage.free / logs_usage.total) * 100
|
||||||
|
else:
|
||||||
|
logs_free_percent = media_free_percent # Use same as media
|
||||||
|
|
||||||
|
# Alert thresholds
|
||||||
|
if media_free_percent < 10:
|
||||||
|
self.add_error(
|
||||||
|
f"Critical disk space: {
|
||||||
|
media_free_percent:.1f}% free in media directory"
|
||||||
|
)
|
||||||
|
elif media_free_percent < 20:
|
||||||
|
logger.warning(
|
||||||
|
f"Low disk space: {
|
||||||
|
media_free_percent:.1f}% free in media directory"
|
||||||
|
)
|
||||||
|
|
||||||
|
if logs_free_percent < 10:
|
||||||
|
self.add_error(
|
||||||
|
f"Critical disk space: {
|
||||||
|
logs_free_percent:.1f}% free in logs directory"
|
||||||
|
)
|
||||||
|
elif logs_free_percent < 20:
|
||||||
|
logger.warning(
|
||||||
|
f"Low disk space: {
|
||||||
|
logs_free_percent:.1f}% free in logs directory"
|
||||||
|
)
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning(f"Disk space check failed: {e}")
|
||||||
|
# Don't fail health check for disk space issues in development
|
||||||
108
backend/apps/core/history.py
Normal file
108
backend/apps/core/history.py
Normal file
@@ -0,0 +1,108 @@
|
|||||||
|
from django.db import models
|
||||||
|
from django.contrib.contenttypes.models import ContentType
|
||||||
|
from django.contrib.contenttypes.fields import GenericForeignKey
|
||||||
|
from django.conf import settings
|
||||||
|
from typing import Any, Dict, Optional
|
||||||
|
from django.db.models import QuerySet
|
||||||
|
|
||||||
|
|
||||||
|
class DiffMixin:
|
||||||
|
"""Mixin to add diffing capabilities to models"""
|
||||||
|
|
||||||
|
def get_prev_record(self) -> Optional[Any]:
|
||||||
|
"""Get the previous record for this instance"""
|
||||||
|
try:
|
||||||
|
return (
|
||||||
|
type(self)
|
||||||
|
.objects.filter(
|
||||||
|
pgh_created_at__lt=self.pgh_created_at,
|
||||||
|
pgh_obj_id=self.pgh_obj_id,
|
||||||
|
)
|
||||||
|
.order_by("-pgh_created_at")
|
||||||
|
.first()
|
||||||
|
)
|
||||||
|
except (AttributeError, TypeError):
|
||||||
|
return None
|
||||||
|
|
||||||
|
def diff_against_previous(self) -> Dict:
|
||||||
|
"""Compare this record against the previous one"""
|
||||||
|
prev_record = self.get_prev_record()
|
||||||
|
if not prev_record:
|
||||||
|
return {}
|
||||||
|
|
||||||
|
skip_fields = {
|
||||||
|
"pgh_id",
|
||||||
|
"pgh_created_at",
|
||||||
|
"pgh_label",
|
||||||
|
"pgh_obj_id",
|
||||||
|
"pgh_context_id",
|
||||||
|
"_state",
|
||||||
|
"created_at",
|
||||||
|
"updated_at",
|
||||||
|
}
|
||||||
|
|
||||||
|
changes = {}
|
||||||
|
for field, value in self.__dict__.items():
|
||||||
|
# Skip internal fields and those we don't want to track
|
||||||
|
if field.startswith("_") or field in skip_fields or field.endswith("_id"):
|
||||||
|
continue
|
||||||
|
|
||||||
|
try:
|
||||||
|
old_value = getattr(prev_record, field)
|
||||||
|
new_value = value
|
||||||
|
if old_value != new_value:
|
||||||
|
changes[field] = {
|
||||||
|
"old": (str(old_value) if old_value is not None else "None"),
|
||||||
|
"new": (str(new_value) if new_value is not None else "None"),
|
||||||
|
}
|
||||||
|
except AttributeError:
|
||||||
|
continue
|
||||||
|
|
||||||
|
return changes
|
||||||
|
|
||||||
|
|
||||||
|
class TrackedModel(models.Model):
|
||||||
|
"""Abstract base class for models that need history tracking"""
|
||||||
|
|
||||||
|
created_at = models.DateTimeField(auto_now_add=True)
|
||||||
|
updated_at = models.DateTimeField(auto_now=True)
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
abstract = True
|
||||||
|
|
||||||
|
def get_history(self) -> QuerySet:
|
||||||
|
"""Get all history records for this instance in chronological order"""
|
||||||
|
event_model = self.events.model # pghistory provides this automatically
|
||||||
|
if event_model:
|
||||||
|
return event_model.objects.filter(pgh_obj_id=self.pk).order_by(
|
||||||
|
"-pgh_created_at"
|
||||||
|
)
|
||||||
|
return self.__class__.objects.none()
|
||||||
|
|
||||||
|
|
||||||
|
class HistoricalSlug(models.Model):
|
||||||
|
"""Track historical slugs for models"""
|
||||||
|
|
||||||
|
content_type = models.ForeignKey(ContentType, on_delete=models.CASCADE)
|
||||||
|
object_id = models.PositiveIntegerField()
|
||||||
|
content_object = GenericForeignKey("content_type", "object_id")
|
||||||
|
slug = models.SlugField(max_length=255)
|
||||||
|
created_at = models.DateTimeField(auto_now_add=True)
|
||||||
|
user = models.ForeignKey(
|
||||||
|
settings.AUTH_USER_MODEL,
|
||||||
|
null=True,
|
||||||
|
blank=True,
|
||||||
|
on_delete=models.SET_NULL,
|
||||||
|
related_name="historical_slugs",
|
||||||
|
)
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
app_label = "core"
|
||||||
|
unique_together = ("content_type", "slug")
|
||||||
|
indexes = [
|
||||||
|
models.Index(fields=["content_type", "object_id"]),
|
||||||
|
models.Index(fields=["slug"]),
|
||||||
|
]
|
||||||
|
|
||||||
|
def __str__(self) -> str:
|
||||||
|
return f"{self.content_type} - {self.object_id} - {self.slug}"
|
||||||
261
backend/apps/core/logging.py
Normal file
261
backend/apps/core/logging.py
Normal file
@@ -0,0 +1,261 @@
|
|||||||
|
"""
|
||||||
|
Centralized logging configuration for ThrillWiki.
|
||||||
|
Provides structured logging with proper formatting and context.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import logging
|
||||||
|
import sys
|
||||||
|
from typing import Dict, Any, Optional
|
||||||
|
from django.conf import settings
|
||||||
|
from django.utils import timezone
|
||||||
|
|
||||||
|
|
||||||
|
class ThrillWikiFormatter(logging.Formatter):
|
||||||
|
"""Custom formatter for ThrillWiki logs with structured output."""
|
||||||
|
|
||||||
|
def format(self, record):
|
||||||
|
# Add timestamp if not present
|
||||||
|
if not hasattr(record, "timestamp"):
|
||||||
|
record.timestamp = timezone.now().isoformat()
|
||||||
|
|
||||||
|
# Add request context if available
|
||||||
|
if hasattr(record, "request"):
|
||||||
|
record.request_id = getattr(record.request, "id", "unknown")
|
||||||
|
record.user_id = (
|
||||||
|
getattr(record.request.user, "id", "anonymous")
|
||||||
|
if hasattr(record.request, "user")
|
||||||
|
else "unknown"
|
||||||
|
)
|
||||||
|
record.path = getattr(record.request, "path", "unknown")
|
||||||
|
record.method = getattr(record.request, "method", "unknown")
|
||||||
|
|
||||||
|
# Structure the log message
|
||||||
|
if hasattr(record, "extra_data"):
|
||||||
|
record.structured_data = record.extra_data
|
||||||
|
|
||||||
|
return super().format(record)
|
||||||
|
|
||||||
|
|
||||||
|
def get_logger(name: str) -> logging.Logger:
|
||||||
|
"""
|
||||||
|
Get a configured logger for ThrillWiki components.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
name: Logger name (usually __name__)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Configured logger instance
|
||||||
|
"""
|
||||||
|
logger = logging.getLogger(name)
|
||||||
|
|
||||||
|
# Only configure if not already configured
|
||||||
|
if not logger.handlers:
|
||||||
|
handler = logging.StreamHandler(sys.stdout)
|
||||||
|
formatter = ThrillWikiFormatter(
|
||||||
|
fmt="%(asctime)s - %(name)s - %(levelname)s - %(message)s"
|
||||||
|
)
|
||||||
|
handler.setFormatter(formatter)
|
||||||
|
logger.addHandler(handler)
|
||||||
|
logger.setLevel(logging.INFO if settings.DEBUG else logging.WARNING)
|
||||||
|
|
||||||
|
return logger
|
||||||
|
|
||||||
|
|
||||||
|
def log_exception(
|
||||||
|
logger: logging.Logger,
|
||||||
|
exception: Exception,
|
||||||
|
*,
|
||||||
|
context: Optional[Dict[str, Any]] = None,
|
||||||
|
request=None,
|
||||||
|
level: int = logging.ERROR,
|
||||||
|
) -> None:
|
||||||
|
"""
|
||||||
|
Log an exception with structured context.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
logger: Logger instance
|
||||||
|
exception: Exception to log
|
||||||
|
context: Additional context data
|
||||||
|
request: Django request object
|
||||||
|
level: Log level
|
||||||
|
"""
|
||||||
|
log_data = {
|
||||||
|
"exception_type": exception.__class__.__name__,
|
||||||
|
"exception_message": str(exception),
|
||||||
|
"context": context or {},
|
||||||
|
}
|
||||||
|
|
||||||
|
if request:
|
||||||
|
log_data.update(
|
||||||
|
{
|
||||||
|
"request_path": getattr(request, "path", "unknown"),
|
||||||
|
"request_method": getattr(request, "method", "unknown"),
|
||||||
|
"user_id": (
|
||||||
|
getattr(request.user, "id", "anonymous")
|
||||||
|
if hasattr(request, "user")
|
||||||
|
else "unknown"
|
||||||
|
),
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
logger.log(
|
||||||
|
level,
|
||||||
|
f"Exception occurred: {exception}",
|
||||||
|
extra={"extra_data": log_data},
|
||||||
|
exc_info=True,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def log_business_event(
|
||||||
|
logger: logging.Logger,
|
||||||
|
event_type: str,
|
||||||
|
*,
|
||||||
|
message: str,
|
||||||
|
context: Optional[Dict[str, Any]] = None,
|
||||||
|
request=None,
|
||||||
|
level: int = logging.INFO,
|
||||||
|
) -> None:
|
||||||
|
"""
|
||||||
|
Log a business event with structured context.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
logger: Logger instance
|
||||||
|
event_type: Type of business event
|
||||||
|
message: Event message
|
||||||
|
context: Additional context data
|
||||||
|
request: Django request object
|
||||||
|
level: Log level
|
||||||
|
"""
|
||||||
|
log_data = {"event_type": event_type, "context": context or {}}
|
||||||
|
|
||||||
|
if request:
|
||||||
|
log_data.update(
|
||||||
|
{
|
||||||
|
"request_path": getattr(request, "path", "unknown"),
|
||||||
|
"request_method": getattr(request, "method", "unknown"),
|
||||||
|
"user_id": (
|
||||||
|
getattr(request.user, "id", "anonymous")
|
||||||
|
if hasattr(request, "user")
|
||||||
|
else "unknown"
|
||||||
|
),
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
logger.log(level, message, extra={"extra_data": log_data})
|
||||||
|
|
||||||
|
|
||||||
|
def log_performance_metric(
|
||||||
|
logger: logging.Logger,
|
||||||
|
operation: str,
|
||||||
|
*,
|
||||||
|
duration_ms: float,
|
||||||
|
context: Optional[Dict[str, Any]] = None,
|
||||||
|
level: int = logging.INFO,
|
||||||
|
) -> None:
|
||||||
|
"""
|
||||||
|
Log a performance metric.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
logger: Logger instance
|
||||||
|
operation: Operation name
|
||||||
|
duration_ms: Duration in milliseconds
|
||||||
|
context: Additional context data
|
||||||
|
level: Log level
|
||||||
|
"""
|
||||||
|
log_data = {
|
||||||
|
"metric_type": "performance",
|
||||||
|
"operation": operation,
|
||||||
|
"duration_ms": duration_ms,
|
||||||
|
"context": context or {},
|
||||||
|
}
|
||||||
|
|
||||||
|
message = f"Performance: {operation} took {duration_ms:.2f}ms"
|
||||||
|
logger.log(level, message, extra={"extra_data": log_data})
|
||||||
|
|
||||||
|
|
||||||
|
def log_api_request(
|
||||||
|
logger: logging.Logger,
|
||||||
|
request,
|
||||||
|
*,
|
||||||
|
response_status: Optional[int] = None,
|
||||||
|
duration_ms: Optional[float] = None,
|
||||||
|
level: int = logging.INFO,
|
||||||
|
) -> None:
|
||||||
|
"""
|
||||||
|
Log an API request with context.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
logger: Logger instance
|
||||||
|
request: Django request object
|
||||||
|
response_status: HTTP response status code
|
||||||
|
duration_ms: Request duration in milliseconds
|
||||||
|
level: Log level
|
||||||
|
"""
|
||||||
|
log_data = {
|
||||||
|
"request_type": "api",
|
||||||
|
"path": getattr(request, "path", "unknown"),
|
||||||
|
"method": getattr(request, "method", "unknown"),
|
||||||
|
"user_id": (
|
||||||
|
getattr(request.user, "id", "anonymous")
|
||||||
|
if hasattr(request, "user")
|
||||||
|
else "unknown"
|
||||||
|
),
|
||||||
|
"response_status": response_status,
|
||||||
|
"duration_ms": duration_ms,
|
||||||
|
}
|
||||||
|
|
||||||
|
message = f"API Request: {request.method} {request.path}"
|
||||||
|
if response_status:
|
||||||
|
message += f" -> {response_status}"
|
||||||
|
if duration_ms:
|
||||||
|
message += f" ({duration_ms:.2f}ms)"
|
||||||
|
|
||||||
|
logger.log(level, message, extra={"extra_data": log_data})
|
||||||
|
|
||||||
|
|
||||||
|
def log_security_event(
|
||||||
|
logger: logging.Logger,
|
||||||
|
event_type: str,
|
||||||
|
*,
|
||||||
|
message: str,
|
||||||
|
severity: str = "medium",
|
||||||
|
context: Optional[Dict[str, Any]] = None,
|
||||||
|
request=None,
|
||||||
|
) -> None:
|
||||||
|
"""
|
||||||
|
Log a security-related event.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
logger: Logger instance
|
||||||
|
event_type: Type of security event
|
||||||
|
message: Event message
|
||||||
|
severity: Event severity (low, medium, high, critical)
|
||||||
|
context: Additional context data
|
||||||
|
request: Django request object
|
||||||
|
"""
|
||||||
|
log_data = {
|
||||||
|
"security_event": True,
|
||||||
|
"event_type": event_type,
|
||||||
|
"severity": severity,
|
||||||
|
"context": context or {},
|
||||||
|
}
|
||||||
|
|
||||||
|
if request:
|
||||||
|
log_data.update(
|
||||||
|
{
|
||||||
|
"request_path": getattr(request, "path", "unknown"),
|
||||||
|
"request_method": getattr(request, "method", "unknown"),
|
||||||
|
"user_id": (
|
||||||
|
getattr(request.user, "id", "anonymous")
|
||||||
|
if hasattr(request, "user")
|
||||||
|
else "unknown"
|
||||||
|
),
|
||||||
|
"remote_addr": request.META.get("REMOTE_ADDR", "unknown"),
|
||||||
|
"user_agent": request.META.get("HTTP_USER_AGENT", "unknown"),
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
# Use WARNING for medium/high, ERROR for critical
|
||||||
|
level = logging.ERROR if severity in ["high", "critical"] else logging.WARNING
|
||||||
|
|
||||||
|
logger.log(level, f"SECURITY: {message}", extra={"extra_data": log_data})
|
||||||
1
backend/apps/core/management/__init__.py
Normal file
1
backend/apps/core/management/__init__.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
# Django management commands
|
||||||
1
backend/apps/core/management/commands/__init__.py
Normal file
1
backend/apps/core/management/commands/__init__.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
# Django management commands
|
||||||
101
backend/apps/core/management/commands/rundev.py
Normal file
101
backend/apps/core/management/commands/rundev.py
Normal file
@@ -0,0 +1,101 @@
|
|||||||
|
"""
|
||||||
|
Django management command to run the development server.
|
||||||
|
|
||||||
|
This command automatically sets up the development environment and starts
|
||||||
|
the server, replacing the need for the dev_server.sh script.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import subprocess
|
||||||
|
import sys
|
||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
from django.core.management import execute_from_command_line
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = "Run the development server with automatic setup"
|
||||||
|
|
||||||
|
def add_arguments(self, parser):
|
||||||
|
parser.add_argument(
|
||||||
|
"--port",
|
||||||
|
type=str,
|
||||||
|
default="8000",
|
||||||
|
help="Port to run the server on (default: 8000)",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--host",
|
||||||
|
type=str,
|
||||||
|
default="0.0.0.0",
|
||||||
|
help="Host to bind the server to (default: 0.0.0.0)",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--skip-setup",
|
||||||
|
action="store_true",
|
||||||
|
help="Skip the development setup and go straight to running the server",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--use-runserver-plus",
|
||||||
|
action="store_true",
|
||||||
|
help="Use runserver_plus if available (from django-extensions)",
|
||||||
|
)
|
||||||
|
|
||||||
|
def handle(self, *args, **options):
|
||||||
|
"""Run the development setup and start the server."""
|
||||||
|
if not options["skip_setup"]:
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS(
|
||||||
|
"🚀 Setting up and starting ThrillWiki Development Server..."
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
# Run the setup_dev command first
|
||||||
|
execute_from_command_line(["manage.py", "setup_dev"])
|
||||||
|
|
||||||
|
else:
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS("🚀 Starting ThrillWiki Development Server...")
|
||||||
|
)
|
||||||
|
|
||||||
|
# Determine which server command to use
|
||||||
|
server_command = self.get_server_command(options)
|
||||||
|
|
||||||
|
# Start the server
|
||||||
|
self.stdout.write("")
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS(
|
||||||
|
f'🌟 Starting Django development server on http://{options["host"]}:{options["port"]}'
|
||||||
|
)
|
||||||
|
)
|
||||||
|
self.stdout.write("Press Ctrl+C to stop the server")
|
||||||
|
self.stdout.write("")
|
||||||
|
|
||||||
|
try:
|
||||||
|
if options["use_runserver_plus"] or self.has_runserver_plus():
|
||||||
|
execute_from_command_line(
|
||||||
|
[
|
||||||
|
"manage.py",
|
||||||
|
"runserver_plus",
|
||||||
|
f'{options["host"]}:{options["port"]}',
|
||||||
|
]
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
execute_from_command_line(
|
||||||
|
["manage.py", "runserver", f'{options["host"]}:{options["port"]}']
|
||||||
|
)
|
||||||
|
except KeyboardInterrupt:
|
||||||
|
self.stdout.write("")
|
||||||
|
self.stdout.write(self.style.SUCCESS("👋 Development server stopped"))
|
||||||
|
|
||||||
|
def get_server_command(self, options):
|
||||||
|
"""Determine which server command to use."""
|
||||||
|
if options["use_runserver_plus"] or self.has_runserver_plus():
|
||||||
|
return "runserver_plus"
|
||||||
|
return "runserver"
|
||||||
|
|
||||||
|
def has_runserver_plus(self):
|
||||||
|
"""Check if runserver_plus is available (django-extensions)."""
|
||||||
|
try:
|
||||||
|
import django_extensions
|
||||||
|
|
||||||
|
return True
|
||||||
|
except ImportError:
|
||||||
|
return False
|
||||||
226
backend/apps/core/management/commands/setup_dev.py
Normal file
226
backend/apps/core/management/commands/setup_dev.py
Normal file
@@ -0,0 +1,226 @@
|
|||||||
|
"""
|
||||||
|
Django management command to set up the development environment.
|
||||||
|
|
||||||
|
This command performs all the setup tasks that the dev_server.sh script does,
|
||||||
|
allowing the project to run without requiring the shell script.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import os
|
||||||
|
import subprocess
|
||||||
|
import sys
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
from django.core.management import execute_from_command_line
|
||||||
|
from django.conf import settings
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = "Set up the development environment"
|
||||||
|
|
||||||
|
def add_arguments(self, parser):
|
||||||
|
parser.add_argument(
|
||||||
|
"--skip-migrations",
|
||||||
|
action="store_true",
|
||||||
|
help="Skip running database migrations",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--skip-static",
|
||||||
|
action="store_true",
|
||||||
|
help="Skip collecting static files",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--skip-tailwind",
|
||||||
|
action="store_true",
|
||||||
|
help="Skip building Tailwind CSS",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--skip-superuser",
|
||||||
|
action="store_true",
|
||||||
|
help="Skip creating development superuser",
|
||||||
|
)
|
||||||
|
|
||||||
|
def handle(self, *args, **options):
|
||||||
|
"""Run the development setup process."""
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS("🚀 Setting up ThrillWiki Development Environment...")
|
||||||
|
)
|
||||||
|
|
||||||
|
# Create necessary directories
|
||||||
|
self.create_directories()
|
||||||
|
|
||||||
|
# Run database migrations if needed
|
||||||
|
if not options["skip_migrations"]:
|
||||||
|
self.run_migrations()
|
||||||
|
|
||||||
|
# Seed sample data
|
||||||
|
self.seed_sample_data()
|
||||||
|
|
||||||
|
# Create superuser if it doesn't exist
|
||||||
|
if not options["skip_superuser"]:
|
||||||
|
self.create_superuser()
|
||||||
|
|
||||||
|
# Collect static files
|
||||||
|
if not options["skip_static"]:
|
||||||
|
self.collect_static()
|
||||||
|
|
||||||
|
# Build Tailwind CSS
|
||||||
|
if not options["skip_tailwind"]:
|
||||||
|
self.build_tailwind()
|
||||||
|
|
||||||
|
# Run system checks
|
||||||
|
self.run_system_checks()
|
||||||
|
|
||||||
|
# Display environment info
|
||||||
|
self.display_environment_info()
|
||||||
|
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS("✅ Development environment setup complete!")
|
||||||
|
)
|
||||||
|
|
||||||
|
def create_directories(self):
|
||||||
|
"""Create necessary directories."""
|
||||||
|
self.stdout.write("📁 Creating necessary directories...")
|
||||||
|
directories = ["logs", "profiles", "media", "staticfiles", "static/css"]
|
||||||
|
|
||||||
|
for directory in directories:
|
||||||
|
dir_path = Path(settings.BASE_DIR) / directory
|
||||||
|
dir_path.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
self.stdout.write(self.style.SUCCESS("✅ Directories created"))
|
||||||
|
|
||||||
|
def run_migrations(self):
|
||||||
|
"""Run database migrations if needed."""
|
||||||
|
self.stdout.write("🗄️ Checking database migrations...")
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Check if migrations are up to date
|
||||||
|
result = subprocess.run(
|
||||||
|
[sys.executable, "manage.py", "migrate", "--check"],
|
||||||
|
capture_output=True,
|
||||||
|
text=True,
|
||||||
|
)
|
||||||
|
|
||||||
|
if result.returncode == 0:
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS("✅ Database migrations are up to date")
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
self.stdout.write("🔄 Running database migrations...")
|
||||||
|
subprocess.run(
|
||||||
|
[sys.executable, "manage.py", "migrate", "--noinput"], check=True
|
||||||
|
)
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS("✅ Database migrations completed")
|
||||||
|
)
|
||||||
|
|
||||||
|
except subprocess.CalledProcessError as e:
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.WARNING(f"⚠️ Migration error (continuing): {e}")
|
||||||
|
)
|
||||||
|
|
||||||
|
def seed_sample_data(self):
|
||||||
|
"""Seed sample data to the database."""
|
||||||
|
self.stdout.write("🌱 Seeding sample data...")
|
||||||
|
|
||||||
|
try:
|
||||||
|
subprocess.run(
|
||||||
|
[sys.executable, "manage.py", "seed_sample_data"], check=True
|
||||||
|
)
|
||||||
|
self.stdout.write(self.style.SUCCESS("✅ Sample data seeded"))
|
||||||
|
except subprocess.CalledProcessError:
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.WARNING("⚠️ Could not seed sample data (continuing)")
|
||||||
|
)
|
||||||
|
|
||||||
|
def create_superuser(self):
|
||||||
|
"""Create development superuser if it doesn't exist."""
|
||||||
|
self.stdout.write("👤 Checking for superuser...")
|
||||||
|
|
||||||
|
try:
|
||||||
|
from django.contrib.auth import get_user_model
|
||||||
|
|
||||||
|
User = get_user_model()
|
||||||
|
|
||||||
|
if User.objects.filter(is_superuser=True).exists():
|
||||||
|
self.stdout.write(self.style.SUCCESS("✅ Superuser already exists"))
|
||||||
|
else:
|
||||||
|
self.stdout.write("👤 Creating development superuser (admin/admin)...")
|
||||||
|
if not User.objects.filter(username="admin").exists():
|
||||||
|
User.objects.create_superuser("admin", "admin@example.com", "admin")
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS("✅ Created superuser: admin/admin")
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS("✅ Admin user already exists")
|
||||||
|
)
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
self.stdout.write(self.style.WARNING(f"⚠️ Could not create superuser: {e}"))
|
||||||
|
|
||||||
|
def collect_static(self):
|
||||||
|
"""Collect static files for development."""
|
||||||
|
self.stdout.write("📦 Collecting static files...")
|
||||||
|
|
||||||
|
try:
|
||||||
|
subprocess.run(
|
||||||
|
[sys.executable, "manage.py", "collectstatic", "--noinput", "--clear"],
|
||||||
|
check=True,
|
||||||
|
)
|
||||||
|
self.stdout.write(self.style.SUCCESS("✅ Static files collected"))
|
||||||
|
except subprocess.CalledProcessError as e:
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.WARNING(f"⚠️ Could not collect static files: {e}")
|
||||||
|
)
|
||||||
|
|
||||||
|
def build_tailwind(self):
|
||||||
|
"""Build Tailwind CSS if npm is available."""
|
||||||
|
self.stdout.write("🎨 Building Tailwind CSS...")
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Check if npm is available
|
||||||
|
subprocess.run(["npm", "--version"], capture_output=True, check=True)
|
||||||
|
|
||||||
|
# Build Tailwind CSS
|
||||||
|
subprocess.run(
|
||||||
|
[sys.executable, "manage.py", "tailwind", "build"], check=True
|
||||||
|
)
|
||||||
|
self.stdout.write(self.style.SUCCESS("✅ Tailwind CSS built"))
|
||||||
|
|
||||||
|
except (subprocess.CalledProcessError, FileNotFoundError):
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.WARNING(
|
||||||
|
"⚠️ npm not found or Tailwind build failed, skipping"
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
def run_system_checks(self):
|
||||||
|
"""Run Django system checks."""
|
||||||
|
self.stdout.write("🔍 Running system checks...")
|
||||||
|
|
||||||
|
try:
|
||||||
|
subprocess.run([sys.executable, "manage.py", "check"], check=True)
|
||||||
|
self.stdout.write(self.style.SUCCESS("✅ System checks passed"))
|
||||||
|
except subprocess.CalledProcessError:
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.WARNING("❌ System checks failed, but continuing...")
|
||||||
|
)
|
||||||
|
|
||||||
|
def display_environment_info(self):
|
||||||
|
"""Display development environment information."""
|
||||||
|
self.stdout.write("")
|
||||||
|
self.stdout.write(self.style.SUCCESS("🌍 Development Environment:"))
|
||||||
|
self.stdout.write(f" - Settings Module: {settings.SETTINGS_MODULE}")
|
||||||
|
self.stdout.write(f" - Debug Mode: {settings.DEBUG}")
|
||||||
|
self.stdout.write(" - Database: PostgreSQL with PostGIS")
|
||||||
|
self.stdout.write(" - Cache: Local memory cache")
|
||||||
|
self.stdout.write(" - Admin URL: http://localhost:8000/admin/")
|
||||||
|
self.stdout.write(" - Admin User: admin / admin")
|
||||||
|
self.stdout.write(" - Silk Profiler: http://localhost:8000/silk/")
|
||||||
|
self.stdout.write(" - Debug Toolbar: Available on debug pages")
|
||||||
|
self.stdout.write(" - API Documentation: http://localhost:8000/api/docs/")
|
||||||
|
self.stdout.write("")
|
||||||
|
self.stdout.write("🌟 Ready to start development server with:")
|
||||||
|
self.stdout.write(" python manage.py runserver")
|
||||||
|
self.stdout.write("")
|
||||||
35
backend/apps/core/management/commands/update_trending.py
Normal file
35
backend/apps/core/management/commands/update_trending.py
Normal file
@@ -0,0 +1,35 @@
|
|||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
from django.core.cache import cache
|
||||||
|
from apps.parks.models import Park
|
||||||
|
from apps.rides.models import Ride
|
||||||
|
from apps.core.analytics import PageView
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = "Updates trending parks and rides cache based on views in the last 24 hours"
|
||||||
|
|
||||||
|
def handle(self, *args, **kwargs):
|
||||||
|
"""
|
||||||
|
Updates the trending parks and rides in the cache.
|
||||||
|
|
||||||
|
This command is designed to be run every hour via cron to keep the trending
|
||||||
|
items up to date. It looks at page views from the last 24 hours and caches
|
||||||
|
the top 10 most viewed parks and rides.
|
||||||
|
|
||||||
|
The cached data is used by the home page to display trending items without
|
||||||
|
having to query the database on every request.
|
||||||
|
"""
|
||||||
|
# Get top 10 trending parks and rides from the last 24 hours
|
||||||
|
trending_parks = PageView.get_trending_items(Park, hours=24, limit=10)
|
||||||
|
trending_rides = PageView.get_trending_items(Ride, hours=24, limit=10)
|
||||||
|
|
||||||
|
# Cache the results for 1 hour
|
||||||
|
cache.set("trending_parks", trending_parks, 3600) # 3600 seconds = 1 hour
|
||||||
|
cache.set("trending_rides", trending_rides, 3600)
|
||||||
|
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS(
|
||||||
|
"Successfully updated trending parks and rides. "
|
||||||
|
"Cached 10 items each for parks and rides based on views in the last 24 hours."
|
||||||
|
)
|
||||||
|
)
|
||||||
273
backend/apps/core/managers.py
Normal file
273
backend/apps/core/managers.py
Normal file
@@ -0,0 +1,273 @@
|
|||||||
|
"""
|
||||||
|
Custom managers and QuerySets for optimized database patterns.
|
||||||
|
Following Django styleguide best practices for database access.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from typing import Optional, List, Union
|
||||||
|
from django.db import models
|
||||||
|
from django.db.models import Q, Count, Avg, Max
|
||||||
|
from django.contrib.gis.geos import Point
|
||||||
|
from django.contrib.gis.measure import Distance
|
||||||
|
from django.utils import timezone
|
||||||
|
from datetime import timedelta
|
||||||
|
|
||||||
|
|
||||||
|
class BaseQuerySet(models.QuerySet):
|
||||||
|
"""Base QuerySet with common optimizations and patterns."""
|
||||||
|
|
||||||
|
def active(self):
|
||||||
|
"""Filter for active/enabled records."""
|
||||||
|
if hasattr(self.model, "is_active"):
|
||||||
|
return self.filter(is_active=True)
|
||||||
|
return self
|
||||||
|
|
||||||
|
def published(self):
|
||||||
|
"""Filter for published records."""
|
||||||
|
if hasattr(self.model, "is_published"):
|
||||||
|
return self.filter(is_published=True)
|
||||||
|
return self
|
||||||
|
|
||||||
|
def recent(self, *, days: int = 30):
|
||||||
|
"""Filter for recently created records."""
|
||||||
|
cutoff_date = timezone.now() - timedelta(days=days)
|
||||||
|
return self.filter(created_at__gte=cutoff_date)
|
||||||
|
|
||||||
|
def search(self, *, query: str, fields: Optional[List[str]] = None):
|
||||||
|
"""
|
||||||
|
Full-text search across specified fields.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
query: Search query string
|
||||||
|
fields: List of field names to search (defaults to name, description)
|
||||||
|
"""
|
||||||
|
if not query:
|
||||||
|
return self
|
||||||
|
|
||||||
|
if fields is None:
|
||||||
|
fields = ["name", "description"] if hasattr(self.model, "name") else []
|
||||||
|
|
||||||
|
q_objects = Q()
|
||||||
|
for field in fields:
|
||||||
|
if hasattr(self.model, field):
|
||||||
|
q_objects |= Q(**{f"{field}__icontains": query})
|
||||||
|
|
||||||
|
return self.filter(q_objects) if q_objects else self
|
||||||
|
|
||||||
|
def with_stats(self):
|
||||||
|
"""Add basic statistics annotations."""
|
||||||
|
return self
|
||||||
|
|
||||||
|
def optimized_for_list(self):
|
||||||
|
"""Optimize queryset for list display."""
|
||||||
|
return self.select_related().prefetch_related()
|
||||||
|
|
||||||
|
def optimized_for_detail(self):
|
||||||
|
"""Optimize queryset for detail display."""
|
||||||
|
return self.select_related().prefetch_related()
|
||||||
|
|
||||||
|
|
||||||
|
class BaseManager(models.Manager):
|
||||||
|
"""Base manager with common patterns."""
|
||||||
|
|
||||||
|
def get_queryset(self):
|
||||||
|
return BaseQuerySet(self.model, using=self._db)
|
||||||
|
|
||||||
|
def active(self):
|
||||||
|
return self.get_queryset().active()
|
||||||
|
|
||||||
|
def published(self):
|
||||||
|
return self.get_queryset().published()
|
||||||
|
|
||||||
|
def recent(self, *, days: int = 30):
|
||||||
|
return self.get_queryset().recent(days=days)
|
||||||
|
|
||||||
|
def search(self, *, query: str, fields: Optional[List[str]] = None):
|
||||||
|
return self.get_queryset().search(query=query, fields=fields)
|
||||||
|
|
||||||
|
|
||||||
|
class LocationQuerySet(BaseQuerySet):
|
||||||
|
"""QuerySet for location-based models with geographic functionality."""
|
||||||
|
|
||||||
|
def near_point(self, *, point: Point, distance_km: float = 50):
|
||||||
|
"""Filter locations near a geographic point."""
|
||||||
|
if hasattr(self.model, "point"):
|
||||||
|
return (
|
||||||
|
self.filter(point__distance_lte=(point, Distance(km=distance_km)))
|
||||||
|
.distance(point)
|
||||||
|
.order_by("distance")
|
||||||
|
)
|
||||||
|
return self
|
||||||
|
|
||||||
|
def within_bounds(self, *, north: float, south: float, east: float, west: float):
|
||||||
|
"""Filter locations within geographic bounds."""
|
||||||
|
if hasattr(self.model, "point"):
|
||||||
|
return self.filter(
|
||||||
|
point__latitude__gte=south,
|
||||||
|
point__latitude__lte=north,
|
||||||
|
point__longitude__gte=west,
|
||||||
|
point__longitude__lte=east,
|
||||||
|
)
|
||||||
|
return self
|
||||||
|
|
||||||
|
def by_country(self, *, country: str):
|
||||||
|
"""Filter by country."""
|
||||||
|
if hasattr(self.model, "country"):
|
||||||
|
return self.filter(country__iexact=country)
|
||||||
|
return self
|
||||||
|
|
||||||
|
def by_region(self, *, state: str):
|
||||||
|
"""Filter by state/region."""
|
||||||
|
if hasattr(self.model, "state"):
|
||||||
|
return self.filter(state__iexact=state)
|
||||||
|
return self
|
||||||
|
|
||||||
|
def by_city(self, *, city: str):
|
||||||
|
"""Filter by city."""
|
||||||
|
if hasattr(self.model, "city"):
|
||||||
|
return self.filter(city__iexact=city)
|
||||||
|
return self
|
||||||
|
|
||||||
|
|
||||||
|
class LocationManager(BaseManager):
|
||||||
|
"""Manager for location-based models."""
|
||||||
|
|
||||||
|
def get_queryset(self):
|
||||||
|
return LocationQuerySet(self.model, using=self._db)
|
||||||
|
|
||||||
|
def near_point(self, *, point: Point, distance_km: float = 50):
|
||||||
|
return self.get_queryset().near_point(point=point, distance_km=distance_km)
|
||||||
|
|
||||||
|
def within_bounds(self, *, north: float, south: float, east: float, west: float):
|
||||||
|
return self.get_queryset().within_bounds(
|
||||||
|
north=north, south=south, east=east, west=west
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class ReviewableQuerySet(BaseQuerySet):
|
||||||
|
"""QuerySet for models that can be reviewed."""
|
||||||
|
|
||||||
|
def with_review_stats(self):
|
||||||
|
"""Add review statistics annotations."""
|
||||||
|
return self.annotate(
|
||||||
|
review_count=Count("reviews", filter=Q(reviews__is_published=True)),
|
||||||
|
average_rating=Avg("reviews__rating", filter=Q(reviews__is_published=True)),
|
||||||
|
latest_review_date=Max(
|
||||||
|
"reviews__created_at", filter=Q(reviews__is_published=True)
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
def highly_rated(self, *, min_rating: float = 8.0):
|
||||||
|
"""Filter for highly rated items."""
|
||||||
|
return self.with_review_stats().filter(average_rating__gte=min_rating)
|
||||||
|
|
||||||
|
def recently_reviewed(self, *, days: int = 30):
|
||||||
|
"""Filter for items with recent reviews."""
|
||||||
|
cutoff_date = timezone.now() - timedelta(days=days)
|
||||||
|
return self.filter(
|
||||||
|
reviews__created_at__gte=cutoff_date, reviews__is_published=True
|
||||||
|
).distinct()
|
||||||
|
|
||||||
|
|
||||||
|
class ReviewableManager(BaseManager):
|
||||||
|
"""Manager for reviewable models."""
|
||||||
|
|
||||||
|
def get_queryset(self):
|
||||||
|
return ReviewableQuerySet(self.model, using=self._db)
|
||||||
|
|
||||||
|
def with_review_stats(self):
|
||||||
|
return self.get_queryset().with_review_stats()
|
||||||
|
|
||||||
|
def highly_rated(self, *, min_rating: float = 8.0):
|
||||||
|
return self.get_queryset().highly_rated(min_rating=min_rating)
|
||||||
|
|
||||||
|
|
||||||
|
class HierarchicalQuerySet(BaseQuerySet):
|
||||||
|
"""QuerySet for hierarchical models (with parent/child relationships)."""
|
||||||
|
|
||||||
|
def root_level(self):
|
||||||
|
"""Filter for root-level items (no parent)."""
|
||||||
|
if hasattr(self.model, "parent"):
|
||||||
|
return self.filter(parent__isnull=True)
|
||||||
|
return self
|
||||||
|
|
||||||
|
def children_of(self, *, parent_id: int):
|
||||||
|
"""Get children of a specific parent."""
|
||||||
|
if hasattr(self.model, "parent"):
|
||||||
|
return self.filter(parent_id=parent_id)
|
||||||
|
return self
|
||||||
|
|
||||||
|
def with_children_count(self):
|
||||||
|
"""Add count of children."""
|
||||||
|
if hasattr(self.model, "children"):
|
||||||
|
return self.annotate(children_count=Count("children"))
|
||||||
|
return self
|
||||||
|
|
||||||
|
|
||||||
|
class HierarchicalManager(BaseManager):
|
||||||
|
"""Manager for hierarchical models."""
|
||||||
|
|
||||||
|
def get_queryset(self):
|
||||||
|
return HierarchicalQuerySet(self.model, using=self._db)
|
||||||
|
|
||||||
|
def root_level(self):
|
||||||
|
return self.get_queryset().root_level()
|
||||||
|
|
||||||
|
|
||||||
|
class TimestampedQuerySet(BaseQuerySet):
|
||||||
|
"""QuerySet for models with created_at/updated_at timestamps."""
|
||||||
|
|
||||||
|
def created_between(self, *, start_date, end_date):
|
||||||
|
"""Filter by creation date range."""
|
||||||
|
return self.filter(created_at__date__range=[start_date, end_date])
|
||||||
|
|
||||||
|
def updated_since(self, *, since_date):
|
||||||
|
"""Filter for records updated since a date."""
|
||||||
|
return self.filter(updated_at__gte=since_date)
|
||||||
|
|
||||||
|
def by_creation_date(self, *, descending: bool = True):
|
||||||
|
"""Order by creation date."""
|
||||||
|
order = "-created_at" if descending else "created_at"
|
||||||
|
return self.order_by(order)
|
||||||
|
|
||||||
|
|
||||||
|
class TimestampedManager(BaseManager):
|
||||||
|
"""Manager for timestamped models."""
|
||||||
|
|
||||||
|
def get_queryset(self):
|
||||||
|
return TimestampedQuerySet(self.model, using=self._db)
|
||||||
|
|
||||||
|
def created_between(self, *, start_date, end_date):
|
||||||
|
return self.get_queryset().created_between(
|
||||||
|
start_date=start_date, end_date=end_date
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class StatusQuerySet(BaseQuerySet):
|
||||||
|
"""QuerySet for models with status fields."""
|
||||||
|
|
||||||
|
def with_status(self, *, status: Union[str, List[str]]):
|
||||||
|
"""Filter by status."""
|
||||||
|
if isinstance(status, list):
|
||||||
|
return self.filter(status__in=status)
|
||||||
|
return self.filter(status=status)
|
||||||
|
|
||||||
|
def operating(self):
|
||||||
|
"""Filter for operating/active status."""
|
||||||
|
return self.filter(status="OPERATING")
|
||||||
|
|
||||||
|
def closed(self):
|
||||||
|
"""Filter for closed status."""
|
||||||
|
return self.filter(status__in=["CLOSED_TEMP", "CLOSED_PERM"])
|
||||||
|
|
||||||
|
|
||||||
|
class StatusManager(BaseManager):
|
||||||
|
"""Manager for status-based models."""
|
||||||
|
|
||||||
|
def get_queryset(self):
|
||||||
|
return StatusQuerySet(self.model, using=self._db)
|
||||||
|
|
||||||
|
def operating(self):
|
||||||
|
return self.get_queryset().operating()
|
||||||
|
|
||||||
|
def closed(self):
|
||||||
|
return self.get_queryset().closed()
|
||||||
22
backend/apps/core/middleware/__init__.py
Normal file
22
backend/apps/core/middleware/__init__.py
Normal file
@@ -0,0 +1,22 @@
|
|||||||
|
# Core middleware modules
|
||||||
|
|
||||||
|
# Import middleware classes from the analytics module
|
||||||
|
from .analytics import PageViewMiddleware, PgHistoryContextMiddleware
|
||||||
|
|
||||||
|
# Import middleware classes from the performance_middleware.py module
|
||||||
|
from .performance_middleware import (
|
||||||
|
PerformanceMiddleware,
|
||||||
|
QueryCountMiddleware,
|
||||||
|
DatabaseConnectionMiddleware,
|
||||||
|
CachePerformanceMiddleware,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Make all middleware classes available at the package level
|
||||||
|
__all__ = [
|
||||||
|
"PageViewMiddleware",
|
||||||
|
"PgHistoryContextMiddleware",
|
||||||
|
"PerformanceMiddleware",
|
||||||
|
"QueryCountMiddleware",
|
||||||
|
"DatabaseConnectionMiddleware",
|
||||||
|
"CachePerformanceMiddleware",
|
||||||
|
]
|
||||||
84
backend/apps/core/middleware/analytics.py
Normal file
84
backend/apps/core/middleware/analytics.py
Normal file
@@ -0,0 +1,84 @@
|
|||||||
|
"""
|
||||||
|
Analytics and tracking middleware for Django application.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import pghistory
|
||||||
|
from django.contrib.auth.models import AnonymousUser
|
||||||
|
from django.core.handlers.wsgi import WSGIRequest
|
||||||
|
from django.utils.deprecation import MiddlewareMixin
|
||||||
|
from django.contrib.contenttypes.models import ContentType
|
||||||
|
from django.views.generic.detail import DetailView
|
||||||
|
from apps.core.analytics import PageView
|
||||||
|
|
||||||
|
|
||||||
|
class RequestContextProvider(pghistory.context):
|
||||||
|
"""Custom context provider for pghistory that extracts information from the request."""
|
||||||
|
|
||||||
|
def __call__(self, request: WSGIRequest) -> dict:
|
||||||
|
return {
|
||||||
|
"user": (
|
||||||
|
str(request.user)
|
||||||
|
if request.user and not isinstance(request.user, AnonymousUser)
|
||||||
|
else None
|
||||||
|
),
|
||||||
|
"ip": request.META.get("REMOTE_ADDR"),
|
||||||
|
"user_agent": request.META.get("HTTP_USER_AGENT"),
|
||||||
|
"session_key": (
|
||||||
|
request.session.session_key if hasattr(request, "session") else None
|
||||||
|
),
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
# Initialize the context provider
|
||||||
|
request_context = RequestContextProvider()
|
||||||
|
|
||||||
|
|
||||||
|
class PgHistoryContextMiddleware:
|
||||||
|
"""
|
||||||
|
Middleware that ensures request object is available to pghistory context.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, get_response):
|
||||||
|
self.get_response = get_response
|
||||||
|
|
||||||
|
def __call__(self, request):
|
||||||
|
response = self.get_response(request)
|
||||||
|
return response
|
||||||
|
|
||||||
|
|
||||||
|
class PageViewMiddleware(MiddlewareMixin):
|
||||||
|
"""Middleware to track page views for DetailView-based pages."""
|
||||||
|
|
||||||
|
def process_view(self, request, view_func, view_args, view_kwargs):
|
||||||
|
# Only track GET requests
|
||||||
|
if request.method != "GET":
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Get view class if it exists
|
||||||
|
view_class = getattr(view_func, "view_class", None)
|
||||||
|
if not view_class or not issubclass(view_class, DetailView):
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Get the object if it's a detail view
|
||||||
|
try:
|
||||||
|
view_instance = view_class()
|
||||||
|
view_instance.request = request
|
||||||
|
view_instance.args = view_args
|
||||||
|
view_instance.kwargs = view_kwargs
|
||||||
|
obj = view_instance.get_object()
|
||||||
|
except (AttributeError, Exception):
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Record the page view
|
||||||
|
try:
|
||||||
|
PageView.objects.create(
|
||||||
|
content_type=ContentType.objects.get_for_model(obj.__class__),
|
||||||
|
object_id=obj.pk,
|
||||||
|
ip_address=request.META.get("REMOTE_ADDR", ""),
|
||||||
|
user_agent=request.META.get("HTTP_USER_AGENT", "")[:512],
|
||||||
|
)
|
||||||
|
except Exception:
|
||||||
|
# Fail silently to not interrupt the request
|
||||||
|
pass
|
||||||
|
|
||||||
|
return None
|
||||||
317
backend/apps/core/middleware/performance_middleware.py
Normal file
317
backend/apps/core/middleware/performance_middleware.py
Normal file
@@ -0,0 +1,317 @@
|
|||||||
|
"""
|
||||||
|
Performance monitoring middleware for tracking request metrics.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import time
|
||||||
|
import logging
|
||||||
|
from django.db import connection
|
||||||
|
from django.utils.deprecation import MiddlewareMixin
|
||||||
|
from django.conf import settings
|
||||||
|
|
||||||
|
performance_logger = logging.getLogger("performance")
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
class PerformanceMiddleware(MiddlewareMixin):
|
||||||
|
"""Middleware to collect performance metrics for each request"""
|
||||||
|
|
||||||
|
def process_request(self, request):
|
||||||
|
"""Initialize performance tracking for the request"""
|
||||||
|
request._performance_start_time = time.time()
|
||||||
|
request._performance_initial_queries = (
|
||||||
|
len(connection.queries) if hasattr(connection, "queries") else 0
|
||||||
|
)
|
||||||
|
return None
|
||||||
|
|
||||||
|
def process_response(self, request, response):
|
||||||
|
"""Log performance metrics after response is ready"""
|
||||||
|
# Skip performance tracking for certain paths
|
||||||
|
skip_paths = [
|
||||||
|
"/health/",
|
||||||
|
"/admin/jsi18n/",
|
||||||
|
"/static/",
|
||||||
|
"/media/",
|
||||||
|
"/__debug__/",
|
||||||
|
]
|
||||||
|
if any(request.path.startswith(path) for path in skip_paths):
|
||||||
|
return response
|
||||||
|
|
||||||
|
# Calculate metrics
|
||||||
|
end_time = time.time()
|
||||||
|
start_time = getattr(request, "_performance_start_time", end_time)
|
||||||
|
duration = end_time - start_time
|
||||||
|
|
||||||
|
initial_queries = getattr(request, "_performance_initial_queries", 0)
|
||||||
|
total_queries = (
|
||||||
|
len(connection.queries) - initial_queries
|
||||||
|
if hasattr(connection, "queries")
|
||||||
|
else 0
|
||||||
|
)
|
||||||
|
|
||||||
|
# Get content length
|
||||||
|
content_length = 0
|
||||||
|
if hasattr(response, "content"):
|
||||||
|
content_length = len(response.content)
|
||||||
|
elif hasattr(response, "streaming_content"):
|
||||||
|
# For streaming responses, we can't easily measure content length
|
||||||
|
content_length = -1
|
||||||
|
|
||||||
|
# Build performance data
|
||||||
|
performance_data = {
|
||||||
|
"path": request.path,
|
||||||
|
"method": request.method,
|
||||||
|
"status_code": response.status_code,
|
||||||
|
"duration_ms": round(duration * 1000, 2),
|
||||||
|
"duration_seconds": round(duration, 3),
|
||||||
|
"query_count": total_queries,
|
||||||
|
"content_length_bytes": content_length,
|
||||||
|
"user_id": (
|
||||||
|
getattr(request.user, "id", None)
|
||||||
|
if hasattr(request, "user") and request.user.is_authenticated
|
||||||
|
else None
|
||||||
|
),
|
||||||
|
"user_agent": request.META.get("HTTP_USER_AGENT", "")[
|
||||||
|
:100
|
||||||
|
], # Truncate user agent
|
||||||
|
"remote_addr": self._get_client_ip(request),
|
||||||
|
}
|
||||||
|
|
||||||
|
# Add query details in debug mode
|
||||||
|
if settings.DEBUG and hasattr(connection, "queries") and total_queries > 0:
|
||||||
|
recent_queries = connection.queries[-total_queries:]
|
||||||
|
performance_data["queries"] = [
|
||||||
|
{
|
||||||
|
"sql": (
|
||||||
|
query["sql"][:200] + "..."
|
||||||
|
if len(query["sql"]) > 200
|
||||||
|
else query["sql"]
|
||||||
|
),
|
||||||
|
"time": float(query["time"]),
|
||||||
|
}
|
||||||
|
for query in recent_queries[-10:] # Last 10 queries only
|
||||||
|
]
|
||||||
|
|
||||||
|
# Identify slow queries
|
||||||
|
slow_queries = [q for q in recent_queries if float(q["time"]) > 0.1]
|
||||||
|
if slow_queries:
|
||||||
|
performance_data["slow_query_count"] = len(slow_queries)
|
||||||
|
performance_data["slowest_query_time"] = max(
|
||||||
|
float(q["time"]) for q in slow_queries
|
||||||
|
)
|
||||||
|
|
||||||
|
# Determine log level based on performance
|
||||||
|
log_level = self._get_log_level(duration, total_queries, response.status_code)
|
||||||
|
|
||||||
|
# Log the performance data
|
||||||
|
performance_logger.log(
|
||||||
|
log_level,
|
||||||
|
f"Request performance: {request.method} {request.path} - "
|
||||||
|
f"{duration:.3f}s, {total_queries} queries, {response.status_code}",
|
||||||
|
extra=performance_data,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Add performance headers for debugging (only in debug mode)
|
||||||
|
if settings.DEBUG:
|
||||||
|
response["X-Response-Time"] = f"{duration * 1000:.2f}ms"
|
||||||
|
response["X-Query-Count"] = str(total_queries)
|
||||||
|
if total_queries > 0 and hasattr(connection, "queries"):
|
||||||
|
total_query_time = sum(
|
||||||
|
float(q["time"]) for q in connection.queries[-total_queries:]
|
||||||
|
)
|
||||||
|
response["X-Query-Time"] = f"{total_query_time * 1000:.2f}ms"
|
||||||
|
|
||||||
|
return response
|
||||||
|
|
||||||
|
def process_exception(self, request, exception):
|
||||||
|
"""Log performance data even when an exception occurs"""
|
||||||
|
end_time = time.time()
|
||||||
|
start_time = getattr(request, "_performance_start_time", end_time)
|
||||||
|
duration = end_time - start_time
|
||||||
|
|
||||||
|
initial_queries = getattr(request, "_performance_initial_queries", 0)
|
||||||
|
total_queries = (
|
||||||
|
len(connection.queries) - initial_queries
|
||||||
|
if hasattr(connection, "queries")
|
||||||
|
else 0
|
||||||
|
)
|
||||||
|
|
||||||
|
performance_data = {
|
||||||
|
"path": request.path,
|
||||||
|
"method": request.method,
|
||||||
|
"status_code": 500, # Exception occurred
|
||||||
|
"duration_ms": round(duration * 1000, 2),
|
||||||
|
"query_count": total_queries,
|
||||||
|
"exception": str(exception),
|
||||||
|
"exception_type": type(exception).__name__,
|
||||||
|
"user_id": (
|
||||||
|
getattr(request.user, "id", None)
|
||||||
|
if hasattr(request, "user") and request.user.is_authenticated
|
||||||
|
else None
|
||||||
|
),
|
||||||
|
}
|
||||||
|
|
||||||
|
performance_logger.error(
|
||||||
|
f"Request exception: {
|
||||||
|
request.method} {
|
||||||
|
request.path} - "
|
||||||
|
f"{
|
||||||
|
duration:.3f}s, {total_queries} queries, {
|
||||||
|
type(exception).__name__}: {exception}",
|
||||||
|
extra=performance_data,
|
||||||
|
)
|
||||||
|
|
||||||
|
return None # Don't handle the exception, just log it
|
||||||
|
|
||||||
|
def _get_client_ip(self, request):
|
||||||
|
"""Extract client IP address from request"""
|
||||||
|
x_forwarded_for = request.META.get("HTTP_X_FORWARDED_FOR")
|
||||||
|
if x_forwarded_for:
|
||||||
|
ip = x_forwarded_for.split(",")[0].strip()
|
||||||
|
else:
|
||||||
|
ip = request.META.get("REMOTE_ADDR", "")
|
||||||
|
return ip
|
||||||
|
|
||||||
|
def _get_log_level(self, duration, query_count, status_code):
|
||||||
|
"""Determine appropriate log level based on performance metrics"""
|
||||||
|
# Error responses
|
||||||
|
if status_code >= 500:
|
||||||
|
return logging.ERROR
|
||||||
|
elif status_code >= 400:
|
||||||
|
return logging.WARNING
|
||||||
|
|
||||||
|
# Performance-based log levels
|
||||||
|
if duration > 5.0: # Very slow requests
|
||||||
|
return logging.ERROR
|
||||||
|
elif duration > 2.0 or query_count > 20: # Slow requests or high query count
|
||||||
|
return logging.WARNING
|
||||||
|
elif duration > 1.0 or query_count > 10: # Moderately slow
|
||||||
|
return logging.INFO
|
||||||
|
else:
|
||||||
|
return logging.DEBUG
|
||||||
|
|
||||||
|
|
||||||
|
class QueryCountMiddleware(MiddlewareMixin):
|
||||||
|
"""Middleware to track and limit query counts per request"""
|
||||||
|
|
||||||
|
def __init__(self, get_response):
|
||||||
|
self.get_response = get_response
|
||||||
|
self.query_limit = getattr(settings, "MAX_QUERIES_PER_REQUEST", 50)
|
||||||
|
super().__init__(get_response)
|
||||||
|
|
||||||
|
def process_request(self, request):
|
||||||
|
"""Initialize query tracking"""
|
||||||
|
request._query_count_start = (
|
||||||
|
len(connection.queries) if hasattr(connection, "queries") else 0
|
||||||
|
)
|
||||||
|
return None
|
||||||
|
|
||||||
|
def process_response(self, request, response):
|
||||||
|
"""Check query count and warn if excessive"""
|
||||||
|
if not hasattr(connection, "queries"):
|
||||||
|
return response
|
||||||
|
|
||||||
|
start_count = getattr(request, "_query_count_start", 0)
|
||||||
|
current_count = len(connection.queries)
|
||||||
|
request_query_count = current_count - start_count
|
||||||
|
|
||||||
|
if request_query_count > self.query_limit:
|
||||||
|
logger.warning(
|
||||||
|
f"Excessive query count: {
|
||||||
|
request.path} executed {request_query_count} queries "
|
||||||
|
f"(limit: {
|
||||||
|
self.query_limit})",
|
||||||
|
extra={
|
||||||
|
"path": request.path,
|
||||||
|
"method": request.method,
|
||||||
|
"query_count": request_query_count,
|
||||||
|
"query_limit": self.query_limit,
|
||||||
|
"excessive_queries": True,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
return response
|
||||||
|
|
||||||
|
|
||||||
|
class DatabaseConnectionMiddleware(MiddlewareMixin):
|
||||||
|
"""Middleware to monitor database connection health"""
|
||||||
|
|
||||||
|
def process_request(self, request):
|
||||||
|
"""Check database connection at start of request"""
|
||||||
|
try:
|
||||||
|
# Simple connection test
|
||||||
|
from django.db import connection
|
||||||
|
|
||||||
|
with connection.cursor() as cursor:
|
||||||
|
cursor.execute("SELECT 1")
|
||||||
|
cursor.fetchone()
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(
|
||||||
|
f"Database connection failed at request start: {e}",
|
||||||
|
extra={
|
||||||
|
"path": request.path,
|
||||||
|
"method": request.method,
|
||||||
|
"database_error": str(e),
|
||||||
|
},
|
||||||
|
)
|
||||||
|
# Don't block the request, let Django handle the database error
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
|
def process_response(self, request, response):
|
||||||
|
"""Close database connections properly"""
|
||||||
|
try:
|
||||||
|
from django.db import connection
|
||||||
|
|
||||||
|
connection.close()
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning(f"Error closing database connection: {e}")
|
||||||
|
|
||||||
|
return response
|
||||||
|
|
||||||
|
|
||||||
|
class CachePerformanceMiddleware(MiddlewareMixin):
|
||||||
|
"""Middleware to monitor cache performance"""
|
||||||
|
|
||||||
|
def process_request(self, request):
|
||||||
|
"""Initialize cache performance tracking"""
|
||||||
|
request._cache_hits = 0
|
||||||
|
request._cache_misses = 0
|
||||||
|
request._cache_start_time = time.time()
|
||||||
|
return None
|
||||||
|
|
||||||
|
def process_response(self, request, response):
|
||||||
|
"""Log cache performance metrics"""
|
||||||
|
cache_duration = time.time() - getattr(
|
||||||
|
request, "_cache_start_time", time.time()
|
||||||
|
)
|
||||||
|
cache_hits = getattr(request, "_cache_hits", 0)
|
||||||
|
cache_misses = getattr(request, "_cache_misses", 0)
|
||||||
|
|
||||||
|
if cache_hits + cache_misses > 0:
|
||||||
|
hit_rate = (cache_hits / (cache_hits + cache_misses)) * 100
|
||||||
|
|
||||||
|
cache_data = {
|
||||||
|
"path": request.path,
|
||||||
|
"cache_hits": cache_hits,
|
||||||
|
"cache_misses": cache_misses,
|
||||||
|
"cache_hit_rate": round(hit_rate, 2),
|
||||||
|
"cache_operations": cache_hits + cache_misses,
|
||||||
|
# milliseconds
|
||||||
|
"cache_duration": round(cache_duration * 1000, 2),
|
||||||
|
}
|
||||||
|
|
||||||
|
# Log cache performance
|
||||||
|
if hit_rate < 50 and cache_hits + cache_misses > 5:
|
||||||
|
logger.warning(
|
||||||
|
f"Low cache hit rate for {request.path}: {hit_rate:.1f}%",
|
||||||
|
extra=cache_data,
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
logger.debug(
|
||||||
|
f"Cache performance for {
|
||||||
|
request.path}: {
|
||||||
|
hit_rate:.1f}% hit rate",
|
||||||
|
extra=cache_data,
|
||||||
|
)
|
||||||
|
|
||||||
|
return response
|
||||||
54
backend/apps/core/migrations/0001_initial.py
Normal file
54
backend/apps/core/migrations/0001_initial.py
Normal file
@@ -0,0 +1,54 @@
|
|||||||
|
# Generated by Django 5.1.4 on 2025-08-13 21:35
|
||||||
|
|
||||||
|
import django.db.models.deletion
|
||||||
|
from django.db import migrations, models
|
||||||
|
|
||||||
|
|
||||||
|
class Migration(migrations.Migration):
|
||||||
|
|
||||||
|
initial = True
|
||||||
|
|
||||||
|
dependencies = [
|
||||||
|
("contenttypes", "0002_remove_content_type_name"),
|
||||||
|
]
|
||||||
|
|
||||||
|
operations = [
|
||||||
|
migrations.CreateModel(
|
||||||
|
name="SlugHistory",
|
||||||
|
fields=[
|
||||||
|
(
|
||||||
|
"id",
|
||||||
|
models.BigAutoField(
|
||||||
|
auto_created=True,
|
||||||
|
primary_key=True,
|
||||||
|
serialize=False,
|
||||||
|
verbose_name="ID",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
("object_id", models.CharField(max_length=50)),
|
||||||
|
("old_slug", models.SlugField(max_length=200)),
|
||||||
|
("created_at", models.DateTimeField(auto_now_add=True)),
|
||||||
|
(
|
||||||
|
"content_type",
|
||||||
|
models.ForeignKey(
|
||||||
|
on_delete=django.db.models.deletion.CASCADE,
|
||||||
|
to="contenttypes.contenttype",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
],
|
||||||
|
options={
|
||||||
|
"verbose_name_plural": "Slug histories",
|
||||||
|
"ordering": ["-created_at"],
|
||||||
|
"indexes": [
|
||||||
|
models.Index(
|
||||||
|
fields=["content_type", "object_id"],
|
||||||
|
name="core_slughi_content_8bbf56_idx",
|
||||||
|
),
|
||||||
|
models.Index(
|
||||||
|
fields=["old_slug"],
|
||||||
|
name="core_slughi_old_slu_aaef7f_idx",
|
||||||
|
),
|
||||||
|
],
|
||||||
|
},
|
||||||
|
),
|
||||||
|
]
|
||||||
102
backend/apps/core/migrations/0002_historicalslug_pageview.py
Normal file
102
backend/apps/core/migrations/0002_historicalslug_pageview.py
Normal file
@@ -0,0 +1,102 @@
|
|||||||
|
# Generated by Django 5.1.4 on 2025-08-14 14:50
|
||||||
|
|
||||||
|
import django.db.models.deletion
|
||||||
|
from django.conf import settings
|
||||||
|
from django.db import migrations, models
|
||||||
|
|
||||||
|
|
||||||
|
class Migration(migrations.Migration):
|
||||||
|
|
||||||
|
dependencies = [
|
||||||
|
("contenttypes", "0002_remove_content_type_name"),
|
||||||
|
("core", "0001_initial"),
|
||||||
|
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
|
||||||
|
]
|
||||||
|
|
||||||
|
operations = [
|
||||||
|
migrations.CreateModel(
|
||||||
|
name="HistoricalSlug",
|
||||||
|
fields=[
|
||||||
|
(
|
||||||
|
"id",
|
||||||
|
models.BigAutoField(
|
||||||
|
auto_created=True,
|
||||||
|
primary_key=True,
|
||||||
|
serialize=False,
|
||||||
|
verbose_name="ID",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
("object_id", models.PositiveIntegerField()),
|
||||||
|
("slug", models.SlugField(max_length=255)),
|
||||||
|
("created_at", models.DateTimeField(auto_now_add=True)),
|
||||||
|
(
|
||||||
|
"content_type",
|
||||||
|
models.ForeignKey(
|
||||||
|
on_delete=django.db.models.deletion.CASCADE,
|
||||||
|
to="contenttypes.contenttype",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"user",
|
||||||
|
models.ForeignKey(
|
||||||
|
blank=True,
|
||||||
|
null=True,
|
||||||
|
on_delete=django.db.models.deletion.SET_NULL,
|
||||||
|
related_name="historical_slugs",
|
||||||
|
to=settings.AUTH_USER_MODEL,
|
||||||
|
),
|
||||||
|
),
|
||||||
|
],
|
||||||
|
options={
|
||||||
|
"indexes": [
|
||||||
|
models.Index(
|
||||||
|
fields=["content_type", "object_id"],
|
||||||
|
name="core_histor_content_b4c470_idx",
|
||||||
|
),
|
||||||
|
models.Index(fields=["slug"], name="core_histor_slug_8fd7b3_idx"),
|
||||||
|
],
|
||||||
|
"unique_together": {("content_type", "slug")},
|
||||||
|
},
|
||||||
|
),
|
||||||
|
migrations.CreateModel(
|
||||||
|
name="PageView",
|
||||||
|
fields=[
|
||||||
|
(
|
||||||
|
"id",
|
||||||
|
models.BigAutoField(
|
||||||
|
auto_created=True,
|
||||||
|
primary_key=True,
|
||||||
|
serialize=False,
|
||||||
|
verbose_name="ID",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
("object_id", models.PositiveIntegerField()),
|
||||||
|
(
|
||||||
|
"timestamp",
|
||||||
|
models.DateTimeField(auto_now_add=True, db_index=True),
|
||||||
|
),
|
||||||
|
("ip_address", models.GenericIPAddressField()),
|
||||||
|
("user_agent", models.CharField(blank=True, max_length=512)),
|
||||||
|
(
|
||||||
|
"content_type",
|
||||||
|
models.ForeignKey(
|
||||||
|
on_delete=django.db.models.deletion.CASCADE,
|
||||||
|
related_name="page_views",
|
||||||
|
to="contenttypes.contenttype",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
],
|
||||||
|
options={
|
||||||
|
"indexes": [
|
||||||
|
models.Index(
|
||||||
|
fields=["timestamp"],
|
||||||
|
name="core_pagevi_timesta_757ebb_idx",
|
||||||
|
),
|
||||||
|
models.Index(
|
||||||
|
fields=["content_type", "object_id"],
|
||||||
|
name="core_pagevi_content_eda7ad_idx",
|
||||||
|
),
|
||||||
|
],
|
||||||
|
},
|
||||||
|
),
|
||||||
|
]
|
||||||
0
backend/apps/core/migrations/__init__.py
Normal file
0
backend/apps/core/migrations/__init__.py
Normal file
19
backend/apps/core/mixins/__init__.py
Normal file
19
backend/apps/core/mixins/__init__.py
Normal file
@@ -0,0 +1,19 @@
|
|||||||
|
from django.views.generic.list import MultipleObjectMixin
|
||||||
|
|
||||||
|
|
||||||
|
class HTMXFilterableMixin(MultipleObjectMixin):
|
||||||
|
"""
|
||||||
|
A mixin that provides filtering capabilities for HTMX requests.
|
||||||
|
"""
|
||||||
|
|
||||||
|
filter_class = None
|
||||||
|
|
||||||
|
def get_queryset(self):
|
||||||
|
queryset = super().get_queryset()
|
||||||
|
self.filterset = self.filter_class(self.request.GET, queryset=queryset)
|
||||||
|
return self.filterset.qs
|
||||||
|
|
||||||
|
def get_context_data(self, **kwargs):
|
||||||
|
context = super().get_context_data(**kwargs)
|
||||||
|
context["filter"] = self.filterset
|
||||||
|
return context
|
||||||
113
backend/apps/core/models.py
Normal file
113
backend/apps/core/models.py
Normal file
@@ -0,0 +1,113 @@
|
|||||||
|
from django.db import models
|
||||||
|
from django.contrib.contenttypes.fields import GenericForeignKey
|
||||||
|
from django.contrib.contenttypes.models import ContentType
|
||||||
|
from django.utils.text import slugify
|
||||||
|
from apps.core.history import TrackedModel
|
||||||
|
|
||||||
|
|
||||||
|
class SlugHistory(models.Model):
|
||||||
|
"""
|
||||||
|
Model for tracking slug changes across all models that use slugs.
|
||||||
|
Uses generic relations to work with any model.
|
||||||
|
"""
|
||||||
|
|
||||||
|
content_type = models.ForeignKey(ContentType, on_delete=models.CASCADE)
|
||||||
|
object_id = models.CharField(
|
||||||
|
max_length=50
|
||||||
|
) # Using CharField to work with our custom IDs
|
||||||
|
content_object = GenericForeignKey("content_type", "object_id")
|
||||||
|
|
||||||
|
old_slug = models.SlugField(max_length=200)
|
||||||
|
created_at = models.DateTimeField(auto_now_add=True)
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
indexes = [
|
||||||
|
models.Index(fields=["content_type", "object_id"]),
|
||||||
|
models.Index(fields=["old_slug"]),
|
||||||
|
]
|
||||||
|
verbose_name_plural = "Slug histories"
|
||||||
|
ordering = ["-created_at"]
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return f"Old slug '{self.old_slug}' for {self.content_object}"
|
||||||
|
|
||||||
|
|
||||||
|
class SluggedModel(TrackedModel):
|
||||||
|
"""
|
||||||
|
Abstract base model that provides slug functionality with history tracking.
|
||||||
|
"""
|
||||||
|
|
||||||
|
name = models.CharField(max_length=200)
|
||||||
|
slug = models.SlugField(max_length=200, unique=True)
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
abstract = True
|
||||||
|
|
||||||
|
def save(self, *args, **kwargs):
|
||||||
|
# Get the current instance from DB if it exists
|
||||||
|
if self.pk:
|
||||||
|
try:
|
||||||
|
old_instance = self.__class__.objects.get(pk=self.pk)
|
||||||
|
# If slug has changed, save the old one to history
|
||||||
|
if old_instance.slug != self.slug:
|
||||||
|
SlugHistory.objects.create(
|
||||||
|
content_type=ContentType.objects.get_for_model(self),
|
||||||
|
object_id=getattr(self, self.get_id_field_name()),
|
||||||
|
old_slug=old_instance.slug,
|
||||||
|
)
|
||||||
|
except self.__class__.DoesNotExist:
|
||||||
|
pass
|
||||||
|
|
||||||
|
# Generate slug if not set
|
||||||
|
if not self.slug:
|
||||||
|
self.slug = slugify(self.name)
|
||||||
|
|
||||||
|
super().save(*args, **kwargs)
|
||||||
|
|
||||||
|
def get_id_field_name(self):
|
||||||
|
"""
|
||||||
|
Returns the name of the read-only ID field for this model.
|
||||||
|
Should be overridden by subclasses.
|
||||||
|
"""
|
||||||
|
raise NotImplementedError(
|
||||||
|
"Subclasses of SluggedModel must implement get_id_field_name()"
|
||||||
|
)
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def get_by_slug(cls, slug):
|
||||||
|
"""
|
||||||
|
Get an object by its current or historical slug.
|
||||||
|
Returns (object, is_old_slug) tuple.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
# Try to get by current slug first
|
||||||
|
return cls.objects.get(slug=slug), False
|
||||||
|
except cls.DoesNotExist:
|
||||||
|
# Check pghistory first
|
||||||
|
history_model = cls.get_history_model()
|
||||||
|
history_entry = (
|
||||||
|
history_model.objects.filter(slug=slug)
|
||||||
|
.order_by("-pgh_created_at")
|
||||||
|
.first()
|
||||||
|
)
|
||||||
|
|
||||||
|
if history_entry:
|
||||||
|
return cls.objects.get(id=history_entry.pgh_obj_id), True
|
||||||
|
|
||||||
|
# Try to find in manual slug history as fallback
|
||||||
|
history = (
|
||||||
|
SlugHistory.objects.filter(
|
||||||
|
content_type=ContentType.objects.get_for_model(cls),
|
||||||
|
old_slug=slug,
|
||||||
|
)
|
||||||
|
.order_by("-created_at")
|
||||||
|
.first()
|
||||||
|
)
|
||||||
|
|
||||||
|
if history:
|
||||||
|
return (
|
||||||
|
cls.objects.get(**{cls.get_id_field_name(): history.object_id}),
|
||||||
|
True,
|
||||||
|
)
|
||||||
|
|
||||||
|
raise cls.DoesNotExist(f"{cls.__name__} with slug '{slug}' does not exist")
|
||||||
322
backend/apps/core/selectors.py
Normal file
322
backend/apps/core/selectors.py
Normal file
@@ -0,0 +1,322 @@
|
|||||||
|
"""
|
||||||
|
Selectors for core functionality including map services and analytics.
|
||||||
|
Following Django styleguide pattern for separating data access from business logic.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from typing import Optional, Dict, Any, List
|
||||||
|
from django.db.models import QuerySet, Q, Count
|
||||||
|
from django.contrib.gis.geos import Point, Polygon
|
||||||
|
from django.contrib.gis.measure import Distance
|
||||||
|
from django.utils import timezone
|
||||||
|
from datetime import timedelta
|
||||||
|
|
||||||
|
from .analytics import PageView
|
||||||
|
from apps.parks.models import Park
|
||||||
|
from apps.rides.models import Ride
|
||||||
|
|
||||||
|
|
||||||
|
def unified_locations_for_map(
|
||||||
|
*,
|
||||||
|
bounds: Optional[Polygon] = None,
|
||||||
|
location_types: Optional[List[str]] = None,
|
||||||
|
filters: Optional[Dict[str, Any]] = None,
|
||||||
|
) -> Dict[str, QuerySet]:
|
||||||
|
"""
|
||||||
|
Get unified location data for map display across all location types.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
bounds: Geographic boundary polygon
|
||||||
|
location_types: List of location types to include ('park', 'ride')
|
||||||
|
filters: Additional filter parameters
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary containing querysets for each location type
|
||||||
|
"""
|
||||||
|
results = {}
|
||||||
|
|
||||||
|
# Default to all location types if none specified
|
||||||
|
if not location_types:
|
||||||
|
location_types = ["park", "ride"]
|
||||||
|
|
||||||
|
# Parks
|
||||||
|
if "park" in location_types:
|
||||||
|
park_queryset = (
|
||||||
|
Park.objects.select_related("operator")
|
||||||
|
.prefetch_related("location")
|
||||||
|
.annotate(ride_count_calculated=Count("rides"))
|
||||||
|
)
|
||||||
|
|
||||||
|
if bounds:
|
||||||
|
park_queryset = park_queryset.filter(location__coordinates__within=bounds)
|
||||||
|
|
||||||
|
if filters:
|
||||||
|
if "status" in filters:
|
||||||
|
park_queryset = park_queryset.filter(status=filters["status"])
|
||||||
|
if "operator" in filters:
|
||||||
|
park_queryset = park_queryset.filter(operator=filters["operator"])
|
||||||
|
|
||||||
|
results["parks"] = park_queryset.order_by("name")
|
||||||
|
|
||||||
|
# Rides
|
||||||
|
if "ride" in location_types:
|
||||||
|
ride_queryset = Ride.objects.select_related(
|
||||||
|
"park", "manufacturer"
|
||||||
|
).prefetch_related("park__location", "location")
|
||||||
|
|
||||||
|
if bounds:
|
||||||
|
ride_queryset = ride_queryset.filter(
|
||||||
|
Q(location__coordinates__within=bounds)
|
||||||
|
| Q(park__location__coordinates__within=bounds)
|
||||||
|
)
|
||||||
|
|
||||||
|
if filters:
|
||||||
|
if "category" in filters:
|
||||||
|
ride_queryset = ride_queryset.filter(category=filters["category"])
|
||||||
|
if "manufacturer" in filters:
|
||||||
|
ride_queryset = ride_queryset.filter(
|
||||||
|
manufacturer=filters["manufacturer"]
|
||||||
|
)
|
||||||
|
if "park" in filters:
|
||||||
|
ride_queryset = ride_queryset.filter(park=filters["park"])
|
||||||
|
|
||||||
|
results["rides"] = ride_queryset.order_by("park__name", "name")
|
||||||
|
|
||||||
|
return results
|
||||||
|
|
||||||
|
|
||||||
|
def locations_near_point(
|
||||||
|
*,
|
||||||
|
point: Point,
|
||||||
|
distance_km: float = 50,
|
||||||
|
location_types: Optional[List[str]] = None,
|
||||||
|
limit: int = 20,
|
||||||
|
) -> Dict[str, QuerySet]:
|
||||||
|
"""
|
||||||
|
Get locations near a specific geographic point across all types.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
point: Geographic point (longitude, latitude)
|
||||||
|
distance_km: Maximum distance in kilometers
|
||||||
|
location_types: List of location types to include
|
||||||
|
limit: Maximum number of results per type
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary containing nearby locations by type
|
||||||
|
"""
|
||||||
|
results = {}
|
||||||
|
|
||||||
|
if not location_types:
|
||||||
|
location_types = ["park", "ride"]
|
||||||
|
|
||||||
|
# Parks near point
|
||||||
|
if "park" in location_types:
|
||||||
|
results["parks"] = (
|
||||||
|
Park.objects.filter(
|
||||||
|
location__coordinates__distance_lte=(
|
||||||
|
point,
|
||||||
|
Distance(km=distance_km),
|
||||||
|
)
|
||||||
|
)
|
||||||
|
.select_related("operator")
|
||||||
|
.prefetch_related("location")
|
||||||
|
.distance(point)
|
||||||
|
.order_by("distance")[:limit]
|
||||||
|
)
|
||||||
|
|
||||||
|
# Rides near point
|
||||||
|
if "ride" in location_types:
|
||||||
|
results["rides"] = (
|
||||||
|
Ride.objects.filter(
|
||||||
|
Q(
|
||||||
|
location__coordinates__distance_lte=(
|
||||||
|
point,
|
||||||
|
Distance(km=distance_km),
|
||||||
|
)
|
||||||
|
)
|
||||||
|
| Q(
|
||||||
|
park__location__coordinates__distance_lte=(
|
||||||
|
point,
|
||||||
|
Distance(km=distance_km),
|
||||||
|
)
|
||||||
|
)
|
||||||
|
)
|
||||||
|
.select_related("park", "manufacturer")
|
||||||
|
.prefetch_related("park__location")
|
||||||
|
.distance(point)
|
||||||
|
.order_by("distance")[:limit]
|
||||||
|
)
|
||||||
|
|
||||||
|
return results
|
||||||
|
|
||||||
|
|
||||||
|
def search_all_locations(*, query: str, limit: int = 20) -> Dict[str, QuerySet]:
|
||||||
|
"""
|
||||||
|
Search across all location types for a query string.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
query: Search string
|
||||||
|
limit: Maximum results per type
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary containing search results by type
|
||||||
|
"""
|
||||||
|
results = {}
|
||||||
|
|
||||||
|
# Search parks
|
||||||
|
results["parks"] = (
|
||||||
|
Park.objects.filter(
|
||||||
|
Q(name__icontains=query)
|
||||||
|
| Q(description__icontains=query)
|
||||||
|
| Q(location__city__icontains=query)
|
||||||
|
| Q(location__region__icontains=query)
|
||||||
|
)
|
||||||
|
.select_related("operator")
|
||||||
|
.prefetch_related("location")
|
||||||
|
.order_by("name")[:limit]
|
||||||
|
)
|
||||||
|
|
||||||
|
# Search rides
|
||||||
|
results["rides"] = (
|
||||||
|
Ride.objects.filter(
|
||||||
|
Q(name__icontains=query)
|
||||||
|
| Q(description__icontains=query)
|
||||||
|
| Q(park__name__icontains=query)
|
||||||
|
| Q(manufacturer__name__icontains=query)
|
||||||
|
)
|
||||||
|
.select_related("park", "manufacturer")
|
||||||
|
.prefetch_related("park__location")
|
||||||
|
.order_by("park__name", "name")[:limit]
|
||||||
|
)
|
||||||
|
|
||||||
|
return results
|
||||||
|
|
||||||
|
|
||||||
|
def page_views_for_analytics(
|
||||||
|
*,
|
||||||
|
start_date: Optional[timezone.datetime] = None,
|
||||||
|
end_date: Optional[timezone.datetime] = None,
|
||||||
|
path_pattern: Optional[str] = None,
|
||||||
|
) -> QuerySet[PageView]:
|
||||||
|
"""
|
||||||
|
Get page views for analytics with optional filtering.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
start_date: Start date for filtering
|
||||||
|
end_date: End date for filtering
|
||||||
|
path_pattern: URL path pattern to filter by
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
QuerySet of page views
|
||||||
|
"""
|
||||||
|
queryset = PageView.objects.all()
|
||||||
|
|
||||||
|
if start_date:
|
||||||
|
queryset = queryset.filter(timestamp__gte=start_date)
|
||||||
|
|
||||||
|
if end_date:
|
||||||
|
queryset = queryset.filter(timestamp__lte=end_date)
|
||||||
|
|
||||||
|
if path_pattern:
|
||||||
|
queryset = queryset.filter(path__icontains=path_pattern)
|
||||||
|
|
||||||
|
return queryset.order_by("-timestamp")
|
||||||
|
|
||||||
|
|
||||||
|
def popular_pages_summary(*, days: int = 30) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Get summary of most popular pages in the last N days.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
days: Number of days to analyze
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary containing popular pages statistics
|
||||||
|
"""
|
||||||
|
cutoff_date = timezone.now() - timedelta(days=days)
|
||||||
|
|
||||||
|
# Most viewed pages
|
||||||
|
popular_pages = (
|
||||||
|
PageView.objects.filter(timestamp__gte=cutoff_date)
|
||||||
|
.values("path")
|
||||||
|
.annotate(view_count=Count("id"))
|
||||||
|
.order_by("-view_count")[:10]
|
||||||
|
)
|
||||||
|
|
||||||
|
# Total page views
|
||||||
|
total_views = PageView.objects.filter(timestamp__gte=cutoff_date).count()
|
||||||
|
|
||||||
|
# Unique visitors (based on IP)
|
||||||
|
unique_visitors = (
|
||||||
|
PageView.objects.filter(timestamp__gte=cutoff_date)
|
||||||
|
.values("ip_address")
|
||||||
|
.distinct()
|
||||||
|
.count()
|
||||||
|
)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"popular_pages": list(popular_pages),
|
||||||
|
"total_views": total_views,
|
||||||
|
"unique_visitors": unique_visitors,
|
||||||
|
"period_days": days,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def geographic_distribution_summary() -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Get geographic distribution statistics for all locations.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary containing geographic statistics
|
||||||
|
"""
|
||||||
|
# Parks by country
|
||||||
|
parks_by_country = (
|
||||||
|
Park.objects.filter(location__country__isnull=False)
|
||||||
|
.values("location__country")
|
||||||
|
.annotate(count=Count("id"))
|
||||||
|
.order_by("-count")
|
||||||
|
)
|
||||||
|
|
||||||
|
# Rides by country (through park location)
|
||||||
|
rides_by_country = (
|
||||||
|
Ride.objects.filter(park__location__country__isnull=False)
|
||||||
|
.values("park__location__country")
|
||||||
|
.annotate(count=Count("id"))
|
||||||
|
.order_by("-count")
|
||||||
|
)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"parks_by_country": list(parks_by_country),
|
||||||
|
"rides_by_country": list(rides_by_country),
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def system_health_metrics() -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Get system health and activity metrics.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary containing system health statistics
|
||||||
|
"""
|
||||||
|
now = timezone.now()
|
||||||
|
last_24h = now - timedelta(hours=24)
|
||||||
|
last_7d = now - timedelta(days=7)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"total_parks": Park.objects.count(),
|
||||||
|
"operating_parks": Park.objects.filter(status="OPERATING").count(),
|
||||||
|
"total_rides": Ride.objects.count(),
|
||||||
|
"page_views_24h": PageView.objects.filter(timestamp__gte=last_24h).count(),
|
||||||
|
"page_views_7d": PageView.objects.filter(timestamp__gte=last_7d).count(),
|
||||||
|
"data_freshness": {
|
||||||
|
"latest_park_update": (
|
||||||
|
Park.objects.order_by("-updated_at").first().updated_at
|
||||||
|
if Park.objects.exists()
|
||||||
|
else None
|
||||||
|
),
|
||||||
|
"latest_ride_update": (
|
||||||
|
Ride.objects.order_by("-updated_at").first().updated_at
|
||||||
|
if Ride.objects.exists()
|
||||||
|
else None
|
||||||
|
),
|
||||||
|
},
|
||||||
|
}
|
||||||
27
backend/apps/core/services/__init__.py
Normal file
27
backend/apps/core/services/__init__.py
Normal file
@@ -0,0 +1,27 @@
|
|||||||
|
"""
|
||||||
|
Core services for ThrillWiki unified map functionality.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from .map_service import UnifiedMapService
|
||||||
|
from .clustering_service import ClusteringService
|
||||||
|
from .map_cache_service import MapCacheService
|
||||||
|
from .data_structures import (
|
||||||
|
UnifiedLocation,
|
||||||
|
LocationType,
|
||||||
|
GeoBounds,
|
||||||
|
MapFilters,
|
||||||
|
MapResponse,
|
||||||
|
ClusterData,
|
||||||
|
)
|
||||||
|
|
||||||
|
__all__ = [
|
||||||
|
"UnifiedMapService",
|
||||||
|
"ClusteringService",
|
||||||
|
"MapCacheService",
|
||||||
|
"UnifiedLocation",
|
||||||
|
"LocationType",
|
||||||
|
"GeoBounds",
|
||||||
|
"MapFilters",
|
||||||
|
"MapResponse",
|
||||||
|
"ClusterData",
|
||||||
|
]
|
||||||
365
backend/apps/core/services/clustering_service.py
Normal file
365
backend/apps/core/services/clustering_service.py
Normal file
@@ -0,0 +1,365 @@
|
|||||||
|
"""
|
||||||
|
Clustering service for map locations to improve performance and user experience.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import math
|
||||||
|
from typing import List, Tuple, Dict, Any, Optional
|
||||||
|
from dataclasses import dataclass
|
||||||
|
from collections import defaultdict
|
||||||
|
|
||||||
|
from .data_structures import (
|
||||||
|
UnifiedLocation,
|
||||||
|
ClusterData,
|
||||||
|
GeoBounds,
|
||||||
|
LocationType,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class ClusterPoint:
|
||||||
|
"""Internal representation of a point for clustering."""
|
||||||
|
|
||||||
|
location: UnifiedLocation
|
||||||
|
x: float # Projected x coordinate
|
||||||
|
y: float # Projected y coordinate
|
||||||
|
|
||||||
|
|
||||||
|
class ClusteringService:
|
||||||
|
"""
|
||||||
|
Handles location clustering for map display using a simple grid-based approach
|
||||||
|
with zoom-level dependent clustering radius.
|
||||||
|
"""
|
||||||
|
|
||||||
|
# Clustering configuration
|
||||||
|
DEFAULT_RADIUS = 40 # pixels
|
||||||
|
MIN_POINTS_TO_CLUSTER = 2
|
||||||
|
MAX_ZOOM_FOR_CLUSTERING = 15
|
||||||
|
MIN_ZOOM_FOR_CLUSTERING = 3
|
||||||
|
|
||||||
|
# Zoom level configurations
|
||||||
|
ZOOM_CONFIGS = {
|
||||||
|
3: {"radius": 80, "min_points": 5}, # World level
|
||||||
|
4: {"radius": 70, "min_points": 4}, # Continent level
|
||||||
|
5: {"radius": 60, "min_points": 3}, # Country level
|
||||||
|
6: {"radius": 50, "min_points": 3}, # Large region level
|
||||||
|
7: {"radius": 45, "min_points": 2}, # Region level
|
||||||
|
8: {"radius": 40, "min_points": 2}, # State level
|
||||||
|
9: {"radius": 35, "min_points": 2}, # Metro area level
|
||||||
|
10: {"radius": 30, "min_points": 2}, # City level
|
||||||
|
11: {"radius": 25, "min_points": 2}, # District level
|
||||||
|
12: {"radius": 20, "min_points": 2}, # Neighborhood level
|
||||||
|
13: {"radius": 15, "min_points": 2}, # Block level
|
||||||
|
14: {"radius": 10, "min_points": 2}, # Street level
|
||||||
|
15: {"radius": 5, "min_points": 2}, # Building level
|
||||||
|
}
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
self.cluster_id_counter = 0
|
||||||
|
|
||||||
|
def should_cluster(self, zoom_level: int, point_count: int) -> bool:
|
||||||
|
"""Determine if clustering should be applied based on zoom level and point count."""
|
||||||
|
if zoom_level > self.MAX_ZOOM_FOR_CLUSTERING:
|
||||||
|
return False
|
||||||
|
if zoom_level < self.MIN_ZOOM_FOR_CLUSTERING:
|
||||||
|
return True
|
||||||
|
|
||||||
|
config = self.ZOOM_CONFIGS.get(
|
||||||
|
zoom_level, {"min_points": self.MIN_POINTS_TO_CLUSTER}
|
||||||
|
)
|
||||||
|
return point_count >= config["min_points"]
|
||||||
|
|
||||||
|
def cluster_locations(
|
||||||
|
self,
|
||||||
|
locations: List[UnifiedLocation],
|
||||||
|
zoom_level: int,
|
||||||
|
bounds: Optional[GeoBounds] = None,
|
||||||
|
) -> Tuple[List[UnifiedLocation], List[ClusterData]]:
|
||||||
|
"""
|
||||||
|
Cluster locations based on zoom level and density.
|
||||||
|
Returns (unclustered_locations, clusters).
|
||||||
|
"""
|
||||||
|
if not locations or not self.should_cluster(zoom_level, len(locations)):
|
||||||
|
return locations, []
|
||||||
|
|
||||||
|
# Convert locations to projected coordinates for clustering
|
||||||
|
cluster_points = self._project_locations(locations, bounds)
|
||||||
|
|
||||||
|
# Get clustering configuration for zoom level
|
||||||
|
config = self.ZOOM_CONFIGS.get(
|
||||||
|
zoom_level,
|
||||||
|
{
|
||||||
|
"radius": self.DEFAULT_RADIUS,
|
||||||
|
"min_points": self.MIN_POINTS_TO_CLUSTER,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
# Perform clustering
|
||||||
|
clustered_groups = self._cluster_points(
|
||||||
|
cluster_points, config["radius"], config["min_points"]
|
||||||
|
)
|
||||||
|
|
||||||
|
# Separate individual locations from clusters
|
||||||
|
unclustered_locations = []
|
||||||
|
clusters = []
|
||||||
|
|
||||||
|
for group in clustered_groups:
|
||||||
|
if len(group) < config["min_points"]:
|
||||||
|
# Add individual locations
|
||||||
|
unclustered_locations.extend([cp.location for cp in group])
|
||||||
|
else:
|
||||||
|
# Create cluster
|
||||||
|
cluster = self._create_cluster(group)
|
||||||
|
clusters.append(cluster)
|
||||||
|
|
||||||
|
return unclustered_locations, clusters
|
||||||
|
|
||||||
|
def _project_locations(
|
||||||
|
self,
|
||||||
|
locations: List[UnifiedLocation],
|
||||||
|
bounds: Optional[GeoBounds] = None,
|
||||||
|
) -> List[ClusterPoint]:
|
||||||
|
"""Convert lat/lng coordinates to projected x/y for clustering calculations."""
|
||||||
|
cluster_points = []
|
||||||
|
|
||||||
|
# Use bounds or calculate from locations
|
||||||
|
if not bounds:
|
||||||
|
lats = [loc.latitude for loc in locations]
|
||||||
|
lngs = [loc.longitude for loc in locations]
|
||||||
|
bounds = GeoBounds(
|
||||||
|
north=max(lats),
|
||||||
|
south=min(lats),
|
||||||
|
east=max(lngs),
|
||||||
|
west=min(lngs),
|
||||||
|
)
|
||||||
|
|
||||||
|
# Simple equirectangular projection (good enough for clustering)
|
||||||
|
center_lat = (bounds.north + bounds.south) / 2
|
||||||
|
lat_scale = 111320 # meters per degree latitude
|
||||||
|
lng_scale = 111320 * math.cos(
|
||||||
|
math.radians(center_lat)
|
||||||
|
) # meters per degree longitude
|
||||||
|
|
||||||
|
for location in locations:
|
||||||
|
# Convert to meters relative to bounds center
|
||||||
|
x = (location.longitude - (bounds.west + bounds.east) / 2) * lng_scale
|
||||||
|
y = (location.latitude - (bounds.north + bounds.south) / 2) * lat_scale
|
||||||
|
|
||||||
|
cluster_points.append(ClusterPoint(location=location, x=x, y=y))
|
||||||
|
|
||||||
|
return cluster_points
|
||||||
|
|
||||||
|
def _cluster_points(
|
||||||
|
self, points: List[ClusterPoint], radius_pixels: int, min_points: int
|
||||||
|
) -> List[List[ClusterPoint]]:
|
||||||
|
"""
|
||||||
|
Cluster points using a simple distance-based approach.
|
||||||
|
Radius is in pixels, converted to meters based on zoom level.
|
||||||
|
"""
|
||||||
|
# Convert pixel radius to meters (rough approximation)
|
||||||
|
# At zoom level 10, 1 pixel ≈ 150 meters
|
||||||
|
radius_meters = radius_pixels * 150
|
||||||
|
|
||||||
|
clustered = [False] * len(points)
|
||||||
|
clusters = []
|
||||||
|
|
||||||
|
for i, point in enumerate(points):
|
||||||
|
if clustered[i]:
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Find all points within radius
|
||||||
|
cluster_group = [point]
|
||||||
|
clustered[i] = True
|
||||||
|
|
||||||
|
for j, other_point in enumerate(points):
|
||||||
|
if i == j or clustered[j]:
|
||||||
|
continue
|
||||||
|
|
||||||
|
distance = self._calculate_distance(point, other_point)
|
||||||
|
if distance <= radius_meters:
|
||||||
|
cluster_group.append(other_point)
|
||||||
|
clustered[j] = True
|
||||||
|
|
||||||
|
clusters.append(cluster_group)
|
||||||
|
|
||||||
|
return clusters
|
||||||
|
|
||||||
|
def _calculate_distance(self, point1: ClusterPoint, point2: ClusterPoint) -> float:
|
||||||
|
"""Calculate Euclidean distance between two projected points in meters."""
|
||||||
|
dx = point1.x - point2.x
|
||||||
|
dy = point1.y - point2.y
|
||||||
|
return math.sqrt(dx * dx + dy * dy)
|
||||||
|
|
||||||
|
def _create_cluster(self, cluster_points: List[ClusterPoint]) -> ClusterData:
|
||||||
|
"""Create a ClusterData object from a group of points."""
|
||||||
|
locations = [cp.location for cp in cluster_points]
|
||||||
|
|
||||||
|
# Calculate cluster center (average position)
|
||||||
|
avg_lat = sum(loc.latitude for loc in locations) / len(locations)
|
||||||
|
avg_lng = sum(loc.longitude for loc in locations) / len(locations)
|
||||||
|
|
||||||
|
# Calculate cluster bounds
|
||||||
|
lats = [loc.latitude for loc in locations]
|
||||||
|
lngs = [loc.longitude for loc in locations]
|
||||||
|
cluster_bounds = GeoBounds(
|
||||||
|
north=max(lats), south=min(lats), east=max(lngs), west=min(lngs)
|
||||||
|
)
|
||||||
|
|
||||||
|
# Collect location types in cluster
|
||||||
|
types = set(loc.type for loc in locations)
|
||||||
|
|
||||||
|
# Select representative location (highest weight)
|
||||||
|
representative = self._select_representative_location(locations)
|
||||||
|
|
||||||
|
# Generate cluster ID
|
||||||
|
self.cluster_id_counter += 1
|
||||||
|
cluster_id = f"cluster_{self.cluster_id_counter}"
|
||||||
|
|
||||||
|
return ClusterData(
|
||||||
|
id=cluster_id,
|
||||||
|
coordinates=(avg_lat, avg_lng),
|
||||||
|
count=len(locations),
|
||||||
|
types=types,
|
||||||
|
bounds=cluster_bounds,
|
||||||
|
representative_location=representative,
|
||||||
|
)
|
||||||
|
|
||||||
|
def _select_representative_location(
|
||||||
|
self, locations: List[UnifiedLocation]
|
||||||
|
) -> Optional[UnifiedLocation]:
|
||||||
|
"""Select the most representative location for a cluster."""
|
||||||
|
if not locations:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Prioritize by: 1) Parks over rides/companies, 2) Higher weight, 3)
|
||||||
|
# Better rating
|
||||||
|
parks = [loc for loc in locations if loc.type == LocationType.PARK]
|
||||||
|
if parks:
|
||||||
|
return max(
|
||||||
|
parks,
|
||||||
|
key=lambda x: (
|
||||||
|
x.cluster_weight,
|
||||||
|
x.metadata.get("rating", 0) or 0,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
rides = [loc for loc in locations if loc.type == LocationType.RIDE]
|
||||||
|
if rides:
|
||||||
|
return max(
|
||||||
|
rides,
|
||||||
|
key=lambda x: (
|
||||||
|
x.cluster_weight,
|
||||||
|
x.metadata.get("rating", 0) or 0,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
companies = [loc for loc in locations if loc.type == LocationType.COMPANY]
|
||||||
|
if companies:
|
||||||
|
return max(companies, key=lambda x: x.cluster_weight)
|
||||||
|
|
||||||
|
# Fall back to highest weight location
|
||||||
|
return max(locations, key=lambda x: x.cluster_weight)
|
||||||
|
|
||||||
|
def get_cluster_breakdown(self, clusters: List[ClusterData]) -> Dict[str, Any]:
|
||||||
|
"""Get statistics about clustering results."""
|
||||||
|
if not clusters:
|
||||||
|
return {
|
||||||
|
"total_clusters": 0,
|
||||||
|
"total_points_clustered": 0,
|
||||||
|
"average_cluster_size": 0,
|
||||||
|
"type_distribution": {},
|
||||||
|
"category_distribution": {},
|
||||||
|
}
|
||||||
|
|
||||||
|
total_points = sum(cluster.count for cluster in clusters)
|
||||||
|
type_counts = defaultdict(int)
|
||||||
|
category_counts = defaultdict(int)
|
||||||
|
|
||||||
|
for cluster in clusters:
|
||||||
|
for location_type in cluster.types:
|
||||||
|
type_counts[location_type.value] += cluster.count
|
||||||
|
|
||||||
|
if cluster.representative_location:
|
||||||
|
category_counts[cluster.representative_location.cluster_category] += 1
|
||||||
|
|
||||||
|
return {
|
||||||
|
"total_clusters": len(clusters),
|
||||||
|
"total_points_clustered": total_points,
|
||||||
|
"average_cluster_size": total_points / len(clusters),
|
||||||
|
"largest_cluster_size": max(cluster.count for cluster in clusters),
|
||||||
|
"smallest_cluster_size": min(cluster.count for cluster in clusters),
|
||||||
|
"type_distribution": dict(type_counts),
|
||||||
|
"category_distribution": dict(category_counts),
|
||||||
|
}
|
||||||
|
|
||||||
|
def expand_cluster(
|
||||||
|
self, cluster: ClusterData, zoom_level: int
|
||||||
|
) -> List[UnifiedLocation]:
|
||||||
|
"""
|
||||||
|
Expand a cluster to show individual locations (for drill-down functionality).
|
||||||
|
This would typically require re-querying the database with the cluster bounds.
|
||||||
|
"""
|
||||||
|
# This is a placeholder - in practice, this would re-query the database
|
||||||
|
# with the cluster bounds and higher detail level
|
||||||
|
return []
|
||||||
|
|
||||||
|
|
||||||
|
class SmartClusteringRules:
|
||||||
|
"""
|
||||||
|
Advanced clustering rules that consider location types and importance.
|
||||||
|
"""
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def should_cluster_together(loc1: UnifiedLocation, loc2: UnifiedLocation) -> bool:
|
||||||
|
"""Determine if two locations should be clustered together."""
|
||||||
|
|
||||||
|
# Same park rides should cluster together more readily
|
||||||
|
if loc1.type == LocationType.RIDE and loc2.type == LocationType.RIDE:
|
||||||
|
park1_id = loc1.metadata.get("park_id")
|
||||||
|
park2_id = loc2.metadata.get("park_id")
|
||||||
|
if park1_id and park2_id and park1_id == park2_id:
|
||||||
|
return True
|
||||||
|
|
||||||
|
# Major parks should resist clustering unless very close
|
||||||
|
if (
|
||||||
|
loc1.cluster_category == "major_park"
|
||||||
|
or loc2.cluster_category == "major_park"
|
||||||
|
):
|
||||||
|
return False
|
||||||
|
|
||||||
|
# Similar types cluster more readily
|
||||||
|
if loc1.type == loc2.type:
|
||||||
|
return True
|
||||||
|
|
||||||
|
# Different types can cluster but with higher threshold
|
||||||
|
return False
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def calculate_cluster_priority(
|
||||||
|
locations: List[UnifiedLocation],
|
||||||
|
) -> UnifiedLocation:
|
||||||
|
"""Select the representative location for a cluster based on priority rules."""
|
||||||
|
# Prioritize by: 1) Parks over rides, 2) Higher weight, 3) Better
|
||||||
|
# rating
|
||||||
|
parks = [loc for loc in locations if loc.type == LocationType.PARK]
|
||||||
|
if parks:
|
||||||
|
return max(
|
||||||
|
parks,
|
||||||
|
key=lambda x: (
|
||||||
|
x.cluster_weight,
|
||||||
|
x.metadata.get("rating", 0) or 0,
|
||||||
|
x.metadata.get("ride_count", 0) or 0,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
rides = [loc for loc in locations if loc.type == LocationType.RIDE]
|
||||||
|
if rides:
|
||||||
|
return max(
|
||||||
|
rides,
|
||||||
|
key=lambda x: (
|
||||||
|
x.cluster_weight,
|
||||||
|
x.metadata.get("rating", 0) or 0,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
# Fall back to highest weight
|
||||||
|
return max(locations, key=lambda x: x.cluster_weight)
|
||||||
253
backend/apps/core/services/data_structures.py
Normal file
253
backend/apps/core/services/data_structures.py
Normal file
@@ -0,0 +1,253 @@
|
|||||||
|
"""
|
||||||
|
Data structures for the unified map service.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from dataclasses import dataclass, field
|
||||||
|
from enum import Enum
|
||||||
|
from typing import Dict, List, Optional, Set, Tuple, Any
|
||||||
|
from django.contrib.gis.geos import Polygon
|
||||||
|
|
||||||
|
|
||||||
|
class LocationType(Enum):
|
||||||
|
"""Types of locations supported by the map service."""
|
||||||
|
|
||||||
|
PARK = "park"
|
||||||
|
RIDE = "ride"
|
||||||
|
COMPANY = "company"
|
||||||
|
GENERIC = "generic"
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class GeoBounds:
|
||||||
|
"""Geographic boundary box for spatial queries."""
|
||||||
|
|
||||||
|
north: float
|
||||||
|
south: float
|
||||||
|
east: float
|
||||||
|
west: float
|
||||||
|
|
||||||
|
def __post_init__(self):
|
||||||
|
"""Validate bounds after initialization."""
|
||||||
|
if self.north < self.south:
|
||||||
|
raise ValueError("North bound must be greater than south bound")
|
||||||
|
if self.east < self.west:
|
||||||
|
raise ValueError("East bound must be greater than west bound")
|
||||||
|
if not (-90 <= self.south <= 90 and -90 <= self.north <= 90):
|
||||||
|
raise ValueError("Latitude bounds must be between -90 and 90")
|
||||||
|
if not (-180 <= self.west <= 180 and -180 <= self.east <= 180):
|
||||||
|
raise ValueError("Longitude bounds must be between -180 and 180")
|
||||||
|
|
||||||
|
def to_polygon(self) -> Polygon:
|
||||||
|
"""Convert bounds to PostGIS Polygon for database queries."""
|
||||||
|
return Polygon.from_bbox((self.west, self.south, self.east, self.north))
|
||||||
|
|
||||||
|
def expand(self, factor: float = 1.1) -> "GeoBounds":
|
||||||
|
"""Expand bounds by factor for buffer queries."""
|
||||||
|
center_lat = (self.north + self.south) / 2
|
||||||
|
center_lng = (self.east + self.west) / 2
|
||||||
|
|
||||||
|
lat_range = (self.north - self.south) * factor / 2
|
||||||
|
lng_range = (self.east - self.west) * factor / 2
|
||||||
|
|
||||||
|
return GeoBounds(
|
||||||
|
north=min(90, center_lat + lat_range),
|
||||||
|
south=max(-90, center_lat - lat_range),
|
||||||
|
east=min(180, center_lng + lng_range),
|
||||||
|
west=max(-180, center_lng - lng_range),
|
||||||
|
)
|
||||||
|
|
||||||
|
def contains_point(self, lat: float, lng: float) -> bool:
|
||||||
|
"""Check if a point is within these bounds."""
|
||||||
|
return self.south <= lat <= self.north and self.west <= lng <= self.east
|
||||||
|
|
||||||
|
def to_dict(self) -> Dict[str, float]:
|
||||||
|
"""Convert to dictionary for JSON serialization."""
|
||||||
|
return {
|
||||||
|
"north": self.north,
|
||||||
|
"south": self.south,
|
||||||
|
"east": self.east,
|
||||||
|
"west": self.west,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class MapFilters:
|
||||||
|
"""Filtering options for map queries."""
|
||||||
|
|
||||||
|
location_types: Optional[Set[LocationType]] = None
|
||||||
|
park_status: Optional[Set[str]] = None # OPERATING, CLOSED_TEMP, etc.
|
||||||
|
ride_types: Optional[Set[str]] = None
|
||||||
|
company_roles: Optional[Set[str]] = None # OPERATOR, MANUFACTURER, etc.
|
||||||
|
search_query: Optional[str] = None
|
||||||
|
min_rating: Optional[float] = None
|
||||||
|
has_coordinates: bool = True
|
||||||
|
country: Optional[str] = None
|
||||||
|
state: Optional[str] = None
|
||||||
|
city: Optional[str] = None
|
||||||
|
|
||||||
|
def to_dict(self) -> Dict[str, Any]:
|
||||||
|
"""Convert to dictionary for caching and serialization."""
|
||||||
|
return {
|
||||||
|
"location_types": (
|
||||||
|
[t.value for t in self.location_types] if self.location_types else None
|
||||||
|
),
|
||||||
|
"park_status": (list(self.park_status) if self.park_status else None),
|
||||||
|
"ride_types": list(self.ride_types) if self.ride_types else None,
|
||||||
|
"company_roles": (list(self.company_roles) if self.company_roles else None),
|
||||||
|
"search_query": self.search_query,
|
||||||
|
"min_rating": self.min_rating,
|
||||||
|
"has_coordinates": self.has_coordinates,
|
||||||
|
"country": self.country,
|
||||||
|
"state": self.state,
|
||||||
|
"city": self.city,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class UnifiedLocation:
|
||||||
|
"""Unified location interface for all location types."""
|
||||||
|
|
||||||
|
id: str # Composite: f"{type}_{id}"
|
||||||
|
type: LocationType
|
||||||
|
name: str
|
||||||
|
coordinates: Tuple[float, float] # (lat, lng)
|
||||||
|
address: Optional[str] = None
|
||||||
|
metadata: Dict[str, Any] = field(default_factory=dict)
|
||||||
|
type_data: Dict[str, Any] = field(default_factory=dict)
|
||||||
|
cluster_weight: int = 1
|
||||||
|
cluster_category: str = "default"
|
||||||
|
|
||||||
|
@property
|
||||||
|
def latitude(self) -> float:
|
||||||
|
"""Get latitude from coordinates."""
|
||||||
|
return self.coordinates[0]
|
||||||
|
|
||||||
|
@property
|
||||||
|
def longitude(self) -> float:
|
||||||
|
"""Get longitude from coordinates."""
|
||||||
|
return self.coordinates[1]
|
||||||
|
|
||||||
|
def to_geojson_feature(self) -> Dict[str, Any]:
|
||||||
|
"""Convert to GeoJSON feature for mapping libraries."""
|
||||||
|
return {
|
||||||
|
"type": "Feature",
|
||||||
|
"properties": {
|
||||||
|
"id": self.id,
|
||||||
|
"type": self.type.value,
|
||||||
|
"name": self.name,
|
||||||
|
"address": self.address,
|
||||||
|
"metadata": self.metadata,
|
||||||
|
"type_data": self.type_data,
|
||||||
|
"cluster_weight": self.cluster_weight,
|
||||||
|
"cluster_category": self.cluster_category,
|
||||||
|
},
|
||||||
|
"geometry": {
|
||||||
|
"type": "Point",
|
||||||
|
# GeoJSON uses lng, lat
|
||||||
|
"coordinates": [self.longitude, self.latitude],
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
def to_dict(self) -> Dict[str, Any]:
|
||||||
|
"""Convert to dictionary for JSON responses."""
|
||||||
|
return {
|
||||||
|
"id": self.id,
|
||||||
|
"type": self.type.value,
|
||||||
|
"name": self.name,
|
||||||
|
"coordinates": list(self.coordinates),
|
||||||
|
"address": self.address,
|
||||||
|
"metadata": self.metadata,
|
||||||
|
"type_data": self.type_data,
|
||||||
|
"cluster_weight": self.cluster_weight,
|
||||||
|
"cluster_category": self.cluster_category,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class ClusterData:
|
||||||
|
"""Represents a cluster of locations for map display."""
|
||||||
|
|
||||||
|
id: str
|
||||||
|
coordinates: Tuple[float, float] # (lat, lng)
|
||||||
|
count: int
|
||||||
|
types: Set[LocationType]
|
||||||
|
bounds: GeoBounds
|
||||||
|
representative_location: Optional[UnifiedLocation] = None
|
||||||
|
|
||||||
|
def to_dict(self) -> Dict[str, Any]:
|
||||||
|
"""Convert to dictionary for JSON responses."""
|
||||||
|
return {
|
||||||
|
"id": self.id,
|
||||||
|
"coordinates": list(self.coordinates),
|
||||||
|
"count": self.count,
|
||||||
|
"types": [t.value for t in self.types],
|
||||||
|
"bounds": self.bounds.to_dict(),
|
||||||
|
"representative": (
|
||||||
|
self.representative_location.to_dict()
|
||||||
|
if self.representative_location
|
||||||
|
else None
|
||||||
|
),
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class MapResponse:
|
||||||
|
"""Response structure for map API calls."""
|
||||||
|
|
||||||
|
locations: List[UnifiedLocation] = field(default_factory=list)
|
||||||
|
clusters: List[ClusterData] = field(default_factory=list)
|
||||||
|
bounds: Optional[GeoBounds] = None
|
||||||
|
total_count: int = 0
|
||||||
|
filtered_count: int = 0
|
||||||
|
zoom_level: Optional[int] = None
|
||||||
|
clustered: bool = False
|
||||||
|
cache_hit: bool = False
|
||||||
|
query_time_ms: Optional[int] = None
|
||||||
|
filters_applied: List[str] = field(default_factory=list)
|
||||||
|
|
||||||
|
def to_dict(self) -> Dict[str, Any]:
|
||||||
|
"""Convert to dictionary for JSON responses."""
|
||||||
|
return {
|
||||||
|
"status": "success",
|
||||||
|
"data": {
|
||||||
|
"locations": [loc.to_dict() for loc in self.locations],
|
||||||
|
"clusters": [cluster.to_dict() for cluster in self.clusters],
|
||||||
|
"bounds": self.bounds.to_dict() if self.bounds else None,
|
||||||
|
"total_count": self.total_count,
|
||||||
|
"filtered_count": self.filtered_count,
|
||||||
|
"zoom_level": self.zoom_level,
|
||||||
|
"clustered": self.clustered,
|
||||||
|
},
|
||||||
|
"meta": {
|
||||||
|
"cache_hit": self.cache_hit,
|
||||||
|
"query_time_ms": self.query_time_ms,
|
||||||
|
"filters_applied": self.filters_applied,
|
||||||
|
"pagination": {
|
||||||
|
"has_more": False, # TODO: Implement pagination
|
||||||
|
"total_pages": 1,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class QueryPerformanceMetrics:
|
||||||
|
"""Performance metrics for query optimization."""
|
||||||
|
|
||||||
|
query_time_ms: int
|
||||||
|
db_query_count: int
|
||||||
|
cache_hit: bool
|
||||||
|
result_count: int
|
||||||
|
bounds_used: bool
|
||||||
|
clustering_used: bool
|
||||||
|
|
||||||
|
def to_dict(self) -> Dict[str, Any]:
|
||||||
|
"""Convert to dictionary for logging."""
|
||||||
|
return {
|
||||||
|
"query_time_ms": self.query_time_ms,
|
||||||
|
"db_query_count": self.db_query_count,
|
||||||
|
"cache_hit": self.cache_hit,
|
||||||
|
"result_count": self.result_count,
|
||||||
|
"bounds_used": self.bounds_used,
|
||||||
|
"clustering_used": self.clustering_used,
|
||||||
|
}
|
||||||
320
backend/apps/core/services/enhanced_cache_service.py
Normal file
320
backend/apps/core/services/enhanced_cache_service.py
Normal file
@@ -0,0 +1,320 @@
|
|||||||
|
"""
|
||||||
|
Enhanced caching service with multiple cache backends and strategies.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from typing import Optional, Any, Dict, Callable
|
||||||
|
from django.core.cache import caches
|
||||||
|
import hashlib
|
||||||
|
import json
|
||||||
|
import logging
|
||||||
|
import time
|
||||||
|
from functools import wraps
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
# Define GeoBounds for type hinting
|
||||||
|
class GeoBounds:
|
||||||
|
def __init__(self, min_lat: float, min_lng: float, max_lat: float, max_lng: float):
|
||||||
|
self.min_lat = min_lat
|
||||||
|
self.min_lng = min_lng
|
||||||
|
self.max_lat = max_lat
|
||||||
|
self.max_lng = max_lng
|
||||||
|
|
||||||
|
|
||||||
|
class EnhancedCacheService:
|
||||||
|
"""Comprehensive caching service with multiple cache backends"""
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
self.default_cache = caches["default"]
|
||||||
|
try:
|
||||||
|
self.api_cache = caches["api"]
|
||||||
|
except Exception:
|
||||||
|
# Fallback to default cache if api cache not configured
|
||||||
|
self.api_cache = self.default_cache
|
||||||
|
|
||||||
|
# L1: Query-level caching
|
||||||
|
def cache_queryset(
|
||||||
|
self,
|
||||||
|
cache_key: str,
|
||||||
|
queryset_func: Callable,
|
||||||
|
timeout: int = 3600,
|
||||||
|
**kwargs,
|
||||||
|
) -> Any:
|
||||||
|
"""Cache expensive querysets"""
|
||||||
|
cached_result = self.default_cache.get(cache_key)
|
||||||
|
if cached_result is None:
|
||||||
|
start_time = time.time()
|
||||||
|
result = queryset_func(**kwargs)
|
||||||
|
duration = time.time() - start_time
|
||||||
|
|
||||||
|
# Log cache miss and function execution time
|
||||||
|
logger.info(
|
||||||
|
f"Cache miss for key '{cache_key}', executed in {
|
||||||
|
duration:.3f}s",
|
||||||
|
extra={"cache_key": cache_key, "execution_time": duration},
|
||||||
|
)
|
||||||
|
|
||||||
|
self.default_cache.set(cache_key, result, timeout)
|
||||||
|
return result
|
||||||
|
|
||||||
|
logger.debug(f"Cache hit for key '{cache_key}'")
|
||||||
|
return cached_result
|
||||||
|
|
||||||
|
# L2: API response caching
|
||||||
|
def cache_api_response(
|
||||||
|
self,
|
||||||
|
view_name: str,
|
||||||
|
params: Dict,
|
||||||
|
response_data: Any,
|
||||||
|
timeout: int = 1800,
|
||||||
|
):
|
||||||
|
"""Cache API responses based on view and parameters"""
|
||||||
|
cache_key = self._generate_api_cache_key(view_name, params)
|
||||||
|
self.api_cache.set(cache_key, response_data, timeout)
|
||||||
|
logger.debug(f"Cached API response for view '{view_name}'")
|
||||||
|
|
||||||
|
def get_cached_api_response(self, view_name: str, params: Dict) -> Optional[Any]:
|
||||||
|
"""Retrieve cached API response"""
|
||||||
|
cache_key = self._generate_api_cache_key(view_name, params)
|
||||||
|
result = self.api_cache.get(cache_key)
|
||||||
|
|
||||||
|
if result:
|
||||||
|
logger.debug(f"Cache hit for API view '{view_name}'")
|
||||||
|
else:
|
||||||
|
logger.debug(f"Cache miss for API view '{view_name}'")
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
# L3: Geographic caching (building on existing MapCacheService)
|
||||||
|
def cache_geographic_data(
|
||||||
|
self,
|
||||||
|
bounds: "GeoBounds",
|
||||||
|
data: Any,
|
||||||
|
zoom_level: int,
|
||||||
|
timeout: int = 1800,
|
||||||
|
):
|
||||||
|
"""Cache geographic data with spatial keys"""
|
||||||
|
# Generate spatial cache key based on bounds and zoom level
|
||||||
|
cache_key = f"geo:{
|
||||||
|
bounds.min_lat}:{
|
||||||
|
bounds.min_lng}:{
|
||||||
|
bounds.max_lat}:{
|
||||||
|
bounds.max_lng}:z{zoom_level}"
|
||||||
|
self.default_cache.set(cache_key, data, timeout)
|
||||||
|
logger.debug(f"Cached geographic data for bounds {bounds}")
|
||||||
|
|
||||||
|
def get_cached_geographic_data(
|
||||||
|
self, bounds: "GeoBounds", zoom_level: int
|
||||||
|
) -> Optional[Any]:
|
||||||
|
"""Retrieve cached geographic data"""
|
||||||
|
cache_key = f"geo:{
|
||||||
|
bounds.min_lat}:{
|
||||||
|
bounds.min_lng}:{
|
||||||
|
bounds.max_lat}:{
|
||||||
|
bounds.max_lng}:z{zoom_level}"
|
||||||
|
return self.default_cache.get(cache_key)
|
||||||
|
|
||||||
|
# Cache invalidation utilities
|
||||||
|
def invalidate_pattern(self, pattern: str):
|
||||||
|
"""Invalidate cache keys matching a pattern (if backend supports it)"""
|
||||||
|
try:
|
||||||
|
# For Redis cache backends
|
||||||
|
if hasattr(self.default_cache, "delete_pattern"):
|
||||||
|
deleted_count = self.default_cache.delete_pattern(pattern)
|
||||||
|
logger.info(
|
||||||
|
f"Invalidated {deleted_count} cache keys matching pattern '{pattern}'"
|
||||||
|
)
|
||||||
|
return deleted_count
|
||||||
|
else:
|
||||||
|
logger.warning(
|
||||||
|
f"Cache backend does not support pattern deletion for pattern '{pattern}'"
|
||||||
|
)
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Error invalidating cache pattern '{pattern}': {e}")
|
||||||
|
|
||||||
|
def invalidate_model_cache(
|
||||||
|
self, model_name: str, instance_id: Optional[int] = None
|
||||||
|
):
|
||||||
|
"""Invalidate cache keys related to a specific model"""
|
||||||
|
if instance_id:
|
||||||
|
pattern = f"*{model_name}:{instance_id}*"
|
||||||
|
else:
|
||||||
|
pattern = f"*{model_name}*"
|
||||||
|
|
||||||
|
self.invalidate_pattern(pattern)
|
||||||
|
|
||||||
|
# Cache warming utilities
|
||||||
|
def warm_cache(
|
||||||
|
self,
|
||||||
|
cache_key: str,
|
||||||
|
warm_func: Callable,
|
||||||
|
timeout: int = 3600,
|
||||||
|
**kwargs,
|
||||||
|
):
|
||||||
|
"""Proactively warm cache with data"""
|
||||||
|
try:
|
||||||
|
data = warm_func(**kwargs)
|
||||||
|
self.default_cache.set(cache_key, data, timeout)
|
||||||
|
logger.info(f"Warmed cache for key '{cache_key}'")
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Error warming cache for key '{cache_key}': {e}")
|
||||||
|
|
||||||
|
def _generate_api_cache_key(self, view_name: str, params: Dict) -> str:
|
||||||
|
"""Generate consistent cache keys for API responses"""
|
||||||
|
# Sort params to ensure consistent key generation
|
||||||
|
params_str = json.dumps(params, sort_keys=True, default=str)
|
||||||
|
params_hash = hashlib.md5(params_str.encode()).hexdigest()
|
||||||
|
return f"api:{view_name}:{params_hash}"
|
||||||
|
|
||||||
|
|
||||||
|
# Cache decorators
|
||||||
|
def cache_api_response(timeout=1800, vary_on=None, key_prefix=""):
|
||||||
|
"""Decorator for caching API responses"""
|
||||||
|
|
||||||
|
def decorator(view_func):
|
||||||
|
@wraps(view_func)
|
||||||
|
def wrapper(self, request, *args, **kwargs):
|
||||||
|
if request.method != "GET":
|
||||||
|
return view_func(self, request, *args, **kwargs)
|
||||||
|
|
||||||
|
# Generate cache key based on view, user, and parameters
|
||||||
|
cache_key_parts = [
|
||||||
|
key_prefix or view_func.__name__,
|
||||||
|
(
|
||||||
|
str(request.user.id)
|
||||||
|
if request.user.is_authenticated
|
||||||
|
else "anonymous"
|
||||||
|
),
|
||||||
|
str(hash(frozenset(request.GET.items()))),
|
||||||
|
]
|
||||||
|
|
||||||
|
if vary_on:
|
||||||
|
for field in vary_on:
|
||||||
|
cache_key_parts.append(str(getattr(request, field, "")))
|
||||||
|
|
||||||
|
cache_key = ":".join(cache_key_parts)
|
||||||
|
|
||||||
|
# Try to get from cache
|
||||||
|
cache_service = EnhancedCacheService()
|
||||||
|
cached_response = cache_service.api_cache.get(cache_key)
|
||||||
|
if cached_response:
|
||||||
|
logger.debug(f"Cache hit for API view {view_func.__name__}")
|
||||||
|
return cached_response
|
||||||
|
|
||||||
|
# Execute view and cache result
|
||||||
|
response = view_func(self, request, *args, **kwargs)
|
||||||
|
if hasattr(response, "status_code") and response.status_code == 200:
|
||||||
|
cache_service.api_cache.set(cache_key, response, timeout)
|
||||||
|
logger.debug(
|
||||||
|
f"Cached API response for view {
|
||||||
|
view_func.__name__}"
|
||||||
|
)
|
||||||
|
|
||||||
|
return response
|
||||||
|
|
||||||
|
return wrapper
|
||||||
|
|
||||||
|
return decorator
|
||||||
|
|
||||||
|
|
||||||
|
def cache_queryset_result(cache_key_template: str, timeout: int = 3600):
|
||||||
|
"""Decorator for caching queryset results"""
|
||||||
|
|
||||||
|
def decorator(func):
|
||||||
|
@wraps(func)
|
||||||
|
def wrapper(*args, **kwargs):
|
||||||
|
# Generate cache key from template and arguments
|
||||||
|
cache_key = cache_key_template.format(*args, **kwargs)
|
||||||
|
|
||||||
|
cache_service = EnhancedCacheService()
|
||||||
|
return cache_service.cache_queryset(
|
||||||
|
cache_key, func, timeout, *args, **kwargs
|
||||||
|
)
|
||||||
|
|
||||||
|
return wrapper
|
||||||
|
|
||||||
|
return decorator
|
||||||
|
|
||||||
|
|
||||||
|
# Context manager for cache warming
|
||||||
|
class CacheWarmer:
|
||||||
|
"""Context manager for batch cache warming operations"""
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
self.cache_service = EnhancedCacheService()
|
||||||
|
self.warm_operations = []
|
||||||
|
|
||||||
|
def add(
|
||||||
|
self,
|
||||||
|
cache_key: str,
|
||||||
|
warm_func: Callable,
|
||||||
|
timeout: int = 3600,
|
||||||
|
**kwargs,
|
||||||
|
):
|
||||||
|
"""Add a cache warming operation to the batch"""
|
||||||
|
self.warm_operations.append(
|
||||||
|
{
|
||||||
|
"cache_key": cache_key,
|
||||||
|
"warm_func": warm_func,
|
||||||
|
"timeout": timeout,
|
||||||
|
"kwargs": kwargs,
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
def __enter__(self):
|
||||||
|
return self
|
||||||
|
|
||||||
|
def __exit__(self, exc_type, exc_val, exc_tb):
|
||||||
|
"""Execute all cache warming operations"""
|
||||||
|
logger.info(f"Warming {len(self.warm_operations)} cache entries")
|
||||||
|
|
||||||
|
for operation in self.warm_operations:
|
||||||
|
try:
|
||||||
|
self.cache_service.warm_cache(**operation)
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(
|
||||||
|
f"Error warming cache for {
|
||||||
|
operation['cache_key']}: {e}"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
# Cache statistics and monitoring
|
||||||
|
class CacheMonitor:
|
||||||
|
"""Monitor cache performance and statistics"""
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
self.cache_service = EnhancedCacheService()
|
||||||
|
|
||||||
|
def get_cache_stats(self) -> Dict[str, Any]:
|
||||||
|
"""Get cache statistics if available"""
|
||||||
|
stats = {}
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Redis cache stats
|
||||||
|
if hasattr(self.cache_service.default_cache, "_cache"):
|
||||||
|
redis_client = self.cache_service.default_cache._cache.get_client()
|
||||||
|
info = redis_client.info()
|
||||||
|
stats["redis"] = {
|
||||||
|
"used_memory": info.get("used_memory_human"),
|
||||||
|
"connected_clients": info.get("connected_clients"),
|
||||||
|
"total_commands_processed": info.get("total_commands_processed"),
|
||||||
|
"keyspace_hits": info.get("keyspace_hits"),
|
||||||
|
"keyspace_misses": info.get("keyspace_misses"),
|
||||||
|
}
|
||||||
|
|
||||||
|
# Calculate hit rate
|
||||||
|
hits = info.get("keyspace_hits", 0)
|
||||||
|
misses = info.get("keyspace_misses", 0)
|
||||||
|
if hits + misses > 0:
|
||||||
|
stats["redis"]["hit_rate"] = hits / (hits + misses) * 100
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Error getting cache stats: {e}")
|
||||||
|
|
||||||
|
return stats
|
||||||
|
|
||||||
|
def log_cache_performance(self):
|
||||||
|
"""Log cache performance metrics"""
|
||||||
|
stats = self.get_cache_stats()
|
||||||
|
if stats:
|
||||||
|
logger.info("Cache performance statistics", extra=stats)
|
||||||
479
backend/apps/core/services/location_adapters.py
Normal file
479
backend/apps/core/services/location_adapters.py
Normal file
@@ -0,0 +1,479 @@
|
|||||||
|
"""
|
||||||
|
Location adapters for converting between domain-specific models and UnifiedLocation.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from django.db import models
|
||||||
|
from typing import List, Optional
|
||||||
|
from django.db.models import QuerySet
|
||||||
|
from django.urls import reverse
|
||||||
|
|
||||||
|
from .data_structures import (
|
||||||
|
UnifiedLocation,
|
||||||
|
LocationType,
|
||||||
|
GeoBounds,
|
||||||
|
MapFilters,
|
||||||
|
)
|
||||||
|
from apps.parks.models import ParkLocation, CompanyHeadquarters
|
||||||
|
from apps.rides.models import RideLocation
|
||||||
|
from apps.location.models import Location
|
||||||
|
|
||||||
|
|
||||||
|
class BaseLocationAdapter:
|
||||||
|
"""Base adapter class for location conversions."""
|
||||||
|
|
||||||
|
def to_unified_location(self, location_obj) -> Optional[UnifiedLocation]:
|
||||||
|
"""Convert model instance to UnifiedLocation."""
|
||||||
|
raise NotImplementedError
|
||||||
|
|
||||||
|
def get_queryset(
|
||||||
|
self,
|
||||||
|
bounds: Optional[GeoBounds] = None,
|
||||||
|
filters: Optional[MapFilters] = None,
|
||||||
|
) -> QuerySet:
|
||||||
|
"""Get optimized queryset for this location type."""
|
||||||
|
raise NotImplementedError
|
||||||
|
|
||||||
|
def bulk_convert(self, queryset: QuerySet) -> List[UnifiedLocation]:
|
||||||
|
"""Convert multiple location objects efficiently."""
|
||||||
|
unified_locations = []
|
||||||
|
for obj in queryset:
|
||||||
|
unified_loc = self.to_unified_location(obj)
|
||||||
|
if unified_loc:
|
||||||
|
unified_locations.append(unified_loc)
|
||||||
|
return unified_locations
|
||||||
|
|
||||||
|
|
||||||
|
class ParkLocationAdapter(BaseLocationAdapter):
|
||||||
|
"""Converts Park/ParkLocation to UnifiedLocation."""
|
||||||
|
|
||||||
|
def to_unified_location(
|
||||||
|
self, park_location: ParkLocation
|
||||||
|
) -> Optional[UnifiedLocation]:
|
||||||
|
"""Convert ParkLocation to UnifiedLocation."""
|
||||||
|
if not park_location.point:
|
||||||
|
return None
|
||||||
|
|
||||||
|
park = park_location.park
|
||||||
|
|
||||||
|
return UnifiedLocation(
|
||||||
|
id=f"park_{park.id}",
|
||||||
|
type=LocationType.PARK,
|
||||||
|
name=park.name,
|
||||||
|
coordinates=(park_location.latitude, park_location.longitude),
|
||||||
|
address=park_location.formatted_address,
|
||||||
|
metadata={
|
||||||
|
"status": getattr(park, "status", "UNKNOWN"),
|
||||||
|
"rating": (
|
||||||
|
float(park.average_rating)
|
||||||
|
if hasattr(park, "average_rating") and park.average_rating
|
||||||
|
else None
|
||||||
|
),
|
||||||
|
"ride_count": getattr(park, "ride_count", 0),
|
||||||
|
"coaster_count": getattr(park, "coaster_count", 0),
|
||||||
|
"operator": (
|
||||||
|
park.operator.name
|
||||||
|
if hasattr(park, "operator") and park.operator
|
||||||
|
else None
|
||||||
|
),
|
||||||
|
"city": park_location.city,
|
||||||
|
"state": park_location.state,
|
||||||
|
"country": park_location.country,
|
||||||
|
},
|
||||||
|
type_data={
|
||||||
|
"slug": park.slug,
|
||||||
|
"opening_date": (
|
||||||
|
park.opening_date.isoformat()
|
||||||
|
if hasattr(park, "opening_date") and park.opening_date
|
||||||
|
else None
|
||||||
|
),
|
||||||
|
"website": getattr(park, "website", ""),
|
||||||
|
"operating_season": getattr(park, "operating_season", ""),
|
||||||
|
"highway_exit": park_location.highway_exit,
|
||||||
|
"parking_notes": park_location.parking_notes,
|
||||||
|
"best_arrival_time": (
|
||||||
|
park_location.best_arrival_time.strftime("%H:%M")
|
||||||
|
if park_location.best_arrival_time
|
||||||
|
else None
|
||||||
|
),
|
||||||
|
"seasonal_notes": park_location.seasonal_notes,
|
||||||
|
"url": self._get_park_url(park),
|
||||||
|
},
|
||||||
|
cluster_weight=self._calculate_park_weight(park),
|
||||||
|
cluster_category=self._get_park_category(park),
|
||||||
|
)
|
||||||
|
|
||||||
|
def get_queryset(
|
||||||
|
self,
|
||||||
|
bounds: Optional[GeoBounds] = None,
|
||||||
|
filters: Optional[MapFilters] = None,
|
||||||
|
) -> QuerySet:
|
||||||
|
"""Get optimized queryset for park locations."""
|
||||||
|
queryset = ParkLocation.objects.select_related("park", "park__operator").filter(
|
||||||
|
point__isnull=False
|
||||||
|
)
|
||||||
|
|
||||||
|
# Spatial filtering
|
||||||
|
if bounds:
|
||||||
|
queryset = queryset.filter(point__within=bounds.to_polygon())
|
||||||
|
|
||||||
|
# Park-specific filters
|
||||||
|
if filters:
|
||||||
|
if filters.park_status:
|
||||||
|
queryset = queryset.filter(park__status__in=filters.park_status)
|
||||||
|
if filters.search_query:
|
||||||
|
queryset = queryset.filter(park__name__icontains=filters.search_query)
|
||||||
|
if filters.country:
|
||||||
|
queryset = queryset.filter(country=filters.country)
|
||||||
|
if filters.state:
|
||||||
|
queryset = queryset.filter(state=filters.state)
|
||||||
|
if filters.city:
|
||||||
|
queryset = queryset.filter(city=filters.city)
|
||||||
|
|
||||||
|
return queryset.order_by("park__name")
|
||||||
|
|
||||||
|
def _calculate_park_weight(self, park) -> int:
|
||||||
|
"""Calculate clustering weight based on park importance."""
|
||||||
|
weight = 1
|
||||||
|
if hasattr(park, "ride_count") and park.ride_count and park.ride_count > 20:
|
||||||
|
weight += 2
|
||||||
|
if (
|
||||||
|
hasattr(park, "coaster_count")
|
||||||
|
and park.coaster_count
|
||||||
|
and park.coaster_count > 5
|
||||||
|
):
|
||||||
|
weight += 1
|
||||||
|
if (
|
||||||
|
hasattr(park, "average_rating")
|
||||||
|
and park.average_rating
|
||||||
|
and park.average_rating > 4.0
|
||||||
|
):
|
||||||
|
weight += 1
|
||||||
|
return min(weight, 5) # Cap at 5
|
||||||
|
|
||||||
|
def _get_park_category(self, park) -> str:
|
||||||
|
"""Determine park category for clustering."""
|
||||||
|
coaster_count = getattr(park, "coaster_count", 0) or 0
|
||||||
|
ride_count = getattr(park, "ride_count", 0) or 0
|
||||||
|
|
||||||
|
if coaster_count >= 10:
|
||||||
|
return "major_park"
|
||||||
|
elif ride_count >= 15:
|
||||||
|
return "theme_park"
|
||||||
|
else:
|
||||||
|
return "small_park"
|
||||||
|
|
||||||
|
def _get_park_url(self, park) -> str:
|
||||||
|
"""Get URL for park detail page."""
|
||||||
|
try:
|
||||||
|
return reverse("parks:detail", kwargs={"slug": park.slug})
|
||||||
|
except BaseException:
|
||||||
|
return f"/parks/{park.slug}/"
|
||||||
|
|
||||||
|
|
||||||
|
class RideLocationAdapter(BaseLocationAdapter):
|
||||||
|
"""Converts Ride/RideLocation to UnifiedLocation."""
|
||||||
|
|
||||||
|
def to_unified_location(
|
||||||
|
self, ride_location: RideLocation
|
||||||
|
) -> Optional[UnifiedLocation]:
|
||||||
|
"""Convert RideLocation to UnifiedLocation."""
|
||||||
|
if not ride_location.point:
|
||||||
|
return None
|
||||||
|
|
||||||
|
ride = ride_location.ride
|
||||||
|
|
||||||
|
return UnifiedLocation(
|
||||||
|
id=f"ride_{ride.id}",
|
||||||
|
type=LocationType.RIDE,
|
||||||
|
name=ride.name,
|
||||||
|
coordinates=(ride_location.latitude, ride_location.longitude),
|
||||||
|
address=(
|
||||||
|
f"{ride_location.park_area}, {ride.park.name}"
|
||||||
|
if ride_location.park_area
|
||||||
|
else ride.park.name
|
||||||
|
),
|
||||||
|
metadata={
|
||||||
|
"park_id": ride.park.id,
|
||||||
|
"park_name": ride.park.name,
|
||||||
|
"park_area": ride_location.park_area,
|
||||||
|
"ride_type": getattr(ride, "ride_type", "Unknown"),
|
||||||
|
"status": getattr(ride, "status", "UNKNOWN"),
|
||||||
|
"rating": (
|
||||||
|
float(ride.average_rating)
|
||||||
|
if hasattr(ride, "average_rating") and ride.average_rating
|
||||||
|
else None
|
||||||
|
),
|
||||||
|
"manufacturer": (
|
||||||
|
getattr(ride, "manufacturer", {}).get("name")
|
||||||
|
if hasattr(ride, "manufacturer")
|
||||||
|
else None
|
||||||
|
),
|
||||||
|
},
|
||||||
|
type_data={
|
||||||
|
"slug": ride.slug,
|
||||||
|
"opening_date": (
|
||||||
|
ride.opening_date.isoformat()
|
||||||
|
if hasattr(ride, "opening_date") and ride.opening_date
|
||||||
|
else None
|
||||||
|
),
|
||||||
|
"height_requirement": getattr(ride, "height_requirement", ""),
|
||||||
|
"duration_minutes": getattr(ride, "duration_minutes", None),
|
||||||
|
"max_speed_mph": getattr(ride, "max_speed_mph", None),
|
||||||
|
"entrance_notes": ride_location.entrance_notes,
|
||||||
|
"accessibility_notes": ride_location.accessibility_notes,
|
||||||
|
"url": self._get_ride_url(ride),
|
||||||
|
},
|
||||||
|
cluster_weight=self._calculate_ride_weight(ride),
|
||||||
|
cluster_category=self._get_ride_category(ride),
|
||||||
|
)
|
||||||
|
|
||||||
|
def get_queryset(
|
||||||
|
self,
|
||||||
|
bounds: Optional[GeoBounds] = None,
|
||||||
|
filters: Optional[MapFilters] = None,
|
||||||
|
) -> QuerySet:
|
||||||
|
"""Get optimized queryset for ride locations."""
|
||||||
|
queryset = RideLocation.objects.select_related(
|
||||||
|
"ride", "ride__park", "ride__park__operator"
|
||||||
|
).filter(point__isnull=False)
|
||||||
|
|
||||||
|
# Spatial filtering
|
||||||
|
if bounds:
|
||||||
|
queryset = queryset.filter(point__within=bounds.to_polygon())
|
||||||
|
|
||||||
|
# Ride-specific filters
|
||||||
|
if filters:
|
||||||
|
if filters.ride_types:
|
||||||
|
queryset = queryset.filter(ride__ride_type__in=filters.ride_types)
|
||||||
|
if filters.search_query:
|
||||||
|
queryset = queryset.filter(ride__name__icontains=filters.search_query)
|
||||||
|
|
||||||
|
return queryset.order_by("ride__name")
|
||||||
|
|
||||||
|
def _calculate_ride_weight(self, ride) -> int:
|
||||||
|
"""Calculate clustering weight based on ride importance."""
|
||||||
|
weight = 1
|
||||||
|
ride_type = getattr(ride, "ride_type", "").lower()
|
||||||
|
if "coaster" in ride_type or "roller" in ride_type:
|
||||||
|
weight += 1
|
||||||
|
if (
|
||||||
|
hasattr(ride, "average_rating")
|
||||||
|
and ride.average_rating
|
||||||
|
and ride.average_rating > 4.0
|
||||||
|
):
|
||||||
|
weight += 1
|
||||||
|
return min(weight, 3) # Cap at 3 for rides
|
||||||
|
|
||||||
|
def _get_ride_category(self, ride) -> str:
|
||||||
|
"""Determine ride category for clustering."""
|
||||||
|
ride_type = getattr(ride, "ride_type", "").lower()
|
||||||
|
if "coaster" in ride_type or "roller" in ride_type:
|
||||||
|
return "coaster"
|
||||||
|
elif "water" in ride_type or "splash" in ride_type:
|
||||||
|
return "water_ride"
|
||||||
|
else:
|
||||||
|
return "other_ride"
|
||||||
|
|
||||||
|
def _get_ride_url(self, ride) -> str:
|
||||||
|
"""Get URL for ride detail page."""
|
||||||
|
try:
|
||||||
|
return reverse("rides:detail", kwargs={"slug": ride.slug})
|
||||||
|
except BaseException:
|
||||||
|
return f"/rides/{ride.slug}/"
|
||||||
|
|
||||||
|
|
||||||
|
class CompanyLocationAdapter(BaseLocationAdapter):
|
||||||
|
"""Converts Company/CompanyHeadquarters to UnifiedLocation."""
|
||||||
|
|
||||||
|
def to_unified_location(
|
||||||
|
self, company_headquarters: CompanyHeadquarters
|
||||||
|
) -> Optional[UnifiedLocation]:
|
||||||
|
"""Convert CompanyHeadquarters to UnifiedLocation."""
|
||||||
|
# Note: CompanyHeadquarters doesn't have coordinates, so we need to geocode
|
||||||
|
# For now, we'll skip companies without coordinates
|
||||||
|
# TODO: Implement geocoding service integration
|
||||||
|
return None
|
||||||
|
|
||||||
|
def get_queryset(
|
||||||
|
self,
|
||||||
|
bounds: Optional[GeoBounds] = None,
|
||||||
|
filters: Optional[MapFilters] = None,
|
||||||
|
) -> QuerySet:
|
||||||
|
"""Get optimized queryset for company locations."""
|
||||||
|
queryset = CompanyHeadquarters.objects.select_related("company")
|
||||||
|
|
||||||
|
# Company-specific filters
|
||||||
|
if filters:
|
||||||
|
if filters.company_roles:
|
||||||
|
queryset = queryset.filter(
|
||||||
|
company__roles__overlap=filters.company_roles
|
||||||
|
)
|
||||||
|
if filters.search_query:
|
||||||
|
queryset = queryset.filter(
|
||||||
|
company__name__icontains=filters.search_query
|
||||||
|
)
|
||||||
|
if filters.country:
|
||||||
|
queryset = queryset.filter(country=filters.country)
|
||||||
|
if filters.city:
|
||||||
|
queryset = queryset.filter(city=filters.city)
|
||||||
|
|
||||||
|
return queryset.order_by("company__name")
|
||||||
|
|
||||||
|
|
||||||
|
class GenericLocationAdapter(BaseLocationAdapter):
|
||||||
|
"""Converts generic Location model to UnifiedLocation."""
|
||||||
|
|
||||||
|
def to_unified_location(self, location: Location) -> Optional[UnifiedLocation]:
|
||||||
|
"""Convert generic Location to UnifiedLocation."""
|
||||||
|
if not location.point and not (location.latitude and location.longitude):
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Use point coordinates if available, fall back to lat/lng fields
|
||||||
|
if location.point:
|
||||||
|
coordinates = (location.point.y, location.point.x)
|
||||||
|
else:
|
||||||
|
coordinates = (float(location.latitude), float(location.longitude))
|
||||||
|
|
||||||
|
return UnifiedLocation(
|
||||||
|
id=f"generic_{location.id}",
|
||||||
|
type=LocationType.GENERIC,
|
||||||
|
name=location.name,
|
||||||
|
coordinates=coordinates,
|
||||||
|
address=location.get_formatted_address(),
|
||||||
|
metadata={
|
||||||
|
"location_type": location.location_type,
|
||||||
|
"content_type": (
|
||||||
|
location.content_type.model if location.content_type else None
|
||||||
|
),
|
||||||
|
"object_id": location.object_id,
|
||||||
|
"city": location.city,
|
||||||
|
"state": location.state,
|
||||||
|
"country": location.country,
|
||||||
|
},
|
||||||
|
type_data={
|
||||||
|
"created_at": (
|
||||||
|
location.created_at.isoformat() if location.created_at else None
|
||||||
|
),
|
||||||
|
"updated_at": (
|
||||||
|
location.updated_at.isoformat() if location.updated_at else None
|
||||||
|
),
|
||||||
|
},
|
||||||
|
cluster_weight=1,
|
||||||
|
cluster_category="generic",
|
||||||
|
)
|
||||||
|
|
||||||
|
def get_queryset(
|
||||||
|
self,
|
||||||
|
bounds: Optional[GeoBounds] = None,
|
||||||
|
filters: Optional[MapFilters] = None,
|
||||||
|
) -> QuerySet:
|
||||||
|
"""Get optimized queryset for generic locations."""
|
||||||
|
queryset = Location.objects.select_related("content_type").filter(
|
||||||
|
models.Q(point__isnull=False)
|
||||||
|
| models.Q(latitude__isnull=False, longitude__isnull=False)
|
||||||
|
)
|
||||||
|
|
||||||
|
# Spatial filtering
|
||||||
|
if bounds:
|
||||||
|
queryset = queryset.filter(
|
||||||
|
models.Q(point__within=bounds.to_polygon())
|
||||||
|
| models.Q(
|
||||||
|
latitude__gte=bounds.south,
|
||||||
|
latitude__lte=bounds.north,
|
||||||
|
longitude__gte=bounds.west,
|
||||||
|
longitude__lte=bounds.east,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
# Generic filters
|
||||||
|
if filters:
|
||||||
|
if filters.search_query:
|
||||||
|
queryset = queryset.filter(name__icontains=filters.search_query)
|
||||||
|
if filters.country:
|
||||||
|
queryset = queryset.filter(country=filters.country)
|
||||||
|
if filters.city:
|
||||||
|
queryset = queryset.filter(city=filters.city)
|
||||||
|
|
||||||
|
return queryset.order_by("name")
|
||||||
|
|
||||||
|
|
||||||
|
class LocationAbstractionLayer:
|
||||||
|
"""
|
||||||
|
Abstraction layer handling different location model types.
|
||||||
|
Implements the adapter pattern to provide unified access to all location types.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
self.adapters = {
|
||||||
|
LocationType.PARK: ParkLocationAdapter(),
|
||||||
|
LocationType.RIDE: RideLocationAdapter(),
|
||||||
|
LocationType.COMPANY: CompanyLocationAdapter(),
|
||||||
|
LocationType.GENERIC: GenericLocationAdapter(),
|
||||||
|
}
|
||||||
|
|
||||||
|
def get_all_locations(
|
||||||
|
self,
|
||||||
|
bounds: Optional[GeoBounds] = None,
|
||||||
|
filters: Optional[MapFilters] = None,
|
||||||
|
) -> List[UnifiedLocation]:
|
||||||
|
"""Get locations from all sources within bounds."""
|
||||||
|
all_locations = []
|
||||||
|
|
||||||
|
# Determine which location types to include
|
||||||
|
location_types = (
|
||||||
|
filters.location_types
|
||||||
|
if filters and filters.location_types
|
||||||
|
else set(LocationType)
|
||||||
|
)
|
||||||
|
|
||||||
|
for location_type in location_types:
|
||||||
|
adapter = self.adapters[location_type]
|
||||||
|
queryset = adapter.get_queryset(bounds, filters)
|
||||||
|
locations = adapter.bulk_convert(queryset)
|
||||||
|
all_locations.extend(locations)
|
||||||
|
|
||||||
|
return all_locations
|
||||||
|
|
||||||
|
def get_locations_by_type(
|
||||||
|
self,
|
||||||
|
location_type: LocationType,
|
||||||
|
bounds: Optional[GeoBounds] = None,
|
||||||
|
filters: Optional[MapFilters] = None,
|
||||||
|
) -> List[UnifiedLocation]:
|
||||||
|
"""Get locations of specific type."""
|
||||||
|
adapter = self.adapters[location_type]
|
||||||
|
queryset = adapter.get_queryset(bounds, filters)
|
||||||
|
return adapter.bulk_convert(queryset)
|
||||||
|
|
||||||
|
def get_location_by_id(
|
||||||
|
self, location_type: LocationType, location_id: int
|
||||||
|
) -> Optional[UnifiedLocation]:
|
||||||
|
"""Get single location with full details."""
|
||||||
|
adapter = self.adapters[location_type]
|
||||||
|
|
||||||
|
try:
|
||||||
|
if location_type == LocationType.PARK:
|
||||||
|
obj = ParkLocation.objects.select_related("park", "park__operator").get(
|
||||||
|
park_id=location_id
|
||||||
|
)
|
||||||
|
elif location_type == LocationType.RIDE:
|
||||||
|
obj = RideLocation.objects.select_related("ride", "ride__park").get(
|
||||||
|
ride_id=location_id
|
||||||
|
)
|
||||||
|
elif location_type == LocationType.COMPANY:
|
||||||
|
obj = CompanyHeadquarters.objects.select_related("company").get(
|
||||||
|
company_id=location_id
|
||||||
|
)
|
||||||
|
elif location_type == LocationType.GENERIC:
|
||||||
|
obj = Location.objects.select_related("content_type").get(
|
||||||
|
id=location_id
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
return None
|
||||||
|
|
||||||
|
return adapter.to_unified_location(obj)
|
||||||
|
except Exception:
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
# Import models after defining adapters to avoid circular imports
|
||||||
465
backend/apps/core/services/location_search.py
Normal file
465
backend/apps/core/services/location_search.py
Normal file
@@ -0,0 +1,465 @@
|
|||||||
|
"""
|
||||||
|
Location-aware search service for ThrillWiki.
|
||||||
|
|
||||||
|
Integrates PostGIS location data with existing search functionality
|
||||||
|
to provide proximity-based search, location filtering, and geographic
|
||||||
|
search capabilities.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from django.contrib.gis.geos import Point
|
||||||
|
from django.contrib.gis.measure import Distance
|
||||||
|
from django.db.models import Q
|
||||||
|
from typing import Optional, List, Dict, Any, Set
|
||||||
|
from dataclasses import dataclass
|
||||||
|
|
||||||
|
from apps.parks.models import Park, Company, ParkLocation
|
||||||
|
from apps.rides.models import Ride
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class LocationSearchFilters:
|
||||||
|
"""Filters for location-aware search queries."""
|
||||||
|
|
||||||
|
# Text search
|
||||||
|
search_query: Optional[str] = None
|
||||||
|
|
||||||
|
# Location-based filters
|
||||||
|
location_point: Optional[Point] = None
|
||||||
|
radius_km: Optional[float] = None
|
||||||
|
location_types: Optional[Set[str]] = None # 'park', 'ride', 'company'
|
||||||
|
|
||||||
|
# Geographic filters
|
||||||
|
country: Optional[str] = None
|
||||||
|
state: Optional[str] = None
|
||||||
|
city: Optional[str] = None
|
||||||
|
|
||||||
|
# Content-specific filters
|
||||||
|
park_status: Optional[List[str]] = None
|
||||||
|
ride_types: Optional[List[str]] = None
|
||||||
|
company_roles: Optional[List[str]] = None
|
||||||
|
|
||||||
|
# Result options
|
||||||
|
include_distance: bool = True
|
||||||
|
max_results: int = 100
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class LocationSearchResult:
|
||||||
|
"""Single search result with location data."""
|
||||||
|
|
||||||
|
# Core data
|
||||||
|
content_type: str # 'park', 'ride', 'company'
|
||||||
|
object_id: int
|
||||||
|
name: str
|
||||||
|
description: Optional[str] = None
|
||||||
|
url: Optional[str] = None
|
||||||
|
|
||||||
|
# Location data
|
||||||
|
latitude: Optional[float] = None
|
||||||
|
longitude: Optional[float] = None
|
||||||
|
address: Optional[str] = None
|
||||||
|
city: Optional[str] = None
|
||||||
|
state: Optional[str] = None
|
||||||
|
country: Optional[str] = None
|
||||||
|
|
||||||
|
# Distance data (if proximity search)
|
||||||
|
distance_km: Optional[float] = None
|
||||||
|
|
||||||
|
# Additional metadata
|
||||||
|
status: Optional[str] = None
|
||||||
|
tags: Optional[List[str]] = None
|
||||||
|
rating: Optional[float] = None
|
||||||
|
|
||||||
|
def to_dict(self) -> Dict[str, Any]:
|
||||||
|
"""Convert to dictionary for JSON serialization."""
|
||||||
|
return {
|
||||||
|
"content_type": self.content_type,
|
||||||
|
"object_id": self.object_id,
|
||||||
|
"name": self.name,
|
||||||
|
"description": self.description,
|
||||||
|
"url": self.url,
|
||||||
|
"location": {
|
||||||
|
"latitude": self.latitude,
|
||||||
|
"longitude": self.longitude,
|
||||||
|
"address": self.address,
|
||||||
|
"city": self.city,
|
||||||
|
"state": self.state,
|
||||||
|
"country": self.country,
|
||||||
|
},
|
||||||
|
"distance_km": self.distance_km,
|
||||||
|
"status": self.status,
|
||||||
|
"tags": self.tags or [],
|
||||||
|
"rating": self.rating,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
class LocationSearchService:
|
||||||
|
"""Service for performing location-aware searches across ThrillWiki content."""
|
||||||
|
|
||||||
|
def search(self, filters: LocationSearchFilters) -> List[LocationSearchResult]:
|
||||||
|
"""
|
||||||
|
Perform a comprehensive location-aware search.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
filters: Search filters and options
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of search results with location data
|
||||||
|
"""
|
||||||
|
results = []
|
||||||
|
|
||||||
|
# Search each content type based on filters
|
||||||
|
if not filters.location_types or "park" in filters.location_types:
|
||||||
|
results.extend(self._search_parks(filters))
|
||||||
|
|
||||||
|
if not filters.location_types or "ride" in filters.location_types:
|
||||||
|
results.extend(self._search_rides(filters))
|
||||||
|
|
||||||
|
if not filters.location_types or "company" in filters.location_types:
|
||||||
|
results.extend(self._search_companies(filters))
|
||||||
|
|
||||||
|
# Sort by distance if proximity search, otherwise by relevance
|
||||||
|
if filters.location_point and filters.include_distance:
|
||||||
|
results.sort(key=lambda x: x.distance_km or float("inf"))
|
||||||
|
else:
|
||||||
|
results.sort(key=lambda x: x.name.lower())
|
||||||
|
|
||||||
|
# Apply max results limit
|
||||||
|
return results[: filters.max_results]
|
||||||
|
|
||||||
|
def _search_parks(
|
||||||
|
self, filters: LocationSearchFilters
|
||||||
|
) -> List[LocationSearchResult]:
|
||||||
|
"""Search parks with location data."""
|
||||||
|
queryset = Park.objects.select_related("location", "operator").all()
|
||||||
|
|
||||||
|
# Apply location filters
|
||||||
|
queryset = self._apply_location_filters(queryset, filters, "location__point")
|
||||||
|
|
||||||
|
# Apply text search
|
||||||
|
if filters.search_query:
|
||||||
|
query = (
|
||||||
|
Q(name__icontains=filters.search_query)
|
||||||
|
| Q(description__icontains=filters.search_query)
|
||||||
|
| Q(location__city__icontains=filters.search_query)
|
||||||
|
| Q(location__state__icontains=filters.search_query)
|
||||||
|
| Q(location__country__icontains=filters.search_query)
|
||||||
|
)
|
||||||
|
queryset = queryset.filter(query)
|
||||||
|
|
||||||
|
# Apply park-specific filters
|
||||||
|
if filters.park_status:
|
||||||
|
queryset = queryset.filter(status__in=filters.park_status)
|
||||||
|
|
||||||
|
# Add distance annotation if proximity search
|
||||||
|
if filters.location_point and filters.include_distance:
|
||||||
|
queryset = queryset.annotate(
|
||||||
|
distance=Distance("location__point", filters.location_point)
|
||||||
|
).order_by("distance")
|
||||||
|
|
||||||
|
# Convert to search results
|
||||||
|
results = []
|
||||||
|
for park in queryset:
|
||||||
|
result = LocationSearchResult(
|
||||||
|
content_type="park",
|
||||||
|
object_id=park.id,
|
||||||
|
name=park.name,
|
||||||
|
description=park.description,
|
||||||
|
url=(
|
||||||
|
park.get_absolute_url()
|
||||||
|
if hasattr(park, "get_absolute_url")
|
||||||
|
else None
|
||||||
|
),
|
||||||
|
status=park.get_status_display(),
|
||||||
|
rating=(float(park.average_rating) if park.average_rating else None),
|
||||||
|
tags=["park", park.status.lower()],
|
||||||
|
)
|
||||||
|
|
||||||
|
# Add location data
|
||||||
|
if hasattr(park, "location") and park.location:
|
||||||
|
location = park.location
|
||||||
|
result.latitude = location.latitude
|
||||||
|
result.longitude = location.longitude
|
||||||
|
result.address = location.formatted_address
|
||||||
|
result.city = location.city
|
||||||
|
result.state = location.state
|
||||||
|
result.country = location.country
|
||||||
|
|
||||||
|
# Add distance if proximity search
|
||||||
|
if (
|
||||||
|
filters.location_point
|
||||||
|
and filters.include_distance
|
||||||
|
and hasattr(park, "distance")
|
||||||
|
):
|
||||||
|
result.distance_km = float(park.distance.km)
|
||||||
|
|
||||||
|
results.append(result)
|
||||||
|
|
||||||
|
return results
|
||||||
|
|
||||||
|
def _search_rides(
|
||||||
|
self, filters: LocationSearchFilters
|
||||||
|
) -> List[LocationSearchResult]:
|
||||||
|
"""Search rides with location data."""
|
||||||
|
queryset = Ride.objects.select_related("park", "location").all()
|
||||||
|
|
||||||
|
# Apply location filters
|
||||||
|
queryset = self._apply_location_filters(queryset, filters, "location__point")
|
||||||
|
|
||||||
|
# Apply text search
|
||||||
|
if filters.search_query:
|
||||||
|
query = (
|
||||||
|
Q(name__icontains=filters.search_query)
|
||||||
|
| Q(description__icontains=filters.search_query)
|
||||||
|
| Q(park__name__icontains=filters.search_query)
|
||||||
|
| Q(location__park_area__icontains=filters.search_query)
|
||||||
|
)
|
||||||
|
queryset = queryset.filter(query)
|
||||||
|
|
||||||
|
# Apply ride-specific filters
|
||||||
|
if filters.ride_types:
|
||||||
|
queryset = queryset.filter(ride_type__in=filters.ride_types)
|
||||||
|
|
||||||
|
# Add distance annotation if proximity search
|
||||||
|
if filters.location_point and filters.include_distance:
|
||||||
|
queryset = queryset.annotate(
|
||||||
|
distance=Distance("location__point", filters.location_point)
|
||||||
|
).order_by("distance")
|
||||||
|
|
||||||
|
# Convert to search results
|
||||||
|
results = []
|
||||||
|
for ride in queryset:
|
||||||
|
result = LocationSearchResult(
|
||||||
|
content_type="ride",
|
||||||
|
object_id=ride.id,
|
||||||
|
name=ride.name,
|
||||||
|
description=ride.description,
|
||||||
|
url=(
|
||||||
|
ride.get_absolute_url()
|
||||||
|
if hasattr(ride, "get_absolute_url")
|
||||||
|
else None
|
||||||
|
),
|
||||||
|
status=ride.status,
|
||||||
|
tags=[
|
||||||
|
"ride",
|
||||||
|
ride.ride_type.lower() if ride.ride_type else "attraction",
|
||||||
|
],
|
||||||
|
)
|
||||||
|
|
||||||
|
# Add location data from ride location or park location
|
||||||
|
location = None
|
||||||
|
if hasattr(ride, "location") and ride.location:
|
||||||
|
location = ride.location
|
||||||
|
result.latitude = location.latitude
|
||||||
|
result.longitude = location.longitude
|
||||||
|
result.address = (
|
||||||
|
f"{ride.park.name} - {location.park_area}"
|
||||||
|
if location.park_area
|
||||||
|
else ride.park.name
|
||||||
|
)
|
||||||
|
|
||||||
|
# Add distance if proximity search
|
||||||
|
if (
|
||||||
|
filters.location_point
|
||||||
|
and filters.include_distance
|
||||||
|
and hasattr(ride, "distance")
|
||||||
|
):
|
||||||
|
result.distance_km = float(ride.distance.km)
|
||||||
|
|
||||||
|
# Fall back to park location if no specific ride location
|
||||||
|
elif ride.park and hasattr(ride.park, "location") and ride.park.location:
|
||||||
|
park_location = ride.park.location
|
||||||
|
result.latitude = park_location.latitude
|
||||||
|
result.longitude = park_location.longitude
|
||||||
|
result.address = park_location.formatted_address
|
||||||
|
result.city = park_location.city
|
||||||
|
result.state = park_location.state
|
||||||
|
result.country = park_location.country
|
||||||
|
|
||||||
|
results.append(result)
|
||||||
|
|
||||||
|
return results
|
||||||
|
|
||||||
|
def _search_companies(
|
||||||
|
self, filters: LocationSearchFilters
|
||||||
|
) -> List[LocationSearchResult]:
|
||||||
|
"""Search companies with headquarters location data."""
|
||||||
|
queryset = Company.objects.select_related("headquarters").all()
|
||||||
|
|
||||||
|
# Apply location filters
|
||||||
|
queryset = self._apply_location_filters(
|
||||||
|
queryset, filters, "headquarters__point"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Apply text search
|
||||||
|
if filters.search_query:
|
||||||
|
query = (
|
||||||
|
Q(name__icontains=filters.search_query)
|
||||||
|
| Q(description__icontains=filters.search_query)
|
||||||
|
| Q(headquarters__city__icontains=filters.search_query)
|
||||||
|
| Q(headquarters__state_province__icontains=filters.search_query)
|
||||||
|
| Q(headquarters__country__icontains=filters.search_query)
|
||||||
|
)
|
||||||
|
queryset = queryset.filter(query)
|
||||||
|
|
||||||
|
# Apply company-specific filters
|
||||||
|
if filters.company_roles:
|
||||||
|
queryset = queryset.filter(roles__overlap=filters.company_roles)
|
||||||
|
|
||||||
|
# Add distance annotation if proximity search
|
||||||
|
if filters.location_point and filters.include_distance:
|
||||||
|
queryset = queryset.annotate(
|
||||||
|
distance=Distance("headquarters__point", filters.location_point)
|
||||||
|
).order_by("distance")
|
||||||
|
|
||||||
|
# Convert to search results
|
||||||
|
results = []
|
||||||
|
for company in queryset:
|
||||||
|
result = LocationSearchResult(
|
||||||
|
content_type="company",
|
||||||
|
object_id=company.id,
|
||||||
|
name=company.name,
|
||||||
|
description=company.description,
|
||||||
|
url=(
|
||||||
|
company.get_absolute_url()
|
||||||
|
if hasattr(company, "get_absolute_url")
|
||||||
|
else None
|
||||||
|
),
|
||||||
|
tags=["company"] + (company.roles or []),
|
||||||
|
)
|
||||||
|
|
||||||
|
# Add location data
|
||||||
|
if hasattr(company, "headquarters") and company.headquarters:
|
||||||
|
hq = company.headquarters
|
||||||
|
result.latitude = hq.latitude
|
||||||
|
result.longitude = hq.longitude
|
||||||
|
result.address = hq.formatted_address
|
||||||
|
result.city = hq.city
|
||||||
|
result.state = hq.state_province
|
||||||
|
result.country = hq.country
|
||||||
|
|
||||||
|
# Add distance if proximity search
|
||||||
|
if (
|
||||||
|
filters.location_point
|
||||||
|
and filters.include_distance
|
||||||
|
and hasattr(company, "distance")
|
||||||
|
):
|
||||||
|
result.distance_km = float(company.distance.km)
|
||||||
|
|
||||||
|
results.append(result)
|
||||||
|
|
||||||
|
return results
|
||||||
|
|
||||||
|
def _apply_location_filters(
|
||||||
|
self, queryset, filters: LocationSearchFilters, point_field: str
|
||||||
|
):
|
||||||
|
"""Apply common location filters to a queryset."""
|
||||||
|
|
||||||
|
# Proximity filter
|
||||||
|
if filters.location_point and filters.radius_km:
|
||||||
|
distance = Distance(km=filters.radius_km)
|
||||||
|
queryset = queryset.filter(
|
||||||
|
**{
|
||||||
|
f"{point_field}__distance_lte": (
|
||||||
|
filters.location_point,
|
||||||
|
distance,
|
||||||
|
)
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
# Geographic filters - adjust field names based on model
|
||||||
|
if filters.country:
|
||||||
|
if "headquarters" in point_field:
|
||||||
|
queryset = queryset.filter(
|
||||||
|
headquarters__country__icontains=filters.country
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
location_field = point_field.split("__")[0]
|
||||||
|
queryset = queryset.filter(
|
||||||
|
**{f"{location_field}__country__icontains": filters.country}
|
||||||
|
)
|
||||||
|
|
||||||
|
if filters.state:
|
||||||
|
if "headquarters" in point_field:
|
||||||
|
queryset = queryset.filter(
|
||||||
|
headquarters__state_province__icontains=filters.state
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
location_field = point_field.split("__")[0]
|
||||||
|
queryset = queryset.filter(
|
||||||
|
**{f"{location_field}__state__icontains": filters.state}
|
||||||
|
)
|
||||||
|
|
||||||
|
if filters.city:
|
||||||
|
location_field = point_field.split("__")[0]
|
||||||
|
queryset = queryset.filter(
|
||||||
|
**{f"{location_field}__city__icontains": filters.city}
|
||||||
|
)
|
||||||
|
|
||||||
|
return queryset
|
||||||
|
|
||||||
|
def suggest_locations(self, query: str, limit: int = 10) -> List[Dict[str, Any]]:
|
||||||
|
"""
|
||||||
|
Get location suggestions for autocomplete.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
query: Search query string
|
||||||
|
limit: Maximum number of suggestions
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of location suggestions
|
||||||
|
"""
|
||||||
|
suggestions = []
|
||||||
|
|
||||||
|
if len(query) < 2:
|
||||||
|
return suggestions
|
||||||
|
|
||||||
|
# Get park location suggestions
|
||||||
|
park_locations = ParkLocation.objects.filter(
|
||||||
|
Q(park__name__icontains=query)
|
||||||
|
| Q(city__icontains=query)
|
||||||
|
| Q(state__icontains=query)
|
||||||
|
).select_related("park")[: limit // 3]
|
||||||
|
|
||||||
|
for location in park_locations:
|
||||||
|
suggestions.append(
|
||||||
|
{
|
||||||
|
"type": "park",
|
||||||
|
"name": location.park.name,
|
||||||
|
"address": location.formatted_address,
|
||||||
|
"coordinates": location.coordinates,
|
||||||
|
"url": (
|
||||||
|
location.park.get_absolute_url()
|
||||||
|
if hasattr(location.park, "get_absolute_url")
|
||||||
|
else None
|
||||||
|
),
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
# Get city suggestions
|
||||||
|
cities = (
|
||||||
|
ParkLocation.objects.filter(city__icontains=query)
|
||||||
|
.values("city", "state", "country")
|
||||||
|
.distinct()[: limit // 3]
|
||||||
|
)
|
||||||
|
|
||||||
|
for city_data in cities:
|
||||||
|
suggestions.append(
|
||||||
|
{
|
||||||
|
"type": "city",
|
||||||
|
"name": f"{
|
||||||
|
city_data['city']}, {
|
||||||
|
city_data['state']}",
|
||||||
|
"address": f"{
|
||||||
|
city_data['city']}, {
|
||||||
|
city_data['state']}, {
|
||||||
|
city_data['country']}",
|
||||||
|
"coordinates": None,
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
return suggestions[:limit]
|
||||||
|
|
||||||
|
|
||||||
|
# Global instance
|
||||||
|
location_search_service = LocationSearchService()
|
||||||
438
backend/apps/core/services/map_cache_service.py
Normal file
438
backend/apps/core/services/map_cache_service.py
Normal file
@@ -0,0 +1,438 @@
|
|||||||
|
"""
|
||||||
|
Caching service for map data to improve performance and reduce database load.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import hashlib
|
||||||
|
import json
|
||||||
|
import time
|
||||||
|
from typing import Dict, List, Optional, Any
|
||||||
|
|
||||||
|
from django.core.cache import cache
|
||||||
|
from django.utils import timezone
|
||||||
|
|
||||||
|
from .data_structures import (
|
||||||
|
UnifiedLocation,
|
||||||
|
ClusterData,
|
||||||
|
GeoBounds,
|
||||||
|
MapFilters,
|
||||||
|
MapResponse,
|
||||||
|
QueryPerformanceMetrics,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class MapCacheService:
|
||||||
|
"""
|
||||||
|
Handles caching of map data with geographic partitioning and intelligent invalidation.
|
||||||
|
"""
|
||||||
|
|
||||||
|
# Cache configuration
|
||||||
|
DEFAULT_TTL = 3600 # 1 hour
|
||||||
|
CLUSTER_TTL = 7200 # 2 hours (clusters change less frequently)
|
||||||
|
LOCATION_DETAIL_TTL = 1800 # 30 minutes
|
||||||
|
BOUNDS_CACHE_TTL = 1800 # 30 minutes
|
||||||
|
|
||||||
|
# Cache key prefixes
|
||||||
|
CACHE_PREFIX = "thrillwiki_map"
|
||||||
|
LOCATIONS_PREFIX = f"{CACHE_PREFIX}:locations"
|
||||||
|
CLUSTERS_PREFIX = f"{CACHE_PREFIX}:clusters"
|
||||||
|
BOUNDS_PREFIX = f"{CACHE_PREFIX}:bounds"
|
||||||
|
DETAIL_PREFIX = f"{CACHE_PREFIX}:detail"
|
||||||
|
STATS_PREFIX = f"{CACHE_PREFIX}:stats"
|
||||||
|
|
||||||
|
# Geographic partitioning settings
|
||||||
|
GEOHASH_PRECISION = 6 # ~1.2km precision for cache partitioning
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
self.cache_stats = {
|
||||||
|
"hits": 0,
|
||||||
|
"misses": 0,
|
||||||
|
"invalidations": 0,
|
||||||
|
"geohash_partitions": 0,
|
||||||
|
}
|
||||||
|
|
||||||
|
def get_locations_cache_key(
|
||||||
|
self,
|
||||||
|
bounds: Optional[GeoBounds],
|
||||||
|
filters: Optional[MapFilters],
|
||||||
|
zoom_level: Optional[int] = None,
|
||||||
|
) -> str:
|
||||||
|
"""Generate cache key for location queries."""
|
||||||
|
key_parts = [self.LOCATIONS_PREFIX]
|
||||||
|
|
||||||
|
if bounds:
|
||||||
|
# Use geohash for spatial locality
|
||||||
|
geohash = self._bounds_to_geohash(bounds)
|
||||||
|
key_parts.append(f"geo:{geohash}")
|
||||||
|
|
||||||
|
if filters:
|
||||||
|
# Create deterministic hash of filters
|
||||||
|
filter_hash = self._hash_filters(filters)
|
||||||
|
key_parts.append(f"filters:{filter_hash}")
|
||||||
|
|
||||||
|
if zoom_level is not None:
|
||||||
|
key_parts.append(f"zoom:{zoom_level}")
|
||||||
|
|
||||||
|
return ":".join(key_parts)
|
||||||
|
|
||||||
|
def get_clusters_cache_key(
|
||||||
|
self,
|
||||||
|
bounds: Optional[GeoBounds],
|
||||||
|
filters: Optional[MapFilters],
|
||||||
|
zoom_level: int,
|
||||||
|
) -> str:
|
||||||
|
"""Generate cache key for cluster queries."""
|
||||||
|
key_parts = [self.CLUSTERS_PREFIX, f"zoom:{zoom_level}"]
|
||||||
|
|
||||||
|
if bounds:
|
||||||
|
geohash = self._bounds_to_geohash(bounds)
|
||||||
|
key_parts.append(f"geo:{geohash}")
|
||||||
|
|
||||||
|
if filters:
|
||||||
|
filter_hash = self._hash_filters(filters)
|
||||||
|
key_parts.append(f"filters:{filter_hash}")
|
||||||
|
|
||||||
|
return ":".join(key_parts)
|
||||||
|
|
||||||
|
def get_location_detail_cache_key(
|
||||||
|
self, location_type: str, location_id: int
|
||||||
|
) -> str:
|
||||||
|
"""Generate cache key for individual location details."""
|
||||||
|
return f"{self.DETAIL_PREFIX}:{location_type}:{location_id}"
|
||||||
|
|
||||||
|
def cache_locations(
|
||||||
|
self,
|
||||||
|
cache_key: str,
|
||||||
|
locations: List[UnifiedLocation],
|
||||||
|
ttl: Optional[int] = None,
|
||||||
|
) -> None:
|
||||||
|
"""Cache location data."""
|
||||||
|
try:
|
||||||
|
# Convert locations to serializable format
|
||||||
|
cache_data = {
|
||||||
|
"locations": [loc.to_dict() for loc in locations],
|
||||||
|
"cached_at": timezone.now().isoformat(),
|
||||||
|
"count": len(locations),
|
||||||
|
}
|
||||||
|
|
||||||
|
cache.set(cache_key, cache_data, ttl or self.DEFAULT_TTL)
|
||||||
|
except Exception as e:
|
||||||
|
# Log error but don't fail the request
|
||||||
|
print(f"Cache write error for key {cache_key}: {e}")
|
||||||
|
|
||||||
|
def cache_clusters(
|
||||||
|
self,
|
||||||
|
cache_key: str,
|
||||||
|
clusters: List[ClusterData],
|
||||||
|
ttl: Optional[int] = None,
|
||||||
|
) -> None:
|
||||||
|
"""Cache cluster data."""
|
||||||
|
try:
|
||||||
|
cache_data = {
|
||||||
|
"clusters": [cluster.to_dict() for cluster in clusters],
|
||||||
|
"cached_at": timezone.now().isoformat(),
|
||||||
|
"count": len(clusters),
|
||||||
|
}
|
||||||
|
|
||||||
|
cache.set(cache_key, cache_data, ttl or self.CLUSTER_TTL)
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Cache write error for clusters {cache_key}: {e}")
|
||||||
|
|
||||||
|
def cache_map_response(
|
||||||
|
self, cache_key: str, response: MapResponse, ttl: Optional[int] = None
|
||||||
|
) -> None:
|
||||||
|
"""Cache complete map response."""
|
||||||
|
try:
|
||||||
|
cache_data = response.to_dict()
|
||||||
|
cache_data["cached_at"] = timezone.now().isoformat()
|
||||||
|
|
||||||
|
cache.set(cache_key, cache_data, ttl or self.DEFAULT_TTL)
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Cache write error for response {cache_key}: {e}")
|
||||||
|
|
||||||
|
def get_cached_locations(self, cache_key: str) -> Optional[List[UnifiedLocation]]:
|
||||||
|
"""Retrieve cached location data."""
|
||||||
|
try:
|
||||||
|
cache_data = cache.get(cache_key)
|
||||||
|
if not cache_data:
|
||||||
|
self.cache_stats["misses"] += 1
|
||||||
|
return None
|
||||||
|
|
||||||
|
self.cache_stats["hits"] += 1
|
||||||
|
|
||||||
|
# Convert back to UnifiedLocation objects
|
||||||
|
locations = []
|
||||||
|
for loc_data in cache_data["locations"]:
|
||||||
|
# Reconstruct UnifiedLocation from dictionary
|
||||||
|
locations.append(self._dict_to_unified_location(loc_data))
|
||||||
|
|
||||||
|
return locations
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Cache read error for key {cache_key}: {e}")
|
||||||
|
self.cache_stats["misses"] += 1
|
||||||
|
return None
|
||||||
|
|
||||||
|
def get_cached_clusters(self, cache_key: str) -> Optional[List[ClusterData]]:
|
||||||
|
"""Retrieve cached cluster data."""
|
||||||
|
try:
|
||||||
|
cache_data = cache.get(cache_key)
|
||||||
|
if not cache_data:
|
||||||
|
self.cache_stats["misses"] += 1
|
||||||
|
return None
|
||||||
|
|
||||||
|
self.cache_stats["hits"] += 1
|
||||||
|
|
||||||
|
# Convert back to ClusterData objects
|
||||||
|
clusters = []
|
||||||
|
for cluster_data in cache_data["clusters"]:
|
||||||
|
clusters.append(self._dict_to_cluster_data(cluster_data))
|
||||||
|
|
||||||
|
return clusters
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Cache read error for clusters {cache_key}: {e}")
|
||||||
|
self.cache_stats["misses"] += 1
|
||||||
|
return None
|
||||||
|
|
||||||
|
def get_cached_map_response(self, cache_key: str) -> Optional[MapResponse]:
|
||||||
|
"""Retrieve cached map response."""
|
||||||
|
try:
|
||||||
|
cache_data = cache.get(cache_key)
|
||||||
|
if not cache_data:
|
||||||
|
self.cache_stats["misses"] += 1
|
||||||
|
return None
|
||||||
|
|
||||||
|
self.cache_stats["hits"] += 1
|
||||||
|
|
||||||
|
# Convert back to MapResponse object
|
||||||
|
return self._dict_to_map_response(cache_data["data"])
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Cache read error for response {cache_key}: {e}")
|
||||||
|
self.cache_stats["misses"] += 1
|
||||||
|
return None
|
||||||
|
|
||||||
|
def invalidate_location_cache(
|
||||||
|
self, location_type: str, location_id: Optional[int] = None
|
||||||
|
) -> None:
|
||||||
|
"""Invalidate cache for specific location or all locations of a type."""
|
||||||
|
try:
|
||||||
|
if location_id:
|
||||||
|
# Invalidate specific location detail
|
||||||
|
detail_key = self.get_location_detail_cache_key(
|
||||||
|
location_type, location_id
|
||||||
|
)
|
||||||
|
cache.delete(detail_key)
|
||||||
|
|
||||||
|
# Invalidate related location and cluster caches
|
||||||
|
# In a production system, you'd want more sophisticated cache
|
||||||
|
# tagging
|
||||||
|
cache.delete_many(
|
||||||
|
[f"{self.LOCATIONS_PREFIX}:*", f"{self.CLUSTERS_PREFIX}:*"]
|
||||||
|
)
|
||||||
|
|
||||||
|
self.cache_stats["invalidations"] += 1
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Cache invalidation error: {e}")
|
||||||
|
|
||||||
|
def invalidate_bounds_cache(self, bounds: GeoBounds) -> None:
|
||||||
|
"""Invalidate cache for specific geographic bounds."""
|
||||||
|
try:
|
||||||
|
geohash = self._bounds_to_geohash(bounds)
|
||||||
|
pattern = f"{self.LOCATIONS_PREFIX}:geo:{geohash}*"
|
||||||
|
|
||||||
|
# In production, you'd use cache tagging or Redis SCAN
|
||||||
|
# For now, we'll invalidate broader patterns
|
||||||
|
cache.delete_many([pattern])
|
||||||
|
|
||||||
|
self.cache_stats["invalidations"] += 1
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Bounds cache invalidation error: {e}")
|
||||||
|
|
||||||
|
def clear_all_map_cache(self) -> None:
|
||||||
|
"""Clear all map-related cache data."""
|
||||||
|
try:
|
||||||
|
cache.delete_many(
|
||||||
|
[
|
||||||
|
f"{self.LOCATIONS_PREFIX}:*",
|
||||||
|
f"{self.CLUSTERS_PREFIX}:*",
|
||||||
|
f"{self.BOUNDS_PREFIX}:*",
|
||||||
|
f"{self.DETAIL_PREFIX}:*",
|
||||||
|
]
|
||||||
|
)
|
||||||
|
|
||||||
|
self.cache_stats["invalidations"] += 1
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Cache clear error: {e}")
|
||||||
|
|
||||||
|
def get_cache_stats(self) -> Dict[str, Any]:
|
||||||
|
"""Get cache performance statistics."""
|
||||||
|
total_requests = self.cache_stats["hits"] + self.cache_stats["misses"]
|
||||||
|
hit_rate = (
|
||||||
|
(self.cache_stats["hits"] / total_requests * 100)
|
||||||
|
if total_requests > 0
|
||||||
|
else 0
|
||||||
|
)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"hits": self.cache_stats["hits"],
|
||||||
|
"misses": self.cache_stats["misses"],
|
||||||
|
"hit_rate_percent": round(hit_rate, 2),
|
||||||
|
"invalidations": self.cache_stats["invalidations"],
|
||||||
|
"geohash_partitions": self.cache_stats["geohash_partitions"],
|
||||||
|
}
|
||||||
|
|
||||||
|
def record_performance_metrics(self, metrics: QueryPerformanceMetrics) -> None:
|
||||||
|
"""Record query performance metrics for analysis."""
|
||||||
|
try:
|
||||||
|
# 5-minute buckets
|
||||||
|
stats_key = f"{
|
||||||
|
self.STATS_PREFIX}:performance:{
|
||||||
|
int(
|
||||||
|
time.time() //
|
||||||
|
300)}"
|
||||||
|
|
||||||
|
current_stats = cache.get(
|
||||||
|
stats_key,
|
||||||
|
{
|
||||||
|
"query_count": 0,
|
||||||
|
"total_time_ms": 0,
|
||||||
|
"cache_hits": 0,
|
||||||
|
"db_queries": 0,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
current_stats["query_count"] += 1
|
||||||
|
current_stats["total_time_ms"] += metrics.query_time_ms
|
||||||
|
current_stats["cache_hits"] += 1 if metrics.cache_hit else 0
|
||||||
|
current_stats["db_queries"] += metrics.db_query_count
|
||||||
|
|
||||||
|
cache.set(stats_key, current_stats, 3600) # Keep for 1 hour
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Performance metrics recording error: {e}")
|
||||||
|
|
||||||
|
def _bounds_to_geohash(self, bounds: GeoBounds) -> str:
|
||||||
|
"""Convert geographic bounds to geohash for cache partitioning."""
|
||||||
|
# Use center point of bounds for geohash
|
||||||
|
center_lat = (bounds.north + bounds.south) / 2
|
||||||
|
center_lng = (bounds.east + bounds.west) / 2
|
||||||
|
|
||||||
|
# Simple geohash implementation (in production, use a library)
|
||||||
|
return self._encode_geohash(center_lat, center_lng, self.GEOHASH_PRECISION)
|
||||||
|
|
||||||
|
def _encode_geohash(self, lat: float, lng: float, precision: int) -> str:
|
||||||
|
"""Simple geohash encoding implementation."""
|
||||||
|
# This is a simplified implementation
|
||||||
|
# In production, use the `geohash` library
|
||||||
|
lat_range = [-90.0, 90.0]
|
||||||
|
lng_range = [-180.0, 180.0]
|
||||||
|
|
||||||
|
geohash = ""
|
||||||
|
bits = 0
|
||||||
|
bit_count = 0
|
||||||
|
even_bit = True
|
||||||
|
|
||||||
|
while len(geohash) < precision:
|
||||||
|
if even_bit:
|
||||||
|
# longitude
|
||||||
|
mid = (lng_range[0] + lng_range[1]) / 2
|
||||||
|
if lng >= mid:
|
||||||
|
bits = (bits << 1) + 1
|
||||||
|
lng_range[0] = mid
|
||||||
|
else:
|
||||||
|
bits = bits << 1
|
||||||
|
lng_range[1] = mid
|
||||||
|
else:
|
||||||
|
# latitude
|
||||||
|
mid = (lat_range[0] + lat_range[1]) / 2
|
||||||
|
if lat >= mid:
|
||||||
|
bits = (bits << 1) + 1
|
||||||
|
lat_range[0] = mid
|
||||||
|
else:
|
||||||
|
bits = bits << 1
|
||||||
|
lat_range[1] = mid
|
||||||
|
|
||||||
|
even_bit = not even_bit
|
||||||
|
bit_count += 1
|
||||||
|
|
||||||
|
if bit_count == 5:
|
||||||
|
# Convert 5 bits to base32 character
|
||||||
|
geohash += "0123456789bcdefghjkmnpqrstuvwxyz"[bits]
|
||||||
|
bits = 0
|
||||||
|
bit_count = 0
|
||||||
|
|
||||||
|
return geohash
|
||||||
|
|
||||||
|
def _hash_filters(self, filters: MapFilters) -> str:
|
||||||
|
"""Create deterministic hash of filters for cache keys."""
|
||||||
|
filter_dict = filters.to_dict()
|
||||||
|
# Sort to ensure consistent ordering
|
||||||
|
filter_str = json.dumps(filter_dict, sort_keys=True)
|
||||||
|
return hashlib.md5(filter_str.encode()).hexdigest()[:8]
|
||||||
|
|
||||||
|
def _dict_to_unified_location(self, data: Dict[str, Any]) -> UnifiedLocation:
|
||||||
|
"""Convert dictionary back to UnifiedLocation object."""
|
||||||
|
from .data_structures import LocationType
|
||||||
|
|
||||||
|
return UnifiedLocation(
|
||||||
|
id=data["id"],
|
||||||
|
type=LocationType(data["type"]),
|
||||||
|
name=data["name"],
|
||||||
|
coordinates=tuple(data["coordinates"]),
|
||||||
|
address=data.get("address"),
|
||||||
|
metadata=data.get("metadata", {}),
|
||||||
|
type_data=data.get("type_data", {}),
|
||||||
|
cluster_weight=data.get("cluster_weight", 1),
|
||||||
|
cluster_category=data.get("cluster_category", "default"),
|
||||||
|
)
|
||||||
|
|
||||||
|
def _dict_to_cluster_data(self, data: Dict[str, Any]) -> ClusterData:
|
||||||
|
"""Convert dictionary back to ClusterData object."""
|
||||||
|
from .data_structures import LocationType
|
||||||
|
|
||||||
|
bounds = GeoBounds(**data["bounds"])
|
||||||
|
types = {LocationType(t) for t in data["types"]}
|
||||||
|
|
||||||
|
representative = None
|
||||||
|
if data.get("representative"):
|
||||||
|
representative = self._dict_to_unified_location(data["representative"])
|
||||||
|
|
||||||
|
return ClusterData(
|
||||||
|
id=data["id"],
|
||||||
|
coordinates=tuple(data["coordinates"]),
|
||||||
|
count=data["count"],
|
||||||
|
types=types,
|
||||||
|
bounds=bounds,
|
||||||
|
representative_location=representative,
|
||||||
|
)
|
||||||
|
|
||||||
|
def _dict_to_map_response(self, data: Dict[str, Any]) -> MapResponse:
|
||||||
|
"""Convert dictionary back to MapResponse object."""
|
||||||
|
locations = [
|
||||||
|
self._dict_to_unified_location(loc) for loc in data.get("locations", [])
|
||||||
|
]
|
||||||
|
clusters = [
|
||||||
|
self._dict_to_cluster_data(cluster) for cluster in data.get("clusters", [])
|
||||||
|
]
|
||||||
|
|
||||||
|
bounds = None
|
||||||
|
if data.get("bounds"):
|
||||||
|
bounds = GeoBounds(**data["bounds"])
|
||||||
|
|
||||||
|
return MapResponse(
|
||||||
|
locations=locations,
|
||||||
|
clusters=clusters,
|
||||||
|
bounds=bounds,
|
||||||
|
total_count=data.get("total_count", 0),
|
||||||
|
filtered_count=data.get("filtered_count", 0),
|
||||||
|
zoom_level=data.get("zoom_level"),
|
||||||
|
clustered=data.get("clustered", False),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
# Global cache service instance
|
||||||
|
map_cache = MapCacheService()
|
||||||
474
backend/apps/core/services/map_service.py
Normal file
474
backend/apps/core/services/map_service.py
Normal file
@@ -0,0 +1,474 @@
|
|||||||
|
"""
|
||||||
|
Unified Map Service - Main orchestrating service for all map functionality.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import time
|
||||||
|
from typing import List, Optional, Dict, Any, Set
|
||||||
|
from django.db import connection
|
||||||
|
|
||||||
|
from .data_structures import (
|
||||||
|
UnifiedLocation,
|
||||||
|
ClusterData,
|
||||||
|
GeoBounds,
|
||||||
|
MapFilters,
|
||||||
|
MapResponse,
|
||||||
|
LocationType,
|
||||||
|
QueryPerformanceMetrics,
|
||||||
|
)
|
||||||
|
from .location_adapters import LocationAbstractionLayer
|
||||||
|
from .clustering_service import ClusteringService
|
||||||
|
from .map_cache_service import MapCacheService
|
||||||
|
|
||||||
|
|
||||||
|
class UnifiedMapService:
|
||||||
|
"""
|
||||||
|
Main service orchestrating map data retrieval, filtering, clustering, and caching.
|
||||||
|
Provides a unified interface for all location types with performance optimization.
|
||||||
|
"""
|
||||||
|
|
||||||
|
# Performance thresholds
|
||||||
|
MAX_UNCLUSTERED_POINTS = 500
|
||||||
|
MAX_CLUSTERED_POINTS = 2000
|
||||||
|
DEFAULT_ZOOM_LEVEL = 10
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
self.location_layer = LocationAbstractionLayer()
|
||||||
|
self.clustering_service = ClusteringService()
|
||||||
|
self.cache_service = MapCacheService()
|
||||||
|
|
||||||
|
def get_map_data(
|
||||||
|
self,
|
||||||
|
*,
|
||||||
|
bounds: Optional[GeoBounds] = None,
|
||||||
|
filters: Optional[MapFilters] = None,
|
||||||
|
zoom_level: int = DEFAULT_ZOOM_LEVEL,
|
||||||
|
cluster: bool = True,
|
||||||
|
use_cache: bool = True,
|
||||||
|
) -> MapResponse:
|
||||||
|
"""
|
||||||
|
Primary method for retrieving unified map data.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
bounds: Geographic bounds to query within
|
||||||
|
filters: Filtering criteria for locations
|
||||||
|
zoom_level: Map zoom level for clustering decisions
|
||||||
|
cluster: Whether to apply clustering
|
||||||
|
use_cache: Whether to use cached data
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
MapResponse with locations, clusters, and metadata
|
||||||
|
"""
|
||||||
|
start_time = time.time()
|
||||||
|
initial_query_count = len(connection.queries)
|
||||||
|
cache_hit = False
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Generate cache key
|
||||||
|
cache_key = None
|
||||||
|
if use_cache:
|
||||||
|
cache_key = self._generate_cache_key(
|
||||||
|
bounds, filters, zoom_level, cluster
|
||||||
|
)
|
||||||
|
|
||||||
|
# Try to get from cache first
|
||||||
|
cached_response = self.cache_service.get_cached_map_response(cache_key)
|
||||||
|
if cached_response:
|
||||||
|
cached_response.cache_hit = True
|
||||||
|
cached_response.query_time_ms = int(
|
||||||
|
(time.time() - start_time) * 1000
|
||||||
|
)
|
||||||
|
return cached_response
|
||||||
|
|
||||||
|
# Get locations from database
|
||||||
|
locations = self._get_locations_from_db(bounds, filters)
|
||||||
|
|
||||||
|
# Apply smart limiting based on zoom level and density
|
||||||
|
locations = self._apply_smart_limiting(locations, bounds, zoom_level)
|
||||||
|
|
||||||
|
# Determine if clustering should be applied
|
||||||
|
should_cluster = cluster and self.clustering_service.should_cluster(
|
||||||
|
zoom_level, len(locations)
|
||||||
|
)
|
||||||
|
|
||||||
|
# Apply clustering if needed
|
||||||
|
clusters = []
|
||||||
|
if should_cluster:
|
||||||
|
locations, clusters = self.clustering_service.cluster_locations(
|
||||||
|
locations, zoom_level, bounds
|
||||||
|
)
|
||||||
|
|
||||||
|
# Calculate response bounds
|
||||||
|
response_bounds = self._calculate_response_bounds(
|
||||||
|
locations, clusters, bounds
|
||||||
|
)
|
||||||
|
|
||||||
|
# Create response
|
||||||
|
response = MapResponse(
|
||||||
|
locations=locations,
|
||||||
|
clusters=clusters,
|
||||||
|
bounds=response_bounds,
|
||||||
|
total_count=len(locations) + sum(cluster.count for cluster in clusters),
|
||||||
|
filtered_count=len(locations),
|
||||||
|
zoom_level=zoom_level,
|
||||||
|
clustered=should_cluster,
|
||||||
|
cache_hit=cache_hit,
|
||||||
|
query_time_ms=int((time.time() - start_time) * 1000),
|
||||||
|
filters_applied=self._get_applied_filters_list(filters),
|
||||||
|
)
|
||||||
|
|
||||||
|
# Cache the response
|
||||||
|
if use_cache and cache_key:
|
||||||
|
self.cache_service.cache_map_response(cache_key, response)
|
||||||
|
|
||||||
|
# Record performance metrics
|
||||||
|
self._record_performance_metrics(
|
||||||
|
start_time,
|
||||||
|
initial_query_count,
|
||||||
|
cache_hit,
|
||||||
|
len(locations) + len(clusters),
|
||||||
|
bounds is not None,
|
||||||
|
should_cluster,
|
||||||
|
)
|
||||||
|
|
||||||
|
return response
|
||||||
|
|
||||||
|
except Exception:
|
||||||
|
# Return error response
|
||||||
|
return MapResponse(
|
||||||
|
locations=[],
|
||||||
|
clusters=[],
|
||||||
|
total_count=0,
|
||||||
|
filtered_count=0,
|
||||||
|
query_time_ms=int((time.time() - start_time) * 1000),
|
||||||
|
cache_hit=False,
|
||||||
|
)
|
||||||
|
|
||||||
|
def get_location_details(
|
||||||
|
self, location_type: str, location_id: int
|
||||||
|
) -> Optional[UnifiedLocation]:
|
||||||
|
"""
|
||||||
|
Get detailed information for a specific location.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
location_type: Type of location (park, ride, company, generic)
|
||||||
|
location_id: ID of the location
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
UnifiedLocation with full details or None if not found
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
# Check cache first
|
||||||
|
cache_key = self.cache_service.get_location_detail_cache_key(
|
||||||
|
location_type, location_id
|
||||||
|
)
|
||||||
|
cached_locations = self.cache_service.get_cached_locations(cache_key)
|
||||||
|
if cached_locations:
|
||||||
|
return cached_locations[0] if cached_locations else None
|
||||||
|
|
||||||
|
# Get from database
|
||||||
|
location_type_enum = LocationType(location_type.lower())
|
||||||
|
location = self.location_layer.get_location_by_id(
|
||||||
|
location_type_enum, location_id
|
||||||
|
)
|
||||||
|
|
||||||
|
# Cache the result
|
||||||
|
if location:
|
||||||
|
self.cache_service.cache_locations(
|
||||||
|
cache_key,
|
||||||
|
[location],
|
||||||
|
self.cache_service.LOCATION_DETAIL_TTL,
|
||||||
|
)
|
||||||
|
|
||||||
|
return location
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Error getting location details: {e}")
|
||||||
|
return None
|
||||||
|
|
||||||
|
def search_locations(
|
||||||
|
self,
|
||||||
|
query: str,
|
||||||
|
bounds: Optional[GeoBounds] = None,
|
||||||
|
location_types: Optional[Set[LocationType]] = None,
|
||||||
|
limit: int = 50,
|
||||||
|
) -> List[UnifiedLocation]:
|
||||||
|
"""
|
||||||
|
Search locations with text query.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
query: Search query string
|
||||||
|
bounds: Optional geographic bounds to search within
|
||||||
|
location_types: Optional set of location types to search
|
||||||
|
limit: Maximum number of results
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of matching UnifiedLocation objects
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
# Create search filters
|
||||||
|
filters = MapFilters(
|
||||||
|
search_query=query,
|
||||||
|
location_types=location_types or {LocationType.PARK, LocationType.RIDE},
|
||||||
|
has_coordinates=True,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Get locations
|
||||||
|
locations = self.location_layer.get_all_locations(bounds, filters)
|
||||||
|
|
||||||
|
# Apply limit
|
||||||
|
return locations[:limit]
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Error searching locations: {e}")
|
||||||
|
return []
|
||||||
|
|
||||||
|
def get_locations_by_bounds(
|
||||||
|
self,
|
||||||
|
north: float,
|
||||||
|
south: float,
|
||||||
|
east: float,
|
||||||
|
west: float,
|
||||||
|
location_types: Optional[Set[LocationType]] = None,
|
||||||
|
zoom_level: int = DEFAULT_ZOOM_LEVEL,
|
||||||
|
) -> MapResponse:
|
||||||
|
"""
|
||||||
|
Get locations within specific geographic bounds.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
north, south, east, west: Bounding box coordinates
|
||||||
|
location_types: Optional filter for location types
|
||||||
|
zoom_level: Map zoom level for optimization
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
MapResponse with locations in bounds
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
bounds = GeoBounds(north=north, south=south, east=east, west=west)
|
||||||
|
filters = (
|
||||||
|
MapFilters(location_types=location_types) if location_types else None
|
||||||
|
)
|
||||||
|
|
||||||
|
return self.get_map_data(
|
||||||
|
bounds=bounds, filters=filters, zoom_level=zoom_level
|
||||||
|
)
|
||||||
|
|
||||||
|
except ValueError:
|
||||||
|
# Invalid bounds
|
||||||
|
return MapResponse(
|
||||||
|
locations=[], clusters=[], total_count=0, filtered_count=0
|
||||||
|
)
|
||||||
|
|
||||||
|
def get_clustered_locations(
|
||||||
|
self,
|
||||||
|
zoom_level: int,
|
||||||
|
bounds: Optional[GeoBounds] = None,
|
||||||
|
filters: Optional[MapFilters] = None,
|
||||||
|
) -> MapResponse:
|
||||||
|
"""
|
||||||
|
Get clustered location data for map display.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
zoom_level: Map zoom level for clustering configuration
|
||||||
|
bounds: Optional geographic bounds
|
||||||
|
filters: Optional filtering criteria
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
MapResponse with clustered data
|
||||||
|
"""
|
||||||
|
return self.get_map_data(
|
||||||
|
bounds=bounds, filters=filters, zoom_level=zoom_level, cluster=True
|
||||||
|
)
|
||||||
|
|
||||||
|
def get_locations_by_type(
|
||||||
|
self,
|
||||||
|
location_type: LocationType,
|
||||||
|
bounds: Optional[GeoBounds] = None,
|
||||||
|
limit: Optional[int] = None,
|
||||||
|
) -> List[UnifiedLocation]:
|
||||||
|
"""
|
||||||
|
Get locations of a specific type.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
location_type: Type of locations to retrieve
|
||||||
|
bounds: Optional geographic bounds
|
||||||
|
limit: Optional limit on results
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of UnifiedLocation objects
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
filters = MapFilters(location_types={location_type})
|
||||||
|
locations = self.location_layer.get_locations_by_type(
|
||||||
|
location_type, bounds, filters
|
||||||
|
)
|
||||||
|
|
||||||
|
if limit:
|
||||||
|
locations = locations[:limit]
|
||||||
|
|
||||||
|
return locations
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Error getting locations by type: {e}")
|
||||||
|
return []
|
||||||
|
|
||||||
|
def invalidate_cache(
|
||||||
|
self,
|
||||||
|
location_type: Optional[str] = None,
|
||||||
|
location_id: Optional[int] = None,
|
||||||
|
bounds: Optional[GeoBounds] = None,
|
||||||
|
) -> None:
|
||||||
|
"""
|
||||||
|
Invalidate cached map data.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
location_type: Optional specific location type to invalidate
|
||||||
|
location_id: Optional specific location ID to invalidate
|
||||||
|
bounds: Optional specific bounds to invalidate
|
||||||
|
"""
|
||||||
|
if location_type and location_id:
|
||||||
|
self.cache_service.invalidate_location_cache(location_type, location_id)
|
||||||
|
elif bounds:
|
||||||
|
self.cache_service.invalidate_bounds_cache(bounds)
|
||||||
|
else:
|
||||||
|
self.cache_service.clear_all_map_cache()
|
||||||
|
|
||||||
|
def get_service_stats(self) -> Dict[str, Any]:
|
||||||
|
"""Get service performance and usage statistics."""
|
||||||
|
cache_stats = self.cache_service.get_cache_stats()
|
||||||
|
|
||||||
|
return {
|
||||||
|
"cache_performance": cache_stats,
|
||||||
|
"clustering_available": True,
|
||||||
|
"supported_location_types": [t.value for t in LocationType],
|
||||||
|
"max_unclustered_points": self.MAX_UNCLUSTERED_POINTS,
|
||||||
|
"max_clustered_points": self.MAX_CLUSTERED_POINTS,
|
||||||
|
"service_version": "1.0.0",
|
||||||
|
}
|
||||||
|
|
||||||
|
def _get_locations_from_db(
|
||||||
|
self, bounds: Optional[GeoBounds], filters: Optional[MapFilters]
|
||||||
|
) -> List[UnifiedLocation]:
|
||||||
|
"""Get locations from database using the abstraction layer."""
|
||||||
|
return self.location_layer.get_all_locations(bounds, filters)
|
||||||
|
|
||||||
|
def _apply_smart_limiting(
|
||||||
|
self,
|
||||||
|
locations: List[UnifiedLocation],
|
||||||
|
bounds: Optional[GeoBounds],
|
||||||
|
zoom_level: int,
|
||||||
|
) -> List[UnifiedLocation]:
|
||||||
|
"""Apply intelligent limiting based on zoom level and density."""
|
||||||
|
if zoom_level < 6: # Very zoomed out - show only major parks
|
||||||
|
major_parks = [
|
||||||
|
loc
|
||||||
|
for loc in locations
|
||||||
|
if (
|
||||||
|
loc.type == LocationType.PARK
|
||||||
|
and loc.cluster_category in ["major_park", "theme_park"]
|
||||||
|
)
|
||||||
|
]
|
||||||
|
return major_parks[:200]
|
||||||
|
elif zoom_level < 10: # Regional level
|
||||||
|
return locations[:1000]
|
||||||
|
else: # City level and closer
|
||||||
|
return locations[: self.MAX_CLUSTERED_POINTS]
|
||||||
|
|
||||||
|
def _calculate_response_bounds(
|
||||||
|
self,
|
||||||
|
locations: List[UnifiedLocation],
|
||||||
|
clusters: List[ClusterData],
|
||||||
|
request_bounds: Optional[GeoBounds],
|
||||||
|
) -> Optional[GeoBounds]:
|
||||||
|
"""Calculate the actual bounds of the response data."""
|
||||||
|
if request_bounds:
|
||||||
|
return request_bounds
|
||||||
|
|
||||||
|
all_coords = []
|
||||||
|
|
||||||
|
# Add location coordinates
|
||||||
|
for loc in locations:
|
||||||
|
all_coords.append((loc.latitude, loc.longitude))
|
||||||
|
|
||||||
|
# Add cluster coordinates
|
||||||
|
for cluster in clusters:
|
||||||
|
all_coords.append(cluster.coordinates)
|
||||||
|
|
||||||
|
if not all_coords:
|
||||||
|
return None
|
||||||
|
|
||||||
|
lats, lngs = zip(*all_coords)
|
||||||
|
return GeoBounds(
|
||||||
|
north=max(lats), south=min(lats), east=max(lngs), west=min(lngs)
|
||||||
|
)
|
||||||
|
|
||||||
|
def _get_applied_filters_list(self, filters: Optional[MapFilters]) -> List[str]:
|
||||||
|
"""Get list of applied filter types for metadata."""
|
||||||
|
if not filters:
|
||||||
|
return []
|
||||||
|
|
||||||
|
applied = []
|
||||||
|
if filters.location_types:
|
||||||
|
applied.append("location_types")
|
||||||
|
if filters.search_query:
|
||||||
|
applied.append("search_query")
|
||||||
|
if filters.park_status:
|
||||||
|
applied.append("park_status")
|
||||||
|
if filters.ride_types:
|
||||||
|
applied.append("ride_types")
|
||||||
|
if filters.company_roles:
|
||||||
|
applied.append("company_roles")
|
||||||
|
if filters.min_rating:
|
||||||
|
applied.append("min_rating")
|
||||||
|
if filters.country:
|
||||||
|
applied.append("country")
|
||||||
|
if filters.state:
|
||||||
|
applied.append("state")
|
||||||
|
if filters.city:
|
||||||
|
applied.append("city")
|
||||||
|
|
||||||
|
return applied
|
||||||
|
|
||||||
|
def _generate_cache_key(
|
||||||
|
self,
|
||||||
|
bounds: Optional[GeoBounds],
|
||||||
|
filters: Optional[MapFilters],
|
||||||
|
zoom_level: int,
|
||||||
|
cluster: bool,
|
||||||
|
) -> str:
|
||||||
|
"""Generate cache key for the request."""
|
||||||
|
if cluster:
|
||||||
|
return self.cache_service.get_clusters_cache_key(
|
||||||
|
bounds, filters, zoom_level
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
return self.cache_service.get_locations_cache_key(
|
||||||
|
bounds, filters, zoom_level
|
||||||
|
)
|
||||||
|
|
||||||
|
def _record_performance_metrics(
|
||||||
|
self,
|
||||||
|
start_time: float,
|
||||||
|
initial_query_count: int,
|
||||||
|
cache_hit: bool,
|
||||||
|
result_count: int,
|
||||||
|
bounds_used: bool,
|
||||||
|
clustering_used: bool,
|
||||||
|
) -> None:
|
||||||
|
"""Record performance metrics for monitoring."""
|
||||||
|
query_time_ms = int((time.time() - start_time) * 1000)
|
||||||
|
db_query_count = len(connection.queries) - initial_query_count
|
||||||
|
|
||||||
|
metrics = QueryPerformanceMetrics(
|
||||||
|
query_time_ms=query_time_ms,
|
||||||
|
db_query_count=db_query_count,
|
||||||
|
cache_hit=cache_hit,
|
||||||
|
result_count=result_count,
|
||||||
|
bounds_used=bounds_used,
|
||||||
|
clustering_used=clustering_used,
|
||||||
|
)
|
||||||
|
|
||||||
|
self.cache_service.record_performance_metrics(metrics)
|
||||||
|
|
||||||
|
|
||||||
|
# Global service instance
|
||||||
|
unified_map_service = UnifiedMapService()
|
||||||
407
backend/apps/core/services/performance_monitoring.py
Normal file
407
backend/apps/core/services/performance_monitoring.py
Normal file
@@ -0,0 +1,407 @@
|
|||||||
|
"""
|
||||||
|
Performance monitoring utilities and context managers.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import time
|
||||||
|
import logging
|
||||||
|
from contextlib import contextmanager
|
||||||
|
from functools import wraps
|
||||||
|
from typing import Optional, Dict, Any, List
|
||||||
|
from django.db import connection
|
||||||
|
from django.conf import settings
|
||||||
|
from django.utils import timezone
|
||||||
|
|
||||||
|
logger = logging.getLogger("performance")
|
||||||
|
|
||||||
|
|
||||||
|
@contextmanager
|
||||||
|
def monitor_performance(operation_name: str, **tags):
|
||||||
|
"""Context manager for monitoring operation performance"""
|
||||||
|
start_time = time.time()
|
||||||
|
initial_queries = len(connection.queries)
|
||||||
|
|
||||||
|
# Create performance context
|
||||||
|
performance_context = {
|
||||||
|
"operation": operation_name,
|
||||||
|
"start_time": start_time,
|
||||||
|
"timestamp": timezone.now().isoformat(),
|
||||||
|
**tags,
|
||||||
|
}
|
||||||
|
|
||||||
|
try:
|
||||||
|
yield performance_context
|
||||||
|
except Exception as e:
|
||||||
|
performance_context["error"] = str(e)
|
||||||
|
performance_context["status"] = "error"
|
||||||
|
raise
|
||||||
|
else:
|
||||||
|
performance_context["status"] = "success"
|
||||||
|
finally:
|
||||||
|
end_time = time.time()
|
||||||
|
duration = end_time - start_time
|
||||||
|
total_queries = len(connection.queries) - initial_queries
|
||||||
|
|
||||||
|
# Update performance context with final metrics
|
||||||
|
performance_context.update(
|
||||||
|
{
|
||||||
|
"duration_seconds": duration,
|
||||||
|
"duration_ms": round(duration * 1000, 2),
|
||||||
|
"query_count": total_queries,
|
||||||
|
"end_time": end_time,
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
# Log performance data
|
||||||
|
log_level = (
|
||||||
|
logging.WARNING if duration > 2.0 or total_queries > 10 else logging.INFO
|
||||||
|
)
|
||||||
|
logger.log(
|
||||||
|
log_level,
|
||||||
|
f"Performance: {operation_name} completed in {
|
||||||
|
duration:.3f}s with {total_queries} queries",
|
||||||
|
extra=performance_context,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Log slow operations with additional detail
|
||||||
|
if duration > 2.0:
|
||||||
|
logger.warning(
|
||||||
|
f"Slow operation detected: {operation_name} took {
|
||||||
|
duration:.3f}s",
|
||||||
|
extra={
|
||||||
|
"slow_operation": True,
|
||||||
|
"threshold_exceeded": "duration",
|
||||||
|
**performance_context,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
if total_queries > 10:
|
||||||
|
logger.warning(
|
||||||
|
f"High query count: {operation_name} executed {total_queries} queries",
|
||||||
|
extra={
|
||||||
|
"high_query_count": True,
|
||||||
|
"threshold_exceeded": "query_count",
|
||||||
|
**performance_context,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@contextmanager
|
||||||
|
def track_queries(operation_name: str, warn_threshold: int = 10):
|
||||||
|
"""Context manager to track database queries for specific operations"""
|
||||||
|
if not settings.DEBUG:
|
||||||
|
yield
|
||||||
|
return
|
||||||
|
|
||||||
|
initial_queries = len(connection.queries)
|
||||||
|
start_time = time.time()
|
||||||
|
|
||||||
|
try:
|
||||||
|
yield
|
||||||
|
finally:
|
||||||
|
end_time = time.time()
|
||||||
|
total_queries = len(connection.queries) - initial_queries
|
||||||
|
execution_time = end_time - start_time
|
||||||
|
|
||||||
|
query_details = []
|
||||||
|
if hasattr(connection, "queries") and total_queries > 0:
|
||||||
|
recent_queries = connection.queries[-total_queries:]
|
||||||
|
query_details = [
|
||||||
|
{
|
||||||
|
"sql": (
|
||||||
|
query["sql"][:200] + "..."
|
||||||
|
if len(query["sql"]) > 200
|
||||||
|
else query["sql"]
|
||||||
|
),
|
||||||
|
"time": float(query["time"]),
|
||||||
|
}
|
||||||
|
for query in recent_queries
|
||||||
|
]
|
||||||
|
|
||||||
|
performance_data = {
|
||||||
|
"operation": operation_name,
|
||||||
|
"query_count": total_queries,
|
||||||
|
"execution_time": execution_time,
|
||||||
|
"queries": query_details if settings.DEBUG else [],
|
||||||
|
}
|
||||||
|
|
||||||
|
if total_queries > warn_threshold or execution_time > 1.0:
|
||||||
|
logger.warning(
|
||||||
|
f"Performance concern in {operation_name}: "
|
||||||
|
f"{total_queries} queries, {execution_time:.2f}s",
|
||||||
|
extra=performance_data,
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
logger.debug(
|
||||||
|
f"Query tracking for {operation_name}: "
|
||||||
|
f"{total_queries} queries, {execution_time:.2f}s",
|
||||||
|
extra=performance_data,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class PerformanceProfiler:
|
||||||
|
"""Advanced performance profiling with detailed metrics"""
|
||||||
|
|
||||||
|
def __init__(self, name: str):
|
||||||
|
self.name = name
|
||||||
|
self.start_time = None
|
||||||
|
self.end_time = None
|
||||||
|
self.checkpoints = []
|
||||||
|
self.initial_queries = 0
|
||||||
|
self.memory_usage = {}
|
||||||
|
|
||||||
|
def start(self):
|
||||||
|
"""Start profiling"""
|
||||||
|
self.start_time = time.time()
|
||||||
|
self.initial_queries = len(connection.queries)
|
||||||
|
|
||||||
|
# Track memory usage if psutil is available
|
||||||
|
try:
|
||||||
|
import psutil
|
||||||
|
|
||||||
|
process = psutil.Process()
|
||||||
|
self.memory_usage["start"] = process.memory_info().rss
|
||||||
|
except ImportError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
logger.debug(f"Started profiling: {self.name}")
|
||||||
|
|
||||||
|
def checkpoint(self, name: str):
|
||||||
|
"""Add a checkpoint"""
|
||||||
|
if self.start_time is None:
|
||||||
|
logger.warning(f"Checkpoint '{name}' called before profiling started")
|
||||||
|
return
|
||||||
|
|
||||||
|
current_time = time.time()
|
||||||
|
elapsed = current_time - self.start_time
|
||||||
|
queries_since_start = len(connection.queries) - self.initial_queries
|
||||||
|
|
||||||
|
checkpoint = {
|
||||||
|
"name": name,
|
||||||
|
"timestamp": current_time,
|
||||||
|
"elapsed_seconds": elapsed,
|
||||||
|
"queries_since_start": queries_since_start,
|
||||||
|
}
|
||||||
|
|
||||||
|
# Memory usage if available
|
||||||
|
try:
|
||||||
|
import psutil
|
||||||
|
|
||||||
|
process = psutil.Process()
|
||||||
|
checkpoint["memory_rss"] = process.memory_info().rss
|
||||||
|
except ImportError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
self.checkpoints.append(checkpoint)
|
||||||
|
logger.debug(f"Checkpoint '{name}' at {elapsed:.3f}s")
|
||||||
|
|
||||||
|
def stop(self):
|
||||||
|
"""Stop profiling and log results"""
|
||||||
|
if self.start_time is None:
|
||||||
|
logger.warning("Profiling stopped before it was started")
|
||||||
|
return
|
||||||
|
|
||||||
|
self.end_time = time.time()
|
||||||
|
total_duration = self.end_time - self.start_time
|
||||||
|
total_queries = len(connection.queries) - self.initial_queries
|
||||||
|
|
||||||
|
# Final memory usage
|
||||||
|
try:
|
||||||
|
import psutil
|
||||||
|
|
||||||
|
process = psutil.Process()
|
||||||
|
self.memory_usage["end"] = process.memory_info().rss
|
||||||
|
except ImportError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
# Create detailed profiling report
|
||||||
|
report = {
|
||||||
|
"profiler_name": self.name,
|
||||||
|
"total_duration": total_duration,
|
||||||
|
"total_queries": total_queries,
|
||||||
|
"checkpoints": self.checkpoints,
|
||||||
|
"memory_usage": self.memory_usage,
|
||||||
|
"queries_per_second": (
|
||||||
|
total_queries / total_duration if total_duration > 0 else 0
|
||||||
|
),
|
||||||
|
}
|
||||||
|
|
||||||
|
# Calculate checkpoint intervals
|
||||||
|
if len(self.checkpoints) > 1:
|
||||||
|
intervals = []
|
||||||
|
for i in range(1, len(self.checkpoints)):
|
||||||
|
prev = self.checkpoints[i - 1]
|
||||||
|
curr = self.checkpoints[i]
|
||||||
|
intervals.append(
|
||||||
|
{
|
||||||
|
"from": prev["name"],
|
||||||
|
"to": curr["name"],
|
||||||
|
"duration": curr["elapsed_seconds"] - prev["elapsed_seconds"],
|
||||||
|
"queries": curr["queries_since_start"]
|
||||||
|
- prev["queries_since_start"],
|
||||||
|
}
|
||||||
|
)
|
||||||
|
report["checkpoint_intervals"] = intervals
|
||||||
|
|
||||||
|
# Log the complete report
|
||||||
|
log_level = logging.WARNING if total_duration > 1.0 else logging.INFO
|
||||||
|
logger.log(
|
||||||
|
log_level,
|
||||||
|
f"Profiling complete: {
|
||||||
|
self.name} took {
|
||||||
|
total_duration:.3f}s with {total_queries} queries",
|
||||||
|
extra=report,
|
||||||
|
)
|
||||||
|
|
||||||
|
return report
|
||||||
|
|
||||||
|
|
||||||
|
@contextmanager
|
||||||
|
def profile_operation(name: str):
|
||||||
|
"""Context manager for detailed operation profiling"""
|
||||||
|
profiler = PerformanceProfiler(name)
|
||||||
|
profiler.start()
|
||||||
|
|
||||||
|
try:
|
||||||
|
yield profiler
|
||||||
|
finally:
|
||||||
|
profiler.stop()
|
||||||
|
|
||||||
|
|
||||||
|
class DatabaseQueryAnalyzer:
|
||||||
|
"""Analyze database query patterns and performance"""
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def analyze_queries(queries: List[Dict]) -> Dict[str, Any]:
|
||||||
|
"""Analyze a list of queries for patterns and issues"""
|
||||||
|
if not queries:
|
||||||
|
return {}
|
||||||
|
|
||||||
|
total_time = sum(float(q.get("time", 0)) for q in queries)
|
||||||
|
query_count = len(queries)
|
||||||
|
|
||||||
|
# Group queries by type
|
||||||
|
query_types = {}
|
||||||
|
for query in queries:
|
||||||
|
sql = query.get("sql", "").strip().upper()
|
||||||
|
query_type = sql.split()[0] if sql else "UNKNOWN"
|
||||||
|
query_types[query_type] = query_types.get(query_type, 0) + 1
|
||||||
|
|
||||||
|
# Find slow queries (top 10% by time)
|
||||||
|
sorted_queries = sorted(
|
||||||
|
queries, key=lambda q: float(q.get("time", 0)), reverse=True
|
||||||
|
)
|
||||||
|
slow_query_count = max(1, query_count // 10)
|
||||||
|
slow_queries = sorted_queries[:slow_query_count]
|
||||||
|
|
||||||
|
# Detect duplicate queries
|
||||||
|
query_signatures = {}
|
||||||
|
for query in queries:
|
||||||
|
# Simplified signature - remove literals and normalize whitespace
|
||||||
|
sql = query.get("sql", "")
|
||||||
|
signature = " ".join(sql.split()) # Normalize whitespace
|
||||||
|
query_signatures[signature] = query_signatures.get(signature, 0) + 1
|
||||||
|
|
||||||
|
duplicates = {
|
||||||
|
sig: count for sig, count in query_signatures.items() if count > 1
|
||||||
|
}
|
||||||
|
|
||||||
|
analysis = {
|
||||||
|
"total_queries": query_count,
|
||||||
|
"total_time": total_time,
|
||||||
|
"average_time": total_time / query_count if query_count > 0 else 0,
|
||||||
|
"query_types": query_types,
|
||||||
|
"slow_queries": [
|
||||||
|
{
|
||||||
|
"sql": (
|
||||||
|
q.get("sql", "")[:200] + "..."
|
||||||
|
if len(q.get("sql", "")) > 200
|
||||||
|
else q.get("sql", "")
|
||||||
|
),
|
||||||
|
"time": float(q.get("time", 0)),
|
||||||
|
}
|
||||||
|
for q in slow_queries
|
||||||
|
],
|
||||||
|
"duplicate_query_count": len(duplicates),
|
||||||
|
"duplicate_queries": (
|
||||||
|
duplicates
|
||||||
|
if len(duplicates) <= 10
|
||||||
|
else dict(list(duplicates.items())[:10])
|
||||||
|
),
|
||||||
|
}
|
||||||
|
|
||||||
|
return analysis
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def analyze_current_queries(cls) -> Dict[str, Any]:
|
||||||
|
"""Analyze the current request's queries"""
|
||||||
|
if hasattr(connection, "queries"):
|
||||||
|
return cls.analyze_queries(connection.queries)
|
||||||
|
return {}
|
||||||
|
|
||||||
|
|
||||||
|
# Performance monitoring decorators
|
||||||
|
def monitor_function_performance(operation_name: Optional[str] = None):
|
||||||
|
"""Decorator to monitor function performance"""
|
||||||
|
|
||||||
|
def decorator(func):
|
||||||
|
@wraps(func)
|
||||||
|
def wrapper(*args, **kwargs):
|
||||||
|
name = operation_name or f"{func.__module__}.{func.__name__}"
|
||||||
|
with monitor_performance(
|
||||||
|
name, function=func.__name__, module=func.__module__
|
||||||
|
):
|
||||||
|
return func(*args, **kwargs)
|
||||||
|
|
||||||
|
return wrapper
|
||||||
|
|
||||||
|
return decorator
|
||||||
|
|
||||||
|
|
||||||
|
def track_database_queries(warn_threshold: int = 10):
|
||||||
|
"""Decorator to track database queries for a function"""
|
||||||
|
|
||||||
|
def decorator(func):
|
||||||
|
@wraps(func)
|
||||||
|
def wrapper(*args, **kwargs):
|
||||||
|
operation_name = f"{func.__module__}.{func.__name__}"
|
||||||
|
with track_queries(operation_name, warn_threshold):
|
||||||
|
return func(*args, **kwargs)
|
||||||
|
|
||||||
|
return wrapper
|
||||||
|
|
||||||
|
return decorator
|
||||||
|
|
||||||
|
|
||||||
|
# Performance metrics collection
|
||||||
|
class PerformanceMetrics:
|
||||||
|
"""Collect and aggregate performance metrics"""
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
self.metrics = []
|
||||||
|
|
||||||
|
def record_metric(self, name: str, value: float, tags: Optional[Dict] = None):
|
||||||
|
"""Record a performance metric"""
|
||||||
|
metric = {
|
||||||
|
"name": name,
|
||||||
|
"value": value,
|
||||||
|
"timestamp": timezone.now().isoformat(),
|
||||||
|
"tags": tags or {},
|
||||||
|
}
|
||||||
|
self.metrics.append(metric)
|
||||||
|
|
||||||
|
# Log the metric
|
||||||
|
logger.info(f"Performance metric: {name} = {value}", extra=metric)
|
||||||
|
|
||||||
|
def get_metrics(self, name: Optional[str] = None) -> List[Dict]:
|
||||||
|
"""Get recorded metrics, optionally filtered by name"""
|
||||||
|
if name:
|
||||||
|
return [m for m in self.metrics if m["name"] == name]
|
||||||
|
return self.metrics.copy()
|
||||||
|
|
||||||
|
def clear_metrics(self):
|
||||||
|
"""Clear all recorded metrics"""
|
||||||
|
self.metrics.clear()
|
||||||
|
|
||||||
|
|
||||||
|
# Global performance metrics instance
|
||||||
|
performance_metrics = PerformanceMetrics()
|
||||||
1
backend/apps/core/tests.py
Normal file
1
backend/apps/core/tests.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
# Create your tests here.
|
||||||
35
backend/apps/core/urls/map_urls.py
Normal file
35
backend/apps/core/urls/map_urls.py
Normal file
@@ -0,0 +1,35 @@
|
|||||||
|
"""
|
||||||
|
URL patterns for the unified map service API.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from django.urls import path
|
||||||
|
from ..views.map_views import (
|
||||||
|
MapLocationsView,
|
||||||
|
MapLocationDetailView,
|
||||||
|
MapSearchView,
|
||||||
|
MapBoundsView,
|
||||||
|
MapStatsView,
|
||||||
|
MapCacheView,
|
||||||
|
)
|
||||||
|
|
||||||
|
app_name = "map_api"
|
||||||
|
|
||||||
|
urlpatterns = [
|
||||||
|
# Main map data endpoint
|
||||||
|
path("locations/", MapLocationsView.as_view(), name="locations"),
|
||||||
|
# Location detail endpoint
|
||||||
|
path(
|
||||||
|
"locations/<str:location_type>/<int:location_id>/",
|
||||||
|
MapLocationDetailView.as_view(),
|
||||||
|
name="location_detail",
|
||||||
|
),
|
||||||
|
# Search endpoint
|
||||||
|
path("search/", MapSearchView.as_view(), name="search"),
|
||||||
|
# Bounds-based query endpoint
|
||||||
|
path("bounds/", MapBoundsView.as_view(), name="bounds"),
|
||||||
|
# Service statistics endpoint
|
||||||
|
path("stats/", MapStatsView.as_view(), name="stats"),
|
||||||
|
# Cache management endpoints
|
||||||
|
path("cache/", MapCacheView.as_view(), name="cache"),
|
||||||
|
path("cache/invalidate/", MapCacheView.as_view(), name="cache_invalidate"),
|
||||||
|
]
|
||||||
39
backend/apps/core/urls/maps.py
Normal file
39
backend/apps/core/urls/maps.py
Normal file
@@ -0,0 +1,39 @@
|
|||||||
|
"""
|
||||||
|
URL patterns for map views.
|
||||||
|
Includes both HTML views and HTMX endpoints.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from django.urls import path
|
||||||
|
from ..views.maps import (
|
||||||
|
UniversalMapView,
|
||||||
|
ParkMapView,
|
||||||
|
NearbyLocationsView,
|
||||||
|
LocationFilterView,
|
||||||
|
LocationSearchView,
|
||||||
|
MapBoundsUpdateView,
|
||||||
|
LocationDetailModalView,
|
||||||
|
LocationListView,
|
||||||
|
)
|
||||||
|
|
||||||
|
app_name = "maps"
|
||||||
|
|
||||||
|
urlpatterns = [
|
||||||
|
# Main map views
|
||||||
|
path("", UniversalMapView.as_view(), name="universal_map"),
|
||||||
|
path("parks/", ParkMapView.as_view(), name="park_map"),
|
||||||
|
path("nearby/", NearbyLocationsView.as_view(), name="nearby_locations"),
|
||||||
|
path("list/", LocationListView.as_view(), name="location_list"),
|
||||||
|
# HTMX endpoints for dynamic updates
|
||||||
|
path("htmx/filter/", LocationFilterView.as_view(), name="htmx_filter"),
|
||||||
|
path("htmx/search/", LocationSearchView.as_view(), name="htmx_search"),
|
||||||
|
path(
|
||||||
|
"htmx/bounds/",
|
||||||
|
MapBoundsUpdateView.as_view(),
|
||||||
|
name="htmx_bounds_update",
|
||||||
|
),
|
||||||
|
path(
|
||||||
|
"htmx/location/<str:location_type>/<int:location_id>/",
|
||||||
|
LocationDetailModalView.as_view(),
|
||||||
|
name="htmx_location_detail",
|
||||||
|
),
|
||||||
|
]
|
||||||
24
backend/apps/core/urls/search.py
Normal file
24
backend/apps/core/urls/search.py
Normal file
@@ -0,0 +1,24 @@
|
|||||||
|
from django.urls import path
|
||||||
|
from apps.core.views.search import (
|
||||||
|
AdaptiveSearchView,
|
||||||
|
FilterFormView,
|
||||||
|
LocationSearchView,
|
||||||
|
LocationSuggestionsView,
|
||||||
|
)
|
||||||
|
from apps.rides.views import RideSearchView
|
||||||
|
|
||||||
|
app_name = "search"
|
||||||
|
|
||||||
|
urlpatterns = [
|
||||||
|
path("parks/", AdaptiveSearchView.as_view(), name="search"),
|
||||||
|
path("parks/filters/", FilterFormView.as_view(), name="filter_form"),
|
||||||
|
path("rides/", RideSearchView.as_view(), name="ride_search"),
|
||||||
|
path("rides/results/", RideSearchView.as_view(), name="ride_search_results"),
|
||||||
|
# Location-aware search
|
||||||
|
path("location/", LocationSearchView.as_view(), name="location_search"),
|
||||||
|
path(
|
||||||
|
"location/suggestions/",
|
||||||
|
LocationSuggestionsView.as_view(),
|
||||||
|
name="location_suggestions",
|
||||||
|
),
|
||||||
|
]
|
||||||
1
backend/apps/core/utils/__init__.py
Normal file
1
backend/apps/core/utils/__init__.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
# Core utilities
|
||||||
432
backend/apps/core/utils/query_optimization.py
Normal file
432
backend/apps/core/utils/query_optimization.py
Normal file
@@ -0,0 +1,432 @@
|
|||||||
|
"""
|
||||||
|
Database query optimization utilities and helpers.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import time
|
||||||
|
import logging
|
||||||
|
from contextlib import contextmanager
|
||||||
|
from typing import Optional, Dict, Any, List, Type
|
||||||
|
from django.db import connection, models
|
||||||
|
from django.db.models import QuerySet, Prefetch, Count, Avg, Max
|
||||||
|
from django.conf import settings
|
||||||
|
from django.core.cache import cache
|
||||||
|
|
||||||
|
logger = logging.getLogger("query_optimization")
|
||||||
|
|
||||||
|
|
||||||
|
@contextmanager
|
||||||
|
def track_queries(
|
||||||
|
operation_name: str, warn_threshold: int = 10, time_threshold: float = 1.0
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Context manager to track database queries for specific operations
|
||||||
|
|
||||||
|
Args:
|
||||||
|
operation_name: Name of the operation being tracked
|
||||||
|
warn_threshold: Number of queries that triggers a warning
|
||||||
|
time_threshold: Execution time in seconds that triggers a warning
|
||||||
|
"""
|
||||||
|
if not settings.DEBUG:
|
||||||
|
yield
|
||||||
|
return
|
||||||
|
|
||||||
|
initial_queries = len(connection.queries)
|
||||||
|
start_time = time.time()
|
||||||
|
|
||||||
|
try:
|
||||||
|
yield
|
||||||
|
finally:
|
||||||
|
end_time = time.time()
|
||||||
|
total_queries = len(connection.queries) - initial_queries
|
||||||
|
execution_time = end_time - start_time
|
||||||
|
|
||||||
|
# Collect query details
|
||||||
|
query_details = []
|
||||||
|
if hasattr(connection, "queries") and total_queries > 0:
|
||||||
|
recent_queries = connection.queries[-total_queries:]
|
||||||
|
query_details = [
|
||||||
|
{
|
||||||
|
"sql": (
|
||||||
|
query["sql"][:500] + "..."
|
||||||
|
if len(query["sql"]) > 500
|
||||||
|
else query["sql"]
|
||||||
|
),
|
||||||
|
"time": float(query["time"]),
|
||||||
|
"duplicate_count": sum(
|
||||||
|
1 for q in recent_queries if q["sql"] == query["sql"]
|
||||||
|
),
|
||||||
|
}
|
||||||
|
for query in recent_queries
|
||||||
|
]
|
||||||
|
|
||||||
|
performance_data = {
|
||||||
|
"operation": operation_name,
|
||||||
|
"query_count": total_queries,
|
||||||
|
"execution_time": execution_time,
|
||||||
|
"queries": query_details if settings.DEBUG else [],
|
||||||
|
"slow_queries": [
|
||||||
|
q for q in query_details if q["time"] > 0.1
|
||||||
|
], # Queries slower than 100ms
|
||||||
|
}
|
||||||
|
|
||||||
|
# Log warnings for performance issues
|
||||||
|
if total_queries > warn_threshold or execution_time > time_threshold:
|
||||||
|
logger.warning(
|
||||||
|
f"Performance concern in {operation_name}: "
|
||||||
|
f"{total_queries} queries, {execution_time:.2f}s",
|
||||||
|
extra=performance_data,
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
logger.debug(
|
||||||
|
f"Query tracking for {operation_name}: "
|
||||||
|
f"{total_queries} queries, {execution_time:.2f}s",
|
||||||
|
extra=performance_data,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class QueryOptimizer:
|
||||||
|
"""Utility class for common query optimization patterns"""
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def optimize_park_queryset(queryset: QuerySet) -> QuerySet:
|
||||||
|
"""
|
||||||
|
Optimize Park queryset with proper select_related and prefetch_related
|
||||||
|
"""
|
||||||
|
return (
|
||||||
|
queryset.select_related("location", "operator", "created_by")
|
||||||
|
.prefetch_related("areas", "rides__manufacturer", "reviews__user")
|
||||||
|
.annotate(
|
||||||
|
ride_count=Count("rides"),
|
||||||
|
average_rating=Avg("reviews__rating"),
|
||||||
|
latest_review_date=Max("reviews__created_at"),
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def optimize_ride_queryset(queryset: QuerySet) -> QuerySet:
|
||||||
|
"""
|
||||||
|
Optimize Ride queryset with proper relationships
|
||||||
|
"""
|
||||||
|
return (
|
||||||
|
queryset.select_related(
|
||||||
|
"park", "park__location", "manufacturer", "created_by"
|
||||||
|
)
|
||||||
|
.prefetch_related("reviews__user", "media_items")
|
||||||
|
.annotate(
|
||||||
|
review_count=Count("reviews"),
|
||||||
|
average_rating=Avg("reviews__rating"),
|
||||||
|
latest_review_date=Max("reviews__created_at"),
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def optimize_user_queryset(queryset: QuerySet) -> QuerySet:
|
||||||
|
"""
|
||||||
|
Optimize User queryset for profile views
|
||||||
|
"""
|
||||||
|
return queryset.prefetch_related(
|
||||||
|
Prefetch("park_reviews", to_attr="cached_park_reviews"),
|
||||||
|
Prefetch("ride_reviews", to_attr="cached_ride_reviews"),
|
||||||
|
"authored_parks",
|
||||||
|
"authored_rides",
|
||||||
|
).annotate(
|
||||||
|
total_reviews=Count("park_reviews") + Count("ride_reviews"),
|
||||||
|
parks_authored=Count("authored_parks"),
|
||||||
|
rides_authored=Count("authored_rides"),
|
||||||
|
)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def create_bulk_queryset(model: Type[models.Model], ids: List[int]) -> QuerySet:
|
||||||
|
"""
|
||||||
|
Create an optimized queryset for bulk operations
|
||||||
|
"""
|
||||||
|
queryset = model.objects.filter(id__in=ids)
|
||||||
|
|
||||||
|
# Apply model-specific optimizations
|
||||||
|
if hasattr(model, "_meta") and model._meta.model_name == "park":
|
||||||
|
return QueryOptimizer.optimize_park_queryset(queryset)
|
||||||
|
elif hasattr(model, "_meta") and model._meta.model_name == "ride":
|
||||||
|
return QueryOptimizer.optimize_ride_queryset(queryset)
|
||||||
|
elif hasattr(model, "_meta") and model._meta.model_name == "user":
|
||||||
|
return QueryOptimizer.optimize_user_queryset(queryset)
|
||||||
|
|
||||||
|
return queryset
|
||||||
|
|
||||||
|
|
||||||
|
class QueryCache:
|
||||||
|
"""Caching utilities for expensive queries"""
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def cache_queryset_result(
|
||||||
|
cache_key: str, queryset_func, timeout: int = 3600, **kwargs
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Cache the result of an expensive queryset operation
|
||||||
|
|
||||||
|
Args:
|
||||||
|
cache_key: Unique key for caching
|
||||||
|
queryset_func: Function that returns the queryset result
|
||||||
|
timeout: Cache timeout in seconds
|
||||||
|
**kwargs: Arguments to pass to queryset_func
|
||||||
|
"""
|
||||||
|
# Try to get from cache first
|
||||||
|
cached_result = cache.get(cache_key)
|
||||||
|
if cached_result is not None:
|
||||||
|
logger.debug(f"Cache hit for queryset: {cache_key}")
|
||||||
|
return cached_result
|
||||||
|
|
||||||
|
# Execute the expensive operation
|
||||||
|
with track_queries(f"cache_miss_{cache_key}"):
|
||||||
|
result = queryset_func(**kwargs)
|
||||||
|
|
||||||
|
# Cache the result
|
||||||
|
cache.set(cache_key, result, timeout)
|
||||||
|
logger.debug(f"Cached queryset result: {cache_key}")
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def invalidate_model_cache(model_name: str, instance_id: Optional[int] = None):
|
||||||
|
"""
|
||||||
|
Invalidate cache keys related to a specific model
|
||||||
|
|
||||||
|
Args:
|
||||||
|
model_name: Name of the model (e.g., 'park', 'ride')
|
||||||
|
instance_id: Specific instance ID, if applicable
|
||||||
|
"""
|
||||||
|
# Pattern-based cache invalidation (works with Redis)
|
||||||
|
if instance_id:
|
||||||
|
pattern = f"*{model_name}_{instance_id}*"
|
||||||
|
else:
|
||||||
|
pattern = f"*{model_name}*"
|
||||||
|
|
||||||
|
try:
|
||||||
|
# For Redis cache backends that support pattern deletion
|
||||||
|
if hasattr(cache, "delete_pattern"):
|
||||||
|
deleted_count = cache.delete_pattern(pattern)
|
||||||
|
logger.info(
|
||||||
|
f"Invalidated {deleted_count} cache keys for pattern: {pattern}"
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
logger.warning(
|
||||||
|
f"Cache backend does not support pattern deletion: {pattern}"
|
||||||
|
)
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Error invalidating cache pattern {pattern}: {e}")
|
||||||
|
|
||||||
|
|
||||||
|
class IndexAnalyzer:
|
||||||
|
"""Analyze and suggest database indexes"""
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def analyze_slow_queries(min_time: float = 0.1) -> List[Dict[str, Any]]:
|
||||||
|
"""
|
||||||
|
Analyze slow queries from the current request
|
||||||
|
|
||||||
|
Args:
|
||||||
|
min_time: Minimum query time in seconds to consider "slow"
|
||||||
|
"""
|
||||||
|
if not hasattr(connection, "queries"):
|
||||||
|
return []
|
||||||
|
|
||||||
|
slow_queries = []
|
||||||
|
for query in connection.queries:
|
||||||
|
query_time = float(query.get("time", 0))
|
||||||
|
if query_time >= min_time:
|
||||||
|
slow_queries.append(
|
||||||
|
{
|
||||||
|
"sql": query["sql"],
|
||||||
|
"time": query_time,
|
||||||
|
"analysis": IndexAnalyzer._analyze_query_sql(query["sql"]),
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
return slow_queries
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _analyze_query_sql(sql: str) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Analyze SQL to suggest potential optimizations
|
||||||
|
"""
|
||||||
|
sql_upper = sql.upper()
|
||||||
|
analysis = {
|
||||||
|
"has_where_clause": "WHERE" in sql_upper,
|
||||||
|
"has_join": any(
|
||||||
|
join in sql_upper
|
||||||
|
for join in ["JOIN", "INNER JOIN", "LEFT JOIN", "RIGHT JOIN"]
|
||||||
|
),
|
||||||
|
"has_order_by": "ORDER BY" in sql_upper,
|
||||||
|
"has_group_by": "GROUP BY" in sql_upper,
|
||||||
|
"has_like": "LIKE" in sql_upper,
|
||||||
|
"table_scans": [],
|
||||||
|
"suggestions": [],
|
||||||
|
}
|
||||||
|
|
||||||
|
# Detect potential table scans
|
||||||
|
if "WHERE" not in sql_upper and "SELECT COUNT(*) FROM" not in sql_upper:
|
||||||
|
analysis["table_scans"].append("Query may be doing a full table scan")
|
||||||
|
|
||||||
|
# Suggest indexes based on patterns
|
||||||
|
if analysis["has_where_clause"] and not analysis["has_join"]:
|
||||||
|
analysis["suggestions"].append(
|
||||||
|
"Consider adding indexes on WHERE clause columns"
|
||||||
|
)
|
||||||
|
|
||||||
|
if analysis["has_order_by"]:
|
||||||
|
analysis["suggestions"].append(
|
||||||
|
"Consider adding indexes on ORDER BY columns"
|
||||||
|
)
|
||||||
|
|
||||||
|
if analysis["has_like"] and "%" not in sql[: sql.find("LIKE") + 10]:
|
||||||
|
analysis["suggestions"].append(
|
||||||
|
"LIKE queries with leading wildcards cannot use indexes efficiently"
|
||||||
|
)
|
||||||
|
|
||||||
|
return analysis
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def suggest_model_indexes(model: Type[models.Model]) -> List[str]:
|
||||||
|
"""
|
||||||
|
Suggest database indexes for a Django model based on its fields
|
||||||
|
"""
|
||||||
|
suggestions = []
|
||||||
|
opts = model._meta
|
||||||
|
|
||||||
|
# Foreign key fields should have indexes (Django adds these
|
||||||
|
# automatically)
|
||||||
|
for field in opts.fields:
|
||||||
|
if isinstance(field, models.ForeignKey):
|
||||||
|
suggestions.append(
|
||||||
|
f"Index on {field.name} (automatically created by Django)"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Suggest composite indexes for common query patterns
|
||||||
|
date_fields = [
|
||||||
|
f.name
|
||||||
|
for f in opts.fields
|
||||||
|
if isinstance(f, (models.DateField, models.DateTimeField))
|
||||||
|
]
|
||||||
|
status_fields = [
|
||||||
|
f.name
|
||||||
|
for f in opts.fields
|
||||||
|
if f.name in ["status", "is_active", "is_published"]
|
||||||
|
]
|
||||||
|
|
||||||
|
if date_fields and status_fields:
|
||||||
|
for date_field in date_fields:
|
||||||
|
for status_field in status_fields:
|
||||||
|
suggestions.append(
|
||||||
|
f"Composite index on ({status_field}, {date_field}) for filtered date queries"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Suggest indexes for fields commonly used in WHERE clauses
|
||||||
|
common_filter_fields = ["slug", "name", "created_at", "updated_at"]
|
||||||
|
for field in opts.fields:
|
||||||
|
if field.name in common_filter_fields and not field.db_index:
|
||||||
|
suggestions.append(
|
||||||
|
f"Consider adding db_index=True to {
|
||||||
|
field.name}"
|
||||||
|
)
|
||||||
|
|
||||||
|
return suggestions
|
||||||
|
|
||||||
|
|
||||||
|
def log_query_performance():
|
||||||
|
"""Decorator to log query performance for a function"""
|
||||||
|
|
||||||
|
def decorator(func):
|
||||||
|
def wrapper(*args, **kwargs):
|
||||||
|
operation_name = f"{func.__module__}.{func.__name__}"
|
||||||
|
with track_queries(operation_name):
|
||||||
|
return func(*args, **kwargs)
|
||||||
|
|
||||||
|
return wrapper
|
||||||
|
|
||||||
|
return decorator
|
||||||
|
|
||||||
|
|
||||||
|
def optimize_queryset_for_serialization(
|
||||||
|
queryset: QuerySet, fields: List[str]
|
||||||
|
) -> QuerySet:
|
||||||
|
"""
|
||||||
|
Optimize a queryset for API serialization by only selecting needed fields
|
||||||
|
|
||||||
|
Args:
|
||||||
|
queryset: The queryset to optimize
|
||||||
|
fields: List of field names that will be serialized
|
||||||
|
"""
|
||||||
|
# Extract foreign key fields that need select_related
|
||||||
|
model = queryset.model
|
||||||
|
opts = model._meta
|
||||||
|
|
||||||
|
select_related_fields = []
|
||||||
|
prefetch_related_fields = []
|
||||||
|
|
||||||
|
for field_name in fields:
|
||||||
|
try:
|
||||||
|
field = opts.get_field(field_name)
|
||||||
|
if isinstance(field, models.ForeignKey):
|
||||||
|
select_related_fields.append(field_name)
|
||||||
|
elif isinstance(
|
||||||
|
field, (models.ManyToManyField, models.reverse.ManyToManyRel)
|
||||||
|
):
|
||||||
|
prefetch_related_fields.append(field_name)
|
||||||
|
except models.FieldDoesNotExist:
|
||||||
|
# Field might be a property or method, skip optimization
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Apply optimizations
|
||||||
|
if select_related_fields:
|
||||||
|
queryset = queryset.select_related(*select_related_fields)
|
||||||
|
|
||||||
|
if prefetch_related_fields:
|
||||||
|
queryset = queryset.prefetch_related(*prefetch_related_fields)
|
||||||
|
|
||||||
|
return queryset
|
||||||
|
|
||||||
|
|
||||||
|
# Query performance monitoring context manager
|
||||||
|
@contextmanager
|
||||||
|
def monitor_db_performance(operation_name: str):
|
||||||
|
"""
|
||||||
|
Context manager that monitors database performance for an operation
|
||||||
|
"""
|
||||||
|
initial_queries = len(connection.queries) if hasattr(connection, "queries") else 0
|
||||||
|
start_time = time.time()
|
||||||
|
|
||||||
|
try:
|
||||||
|
yield
|
||||||
|
finally:
|
||||||
|
end_time = time.time()
|
||||||
|
duration = end_time - start_time
|
||||||
|
|
||||||
|
if hasattr(connection, "queries"):
|
||||||
|
total_queries = len(connection.queries) - initial_queries
|
||||||
|
|
||||||
|
# Analyze queries for performance issues
|
||||||
|
slow_queries = IndexAnalyzer.analyze_slow_queries(0.05) # 50ms threshold
|
||||||
|
|
||||||
|
performance_data = {
|
||||||
|
"operation": operation_name,
|
||||||
|
"duration": duration,
|
||||||
|
"query_count": total_queries,
|
||||||
|
"slow_query_count": len(slow_queries),
|
||||||
|
# Limit to top 5 slow queries
|
||||||
|
"slow_queries": slow_queries[:5],
|
||||||
|
}
|
||||||
|
|
||||||
|
# Log performance data
|
||||||
|
if duration > 1.0 or total_queries > 15 or slow_queries:
|
||||||
|
logger.warning(
|
||||||
|
f"Performance issue in {operation_name}: "
|
||||||
|
f"{
|
||||||
|
duration:.3f}s, {total_queries} queries, {
|
||||||
|
len(slow_queries)} slow",
|
||||||
|
extra=performance_data,
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
logger.debug(
|
||||||
|
f"DB performance for {operation_name}: "
|
||||||
|
f"{duration:.3f}s, {total_queries} queries",
|
||||||
|
extra=performance_data,
|
||||||
|
)
|
||||||
1
backend/apps/core/views/__init__.py
Normal file
1
backend/apps/core/views/__init__.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
# Core views
|
||||||
273
backend/apps/core/views/health_views.py
Normal file
273
backend/apps/core/views/health_views.py
Normal file
@@ -0,0 +1,273 @@
|
|||||||
|
"""
|
||||||
|
Enhanced health check views for API monitoring.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import time
|
||||||
|
from django.http import JsonResponse
|
||||||
|
from django.utils import timezone
|
||||||
|
from django.views import View
|
||||||
|
from django.conf import settings
|
||||||
|
from rest_framework.views import APIView
|
||||||
|
from rest_framework.response import Response
|
||||||
|
from rest_framework.permissions import AllowAny
|
||||||
|
from health_check.views import MainView
|
||||||
|
from apps.core.services.enhanced_cache_service import CacheMonitor
|
||||||
|
from apps.core.utils.query_optimization import IndexAnalyzer
|
||||||
|
|
||||||
|
|
||||||
|
class HealthCheckAPIView(APIView):
|
||||||
|
"""
|
||||||
|
Enhanced API endpoint for health checks with detailed JSON response
|
||||||
|
"""
|
||||||
|
|
||||||
|
permission_classes = [AllowAny] # Public endpoint
|
||||||
|
|
||||||
|
def get(self, request):
|
||||||
|
"""Return comprehensive health check information"""
|
||||||
|
start_time = time.time()
|
||||||
|
|
||||||
|
# Get basic health check results
|
||||||
|
main_view = MainView()
|
||||||
|
main_view.request = request
|
||||||
|
|
||||||
|
plugins = main_view.plugins
|
||||||
|
errors = main_view.errors
|
||||||
|
|
||||||
|
# Collect additional performance metrics
|
||||||
|
cache_monitor = CacheMonitor()
|
||||||
|
cache_stats = cache_monitor.get_cache_stats()
|
||||||
|
|
||||||
|
# Build comprehensive health data
|
||||||
|
health_data = {
|
||||||
|
"status": "healthy" if not errors else "unhealthy",
|
||||||
|
"timestamp": timezone.now().isoformat(),
|
||||||
|
"version": getattr(settings, "VERSION", "1.0.0"),
|
||||||
|
"environment": getattr(settings, "ENVIRONMENT", "development"),
|
||||||
|
"response_time_ms": 0, # Will be calculated at the end
|
||||||
|
"checks": {},
|
||||||
|
"metrics": {
|
||||||
|
"cache": cache_stats,
|
||||||
|
"database": self._get_database_metrics(),
|
||||||
|
"system": self._get_system_metrics(),
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
# Process individual health checks
|
||||||
|
for plugin in plugins:
|
||||||
|
plugin_name = plugin.identifier()
|
||||||
|
plugin_errors = errors.get(plugin.__class__.__name__, [])
|
||||||
|
|
||||||
|
health_data["checks"][plugin_name] = {
|
||||||
|
"status": "healthy" if not plugin_errors else "unhealthy",
|
||||||
|
"critical": getattr(plugin, "critical_service", False),
|
||||||
|
"errors": [str(error) for error in plugin_errors],
|
||||||
|
"response_time_ms": getattr(plugin, "_response_time", None),
|
||||||
|
}
|
||||||
|
|
||||||
|
# Calculate total response time
|
||||||
|
health_data["response_time_ms"] = round((time.time() - start_time) * 1000, 2)
|
||||||
|
|
||||||
|
# Determine HTTP status code
|
||||||
|
status_code = 200
|
||||||
|
if errors:
|
||||||
|
# Check if any critical services are failing
|
||||||
|
critical_errors = any(
|
||||||
|
getattr(plugin, "critical_service", False)
|
||||||
|
for plugin in plugins
|
||||||
|
if errors.get(plugin.__class__.__name__)
|
||||||
|
)
|
||||||
|
status_code = 503 if critical_errors else 200
|
||||||
|
|
||||||
|
return Response(health_data, status=status_code)
|
||||||
|
|
||||||
|
def _get_database_metrics(self):
|
||||||
|
"""Get database performance metrics"""
|
||||||
|
try:
|
||||||
|
from django.db import connection
|
||||||
|
|
||||||
|
# Get basic connection info
|
||||||
|
metrics = {
|
||||||
|
"vendor": connection.vendor,
|
||||||
|
"connection_status": "connected",
|
||||||
|
}
|
||||||
|
|
||||||
|
# Test query performance
|
||||||
|
start_time = time.time()
|
||||||
|
with connection.cursor() as cursor:
|
||||||
|
cursor.execute("SELECT 1")
|
||||||
|
cursor.fetchone()
|
||||||
|
query_time = (time.time() - start_time) * 1000
|
||||||
|
|
||||||
|
metrics["test_query_time_ms"] = round(query_time, 2)
|
||||||
|
|
||||||
|
# PostgreSQL specific metrics
|
||||||
|
if connection.vendor == "postgresql":
|
||||||
|
try:
|
||||||
|
with connection.cursor() as cursor:
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
SELECT
|
||||||
|
numbackends as active_connections,
|
||||||
|
xact_commit as transactions_committed,
|
||||||
|
xact_rollback as transactions_rolled_back,
|
||||||
|
blks_read as blocks_read,
|
||||||
|
blks_hit as blocks_hit
|
||||||
|
FROM pg_stat_database
|
||||||
|
WHERE datname = current_database()
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
row = cursor.fetchone()
|
||||||
|
if row:
|
||||||
|
metrics.update(
|
||||||
|
{
|
||||||
|
"active_connections": row[0],
|
||||||
|
"transactions_committed": row[1],
|
||||||
|
"transactions_rolled_back": row[2],
|
||||||
|
"cache_hit_ratio": (
|
||||||
|
round(
|
||||||
|
(row[4] / (row[3] + row[4])) * 100,
|
||||||
|
2,
|
||||||
|
)
|
||||||
|
if (row[3] + row[4]) > 0
|
||||||
|
else 0
|
||||||
|
),
|
||||||
|
}
|
||||||
|
)
|
||||||
|
except Exception:
|
||||||
|
pass # Skip advanced metrics if not available
|
||||||
|
|
||||||
|
return metrics
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
return {"connection_status": "error", "error": str(e)}
|
||||||
|
|
||||||
|
def _get_system_metrics(self):
|
||||||
|
"""Get system performance metrics"""
|
||||||
|
metrics = {
|
||||||
|
"debug_mode": settings.DEBUG,
|
||||||
|
"allowed_hosts": (settings.ALLOWED_HOSTS if settings.DEBUG else ["hidden"]),
|
||||||
|
}
|
||||||
|
|
||||||
|
try:
|
||||||
|
import psutil
|
||||||
|
|
||||||
|
# Memory metrics
|
||||||
|
memory = psutil.virtual_memory()
|
||||||
|
metrics["memory"] = {
|
||||||
|
"total_mb": round(memory.total / 1024 / 1024, 2),
|
||||||
|
"available_mb": round(memory.available / 1024 / 1024, 2),
|
||||||
|
"percent_used": memory.percent,
|
||||||
|
}
|
||||||
|
|
||||||
|
# CPU metrics
|
||||||
|
metrics["cpu"] = {
|
||||||
|
"percent_used": psutil.cpu_percent(interval=0.1),
|
||||||
|
"core_count": psutil.cpu_count(),
|
||||||
|
}
|
||||||
|
|
||||||
|
# Disk metrics
|
||||||
|
disk = psutil.disk_usage("/")
|
||||||
|
metrics["disk"] = {
|
||||||
|
"total_gb": round(disk.total / 1024 / 1024 / 1024, 2),
|
||||||
|
"free_gb": round(disk.free / 1024 / 1024 / 1024, 2),
|
||||||
|
"percent_used": round((disk.used / disk.total) * 100, 2),
|
||||||
|
}
|
||||||
|
|
||||||
|
except ImportError:
|
||||||
|
metrics["system_monitoring"] = "psutil not available"
|
||||||
|
except Exception as e:
|
||||||
|
metrics["system_error"] = str(e)
|
||||||
|
|
||||||
|
return metrics
|
||||||
|
|
||||||
|
|
||||||
|
class PerformanceMetricsView(APIView):
|
||||||
|
"""
|
||||||
|
API view for performance metrics and database analysis
|
||||||
|
"""
|
||||||
|
|
||||||
|
permission_classes = [AllowAny] if settings.DEBUG else []
|
||||||
|
|
||||||
|
def get(self, request):
|
||||||
|
"""Return performance metrics and analysis"""
|
||||||
|
if not settings.DEBUG:
|
||||||
|
return Response({"error": "Only available in debug mode"}, status=403)
|
||||||
|
|
||||||
|
metrics = {
|
||||||
|
"timestamp": timezone.now().isoformat(),
|
||||||
|
"database_analysis": self._get_database_analysis(),
|
||||||
|
"cache_performance": self._get_cache_performance(),
|
||||||
|
"recent_slow_queries": self._get_slow_queries(),
|
||||||
|
}
|
||||||
|
|
||||||
|
return Response(metrics)
|
||||||
|
|
||||||
|
def _get_database_analysis(self):
|
||||||
|
"""Analyze database performance"""
|
||||||
|
try:
|
||||||
|
from django.db import connection
|
||||||
|
|
||||||
|
analysis = {
|
||||||
|
"total_queries": len(connection.queries),
|
||||||
|
"query_analysis": IndexAnalyzer.analyze_slow_queries(0.05),
|
||||||
|
}
|
||||||
|
|
||||||
|
if connection.queries:
|
||||||
|
query_times = [float(q.get("time", 0)) for q in connection.queries]
|
||||||
|
analysis.update(
|
||||||
|
{
|
||||||
|
"total_query_time": sum(query_times),
|
||||||
|
"average_query_time": sum(query_times) / len(query_times),
|
||||||
|
"slowest_query_time": max(query_times),
|
||||||
|
"fastest_query_time": min(query_times),
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
return analysis
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
return {"error": str(e)}
|
||||||
|
|
||||||
|
def _get_cache_performance(self):
|
||||||
|
"""Get cache performance metrics"""
|
||||||
|
try:
|
||||||
|
cache_monitor = CacheMonitor()
|
||||||
|
return cache_monitor.get_cache_stats()
|
||||||
|
except Exception as e:
|
||||||
|
return {"error": str(e)}
|
||||||
|
|
||||||
|
def _get_slow_queries(self):
|
||||||
|
"""Get recent slow queries"""
|
||||||
|
try:
|
||||||
|
return IndexAnalyzer.analyze_slow_queries(0.1) # 100ms threshold
|
||||||
|
except Exception as e:
|
||||||
|
return {"error": str(e)}
|
||||||
|
|
||||||
|
|
||||||
|
class SimpleHealthView(View):
|
||||||
|
"""
|
||||||
|
Simple health check endpoint for load balancers
|
||||||
|
"""
|
||||||
|
|
||||||
|
def get(self, request):
|
||||||
|
"""Return simple OK status"""
|
||||||
|
try:
|
||||||
|
# Basic database connectivity test
|
||||||
|
from django.db import connection
|
||||||
|
|
||||||
|
with connection.cursor() as cursor:
|
||||||
|
cursor.execute("SELECT 1")
|
||||||
|
cursor.fetchone()
|
||||||
|
|
||||||
|
return JsonResponse(
|
||||||
|
{"status": "ok", "timestamp": timezone.now().isoformat()}
|
||||||
|
)
|
||||||
|
except Exception as e:
|
||||||
|
return JsonResponse(
|
||||||
|
{
|
||||||
|
"status": "error",
|
||||||
|
"error": str(e),
|
||||||
|
"timestamp": timezone.now().isoformat(),
|
||||||
|
},
|
||||||
|
status=503,
|
||||||
|
)
|
||||||
699
backend/apps/core/views/map_views.py
Normal file
699
backend/apps/core/views/map_views.py
Normal file
@@ -0,0 +1,699 @@
|
|||||||
|
"""
|
||||||
|
API views for the unified map service.
|
||||||
|
Enhanced with proper error handling, pagination, and performance optimizations.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import json
|
||||||
|
import logging
|
||||||
|
from typing import Dict, Any, Optional
|
||||||
|
from django.http import JsonResponse, HttpRequest
|
||||||
|
from django.views.decorators.cache import cache_page
|
||||||
|
from django.views.decorators.gzip import gzip_page
|
||||||
|
from django.utils.decorators import method_decorator
|
||||||
|
from django.views import View
|
||||||
|
from django.core.exceptions import ValidationError
|
||||||
|
from django.conf import settings
|
||||||
|
import time
|
||||||
|
|
||||||
|
from ..services.map_service import unified_map_service
|
||||||
|
from ..services.data_structures import GeoBounds, MapFilters, LocationType
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
class MapAPIView(View):
|
||||||
|
"""Base view for map API endpoints with common functionality."""
|
||||||
|
|
||||||
|
# Pagination settings
|
||||||
|
DEFAULT_PAGE_SIZE = 50
|
||||||
|
MAX_PAGE_SIZE = 200
|
||||||
|
|
||||||
|
def dispatch(self, request, *args, **kwargs):
|
||||||
|
"""Add CORS headers, compression, and handle preflight requests."""
|
||||||
|
start_time = time.time()
|
||||||
|
|
||||||
|
try:
|
||||||
|
response = super().dispatch(request, *args, **kwargs)
|
||||||
|
|
||||||
|
# Add CORS headers for API access
|
||||||
|
response["Access-Control-Allow-Origin"] = "*"
|
||||||
|
response["Access-Control-Allow-Methods"] = "GET, POST, OPTIONS"
|
||||||
|
response["Access-Control-Allow-Headers"] = "Content-Type, Authorization"
|
||||||
|
|
||||||
|
# Add performance headers
|
||||||
|
response["X-Response-Time"] = (
|
||||||
|
f"{(time.time() -
|
||||||
|
start_time) *
|
||||||
|
1000:.2f}ms"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Add compression hint for large responses
|
||||||
|
if hasattr(response, "content") and len(response.content) > 1024:
|
||||||
|
response["Content-Encoding"] = "gzip"
|
||||||
|
|
||||||
|
return response
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(
|
||||||
|
f"API error in {
|
||||||
|
request.path}: {
|
||||||
|
str(e)}",
|
||||||
|
exc_info=True,
|
||||||
|
)
|
||||||
|
return self._error_response("An internal server error occurred", status=500)
|
||||||
|
|
||||||
|
def options(self, request, *args, **kwargs):
|
||||||
|
"""Handle preflight CORS requests."""
|
||||||
|
return JsonResponse({}, status=200)
|
||||||
|
|
||||||
|
def _parse_bounds(self, request: HttpRequest) -> Optional[GeoBounds]:
|
||||||
|
"""Parse geographic bounds from request parameters."""
|
||||||
|
try:
|
||||||
|
north = request.GET.get("north")
|
||||||
|
south = request.GET.get("south")
|
||||||
|
east = request.GET.get("east")
|
||||||
|
west = request.GET.get("west")
|
||||||
|
|
||||||
|
if all(param is not None for param in [north, south, east, west]):
|
||||||
|
bounds = GeoBounds(
|
||||||
|
north=float(north),
|
||||||
|
south=float(south),
|
||||||
|
east=float(east),
|
||||||
|
west=float(west),
|
||||||
|
)
|
||||||
|
|
||||||
|
# Validate bounds
|
||||||
|
if not (-90 <= bounds.south <= bounds.north <= 90):
|
||||||
|
raise ValidationError("Invalid latitude bounds")
|
||||||
|
if not (-180 <= bounds.west <= bounds.east <= 180):
|
||||||
|
raise ValidationError("Invalid longitude bounds")
|
||||||
|
|
||||||
|
return bounds
|
||||||
|
return None
|
||||||
|
except (ValueError, TypeError) as e:
|
||||||
|
raise ValidationError(f"Invalid bounds parameters: {e}")
|
||||||
|
|
||||||
|
def _parse_pagination(self, request: HttpRequest) -> Dict[str, int]:
|
||||||
|
"""Parse pagination parameters from request."""
|
||||||
|
try:
|
||||||
|
page = max(1, int(request.GET.get("page", 1)))
|
||||||
|
page_size = min(
|
||||||
|
self.MAX_PAGE_SIZE,
|
||||||
|
max(
|
||||||
|
1,
|
||||||
|
int(request.GET.get("page_size", self.DEFAULT_PAGE_SIZE)),
|
||||||
|
),
|
||||||
|
)
|
||||||
|
offset = (page - 1) * page_size
|
||||||
|
|
||||||
|
return {
|
||||||
|
"page": page,
|
||||||
|
"page_size": page_size,
|
||||||
|
"offset": offset,
|
||||||
|
"limit": page_size,
|
||||||
|
}
|
||||||
|
except (ValueError, TypeError):
|
||||||
|
return {
|
||||||
|
"page": 1,
|
||||||
|
"page_size": self.DEFAULT_PAGE_SIZE,
|
||||||
|
"offset": 0,
|
||||||
|
"limit": self.DEFAULT_PAGE_SIZE,
|
||||||
|
}
|
||||||
|
|
||||||
|
def _parse_filters(self, request: HttpRequest) -> Optional[MapFilters]:
|
||||||
|
"""Parse filtering parameters from request."""
|
||||||
|
try:
|
||||||
|
filters = MapFilters()
|
||||||
|
|
||||||
|
# Location types
|
||||||
|
location_types_param = request.GET.get("types")
|
||||||
|
if location_types_param:
|
||||||
|
type_strings = location_types_param.split(",")
|
||||||
|
valid_types = {lt.value for lt in LocationType}
|
||||||
|
filters.location_types = {
|
||||||
|
LocationType(t.strip())
|
||||||
|
for t in type_strings
|
||||||
|
if t.strip() in valid_types
|
||||||
|
}
|
||||||
|
|
||||||
|
# Park status
|
||||||
|
park_status_param = request.GET.get("park_status")
|
||||||
|
if park_status_param:
|
||||||
|
filters.park_status = set(park_status_param.split(","))
|
||||||
|
|
||||||
|
# Ride types
|
||||||
|
ride_types_param = request.GET.get("ride_types")
|
||||||
|
if ride_types_param:
|
||||||
|
filters.ride_types = set(ride_types_param.split(","))
|
||||||
|
|
||||||
|
# Company roles
|
||||||
|
company_roles_param = request.GET.get("company_roles")
|
||||||
|
if company_roles_param:
|
||||||
|
filters.company_roles = set(company_roles_param.split(","))
|
||||||
|
|
||||||
|
# Search query with length validation
|
||||||
|
search_query = request.GET.get("q") or request.GET.get("search")
|
||||||
|
if search_query and len(search_query.strip()) >= 2:
|
||||||
|
filters.search_query = search_query.strip()
|
||||||
|
|
||||||
|
# Rating filter with validation
|
||||||
|
min_rating_param = request.GET.get("min_rating")
|
||||||
|
if min_rating_param:
|
||||||
|
min_rating = float(min_rating_param)
|
||||||
|
if 0 <= min_rating <= 10:
|
||||||
|
filters.min_rating = min_rating
|
||||||
|
|
||||||
|
# Geographic filters with validation
|
||||||
|
country = request.GET.get("country", "").strip()
|
||||||
|
if country and len(country) >= 2:
|
||||||
|
filters.country = country
|
||||||
|
|
||||||
|
state = request.GET.get("state", "").strip()
|
||||||
|
if state and len(state) >= 2:
|
||||||
|
filters.state = state
|
||||||
|
|
||||||
|
city = request.GET.get("city", "").strip()
|
||||||
|
if city and len(city) >= 2:
|
||||||
|
filters.city = city
|
||||||
|
|
||||||
|
# Coordinates requirement
|
||||||
|
has_coordinates_param = request.GET.get("has_coordinates")
|
||||||
|
if has_coordinates_param is not None:
|
||||||
|
filters.has_coordinates = has_coordinates_param.lower() in [
|
||||||
|
"true",
|
||||||
|
"1",
|
||||||
|
"yes",
|
||||||
|
]
|
||||||
|
|
||||||
|
return (
|
||||||
|
filters
|
||||||
|
if any(
|
||||||
|
[
|
||||||
|
filters.location_types,
|
||||||
|
filters.park_status,
|
||||||
|
filters.ride_types,
|
||||||
|
filters.company_roles,
|
||||||
|
filters.search_query,
|
||||||
|
filters.min_rating,
|
||||||
|
filters.country,
|
||||||
|
filters.state,
|
||||||
|
filters.city,
|
||||||
|
]
|
||||||
|
)
|
||||||
|
else None
|
||||||
|
)
|
||||||
|
|
||||||
|
except (ValueError, TypeError) as e:
|
||||||
|
raise ValidationError(f"Invalid filter parameters: {e}")
|
||||||
|
|
||||||
|
def _parse_zoom_level(self, request: HttpRequest) -> int:
|
||||||
|
"""Parse zoom level from request with default."""
|
||||||
|
try:
|
||||||
|
zoom_param = request.GET.get("zoom", "10")
|
||||||
|
zoom_level = int(zoom_param)
|
||||||
|
return max(1, min(20, zoom_level)) # Clamp between 1 and 20
|
||||||
|
except (ValueError, TypeError):
|
||||||
|
return 10 # Default zoom level
|
||||||
|
|
||||||
|
def _create_paginated_response(
|
||||||
|
self,
|
||||||
|
data: list,
|
||||||
|
total_count: int,
|
||||||
|
pagination: Dict[str, int],
|
||||||
|
request: HttpRequest,
|
||||||
|
) -> Dict[str, Any]:
|
||||||
|
"""Create paginated response with metadata."""
|
||||||
|
total_pages = (total_count + pagination["page_size"] - 1) // pagination[
|
||||||
|
"page_size"
|
||||||
|
]
|
||||||
|
|
||||||
|
# Build pagination URLs
|
||||||
|
base_url = request.build_absolute_uri(request.path)
|
||||||
|
query_params = request.GET.copy()
|
||||||
|
|
||||||
|
next_url = None
|
||||||
|
if pagination["page"] < total_pages:
|
||||||
|
query_params["page"] = pagination["page"] + 1
|
||||||
|
next_url = f"{base_url}?{query_params.urlencode()}"
|
||||||
|
|
||||||
|
prev_url = None
|
||||||
|
if pagination["page"] > 1:
|
||||||
|
query_params["page"] = pagination["page"] - 1
|
||||||
|
prev_url = f"{base_url}?{query_params.urlencode()}"
|
||||||
|
|
||||||
|
return {
|
||||||
|
"status": "success",
|
||||||
|
"data": data,
|
||||||
|
"pagination": {
|
||||||
|
"page": pagination["page"],
|
||||||
|
"page_size": pagination["page_size"],
|
||||||
|
"total_pages": total_pages,
|
||||||
|
"total_count": total_count,
|
||||||
|
"has_next": pagination["page"] < total_pages,
|
||||||
|
"has_previous": pagination["page"] > 1,
|
||||||
|
"next_url": next_url,
|
||||||
|
"previous_url": prev_url,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
def _error_response(
|
||||||
|
self,
|
||||||
|
message: str,
|
||||||
|
status: int = 400,
|
||||||
|
error_code: str = None,
|
||||||
|
details: Dict[str, Any] = None,
|
||||||
|
) -> JsonResponse:
|
||||||
|
"""Return standardized error response with enhanced information."""
|
||||||
|
response_data = {
|
||||||
|
"status": "error",
|
||||||
|
"message": message,
|
||||||
|
"timestamp": time.time(),
|
||||||
|
"data": None,
|
||||||
|
}
|
||||||
|
|
||||||
|
if error_code:
|
||||||
|
response_data["error_code"] = error_code
|
||||||
|
|
||||||
|
if details:
|
||||||
|
response_data["details"] = details
|
||||||
|
|
||||||
|
# Add request ID for debugging in production
|
||||||
|
if hasattr(settings, "DEBUG") and not settings.DEBUG:
|
||||||
|
response_data["request_id"] = getattr(self.request, "id", None)
|
||||||
|
|
||||||
|
return JsonResponse(response_data, status=status)
|
||||||
|
|
||||||
|
def _success_response(
|
||||||
|
self, data: Any, message: str = None, metadata: Dict[str, Any] = None
|
||||||
|
) -> JsonResponse:
|
||||||
|
"""Return standardized success response."""
|
||||||
|
response_data = {
|
||||||
|
"status": "success",
|
||||||
|
"data": data,
|
||||||
|
"timestamp": time.time(),
|
||||||
|
}
|
||||||
|
|
||||||
|
if message:
|
||||||
|
response_data["message"] = message
|
||||||
|
|
||||||
|
if metadata:
|
||||||
|
response_data["metadata"] = metadata
|
||||||
|
|
||||||
|
return JsonResponse(response_data)
|
||||||
|
|
||||||
|
|
||||||
|
class MapLocationsView(MapAPIView):
|
||||||
|
"""
|
||||||
|
API endpoint for getting map locations with optional clustering.
|
||||||
|
|
||||||
|
GET /api/map/locations/
|
||||||
|
Parameters:
|
||||||
|
- north, south, east, west: Bounding box coordinates
|
||||||
|
- zoom: Zoom level (1-20)
|
||||||
|
- types: Comma-separated location types (park,ride,company,generic)
|
||||||
|
- cluster: Whether to enable clustering (true/false)
|
||||||
|
- q: Search query
|
||||||
|
- park_status: Park status filter
|
||||||
|
- ride_types: Ride type filter
|
||||||
|
- min_rating: Minimum rating filter
|
||||||
|
- country, state, city: Geographic filters
|
||||||
|
"""
|
||||||
|
|
||||||
|
@method_decorator(cache_page(300)) # Cache for 5 minutes
|
||||||
|
@method_decorator(gzip_page) # Compress large responses
|
||||||
|
def get(self, request: HttpRequest) -> JsonResponse:
|
||||||
|
"""Get map locations with optional clustering and filtering."""
|
||||||
|
try:
|
||||||
|
# Parse parameters
|
||||||
|
bounds = self._parse_bounds(request)
|
||||||
|
filters = self._parse_filters(request)
|
||||||
|
zoom_level = self._parse_zoom_level(request)
|
||||||
|
pagination = self._parse_pagination(request)
|
||||||
|
|
||||||
|
# Clustering preference
|
||||||
|
cluster_param = request.GET.get("cluster", "true")
|
||||||
|
enable_clustering = cluster_param.lower() in ["true", "1", "yes"]
|
||||||
|
|
||||||
|
# Cache preference
|
||||||
|
use_cache_param = request.GET.get("cache", "true")
|
||||||
|
use_cache = use_cache_param.lower() in ["true", "1", "yes"]
|
||||||
|
|
||||||
|
# Validate request
|
||||||
|
if not enable_clustering and not bounds and not filters:
|
||||||
|
return self._error_response(
|
||||||
|
"Either bounds, filters, or clustering must be specified for non-clustered requests",
|
||||||
|
error_code="MISSING_PARAMETERS",
|
||||||
|
)
|
||||||
|
|
||||||
|
# Get map data
|
||||||
|
response = unified_map_service.get_map_data(
|
||||||
|
bounds=bounds,
|
||||||
|
filters=filters,
|
||||||
|
zoom_level=zoom_level,
|
||||||
|
cluster=enable_clustering,
|
||||||
|
use_cache=use_cache,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Handle pagination for non-clustered results
|
||||||
|
if not enable_clustering and response.locations:
|
||||||
|
start_idx = pagination["offset"]
|
||||||
|
end_idx = start_idx + pagination["limit"]
|
||||||
|
paginated_locations = response.locations[start_idx:end_idx]
|
||||||
|
|
||||||
|
return JsonResponse(
|
||||||
|
self._create_paginated_response(
|
||||||
|
[loc.to_dict() for loc in paginated_locations],
|
||||||
|
len(response.locations),
|
||||||
|
pagination,
|
||||||
|
request,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
# For clustered results, return as-is with metadata
|
||||||
|
response_dict = response.to_dict()
|
||||||
|
|
||||||
|
return self._success_response(
|
||||||
|
response_dict,
|
||||||
|
metadata={
|
||||||
|
"clustered": response.clustered,
|
||||||
|
"cache_hit": response.cache_hit,
|
||||||
|
"query_time_ms": response.query_time_ms,
|
||||||
|
"filters_applied": response.filters_applied,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
except ValidationError as e:
|
||||||
|
logger.warning(f"Validation error in MapLocationsView: {str(e)}")
|
||||||
|
return self._error_response(str(e), 400, error_code="VALIDATION_ERROR")
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Error in MapLocationsView: {str(e)}", exc_info=True)
|
||||||
|
return self._error_response(
|
||||||
|
"Failed to retrieve map locations",
|
||||||
|
500,
|
||||||
|
error_code="INTERNAL_ERROR",
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class MapLocationDetailView(MapAPIView):
|
||||||
|
"""
|
||||||
|
API endpoint for getting detailed information about a specific location.
|
||||||
|
|
||||||
|
GET /api/map/locations/<type>/<id>/
|
||||||
|
"""
|
||||||
|
|
||||||
|
@method_decorator(cache_page(600)) # Cache for 10 minutes
|
||||||
|
def get(
|
||||||
|
self, request: HttpRequest, location_type: str, location_id: int
|
||||||
|
) -> JsonResponse:
|
||||||
|
"""Get detailed information for a specific location."""
|
||||||
|
try:
|
||||||
|
# Validate location type
|
||||||
|
valid_types = [lt.value for lt in LocationType]
|
||||||
|
if location_type not in valid_types:
|
||||||
|
return self._error_response(
|
||||||
|
f"Invalid location type: {location_type}. Valid types: {
|
||||||
|
', '.join(valid_types)}",
|
||||||
|
400,
|
||||||
|
error_code="INVALID_LOCATION_TYPE",
|
||||||
|
)
|
||||||
|
|
||||||
|
# Validate location ID
|
||||||
|
if location_id <= 0:
|
||||||
|
return self._error_response(
|
||||||
|
"Location ID must be a positive integer",
|
||||||
|
400,
|
||||||
|
error_code="INVALID_LOCATION_ID",
|
||||||
|
)
|
||||||
|
|
||||||
|
# Get location details
|
||||||
|
location = unified_map_service.get_location_details(
|
||||||
|
location_type, location_id
|
||||||
|
)
|
||||||
|
|
||||||
|
if not location:
|
||||||
|
return self._error_response(
|
||||||
|
f"Location not found: {location_type}/{location_id}",
|
||||||
|
404,
|
||||||
|
error_code="LOCATION_NOT_FOUND",
|
||||||
|
)
|
||||||
|
|
||||||
|
return self._success_response(
|
||||||
|
location.to_dict(),
|
||||||
|
metadata={
|
||||||
|
"location_type": location_type,
|
||||||
|
"location_id": location_id,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
except ValueError as e:
|
||||||
|
logger.warning(f"Value error in MapLocationDetailView: {str(e)}")
|
||||||
|
return self._error_response(str(e), 400, error_code="INVALID_PARAMETER")
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(
|
||||||
|
f"Error in MapLocationDetailView: {
|
||||||
|
str(e)}",
|
||||||
|
exc_info=True,
|
||||||
|
)
|
||||||
|
return self._error_response(
|
||||||
|
"Failed to retrieve location details",
|
||||||
|
500,
|
||||||
|
error_code="INTERNAL_ERROR",
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class MapSearchView(MapAPIView):
|
||||||
|
"""
|
||||||
|
API endpoint for searching locations by text query.
|
||||||
|
|
||||||
|
GET /api/map/search/
|
||||||
|
Parameters:
|
||||||
|
- q: Search query (required)
|
||||||
|
- north, south, east, west: Optional bounding box
|
||||||
|
- types: Comma-separated location types
|
||||||
|
- limit: Maximum results (default 50)
|
||||||
|
"""
|
||||||
|
|
||||||
|
@method_decorator(gzip_page) # Compress responses
|
||||||
|
def get(self, request: HttpRequest) -> JsonResponse:
|
||||||
|
"""Search locations by text query with pagination."""
|
||||||
|
try:
|
||||||
|
# Get and validate search query
|
||||||
|
query = request.GET.get("q", "").strip()
|
||||||
|
if not query:
|
||||||
|
return self._error_response(
|
||||||
|
"Search query 'q' parameter is required",
|
||||||
|
400,
|
||||||
|
error_code="MISSING_QUERY",
|
||||||
|
)
|
||||||
|
|
||||||
|
if len(query) < 2:
|
||||||
|
return self._error_response(
|
||||||
|
"Search query must be at least 2 characters long",
|
||||||
|
400,
|
||||||
|
error_code="QUERY_TOO_SHORT",
|
||||||
|
)
|
||||||
|
|
||||||
|
# Parse parameters
|
||||||
|
bounds = self._parse_bounds(request)
|
||||||
|
pagination = self._parse_pagination(request)
|
||||||
|
|
||||||
|
# Parse location types
|
||||||
|
location_types = None
|
||||||
|
types_param = request.GET.get("types")
|
||||||
|
if types_param:
|
||||||
|
try:
|
||||||
|
valid_types = {lt.value for lt in LocationType}
|
||||||
|
location_types = {
|
||||||
|
LocationType(t.strip())
|
||||||
|
for t in types_param.split(",")
|
||||||
|
if t.strip() in valid_types
|
||||||
|
}
|
||||||
|
except ValueError:
|
||||||
|
return self._error_response(
|
||||||
|
"Invalid location types",
|
||||||
|
400,
|
||||||
|
error_code="INVALID_TYPES",
|
||||||
|
)
|
||||||
|
|
||||||
|
# Set reasonable search limit (higher for search than general
|
||||||
|
# listings)
|
||||||
|
search_limit = min(500, pagination["page"] * pagination["page_size"])
|
||||||
|
|
||||||
|
# Perform search
|
||||||
|
locations = unified_map_service.search_locations(
|
||||||
|
query=query,
|
||||||
|
bounds=bounds,
|
||||||
|
location_types=location_types,
|
||||||
|
limit=search_limit,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Apply pagination
|
||||||
|
start_idx = pagination["offset"]
|
||||||
|
end_idx = start_idx + pagination["limit"]
|
||||||
|
paginated_locations = locations[start_idx:end_idx]
|
||||||
|
|
||||||
|
return JsonResponse(
|
||||||
|
self._create_paginated_response(
|
||||||
|
[loc.to_dict() for loc in paginated_locations],
|
||||||
|
len(locations),
|
||||||
|
pagination,
|
||||||
|
request,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
except ValidationError as e:
|
||||||
|
logger.warning(f"Validation error in MapSearchView: {str(e)}")
|
||||||
|
return self._error_response(str(e), 400, error_code="VALIDATION_ERROR")
|
||||||
|
except ValueError as e:
|
||||||
|
logger.warning(f"Value error in MapSearchView: {str(e)}")
|
||||||
|
return self._error_response(str(e), 400, error_code="INVALID_PARAMETER")
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Error in MapSearchView: {str(e)}", exc_info=True)
|
||||||
|
return self._error_response(
|
||||||
|
"Search failed due to internal error",
|
||||||
|
500,
|
||||||
|
error_code="SEARCH_FAILED",
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class MapBoundsView(MapAPIView):
|
||||||
|
"""
|
||||||
|
API endpoint for getting locations within specific bounds.
|
||||||
|
|
||||||
|
GET /api/map/bounds/
|
||||||
|
Parameters:
|
||||||
|
- north, south, east, west: Bounding box coordinates (required)
|
||||||
|
- types: Comma-separated location types
|
||||||
|
- zoom: Zoom level
|
||||||
|
"""
|
||||||
|
|
||||||
|
@method_decorator(cache_page(300)) # Cache for 5 minutes
|
||||||
|
def get(self, request: HttpRequest) -> JsonResponse:
|
||||||
|
"""Get locations within specific geographic bounds."""
|
||||||
|
try:
|
||||||
|
# Parse required bounds
|
||||||
|
bounds = self._parse_bounds(request)
|
||||||
|
if not bounds:
|
||||||
|
return self._error_response(
|
||||||
|
"Bounds parameters required: north, south, east, west", 400
|
||||||
|
)
|
||||||
|
|
||||||
|
# Parse optional filters
|
||||||
|
location_types = None
|
||||||
|
types_param = request.GET.get("types")
|
||||||
|
if types_param:
|
||||||
|
location_types = {
|
||||||
|
LocationType(t.strip())
|
||||||
|
for t in types_param.split(",")
|
||||||
|
if t.strip() in [lt.value for lt in LocationType]
|
||||||
|
}
|
||||||
|
|
||||||
|
zoom_level = self._parse_zoom_level(request)
|
||||||
|
|
||||||
|
# Get locations within bounds
|
||||||
|
response = unified_map_service.get_locations_by_bounds(
|
||||||
|
north=bounds.north,
|
||||||
|
south=bounds.south,
|
||||||
|
east=bounds.east,
|
||||||
|
west=bounds.west,
|
||||||
|
location_types=location_types,
|
||||||
|
zoom_level=zoom_level,
|
||||||
|
)
|
||||||
|
|
||||||
|
return JsonResponse(response.to_dict())
|
||||||
|
|
||||||
|
except ValidationError as e:
|
||||||
|
return self._error_response(str(e), 400)
|
||||||
|
except Exception as e:
|
||||||
|
return self._error_response(
|
||||||
|
f"Internal server error: {
|
||||||
|
str(e)}",
|
||||||
|
500,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class MapStatsView(MapAPIView):
|
||||||
|
"""
|
||||||
|
API endpoint for getting map service statistics and health information.
|
||||||
|
|
||||||
|
GET /api/map/stats/
|
||||||
|
"""
|
||||||
|
|
||||||
|
def get(self, request: HttpRequest) -> JsonResponse:
|
||||||
|
"""Get map service statistics and performance metrics."""
|
||||||
|
try:
|
||||||
|
stats = unified_map_service.get_service_stats()
|
||||||
|
|
||||||
|
return JsonResponse({"status": "success", "data": stats})
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
return self._error_response(
|
||||||
|
f"Internal server error: {
|
||||||
|
str(e)}",
|
||||||
|
500,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class MapCacheView(MapAPIView):
|
||||||
|
"""
|
||||||
|
API endpoint for cache management (admin only).
|
||||||
|
|
||||||
|
DELETE /api/map/cache/
|
||||||
|
POST /api/map/cache/invalidate/
|
||||||
|
"""
|
||||||
|
|
||||||
|
def delete(self, request: HttpRequest) -> JsonResponse:
|
||||||
|
"""Clear all map cache (admin only)."""
|
||||||
|
# TODO: Add admin permission check
|
||||||
|
try:
|
||||||
|
unified_map_service.invalidate_cache()
|
||||||
|
|
||||||
|
return JsonResponse(
|
||||||
|
{
|
||||||
|
"status": "success",
|
||||||
|
"message": "Map cache cleared successfully",
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
return self._error_response(
|
||||||
|
f"Internal server error: {
|
||||||
|
str(e)}",
|
||||||
|
500,
|
||||||
|
)
|
||||||
|
|
||||||
|
def post(self, request: HttpRequest) -> JsonResponse:
|
||||||
|
"""Invalidate specific cache entries."""
|
||||||
|
# TODO: Add admin permission check
|
||||||
|
try:
|
||||||
|
data = json.loads(request.body)
|
||||||
|
|
||||||
|
location_type = data.get("location_type")
|
||||||
|
location_id = data.get("location_id")
|
||||||
|
bounds_data = data.get("bounds")
|
||||||
|
|
||||||
|
bounds = None
|
||||||
|
if bounds_data:
|
||||||
|
bounds = GeoBounds(**bounds_data)
|
||||||
|
|
||||||
|
unified_map_service.invalidate_cache(
|
||||||
|
location_type=location_type,
|
||||||
|
location_id=location_id,
|
||||||
|
bounds=bounds,
|
||||||
|
)
|
||||||
|
|
||||||
|
return JsonResponse(
|
||||||
|
{
|
||||||
|
"status": "success",
|
||||||
|
"message": "Cache invalidated successfully",
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
except (json.JSONDecodeError, TypeError, ValueError) as e:
|
||||||
|
return self._error_response(f"Invalid request data: {str(e)}", 400)
|
||||||
|
except Exception as e:
|
||||||
|
return self._error_response(
|
||||||
|
f"Internal server error: {
|
||||||
|
str(e)}",
|
||||||
|
500,
|
||||||
|
)
|
||||||
421
backend/apps/core/views/maps.py
Normal file
421
backend/apps/core/views/maps.py
Normal file
@@ -0,0 +1,421 @@
|
|||||||
|
"""
|
||||||
|
HTML views for the unified map service.
|
||||||
|
Provides web interfaces for map functionality with HTMX integration.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import json
|
||||||
|
from typing import Dict, Any, Optional, Set
|
||||||
|
from django.shortcuts import render
|
||||||
|
from django.http import JsonResponse, HttpRequest, HttpResponse
|
||||||
|
from django.views.generic import TemplateView, View
|
||||||
|
from django.core.paginator import Paginator
|
||||||
|
|
||||||
|
from ..services.map_service import unified_map_service
|
||||||
|
from ..services.data_structures import GeoBounds, MapFilters, LocationType
|
||||||
|
|
||||||
|
|
||||||
|
class MapViewMixin:
|
||||||
|
"""Mixin providing common functionality for map views."""
|
||||||
|
|
||||||
|
def get_map_context(self, request: HttpRequest) -> Dict[str, Any]:
|
||||||
|
"""Get common context data for map views."""
|
||||||
|
return {
|
||||||
|
"map_api_urls": {
|
||||||
|
"locations": "/api/map/locations/",
|
||||||
|
"search": "/api/map/search/",
|
||||||
|
"bounds": "/api/map/bounds/",
|
||||||
|
"location_detail": "/api/map/locations/",
|
||||||
|
},
|
||||||
|
"location_types": [lt.value for lt in LocationType],
|
||||||
|
"default_zoom": 10,
|
||||||
|
"enable_clustering": True,
|
||||||
|
"enable_search": True,
|
||||||
|
}
|
||||||
|
|
||||||
|
def parse_location_types(self, request: HttpRequest) -> Optional[Set[LocationType]]:
|
||||||
|
"""Parse location types from request parameters."""
|
||||||
|
types_param = request.GET.get("types")
|
||||||
|
if types_param:
|
||||||
|
try:
|
||||||
|
return {
|
||||||
|
LocationType(t.strip())
|
||||||
|
for t in types_param.split(",")
|
||||||
|
if t.strip() in [lt.value for lt in LocationType]
|
||||||
|
}
|
||||||
|
except ValueError:
|
||||||
|
return None
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
class UniversalMapView(MapViewMixin, TemplateView):
|
||||||
|
"""
|
||||||
|
Main universal map view showing all location types.
|
||||||
|
|
||||||
|
URL: /maps/
|
||||||
|
"""
|
||||||
|
|
||||||
|
template_name = "maps/universal_map.html"
|
||||||
|
|
||||||
|
def get_context_data(self, **kwargs):
|
||||||
|
context = super().get_context_data(**kwargs)
|
||||||
|
context.update(self.get_map_context(self.request))
|
||||||
|
|
||||||
|
# Additional context for universal map
|
||||||
|
context.update(
|
||||||
|
{
|
||||||
|
"page_title": "Interactive Map - All Locations",
|
||||||
|
"map_type": "universal",
|
||||||
|
"show_all_types": True,
|
||||||
|
"initial_location_types": [lt.value for lt in LocationType],
|
||||||
|
"filters_enabled": True,
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
# Handle initial bounds from query parameters
|
||||||
|
if all(
|
||||||
|
param in self.request.GET for param in ["north", "south", "east", "west"]
|
||||||
|
):
|
||||||
|
try:
|
||||||
|
context["initial_bounds"] = {
|
||||||
|
"north": float(self.request.GET["north"]),
|
||||||
|
"south": float(self.request.GET["south"]),
|
||||||
|
"east": float(self.request.GET["east"]),
|
||||||
|
"west": float(self.request.GET["west"]),
|
||||||
|
}
|
||||||
|
except (ValueError, TypeError):
|
||||||
|
pass
|
||||||
|
|
||||||
|
return context
|
||||||
|
|
||||||
|
|
||||||
|
class ParkMapView(MapViewMixin, TemplateView):
|
||||||
|
"""
|
||||||
|
Map view focused specifically on parks.
|
||||||
|
|
||||||
|
URL: /maps/parks/
|
||||||
|
"""
|
||||||
|
|
||||||
|
template_name = "maps/park_map.html"
|
||||||
|
|
||||||
|
def get_context_data(self, **kwargs):
|
||||||
|
context = super().get_context_data(**kwargs)
|
||||||
|
context.update(self.get_map_context(self.request))
|
||||||
|
|
||||||
|
# Park-specific context
|
||||||
|
context.update(
|
||||||
|
{
|
||||||
|
"page_title": "Theme Parks Map",
|
||||||
|
"map_type": "parks",
|
||||||
|
"show_all_types": False,
|
||||||
|
"initial_location_types": [LocationType.PARK.value],
|
||||||
|
"filters_enabled": True,
|
||||||
|
"park_specific_filters": True,
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
return context
|
||||||
|
|
||||||
|
|
||||||
|
class NearbyLocationsView(MapViewMixin, TemplateView):
|
||||||
|
"""
|
||||||
|
View for showing locations near a specific point.
|
||||||
|
|
||||||
|
URL: /maps/nearby/
|
||||||
|
"""
|
||||||
|
|
||||||
|
template_name = "maps/nearby_locations.html"
|
||||||
|
|
||||||
|
def get_context_data(self, **kwargs):
|
||||||
|
context = super().get_context_data(**kwargs)
|
||||||
|
context.update(self.get_map_context(self.request))
|
||||||
|
|
||||||
|
# Parse coordinates from query parameters
|
||||||
|
lat = self.request.GET.get("lat")
|
||||||
|
lng = self.request.GET.get("lng")
|
||||||
|
radius = self.request.GET.get("radius", "50") # Default 50km radius
|
||||||
|
|
||||||
|
if lat and lng:
|
||||||
|
try:
|
||||||
|
center_lat = float(lat)
|
||||||
|
center_lng = float(lng)
|
||||||
|
# Clamp between 1-200km
|
||||||
|
search_radius = min(200, max(1, float(radius)))
|
||||||
|
|
||||||
|
context.update(
|
||||||
|
{
|
||||||
|
"page_title": f"Locations Near {
|
||||||
|
center_lat:.4f}, {
|
||||||
|
center_lng:.4f}",
|
||||||
|
"map_type": "nearby",
|
||||||
|
"center_coordinates": {
|
||||||
|
"lat": center_lat,
|
||||||
|
"lng": center_lng,
|
||||||
|
},
|
||||||
|
"search_radius": search_radius,
|
||||||
|
"show_radius_circle": True,
|
||||||
|
}
|
||||||
|
)
|
||||||
|
except (ValueError, TypeError):
|
||||||
|
context["error"] = "Invalid coordinates provided"
|
||||||
|
else:
|
||||||
|
context.update(
|
||||||
|
{
|
||||||
|
"page_title": "Nearby Locations",
|
||||||
|
"map_type": "nearby",
|
||||||
|
"prompt_for_location": True,
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
return context
|
||||||
|
|
||||||
|
|
||||||
|
class LocationFilterView(MapViewMixin, View):
|
||||||
|
"""
|
||||||
|
HTMX endpoint for updating map when filters change.
|
||||||
|
|
||||||
|
URL: /maps/htmx/filter/
|
||||||
|
"""
|
||||||
|
|
||||||
|
def get(self, request: HttpRequest) -> HttpResponse:
|
||||||
|
"""Return filtered location data for HTMX updates."""
|
||||||
|
try:
|
||||||
|
# Parse filter parameters
|
||||||
|
location_types = self.parse_location_types(request)
|
||||||
|
search_query = request.GET.get("q", "").strip()
|
||||||
|
country = request.GET.get("country", "").strip()
|
||||||
|
state = request.GET.get("state", "").strip()
|
||||||
|
|
||||||
|
# Create filters
|
||||||
|
filters = None
|
||||||
|
if any([location_types, search_query, country, state]):
|
||||||
|
filters = MapFilters(
|
||||||
|
location_types=location_types,
|
||||||
|
search_query=search_query or None,
|
||||||
|
country=country or None,
|
||||||
|
state=state or None,
|
||||||
|
has_coordinates=True,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Get filtered locations
|
||||||
|
map_response = unified_map_service.get_map_data(
|
||||||
|
filters=filters,
|
||||||
|
zoom_level=int(request.GET.get("zoom", "10")),
|
||||||
|
cluster=request.GET.get("cluster", "true").lower() == "true",
|
||||||
|
)
|
||||||
|
|
||||||
|
# Return JSON response for HTMX
|
||||||
|
return JsonResponse(
|
||||||
|
{
|
||||||
|
"status": "success",
|
||||||
|
"data": map_response.to_dict(),
|
||||||
|
"filters_applied": map_response.filters_applied,
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
return JsonResponse({"status": "error", "message": str(e)}, status=400)
|
||||||
|
|
||||||
|
|
||||||
|
class LocationSearchView(MapViewMixin, View):
|
||||||
|
"""
|
||||||
|
HTMX endpoint for real-time location search.
|
||||||
|
|
||||||
|
URL: /maps/htmx/search/
|
||||||
|
"""
|
||||||
|
|
||||||
|
def get(self, request: HttpRequest) -> HttpResponse:
|
||||||
|
"""Return search results for HTMX updates."""
|
||||||
|
query = request.GET.get("q", "").strip()
|
||||||
|
|
||||||
|
if not query or len(query) < 3:
|
||||||
|
return render(
|
||||||
|
request,
|
||||||
|
"maps/partials/search_results.html",
|
||||||
|
{
|
||||||
|
"results": [],
|
||||||
|
"query": query,
|
||||||
|
"message": "Enter at least 3 characters to search",
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Parse optional location types
|
||||||
|
location_types = self.parse_location_types(request)
|
||||||
|
limit = min(20, max(5, int(request.GET.get("limit", "10"))))
|
||||||
|
|
||||||
|
# Perform search
|
||||||
|
results = unified_map_service.search_locations(
|
||||||
|
query=query, location_types=location_types, limit=limit
|
||||||
|
)
|
||||||
|
|
||||||
|
return render(
|
||||||
|
request,
|
||||||
|
"maps/partials/search_results.html",
|
||||||
|
{"results": results, "query": query, "count": len(results)},
|
||||||
|
)
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
return render(
|
||||||
|
request,
|
||||||
|
"maps/partials/search_results.html",
|
||||||
|
{"results": [], "query": query, "error": str(e)},
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class MapBoundsUpdateView(MapViewMixin, View):
|
||||||
|
"""
|
||||||
|
HTMX endpoint for updating locations when map bounds change.
|
||||||
|
|
||||||
|
URL: /maps/htmx/bounds/
|
||||||
|
"""
|
||||||
|
|
||||||
|
def post(self, request: HttpRequest) -> HttpResponse:
|
||||||
|
"""Update map data when bounds change."""
|
||||||
|
try:
|
||||||
|
data = json.loads(request.body)
|
||||||
|
|
||||||
|
# Parse bounds
|
||||||
|
bounds = GeoBounds(
|
||||||
|
north=float(data["north"]),
|
||||||
|
south=float(data["south"]),
|
||||||
|
east=float(data["east"]),
|
||||||
|
west=float(data["west"]),
|
||||||
|
)
|
||||||
|
|
||||||
|
# Parse additional parameters
|
||||||
|
zoom_level = int(data.get("zoom", 10))
|
||||||
|
location_types = None
|
||||||
|
if "types" in data:
|
||||||
|
location_types = {
|
||||||
|
LocationType(t)
|
||||||
|
for t in data["types"]
|
||||||
|
if t in [lt.value for lt in LocationType]
|
||||||
|
}
|
||||||
|
|
||||||
|
# Location types are used directly in the service call
|
||||||
|
|
||||||
|
# Get updated map data
|
||||||
|
map_response = unified_map_service.get_locations_by_bounds(
|
||||||
|
north=bounds.north,
|
||||||
|
south=bounds.south,
|
||||||
|
east=bounds.east,
|
||||||
|
west=bounds.west,
|
||||||
|
location_types=location_types,
|
||||||
|
zoom_level=zoom_level,
|
||||||
|
)
|
||||||
|
|
||||||
|
return JsonResponse({"status": "success", "data": map_response.to_dict()})
|
||||||
|
|
||||||
|
except (json.JSONDecodeError, ValueError, KeyError) as e:
|
||||||
|
return JsonResponse(
|
||||||
|
{
|
||||||
|
"status": "error",
|
||||||
|
"message": f"Invalid request data: {str(e)}",
|
||||||
|
},
|
||||||
|
status=400,
|
||||||
|
)
|
||||||
|
except Exception as e:
|
||||||
|
return JsonResponse({"status": "error", "message": str(e)}, status=500)
|
||||||
|
|
||||||
|
|
||||||
|
class LocationDetailModalView(MapViewMixin, View):
|
||||||
|
"""
|
||||||
|
HTMX endpoint for showing location details in modal.
|
||||||
|
|
||||||
|
URL: /maps/htmx/location/<type>/<id>/
|
||||||
|
"""
|
||||||
|
|
||||||
|
def get(
|
||||||
|
self, request: HttpRequest, location_type: str, location_id: int
|
||||||
|
) -> HttpResponse:
|
||||||
|
"""Return location detail modal content."""
|
||||||
|
try:
|
||||||
|
# Validate location type
|
||||||
|
if location_type not in [lt.value for lt in LocationType]:
|
||||||
|
return render(
|
||||||
|
request,
|
||||||
|
"maps/partials/location_modal.html",
|
||||||
|
{"error": f"Invalid location type: {location_type}"},
|
||||||
|
)
|
||||||
|
|
||||||
|
# Get location details
|
||||||
|
location = unified_map_service.get_location_details(
|
||||||
|
location_type, location_id
|
||||||
|
)
|
||||||
|
|
||||||
|
if not location:
|
||||||
|
return render(
|
||||||
|
request,
|
||||||
|
"maps/partials/location_modal.html",
|
||||||
|
{"error": "Location not found"},
|
||||||
|
)
|
||||||
|
|
||||||
|
return render(
|
||||||
|
request,
|
||||||
|
"maps/partials/location_modal.html",
|
||||||
|
{"location": location, "location_type": location_type},
|
||||||
|
)
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
return render(
|
||||||
|
request, "maps/partials/location_modal.html", {"error": str(e)}
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class LocationListView(MapViewMixin, TemplateView):
|
||||||
|
"""
|
||||||
|
View for listing locations with pagination (non-map view).
|
||||||
|
|
||||||
|
URL: /maps/list/
|
||||||
|
"""
|
||||||
|
|
||||||
|
template_name = "maps/location_list.html"
|
||||||
|
paginate_by = 20
|
||||||
|
|
||||||
|
def get_context_data(self, **kwargs):
|
||||||
|
context = super().get_context_data(**kwargs)
|
||||||
|
|
||||||
|
# Parse filters
|
||||||
|
location_types = self.parse_location_types(self.request)
|
||||||
|
search_query = self.request.GET.get("q", "").strip()
|
||||||
|
country = self.request.GET.get("country", "").strip()
|
||||||
|
state = self.request.GET.get("state", "").strip()
|
||||||
|
|
||||||
|
# Create filters
|
||||||
|
filters = None
|
||||||
|
if any([location_types, search_query, country, state]):
|
||||||
|
filters = MapFilters(
|
||||||
|
location_types=location_types,
|
||||||
|
search_query=search_query or None,
|
||||||
|
country=country or None,
|
||||||
|
state=state or None,
|
||||||
|
has_coordinates=True,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Get locations without clustering
|
||||||
|
map_response = unified_map_service.get_map_data(
|
||||||
|
filters=filters, cluster=False, use_cache=True
|
||||||
|
)
|
||||||
|
|
||||||
|
# Paginate results
|
||||||
|
paginator = Paginator(map_response.locations, self.paginate_by)
|
||||||
|
page_number = self.request.GET.get("page")
|
||||||
|
page_obj = paginator.get_page(page_number)
|
||||||
|
|
||||||
|
context.update(
|
||||||
|
{
|
||||||
|
"page_title": "All Locations",
|
||||||
|
"locations": page_obj,
|
||||||
|
"total_count": map_response.total_count,
|
||||||
|
"applied_filters": filters,
|
||||||
|
"location_types": [lt.value for lt in LocationType],
|
||||||
|
"current_filters": {
|
||||||
|
"types": self.request.GET.getlist("types"),
|
||||||
|
"q": search_query,
|
||||||
|
"country": country,
|
||||||
|
"state": state,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
return context
|
||||||
178
backend/apps/core/views/search.py
Normal file
178
backend/apps/core/views/search.py
Normal file
@@ -0,0 +1,178 @@
|
|||||||
|
from django.views.generic import TemplateView
|
||||||
|
from django.http import JsonResponse
|
||||||
|
from django.contrib.gis.geos import Point
|
||||||
|
from apps.parks.models import Park
|
||||||
|
from apps.parks.filters import ParkFilter
|
||||||
|
from apps.core.services.location_search import (
|
||||||
|
location_search_service,
|
||||||
|
LocationSearchFilters,
|
||||||
|
)
|
||||||
|
from apps.core.forms.search import LocationSearchForm
|
||||||
|
|
||||||
|
|
||||||
|
class AdaptiveSearchView(TemplateView):
|
||||||
|
template_name = "core/search/results.html"
|
||||||
|
|
||||||
|
def get_queryset(self):
|
||||||
|
"""
|
||||||
|
Get the base queryset, optimized with select_related and prefetch_related
|
||||||
|
"""
|
||||||
|
return (
|
||||||
|
Park.objects.select_related("operator", "property_owner")
|
||||||
|
.prefetch_related("location", "photos")
|
||||||
|
.all()
|
||||||
|
)
|
||||||
|
|
||||||
|
def get_filterset(self):
|
||||||
|
"""
|
||||||
|
Get the filterset instance
|
||||||
|
"""
|
||||||
|
return ParkFilter(self.request.GET, queryset=self.get_queryset())
|
||||||
|
|
||||||
|
def get_context_data(self, **kwargs):
|
||||||
|
"""
|
||||||
|
Add filtered results and filter form to context
|
||||||
|
"""
|
||||||
|
context = super().get_context_data(**kwargs)
|
||||||
|
filterset = self.get_filterset()
|
||||||
|
|
||||||
|
# Check if location-based search is being used
|
||||||
|
location_search = self.request.GET.get("location_search", "").strip()
|
||||||
|
near_location = self.request.GET.get("near_location", "").strip()
|
||||||
|
|
||||||
|
# Add location search context
|
||||||
|
context.update(
|
||||||
|
{
|
||||||
|
"results": filterset.qs,
|
||||||
|
"filters": filterset,
|
||||||
|
"applied_filters": bool(
|
||||||
|
self.request.GET
|
||||||
|
), # Check if any filters are applied
|
||||||
|
"is_location_search": bool(location_search or near_location),
|
||||||
|
"location_search_query": location_search or near_location,
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
return context
|
||||||
|
|
||||||
|
|
||||||
|
class FilterFormView(TemplateView):
|
||||||
|
"""
|
||||||
|
View for rendering just the filter form for HTMX updates
|
||||||
|
"""
|
||||||
|
|
||||||
|
template_name = "core/search/filters.html"
|
||||||
|
|
||||||
|
def get_context_data(self, **kwargs):
|
||||||
|
context = super().get_context_data(**kwargs)
|
||||||
|
filterset = ParkFilter(self.request.GET, queryset=Park.objects.all())
|
||||||
|
context["filters"] = filterset
|
||||||
|
return context
|
||||||
|
|
||||||
|
|
||||||
|
class LocationSearchView(TemplateView):
|
||||||
|
"""
|
||||||
|
Enhanced search view with comprehensive location search capabilities.
|
||||||
|
"""
|
||||||
|
|
||||||
|
template_name = "core/search/location_results.html"
|
||||||
|
|
||||||
|
def get_context_data(self, **kwargs):
|
||||||
|
context = super().get_context_data(**kwargs)
|
||||||
|
|
||||||
|
# Build search filters from request parameters
|
||||||
|
filters = self._build_search_filters()
|
||||||
|
|
||||||
|
# Perform search
|
||||||
|
results = location_search_service.search(filters)
|
||||||
|
|
||||||
|
# Group results by type for better presentation
|
||||||
|
grouped_results = {
|
||||||
|
"parks": [r for r in results if r.content_type == "park"],
|
||||||
|
"rides": [r for r in results if r.content_type == "ride"],
|
||||||
|
"companies": [r for r in results if r.content_type == "company"],
|
||||||
|
}
|
||||||
|
|
||||||
|
context.update(
|
||||||
|
{
|
||||||
|
"results": results,
|
||||||
|
"grouped_results": grouped_results,
|
||||||
|
"total_results": len(results),
|
||||||
|
"search_filters": filters,
|
||||||
|
"has_location_filter": bool(filters.location_point),
|
||||||
|
"search_form": LocationSearchForm(self.request.GET),
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
return context
|
||||||
|
|
||||||
|
def _build_search_filters(self) -> LocationSearchFilters:
|
||||||
|
"""Build LocationSearchFilters from request parameters."""
|
||||||
|
form = LocationSearchForm(self.request.GET)
|
||||||
|
form.is_valid() # Populate cleaned_data
|
||||||
|
|
||||||
|
# Parse location coordinates if provided
|
||||||
|
location_point = None
|
||||||
|
lat = form.cleaned_data.get("lat")
|
||||||
|
lng = form.cleaned_data.get("lng")
|
||||||
|
if lat and lng:
|
||||||
|
try:
|
||||||
|
location_point = Point(float(lng), float(lat), srid=4326)
|
||||||
|
except (ValueError, TypeError):
|
||||||
|
location_point = None
|
||||||
|
|
||||||
|
# Parse location types
|
||||||
|
location_types = set()
|
||||||
|
if form.cleaned_data.get("search_parks"):
|
||||||
|
location_types.add("park")
|
||||||
|
if form.cleaned_data.get("search_rides"):
|
||||||
|
location_types.add("ride")
|
||||||
|
if form.cleaned_data.get("search_companies"):
|
||||||
|
location_types.add("company")
|
||||||
|
|
||||||
|
# If no specific types selected, search all
|
||||||
|
if not location_types:
|
||||||
|
location_types = {"park", "ride", "company"}
|
||||||
|
|
||||||
|
# Parse radius
|
||||||
|
radius_km = None
|
||||||
|
radius_str = form.cleaned_data.get("radius_km", "").strip()
|
||||||
|
if radius_str:
|
||||||
|
try:
|
||||||
|
radius_km = float(radius_str)
|
||||||
|
# Clamp between 1-500km
|
||||||
|
radius_km = max(1, min(500, radius_km))
|
||||||
|
except (ValueError, TypeError):
|
||||||
|
radius_km = None
|
||||||
|
|
||||||
|
return LocationSearchFilters(
|
||||||
|
search_query=form.cleaned_data.get("q", "").strip() or None,
|
||||||
|
location_point=location_point,
|
||||||
|
radius_km=radius_km,
|
||||||
|
location_types=location_types if location_types else None,
|
||||||
|
country=form.cleaned_data.get("country", "").strip() or None,
|
||||||
|
state=form.cleaned_data.get("state", "").strip() or None,
|
||||||
|
city=form.cleaned_data.get("city", "").strip() or None,
|
||||||
|
park_status=self.request.GET.getlist("park_status") or None,
|
||||||
|
include_distance=True,
|
||||||
|
max_results=int(self.request.GET.get("limit", 100)),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class LocationSuggestionsView(TemplateView):
|
||||||
|
"""
|
||||||
|
AJAX endpoint for location search suggestions.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def get(self, request, *args, **kwargs):
|
||||||
|
query = request.GET.get("q", "").strip()
|
||||||
|
limit = int(request.GET.get("limit", 10))
|
||||||
|
|
||||||
|
if len(query) < 2:
|
||||||
|
return JsonResponse({"suggestions": []})
|
||||||
|
|
||||||
|
try:
|
||||||
|
suggestions = location_search_service.suggest_locations(query, limit)
|
||||||
|
return JsonResponse({"suggestions": suggestions})
|
||||||
|
except Exception as e:
|
||||||
|
return JsonResponse({"error": str(e)}, status=500)
|
||||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user