mirror of
https://github.com/pacnpal/thrillwiki_django_no_react.git
synced 2025-12-30 07:47:00 -05:00
Compare commits
12 Commits
clean-hist
...
937eee19e4
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
937eee19e4 | ||
|
|
e62646bcf9 | ||
|
|
92f4104d7a | ||
|
|
02c7cbd1cd | ||
|
|
d504d41de2 | ||
|
|
b0e0678590 | ||
|
|
652ea149bd | ||
|
|
66ed4347a9 | ||
|
|
69c07d1381 | ||
|
|
bead0654df | ||
|
|
37a20f83ba | ||
|
|
2304085c32 |
42
.clinerules
42
.clinerules
@@ -3,11 +3,13 @@
|
|||||||
## Development Server
|
## Development Server
|
||||||
IMPORTANT: Always follow these instructions exactly when starting the development server:
|
IMPORTANT: Always follow these instructions exactly when starting the development server:
|
||||||
|
|
||||||
```bash
|
FIRST, assume the server is running. Always. Assume the changes have taken effect.
|
||||||
lsof -ti :8000 | xargs kill -9; find . -type d -name "__pycache__" -exec rm -r {} +; uv run manage.py tailwind runserver
|
|
||||||
```
|
|
||||||
|
|
||||||
Note: These steps must be executed in this exact order as a single command to ensure consistent behavior.
|
IF THERE IS AN ISSUE WITH THE SERVER, run the following command exactly:
|
||||||
|
```bash
|
||||||
|
lsof -ti :8000 | xargs kill -9; find . -type d -name "__pycache__" -exec rm -r {} +; cd backend && uv run manage.py runserver_plus && cd ../frontend && pnpm run dev
|
||||||
|
|
||||||
|
Note: These steps must be executed in this exact order to ensure consistent behavior. If server does not start correctly, fix the error in accordance with the error details as best you can.
|
||||||
|
|
||||||
## Package Management
|
## Package Management
|
||||||
IMPORTANT: When a Python package is needed, only use UV to add it:
|
IMPORTANT: When a Python package is needed, only use UV to add it:
|
||||||
@@ -24,8 +26,8 @@ uv run manage.py <command>
|
|||||||
This applies to all management commands including but not limited to:
|
This applies to all management commands including but not limited to:
|
||||||
- Making migrations: `uv run manage.py makemigrations`
|
- Making migrations: `uv run manage.py makemigrations`
|
||||||
- Applying migrations: `uv run manage.py migrate`
|
- Applying migrations: `uv run manage.py migrate`
|
||||||
- Creating superuser: `uv run manage.py createsuperuser`
|
- Creating superuser: `uv run manage.py createsuperuser` and possible echo commands before for the necessary data input.
|
||||||
- Starting shell: `uv run manage.py shell`
|
- Starting shell: `uv run manage.py shell` and possible echo commands before for the necessary data input.
|
||||||
|
|
||||||
NEVER use `python manage.py` or `uv run python manage.py`. Always use `uv run manage.py` directly.
|
NEVER use `python manage.py` or `uv run python manage.py`. Always use `uv run manage.py` directly.
|
||||||
|
|
||||||
@@ -48,8 +50,34 @@ IMPORTANT: Follow these entity relationship patterns consistently:
|
|||||||
- PropertyOwners: Companies that own park property (new concept, optional)
|
- PropertyOwners: Companies that own park property (new concept, optional)
|
||||||
- Manufacturers: Companies that manufacture rides (replaces Company for rides)
|
- Manufacturers: Companies that manufacture rides (replaces Company for rides)
|
||||||
- Designers: Companies/individuals that design rides (existing concept)
|
- Designers: Companies/individuals that design rides (existing concept)
|
||||||
|
- IMPORTANT: All entities can have locations.
|
||||||
|
|
||||||
# Relationship Constraints
|
# Relationship Constraints
|
||||||
- Operator and PropertyOwner are usually the same entity but CAN be different
|
- Operator and PropertyOwner are usually the same entity but CAN be different
|
||||||
- Manufacturers and Designers are distinct concepts and should not be conflated
|
- Manufacturers and Designers are distinct concepts and should not be conflated
|
||||||
- All entity relationships should use proper foreign keys with appropriate null/blank settings
|
- All entity relationships should use proper foreign keys with appropriate null/blank settings
|
||||||
|
|
||||||
|
- You are to NEVER assume that blank output means your fixes were correct. That assumption can lead to further issues down the line.
|
||||||
|
- ALWAYS verify your changes by testing the affected functionality thoroughly.
|
||||||
|
- ALWAYS use context7 to check documentation when troubleshooting. It contains VITAL documentation for any and all frameworks, modules, and packages.
|
||||||
|
- ALWAYS document your code changes with conport and the reasoning behind them.
|
||||||
|
- ALWAYS include relevant context and information when making changes to the codebase.
|
||||||
|
- ALWAYS ensure that your code changes are properly tested and validated before deployment.
|
||||||
|
- ALWAYS communicate clearly and effectively with your team about any changes you make.
|
||||||
|
- ALWAYS be open to feedback and willing to make adjustments as necessary.
|
||||||
|
- ALWAYS strive for continuous improvement in your work and processes.
|
||||||
|
- ALWAYS prioritize code readability and maintainability.
|
||||||
|
- ALWAYS keep security best practices in mind when developing and reviewing code.
|
||||||
|
- ALWAYS consider performance implications when making changes to the codebase.
|
||||||
|
- ALWAYS be mindful of the impact of your changes on the overall system architecture.
|
||||||
|
- ALWAYS keep scalability in mind when designing new features or modifying existing ones.
|
||||||
|
- ALWAYS consider the potential for code reuse and modularity in your designs.
|
||||||
|
- ALWAYS document your code with clear and concise comments.
|
||||||
|
- ALWAYS keep your code DRY (Don't Repeat Yourself) by abstracting common functionality into reusable components.
|
||||||
|
- ALWAYS use meaningful variable and function names to improve code readability.
|
||||||
|
- ALWAYS handle errors and exceptions gracefully to improve the user experience.
|
||||||
|
- ALWAYS log important events and errors for troubleshooting purposes.
|
||||||
|
- ALWAYS consider if there may be an existing module or package that can be leveraged before creating new functionality from scratch.
|
||||||
|
- ALWAYS keep documentation up to date with any code changes.
|
||||||
|
- ALWAYS consider if there are any potential security vulnerabilities in your code.
|
||||||
|
- ALWAYS consider if there are any potential performance bottlenecks in your code.
|
||||||
29
.flake8
Normal file
29
.flake8
Normal file
@@ -0,0 +1,29 @@
|
|||||||
|
[flake8]
|
||||||
|
# Maximum line length (matches Black formatter)
|
||||||
|
max-line-length = 88
|
||||||
|
|
||||||
|
# Exclude common directories that shouldn't be linted
|
||||||
|
exclude =
|
||||||
|
.git,
|
||||||
|
__pycache__,
|
||||||
|
.venv,
|
||||||
|
venv,
|
||||||
|
env,
|
||||||
|
.env,
|
||||||
|
migrations,
|
||||||
|
node_modules,
|
||||||
|
.tox,
|
||||||
|
.mypy_cache,
|
||||||
|
.pytest_cache,
|
||||||
|
build,
|
||||||
|
dist,
|
||||||
|
*.egg-info
|
||||||
|
|
||||||
|
# Ignore line break style warnings which are style preferences
|
||||||
|
# W503: line break before binary operator (conflicts with PEP8 W504)
|
||||||
|
# W504: line break after binary operator (conflicts with PEP8 W503)
|
||||||
|
# These warnings contradict each other, so it's best to ignore one or both
|
||||||
|
ignore = W503,W504
|
||||||
|
|
||||||
|
# Maximum complexity for McCabe complexity checker
|
||||||
|
max-complexity = 10
|
||||||
440
.gitignore
vendored
440
.gitignore
vendored
@@ -1,198 +1,8 @@
|
|||||||
/.vscode
|
# Python
|
||||||
/dev.sh
|
|
||||||
/flake.nix
|
|
||||||
venv
|
|
||||||
/venv
|
|
||||||
./venv
|
|
||||||
venv/sour
|
|
||||||
.DS_Store
|
|
||||||
.DS_Store
|
|
||||||
.DS_Store
|
|
||||||
accounts/__pycache__/
|
|
||||||
__pycache__
|
|
||||||
thrillwiki/__pycache__
|
|
||||||
reviews/__pycache__
|
|
||||||
parks/__pycache__
|
|
||||||
media/__pycache__
|
|
||||||
email_service/__pycache__
|
|
||||||
core/__pycache__
|
|
||||||
companies/__pycache__
|
|
||||||
accounts/__pycache__
|
|
||||||
venv
|
|
||||||
accounts/__pycache__
|
|
||||||
thrillwiki/__pycache__/settings.cpython-311.pyc
|
|
||||||
accounts/migrations/__pycache__/__init__.cpython-311.pyc
|
|
||||||
accounts/migrations/__pycache__/0001_initial.cpython-311.pyc
|
|
||||||
companies/migrations/__pycache__
|
|
||||||
moderation/__pycache__
|
|
||||||
rides/__pycache__
|
|
||||||
ssh_tools.jsonc
|
|
||||||
thrillwiki/__pycache__/settings.cpython-312.pyc
|
|
||||||
parks/__pycache__/views.cpython-312.pyc
|
|
||||||
.venv/lib/python3.12/site-packages
|
|
||||||
thrillwiki/__pycache__/urls.cpython-312.pyc
|
|
||||||
thrillwiki/__pycache__/views.cpython-312.pyc
|
|
||||||
.pytest_cache.github
|
|
||||||
static/css/tailwind.css
|
|
||||||
static/css/tailwind.css
|
|
||||||
.venv
|
|
||||||
location/__pycache__
|
|
||||||
analytics/__pycache__
|
|
||||||
designers/__pycache__
|
|
||||||
history_tracking/__pycache__
|
|
||||||
media/migrations/__pycache__/0001_initial.cpython-312.pyc
|
|
||||||
accounts/__pycache__/__init__.cpython-312.pyc
|
|
||||||
accounts/__pycache__/adapters.cpython-312.pyc
|
|
||||||
accounts/__pycache__/admin.cpython-312.pyc
|
|
||||||
accounts/__pycache__/apps.cpython-312.pyc
|
|
||||||
accounts/__pycache__/models.cpython-312.pyc
|
|
||||||
accounts/__pycache__/signals.cpython-312.pyc
|
|
||||||
accounts/__pycache__/urls.cpython-312.pyc
|
|
||||||
accounts/__pycache__/views.cpython-312.pyc
|
|
||||||
accounts/migrations/__pycache__/__init__.cpython-312.pyc
|
|
||||||
accounts/migrations/__pycache__/0001_initial.cpython-312.pyc
|
|
||||||
companies/__pycache__/__init__.cpython-312.pyc
|
|
||||||
companies/__pycache__/admin.cpython-312.pyc
|
|
||||||
companies/__pycache__/apps.cpython-312.pyc
|
|
||||||
companies/__pycache__/models.cpython-312.pyc
|
|
||||||
companies/__pycache__/signals.cpython-312.pyc
|
|
||||||
companies/__pycache__/urls.cpython-312.pyc
|
|
||||||
companies/__pycache__/views.cpython-312.pyc
|
|
||||||
companies/migrations/__pycache__/__init__.cpython-312.pyc
|
|
||||||
companies/migrations/__pycache__/0001_initial.cpython-312.pyc
|
|
||||||
core/__pycache__/__init__.cpython-312.pyc
|
|
||||||
core/__pycache__/admin.cpython-312.pyc
|
|
||||||
core/__pycache__/apps.cpython-312.pyc
|
|
||||||
core/__pycache__/models.cpython-312.pyc
|
|
||||||
core/__pycache__/views.cpython-312.pyc
|
|
||||||
core/migrations/__pycache__/__init__.cpython-312.pyc
|
|
||||||
core/migrations/__pycache__/0001_initial.cpython-312.pyc
|
|
||||||
email_service/__pycache__/__init__.cpython-312.pyc
|
|
||||||
email_service/__pycache__/admin.cpython-312.pyc
|
|
||||||
email_service/__pycache__/apps.cpython-312.pyc
|
|
||||||
email_service/__pycache__/models.cpython-312.pyc
|
|
||||||
email_service/__pycache__/services.cpython-312.pyc
|
|
||||||
email_service/migrations/__pycache__/__init__.cpython-312.pyc
|
|
||||||
email_service/migrations/__pycache__/0001_initial.cpython-312.pyc
|
|
||||||
media/__pycache__/__init__.cpython-312.pyc
|
|
||||||
media/__pycache__/admin.cpython-312.pyc
|
|
||||||
media/__pycache__/apps.cpython-312.pyc
|
|
||||||
media/__pycache__/models.cpython-312.pyc
|
|
||||||
media/migrations/__pycache__/__init__.cpython-312.pyc
|
|
||||||
media/migrations/__pycache__/0001_initial.cpython-312.pyc
|
|
||||||
parks/__pycache__/__init__.cpython-312.pyc
|
|
||||||
parks/__pycache__/admin.cpython-312.pyc
|
|
||||||
parks/__pycache__/apps.cpython-312.pyc
|
|
||||||
parks/__pycache__/models.cpython-312.pyc
|
|
||||||
parks/__pycache__/signals.cpython-312.pyc
|
|
||||||
parks/__pycache__/urls.cpython-312.pyc
|
|
||||||
parks/__pycache__/views.cpython-312.pyc
|
|
||||||
parks/migrations/__pycache__/__init__.cpython-312.pyc
|
|
||||||
parks/migrations/__pycache__/0001_initial.cpython-312.pyc
|
|
||||||
reviews/__pycache__/__init__.cpython-312.pyc
|
|
||||||
reviews/__pycache__/admin.cpython-312.pyc
|
|
||||||
reviews/__pycache__/apps.cpython-312.pyc
|
|
||||||
reviews/__pycache__/models.cpython-312.pyc
|
|
||||||
reviews/__pycache__/signals.cpython-312.pyc
|
|
||||||
reviews/__pycache__/urls.cpython-312.pyc
|
|
||||||
reviews/__pycache__/views.cpython-312.pyc
|
|
||||||
reviews/migrations/__pycache__/__init__.cpython-312.pyc
|
|
||||||
reviews/migrations/__pycache__/0001_initial.cpython-312.pyc
|
|
||||||
rides/__pycache__/__init__.cpython-312.pyc
|
|
||||||
rides/__pycache__/admin.cpython-312.pyc
|
|
||||||
rides/__pycache__/apps.cpython-312.pyc
|
|
||||||
rides/__pycache__/models.cpython-312.pyc
|
|
||||||
rides/__pycache__/signals.cpython-312.pyc
|
|
||||||
rides/__pycache__/urls.cpython-312.pyc
|
|
||||||
rides/__pycache__/views.cpython-312.pyc
|
|
||||||
rides/migrations/__pycache__/__init__.cpython-312.pyc
|
|
||||||
rides/migrations/__pycache__/0001_initial.cpython-312.pyc
|
|
||||||
thrillwiki/__pycache__/__init__.cpython-312.pyc
|
|
||||||
thrillwiki/__pycache__/settings.cpython-312.pyc
|
|
||||||
thrillwiki/__pycache__/urls.cpython-312.pyc
|
|
||||||
thrillwiki/__pycache__/views.cpython-312.pyc
|
|
||||||
thrillwiki/__pycache__/wsgi.cpython-312.pyc
|
|
||||||
accounts/__pycache__/__init__.cpython-312.pyc
|
|
||||||
accounts/__pycache__/adapters.cpython-312.pyc
|
|
||||||
accounts/__pycache__/admin.cpython-312.pyc
|
|
||||||
accounts/__pycache__/apps.cpython-312.pyc
|
|
||||||
accounts/__pycache__/models.cpython-312.pyc
|
|
||||||
accounts/__pycache__/signals.cpython-312.pyc
|
|
||||||
accounts/__pycache__/urls.cpython-312.pyc
|
|
||||||
accounts/__pycache__/views.cpython-312.pyc
|
|
||||||
accounts/migrations/__pycache__/__init__.cpython-312.pyc
|
|
||||||
accounts/migrations/__pycache__/0001_initial.cpython-312.pyc
|
|
||||||
companies/__pycache__/__init__.cpython-312.pyc
|
|
||||||
companies/__pycache__/admin.cpython-312.pyc
|
|
||||||
companies/__pycache__/apps.cpython-312.pyc
|
|
||||||
companies/__pycache__/models.cpython-312.pyc
|
|
||||||
companies/__pycache__/signals.cpython-312.pyc
|
|
||||||
companies/__pycache__/urls.cpython-312.pyc
|
|
||||||
companies/__pycache__/views.cpython-312.pyc
|
|
||||||
companies/migrations/__pycache__/__init__.cpython-312.pyc
|
|
||||||
companies/migrations/__pycache__/0001_initial.cpython-312.pyc
|
|
||||||
core/__pycache__/__init__.cpython-312.pyc
|
|
||||||
core/__pycache__/admin.cpython-312.pyc
|
|
||||||
core/__pycache__/apps.cpython-312.pyc
|
|
||||||
core/__pycache__/models.cpython-312.pyc
|
|
||||||
core/__pycache__/views.cpython-312.pyc
|
|
||||||
core/migrations/__pycache__/__init__.cpython-312.pyc
|
|
||||||
core/migrations/__pycache__/0001_initial.cpython-312.pyc
|
|
||||||
email_service/__pycache__/__init__.cpython-312.pyc
|
|
||||||
email_service/__pycache__/admin.cpython-312.pyc
|
|
||||||
email_service/__pycache__/apps.cpython-312.pyc
|
|
||||||
email_service/__pycache__/models.cpython-312.pyc
|
|
||||||
email_service/__pycache__/services.cpython-312.pyc
|
|
||||||
email_service/migrations/__pycache__/__init__.cpython-312.pyc
|
|
||||||
email_service/migrations/__pycache__/0001_initial.cpython-312.pyc
|
|
||||||
media/__pycache__/__init__.cpython-312.pyc
|
|
||||||
media/__pycache__/admin.cpython-312.pyc
|
|
||||||
media/__pycache__/apps.cpython-312.pyc
|
|
||||||
media/__pycache__/models.cpython-312.pyc
|
|
||||||
media/migrations/__pycache__/__init__.cpython-312.pyc
|
|
||||||
media/migrations/__pycache__/0001_initial.cpython-312.pyc
|
|
||||||
parks/__pycache__/__init__.cpython-312.pyc
|
|
||||||
parks/__pycache__/admin.cpython-312.pyc
|
|
||||||
parks/__pycache__/apps.cpython-312.pyc
|
|
||||||
parks/__pycache__/models.cpython-312.pyc
|
|
||||||
parks/__pycache__/signals.cpython-312.pyc
|
|
||||||
parks/__pycache__/urls.cpython-312.pyc
|
|
||||||
parks/__pycache__/views.cpython-312.pyc
|
|
||||||
parks/migrations/__pycache__/__init__.cpython-312.pyc
|
|
||||||
parks/migrations/__pycache__/0001_initial.cpython-312.pyc
|
|
||||||
reviews/__pycache__/__init__.cpython-312.pyc
|
|
||||||
reviews/__pycache__/admin.cpython-312.pyc
|
|
||||||
reviews/__pycache__/apps.cpython-312.pyc
|
|
||||||
reviews/__pycache__/models.cpython-312.pyc
|
|
||||||
reviews/__pycache__/signals.cpython-312.pyc
|
|
||||||
reviews/__pycache__/urls.cpython-312.pyc
|
|
||||||
reviews/__pycache__/views.cpython-312.pyc
|
|
||||||
reviews/migrations/__pycache__/__init__.cpython-312.pyc
|
|
||||||
reviews/migrations/__pycache__/0001_initial.cpython-312.pyc
|
|
||||||
rides/__pycache__/__init__.cpython-312.pyc
|
|
||||||
rides/__pycache__/admin.cpython-312.pyc
|
|
||||||
rides/__pycache__/apps.cpython-312.pyc
|
|
||||||
rides/__pycache__/models.cpython-312.pyc
|
|
||||||
rides/__pycache__/signals.cpython-312.pyc
|
|
||||||
rides/__pycache__/urls.cpython-312.pyc
|
|
||||||
rides/__pycache__/views.cpython-312.pyc
|
|
||||||
rides/migrations/__pycache__/__init__.cpython-312.pyc
|
|
||||||
rides/migrations/__pycache__/0001_initial.cpython-312.pyc
|
|
||||||
thrillwiki/__pycache__/__init__.cpython-312.pyc
|
|
||||||
thrillwiki/__pycache__/settings.cpython-312.pyc
|
|
||||||
thrillwiki/__pycache__/urls.cpython-312.pyc
|
|
||||||
thrillwiki/__pycache__/views.cpython-312.pyc
|
|
||||||
thrillwiki/__pycache__/wsgi.cpython-312.pyc
|
|
||||||
|
|
||||||
# Byte-compiled / optimized / DLL files
|
|
||||||
__pycache__/
|
__pycache__/
|
||||||
*.py[cod]
|
*.py[cod]
|
||||||
*$py.class
|
*$py.class
|
||||||
|
|
||||||
# C extensions
|
|
||||||
*.so
|
*.so
|
||||||
|
|
||||||
# Distribution / packaging
|
|
||||||
.Python
|
.Python
|
||||||
build/
|
build/
|
||||||
develop-eggs/
|
develop-eggs/
|
||||||
@@ -212,186 +22,96 @@ share/python-wheels/
|
|||||||
*.egg
|
*.egg
|
||||||
MANIFEST
|
MANIFEST
|
||||||
|
|
||||||
# PyInstaller
|
# Django
|
||||||
# Usually these files are written by a python script from a template
|
|
||||||
# before PyInstaller builds the exe, so as to inject date/other infos into it.
|
|
||||||
*.manifest
|
|
||||||
*.spec
|
|
||||||
|
|
||||||
# Installer logs
|
|
||||||
pip-log.txt
|
|
||||||
pip-delete-this-directory.txt
|
|
||||||
|
|
||||||
# Unit test / coverage reports
|
|
||||||
htmlcov/
|
|
||||||
.tox/
|
|
||||||
.nox/
|
|
||||||
.coverage
|
|
||||||
.coverage.*
|
|
||||||
.cache
|
|
||||||
nosetests.xml
|
|
||||||
coverage.xml
|
|
||||||
*.cover
|
|
||||||
*.py,cover
|
|
||||||
.hypothesis/
|
|
||||||
.pytest_cache/
|
|
||||||
cover/
|
|
||||||
|
|
||||||
# Translations
|
|
||||||
*.mo
|
|
||||||
*.pot
|
|
||||||
|
|
||||||
# Django stuff:
|
|
||||||
*.log
|
*.log
|
||||||
local_settings.py
|
local_settings.py
|
||||||
db.sqlite3
|
db.sqlite3
|
||||||
db.sqlite3-journal
|
db.sqlite3-journal
|
||||||
|
/backend/staticfiles/
|
||||||
|
/backend/media/
|
||||||
|
|
||||||
# Flask stuff:
|
# UV
|
||||||
instance/
|
.uv/
|
||||||
.webassets-cache
|
backend/.uv/
|
||||||
|
|
||||||
# Scrapy stuff:
|
# Node.js
|
||||||
.scrapy
|
node_modules/
|
||||||
|
npm-debug.log*
|
||||||
|
yarn-debug.log*
|
||||||
|
yarn-error.log*
|
||||||
|
pnpm-debug.log*
|
||||||
|
lerna-debug.log*
|
||||||
|
.pnpm-store/
|
||||||
|
|
||||||
# Sphinx documentation
|
# Vue.js / Vite
|
||||||
docs/_build/
|
/frontend/dist/
|
||||||
|
/frontend/dist-ssr/
|
||||||
|
*.local
|
||||||
|
|
||||||
# PyBuilder
|
# Environment variables
|
||||||
.pybuilder/
|
.env
|
||||||
target/
|
.env.local
|
||||||
|
.env.development.local
|
||||||
|
.env.test.local
|
||||||
|
.env.production.local
|
||||||
|
backend/.env
|
||||||
|
frontend/.env
|
||||||
|
|
||||||
# Jupyter Notebook
|
# IDEs
|
||||||
.ipynb_checkpoints
|
.vscode/
|
||||||
|
.idea/
|
||||||
|
*.swp
|
||||||
|
*.swo
|
||||||
|
*.sublime-project
|
||||||
|
*.sublime-workspace
|
||||||
|
|
||||||
# IPython
|
# OS
|
||||||
profile_default/
|
|
||||||
ipython_config.py
|
|
||||||
|
|
||||||
# pyenv
|
|
||||||
# For a library or package, you might want to ignore these files since the code is
|
|
||||||
# intended to run in multiple environments; otherwise, check them in:
|
|
||||||
# .python-version
|
|
||||||
|
|
||||||
# pipenv
|
|
||||||
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
|
|
||||||
# However, in case of collaboration, if having platform-specific dependencies or dependencies
|
|
||||||
# having no cross-platform support, pipenv may install dependencies that don't work, or not
|
|
||||||
# install all needed dependencies.
|
|
||||||
#Pipfile.lock
|
|
||||||
|
|
||||||
# poetry
|
|
||||||
# Similar to Pipfile.lock, it is generally recommended to include poetry.lock in version control.
|
|
||||||
# This is especially recommended for binary packages to ensure reproducibility, and is more
|
|
||||||
# commonly ignored for libraries.
|
|
||||||
# https://python-poetry.org/docs/basic-usage/#commit-your-poetrylock-file-to-version-control
|
|
||||||
#poetry.lock
|
|
||||||
|
|
||||||
# pdm
|
|
||||||
# Similar to Pipfile.lock, it is generally recommended to include pdm.lock in version control.
|
|
||||||
#pdm.lock
|
|
||||||
# pdm stores project-wide configurations in .pdm.toml, but it is recommended to not include it
|
|
||||||
# in version control.
|
|
||||||
# https://pdm.fming.dev/latest/usage/project/#working-with-version-control
|
|
||||||
.pdm.toml
|
|
||||||
.pdm-python
|
|
||||||
.pdm-build/
|
|
||||||
|
|
||||||
# PEP 582; used by e.g. github.com/David-OConnor/pyflow and github.com/pdm-project/pdm
|
|
||||||
__pypackages__/
|
|
||||||
|
|
||||||
# Celery stuff
|
|
||||||
celerybeat-schedule
|
|
||||||
celerybeat.pid
|
|
||||||
|
|
||||||
# SageMath parsed files
|
|
||||||
*.sage.py
|
|
||||||
|
|
||||||
# Environments
|
|
||||||
***REMOVED***
|
|
||||||
.venv
|
|
||||||
env/
|
|
||||||
venv/
|
|
||||||
ENV/
|
|
||||||
env.bak/
|
|
||||||
venv.bak/
|
|
||||||
|
|
||||||
# Spyder project settings
|
|
||||||
.spyderproject
|
|
||||||
.spyproject
|
|
||||||
|
|
||||||
# Rope project settings
|
|
||||||
.ropeproject
|
|
||||||
|
|
||||||
# mkdocs documentation
|
|
||||||
/site
|
|
||||||
|
|
||||||
# mypy
|
|
||||||
.mypy_cache/
|
|
||||||
.dmypy.json
|
|
||||||
dmypy.json
|
|
||||||
|
|
||||||
# Pyre type checker
|
|
||||||
.pyre/
|
|
||||||
|
|
||||||
# pytype static type analyzer
|
|
||||||
.pytype/
|
|
||||||
|
|
||||||
# Cython debug symbols
|
|
||||||
cython_debug/
|
|
||||||
|
|
||||||
# PyCharm
|
|
||||||
# JetBrains specific template is maintained in a separate JetBrains.gitignore that can
|
|
||||||
# be found at https://github.[AWS-SECRET-REMOVED]tBrains.gitignore
|
|
||||||
# and can be added to the global gitignore or merged into this file. For a more nuclear
|
|
||||||
# option (not recommended) you can uncomment the following to ignore the entire idea folder.
|
|
||||||
#.idea/
|
|
||||||
|
|
||||||
# Pixi package manager
|
|
||||||
.pixi/
|
|
||||||
|
|
||||||
# Django Tailwind CLI
|
|
||||||
.django_tailwind_cli/
|
|
||||||
|
|
||||||
# General
|
|
||||||
.DS_Store
|
.DS_Store
|
||||||
.AppleDouble
|
Thumbs.db
|
||||||
.LSOverride
|
Desktop.ini
|
||||||
|
|
||||||
# Icon must end with two \r
|
# Logs
|
||||||
Icon
|
|
||||||
|
|
||||||
# Thumbnails
|
|
||||||
._*
|
|
||||||
|
|
||||||
# Files that might appear in the root of a volume
|
|
||||||
.DocumentRevisions-V100
|
|
||||||
.fseventsd
|
|
||||||
.Spotlight-V100
|
|
||||||
.TemporaryItems
|
|
||||||
.Trashes
|
|
||||||
.VolumeIcon.icns
|
|
||||||
.com.apple.timemachine.donotpresent
|
|
||||||
|
|
||||||
# Directories potentially created on remote AFP share
|
|
||||||
.AppleDB
|
|
||||||
.AppleDesktop
|
|
||||||
Network Trash Folder
|
|
||||||
Temporary Items
|
|
||||||
.apdisk
|
|
||||||
|
|
||||||
|
|
||||||
# ThrillWiki CI/CD Configuration
|
|
||||||
.thrillwiki-config
|
|
||||||
***REMOVED***.unraid
|
|
||||||
***REMOVED***.webhook
|
|
||||||
.github-token
|
|
||||||
logs/
|
logs/
|
||||||
profiles
|
*.log
|
||||||
.thrillwiki-github-token
|
|
||||||
.thrillwiki-template-config
|
|
||||||
|
|
||||||
# Environment files with potential secrets
|
# Coverage
|
||||||
scripts/systemd/thrillwiki-automation***REMOVED***
|
coverage/
|
||||||
scripts/systemd/thrillwiki-deployment***REMOVED***
|
*.lcov
|
||||||
scripts/systemd/****REMOVED***backups/
|
.nyc_output
|
||||||
|
htmlcov/
|
||||||
|
.coverage
|
||||||
|
.coverage.*
|
||||||
|
|
||||||
|
# Testing
|
||||||
|
.pytest_cache/
|
||||||
|
.cache
|
||||||
|
|
||||||
|
# Temporary files
|
||||||
|
tmp/
|
||||||
|
temp/
|
||||||
|
*.tmp
|
||||||
|
*.temp
|
||||||
|
|
||||||
|
# Build outputs
|
||||||
|
/dist/
|
||||||
|
/build/
|
||||||
|
|
||||||
|
# Backup files
|
||||||
|
*.bak
|
||||||
|
*.orig
|
||||||
|
*.swp
|
||||||
|
|
||||||
|
# Archive files
|
||||||
|
*.tar.gz
|
||||||
|
*.zip
|
||||||
|
*.rar
|
||||||
|
|
||||||
|
# Security
|
||||||
|
*.pem
|
||||||
|
*.key
|
||||||
|
*.cert
|
||||||
|
|
||||||
|
# Local development
|
||||||
|
/uploads/
|
||||||
|
/backups/
|
||||||
|
.django_tailwind_cli/
|
||||||
|
|||||||
599
README.md
599
README.md
@@ -1,391 +1,344 @@
|
|||||||
# ThrillWiki Development Environment Setup
|
# ThrillWiki Django + Vue.js Monorepo
|
||||||
|
|
||||||
ThrillWiki is a modern Django web application for theme park and roller coaster enthusiasts, featuring a sophisticated dark theme design with purple-to-blue gradients, HTMX interactivity, and comprehensive park/ride information management.
|
A comprehensive theme park and roller coaster information system built with a modern monorepo architecture combining Django REST API backend with Vue.js frontend.
|
||||||
|
|
||||||
## 🏗️ Technology Stack
|
## 🏗️ Architecture Overview
|
||||||
|
|
||||||
- **Backend**: Django 5.0+ with GeoDjango (PostGIS)
|
This project uses a monorepo structure that cleanly separates backend and frontend concerns while maintaining shared resources and documentation:
|
||||||
- **Frontend**: HTMX + Alpine.js + Tailwind CSS
|
|
||||||
- **Database**: PostgreSQL with PostGIS extension
|
|
||||||
- **Package Management**: UV (Python package manager)
|
|
||||||
- **Authentication**: Django Allauth with Google/Discord OAuth
|
|
||||||
- **Styling**: Tailwind CSS with custom dark theme
|
|
||||||
- **History Tracking**: django-pghistory for audit trails
|
|
||||||
- **Testing**: Pytest + Playwright for E2E testing
|
|
||||||
|
|
||||||
## 📋 Prerequisites
|
```
|
||||||
|
thrillwiki-monorepo/
|
||||||
### Required Software
|
├── backend/ # Django REST API (Port 8000)
|
||||||
|
│ ├── apps/ # Modular Django applications
|
||||||
1. **Python 3.11+**
|
│ ├── config/ # Django settings and configuration
|
||||||
```bash
|
│ ├── templates/ # Django templates
|
||||||
python --version # Should be 3.11 or higher
|
│ └── static/ # Static assets
|
||||||
```
|
├── frontend/ # Vue.js SPA (Port 5174)
|
||||||
|
│ ├── src/ # Vue.js source code
|
||||||
2. **UV Package Manager**
|
│ ├── public/ # Static assets
|
||||||
```bash
|
│ └── dist/ # Build output
|
||||||
# Install UV if not already installed
|
├── shared/ # Shared resources and documentation
|
||||||
curl -LsSf https://astral.sh/uv/install.sh | sh
|
│ ├── docs/ # Comprehensive documentation
|
||||||
# or
|
│ ├── scripts/ # Development and deployment scripts
|
||||||
pip install uv
|
│ ├── config/ # Shared configuration
|
||||||
```
|
│ └── media/ # Shared media files
|
||||||
|
├── architecture/ # Architecture documentation
|
||||||
3. **PostgreSQL with PostGIS**
|
└── profiles/ # Development profiles
|
||||||
```bash
|
```
|
||||||
# macOS (Homebrew)
|
|
||||||
brew install postgresql postgis
|
|
||||||
|
|
||||||
# Ubuntu/Debian
|
|
||||||
sudo apt-get install postgresql postgresql-contrib postgis
|
|
||||||
|
|
||||||
# Start PostgreSQL service
|
|
||||||
brew services start postgresql # macOS
|
|
||||||
sudo systemctl start postgresql # Linux
|
|
||||||
```
|
|
||||||
|
|
||||||
4. **GDAL/GEOS Libraries** (for GeoDjango)
|
|
||||||
```bash
|
|
||||||
# macOS (Homebrew)
|
|
||||||
brew install gdal geos
|
|
||||||
|
|
||||||
# Ubuntu/Debian
|
|
||||||
sudo apt-get install gdal-bin libgdal-dev libgeos-dev
|
|
||||||
```
|
|
||||||
|
|
||||||
5. **Node.js** (for Tailwind CSS)
|
|
||||||
```bash
|
|
||||||
# Install Node.js 18+ for Tailwind CSS compilation
|
|
||||||
node --version # Should be 18 or higher
|
|
||||||
```
|
|
||||||
|
|
||||||
## 🚀 Quick Start
|
## 🚀 Quick Start
|
||||||
|
|
||||||
### 1. Clone and Setup Project
|
### Prerequisites
|
||||||
|
|
||||||
```bash
|
- **Python 3.11+** with [uv](https://docs.astral.sh/uv/) for backend dependencies
|
||||||
# Clone the repository
|
- **Node.js 18+** with [pnpm](https://pnpm.io/) for frontend dependencies
|
||||||
git clone <repository-url>
|
- **PostgreSQL 14+** (optional, defaults to SQLite for development)
|
||||||
cd thrillwiki_django_no_react
|
- **Redis 6+** (optional, for caching and sessions)
|
||||||
|
|
||||||
# Install Python dependencies using UV
|
### Development Setup
|
||||||
uv sync
|
|
||||||
```
|
|
||||||
|
|
||||||
### 2. Database Setup
|
1. **Clone the repository**
|
||||||
|
```bash
|
||||||
|
git clone <repository-url>
|
||||||
|
cd thrillwiki-monorepo
|
||||||
|
```
|
||||||
|
|
||||||
```bash
|
2. **Install dependencies**
|
||||||
# Create PostgreSQL database and user
|
```bash
|
||||||
createdb thrillwiki
|
# Install frontend dependencies
|
||||||
createuser wiki
|
pnpm install
|
||||||
|
|
||||||
# Connect to PostgreSQL and setup
|
# Install backend dependencies
|
||||||
psql postgres
|
cd backend && uv sync && cd ..
|
||||||
```
|
```
|
||||||
|
|
||||||
In the PostgreSQL shell:
|
3. **Environment configuration**
|
||||||
```sql
|
```bash
|
||||||
-- Set password for wiki user
|
# Copy environment files
|
||||||
ALTER USER wiki WITH PASSWORD 'thrillwiki';
|
cp .env.example .env
|
||||||
|
cp backend/.env.example backend/.env
|
||||||
|
cp frontend/.env.development frontend/.env.local
|
||||||
|
|
||||||
-- Grant privileges
|
# Edit .env files with your settings
|
||||||
GRANT ALL PRIVILEGES ON DATABASE thrillwiki TO wiki;
|
```
|
||||||
|
|
||||||
-- Enable PostGIS extension
|
4. **Database setup**
|
||||||
\c thrillwiki
|
```bash
|
||||||
CREATE EXTENSION postgis;
|
cd backend
|
||||||
\q
|
uv run manage.py migrate
|
||||||
```
|
uv run manage.py createsuperuser
|
||||||
|
cd ..
|
||||||
|
```
|
||||||
|
|
||||||
### 3. Environment Configuration
|
5. **Start development servers**
|
||||||
|
```bash
|
||||||
|
# Start both servers concurrently
|
||||||
|
pnpm run dev
|
||||||
|
|
||||||
The project uses these database settings (configured in [`thrillwiki/settings.py`](thrillwiki/settings.py)):
|
# Or start individually
|
||||||
```python
|
pnpm run dev:frontend # Vue.js on :5174
|
||||||
DATABASES = {
|
pnpm run dev:backend # Django on :8000
|
||||||
"default": {
|
```
|
||||||
"ENGINE": "django.contrib.gis.db.backends.postgis",
|
|
||||||
"NAME": "thrillwiki",
|
|
||||||
"USER": "wiki",
|
|
||||||
"PASSWORD": "thrillwiki",
|
|
||||||
"HOST": "192.168.86.3", # Update to your PostgreSQL host
|
|
||||||
"PORT": "5432",
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
**Important**: Update the `HOST` setting in [`thrillwiki/settings.py`](thrillwiki/settings.py) to match your PostgreSQL server location:
|
## 📁 Project Structure Details
|
||||||
- Use `"localhost"` or `"127.0.0.1"` for local development
|
|
||||||
- Current setting is `"192.168.86.3"` - update this to your PostgreSQL server IP
|
|
||||||
- For local development, change to `"localhost"` in settings.py
|
|
||||||
|
|
||||||
### 4. Database Migration
|
### Backend (`/backend`)
|
||||||
|
- **Django 5.0+** with REST Framework for API development
|
||||||
|
- **Modular app architecture** with separate apps for parks, rides, accounts, etc.
|
||||||
|
- **UV package management** for fast, reliable Python dependency management
|
||||||
|
- **PostgreSQL/SQLite** database with comprehensive entity relationships
|
||||||
|
- **Redis** for caching, sessions, and background tasks
|
||||||
|
- **Comprehensive API** with frontend serializers for camelCase conversion
|
||||||
|
|
||||||
```bash
|
### Frontend (`/frontend`)
|
||||||
# Run database migrations
|
- **Vue 3** with Composition API and `<script setup>` syntax
|
||||||
uv run manage.py migrate
|
- **TypeScript** for type safety and better developer experience
|
||||||
|
- **Vite** for lightning-fast development and optimized production builds
|
||||||
|
- **Tailwind CSS** with custom design system and dark mode support
|
||||||
|
- **Pinia** for state management with modular stores
|
||||||
|
- **Vue Router** for client-side routing
|
||||||
|
- **Comprehensive UI component library** with shadcn-vue components
|
||||||
|
|
||||||
# Create a superuser account
|
### Shared Resources (`/shared`)
|
||||||
uv run manage.py createsuperuser
|
- **Documentation** - Comprehensive guides and API documentation
|
||||||
```
|
- **Development scripts** - Automated setup, build, and deployment scripts
|
||||||
|
- **Configuration** - Shared Docker, CI/CD, and infrastructure configs
|
||||||
**Note**: If you're setting up for local development, first update the database HOST in [`thrillwiki/settings.py`](thrillwiki/settings.py) from `"192.168.86.3"` to `"localhost"` before running migrations.
|
- **Media management** - Centralized media file handling and optimization
|
||||||
|
|
||||||
### 5. Start Development Server
|
|
||||||
|
|
||||||
**CRITICAL**: Always use this exact command sequence for starting the development server:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
lsof -ti :8000 | xargs kill -9; find . -type d -name "__pycache__" -exec rm -r {} +; uv run manage.py tailwind runserver
|
|
||||||
```
|
|
||||||
|
|
||||||
This command:
|
|
||||||
- Kills any existing processes on port 8000
|
|
||||||
- Cleans Python cache files
|
|
||||||
- Starts Tailwind CSS compilation
|
|
||||||
- Runs the Django development server
|
|
||||||
|
|
||||||
The application will be available at: http://localhost:8000
|
|
||||||
|
|
||||||
## 🛠️ Development Workflow
|
## 🛠️ Development Workflow
|
||||||
|
|
||||||
### Package Management
|
### Available Scripts
|
||||||
|
|
||||||
**ALWAYS use UV for package management**:
|
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
# Add new Python packages
|
# Development
|
||||||
uv add <package-name>
|
pnpm run dev # Start both servers concurrently
|
||||||
|
pnpm run dev:frontend # Frontend only (:5174)
|
||||||
|
pnpm run dev:backend # Backend only (:8000)
|
||||||
|
|
||||||
# Add development dependencies
|
# Building
|
||||||
uv add --dev <package-name>
|
pnpm run build # Build frontend for production
|
||||||
|
pnpm run build:staging # Build for staging environment
|
||||||
|
pnpm run build:production # Build for production environment
|
||||||
|
|
||||||
# Never use pip install - always use UV
|
# Testing
|
||||||
|
pnpm run test # Run all tests
|
||||||
|
pnpm run test:frontend # Frontend unit and E2E tests
|
||||||
|
pnpm run test:backend # Backend unit and integration tests
|
||||||
|
|
||||||
|
# Code Quality
|
||||||
|
pnpm run lint # Lint all code
|
||||||
|
pnpm run type-check # TypeScript type checking
|
||||||
|
|
||||||
|
# Setup and Maintenance
|
||||||
|
pnpm run install:all # Install all dependencies
|
||||||
|
./shared/scripts/dev/setup-dev.sh # Full development setup
|
||||||
|
./shared/scripts/dev/start-all.sh # Start all services
|
||||||
```
|
```
|
||||||
|
|
||||||
### Django Management Commands
|
### Backend Development
|
||||||
|
|
||||||
**ALWAYS use UV for Django commands**:
|
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
# Correct way to run Django commands
|
cd backend
|
||||||
uv run manage.py <command>
|
|
||||||
|
|
||||||
# Examples:
|
# Django management commands
|
||||||
uv run manage.py makemigrations
|
|
||||||
uv run manage.py migrate
|
uv run manage.py migrate
|
||||||
uv run manage.py shell
|
uv run manage.py makemigrations
|
||||||
uv run manage.py createsuperuser
|
uv run manage.py createsuperuser
|
||||||
uv run manage.py collectstatic
|
uv run manage.py collectstatic
|
||||||
|
|
||||||
# NEVER use these patterns:
|
# Testing and quality
|
||||||
# python manage.py <command> ❌ Wrong
|
uv run manage.py test
|
||||||
# uv run python manage.py <command> ❌ Wrong
|
uv run black . # Format code
|
||||||
|
uv run flake8 . # Lint code
|
||||||
|
uv run isort . # Sort imports
|
||||||
```
|
```
|
||||||
|
|
||||||
### CSS Development
|
### Frontend Development
|
||||||
|
|
||||||
The project uses **Tailwind CSS v4** with a custom dark theme. CSS files are located in:
|
|
||||||
- Source: [`static/css/src/input.css`](static/css/src/input.css)
|
|
||||||
- Compiled: [`static/css/`](static/css/) (auto-generated)
|
|
||||||
|
|
||||||
Tailwind automatically compiles when using the `tailwind runserver` command.
|
|
||||||
|
|
||||||
#### Tailwind CSS v4 Migration
|
|
||||||
|
|
||||||
This project has been migrated from Tailwind CSS v3 to v4. For complete migration details:
|
|
||||||
|
|
||||||
- **📖 Full Migration Documentation**: [`TAILWIND_V4_MIGRATION.md`](TAILWIND_V4_MIGRATION.md)
|
|
||||||
- **⚡ Quick Reference Guide**: [`TAILWIND_V4_QUICK_REFERENCE.md`](TAILWIND_V4_QUICK_REFERENCE.md)
|
|
||||||
|
|
||||||
**Key v4 Changes**:
|
|
||||||
- New CSS-first approach with `@theme` blocks
|
|
||||||
- Updated utility class names (e.g., `outline-none` → `outline-hidden`)
|
|
||||||
- New opacity syntax (e.g., `bg-blue-500/50` instead of `bg-blue-500 bg-opacity-50`)
|
|
||||||
- Enhanced performance and smaller bundle sizes
|
|
||||||
|
|
||||||
**Custom Theme Variables** (available in CSS):
|
|
||||||
```css
|
|
||||||
var(--color-primary) /* #4f46e5 - Indigo-600 */
|
|
||||||
var(--color-secondary) /* #e11d48 - Rose-600 */
|
|
||||||
var(--color-accent) /* #8b5cf6 - Violet-500 */
|
|
||||||
var(--font-family-sans) /* Poppins, sans-serif */
|
|
||||||
```
|
|
||||||
|
|
||||||
## 🏗️ Project Structure
|
|
||||||
|
|
||||||
```
|
|
||||||
thrillwiki_django_no_react/
|
|
||||||
├── accounts/ # User account management
|
|
||||||
├── analytics/ # Analytics and tracking
|
|
||||||
├── companies/ # Theme park companies
|
|
||||||
├── core/ # Core application logic
|
|
||||||
├── designers/ # Ride designers
|
|
||||||
├── history/ # History timeline features
|
|
||||||
├── location/ # Geographic location handling
|
|
||||||
├── media/ # Media file management
|
|
||||||
├── moderation/ # Content moderation
|
|
||||||
├── parks/ # Theme park management
|
|
||||||
├── reviews/ # User reviews
|
|
||||||
├── rides/ # Roller coaster/ride management
|
|
||||||
├── search/ # Search functionality
|
|
||||||
├── static/ # Static assets (CSS, JS, images)
|
|
||||||
├── templates/ # Django templates
|
|
||||||
├── thrillwiki/ # Main Django project settings
|
|
||||||
├── memory-bank/ # Development documentation
|
|
||||||
└── .clinerules # Project development rules
|
|
||||||
```
|
|
||||||
|
|
||||||
## 🔧 Key Features
|
|
||||||
|
|
||||||
### Authentication System
|
|
||||||
- Django Allauth integration
|
|
||||||
- Google OAuth authentication
|
|
||||||
- Discord OAuth authentication
|
|
||||||
- Custom user profiles with avatars
|
|
||||||
|
|
||||||
### Geographic Features
|
|
||||||
- PostGIS integration for location data
|
|
||||||
- Interactive park maps
|
|
||||||
- Location-based search and filtering
|
|
||||||
|
|
||||||
### Content Management
|
|
||||||
- Park and ride information management
|
|
||||||
- Photo galleries with upload capabilities
|
|
||||||
- User-generated reviews and ratings
|
|
||||||
- Content moderation system
|
|
||||||
|
|
||||||
### Modern Frontend
|
|
||||||
- HTMX for dynamic interactions
|
|
||||||
- Alpine.js for client-side behavior
|
|
||||||
- Tailwind CSS with custom dark theme
|
|
||||||
- Responsive design (mobile-first)
|
|
||||||
|
|
||||||
## 🧪 Testing
|
|
||||||
|
|
||||||
### Running Tests
|
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
# Run Python tests
|
cd frontend
|
||||||
uv run pytest
|
|
||||||
|
|
||||||
# Run with coverage
|
# Vue.js development
|
||||||
uv run coverage run -m pytest
|
pnpm run dev # Start dev server
|
||||||
uv run coverage report
|
pnpm run build # Production build
|
||||||
|
pnpm run preview # Preview production build
|
||||||
# Run E2E tests with Playwright
|
pnpm run test:unit # Vitest unit tests
|
||||||
uv run pytest tests/e2e/
|
pnpm run test:e2e # Playwright E2E tests
|
||||||
|
pnpm run lint # ESLint
|
||||||
|
pnpm run type-check # TypeScript checking
|
||||||
```
|
```
|
||||||
|
|
||||||
### Test Structure
|
## 🔧 Configuration
|
||||||
- Unit tests: Located within each app's `tests/` directory
|
|
||||||
- E2E tests: [`tests/e2e/`](tests/e2e/)
|
|
||||||
- Test fixtures: [`tests/fixtures/`](tests/fixtures/)
|
|
||||||
|
|
||||||
## 📚 Documentation
|
### Environment Variables
|
||||||
|
|
||||||
### Memory Bank System
|
#### Root `.env`
|
||||||
The project uses a comprehensive documentation system in [`memory-bank/`](memory-bank/):
|
```bash
|
||||||
|
# Database
|
||||||
|
DATABASE_URL=postgresql://user:pass@localhost/thrillwiki
|
||||||
|
REDIS_URL=redis://localhost:6379
|
||||||
|
|
||||||
- [`memory-bank/activeContext.md`](memory-bank/activeContext.md) - Current development context
|
# Security
|
||||||
- [`memory-bank/documentation/design-system.md`](memory-bank/documentation/design-system.md) - Design system documentation
|
SECRET_KEY=your-secret-key
|
||||||
- [`memory-bank/features/`](memory-bank/features/) - Feature-specific documentation
|
DEBUG=True
|
||||||
- [`memory-bank/testing/`](memory-bank/testing/) - Testing documentation and results
|
|
||||||
|
|
||||||
### Key Documentation Files
|
# API Configuration
|
||||||
- [Design System](memory-bank/documentation/design-system.md) - UI/UX guidelines and patterns
|
API_BASE_URL=http://localhost:8000/api
|
||||||
- [Authentication System](memory-bank/features/auth/) - OAuth and user management
|
```
|
||||||
- [Layout Optimization](memory-bank/projects/) - Responsive design implementations
|
|
||||||
|
|
||||||
## 🚨 Important Development Rules
|
#### Backend `.env`
|
||||||
|
```bash
|
||||||
|
# Django Settings
|
||||||
|
DJANGO_SETTINGS_MODULE=config.django.local
|
||||||
|
DEBUG=True
|
||||||
|
ALLOWED_HOSTS=localhost,127.0.0.1
|
||||||
|
|
||||||
### Critical Commands
|
# Database
|
||||||
1. **Server Startup**: Always use the full command sequence:
|
DATABASE_URL=postgresql://user:pass@localhost/thrillwiki
|
||||||
```bash
|
|
||||||
lsof -ti :8000 | xargs kill -9; find . -type d -name "__pycache__" -exec rm -r {} +; uv run manage.py tailwind runserver
|
|
||||||
```
|
|
||||||
|
|
||||||
2. **Package Management**: Only use UV:
|
# Redis
|
||||||
```bash
|
REDIS_URL=redis://localhost:6379
|
||||||
uv add <package> # ✅ Correct
|
|
||||||
pip install <package> # ❌ Wrong
|
|
||||||
```
|
|
||||||
|
|
||||||
3. **Django Commands**: Always prefix with `uv run`:
|
# Email (optional)
|
||||||
```bash
|
EMAIL_HOST=smtp.gmail.com
|
||||||
uv run manage.py <command> # ✅ Correct
|
EMAIL_PORT=587
|
||||||
python manage.py <command> # ❌ Wrong
|
EMAIL_USE_TLS=True
|
||||||
```
|
```
|
||||||
|
|
||||||
### Database Configuration
|
#### Frontend `.env.local`
|
||||||
- Ensure PostgreSQL is running before starting development
|
```bash
|
||||||
- PostGIS extension must be enabled
|
# API Configuration
|
||||||
- Update database host settings for your environment
|
VITE_API_BASE_URL=http://localhost:8000/api
|
||||||
|
|
||||||
### GeoDjango Requirements
|
# Development
|
||||||
- GDAL and GEOS libraries must be properly installed
|
VITE_APP_TITLE=ThrillWiki (Development)
|
||||||
- Library paths are configured in [`thrillwiki/settings.py`](thrillwiki/settings.py) for macOS Homebrew
|
|
||||||
- Current paths: `/opt/homebrew/lib/libgdal.dylib` and `/opt/homebrew/lib/libgeos_c.dylib`
|
|
||||||
- May need adjustment based on your system's library locations (Linux users will need different paths)
|
|
||||||
|
|
||||||
## 🔍 Troubleshooting
|
# Feature Flags
|
||||||
|
VITE_ENABLE_DEBUG=true
|
||||||
|
```
|
||||||
|
|
||||||
### Common Issues
|
## 📊 Key Features
|
||||||
|
|
||||||
1. **PostGIS Extension Error**
|
### Backend Features
|
||||||
```bash
|
- **Comprehensive Park Database** - Detailed information about theme parks worldwide
|
||||||
# Connect to database and enable PostGIS
|
- **Extensive Ride Database** - Complete roller coaster and ride information
|
||||||
psql thrillwiki
|
- **User Management** - Authentication, profiles, and permissions
|
||||||
CREATE EXTENSION postgis;
|
- **Content Moderation** - Review and approval workflows
|
||||||
```
|
- **API Documentation** - Auto-generated OpenAPI/Swagger docs
|
||||||
|
- **Background Tasks** - Celery integration for long-running processes
|
||||||
|
- **Caching Strategy** - Redis-based caching for performance
|
||||||
|
- **Search Functionality** - Full-text search across all content
|
||||||
|
|
||||||
2. **GDAL/GEOS Library Not Found**
|
### Frontend Features
|
||||||
```bash
|
- **Responsive Design** - Mobile-first approach with Tailwind CSS
|
||||||
# macOS (Homebrew): Current paths in settings.py
|
- **Dark Mode Support** - Complete dark/light theme system
|
||||||
GDAL_LIBRARY_PATH = "/opt/homebrew/lib/libgdal.dylib"
|
- **Real-time Search** - Instant search with debouncing and highlighting
|
||||||
GEOS_LIBRARY_PATH = "/opt/homebrew/lib/libgeos_c.dylib"
|
- **Interactive Maps** - Park and ride location visualization
|
||||||
|
- **Photo Galleries** - High-quality image management
|
||||||
# Linux: Update paths in settings.py to something like:
|
- **User Dashboard** - Personalized content and contributions
|
||||||
# GDAL_LIBRARY_PATH = "/usr/lib/x86_64-linux-gnu/libgdal.so"
|
- **Progressive Web App** - PWA capabilities for mobile experience
|
||||||
# GEOS_LIBRARY_PATH = "/usr/lib/x86_64-linux-gnu/libgeos_c.so"
|
- **Accessibility** - WCAG 2.1 AA compliance
|
||||||
|
|
||||||
# Find your library locations
|
|
||||||
find /usr -name "libgdal*" 2>/dev/null
|
|
||||||
find /usr -name "libgeos*" 2>/dev/null
|
|
||||||
find /opt -name "libgdal*" 2>/dev/null
|
|
||||||
find /opt -name "libgeos*" 2>/dev/null
|
|
||||||
```
|
|
||||||
|
|
||||||
3. **Port 8000 Already in Use**
|
## 📖 Documentation
|
||||||
```bash
|
|
||||||
# Kill existing processes
|
|
||||||
lsof -ti :8000 | xargs kill -9
|
|
||||||
```
|
|
||||||
|
|
||||||
4. **Tailwind CSS Not Compiling**
|
### Core Documentation
|
||||||
```bash
|
- **[Backend Documentation](./backend/README.md)** - Django setup and API details
|
||||||
# Ensure Node.js is installed and use the full server command
|
- **[Frontend Documentation](./frontend/README.md)** - Vue.js setup and development
|
||||||
node --version
|
- **[API Documentation](./shared/docs/api/README.md)** - Complete API reference
|
||||||
uv run manage.py tailwind runserver
|
- **[Development Workflow](./shared/docs/development/workflow.md)** - Daily development processes
|
||||||
```
|
|
||||||
|
|
||||||
### Getting Help
|
### Architecture & Deployment
|
||||||
|
- **[Architecture Overview](./architecture/)** - System design and decisions
|
||||||
|
- **[Deployment Guide](./shared/docs/deployment/)** - Production deployment instructions
|
||||||
|
- **[Development Scripts](./shared/scripts/)** - Automation and tooling
|
||||||
|
|
||||||
1. Check the [`memory-bank/`](memory-bank/) documentation for detailed feature information
|
### Additional Resources
|
||||||
2. Review [`memory-bank/testing/`](memory-bank/testing/) for known issues and solutions
|
- **[Contributing Guide](./CONTRIBUTING.md)** - How to contribute to the project
|
||||||
3. Ensure all prerequisites are properly installed
|
- **[Code of Conduct](./CODE_OF_CONDUCT.md)** - Community guidelines
|
||||||
4. Verify database connection and PostGIS extension
|
- **[Security Policy](./SECURITY.md)** - Security reporting and policies
|
||||||
|
|
||||||
## 🎯 Next Steps
|
## 🚀 Deployment
|
||||||
|
|
||||||
After successful setup:
|
### Development Environment
|
||||||
|
```bash
|
||||||
|
# Quick start with all services
|
||||||
|
./shared/scripts/dev/start-all.sh
|
||||||
|
|
||||||
1. **Explore the Admin Interface**: http://localhost:8000/admin/
|
# Full development setup
|
||||||
2. **Browse the Application**: http://localhost:8000/
|
./shared/scripts/dev/setup-dev.sh
|
||||||
3. **Review Documentation**: Check [`memory-bank/`](memory-bank/) for detailed feature docs
|
```
|
||||||
4. **Run Tests**: Ensure everything works with `uv run pytest`
|
|
||||||
5. **Start Development**: Follow the development workflow guidelines above
|
### Production Deployment
|
||||||
|
```bash
|
||||||
|
# Build all components
|
||||||
|
./shared/scripts/build/build-all.sh
|
||||||
|
|
||||||
|
# Deploy to production
|
||||||
|
./shared/scripts/deploy/deploy.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
See [Deployment Guide](./shared/docs/deployment/) for detailed production setup instructions.
|
||||||
|
|
||||||
|
## 🧪 Testing Strategy
|
||||||
|
|
||||||
|
### Backend Testing
|
||||||
|
- **Unit Tests** - Individual function and method testing
|
||||||
|
- **Integration Tests** - API endpoint and database interaction testing
|
||||||
|
- **E2E Tests** - Full user journey testing with Selenium
|
||||||
|
|
||||||
|
### Frontend Testing
|
||||||
|
- **Unit Tests** - Component and utility function testing with Vitest
|
||||||
|
- **Integration Tests** - Component interaction testing
|
||||||
|
- **E2E Tests** - User journey testing with Playwright
|
||||||
|
|
||||||
|
### Code Quality
|
||||||
|
- **Linting** - ESLint for JavaScript/TypeScript, Flake8 for Python
|
||||||
|
- **Type Checking** - TypeScript for frontend, mypy for Python
|
||||||
|
- **Code Formatting** - Prettier for frontend, Black for Python
|
||||||
|
|
||||||
|
## 🤝 Contributing
|
||||||
|
|
||||||
|
We welcome contributions! Please see our [Contributing Guide](./CONTRIBUTING.md) for details on:
|
||||||
|
|
||||||
|
1. **Development Setup** - Getting your development environment ready
|
||||||
|
2. **Code Standards** - Coding conventions and best practices
|
||||||
|
3. **Pull Request Process** - How to submit your changes
|
||||||
|
4. **Issue Reporting** - How to report bugs and request features
|
||||||
|
|
||||||
|
### Quick Contribution Start
|
||||||
|
```bash
|
||||||
|
# Fork and clone the repository
|
||||||
|
git clone https://github.com/your-username/thrillwiki-monorepo.git
|
||||||
|
cd thrillwiki-monorepo
|
||||||
|
|
||||||
|
# Set up development environment
|
||||||
|
./shared/scripts/dev/setup-dev.sh
|
||||||
|
|
||||||
|
# Create a feature branch
|
||||||
|
git checkout -b feature/your-feature-name
|
||||||
|
|
||||||
|
# Make your changes and test
|
||||||
|
pnpm run test
|
||||||
|
|
||||||
|
# Submit a pull request
|
||||||
|
```
|
||||||
|
|
||||||
|
## 📄 License
|
||||||
|
|
||||||
|
This project is licensed under the MIT License - see the [LICENSE](./LICENSE) file for details.
|
||||||
|
|
||||||
|
## 🙏 Acknowledgments
|
||||||
|
|
||||||
|
- **Theme Park Community** - For providing data and inspiration
|
||||||
|
- **Open Source Contributors** - For the amazing tools and libraries
|
||||||
|
- **Vue.js and Django Communities** - For excellent documentation and support
|
||||||
|
|
||||||
|
## 📞 Support
|
||||||
|
|
||||||
|
- **Issues** - [GitHub Issues](https://github.com/your-repo/thrillwiki-monorepo/issues)
|
||||||
|
- **Discussions** - [GitHub Discussions](https://github.com/your-repo/thrillwiki-monorepo/discussions)
|
||||||
|
- **Documentation** - [Project Wiki](https://github.com/your-repo/thrillwiki-monorepo/wiki)
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
**Happy Coding!** 🎢✨
|
**Built with ❤️ for the theme park and roller coaster community**
|
||||||
|
|
||||||
For detailed feature documentation and development context, see the [`memory-bank/`](memory-bank/) directory.
|
|
||||||
|
|||||||
@@ -1,207 +0,0 @@
|
|||||||
from django.contrib import admin
|
|
||||||
from django.contrib.auth.admin import UserAdmin
|
|
||||||
from django.utils.html import format_html
|
|
||||||
from django.urls import reverse
|
|
||||||
from django.contrib.auth.models import Group
|
|
||||||
from .models import User, UserProfile, EmailVerification, TopList, TopListItem
|
|
||||||
|
|
||||||
class UserProfileInline(admin.StackedInline):
|
|
||||||
model = UserProfile
|
|
||||||
can_delete = False
|
|
||||||
verbose_name_plural = 'Profile'
|
|
||||||
fieldsets = (
|
|
||||||
('Personal Info', {
|
|
||||||
'fields': ('display_name', 'avatar', 'pronouns', 'bio')
|
|
||||||
}),
|
|
||||||
('Social Media', {
|
|
||||||
'fields': ('twitter', 'instagram', 'youtube', 'discord')
|
|
||||||
}),
|
|
||||||
('Ride Credits', {
|
|
||||||
'fields': (
|
|
||||||
'coaster_credits',
|
|
||||||
'dark_ride_credits',
|
|
||||||
'flat_ride_credits',
|
|
||||||
'water_ride_credits'
|
|
||||||
)
|
|
||||||
}),
|
|
||||||
)
|
|
||||||
|
|
||||||
class TopListItemInline(admin.TabularInline):
|
|
||||||
model = TopListItem
|
|
||||||
extra = 1
|
|
||||||
fields = ('content_type', 'object_id', 'rank', 'notes')
|
|
||||||
ordering = ('rank',)
|
|
||||||
|
|
||||||
@admin.register(User)
|
|
||||||
class CustomUserAdmin(UserAdmin):
|
|
||||||
list_display = ('username', 'email', 'get_avatar', 'get_status', 'role', 'date_joined', 'last_login', 'get_credits')
|
|
||||||
list_filter = ('is_active', 'is_staff', 'role', 'is_banned', 'groups', 'date_joined')
|
|
||||||
search_fields = ('username', 'email')
|
|
||||||
ordering = ('-date_joined',)
|
|
||||||
actions = ['activate_users', 'deactivate_users', 'ban_users', 'unban_users']
|
|
||||||
inlines = [UserProfileInline]
|
|
||||||
|
|
||||||
fieldsets = (
|
|
||||||
(None, {'fields': ('username', 'password')}),
|
|
||||||
('Personal info', {'fields': ('email', 'pending_email')}),
|
|
||||||
('Roles and Permissions', {
|
|
||||||
'fields': ('role', 'groups', 'user_permissions'),
|
|
||||||
'description': 'Role determines group membership. Groups determine permissions.',
|
|
||||||
}),
|
|
||||||
('Status', {
|
|
||||||
'fields': ('is_active', 'is_staff', 'is_superuser'),
|
|
||||||
'description': 'These are automatically managed based on role.',
|
|
||||||
}),
|
|
||||||
('Ban Status', {
|
|
||||||
'fields': ('is_banned', 'ban_reason', 'ban_date'),
|
|
||||||
}),
|
|
||||||
('Preferences', {
|
|
||||||
'fields': ('theme_preference',),
|
|
||||||
}),
|
|
||||||
('Important dates', {'fields': ('last_login', 'date_joined')}),
|
|
||||||
)
|
|
||||||
add_fieldsets = (
|
|
||||||
(None, {
|
|
||||||
'classes': ('wide',),
|
|
||||||
'fields': ('username', 'email', 'password1', 'password2', 'role'),
|
|
||||||
}),
|
|
||||||
)
|
|
||||||
|
|
||||||
def get_avatar(self, obj):
|
|
||||||
if obj.profile.avatar:
|
|
||||||
return format_html('<img src="{}" width="30" height="30" style="border-radius:50%;" />', obj.profile.avatar.url)
|
|
||||||
return format_html('<div style="width:30px; height:30px; border-radius:50%; background-color:#007bff; color:white; display:flex; align-items:center; justify-content:center;">{}</div>', obj.username[0].upper())
|
|
||||||
get_avatar.short_description = 'Avatar'
|
|
||||||
|
|
||||||
def get_status(self, obj):
|
|
||||||
if obj.is_banned:
|
|
||||||
return format_html('<span style="color: red;">Banned</span>')
|
|
||||||
if not obj.is_active:
|
|
||||||
return format_html('<span style="color: orange;">Inactive</span>')
|
|
||||||
if obj.is_superuser:
|
|
||||||
return format_html('<span style="color: purple;">Superuser</span>')
|
|
||||||
if obj.is_staff:
|
|
||||||
return format_html('<span style="color: blue;">Staff</span>')
|
|
||||||
return format_html('<span style="color: green;">Active</span>')
|
|
||||||
get_status.short_description = 'Status'
|
|
||||||
|
|
||||||
def get_credits(self, obj):
|
|
||||||
try:
|
|
||||||
profile = obj.profile
|
|
||||||
return format_html(
|
|
||||||
'RC: {}<br>DR: {}<br>FR: {}<br>WR: {}',
|
|
||||||
profile.coaster_credits,
|
|
||||||
profile.dark_ride_credits,
|
|
||||||
profile.flat_ride_credits,
|
|
||||||
profile.water_ride_credits
|
|
||||||
)
|
|
||||||
except UserProfile.DoesNotExist:
|
|
||||||
return '-'
|
|
||||||
get_credits.short_description = 'Ride Credits'
|
|
||||||
|
|
||||||
def activate_users(self, request, queryset):
|
|
||||||
queryset.update(is_active=True)
|
|
||||||
activate_users.short_description = "Activate selected users"
|
|
||||||
|
|
||||||
def deactivate_users(self, request, queryset):
|
|
||||||
queryset.update(is_active=False)
|
|
||||||
deactivate_users.short_description = "Deactivate selected users"
|
|
||||||
|
|
||||||
def ban_users(self, request, queryset):
|
|
||||||
from django.utils import timezone
|
|
||||||
queryset.update(is_banned=True, ban_date=timezone.now())
|
|
||||||
ban_users.short_description = "Ban selected users"
|
|
||||||
|
|
||||||
def unban_users(self, request, queryset):
|
|
||||||
queryset.update(is_banned=False, ban_date=None, ban_reason='')
|
|
||||||
unban_users.short_description = "Unban selected users"
|
|
||||||
|
|
||||||
def save_model(self, request, obj, form, change):
|
|
||||||
creating = not obj.pk
|
|
||||||
super().save_model(request, obj, form, change)
|
|
||||||
if creating and obj.role != User.Roles.USER:
|
|
||||||
# Ensure new user with role gets added to appropriate group
|
|
||||||
group = Group.objects.filter(name=obj.role).first()
|
|
||||||
if group:
|
|
||||||
obj.groups.add(group)
|
|
||||||
|
|
||||||
@admin.register(UserProfile)
|
|
||||||
class UserProfileAdmin(admin.ModelAdmin):
|
|
||||||
list_display = ('user', 'display_name', 'coaster_credits', 'dark_ride_credits', 'flat_ride_credits', 'water_ride_credits')
|
|
||||||
list_filter = ('coaster_credits', 'dark_ride_credits', 'flat_ride_credits', 'water_ride_credits')
|
|
||||||
search_fields = ('user__username', 'user__email', 'display_name', 'bio')
|
|
||||||
|
|
||||||
fieldsets = (
|
|
||||||
('User Information', {
|
|
||||||
'fields': ('user', 'display_name', 'avatar', 'pronouns', 'bio')
|
|
||||||
}),
|
|
||||||
('Social Media', {
|
|
||||||
'fields': ('twitter', 'instagram', 'youtube', 'discord')
|
|
||||||
}),
|
|
||||||
('Ride Credits', {
|
|
||||||
'fields': (
|
|
||||||
'coaster_credits',
|
|
||||||
'dark_ride_credits',
|
|
||||||
'flat_ride_credits',
|
|
||||||
'water_ride_credits'
|
|
||||||
)
|
|
||||||
}),
|
|
||||||
)
|
|
||||||
|
|
||||||
@admin.register(EmailVerification)
|
|
||||||
class EmailVerificationAdmin(admin.ModelAdmin):
|
|
||||||
list_display = ('user', 'created_at', 'last_sent', 'is_expired')
|
|
||||||
list_filter = ('created_at', 'last_sent')
|
|
||||||
search_fields = ('user__username', 'user__email', 'token')
|
|
||||||
readonly_fields = ('created_at', 'last_sent')
|
|
||||||
|
|
||||||
fieldsets = (
|
|
||||||
('Verification Details', {
|
|
||||||
'fields': ('user', 'token')
|
|
||||||
}),
|
|
||||||
('Timing', {
|
|
||||||
'fields': ('created_at', 'last_sent')
|
|
||||||
}),
|
|
||||||
)
|
|
||||||
|
|
||||||
def is_expired(self, obj):
|
|
||||||
from django.utils import timezone
|
|
||||||
from datetime import timedelta
|
|
||||||
if timezone.now() - obj.last_sent > timedelta(days=1):
|
|
||||||
return format_html('<span style="color: red;">Expired</span>')
|
|
||||||
return format_html('<span style="color: green;">Valid</span>')
|
|
||||||
is_expired.short_description = 'Status'
|
|
||||||
|
|
||||||
@admin.register(TopList)
|
|
||||||
class TopListAdmin(admin.ModelAdmin):
|
|
||||||
list_display = ('title', 'user', 'category', 'created_at', 'updated_at')
|
|
||||||
list_filter = ('category', 'created_at', 'updated_at')
|
|
||||||
search_fields = ('title', 'user__username', 'description')
|
|
||||||
inlines = [TopListItemInline]
|
|
||||||
|
|
||||||
fieldsets = (
|
|
||||||
('Basic Information', {
|
|
||||||
'fields': ('user', 'title', 'category', 'description')
|
|
||||||
}),
|
|
||||||
('Timestamps', {
|
|
||||||
'fields': ('created_at', 'updated_at'),
|
|
||||||
'classes': ('collapse',)
|
|
||||||
}),
|
|
||||||
)
|
|
||||||
readonly_fields = ('created_at', 'updated_at')
|
|
||||||
|
|
||||||
@admin.register(TopListItem)
|
|
||||||
class TopListItemAdmin(admin.ModelAdmin):
|
|
||||||
list_display = ('top_list', 'content_type', 'object_id', 'rank')
|
|
||||||
list_filter = ('top_list__category', 'rank')
|
|
||||||
search_fields = ('top_list__title', 'notes')
|
|
||||||
ordering = ('top_list', 'rank')
|
|
||||||
|
|
||||||
fieldsets = (
|
|
||||||
('List Information', {
|
|
||||||
'fields': ('top_list', 'rank')
|
|
||||||
}),
|
|
||||||
('Item Details', {
|
|
||||||
'fields': ('content_type', 'object_id', 'notes')
|
|
||||||
}),
|
|
||||||
)
|
|
||||||
@@ -1,30 +0,0 @@
|
|||||||
from django.core.management.base import BaseCommand
|
|
||||||
from allauth.socialaccount.models import SocialApp, SocialAccount, SocialToken
|
|
||||||
from django.contrib.sites.models import Site
|
|
||||||
|
|
||||||
class Command(BaseCommand):
|
|
||||||
help = 'Check all social auth related tables'
|
|
||||||
|
|
||||||
def handle(self, *args, **options):
|
|
||||||
# Check SocialApp
|
|
||||||
self.stdout.write('\nChecking SocialApp table:')
|
|
||||||
for app in SocialApp.objects.all():
|
|
||||||
self.stdout.write(f'ID: {app.id}, Provider: {app.provider}, Name: {app.name}, Client ID: {app.client_id}')
|
|
||||||
self.stdout.write('Sites:')
|
|
||||||
for site in app.sites.all():
|
|
||||||
self.stdout.write(f' - {site.domain}')
|
|
||||||
|
|
||||||
# Check SocialAccount
|
|
||||||
self.stdout.write('\nChecking SocialAccount table:')
|
|
||||||
for account in SocialAccount.objects.all():
|
|
||||||
self.stdout.write(f'ID: {account.id}, Provider: {account.provider}, UID: {account.uid}')
|
|
||||||
|
|
||||||
# Check SocialToken
|
|
||||||
self.stdout.write('\nChecking SocialToken table:')
|
|
||||||
for token in SocialToken.objects.all():
|
|
||||||
self.stdout.write(f'ID: {token.id}, Account: {token.account}, App: {token.app}')
|
|
||||||
|
|
||||||
# Check Site
|
|
||||||
self.stdout.write('\nChecking Site table:')
|
|
||||||
for site in Site.objects.all():
|
|
||||||
self.stdout.write(f'ID: {site.id}, Domain: {site.domain}, Name: {site.name}')
|
|
||||||
@@ -1,19 +0,0 @@
|
|||||||
from django.core.management.base import BaseCommand
|
|
||||||
from allauth.socialaccount.models import SocialApp
|
|
||||||
|
|
||||||
class Command(BaseCommand):
|
|
||||||
help = 'Check social app configurations'
|
|
||||||
|
|
||||||
def handle(self, *args, **options):
|
|
||||||
social_apps = SocialApp.objects.all()
|
|
||||||
|
|
||||||
if not social_apps:
|
|
||||||
self.stdout.write(self.style.ERROR('No social apps found'))
|
|
||||||
return
|
|
||||||
|
|
||||||
for app in social_apps:
|
|
||||||
self.stdout.write(self.style.SUCCESS(f'\nProvider: {app.provider}'))
|
|
||||||
self.stdout.write(f'Name: {app.name}')
|
|
||||||
self.stdout.write(f'Client ID: {app.client_id}')
|
|
||||||
self.stdout.write(f'Secret: {app.secret}')
|
|
||||||
self.stdout.write(f'Sites: {", ".join(str(site.domain) for site in app.sites.all())}')
|
|
||||||
@@ -1,48 +0,0 @@
|
|||||||
from django.core.management.base import BaseCommand
|
|
||||||
from django.contrib.sites.models import Site
|
|
||||||
from allauth.socialaccount.models import SocialApp
|
|
||||||
|
|
||||||
class Command(BaseCommand):
|
|
||||||
help = 'Create social apps for authentication'
|
|
||||||
|
|
||||||
def handle(self, *args, **options):
|
|
||||||
# Get the default site
|
|
||||||
site = Site.objects.get_or_create(
|
|
||||||
id=1,
|
|
||||||
defaults={
|
|
||||||
'domain': 'localhost:8000',
|
|
||||||
'name': 'ThrillWiki Development'
|
|
||||||
}
|
|
||||||
)[0]
|
|
||||||
|
|
||||||
# Create Discord app
|
|
||||||
discord_app, created = SocialApp.objects.get_or_create(
|
|
||||||
provider='discord',
|
|
||||||
defaults={
|
|
||||||
'name': 'Discord',
|
|
||||||
'client_id': '1299112802274902047',
|
|
||||||
'secret': 'ece7Pe_M4mD4mYzAgcINjTEKL_3ftL11',
|
|
||||||
}
|
|
||||||
)
|
|
||||||
if not created:
|
|
||||||
discord_app.client_id = '1299112802274902047'
|
|
||||||
discord_app.secret = 'ece7Pe_M4mD4mYzAgcINjTEKL_3ftL11'
|
|
||||||
discord_app.save()
|
|
||||||
discord_app.sites.add(site)
|
|
||||||
self.stdout.write(f'{"Created" if created else "Updated"} Discord app')
|
|
||||||
|
|
||||||
# Create Google app
|
|
||||||
google_app, created = SocialApp.objects.get_or_create(
|
|
||||||
provider='google',
|
|
||||||
defaults={
|
|
||||||
'name': 'Google',
|
|
||||||
'client_id': '135166769591-nopcgmo0fkqfqfs9qe783a137mtmcrt2.apps.googleusercontent.com',
|
|
||||||
'secret': 'GOCSPX-Wd_0Ue0Ue0Ue0Ue0Ue0Ue0Ue0Ue',
|
|
||||||
}
|
|
||||||
)
|
|
||||||
if not created:
|
|
||||||
google_app.client_id = '135166769591-nopcgmo0fkqfqfs9qe783a137mtmcrt2.apps.googleusercontent.com'
|
|
||||||
google_app.secret = 'GOCSPX-Wd_0Ue0Ue0Ue0Ue0Ue0Ue0Ue0Ue'
|
|
||||||
google_app.save()
|
|
||||||
google_app.sites.add(site)
|
|
||||||
self.stdout.write(f'{"Created" if created else "Updated"} Google app')
|
|
||||||
@@ -1,10 +0,0 @@
|
|||||||
from django.core.management.base import BaseCommand
|
|
||||||
from django.db import connection
|
|
||||||
|
|
||||||
class Command(BaseCommand):
|
|
||||||
help = 'Fix migration history by removing rides.0001_initial'
|
|
||||||
|
|
||||||
def handle(self, *args, **kwargs):
|
|
||||||
with connection.cursor() as cursor:
|
|
||||||
cursor.execute("DELETE FROM django_migrations WHERE app='rides' AND name='0001_initial';")
|
|
||||||
self.stdout.write(self.style.SUCCESS('Successfully removed rides.0001_initial from migration history'))
|
|
||||||
@@ -1,35 +0,0 @@
|
|||||||
from django.core.management.base import BaseCommand
|
|
||||||
from allauth.socialaccount.models import SocialApp
|
|
||||||
from django.contrib.sites.models import Site
|
|
||||||
import os
|
|
||||||
|
|
||||||
class Command(BaseCommand):
|
|
||||||
help = 'Fix social app configurations'
|
|
||||||
|
|
||||||
def handle(self, *args, **options):
|
|
||||||
# Delete all existing social apps
|
|
||||||
SocialApp.objects.all().delete()
|
|
||||||
self.stdout.write('Deleted all existing social apps')
|
|
||||||
|
|
||||||
# Get the default site
|
|
||||||
site = Site.objects.get(id=1)
|
|
||||||
|
|
||||||
# Create Google provider
|
|
||||||
google_app = SocialApp.objects.create(
|
|
||||||
provider='google',
|
|
||||||
name='Google',
|
|
||||||
client_id=os.getenv('GOOGLE_CLIENT_ID'),
|
|
||||||
secret=os.getenv('GOOGLE_CLIENT_SECRET'),
|
|
||||||
)
|
|
||||||
google_app.sites.add(site)
|
|
||||||
self.stdout.write(f'Created Google app with client_id: {google_app.client_id}')
|
|
||||||
|
|
||||||
# Create Discord provider
|
|
||||||
discord_app = SocialApp.objects.create(
|
|
||||||
provider='discord',
|
|
||||||
name='Discord',
|
|
||||||
client_id=os.getenv('DISCORD_CLIENT_ID'),
|
|
||||||
secret=os.getenv('DISCORD_CLIENT_SECRET'),
|
|
||||||
)
|
|
||||||
discord_app.sites.add(site)
|
|
||||||
self.stdout.write(f'Created Discord app with client_id: {discord_app.client_id}')
|
|
||||||
@@ -1,11 +0,0 @@
|
|||||||
from django.core.management.base import BaseCommand
|
|
||||||
from accounts.models import UserProfile
|
|
||||||
|
|
||||||
class Command(BaseCommand):
|
|
||||||
help = 'Regenerate default avatars for users without an uploaded avatar'
|
|
||||||
|
|
||||||
def handle(self, *args, **kwargs):
|
|
||||||
profiles = UserProfile.objects.filter(avatar='')
|
|
||||||
for profile in profiles:
|
|
||||||
profile.save() # This will trigger the avatar generation logic in the save method
|
|
||||||
self.stdout.write(self.style.SUCCESS(f"Regenerated avatar for {profile.user.username}"))
|
|
||||||
@@ -1,17 +0,0 @@
|
|||||||
from django.core.management.base import BaseCommand
|
|
||||||
from django.db import connection
|
|
||||||
|
|
||||||
class Command(BaseCommand):
|
|
||||||
help = 'Reset social auth configuration'
|
|
||||||
|
|
||||||
def handle(self, *args, **options):
|
|
||||||
with connection.cursor() as cursor:
|
|
||||||
# Delete all social apps
|
|
||||||
cursor.execute("DELETE FROM socialaccount_socialapp")
|
|
||||||
cursor.execute("DELETE FROM socialaccount_socialapp_sites")
|
|
||||||
|
|
||||||
# Reset sequences
|
|
||||||
cursor.execute("DELETE FROM sqlite_sequence WHERE name='socialaccount_socialapp'")
|
|
||||||
cursor.execute("DELETE FROM sqlite_sequence WHERE name='socialaccount_socialapp_sites'")
|
|
||||||
|
|
||||||
self.stdout.write(self.style.SUCCESS('Successfully reset social auth configuration'))
|
|
||||||
@@ -1,63 +0,0 @@
|
|||||||
from django.core.management.base import BaseCommand
|
|
||||||
from django.contrib.sites.models import Site
|
|
||||||
from allauth.socialaccount.models import SocialApp
|
|
||||||
from dotenv import load_dotenv
|
|
||||||
import os
|
|
||||||
|
|
||||||
class Command(BaseCommand):
|
|
||||||
help = 'Sets up social authentication apps'
|
|
||||||
|
|
||||||
def handle(self, *args, **kwargs):
|
|
||||||
# Load environment variables
|
|
||||||
load_dotenv()
|
|
||||||
|
|
||||||
# Get environment variables
|
|
||||||
google_client_id = os.getenv('GOOGLE_CLIENT_ID')
|
|
||||||
google_client_secret = os.getenv('GOOGLE_CLIENT_SECRET')
|
|
||||||
discord_client_id = os.getenv('DISCORD_CLIENT_ID')
|
|
||||||
discord_client_secret = os.getenv('DISCORD_CLIENT_SECRET')
|
|
||||||
|
|
||||||
if not all([google_client_id, google_client_secret, discord_client_id, discord_client_secret]):
|
|
||||||
self.stdout.write(self.style.ERROR('Missing required environment variables'))
|
|
||||||
return
|
|
||||||
|
|
||||||
# Get or create the default site
|
|
||||||
site, _ = Site.objects.get_or_create(
|
|
||||||
id=1,
|
|
||||||
defaults={
|
|
||||||
'domain': 'localhost:8000',
|
|
||||||
'name': 'localhost'
|
|
||||||
}
|
|
||||||
)
|
|
||||||
|
|
||||||
# Set up Google
|
|
||||||
google_app, created = SocialApp.objects.get_or_create(
|
|
||||||
provider='google',
|
|
||||||
defaults={
|
|
||||||
'name': 'Google',
|
|
||||||
'client_id': google_client_id,
|
|
||||||
'secret': google_client_secret,
|
|
||||||
}
|
|
||||||
)
|
|
||||||
if not created:
|
|
||||||
google_app.client_id = google_client_id
|
|
||||||
google_app.[SECRET-REMOVED]
|
|
||||||
google_app.save()
|
|
||||||
google_app.sites.add(site)
|
|
||||||
|
|
||||||
# Set up Discord
|
|
||||||
discord_app, created = SocialApp.objects.get_or_create(
|
|
||||||
provider='discord',
|
|
||||||
defaults={
|
|
||||||
'name': 'Discord',
|
|
||||||
'client_id': discord_client_id,
|
|
||||||
'secret': discord_client_secret,
|
|
||||||
}
|
|
||||||
)
|
|
||||||
if not created:
|
|
||||||
discord_app.client_id = discord_client_id
|
|
||||||
discord_app.[SECRET-REMOVED]
|
|
||||||
discord_app.save()
|
|
||||||
discord_app.sites.add(site)
|
|
||||||
|
|
||||||
self.stdout.write(self.style.SUCCESS('Successfully set up social auth apps'))
|
|
||||||
@@ -1,60 +0,0 @@
|
|||||||
from django.core.management.base import BaseCommand
|
|
||||||
from django.urls import reverse
|
|
||||||
from django.test import Client
|
|
||||||
from allauth.socialaccount.models import SocialApp
|
|
||||||
from urllib.parse import urljoin
|
|
||||||
|
|
||||||
class Command(BaseCommand):
|
|
||||||
help = 'Test Discord OAuth2 authentication flow'
|
|
||||||
|
|
||||||
def handle(self, *args, **options):
|
|
||||||
client = Client(HTTP_HOST='localhost:8000')
|
|
||||||
|
|
||||||
# Get Discord app
|
|
||||||
try:
|
|
||||||
discord_app = SocialApp.objects.get(provider='discord')
|
|
||||||
self.stdout.write('Found Discord app configuration:')
|
|
||||||
self.stdout.write(f'Client ID: {discord_app.client_id}')
|
|
||||||
|
|
||||||
# Test login URL
|
|
||||||
login_url = '/accounts/discord/login/'
|
|
||||||
response = client.get(login_url, HTTP_HOST='localhost:8000')
|
|
||||||
self.stdout.write(f'\nTesting login URL: {login_url}')
|
|
||||||
self.stdout.write(f'Status code: {response.status_code}')
|
|
||||||
|
|
||||||
if response.status_code == 302:
|
|
||||||
redirect_url = response['Location']
|
|
||||||
self.stdout.write(f'Redirects to: {redirect_url}')
|
|
||||||
|
|
||||||
# Parse OAuth2 parameters
|
|
||||||
self.stdout.write('\nOAuth2 Parameters:')
|
|
||||||
if 'client_id=' in redirect_url:
|
|
||||||
self.stdout.write('✓ client_id parameter present')
|
|
||||||
if 'redirect_uri=' in redirect_url:
|
|
||||||
self.stdout.write('✓ redirect_uri parameter present')
|
|
||||||
if 'scope=' in redirect_url:
|
|
||||||
self.stdout.write('✓ scope parameter present')
|
|
||||||
if 'response_type=' in redirect_url:
|
|
||||||
self.stdout.write('✓ response_type parameter present')
|
|
||||||
if 'code_challenge=' in redirect_url:
|
|
||||||
self.stdout.write('✓ PKCE enabled (code_challenge present)')
|
|
||||||
|
|
||||||
# Show callback URL
|
|
||||||
callback_url = 'http://localhost:8000/accounts/discord/login/callback/'
|
|
||||||
self.stdout.write('\nCallback URL to configure in Discord Developer Portal:')
|
|
||||||
self.stdout.write(callback_url)
|
|
||||||
|
|
||||||
# Show frontend login URL
|
|
||||||
frontend_url = 'http://localhost:5173'
|
|
||||||
self.stdout.write('\nFrontend configuration:')
|
|
||||||
self.stdout.write(f'Frontend URL: {frontend_url}')
|
|
||||||
self.stdout.write('Discord login button should use:')
|
|
||||||
self.stdout.write('/accounts/discord/login/?process=login')
|
|
||||||
|
|
||||||
# Show allauth URLs
|
|
||||||
self.stdout.write('\nAllauth URLs:')
|
|
||||||
self.stdout.write('Login URL: /accounts/discord/login/?process=login')
|
|
||||||
self.stdout.write('Callback URL: /accounts/discord/login/callback/')
|
|
||||||
|
|
||||||
except SocialApp.DoesNotExist:
|
|
||||||
self.stdout.write(self.style.ERROR('Discord app not found'))
|
|
||||||
@@ -1,36 +0,0 @@
|
|||||||
from django.core.management.base import BaseCommand
|
|
||||||
from allauth.socialaccount.models import SocialApp
|
|
||||||
from django.contrib.sites.models import Site
|
|
||||||
from django.urls import reverse
|
|
||||||
from django.conf import settings
|
|
||||||
|
|
||||||
class Command(BaseCommand):
|
|
||||||
help = 'Verify Discord OAuth2 settings'
|
|
||||||
|
|
||||||
def handle(self, *args, **options):
|
|
||||||
# Get Discord app
|
|
||||||
try:
|
|
||||||
discord_app = SocialApp.objects.get(provider='discord')
|
|
||||||
self.stdout.write('Found Discord app configuration:')
|
|
||||||
self.stdout.write(f'Client ID: {discord_app.client_id}')
|
|
||||||
self.stdout.write(f'Secret: {discord_app.secret}')
|
|
||||||
|
|
||||||
# Get sites
|
|
||||||
sites = discord_app.sites.all()
|
|
||||||
self.stdout.write('\nAssociated sites:')
|
|
||||||
for site in sites:
|
|
||||||
self.stdout.write(f'- {site.domain} ({site.name})')
|
|
||||||
|
|
||||||
# Show callback URL
|
|
||||||
callback_url = 'http://localhost:8000/accounts/discord/login/callback/'
|
|
||||||
self.stdout.write('\nCallback URL to configure in Discord Developer Portal:')
|
|
||||||
self.stdout.write(callback_url)
|
|
||||||
|
|
||||||
# Show OAuth2 settings
|
|
||||||
self.stdout.write('\nOAuth2 settings in settings.py:')
|
|
||||||
discord_settings = settings.SOCIALACCOUNT_PROVIDERS.get('discord', {})
|
|
||||||
self.stdout.write(f'PKCE Enabled: {discord_settings.get("OAUTH_PKCE_ENABLED", False)}')
|
|
||||||
self.stdout.write(f'Scopes: {discord_settings.get("SCOPE", [])}')
|
|
||||||
|
|
||||||
except SocialApp.DoesNotExist:
|
|
||||||
self.stdout.write(self.style.ERROR('Discord app not found'))
|
|
||||||
@@ -1,226 +0,0 @@
|
|||||||
"""
|
|
||||||
Selectors for user and account-related data retrieval.
|
|
||||||
Following Django styleguide pattern for separating data access from business logic.
|
|
||||||
"""
|
|
||||||
|
|
||||||
from typing import Optional, Dict, Any, List
|
|
||||||
from django.db.models import QuerySet, Q, F, Count, Avg, Prefetch
|
|
||||||
from django.contrib.auth import get_user_model
|
|
||||||
from django.utils import timezone
|
|
||||||
from datetime import timedelta
|
|
||||||
|
|
||||||
User = get_user_model()
|
|
||||||
|
|
||||||
|
|
||||||
def user_profile_optimized(*, user_id: int) -> Any:
|
|
||||||
"""
|
|
||||||
Get a user with optimized queries for profile display.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
user_id: User ID
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
User instance with prefetched related data
|
|
||||||
|
|
||||||
Raises:
|
|
||||||
User.DoesNotExist: If user doesn't exist
|
|
||||||
"""
|
|
||||||
return User.objects.prefetch_related(
|
|
||||||
'park_reviews',
|
|
||||||
'ride_reviews',
|
|
||||||
'socialaccount_set'
|
|
||||||
).annotate(
|
|
||||||
park_review_count=Count('park_reviews', filter=Q(park_reviews__is_published=True)),
|
|
||||||
ride_review_count=Count('ride_reviews', filter=Q(ride_reviews__is_published=True)),
|
|
||||||
total_review_count=F('park_review_count') + F('ride_review_count')
|
|
||||||
).get(id=user_id)
|
|
||||||
|
|
||||||
|
|
||||||
def active_users_with_stats() -> QuerySet:
|
|
||||||
"""
|
|
||||||
Get active users with review statistics.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
QuerySet of active users with review counts
|
|
||||||
"""
|
|
||||||
return User.objects.filter(
|
|
||||||
is_active=True
|
|
||||||
).annotate(
|
|
||||||
park_review_count=Count('park_reviews', filter=Q(park_reviews__is_published=True)),
|
|
||||||
ride_review_count=Count('ride_reviews', filter=Q(ride_reviews__is_published=True)),
|
|
||||||
total_review_count=F('park_review_count') + F('ride_review_count')
|
|
||||||
).order_by('-total_review_count')
|
|
||||||
|
|
||||||
|
|
||||||
def users_with_recent_activity(*, days: int = 30) -> QuerySet:
|
|
||||||
"""
|
|
||||||
Get users who have been active in the last N days.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
days: Number of days to look back for activity
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
QuerySet of recently active users
|
|
||||||
"""
|
|
||||||
cutoff_date = timezone.now() - timedelta(days=days)
|
|
||||||
|
|
||||||
return User.objects.filter(
|
|
||||||
Q(last_login__gte=cutoff_date) |
|
|
||||||
Q(park_reviews__created_at__gte=cutoff_date) |
|
|
||||||
Q(ride_reviews__created_at__gte=cutoff_date)
|
|
||||||
).annotate(
|
|
||||||
recent_park_reviews=Count('park_reviews', filter=Q(park_reviews__created_at__gte=cutoff_date)),
|
|
||||||
recent_ride_reviews=Count('ride_reviews', filter=Q(ride_reviews__created_at__gte=cutoff_date)),
|
|
||||||
recent_total_reviews=F('recent_park_reviews') + F('recent_ride_reviews')
|
|
||||||
).order_by('-last_login').distinct()
|
|
||||||
|
|
||||||
|
|
||||||
def top_reviewers(*, limit: int = 10) -> QuerySet:
|
|
||||||
"""
|
|
||||||
Get top users by review count.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
limit: Maximum number of users to return
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
QuerySet of top reviewers
|
|
||||||
"""
|
|
||||||
return User.objects.filter(
|
|
||||||
is_active=True
|
|
||||||
).annotate(
|
|
||||||
park_review_count=Count('park_reviews', filter=Q(park_reviews__is_published=True)),
|
|
||||||
ride_review_count=Count('ride_reviews', filter=Q(ride_reviews__is_published=True)),
|
|
||||||
total_review_count=F('park_review_count') + F('ride_review_count')
|
|
||||||
).filter(
|
|
||||||
total_review_count__gt=0
|
|
||||||
).order_by('-total_review_count')[:limit]
|
|
||||||
|
|
||||||
|
|
||||||
def moderator_users() -> QuerySet:
|
|
||||||
"""
|
|
||||||
Get users with moderation permissions.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
QuerySet of users who can moderate content
|
|
||||||
"""
|
|
||||||
return User.objects.filter(
|
|
||||||
Q(is_staff=True) |
|
|
||||||
Q(groups__name='Moderators') |
|
|
||||||
Q(user_permissions__codename__in=['change_parkreview', 'change_ridereview'])
|
|
||||||
).distinct().order_by('username')
|
|
||||||
|
|
||||||
|
|
||||||
def users_by_registration_date(*, start_date, end_date) -> QuerySet:
|
|
||||||
"""
|
|
||||||
Get users who registered within a date range.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
start_date: Start of date range
|
|
||||||
end_date: End of date range
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
QuerySet of users registered in the date range
|
|
||||||
"""
|
|
||||||
return User.objects.filter(
|
|
||||||
date_joined__date__gte=start_date,
|
|
||||||
date_joined__date__lte=end_date
|
|
||||||
).order_by('-date_joined')
|
|
||||||
|
|
||||||
|
|
||||||
def user_search_autocomplete(*, query: str, limit: int = 10) -> QuerySet:
|
|
||||||
"""
|
|
||||||
Get users matching a search query for autocomplete functionality.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
query: Search string
|
|
||||||
limit: Maximum number of results
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
QuerySet of matching users for autocomplete
|
|
||||||
"""
|
|
||||||
return User.objects.filter(
|
|
||||||
Q(username__icontains=query) |
|
|
||||||
Q(first_name__icontains=query) |
|
|
||||||
Q(last_name__icontains=query),
|
|
||||||
is_active=True
|
|
||||||
).order_by('username')[:limit]
|
|
||||||
|
|
||||||
|
|
||||||
def users_with_social_accounts() -> QuerySet:
|
|
||||||
"""
|
|
||||||
Get users who have connected social accounts.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
QuerySet of users with social account connections
|
|
||||||
"""
|
|
||||||
return User.objects.filter(
|
|
||||||
socialaccount__isnull=False
|
|
||||||
).prefetch_related(
|
|
||||||
'socialaccount_set'
|
|
||||||
).distinct().order_by('username')
|
|
||||||
|
|
||||||
|
|
||||||
def user_statistics_summary() -> Dict[str, Any]:
|
|
||||||
"""
|
|
||||||
Get overall user statistics for dashboard/analytics.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
Dictionary containing user statistics
|
|
||||||
"""
|
|
||||||
total_users = User.objects.count()
|
|
||||||
active_users = User.objects.filter(is_active=True).count()
|
|
||||||
staff_users = User.objects.filter(is_staff=True).count()
|
|
||||||
|
|
||||||
# Users with reviews
|
|
||||||
users_with_reviews = User.objects.filter(
|
|
||||||
Q(park_reviews__isnull=False) |
|
|
||||||
Q(ride_reviews__isnull=False)
|
|
||||||
).distinct().count()
|
|
||||||
|
|
||||||
# Recent registrations (last 30 days)
|
|
||||||
cutoff_date = timezone.now() - timedelta(days=30)
|
|
||||||
recent_registrations = User.objects.filter(
|
|
||||||
date_joined__gte=cutoff_date
|
|
||||||
).count()
|
|
||||||
|
|
||||||
return {
|
|
||||||
'total_users': total_users,
|
|
||||||
'active_users': active_users,
|
|
||||||
'inactive_users': total_users - active_users,
|
|
||||||
'staff_users': staff_users,
|
|
||||||
'users_with_reviews': users_with_reviews,
|
|
||||||
'recent_registrations': recent_registrations,
|
|
||||||
'review_participation_rate': (users_with_reviews / total_users * 100) if total_users > 0 else 0
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
def users_needing_email_verification() -> QuerySet:
|
|
||||||
"""
|
|
||||||
Get users who haven't verified their email addresses.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
QuerySet of users with unverified emails
|
|
||||||
"""
|
|
||||||
return User.objects.filter(
|
|
||||||
is_active=True,
|
|
||||||
emailaddress__verified=False
|
|
||||||
).distinct().order_by('date_joined')
|
|
||||||
|
|
||||||
|
|
||||||
def users_by_review_activity(*, min_reviews: int = 1) -> QuerySet:
|
|
||||||
"""
|
|
||||||
Get users who have written at least a minimum number of reviews.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
min_reviews: Minimum number of reviews required
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
QuerySet of users with sufficient review activity
|
|
||||||
"""
|
|
||||||
return User.objects.annotate(
|
|
||||||
park_review_count=Count('park_reviews', filter=Q(park_reviews__is_published=True)),
|
|
||||||
ride_review_count=Count('ride_reviews', filter=Q(ride_reviews__is_published=True)),
|
|
||||||
total_review_count=F('park_review_count') + F('ride_review_count')
|
|
||||||
).filter(
|
|
||||||
total_review_count__gte=min_reviews
|
|
||||||
).order_by('-total_review_count')
|
|
||||||
@@ -1,91 +0,0 @@
|
|||||||
from django.test import TestCase
|
|
||||||
from django.contrib.auth.models import Group, Permission
|
|
||||||
from django.contrib.contenttypes.models import ContentType
|
|
||||||
from unittest.mock import patch, MagicMock
|
|
||||||
from .models import User, UserProfile
|
|
||||||
from .signals import create_default_groups
|
|
||||||
|
|
||||||
class SignalsTestCase(TestCase):
|
|
||||||
def setUp(self):
|
|
||||||
self.user = User.objects.create_user(
|
|
||||||
username='testuser',
|
|
||||||
email='testuser@example.com',
|
|
||||||
password='password'
|
|
||||||
)
|
|
||||||
|
|
||||||
def test_create_user_profile(self):
|
|
||||||
self.assertTrue(hasattr(self.user, 'profile'))
|
|
||||||
self.assertIsInstance(self.user.profile, UserProfile)
|
|
||||||
|
|
||||||
@patch('accounts.signals.requests.get')
|
|
||||||
def test_create_user_profile_with_social_avatar(self, mock_get):
|
|
||||||
# Mock the response from requests.get
|
|
||||||
mock_response = MagicMock()
|
|
||||||
mock_response.status_code = 200
|
|
||||||
mock_response.content = b'fake-image-content'
|
|
||||||
mock_get.return_value = mock_response
|
|
||||||
|
|
||||||
# Create a social account for the user
|
|
||||||
social_account = self.user.socialaccount_set.create(
|
|
||||||
provider='google',
|
|
||||||
extra_data={'picture': 'http://example.com/avatar.png'}
|
|
||||||
)
|
|
||||||
|
|
||||||
# The signal should have been triggered when the user was created,
|
|
||||||
# but we can trigger it again to test the avatar download
|
|
||||||
from .signals import create_user_profile
|
|
||||||
create_user_profile(sender=User, instance=self.user, created=True)
|
|
||||||
|
|
||||||
self.user.profile.refresh_from_db()
|
|
||||||
self.assertTrue(self.user.profile.avatar.name.startswith('avatars/avatar_testuser'))
|
|
||||||
|
|
||||||
def test_save_user_profile(self):
|
|
||||||
self.user.profile.delete()
|
|
||||||
self.assertFalse(hasattr(self.user, 'profile'))
|
|
||||||
self.user.save()
|
|
||||||
self.assertTrue(hasattr(self.user, 'profile'))
|
|
||||||
self.assertIsInstance(self.user.profile, UserProfile)
|
|
||||||
|
|
||||||
def test_sync_user_role_with_groups(self):
|
|
||||||
self.user.role = User.Roles.MODERATOR
|
|
||||||
self.user.save()
|
|
||||||
self.assertTrue(self.user.groups.filter(name=User.Roles.MODERATOR).exists())
|
|
||||||
self.assertTrue(self.user.is_staff)
|
|
||||||
|
|
||||||
self.user.role = User.Roles.ADMIN
|
|
||||||
self.user.save()
|
|
||||||
self.assertFalse(self.user.groups.filter(name=User.Roles.MODERATOR).exists())
|
|
||||||
self.assertTrue(self.user.groups.filter(name=User.Roles.ADMIN).exists())
|
|
||||||
self.assertTrue(self.user.is_staff)
|
|
||||||
|
|
||||||
self.user.role = User.Roles.SUPERUSER
|
|
||||||
self.user.save()
|
|
||||||
self.assertFalse(self.user.groups.filter(name=User.Roles.ADMIN).exists())
|
|
||||||
self.assertTrue(self.user.groups.filter(name=User.Roles.SUPERUSER).exists())
|
|
||||||
self.assertTrue(self.user.is_superuser)
|
|
||||||
self.assertTrue(self.user.is_staff)
|
|
||||||
|
|
||||||
self.user.role = User.Roles.USER
|
|
||||||
self.user.save()
|
|
||||||
self.assertFalse(self.user.groups.exists())
|
|
||||||
self.assertFalse(self.user.is_superuser)
|
|
||||||
self.assertFalse(self.user.is_staff)
|
|
||||||
|
|
||||||
def test_create_default_groups(self):
|
|
||||||
# Create some permissions for testing
|
|
||||||
content_type = ContentType.objects.get_for_model(User)
|
|
||||||
Permission.objects.create(codename='change_review', name='Can change review', content_type=content_type)
|
|
||||||
Permission.objects.create(codename='delete_review', name='Can delete review', content_type=content_type)
|
|
||||||
Permission.objects.create(codename='change_user', name='Can change user', content_type=content_type)
|
|
||||||
|
|
||||||
create_default_groups()
|
|
||||||
|
|
||||||
moderator_group = Group.objects.get(name=User.Roles.MODERATOR)
|
|
||||||
self.assertIsNotNone(moderator_group)
|
|
||||||
self.assertTrue(moderator_group.permissions.filter(codename='change_review').exists())
|
|
||||||
self.assertFalse(moderator_group.permissions.filter(codename='change_user').exists())
|
|
||||||
|
|
||||||
admin_group = Group.objects.get(name=User.Roles.ADMIN)
|
|
||||||
self.assertIsNotNone(admin_group)
|
|
||||||
self.assertTrue(admin_group.permissions.filter(codename='change_review').exists())
|
|
||||||
self.assertTrue(admin_group.permissions.filter(codename='change_user').exists())
|
|
||||||
@@ -1,25 +0,0 @@
|
|||||||
from django.urls import path
|
|
||||||
from django.contrib.auth import views as auth_views
|
|
||||||
from allauth.account.views import LogoutView
|
|
||||||
from . import views
|
|
||||||
|
|
||||||
app_name = 'accounts'
|
|
||||||
|
|
||||||
urlpatterns = [
|
|
||||||
# Override allauth's login and signup views with our Turnstile-enabled versions
|
|
||||||
path('login/', views.CustomLoginView.as_view(), name='account_login'),
|
|
||||||
path('signup/', views.CustomSignupView.as_view(), name='account_signup'),
|
|
||||||
|
|
||||||
# Authentication views
|
|
||||||
path('logout/', LogoutView.as_view(), name='logout'),
|
|
||||||
path('password_change/', auth_views.PasswordChangeView.as_view(), name='password_change'),
|
|
||||||
path('password_change/done/', auth_views.PasswordChangeDoneView.as_view(), name='password_change_done'),
|
|
||||||
path('password_reset/', auth_views.PasswordResetView.as_view(), name='password_reset'),
|
|
||||||
path('password_reset/done/', auth_views.PasswordResetDoneView.as_view(), name='password_reset_done'),
|
|
||||||
path('reset/<uidb64>/<token>/', auth_views.PasswordResetConfirmView.as_view(), name='password_reset_confirm'),
|
|
||||||
path('reset/done/', auth_views.PasswordResetCompleteView.as_view(), name='password_reset_complete'),
|
|
||||||
|
|
||||||
# Profile views
|
|
||||||
path('profile/', views.user_redirect_view, name='profile_redirect'),
|
|
||||||
path('settings/', views.SettingsView.as_view(), name='settings'),
|
|
||||||
]
|
|
||||||
@@ -1,391 +0,0 @@
|
|||||||
from django.views.generic import DetailView, TemplateView
|
|
||||||
from django.contrib.auth import get_user_model
|
|
||||||
from django.shortcuts import get_object_or_404, redirect, render
|
|
||||||
from django.contrib.auth.decorators import login_required
|
|
||||||
from django.contrib.auth.mixins import LoginRequiredMixin
|
|
||||||
from django.contrib import messages
|
|
||||||
from django.core.exceptions import ValidationError
|
|
||||||
from allauth.socialaccount.providers.google.views import GoogleOAuth2Adapter
|
|
||||||
from allauth.socialaccount.providers.discord.views import DiscordOAuth2Adapter
|
|
||||||
from allauth.socialaccount.providers.oauth2.client import OAuth2Client
|
|
||||||
from django.conf import settings
|
|
||||||
from django.core.mail import send_mail
|
|
||||||
from django.template.loader import render_to_string
|
|
||||||
from django.utils.crypto import get_random_string
|
|
||||||
from django.utils import timezone
|
|
||||||
from datetime import timedelta
|
|
||||||
from django.contrib.sites.shortcuts import get_current_site
|
|
||||||
from django.db.models import Prefetch, QuerySet
|
|
||||||
from django.http import HttpResponseRedirect, HttpResponse, HttpRequest
|
|
||||||
from django.urls import reverse
|
|
||||||
from django.contrib.auth import login
|
|
||||||
from django.core.files.uploadedfile import UploadedFile
|
|
||||||
from accounts.models import User, PasswordReset, TopList, EmailVerification, UserProfile
|
|
||||||
from email_service.services import EmailService
|
|
||||||
from parks.models import ParkReview
|
|
||||||
from rides.models import RideReview
|
|
||||||
from allauth.account.views import LoginView, SignupView
|
|
||||||
from .mixins import TurnstileMixin
|
|
||||||
from typing import Dict, Any, Optional, Union, cast, TYPE_CHECKING
|
|
||||||
from django_htmx.http import HttpResponseClientRefresh
|
|
||||||
from django.contrib.sites.models import Site
|
|
||||||
from django.contrib.sites.requests import RequestSite
|
|
||||||
from contextlib import suppress
|
|
||||||
import re
|
|
||||||
|
|
||||||
if TYPE_CHECKING:
|
|
||||||
from django.contrib.sites.models import Site
|
|
||||||
from django.contrib.sites.requests import RequestSite
|
|
||||||
|
|
||||||
UserModel = get_user_model()
|
|
||||||
|
|
||||||
class CustomLoginView(TurnstileMixin, LoginView):
|
|
||||||
def form_valid(self, form):
|
|
||||||
try:
|
|
||||||
self.validate_turnstile(self.request)
|
|
||||||
except ValidationError as e:
|
|
||||||
form.add_error(None, str(e))
|
|
||||||
return self.form_invalid(form)
|
|
||||||
|
|
||||||
response = super().form_valid(form)
|
|
||||||
return HttpResponseClientRefresh() if getattr(self.request, 'htmx', False) else response
|
|
||||||
|
|
||||||
def form_invalid(self, form):
|
|
||||||
if getattr(self.request, 'htmx', False):
|
|
||||||
return render(
|
|
||||||
self.request,
|
|
||||||
'account/partials/login_form.html',
|
|
||||||
self.get_context_data(form=form)
|
|
||||||
)
|
|
||||||
return super().form_invalid(form)
|
|
||||||
|
|
||||||
def get(self, request: HttpRequest, *args: Any, **kwargs: Any) -> HttpResponse:
|
|
||||||
if getattr(request, 'htmx', False):
|
|
||||||
return render(
|
|
||||||
request,
|
|
||||||
'account/partials/login_modal.html',
|
|
||||||
self.get_context_data()
|
|
||||||
)
|
|
||||||
return super().get(request, *args, **kwargs)
|
|
||||||
|
|
||||||
class CustomSignupView(TurnstileMixin, SignupView):
|
|
||||||
def form_valid(self, form):
|
|
||||||
try:
|
|
||||||
self.validate_turnstile(self.request)
|
|
||||||
except ValidationError as e:
|
|
||||||
form.add_error(None, str(e))
|
|
||||||
return self.form_invalid(form)
|
|
||||||
|
|
||||||
response = super().form_valid(form)
|
|
||||||
return HttpResponseClientRefresh() if getattr(self.request, 'htmx', False) else response
|
|
||||||
|
|
||||||
def form_invalid(self, form):
|
|
||||||
if getattr(self.request, 'htmx', False):
|
|
||||||
return render(
|
|
||||||
self.request,
|
|
||||||
'account/partials/signup_modal.html',
|
|
||||||
self.get_context_data(form=form)
|
|
||||||
)
|
|
||||||
return super().form_invalid(form)
|
|
||||||
|
|
||||||
def get(self, request: HttpRequest, *args: Any, **kwargs: Any) -> HttpResponse:
|
|
||||||
if getattr(request, 'htmx', False):
|
|
||||||
return render(
|
|
||||||
request,
|
|
||||||
'account/partials/signup_modal.html',
|
|
||||||
self.get_context_data()
|
|
||||||
)
|
|
||||||
return super().get(request, *args, **kwargs)
|
|
||||||
|
|
||||||
@login_required
|
|
||||||
def user_redirect_view(request: HttpRequest) -> HttpResponse:
|
|
||||||
user = cast(User, request.user)
|
|
||||||
return redirect('profile', username=user.username)
|
|
||||||
|
|
||||||
def handle_social_login(request: HttpRequest, email: str) -> HttpResponse:
|
|
||||||
if sociallogin := request.session.get('socialaccount_sociallogin'):
|
|
||||||
sociallogin.user.email = email
|
|
||||||
sociallogin.save()
|
|
||||||
login(request, sociallogin.user)
|
|
||||||
del request.session['socialaccount_sociallogin']
|
|
||||||
messages.success(request, 'Successfully logged in')
|
|
||||||
return redirect('/')
|
|
||||||
|
|
||||||
def email_required(request: HttpRequest) -> HttpResponse:
|
|
||||||
if not request.session.get('socialaccount_sociallogin'):
|
|
||||||
messages.error(request, 'No social login in progress')
|
|
||||||
return redirect('/')
|
|
||||||
|
|
||||||
if request.method == 'POST':
|
|
||||||
if email := request.POST.get('email'):
|
|
||||||
return handle_social_login(request, email)
|
|
||||||
messages.error(request, 'Email is required')
|
|
||||||
return render(request, 'accounts/email_required.html', {'error': 'Email is required'})
|
|
||||||
|
|
||||||
return render(request, 'accounts/email_required.html')
|
|
||||||
|
|
||||||
class ProfileView(DetailView):
|
|
||||||
model = User
|
|
||||||
template_name = 'accounts/profile.html'
|
|
||||||
context_object_name = 'profile_user'
|
|
||||||
slug_field = 'username'
|
|
||||||
slug_url_kwarg = 'username'
|
|
||||||
|
|
||||||
def get_queryset(self) -> QuerySet[User]:
|
|
||||||
return User.objects.select_related('profile')
|
|
||||||
|
|
||||||
def get_context_data(self, **kwargs: Any) -> Dict[str, Any]:
|
|
||||||
context = super().get_context_data(**kwargs)
|
|
||||||
user = cast(User, self.get_object())
|
|
||||||
|
|
||||||
context['park_reviews'] = self._get_user_park_reviews(user)
|
|
||||||
context['ride_reviews'] = self._get_user_ride_reviews(user)
|
|
||||||
context['top_lists'] = self._get_user_top_lists(user)
|
|
||||||
|
|
||||||
return context
|
|
||||||
|
|
||||||
def _get_user_park_reviews(self, user: User) -> QuerySet[ParkReview]:
|
|
||||||
return ParkReview.objects.filter(
|
|
||||||
user=user,
|
|
||||||
is_published=True
|
|
||||||
).select_related(
|
|
||||||
'user',
|
|
||||||
'user__profile',
|
|
||||||
'park'
|
|
||||||
).order_by('-created_at')[:5]
|
|
||||||
|
|
||||||
def _get_user_ride_reviews(self, user: User) -> QuerySet[RideReview]:
|
|
||||||
return RideReview.objects.filter(
|
|
||||||
user=user,
|
|
||||||
is_published=True
|
|
||||||
).select_related(
|
|
||||||
'user',
|
|
||||||
'user__profile',
|
|
||||||
'ride'
|
|
||||||
).order_by('-created_at')[:5]
|
|
||||||
|
|
||||||
def _get_user_top_lists(self, user: User) -> QuerySet[TopList]:
|
|
||||||
return TopList.objects.filter(
|
|
||||||
user=user
|
|
||||||
).select_related(
|
|
||||||
'user',
|
|
||||||
'user__profile'
|
|
||||||
).prefetch_related(
|
|
||||||
'items'
|
|
||||||
).order_by('-created_at')[:5]
|
|
||||||
|
|
||||||
class SettingsView(LoginRequiredMixin, TemplateView):
|
|
||||||
template_name = 'accounts/settings.html'
|
|
||||||
|
|
||||||
def get_context_data(self, **kwargs: Any) -> Dict[str, Any]:
|
|
||||||
context = super().get_context_data(**kwargs)
|
|
||||||
context['user'] = self.request.user
|
|
||||||
return context
|
|
||||||
|
|
||||||
def _handle_profile_update(self, request: HttpRequest) -> None:
|
|
||||||
user = cast(User, request.user)
|
|
||||||
profile = get_object_or_404(UserProfile, user=user)
|
|
||||||
|
|
||||||
if display_name := request.POST.get('display_name'):
|
|
||||||
profile.display_name = display_name
|
|
||||||
|
|
||||||
if 'avatar' in request.FILES:
|
|
||||||
avatar_file = cast(UploadedFile, request.FILES['avatar'])
|
|
||||||
profile.avatar.save(avatar_file.name, avatar_file, save=False)
|
|
||||||
profile.save()
|
|
||||||
|
|
||||||
user.save()
|
|
||||||
messages.success(request, 'Profile updated successfully')
|
|
||||||
|
|
||||||
def _validate_password(self, password: str) -> bool:
|
|
||||||
"""Validate password meets requirements."""
|
|
||||||
return (
|
|
||||||
len(password) >= 8 and
|
|
||||||
bool(re.search(r'[A-Z]', password)) and
|
|
||||||
bool(re.search(r'[a-z]', password)) and
|
|
||||||
bool(re.search(r'[0-9]', password))
|
|
||||||
)
|
|
||||||
|
|
||||||
def _send_password_change_confirmation(self, request: HttpRequest, user: User) -> None:
|
|
||||||
"""Send password change confirmation email."""
|
|
||||||
site = get_current_site(request)
|
|
||||||
context = {
|
|
||||||
'user': user,
|
|
||||||
'site_name': site.name,
|
|
||||||
}
|
|
||||||
|
|
||||||
email_html = render_to_string('accounts/email/password_change_confirmation.html', context)
|
|
||||||
|
|
||||||
EmailService.send_email(
|
|
||||||
to=user.email,
|
|
||||||
subject='Password Changed Successfully',
|
|
||||||
text='Your password has been changed successfully.',
|
|
||||||
site=site,
|
|
||||||
html=email_html
|
|
||||||
)
|
|
||||||
|
|
||||||
def _handle_password_change(self, request: HttpRequest) -> Optional[HttpResponseRedirect]:
|
|
||||||
user = cast(User, request.user)
|
|
||||||
old_password = request.POST.get('old_password', '')
|
|
||||||
new_password = request.POST.get('new_password', '')
|
|
||||||
confirm_password = request.POST.get('confirm_password', '')
|
|
||||||
|
|
||||||
if not user.check_password(old_password):
|
|
||||||
messages.error(request, 'Current password is incorrect')
|
|
||||||
return None
|
|
||||||
|
|
||||||
if new_password != confirm_password:
|
|
||||||
messages.error(request, 'New passwords do not match')
|
|
||||||
return None
|
|
||||||
|
|
||||||
if not self._validate_password(new_password):
|
|
||||||
messages.error(request, 'Password must be at least 8 characters and contain uppercase, lowercase, and numbers')
|
|
||||||
return None
|
|
||||||
|
|
||||||
user.set_password(new_password)
|
|
||||||
user.save()
|
|
||||||
|
|
||||||
self._send_password_change_confirmation(request, user)
|
|
||||||
messages.success(request, 'Password changed successfully. Please check your email for confirmation.')
|
|
||||||
return HttpResponseRedirect(reverse('account_login'))
|
|
||||||
|
|
||||||
def _handle_email_change(self, request: HttpRequest) -> None:
|
|
||||||
if new_email := request.POST.get('new_email'):
|
|
||||||
self._send_email_verification(request, new_email)
|
|
||||||
messages.success(request, 'Verification email sent to your new email address')
|
|
||||||
else:
|
|
||||||
messages.error(request, 'New email is required')
|
|
||||||
|
|
||||||
def _send_email_verification(self, request: HttpRequest, new_email: str) -> None:
|
|
||||||
user = cast(User, request.user)
|
|
||||||
token = get_random_string(64)
|
|
||||||
EmailVerification.objects.update_or_create(
|
|
||||||
user=user,
|
|
||||||
defaults={'token': token}
|
|
||||||
)
|
|
||||||
|
|
||||||
site = cast(Site, get_current_site(request))
|
|
||||||
verification_url = reverse('verify_email', kwargs={'token': token})
|
|
||||||
|
|
||||||
context = {
|
|
||||||
'user': user,
|
|
||||||
'verification_url': verification_url,
|
|
||||||
'site_name': site.name,
|
|
||||||
}
|
|
||||||
|
|
||||||
email_html = render_to_string('accounts/email/verify_email.html', context)
|
|
||||||
EmailService.send_email(
|
|
||||||
to=new_email,
|
|
||||||
subject='Verify your new email address',
|
|
||||||
text='Click the link to verify your new email address',
|
|
||||||
site=site,
|
|
||||||
html=email_html
|
|
||||||
)
|
|
||||||
|
|
||||||
user.pending_email = new_email
|
|
||||||
user.save()
|
|
||||||
|
|
||||||
def post(self, request: HttpRequest, *args: Any, **kwargs: Any) -> HttpResponse:
|
|
||||||
action = request.POST.get('action')
|
|
||||||
|
|
||||||
if action == 'update_profile':
|
|
||||||
self._handle_profile_update(request)
|
|
||||||
elif action == 'change_password':
|
|
||||||
if response := self._handle_password_change(request):
|
|
||||||
return response
|
|
||||||
elif action == 'change_email':
|
|
||||||
self._handle_email_change(request)
|
|
||||||
|
|
||||||
return self.get(request, *args, **kwargs)
|
|
||||||
|
|
||||||
def create_password_reset_token(user: User) -> str:
|
|
||||||
token = get_random_string(64)
|
|
||||||
PasswordReset.objects.update_or_create(
|
|
||||||
user=user,
|
|
||||||
defaults={
|
|
||||||
'token': token,
|
|
||||||
'expires_at': timezone.now() + timedelta(hours=24)
|
|
||||||
}
|
|
||||||
)
|
|
||||||
return token
|
|
||||||
|
|
||||||
def send_password_reset_email(user: User, site: Union[Site, RequestSite], token: str) -> None:
|
|
||||||
reset_url = reverse('password_reset_confirm', kwargs={'token': token})
|
|
||||||
context = {
|
|
||||||
'user': user,
|
|
||||||
'reset_url': reset_url,
|
|
||||||
'site_name': site.name,
|
|
||||||
}
|
|
||||||
email_html = render_to_string('accounts/email/password_reset.html', context)
|
|
||||||
|
|
||||||
EmailService.send_email(
|
|
||||||
to=user.email,
|
|
||||||
subject='Reset your password',
|
|
||||||
text='Click the link to reset your password',
|
|
||||||
site=site,
|
|
||||||
html=email_html
|
|
||||||
)
|
|
||||||
|
|
||||||
def request_password_reset(request: HttpRequest) -> HttpResponse:
|
|
||||||
if request.method != 'POST':
|
|
||||||
return render(request, 'accounts/password_reset.html')
|
|
||||||
|
|
||||||
if not (email := request.POST.get('email')):
|
|
||||||
messages.error(request, 'Email is required')
|
|
||||||
return redirect('account_reset_password')
|
|
||||||
|
|
||||||
with suppress(User.DoesNotExist):
|
|
||||||
user = User.objects.get(email=email)
|
|
||||||
token = create_password_reset_token(user)
|
|
||||||
site = get_current_site(request)
|
|
||||||
send_password_reset_email(user, site, token)
|
|
||||||
|
|
||||||
messages.success(request, 'Password reset email sent')
|
|
||||||
return redirect('account_login')
|
|
||||||
|
|
||||||
def handle_password_reset(request: HttpRequest, user: User, new_password: str, reset: PasswordReset, site: Union[Site, RequestSite]) -> None:
|
|
||||||
user.set_password(new_password)
|
|
||||||
user.save()
|
|
||||||
|
|
||||||
reset.used = True
|
|
||||||
reset.save()
|
|
||||||
|
|
||||||
send_password_reset_confirmation(user, site)
|
|
||||||
messages.success(request, 'Password reset successfully')
|
|
||||||
|
|
||||||
def send_password_reset_confirmation(user: User, site: Union[Site, RequestSite]) -> None:
|
|
||||||
context = {
|
|
||||||
'user': user,
|
|
||||||
'site_name': site.name,
|
|
||||||
}
|
|
||||||
email_html = render_to_string('accounts/email/password_reset_complete.html', context)
|
|
||||||
|
|
||||||
EmailService.send_email(
|
|
||||||
to=user.email,
|
|
||||||
subject='Password Reset Complete',
|
|
||||||
text='Your password has been reset successfully.',
|
|
||||||
site=site,
|
|
||||||
html=email_html
|
|
||||||
)
|
|
||||||
|
|
||||||
def reset_password(request: HttpRequest, token: str) -> HttpResponse:
|
|
||||||
try:
|
|
||||||
reset = PasswordReset.objects.select_related('user').get(
|
|
||||||
token=token,
|
|
||||||
expires_at__gt=timezone.now(),
|
|
||||||
used=False
|
|
||||||
)
|
|
||||||
|
|
||||||
if request.method == 'POST':
|
|
||||||
if new_password := request.POST.get('new_password'):
|
|
||||||
site = get_current_site(request)
|
|
||||||
handle_password_reset(request, reset.user, new_password, reset, site)
|
|
||||||
return redirect('account_login')
|
|
||||||
|
|
||||||
messages.error(request, 'New password is required')
|
|
||||||
|
|
||||||
return render(request, 'accounts/password_reset_confirm.html', {'token': token})
|
|
||||||
|
|
||||||
except PasswordReset.DoesNotExist:
|
|
||||||
messages.error(request, 'Invalid or expired reset token')
|
|
||||||
return redirect('account_reset_password')
|
|
||||||
372
architecture/architecture-validation.md
Normal file
372
architecture/architecture-validation.md
Normal file
@@ -0,0 +1,372 @@
|
|||||||
|
# ThrillWiki Monorepo Architecture Validation
|
||||||
|
|
||||||
|
This document provides a comprehensive review and validation of the proposed monorepo architecture for migrating ThrillWiki from Django-only to Django + Vue.js.
|
||||||
|
|
||||||
|
## Architecture Overview Validation
|
||||||
|
|
||||||
|
### ✅ Core Requirements Met
|
||||||
|
|
||||||
|
1. **Clean Separation of Concerns**
|
||||||
|
- Backend: Django API, business logic, database management
|
||||||
|
- Frontend: Vue.js SPA with modern tooling
|
||||||
|
- Shared: Common resources and media files
|
||||||
|
|
||||||
|
2. **Development Workflow Preservation**
|
||||||
|
- UV package management for Python maintained
|
||||||
|
- pnpm for Node.js package management
|
||||||
|
- Existing development scripts adapted
|
||||||
|
- Hot reloading for both backend and frontend
|
||||||
|
|
||||||
|
3. **Project Structure Compatibility**
|
||||||
|
- Django apps preserved under `backend/apps/`
|
||||||
|
- Configuration maintained under `backend/config/`
|
||||||
|
- Static files strategy clearly defined
|
||||||
|
- Media files centralized in `shared/media/`
|
||||||
|
|
||||||
|
## Technical Architecture Validation
|
||||||
|
|
||||||
|
### Backend Architecture ✅
|
||||||
|
|
||||||
|
```mermaid
|
||||||
|
graph TB
|
||||||
|
A[Django Backend] --> B[Apps Directory]
|
||||||
|
A --> C[Config Directory]
|
||||||
|
A --> D[Static Files]
|
||||||
|
|
||||||
|
B --> E[accounts]
|
||||||
|
B --> F[parks]
|
||||||
|
B --> G[rides]
|
||||||
|
B --> H[moderation]
|
||||||
|
B --> I[location]
|
||||||
|
B --> J[media]
|
||||||
|
B --> K[email_service]
|
||||||
|
B --> L[core]
|
||||||
|
|
||||||
|
C --> M[Django Settings]
|
||||||
|
C --> N[URL Configuration]
|
||||||
|
C --> O[WSGI/ASGI]
|
||||||
|
|
||||||
|
D --> P[Admin Assets]
|
||||||
|
D --> Q[Backend Static]
|
||||||
|
```
|
||||||
|
|
||||||
|
**Validation Points:**
|
||||||
|
- ✅ All 8 Django apps properly mapped to new structure
|
||||||
|
- ✅ Configuration files maintain their organization
|
||||||
|
- ✅ Static file handling preserves Django admin functionality
|
||||||
|
- ✅ UV package management integration maintained
|
||||||
|
|
||||||
|
### Frontend Architecture ✅
|
||||||
|
|
||||||
|
```mermaid
|
||||||
|
graph TB
|
||||||
|
A[Vue.js Frontend] --> B[Source Code]
|
||||||
|
A --> C[Build System]
|
||||||
|
A --> D[Development Tools]
|
||||||
|
|
||||||
|
B --> E[Components]
|
||||||
|
B --> F[Views/Pages]
|
||||||
|
B --> G[Router]
|
||||||
|
B --> H[State Management]
|
||||||
|
B --> I[API Layer]
|
||||||
|
|
||||||
|
C --> J[Vite]
|
||||||
|
C --> K[TypeScript]
|
||||||
|
C --> L[Tailwind CSS]
|
||||||
|
|
||||||
|
D --> M[Hot Reload]
|
||||||
|
D --> N[Dev Server]
|
||||||
|
D --> O[Build Tools]
|
||||||
|
```
|
||||||
|
|
||||||
|
**Validation Points:**
|
||||||
|
- ✅ Modern Vue.js 3 + Composition API
|
||||||
|
- ✅ TypeScript for type safety
|
||||||
|
- ✅ Vite for fast development and builds
|
||||||
|
- ✅ Tailwind CSS for styling (matching current setup)
|
||||||
|
- ✅ Pinia for state management
|
||||||
|
- ✅ Vue Router for SPA navigation
|
||||||
|
|
||||||
|
### Integration Architecture ✅
|
||||||
|
|
||||||
|
```mermaid
|
||||||
|
graph LR
|
||||||
|
A[Vue.js Frontend] --> B[HTTP API Calls]
|
||||||
|
B --> C[Django REST API]
|
||||||
|
C --> D[Database]
|
||||||
|
C --> E[Media Files]
|
||||||
|
E --> F[Shared Media Directory]
|
||||||
|
F --> G[Frontend Access]
|
||||||
|
```
|
||||||
|
|
||||||
|
**Validation Points:**
|
||||||
|
- ✅ RESTful API integration between frontend and backend
|
||||||
|
- ✅ Media files accessible to both systems
|
||||||
|
- ✅ Authentication handling via API tokens
|
||||||
|
- ✅ CORS configuration for cross-origin requests
|
||||||
|
|
||||||
|
## File Migration Validation
|
||||||
|
|
||||||
|
### Critical File Mappings ✅
|
||||||
|
|
||||||
|
| Component | Current | New Location | Status |
|
||||||
|
|-----------|---------|--------------|--------|
|
||||||
|
| Django Apps | `/apps/` | `/backend/apps/` | ✅ Mapped |
|
||||||
|
| Configuration | `/config/` | `/backend/config/` | ✅ Mapped |
|
||||||
|
| Static Files | `/static/` | `/backend/static/` | ✅ Mapped |
|
||||||
|
| Media Files | `/media/` | `/shared/media/` | ✅ Mapped |
|
||||||
|
| Scripts | `/scripts/` | `/scripts/` | ✅ Preserved |
|
||||||
|
| Dependencies | `/pyproject.toml` | `/backend/pyproject.toml` | ✅ Mapped |
|
||||||
|
|
||||||
|
### Import Path Updates Required ✅
|
||||||
|
|
||||||
|
**Django Settings Updates:**
|
||||||
|
```python
|
||||||
|
# OLD
|
||||||
|
INSTALLED_APPS = [
|
||||||
|
'accounts',
|
||||||
|
'parks',
|
||||||
|
'rides',
|
||||||
|
# ...
|
||||||
|
]
|
||||||
|
|
||||||
|
# NEW
|
||||||
|
INSTALLED_APPS = [
|
||||||
|
'apps.accounts',
|
||||||
|
'apps.parks',
|
||||||
|
'apps.rides',
|
||||||
|
# ...
|
||||||
|
]
|
||||||
|
```
|
||||||
|
|
||||||
|
**Media Path Updates:**
|
||||||
|
```python
|
||||||
|
# NEW
|
||||||
|
MEDIA_ROOT = BASE_DIR.parent / 'shared' / 'media'
|
||||||
|
```
|
||||||
|
|
||||||
|
## Development Workflow Validation
|
||||||
|
|
||||||
|
### Package Management ✅
|
||||||
|
|
||||||
|
**Backend (UV):**
|
||||||
|
- ✅ `uv add <package>` for new dependencies
|
||||||
|
- ✅ `uv run manage.py <command>` for Django commands
|
||||||
|
- ✅ `uv sync` for dependency installation
|
||||||
|
|
||||||
|
**Frontend (pnpm):**
|
||||||
|
- ✅ `pnpm add <package>` for new dependencies
|
||||||
|
- ✅ `pnpm install` for dependency installation
|
||||||
|
- ✅ `pnpm run dev` for development server
|
||||||
|
|
||||||
|
**Root Workspace:**
|
||||||
|
- ✅ `pnpm run dev` starts both servers concurrently
|
||||||
|
- ✅ Individual server commands available
|
||||||
|
- ✅ Build and test scripts coordinated
|
||||||
|
|
||||||
|
### Development Scripts ✅
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Root level coordination
|
||||||
|
pnpm run dev # Both servers
|
||||||
|
pnpm run backend:dev # Django only
|
||||||
|
pnpm run frontend:dev # Vue.js only
|
||||||
|
pnpm run build # Production build
|
||||||
|
pnpm run test # All tests
|
||||||
|
pnpm run lint # All linting
|
||||||
|
pnpm run format # Code formatting
|
||||||
|
```
|
||||||
|
|
||||||
|
## Deployment Strategy Validation
|
||||||
|
|
||||||
|
### Container Strategy ✅
|
||||||
|
|
||||||
|
**Multi-container Approach:**
|
||||||
|
- ✅ Separate containers for backend and frontend
|
||||||
|
- ✅ Shared volumes for media files
|
||||||
|
- ✅ Database and Redis containers
|
||||||
|
- ✅ Nginx reverse proxy configuration
|
||||||
|
|
||||||
|
**Build Process:**
|
||||||
|
- ✅ Backend: Django static collection + uv dependencies
|
||||||
|
- ✅ Frontend: Vite production build + asset optimization
|
||||||
|
- ✅ Shared: Media file persistence across deployments
|
||||||
|
|
||||||
|
### Platform Compatibility ✅
|
||||||
|
|
||||||
|
**Supported Deployment Platforms:**
|
||||||
|
- ✅ Docker Compose (local and production)
|
||||||
|
- ✅ Vercel (frontend + serverless backend)
|
||||||
|
- ✅ Railway (container deployment)
|
||||||
|
- ✅ DigitalOcean App Platform
|
||||||
|
- ✅ AWS ECS/Fargate
|
||||||
|
- ✅ Google Cloud Run
|
||||||
|
|
||||||
|
## Performance Considerations ✅
|
||||||
|
|
||||||
|
### Backend Optimization
|
||||||
|
- ✅ Database connection pooling
|
||||||
|
- ✅ Redis caching strategy
|
||||||
|
- ✅ Static file CDN integration
|
||||||
|
- ✅ API response optimization
|
||||||
|
|
||||||
|
### Frontend Optimization
|
||||||
|
- ✅ Code splitting and lazy loading
|
||||||
|
- ✅ Asset optimization with Vite
|
||||||
|
- ✅ Tree shaking for minimal bundle size
|
||||||
|
- ✅ Modern build targets
|
||||||
|
|
||||||
|
### Development Performance
|
||||||
|
- ✅ Hot module replacement for Vue.js
|
||||||
|
- ✅ Django auto-reload for backend changes
|
||||||
|
- ✅ Fast dependency installation with UV and pnpm
|
||||||
|
- ✅ Concurrent development servers
|
||||||
|
|
||||||
|
## Security Validation ✅
|
||||||
|
|
||||||
|
### Backend Security
|
||||||
|
- ✅ Django security middleware maintained
|
||||||
|
- ✅ CORS configuration for API access
|
||||||
|
- ✅ Authentication token management
|
||||||
|
- ✅ Input validation and sanitization
|
||||||
|
|
||||||
|
### Frontend Security
|
||||||
|
- ✅ Content Security Policy headers
|
||||||
|
- ✅ XSS protection mechanisms
|
||||||
|
- ✅ Secure API communication (HTTPS)
|
||||||
|
- ✅ Environment variable protection
|
||||||
|
|
||||||
|
### Deployment Security
|
||||||
|
- ✅ SSL/TLS termination
|
||||||
|
- ✅ Security headers configuration
|
||||||
|
- ✅ Secret management strategy
|
||||||
|
- ✅ Container security best practices
|
||||||
|
|
||||||
|
## Risk Assessment and Mitigation
|
||||||
|
|
||||||
|
### Low Risk Items ✅
|
||||||
|
- **File organization**: Clear mapping and systematic approach
|
||||||
|
- **Package management**: Both UV and pnpm are stable and well-supported
|
||||||
|
- **Development workflow**: Incremental changes to existing process
|
||||||
|
|
||||||
|
### Medium Risk Items ⚠️
|
||||||
|
- **Import path updates**: Requires careful testing of all Django apps
|
||||||
|
- **Static file handling**: Need to verify Django admin continues working
|
||||||
|
- **API integration**: New frontend-backend communication layer
|
||||||
|
|
||||||
|
**Mitigation Strategies:**
|
||||||
|
- Comprehensive testing suite for Django apps after migration
|
||||||
|
- Static file serving verification in development and production
|
||||||
|
- API endpoint testing and documentation
|
||||||
|
- Gradual migration approach with rollback capabilities
|
||||||
|
|
||||||
|
### High Risk Items 🔴
|
||||||
|
- **Data migration**: Database changes during restructuring
|
||||||
|
- **Production deployment**: New deployment process requires validation
|
||||||
|
|
||||||
|
**Mitigation Strategies:**
|
||||||
|
- Database backup before any structural changes
|
||||||
|
- Staging environment testing before production deployment
|
||||||
|
- Blue-green deployment strategy for zero-downtime migration
|
||||||
|
- Monitoring and alerting for post-migration issues
|
||||||
|
|
||||||
|
## Testing Strategy Validation
|
||||||
|
|
||||||
|
### Backend Testing ✅
|
||||||
|
```bash
|
||||||
|
# Django tests
|
||||||
|
cd backend
|
||||||
|
uv run manage.py test
|
||||||
|
|
||||||
|
# Code quality
|
||||||
|
uv run flake8 .
|
||||||
|
uv run black --check .
|
||||||
|
```
|
||||||
|
|
||||||
|
### Frontend Testing ✅
|
||||||
|
```bash
|
||||||
|
# Vue.js tests
|
||||||
|
cd frontend
|
||||||
|
pnpm run test
|
||||||
|
pnpm run test:unit
|
||||||
|
pnpm run test:e2e
|
||||||
|
|
||||||
|
# Code quality
|
||||||
|
pnpm run lint
|
||||||
|
pnpm run type-check
|
||||||
|
```
|
||||||
|
|
||||||
|
### Integration Testing ✅
|
||||||
|
- API endpoint testing
|
||||||
|
- Frontend-backend communication testing
|
||||||
|
- Media file access testing
|
||||||
|
- Authentication flow testing
|
||||||
|
|
||||||
|
## Documentation Validation ✅
|
||||||
|
|
||||||
|
### Created Documentation
|
||||||
|
- ✅ **Monorepo Structure Plan**: Complete directory organization
|
||||||
|
- ✅ **Migration Mapping**: File-by-file migration guide
|
||||||
|
- ✅ **Deployment Guide**: Comprehensive deployment strategies
|
||||||
|
- ✅ **Architecture Validation**: This validation document
|
||||||
|
|
||||||
|
### Required Updates
|
||||||
|
- ✅ Root README.md update for monorepo structure
|
||||||
|
- ✅ Development setup instructions
|
||||||
|
- ✅ API documentation for frontend integration
|
||||||
|
- ✅ Deployment runbooks
|
||||||
|
|
||||||
|
## Implementation Readiness Assessment
|
||||||
|
|
||||||
|
### Prerequisites Met ✅
|
||||||
|
- [x] Current Django project analysis complete
|
||||||
|
- [x] Monorepo structure designed
|
||||||
|
- [x] File migration strategy defined
|
||||||
|
- [x] Development workflow planned
|
||||||
|
- [x] Deployment strategy documented
|
||||||
|
- [x] Risk assessment completed
|
||||||
|
|
||||||
|
### Ready for Implementation ✅
|
||||||
|
- [x] Clear step-by-step migration plan
|
||||||
|
- [x] File mapping completeness verified
|
||||||
|
- [x] Package management strategy confirmed
|
||||||
|
- [x] Testing approach defined
|
||||||
|
- [x] Rollback strategy available
|
||||||
|
|
||||||
|
### Success Criteria Defined ✅
|
||||||
|
1. **Functional Requirements**
|
||||||
|
- All existing Django functionality preserved
|
||||||
|
- Modern Vue.js frontend operational
|
||||||
|
- API integration working correctly
|
||||||
|
- Media file handling functional
|
||||||
|
|
||||||
|
2. **Performance Requirements**
|
||||||
|
- Development servers start within reasonable time
|
||||||
|
- Build process completes successfully
|
||||||
|
- Production deployment successful
|
||||||
|
|
||||||
|
3. **Quality Requirements**
|
||||||
|
- All tests passing after migration
|
||||||
|
- Code quality standards maintained
|
||||||
|
- Documentation updated and complete
|
||||||
|
|
||||||
|
## Final Recommendation ✅
|
||||||
|
|
||||||
|
**Approval Status: APPROVED FOR IMPLEMENTATION**
|
||||||
|
|
||||||
|
The proposed monorepo architecture for ThrillWiki is comprehensive, well-planned, and ready for implementation. The plan demonstrates:
|
||||||
|
|
||||||
|
1. **Technical Soundness**: Architecture follows modern best practices
|
||||||
|
2. **Risk Management**: Potential issues identified with mitigation strategies
|
||||||
|
3. **Implementation Clarity**: Clear step-by-step migration process
|
||||||
|
4. **Operational Readiness**: Deployment and maintenance procedures defined
|
||||||
|
|
||||||
|
**Next Steps:**
|
||||||
|
1. Switch to **Code Mode** for implementation
|
||||||
|
2. Begin with directory structure creation
|
||||||
|
3. Migrate backend files systematically
|
||||||
|
4. Create Vue.js frontend application
|
||||||
|
5. Test integration between systems
|
||||||
|
6. Update deployment configurations
|
||||||
|
|
||||||
|
The architecture provides a solid foundation for scaling ThrillWiki with modern frontend technologies while preserving the robust Django backend functionality.
|
||||||
628
architecture/deployment-guide.md
Normal file
628
architecture/deployment-guide.md
Normal file
@@ -0,0 +1,628 @@
|
|||||||
|
# ThrillWiki Monorepo Deployment Guide
|
||||||
|
|
||||||
|
This document outlines deployment strategies, build processes, and infrastructure considerations for the ThrillWiki Django + Vue.js monorepo.
|
||||||
|
|
||||||
|
## Build Process Overview
|
||||||
|
|
||||||
|
```mermaid
|
||||||
|
graph TB
|
||||||
|
A[Source Code] --> B[Backend Build]
|
||||||
|
A --> C[Frontend Build]
|
||||||
|
B --> D[Django Static Collection]
|
||||||
|
C --> E[Vue.js Production Build]
|
||||||
|
D --> F[Backend Container]
|
||||||
|
E --> G[Frontend Assets]
|
||||||
|
F --> H[Production Deployment]
|
||||||
|
G --> H
|
||||||
|
```
|
||||||
|
|
||||||
|
## Development Environment
|
||||||
|
|
||||||
|
### Prerequisites
|
||||||
|
- Python 3.11+ with UV package manager
|
||||||
|
- Node.js 18+ with pnpm
|
||||||
|
- PostgreSQL (production) / SQLite (development)
|
||||||
|
- Redis (for caching and sessions)
|
||||||
|
|
||||||
|
### Local Development Setup
|
||||||
|
```bash
|
||||||
|
# Clone repository
|
||||||
|
git clone <repository-url>
|
||||||
|
cd thrillwiki-monorepo
|
||||||
|
|
||||||
|
# Install root dependencies
|
||||||
|
pnpm install
|
||||||
|
|
||||||
|
# Backend setup
|
||||||
|
cd backend
|
||||||
|
uv sync
|
||||||
|
uv run manage.py migrate
|
||||||
|
uv run manage.py collectstatic
|
||||||
|
|
||||||
|
# Frontend setup
|
||||||
|
cd ../frontend
|
||||||
|
pnpm install
|
||||||
|
|
||||||
|
# Start development servers
|
||||||
|
cd ..
|
||||||
|
pnpm run dev # Starts both backend and frontend
|
||||||
|
```
|
||||||
|
|
||||||
|
## Build Strategies
|
||||||
|
|
||||||
|
### 1. Containerized Deployment (Recommended)
|
||||||
|
|
||||||
|
#### Multi-stage Dockerfile for Backend
|
||||||
|
```dockerfile
|
||||||
|
# backend/Dockerfile
|
||||||
|
FROM python:3.11-slim as builder
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
COPY pyproject.toml uv.lock ./
|
||||||
|
RUN pip install uv
|
||||||
|
RUN uv sync --no-dev
|
||||||
|
|
||||||
|
FROM python:3.11-slim as runtime
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
COPY --from=builder /app/.venv /app/.venv
|
||||||
|
ENV PATH="/app/.venv/bin:$PATH"
|
||||||
|
|
||||||
|
COPY . .
|
||||||
|
RUN python manage.py collectstatic --noinput
|
||||||
|
|
||||||
|
EXPOSE 8000
|
||||||
|
CMD ["gunicorn", "config.wsgi:application", "--bind", "0.0.0.0:8000"]
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Dockerfile for Frontend
|
||||||
|
```dockerfile
|
||||||
|
# frontend/Dockerfile
|
||||||
|
FROM node:18-alpine as builder
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
COPY package.json pnpm-lock.yaml ./
|
||||||
|
RUN npm install -g pnpm
|
||||||
|
RUN pnpm install --frozen-lockfile
|
||||||
|
|
||||||
|
COPY . .
|
||||||
|
RUN pnpm run build
|
||||||
|
|
||||||
|
FROM nginx:alpine as runtime
|
||||||
|
COPY --from=builder /app/dist /usr/share/nginx/html
|
||||||
|
COPY nginx.conf /etc/nginx/nginx.conf
|
||||||
|
EXPOSE 80
|
||||||
|
CMD ["nginx", "-g", "daemon off;"]
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Docker Compose for Development
|
||||||
|
```yaml
|
||||||
|
# docker-compose.dev.yml
|
||||||
|
version: '3.8'
|
||||||
|
|
||||||
|
services:
|
||||||
|
db:
|
||||||
|
image: postgres:15
|
||||||
|
environment:
|
||||||
|
POSTGRES_DB: thrillwiki
|
||||||
|
POSTGRES_USER: thrillwiki
|
||||||
|
POSTGRES_PASSWORD: password
|
||||||
|
volumes:
|
||||||
|
- postgres_data:/var/lib/postgresql/data
|
||||||
|
ports:
|
||||||
|
- "5432:5432"
|
||||||
|
|
||||||
|
redis:
|
||||||
|
image: redis:7-alpine
|
||||||
|
ports:
|
||||||
|
- "6379:6379"
|
||||||
|
|
||||||
|
backend:
|
||||||
|
build:
|
||||||
|
context: ./backend
|
||||||
|
dockerfile: Dockerfile.dev
|
||||||
|
ports:
|
||||||
|
- "8000:8000"
|
||||||
|
volumes:
|
||||||
|
- ./backend:/app
|
||||||
|
- ./shared/media:/app/media
|
||||||
|
environment:
|
||||||
|
- DEBUG=1
|
||||||
|
- DATABASE_URL=postgresql://thrillwiki:password@db:5432/thrillwiki
|
||||||
|
- REDIS_URL=redis://redis:6379/0
|
||||||
|
depends_on:
|
||||||
|
- db
|
||||||
|
- redis
|
||||||
|
|
||||||
|
frontend:
|
||||||
|
build:
|
||||||
|
context: ./frontend
|
||||||
|
dockerfile: Dockerfile.dev
|
||||||
|
ports:
|
||||||
|
- "3000:3000"
|
||||||
|
volumes:
|
||||||
|
- ./frontend:/app
|
||||||
|
- /app/node_modules
|
||||||
|
environment:
|
||||||
|
- VITE_API_URL=http://localhost:8000
|
||||||
|
|
||||||
|
volumes:
|
||||||
|
postgres_data:
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Docker Compose for Production
|
||||||
|
```yaml
|
||||||
|
# docker-compose.prod.yml
|
||||||
|
version: '3.8'
|
||||||
|
|
||||||
|
services:
|
||||||
|
db:
|
||||||
|
image: postgres:15
|
||||||
|
environment:
|
||||||
|
POSTGRES_DB: ${POSTGRES_DB}
|
||||||
|
POSTGRES_USER: ${POSTGRES_USER}
|
||||||
|
POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}
|
||||||
|
volumes:
|
||||||
|
- postgres_data:/var/lib/postgresql/data
|
||||||
|
restart: unless-stopped
|
||||||
|
|
||||||
|
redis:
|
||||||
|
image: redis:7-alpine
|
||||||
|
restart: unless-stopped
|
||||||
|
|
||||||
|
backend:
|
||||||
|
build:
|
||||||
|
context: ./backend
|
||||||
|
dockerfile: Dockerfile
|
||||||
|
environment:
|
||||||
|
- DEBUG=0
|
||||||
|
- DATABASE_URL=${DATABASE_URL}
|
||||||
|
- REDIS_URL=${REDIS_URL}
|
||||||
|
- SECRET_KEY=${SECRET_KEY}
|
||||||
|
- ALLOWED_HOSTS=${ALLOWED_HOSTS}
|
||||||
|
volumes:
|
||||||
|
- ./shared/media:/app/media
|
||||||
|
- static_files:/app/staticfiles
|
||||||
|
depends_on:
|
||||||
|
- db
|
||||||
|
- redis
|
||||||
|
restart: unless-stopped
|
||||||
|
|
||||||
|
frontend:
|
||||||
|
build:
|
||||||
|
context: ./frontend
|
||||||
|
dockerfile: Dockerfile
|
||||||
|
restart: unless-stopped
|
||||||
|
|
||||||
|
nginx:
|
||||||
|
image: nginx:alpine
|
||||||
|
ports:
|
||||||
|
- "80:80"
|
||||||
|
- "443:443"
|
||||||
|
volumes:
|
||||||
|
- ./nginx/nginx.conf:/etc/nginx/nginx.conf
|
||||||
|
- ./nginx/ssl:/etc/nginx/ssl
|
||||||
|
- static_files:/usr/share/nginx/html/static
|
||||||
|
- ./shared/media:/usr/share/nginx/html/media
|
||||||
|
depends_on:
|
||||||
|
- backend
|
||||||
|
- frontend
|
||||||
|
restart: unless-stopped
|
||||||
|
|
||||||
|
volumes:
|
||||||
|
postgres_data:
|
||||||
|
static_files:
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2. Static Site Generation (Alternative)
|
||||||
|
|
||||||
|
For sites with mostly static content, consider pre-rendering:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Frontend build with pre-rendering
|
||||||
|
cd frontend
|
||||||
|
pnpm run build:prerender
|
||||||
|
|
||||||
|
# Serve static files with minimal backend
|
||||||
|
```
|
||||||
|
|
||||||
|
## CI/CD Pipeline
|
||||||
|
|
||||||
|
### GitHub Actions Workflow
|
||||||
|
```yaml
|
||||||
|
# .github/workflows/deploy.yml
|
||||||
|
name: Deploy ThrillWiki
|
||||||
|
|
||||||
|
on:
|
||||||
|
push:
|
||||||
|
branches: [main]
|
||||||
|
pull_request:
|
||||||
|
branches: [main]
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
test:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
|
||||||
|
services:
|
||||||
|
postgres:
|
||||||
|
image: postgres:15
|
||||||
|
env:
|
||||||
|
POSTGRES_PASSWORD: postgres
|
||||||
|
options: >-
|
||||||
|
--health-cmd pg_isready
|
||||||
|
--health-interval 10s
|
||||||
|
--health-timeout 5s
|
||||||
|
--health-retries 5
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Set up Python
|
||||||
|
uses: actions/setup-python@v4
|
||||||
|
with:
|
||||||
|
python-version: '3.11'
|
||||||
|
|
||||||
|
- name: Install UV
|
||||||
|
run: pip install uv
|
||||||
|
|
||||||
|
- name: Backend Tests
|
||||||
|
run: |
|
||||||
|
cd backend
|
||||||
|
uv sync
|
||||||
|
uv run manage.py test
|
||||||
|
uv run flake8 .
|
||||||
|
uv run black --check .
|
||||||
|
|
||||||
|
- name: Set up Node.js
|
||||||
|
uses: actions/setup-node@v4
|
||||||
|
with:
|
||||||
|
node-version: '18'
|
||||||
|
|
||||||
|
- name: Install pnpm
|
||||||
|
run: npm install -g pnpm
|
||||||
|
|
||||||
|
- name: Frontend Tests
|
||||||
|
run: |
|
||||||
|
cd frontend
|
||||||
|
pnpm install --frozen-lockfile
|
||||||
|
pnpm run test
|
||||||
|
pnpm run lint
|
||||||
|
pnpm run type-check
|
||||||
|
|
||||||
|
build:
|
||||||
|
needs: test
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
if: github.ref == 'refs/heads/main'
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Build and push Docker images
|
||||||
|
run: |
|
||||||
|
docker build -t thrillwiki-backend ./backend
|
||||||
|
docker build -t thrillwiki-frontend ./frontend
|
||||||
|
# Push to registry
|
||||||
|
|
||||||
|
- name: Deploy to production
|
||||||
|
run: |
|
||||||
|
# Deploy using your preferred method
|
||||||
|
# (AWS ECS, GCP Cloud Run, Azure Container Instances, etc.)
|
||||||
|
```
|
||||||
|
|
||||||
|
## Platform-Specific Deployments
|
||||||
|
|
||||||
|
### 1. Vercel Deployment (Frontend + API)
|
||||||
|
|
||||||
|
```json
|
||||||
|
// vercel.json
|
||||||
|
{
|
||||||
|
"version": 2,
|
||||||
|
"builds": [
|
||||||
|
{
|
||||||
|
"src": "frontend/package.json",
|
||||||
|
"use": "@vercel/static-build",
|
||||||
|
"config": {
|
||||||
|
"distDir": "dist"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"src": "backend/config/wsgi.py",
|
||||||
|
"use": "@vercel/python"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"routes": [
|
||||||
|
{
|
||||||
|
"src": "/api/(.*)",
|
||||||
|
"dest": "backend/config/wsgi.py"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"src": "/(.*)",
|
||||||
|
"dest": "frontend/dist/$1"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2. Railway Deployment
|
||||||
|
|
||||||
|
```toml
|
||||||
|
# railway.toml
|
||||||
|
[environments.production]
|
||||||
|
|
||||||
|
[environments.production.services.backend]
|
||||||
|
dockerfile = "backend/Dockerfile"
|
||||||
|
variables = { DEBUG = "0" }
|
||||||
|
|
||||||
|
[environments.production.services.frontend]
|
||||||
|
dockerfile = "frontend/Dockerfile"
|
||||||
|
|
||||||
|
[environments.production.services.postgres]
|
||||||
|
image = "postgres:15"
|
||||||
|
variables = { POSTGRES_DB = "thrillwiki" }
|
||||||
|
```
|
||||||
|
|
||||||
|
### 3. DigitalOcean App Platform
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
# .do/app.yaml
|
||||||
|
name: thrillwiki
|
||||||
|
services:
|
||||||
|
- name: backend
|
||||||
|
source_dir: backend
|
||||||
|
github:
|
||||||
|
repo: your-username/thrillwiki-monorepo
|
||||||
|
branch: main
|
||||||
|
run_command: gunicorn config.wsgi:application
|
||||||
|
environment_slug: python
|
||||||
|
instance_count: 1
|
||||||
|
instance_size_slug: basic-xxs
|
||||||
|
envs:
|
||||||
|
- key: DEBUG
|
||||||
|
value: "0"
|
||||||
|
|
||||||
|
- name: frontend
|
||||||
|
source_dir: frontend
|
||||||
|
github:
|
||||||
|
repo: your-username/thrillwiki-monorepo
|
||||||
|
branch: main
|
||||||
|
build_command: pnpm run build
|
||||||
|
run_command: pnpm run preview
|
||||||
|
environment_slug: node-js
|
||||||
|
instance_count: 1
|
||||||
|
instance_size_slug: basic-xxs
|
||||||
|
|
||||||
|
databases:
|
||||||
|
- name: thrillwiki-db
|
||||||
|
engine: PG
|
||||||
|
version: "15"
|
||||||
|
```
|
||||||
|
|
||||||
|
## Environment Configuration
|
||||||
|
|
||||||
|
### Environment Variables
|
||||||
|
|
||||||
|
#### Backend (.env)
|
||||||
|
```bash
|
||||||
|
# Django Settings
|
||||||
|
DEBUG=0
|
||||||
|
SECRET_KEY=your-secret-key-here
|
||||||
|
ALLOWED_HOSTS=yourdomain.com,www.yourdomain.com
|
||||||
|
|
||||||
|
# Database
|
||||||
|
DATABASE_URL=postgresql://user:password@host:port/database
|
||||||
|
|
||||||
|
# Redis
|
||||||
|
REDIS_URL=redis://host:port/0
|
||||||
|
|
||||||
|
# File Storage
|
||||||
|
MEDIA_ROOT=/app/media
|
||||||
|
STATIC_ROOT=/app/staticfiles
|
||||||
|
|
||||||
|
# Email
|
||||||
|
EMAIL_BACKEND=django.core.mail.backends.smtp.EmailBackend
|
||||||
|
EMAIL_HOST=smtp.yourmailprovider.com
|
||||||
|
EMAIL_PORT=587
|
||||||
|
EMAIL_USE_TLS=True
|
||||||
|
EMAIL_HOST_USER=your-email@yourdomain.com
|
||||||
|
EMAIL_HOST_PASSWORD=your-email-password
|
||||||
|
|
||||||
|
# Third-party Services
|
||||||
|
SENTRY_DSN=your-sentry-dsn
|
||||||
|
AWS_ACCESS_KEY_ID=your-aws-key
|
||||||
|
AWS_SECRET_ACCESS_KEY=your-aws-secret
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Frontend (.env.production)
|
||||||
|
```bash
|
||||||
|
VITE_API_URL=https://api.yourdomain.com
|
||||||
|
VITE_APP_TITLE=ThrillWiki
|
||||||
|
VITE_SENTRY_DSN=your-frontend-sentry-dsn
|
||||||
|
VITE_GOOGLE_ANALYTICS_ID=your-ga-id
|
||||||
|
```
|
||||||
|
|
||||||
|
## Performance Optimization
|
||||||
|
|
||||||
|
### Backend Optimizations
|
||||||
|
```python
|
||||||
|
# backend/config/settings/production.py
|
||||||
|
|
||||||
|
# Database optimization
|
||||||
|
DATABASES = {
|
||||||
|
'default': {
|
||||||
|
'ENGINE': 'django.db.backends.postgresql',
|
||||||
|
'CONN_MAX_AGE': 60,
|
||||||
|
'OPTIONS': {
|
||||||
|
'MAX_CONNS': 20,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
# Caching
|
||||||
|
CACHES = {
|
||||||
|
'default': {
|
||||||
|
'BACKEND': 'django.core.cache.backends.redis.RedisCache',
|
||||||
|
'LOCATION': 'redis://127.0.0.1:6379/1',
|
||||||
|
'OPTIONS': {
|
||||||
|
'CLIENT_CLASS': 'django_redis.client.DefaultClient',
|
||||||
|
},
|
||||||
|
'KEY_PREFIX': 'thrillwiki'
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
# Static files with CDN
|
||||||
|
AWS_S3_CUSTOM_DOMAIN = 'cdn.yourdomain.com'
|
||||||
|
STATICFILES_STORAGE = 'storages.backends.s3boto3.StaticS3Boto3Storage'
|
||||||
|
DEFAULT_FILE_STORAGE = 'storages.backends.s3boto3.MediaS3Boto3Storage'
|
||||||
|
```
|
||||||
|
|
||||||
|
### Frontend Optimizations
|
||||||
|
```typescript
|
||||||
|
// frontend/vite.config.ts
|
||||||
|
export default defineConfig({
|
||||||
|
build: {
|
||||||
|
rollupOptions: {
|
||||||
|
output: {
|
||||||
|
manualChunks: {
|
||||||
|
vendor: ['vue', 'vue-router', 'pinia'],
|
||||||
|
ui: ['@headlessui/vue', '@heroicons/vue']
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
sourcemap: false,
|
||||||
|
minify: 'terser',
|
||||||
|
terserOptions: {
|
||||||
|
compress: {
|
||||||
|
drop_console: true,
|
||||||
|
drop_debugger: true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
})
|
||||||
|
```
|
||||||
|
|
||||||
|
## Monitoring and Logging
|
||||||
|
|
||||||
|
### Application Monitoring
|
||||||
|
```python
|
||||||
|
# backend/config/settings/production.py
|
||||||
|
import sentry_sdk
|
||||||
|
from sentry_sdk.integrations.django import DjangoIntegration
|
||||||
|
|
||||||
|
sentry_sdk.init(
|
||||||
|
dsn="your-sentry-dsn",
|
||||||
|
integrations=[DjangoIntegration()],
|
||||||
|
traces_sample_rate=0.1,
|
||||||
|
send_default_pii=True
|
||||||
|
)
|
||||||
|
|
||||||
|
# Logging configuration
|
||||||
|
LOGGING = {
|
||||||
|
'version': 1,
|
||||||
|
'disable_existing_loggers': False,
|
||||||
|
'handlers': {
|
||||||
|
'file': {
|
||||||
|
'level': 'INFO',
|
||||||
|
'class': 'logging.FileHandler',
|
||||||
|
'filename': '/var/log/django/thrillwiki.log',
|
||||||
|
},
|
||||||
|
},
|
||||||
|
'root': {
|
||||||
|
'handlers': ['file'],
|
||||||
|
},
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Infrastructure Monitoring
|
||||||
|
- Use Prometheus + Grafana for metrics
|
||||||
|
- Implement health check endpoints
|
||||||
|
- Set up log aggregation (ELK stack or similar)
|
||||||
|
- Monitor database performance
|
||||||
|
- Track API response times
|
||||||
|
|
||||||
|
## Security Considerations
|
||||||
|
|
||||||
|
### Production Security Checklist
|
||||||
|
- [ ] HTTPS enforced with SSL certificates
|
||||||
|
- [ ] Security headers configured (HSTS, CSP, etc.)
|
||||||
|
- [ ] Database credentials secured
|
||||||
|
- [ ] Secret keys rotated regularly
|
||||||
|
- [ ] CORS properly configured
|
||||||
|
- [ ] Rate limiting implemented
|
||||||
|
- [ ] File upload validation
|
||||||
|
- [ ] SQL injection protection
|
||||||
|
- [ ] XSS protection enabled
|
||||||
|
- [ ] CSRF protection active
|
||||||
|
|
||||||
|
### Security Headers
|
||||||
|
```python
|
||||||
|
# backend/config/settings/production.py
|
||||||
|
SECURE_SSL_REDIRECT = True
|
||||||
|
SECURE_HSTS_SECONDS = 31536000
|
||||||
|
SECURE_HSTS_INCLUDE_SUBDOMAINS = True
|
||||||
|
SECURE_HSTS_PRELOAD = True
|
||||||
|
SECURE_CONTENT_TYPE_NOSNIFF = True
|
||||||
|
SECURE_BROWSER_XSS_FILTER = True
|
||||||
|
X_FRAME_OPTIONS = 'DENY'
|
||||||
|
|
||||||
|
# CORS for API
|
||||||
|
CORS_ALLOWED_ORIGINS = [
|
||||||
|
"https://yourdomain.com",
|
||||||
|
"https://www.yourdomain.com",
|
||||||
|
]
|
||||||
|
```
|
||||||
|
|
||||||
|
## Backup and Recovery
|
||||||
|
|
||||||
|
### Database Backup Strategy
|
||||||
|
```bash
|
||||||
|
# Automated backup script
|
||||||
|
#!/bin/bash
|
||||||
|
pg_dump $DATABASE_URL | gzip > backup_$(date +%Y%m%d_%H%M%S).sql.gz
|
||||||
|
aws s3 cp backup_*.sql.gz s3://your-backup-bucket/database/
|
||||||
|
```
|
||||||
|
|
||||||
|
### Media Files Backup
|
||||||
|
```bash
|
||||||
|
# Sync media files to S3
|
||||||
|
aws s3 sync ./shared/media/ s3://your-media-bucket/media/ --delete
|
||||||
|
```
|
||||||
|
|
||||||
|
## Scaling Strategies
|
||||||
|
|
||||||
|
### Horizontal Scaling
|
||||||
|
- Load balancer configuration
|
||||||
|
- Database read replicas
|
||||||
|
- CDN for static assets
|
||||||
|
- Redis clustering
|
||||||
|
- Auto-scaling groups
|
||||||
|
|
||||||
|
### Vertical Scaling
|
||||||
|
- Database connection pooling
|
||||||
|
- Application server optimization
|
||||||
|
- Memory usage optimization
|
||||||
|
- CPU-intensive task optimization
|
||||||
|
|
||||||
|
## Troubleshooting Guide
|
||||||
|
|
||||||
|
### Common Issues
|
||||||
|
1. **Build failures**: Check dependencies and environment variables
|
||||||
|
2. **Database connection errors**: Verify connection strings and firewall rules
|
||||||
|
3. **Static file 404s**: Ensure collectstatic runs and paths are correct
|
||||||
|
4. **CORS errors**: Check CORS configuration and allowed origins
|
||||||
|
5. **Memory issues**: Monitor application memory usage and optimize queries
|
||||||
|
|
||||||
|
### Debug Commands
|
||||||
|
```bash
|
||||||
|
# Backend debugging
|
||||||
|
cd backend
|
||||||
|
uv run manage.py check --deploy
|
||||||
|
uv run manage.py shell
|
||||||
|
uv run manage.py dbshell
|
||||||
|
|
||||||
|
# Frontend debugging
|
||||||
|
cd frontend
|
||||||
|
pnpm run build --debug
|
||||||
|
pnpm run preview
|
||||||
|
```
|
||||||
|
|
||||||
|
This deployment guide provides a comprehensive approach to deploying the ThrillWiki monorepo across various platforms while maintaining security, performance, and scalability.
|
||||||
353
architecture/migration-mapping.md
Normal file
353
architecture/migration-mapping.md
Normal file
@@ -0,0 +1,353 @@
|
|||||||
|
# ThrillWiki Migration Mapping Document
|
||||||
|
|
||||||
|
This document provides a comprehensive mapping of files from the current Django project to the new monorepo structure.
|
||||||
|
|
||||||
|
## Root Level Files
|
||||||
|
|
||||||
|
| Current Location | New Location | Notes |
|
||||||
|
|------------------|--------------|-------|
|
||||||
|
| `manage.py` | `backend/manage.py` | Core Django management |
|
||||||
|
| `pyproject.toml` | `backend/pyproject.toml` | Python dependencies |
|
||||||
|
| `uv.lock` | `backend/uv.lock` | UV lock file |
|
||||||
|
| `.gitignore` | `.gitignore` (update) | Merge with monorepo patterns |
|
||||||
|
| `README.md` | `README.md` (update) | Update for monorepo |
|
||||||
|
| `.pre-commit-config.yaml` | `.pre-commit-config.yaml` | Root level |
|
||||||
|
|
||||||
|
## Configuration Directory
|
||||||
|
|
||||||
|
| Current Location | New Location | Notes |
|
||||||
|
|------------------|--------------|-------|
|
||||||
|
| `config/django/` | `backend/config/django/` | Django settings |
|
||||||
|
| `config/settings/` | `backend/config/settings/` | Environment settings |
|
||||||
|
| `config/urls.py` | `backend/config/urls.py` | URL configuration |
|
||||||
|
| `config/wsgi.py` | `backend/config/wsgi.py` | WSGI configuration |
|
||||||
|
| `config/asgi.py` | `backend/config/asgi.py` | ASGI configuration |
|
||||||
|
|
||||||
|
## Django Apps
|
||||||
|
|
||||||
|
### Accounts App
|
||||||
|
| Current Location | New Location |
|
||||||
|
|------------------|--------------|
|
||||||
|
| `accounts/` | `backend/apps/accounts/` |
|
||||||
|
| `accounts/__init__.py` | `backend/apps/accounts/__init__.py` |
|
||||||
|
| `accounts/models.py` | `backend/apps/accounts/models.py` |
|
||||||
|
| `accounts/views.py` | `backend/apps/accounts/views.py` |
|
||||||
|
| `accounts/admin.py` | `backend/apps/accounts/admin.py` |
|
||||||
|
| `accounts/apps.py` | `backend/apps/accounts/apps.py` |
|
||||||
|
| `accounts/migrations/` | `backend/apps/accounts/migrations/` |
|
||||||
|
| `accounts/tests/` | `backend/apps/accounts/tests/` |
|
||||||
|
|
||||||
|
### Parks App
|
||||||
|
| Current Location | New Location |
|
||||||
|
|------------------|--------------|
|
||||||
|
| `parks/` | `backend/apps/parks/` |
|
||||||
|
| `parks/__init__.py` | `backend/apps/parks/__init__.py` |
|
||||||
|
| `parks/models.py` | `backend/apps/parks/models.py` |
|
||||||
|
| `parks/views.py` | `backend/apps/parks/views.py` |
|
||||||
|
| `parks/admin.py` | `backend/apps/parks/admin.py` |
|
||||||
|
| `parks/apps.py` | `backend/apps/parks/apps.py` |
|
||||||
|
| `parks/migrations/` | `backend/apps/parks/migrations/` |
|
||||||
|
| `parks/tests/` | `backend/apps/parks/tests/` |
|
||||||
|
|
||||||
|
### Rides App
|
||||||
|
| Current Location | New Location |
|
||||||
|
|------------------|--------------|
|
||||||
|
| `rides/` | `backend/apps/rides/` |
|
||||||
|
| `rides/__init__.py` | `backend/apps/rides/__init__.py` |
|
||||||
|
| `rides/models.py` | `backend/apps/rides/models.py` |
|
||||||
|
| `rides/views.py` | `backend/apps/rides/views.py` |
|
||||||
|
| `rides/admin.py` | `backend/apps/rides/admin.py` |
|
||||||
|
| `rides/apps.py` | `backend/apps/rides/apps.py` |
|
||||||
|
| `rides/migrations/` | `backend/apps/rides/migrations/` |
|
||||||
|
| `rides/tests/` | `backend/apps/rides/tests/` |
|
||||||
|
|
||||||
|
### Moderation App
|
||||||
|
| Current Location | New Location |
|
||||||
|
|------------------|--------------|
|
||||||
|
| `moderation/` | `backend/apps/moderation/` |
|
||||||
|
| `moderation/__init__.py` | `backend/apps/moderation/__init__.py` |
|
||||||
|
| `moderation/models.py` | `backend/apps/moderation/models.py` |
|
||||||
|
| `moderation/views.py` | `backend/apps/moderation/views.py` |
|
||||||
|
| `moderation/admin.py` | `backend/apps/moderation/admin.py` |
|
||||||
|
| `moderation/apps.py` | `backend/apps/moderation/apps.py` |
|
||||||
|
| `moderation/migrations/` | `backend/apps/moderation/migrations/` |
|
||||||
|
| `moderation/tests/` | `backend/apps/moderation/tests/` |
|
||||||
|
|
||||||
|
### Location App
|
||||||
|
| Current Location | New Location |
|
||||||
|
|------------------|--------------|
|
||||||
|
| `location/` | `backend/apps/location/` |
|
||||||
|
| `location/__init__.py` | `backend/apps/location/__init__.py` |
|
||||||
|
| `location/models.py` | `backend/apps/location/models.py` |
|
||||||
|
| `location/views.py` | `backend/apps/location/views.py` |
|
||||||
|
| `location/admin.py` | `backend/apps/location/admin.py` |
|
||||||
|
| `location/apps.py` | `backend/apps/location/apps.py` |
|
||||||
|
| `location/migrations/` | `backend/apps/location/migrations/` |
|
||||||
|
| `location/tests/` | `backend/apps/location/tests/` |
|
||||||
|
|
||||||
|
### Media App
|
||||||
|
| Current Location | New Location |
|
||||||
|
|------------------|--------------|
|
||||||
|
| `media/` | `backend/apps/media/` |
|
||||||
|
| `media/__init__.py` | `backend/apps/media/__init__.py` |
|
||||||
|
| `media/models.py` | `backend/apps/media/models.py` |
|
||||||
|
| `media/views.py` | `backend/apps/media/views.py` |
|
||||||
|
| `media/admin.py` | `backend/apps/media/admin.py` |
|
||||||
|
| `media/apps.py` | `backend/apps/media/apps.py` |
|
||||||
|
| `media/migrations/` | `backend/apps/media/migrations/` |
|
||||||
|
| `media/tests/` | `backend/apps/media/tests/` |
|
||||||
|
|
||||||
|
### Email Service App
|
||||||
|
| Current Location | New Location |
|
||||||
|
|------------------|--------------|
|
||||||
|
| `email_service/` | `backend/apps/email_service/` |
|
||||||
|
| `email_service/__init__.py` | `backend/apps/email_service/__init__.py` |
|
||||||
|
| `email_service/models.py` | `backend/apps/email_service/models.py` |
|
||||||
|
| `email_service/views.py` | `backend/apps/email_service/views.py` |
|
||||||
|
| `email_service/admin.py` | `backend/apps/email_service/admin.py` |
|
||||||
|
| `email_service/apps.py` | `backend/apps/email_service/apps.py` |
|
||||||
|
| `email_service/migrations/` | `backend/apps/email_service/migrations/` |
|
||||||
|
| `email_service/tests/` | `backend/apps/email_service/tests/` |
|
||||||
|
|
||||||
|
### Core App
|
||||||
|
| Current Location | New Location |
|
||||||
|
|------------------|--------------|
|
||||||
|
| `core/` | `backend/apps/core/` |
|
||||||
|
| `core/__init__.py` | `backend/apps/core/__init__.py` |
|
||||||
|
| `core/models.py` | `backend/apps/core/models.py` |
|
||||||
|
| `core/views.py` | `backend/apps/core/views.py` |
|
||||||
|
| `core/admin.py` | `backend/apps/core/admin.py` |
|
||||||
|
| `core/apps.py` | `backend/apps/core/apps.py` |
|
||||||
|
| `core/migrations/` | `backend/apps/core/migrations/` |
|
||||||
|
| `core/tests/` | `backend/apps/core/tests/` |
|
||||||
|
|
||||||
|
## Static Files and Templates
|
||||||
|
|
||||||
|
| Current Location | New Location | Notes |
|
||||||
|
|------------------|--------------|-------|
|
||||||
|
| `static/` | `backend/static/` | Django admin and backend assets |
|
||||||
|
| `staticfiles/` | `backend/staticfiles/` | Collected static files |
|
||||||
|
| `templates/` | `backend/templates/` | Django templates (if any) |
|
||||||
|
|
||||||
|
## Media Files
|
||||||
|
|
||||||
|
| Current Location | New Location | Notes |
|
||||||
|
|------------------|--------------|-------|
|
||||||
|
| `media/` | `shared/media/` | User uploaded content |
|
||||||
|
|
||||||
|
## Scripts and Development Tools
|
||||||
|
|
||||||
|
| Current Location | New Location | Notes |
|
||||||
|
|------------------|--------------|-------|
|
||||||
|
| `scripts/` | `scripts/` | Root level scripts |
|
||||||
|
| `scripts/dev_server.sh` | `scripts/backend_dev.sh` | Rename for clarity |
|
||||||
|
|
||||||
|
## New Frontend Structure (Created)
|
||||||
|
|
||||||
|
| New Location | Purpose |
|
||||||
|
|--------------|---------|
|
||||||
|
| `frontend/` | Vue.js application root |
|
||||||
|
| `frontend/package.json` | Node.js dependencies |
|
||||||
|
| `frontend/pnpm-lock.yaml` | pnpm lock file |
|
||||||
|
| `frontend/vite.config.ts` | Vite configuration |
|
||||||
|
| `frontend/tsconfig.json` | TypeScript configuration |
|
||||||
|
| `frontend/tailwind.config.js` | Tailwind CSS configuration |
|
||||||
|
| `frontend/src/` | Vue.js source code |
|
||||||
|
| `frontend/src/main.ts` | Application entry point |
|
||||||
|
| `frontend/src/App.vue` | Root component |
|
||||||
|
| `frontend/src/components/` | Vue components |
|
||||||
|
| `frontend/src/views/` | Page components |
|
||||||
|
| `frontend/src/router/` | Vue Router configuration |
|
||||||
|
| `frontend/src/stores/` | Pinia stores |
|
||||||
|
| `frontend/src/composables/` | Vue composables |
|
||||||
|
| `frontend/src/utils/` | Utility functions |
|
||||||
|
| `frontend/src/types/` | TypeScript type definitions |
|
||||||
|
| `frontend/src/assets/` | Static assets |
|
||||||
|
| `frontend/public/` | Public assets |
|
||||||
|
| `frontend/dist/` | Build output |
|
||||||
|
|
||||||
|
## New Shared Resources (Created)
|
||||||
|
|
||||||
|
| New Location | Purpose |
|
||||||
|
|--------------|---------|
|
||||||
|
| `shared/` | Cross-platform resources |
|
||||||
|
| `shared/media/` | User uploaded files |
|
||||||
|
| `shared/docs/` | Documentation |
|
||||||
|
| `shared/types/` | Shared TypeScript types |
|
||||||
|
| `shared/constants/` | Shared constants |
|
||||||
|
|
||||||
|
## Updated Root Files
|
||||||
|
|
||||||
|
### package.json (Root)
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"name": "thrillwiki-monorepo",
|
||||||
|
"private": true,
|
||||||
|
"workspaces": [
|
||||||
|
"frontend"
|
||||||
|
],
|
||||||
|
"scripts": {
|
||||||
|
"dev": "concurrently \"pnpm --filter frontend dev\" \"./scripts/backend_dev.sh\"",
|
||||||
|
"build": "pnpm --filter frontend build",
|
||||||
|
"backend:dev": "./scripts/backend_dev.sh",
|
||||||
|
"frontend:dev": "pnpm --filter frontend dev",
|
||||||
|
"test": "pnpm --filter frontend test && cd backend && uv run manage.py test",
|
||||||
|
"lint": "pnpm --filter frontend lint && cd backend && uv run flake8 .",
|
||||||
|
"format": "pnpm --filter frontend format && cd backend && uv run black ."
|
||||||
|
},
|
||||||
|
"devDependencies": {
|
||||||
|
"concurrently": "^8.2.2"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### .gitignore (Updated)
|
||||||
|
```gitignore
|
||||||
|
# Python
|
||||||
|
__pycache__/
|
||||||
|
*.py[cod]
|
||||||
|
*$py.class
|
||||||
|
*.so
|
||||||
|
.Python
|
||||||
|
build/
|
||||||
|
develop-eggs/
|
||||||
|
dist/
|
||||||
|
downloads/
|
||||||
|
eggs/
|
||||||
|
.eggs/
|
||||||
|
lib/
|
||||||
|
lib64/
|
||||||
|
parts/
|
||||||
|
sdist/
|
||||||
|
var/
|
||||||
|
wheels/
|
||||||
|
share/python-wheels/
|
||||||
|
*.egg-info/
|
||||||
|
.installed.cfg
|
||||||
|
*.egg
|
||||||
|
MANIFEST
|
||||||
|
|
||||||
|
# Django
|
||||||
|
*.log
|
||||||
|
local_settings.py
|
||||||
|
db.sqlite3
|
||||||
|
db.sqlite3-journal
|
||||||
|
/backend/static/
|
||||||
|
/backend/media/
|
||||||
|
|
||||||
|
# UV
|
||||||
|
.uv/
|
||||||
|
|
||||||
|
# Node.js
|
||||||
|
node_modules/
|
||||||
|
npm-debug.log*
|
||||||
|
yarn-debug.log*
|
||||||
|
yarn-error.log*
|
||||||
|
pnpm-debug.log*
|
||||||
|
lerna-debug.log*
|
||||||
|
.pnpm-store/
|
||||||
|
|
||||||
|
# Vue.js / Vite
|
||||||
|
/frontend/dist/
|
||||||
|
/frontend/dist-ssr/
|
||||||
|
*.local
|
||||||
|
|
||||||
|
# Environment variables
|
||||||
|
.env
|
||||||
|
.env.local
|
||||||
|
.env.development.local
|
||||||
|
.env.test.local
|
||||||
|
.env.production.local
|
||||||
|
|
||||||
|
# IDEs
|
||||||
|
.vscode/
|
||||||
|
.idea/
|
||||||
|
*.swp
|
||||||
|
*.swo
|
||||||
|
|
||||||
|
# OS
|
||||||
|
.DS_Store
|
||||||
|
Thumbs.db
|
||||||
|
|
||||||
|
# Logs
|
||||||
|
logs/
|
||||||
|
*.log
|
||||||
|
|
||||||
|
# Coverage
|
||||||
|
coverage/
|
||||||
|
*.lcov
|
||||||
|
.nyc_output
|
||||||
|
```
|
||||||
|
|
||||||
|
## Configuration Updates Required
|
||||||
|
|
||||||
|
### Backend Django Settings
|
||||||
|
Update `INSTALLED_APPS` paths:
|
||||||
|
```python
|
||||||
|
INSTALLED_APPS = [
|
||||||
|
'django.contrib.admin',
|
||||||
|
'django.contrib.auth',
|
||||||
|
'django.contrib.contenttypes',
|
||||||
|
'django.contrib.sessions',
|
||||||
|
'django.contrib.messages',
|
||||||
|
'django.contrib.staticfiles',
|
||||||
|
|
||||||
|
# Local apps
|
||||||
|
'apps.accounts',
|
||||||
|
'apps.parks',
|
||||||
|
'apps.rides',
|
||||||
|
'apps.moderation',
|
||||||
|
'apps.location',
|
||||||
|
'apps.media',
|
||||||
|
'apps.email_service',
|
||||||
|
'apps.core',
|
||||||
|
]
|
||||||
|
```
|
||||||
|
|
||||||
|
Update media and static files paths:
|
||||||
|
```python
|
||||||
|
STATIC_URL = '/static/'
|
||||||
|
STATIC_ROOT = BASE_DIR / 'staticfiles'
|
||||||
|
STATICFILES_DIRS = [
|
||||||
|
BASE_DIR / 'static',
|
||||||
|
]
|
||||||
|
|
||||||
|
MEDIA_URL = '/media/'
|
||||||
|
MEDIA_ROOT = BASE_DIR.parent / 'shared' / 'media'
|
||||||
|
```
|
||||||
|
|
||||||
|
### Script Updates
|
||||||
|
Update `scripts/backend_dev.sh`:
|
||||||
|
```bash
|
||||||
|
#!/bin/bash
|
||||||
|
cd backend
|
||||||
|
lsof -ti :8000 | xargs kill -9 2>/dev/null || true
|
||||||
|
find . -type d -name "__pycache__" -exec rm -r {} + 2>/dev/null || true
|
||||||
|
uv run manage.py runserver 0.0.0.0:8000
|
||||||
|
```
|
||||||
|
|
||||||
|
## Migration Steps Summary
|
||||||
|
|
||||||
|
1. **Create new directory structure**
|
||||||
|
2. **Move backend files** to `backend/` directory
|
||||||
|
3. **Update import paths** in Django settings and apps
|
||||||
|
4. **Create frontend** Vue.js application
|
||||||
|
5. **Update scripts** and configuration files
|
||||||
|
6. **Test both backend and frontend** independently
|
||||||
|
7. **Configure API integration** between Django and Vue.js
|
||||||
|
8. **Update deployment** configurations
|
||||||
|
|
||||||
|
## Validation Checklist
|
||||||
|
|
||||||
|
- [ ] All Django apps moved to `backend/apps/`
|
||||||
|
- [ ] Configuration files updated with new paths
|
||||||
|
- [ ] Static and media file paths configured correctly
|
||||||
|
- [ ] Frontend Vue.js application created and configured
|
||||||
|
- [ ] Root package.json with workspace configuration
|
||||||
|
- [ ] Development scripts updated and tested
|
||||||
|
- [ ] Git configuration updated
|
||||||
|
- [ ] Documentation updated
|
||||||
|
- [ ] CI/CD pipelines updated (if applicable)
|
||||||
|
- [ ] Database migrations work correctly
|
||||||
|
- [ ] Both development servers start successfully
|
||||||
|
- [ ] API endpoints accessible from frontend
|
||||||
525
architecture/monorepo-structure-plan.md
Normal file
525
architecture/monorepo-structure-plan.md
Normal file
@@ -0,0 +1,525 @@
|
|||||||
|
# ThrillWiki Django + Vue.js Monorepo Architecture Plan
|
||||||
|
|
||||||
|
## Executive Summary
|
||||||
|
|
||||||
|
This document outlines the optimal monorepo directory structure for migrating the ThrillWiki Django project to a Django + Vue.js architecture. The design separates backend and frontend concerns while maintaining existing Django app organization and supporting modern development workflows.
|
||||||
|
|
||||||
|
## Current Project Analysis
|
||||||
|
|
||||||
|
### Django Apps Structure
|
||||||
|
- **accounts**: User management and authentication
|
||||||
|
- **parks**: Theme park data and operations
|
||||||
|
- **rides**: Ride information and management
|
||||||
|
- **moderation**: Content moderation system
|
||||||
|
- **location**: Geographic data handling
|
||||||
|
- **media**: File and image management
|
||||||
|
- **email_service**: Email functionality
|
||||||
|
- **core**: Core utilities and services
|
||||||
|
|
||||||
|
### Key Infrastructure
|
||||||
|
- **Package Management**: UV-based Python setup
|
||||||
|
- **Configuration**: `config/django/` for settings, `config/settings/` for modular settings
|
||||||
|
- **Development**: `scripts/dev_server.sh` with comprehensive setup
|
||||||
|
- **Static Assets**: Tailwind CSS integration, `static/` and `staticfiles/`
|
||||||
|
- **Media Handling**: Organized `media/` directory with park/ride subdirectories
|
||||||
|
|
||||||
|
## Proposed Monorepo Structure
|
||||||
|
|
||||||
|
```
|
||||||
|
thrillwiki-monorepo/
|
||||||
|
├── README.md
|
||||||
|
├── pyproject.toml # Python dependencies (backend only)
|
||||||
|
├── package.json # Node.js dependencies (monorepo coordination)
|
||||||
|
├── pnpm-workspace.yaml # pnpm workspace configuration
|
||||||
|
├── .env.example
|
||||||
|
├── .gitignore
|
||||||
|
├──
|
||||||
|
├── backend/ # Django Backend
|
||||||
|
│ ├── manage.py
|
||||||
|
│ ├── pyproject.toml # Backend-specific dependencies
|
||||||
|
│ ├── config/
|
||||||
|
│ │ ├── django/
|
||||||
|
│ │ │ ├── base.py
|
||||||
|
│ │ │ ├── local.py
|
||||||
|
│ │ │ ├── production.py
|
||||||
|
│ │ │ └── test.py
|
||||||
|
│ │ └── settings/
|
||||||
|
│ │ ├── database.py
|
||||||
|
│ │ ├── email.py
|
||||||
|
│ │ └── security.py
|
||||||
|
│ ├── thrillwiki/
|
||||||
|
│ │ ├── __init__.py
|
||||||
|
│ │ ├── urls.py
|
||||||
|
│ │ ├── wsgi.py
|
||||||
|
│ │ ├── asgi.py
|
||||||
|
│ │ └── views.py
|
||||||
|
│ ├── apps/ # Django apps
|
||||||
|
│ │ ├── accounts/
|
||||||
|
│ │ ├── parks/
|
||||||
|
│ │ ├── rides/
|
||||||
|
│ │ ├── moderation/
|
||||||
|
│ │ ├── location/
|
||||||
|
│ │ ├── media/
|
||||||
|
│ │ ├── email_service/
|
||||||
|
│ │ └── core/
|
||||||
|
│ ├── templates/ # Django templates (API responses, admin)
|
||||||
|
│ ├── static/ # Backend static files
|
||||||
|
│ │ └── admin/ # Django admin assets
|
||||||
|
│ ├── media/ # User uploads
|
||||||
|
│ │ ├── avatars/
|
||||||
|
│ │ ├── park/
|
||||||
|
│ │ └── submissions/
|
||||||
|
│ └── tests/ # Backend tests
|
||||||
|
│
|
||||||
|
├── frontend/ # Vue.js Frontend
|
||||||
|
│ ├── package.json
|
||||||
|
│ ├── pnpm-lock.yaml
|
||||||
|
│ ├── vite.config.js
|
||||||
|
│ ├── tailwind.config.js
|
||||||
|
│ ├── index.html
|
||||||
|
│ ├── src/
|
||||||
|
│ │ ├── main.js
|
||||||
|
│ │ ├── App.vue
|
||||||
|
│ │ ├── router/
|
||||||
|
│ │ │ └── index.js
|
||||||
|
│ │ ├── stores/ # Pinia/Vuex stores
|
||||||
|
│ │ │ ├── auth.js
|
||||||
|
│ │ │ ├── parks.js
|
||||||
|
│ │ │ └── rides.js
|
||||||
|
│ │ ├── components/
|
||||||
|
│ │ │ ├── common/ # Shared components
|
||||||
|
│ │ │ ├── parks/ # Park-specific components
|
||||||
|
│ │ │ ├── rides/ # Ride-specific components
|
||||||
|
│ │ │ └── moderation/ # Moderation components
|
||||||
|
│ │ ├── views/ # Page components
|
||||||
|
│ │ │ ├── Home.vue
|
||||||
|
│ │ │ ├── parks/
|
||||||
|
│ │ │ ├── rides/
|
||||||
|
│ │ │ └── auth/
|
||||||
|
│ │ ├── composables/ # Vue 3 composables
|
||||||
|
│ │ │ ├── useAuth.js
|
||||||
|
│ │ │ ├── useApi.js
|
||||||
|
│ │ │ └── useTheme.js
|
||||||
|
│ │ ├── services/ # API service layer
|
||||||
|
│ │ │ ├── api.js
|
||||||
|
│ │ │ ├── auth.js
|
||||||
|
│ │ │ ├── parks.js
|
||||||
|
│ │ │ └── rides.js
|
||||||
|
│ │ ├── assets/
|
||||||
|
│ │ │ ├── images/
|
||||||
|
│ │ │ └── styles/
|
||||||
|
│ │ │ ├── globals.css
|
||||||
|
│ │ │ └── components/
|
||||||
|
│ │ └── utils/
|
||||||
|
│ ├── public/
|
||||||
|
│ │ ├── favicon.ico
|
||||||
|
│ │ └── images/
|
||||||
|
│ ├── dist/ # Build output
|
||||||
|
│ └── tests/ # Frontend tests
|
||||||
|
│ ├── unit/
|
||||||
|
│ └── e2e/
|
||||||
|
│
|
||||||
|
├── shared/ # Shared Resources
|
||||||
|
│ ├── docs/ # Documentation
|
||||||
|
│ │ ├── api/ # API documentation
|
||||||
|
│ │ ├── deployment/ # Deployment guides
|
||||||
|
│ │ └── development/ # Development setup
|
||||||
|
│ ├── scripts/ # Build and deployment scripts
|
||||||
|
│ │ ├── dev/
|
||||||
|
│ │ │ ├── start-backend.sh
|
||||||
|
│ │ │ ├── start-frontend.sh
|
||||||
|
│ │ │ └── start-full-stack.sh
|
||||||
|
│ │ ├── build/
|
||||||
|
│ │ │ ├── build-frontend.sh
|
||||||
|
│ │ │ └── build-production.sh
|
||||||
|
│ │ ├── deploy/
|
||||||
|
│ │ └── utils/
|
||||||
|
│ ├── config/ # Shared configuration
|
||||||
|
│ │ ├── docker/
|
||||||
|
│ │ │ ├── Dockerfile.backend
|
||||||
|
│ │ │ ├── Dockerfile.frontend
|
||||||
|
│ │ │ └── docker-compose.yml
|
||||||
|
│ │ ├── nginx/
|
||||||
|
│ │ └── ci/ # CI/CD configuration
|
||||||
|
│ │ └── github-actions/
|
||||||
|
│ └── types/ # Shared TypeScript types
|
||||||
|
│ ├── api.ts
|
||||||
|
│ ├── parks.ts
|
||||||
|
│ └── rides.ts
|
||||||
|
│
|
||||||
|
├── logs/ # Application logs
|
||||||
|
├── backups/ # Database backups
|
||||||
|
├── uploads/ # Temporary upload directory
|
||||||
|
└── dist/ # Production build output
|
||||||
|
├── backend/ # Django static files
|
||||||
|
└── frontend/ # Vue.js build
|
||||||
|
```
|
||||||
|
|
||||||
|
## Directory Organization Rationale
|
||||||
|
|
||||||
|
### 1. Clear Separation of Concerns
|
||||||
|
- **backend/**: Contains all Django-related code, maintaining existing app structure
|
||||||
|
- **frontend/**: Vue.js application with modern structure (Vite + Vue 3)
|
||||||
|
- **shared/**: Common resources, documentation, and configuration
|
||||||
|
|
||||||
|
### 2. Backend Structure (`backend/`)
|
||||||
|
- Preserves existing Django app organization under `apps/`
|
||||||
|
- Maintains UV-based Python dependency management
|
||||||
|
- Keeps configuration structure with `config/django/` and `config/settings/`
|
||||||
|
- Separates templates for API responses vs. frontend UI
|
||||||
|
|
||||||
|
### 3. Frontend Structure (`frontend/`)
|
||||||
|
- Modern Vue 3 + Vite setup with TypeScript support
|
||||||
|
- Organized by feature areas (parks, rides, auth)
|
||||||
|
- Composables for Vue 3 Composition API patterns
|
||||||
|
- Service layer for API communication with Django backend
|
||||||
|
- Tailwind CSS integration with shared design system
|
||||||
|
|
||||||
|
### 4. Shared Resources (`shared/`)
|
||||||
|
- Centralized documentation and deployment scripts
|
||||||
|
- Docker configuration for containerized deployment
|
||||||
|
- TypeScript type definitions shared between frontend and API
|
||||||
|
- CI/CD pipeline configuration
|
||||||
|
|
||||||
|
## Static File Strategy
|
||||||
|
|
||||||
|
### Development
|
||||||
|
```mermaid
|
||||||
|
graph LR
|
||||||
|
A[Vue Dev Server :3000] --> B[Vite HMR]
|
||||||
|
C[Django Dev Server :8000] --> D[Django Static Files]
|
||||||
|
E[Tailwind CSS] --> F[Both Frontend & Backend]
|
||||||
|
```
|
||||||
|
|
||||||
|
### Production
|
||||||
|
```mermaid
|
||||||
|
graph LR
|
||||||
|
A[Vue Build] --> B[dist/frontend/]
|
||||||
|
C[Django Collectstatic] --> D[dist/backend/]
|
||||||
|
E[Nginx] --> F[Serves Both]
|
||||||
|
F --> G[Frontend Assets]
|
||||||
|
F --> H[API Endpoints]
|
||||||
|
F --> I[Media Files]
|
||||||
|
```
|
||||||
|
|
||||||
|
### Implementation Details
|
||||||
|
|
||||||
|
1. **Development Mode**:
|
||||||
|
- Frontend: Vite dev server on port 3000 with HMR
|
||||||
|
- Backend: Django dev server on port 8000
|
||||||
|
- Proxy API calls from frontend to backend
|
||||||
|
|
||||||
|
2. **Production Mode**:
|
||||||
|
- Frontend built to `dist/frontend/`
|
||||||
|
- Django static files collected to `dist/backend/`
|
||||||
|
- Nginx serves static files and proxies API calls
|
||||||
|
|
||||||
|
## Media File Management
|
||||||
|
|
||||||
|
### Current Structure Preservation
|
||||||
|
```
|
||||||
|
media/
|
||||||
|
├── avatars/ # User profile images
|
||||||
|
├── park/ # Park-specific media
|
||||||
|
│ ├── {park-slug}/
|
||||||
|
│ │ └── {ride-slug}/
|
||||||
|
└── submissions/ # User-submitted content
|
||||||
|
└── photos/
|
||||||
|
```
|
||||||
|
|
||||||
|
### Strategy
|
||||||
|
- **Development**: Django serves media files directly
|
||||||
|
- **Production**: CDN or object storage (S3/CloudFlare) integration
|
||||||
|
- **Frontend Access**: Media URLs provided via API responses
|
||||||
|
- **Upload Handling**: Django handles all file uploads, Vue.js provides UI
|
||||||
|
|
||||||
|
## Development Workflow Integration
|
||||||
|
|
||||||
|
### Package Management
|
||||||
|
- **Root**: Node.js dependencies for frontend and tooling (using pnpm)
|
||||||
|
- **Backend**: UV for Python dependencies (existing approach)
|
||||||
|
- **Frontend**: pnpm for Vue.js dependencies
|
||||||
|
|
||||||
|
### Development Scripts
|
||||||
|
```bash
|
||||||
|
# Root level scripts
|
||||||
|
pnpm run dev # Start both backend and frontend
|
||||||
|
pnpm run dev:backend # Start only Django
|
||||||
|
pnpm run dev:frontend # Start only Vue.js
|
||||||
|
pnpm run build # Build for production
|
||||||
|
pnpm run test # Run all tests
|
||||||
|
|
||||||
|
# Backend specific (using UV)
|
||||||
|
cd backend && uv run manage.py runserver
|
||||||
|
cd backend && uv run manage.py test
|
||||||
|
|
||||||
|
# Frontend specific
|
||||||
|
cd frontend && pnpm run dev
|
||||||
|
cd frontend && pnpm run build
|
||||||
|
cd frontend && pnpm run test
|
||||||
|
```
|
||||||
|
|
||||||
|
### Environment Configuration
|
||||||
|
```bash
|
||||||
|
# Root .env (shared settings)
|
||||||
|
DATABASE_URL=
|
||||||
|
REDIS_URL=
|
||||||
|
SECRET_KEY=
|
||||||
|
|
||||||
|
# Backend .env (Django specific)
|
||||||
|
DJANGO_SETTINGS_MODULE=config.django.local
|
||||||
|
DEBUG=True
|
||||||
|
|
||||||
|
# Frontend .env (Vue specific)
|
||||||
|
VITE_API_BASE_URL=http://localhost:8000/api
|
||||||
|
VITE_APP_TITLE=ThrillWiki
|
||||||
|
```
|
||||||
|
|
||||||
|
### Package Manager Configuration
|
||||||
|
|
||||||
|
#### Root pnpm-workspace.yaml
|
||||||
|
```yaml
|
||||||
|
packages:
|
||||||
|
- 'frontend'
|
||||||
|
# Backend is managed separately with uv
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Root package.json
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"name": "thrillwiki-monorepo",
|
||||||
|
"private": true,
|
||||||
|
"packageManager": "pnpm@9.0.0",
|
||||||
|
"scripts": {
|
||||||
|
"dev": "concurrently \"pnpm run dev:backend\" \"pnpm run dev:frontend\"",
|
||||||
|
"dev:backend": "cd backend && uv run manage.py runserver",
|
||||||
|
"dev:frontend": "cd frontend && pnpm run dev",
|
||||||
|
"build": "pnpm run build:frontend && cd backend && uv run manage.py collectstatic --noinput",
|
||||||
|
"build:frontend": "cd frontend && pnpm run build",
|
||||||
|
"test": "pnpm run test:backend && pnpm run test:frontend",
|
||||||
|
"test:backend": "cd backend && uv run manage.py test",
|
||||||
|
"test:frontend": "cd frontend && pnpm run test",
|
||||||
|
"lint": "cd frontend && pnpm run lint && cd ../backend && uv run flake8 .",
|
||||||
|
"format": "cd frontend && pnpm run format && cd ../backend && uv run black ."
|
||||||
|
},
|
||||||
|
"devDependencies": {
|
||||||
|
"concurrently": "^8.2.0"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Frontend package.json
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"name": "thrillwiki-frontend",
|
||||||
|
"private": true,
|
||||||
|
"version": "0.1.0",
|
||||||
|
"type": "module",
|
||||||
|
"scripts": {
|
||||||
|
"dev": "vite",
|
||||||
|
"build": "vite build",
|
||||||
|
"preview": "vite preview",
|
||||||
|
"test": "vitest",
|
||||||
|
"test:e2e": "playwright test",
|
||||||
|
"lint": "eslint . --ext .vue,.js,.jsx,.cjs,.mjs,.ts,.tsx,.cts,.mts --fix",
|
||||||
|
"format": "prettier --write src/",
|
||||||
|
"type-check": "vue-tsc --noEmit"
|
||||||
|
},
|
||||||
|
"dependencies": {
|
||||||
|
"vue": "^3.4.0",
|
||||||
|
"vue-router": "^4.3.0",
|
||||||
|
"pinia": "^2.1.0",
|
||||||
|
"axios": "^1.6.0"
|
||||||
|
},
|
||||||
|
"devDependencies": {
|
||||||
|
"@vitejs/plugin-vue": "^5.0.0",
|
||||||
|
"vite": "^5.0.0",
|
||||||
|
"vue-tsc": "^2.0.0",
|
||||||
|
"typescript": "^5.3.0",
|
||||||
|
"tailwindcss": "^3.4.0",
|
||||||
|
"autoprefixer": "^10.4.0",
|
||||||
|
"postcss": "^8.4.0",
|
||||||
|
"eslint": "^8.57.0",
|
||||||
|
"prettier": "^3.2.0",
|
||||||
|
"vitest": "^1.3.0",
|
||||||
|
"@playwright/test": "^1.42.0"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## File Migration Mapping
|
||||||
|
|
||||||
|
### High-Level Moves
|
||||||
|
```
|
||||||
|
Current → New Location
|
||||||
|
├── manage.py → backend/manage.py
|
||||||
|
├── pyproject.toml → backend/pyproject.toml (+ root package.json)
|
||||||
|
├── config/ → backend/config/
|
||||||
|
├── thrillwiki/ → backend/thrillwiki/
|
||||||
|
├── accounts/ → backend/apps/accounts/
|
||||||
|
├── parks/ → backend/apps/parks/
|
||||||
|
├── rides/ → backend/apps/rides/
|
||||||
|
├── moderation/ → backend/apps/moderation/
|
||||||
|
├── location/ → backend/apps/location/
|
||||||
|
├── media/ → backend/apps/media/
|
||||||
|
├── email_service/ → backend/apps/email_service/
|
||||||
|
├── core/ → backend/apps/core/
|
||||||
|
├── templates/ → backend/templates/ (API) + frontend/src/views/ (UI)
|
||||||
|
├── static/ → backend/static/ (admin) + frontend/src/assets/
|
||||||
|
├── media/ → media/ (shared, accessible to both)
|
||||||
|
├── scripts/ → shared/scripts/
|
||||||
|
├── docs/ → shared/docs/
|
||||||
|
├── tests/ → backend/tests/ + frontend/tests/
|
||||||
|
└── staticfiles/ → dist/backend/ (generated)
|
||||||
|
```
|
||||||
|
|
||||||
|
### Detailed Backend App Moves
|
||||||
|
Each Django app moves to `backend/apps/{app_name}/` with structure preserved:
|
||||||
|
- Models, views, serializers stay the same
|
||||||
|
- Templates for API responses remain in app directories
|
||||||
|
- Static files move to frontend if UI-related
|
||||||
|
- Tests remain with respective apps
|
||||||
|
|
||||||
|
## Build and Deployment Strategy
|
||||||
|
|
||||||
|
### Development Build Process
|
||||||
|
1. **Backend**: No build step, runs directly with Django dev server
|
||||||
|
2. **Frontend**: Vite development server with HMR
|
||||||
|
3. **Shared**: Scripts orchestrate starting both services
|
||||||
|
|
||||||
|
### Production Build Process
|
||||||
|
```mermaid
|
||||||
|
graph TD
|
||||||
|
A[CI/CD Trigger] --> B[Install Dependencies]
|
||||||
|
B --> C[Build Frontend]
|
||||||
|
B --> D[Collect Django Static]
|
||||||
|
C --> E[Generate Frontend Bundle]
|
||||||
|
D --> F[Collect Backend Assets]
|
||||||
|
E --> G[Create Docker Images]
|
||||||
|
F --> G
|
||||||
|
G --> H[Deploy to Production]
|
||||||
|
```
|
||||||
|
|
||||||
|
### Container Strategy
|
||||||
|
- **Multi-stage Docker builds**: Separate backend and frontend images
|
||||||
|
- **Nginx**: Reverse proxy and static file serving
|
||||||
|
- **Volume mounts**: For media files and logs
|
||||||
|
- **Environment-based configuration**: Development vs. production
|
||||||
|
|
||||||
|
## API Integration Strategy
|
||||||
|
|
||||||
|
### Backend API Structure
|
||||||
|
```python
|
||||||
|
# Enhanced DRF setup for SPA
|
||||||
|
REST_FRAMEWORK = {
|
||||||
|
'DEFAULT_RENDERER_CLASSES': [
|
||||||
|
'rest_framework.renderers.JSONRenderer',
|
||||||
|
],
|
||||||
|
'DEFAULT_AUTHENTICATION_CLASSES': [
|
||||||
|
'rest_framework.authentication.SessionAuthentication',
|
||||||
|
'rest_framework.authentication.TokenAuthentication',
|
||||||
|
],
|
||||||
|
}
|
||||||
|
|
||||||
|
# CORS for development
|
||||||
|
CORS_ALLOWED_ORIGINS = [
|
||||||
|
"http://localhost:3000", # Vue dev server
|
||||||
|
]
|
||||||
|
```
|
||||||
|
|
||||||
|
### Frontend API Service
|
||||||
|
```javascript
|
||||||
|
// API service with auth integration
|
||||||
|
class ApiService {
|
||||||
|
constructor() {
|
||||||
|
this.client = axios.create({
|
||||||
|
baseURL: import.meta.env.VITE_API_BASE_URL,
|
||||||
|
withCredentials: true,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Park operations
|
||||||
|
getParks(params = {}) {
|
||||||
|
return this.client.get('/parks/', { params });
|
||||||
|
}
|
||||||
|
|
||||||
|
// Ride operations
|
||||||
|
getRides(parkId, params = {}) {
|
||||||
|
return this.client.get(`/parks/${parkId}/rides/`, { params });
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Configuration Management
|
||||||
|
|
||||||
|
### Shared Environment Variables
|
||||||
|
- Database connections
|
||||||
|
- Redis/Cache settings
|
||||||
|
- Secret keys and API keys
|
||||||
|
- Feature flags
|
||||||
|
|
||||||
|
### Application-Specific Settings
|
||||||
|
- **Django**: `backend/config/django/`
|
||||||
|
- **Vue.js**: `frontend/.env` files
|
||||||
|
- **Docker**: `shared/config/docker/`
|
||||||
|
|
||||||
|
### Development vs. Production
|
||||||
|
- Development: Multiple local servers, hot reloading
|
||||||
|
- Production: Containerized deployment, CDN integration
|
||||||
|
|
||||||
|
## Benefits of This Structure
|
||||||
|
|
||||||
|
1. **Clear Separation**: Backend and frontend concerns are clearly separated
|
||||||
|
2. **Scalability**: Each part can be developed, tested, and deployed independently
|
||||||
|
3. **Modern Workflow**: Supports latest Vue 3, Vite, and Django patterns
|
||||||
|
4. **Backward Compatibility**: Preserves existing Django app structure
|
||||||
|
5. **Developer Experience**: Hot reloading, TypeScript support, modern tooling
|
||||||
|
6. **Deployment Flexibility**: Can deploy as SPA + API or traditional Django
|
||||||
|
|
||||||
|
## Implementation Phases
|
||||||
|
|
||||||
|
### Phase 1: Structure Setup
|
||||||
|
1. Create new directory structure
|
||||||
|
2. Move Django code to `backend/`
|
||||||
|
3. Initialize Vue.js frontend
|
||||||
|
4. Set up basic API integration
|
||||||
|
|
||||||
|
### Phase 2: Frontend Development
|
||||||
|
1. Create Vue.js components for existing Django templates
|
||||||
|
2. Implement routing and state management
|
||||||
|
3. Integrate with Django API endpoints
|
||||||
|
4. Add authentication flow
|
||||||
|
|
||||||
|
### Phase 3: Build & Deploy
|
||||||
|
1. Set up build processes
|
||||||
|
2. Configure CI/CD pipelines
|
||||||
|
3. Implement production deployment
|
||||||
|
4. Performance optimization
|
||||||
|
|
||||||
|
## Considerations and Trade-offs
|
||||||
|
|
||||||
|
### Advantages
|
||||||
|
- Modern development experience
|
||||||
|
- Better code organization
|
||||||
|
- Independent scaling
|
||||||
|
- Rich frontend interactions
|
||||||
|
- API-first architecture
|
||||||
|
|
||||||
|
### Challenges
|
||||||
|
- Increased complexity
|
||||||
|
- Build process coordination
|
||||||
|
- Authentication across services
|
||||||
|
- SEO considerations (if needed)
|
||||||
|
- Development environment setup
|
||||||
|
|
||||||
|
## Next Steps
|
||||||
|
|
||||||
|
1. **Validate Architecture**: Review with development team
|
||||||
|
2. **Prototype Setup**: Create basic structure with sample components
|
||||||
|
3. **Migration Planning**: Detailed plan for moving existing code
|
||||||
|
4. **Tool Selection**: Finalize Vue.js ecosystem choices (Pinia vs. Vuex, etc.)
|
||||||
|
5. **Implementation**: Begin phase-by-phase migration
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
This architecture provides a solid foundation for migrating ThrillWiki to a modern Django + Vue.js monorepo while preserving existing functionality and enabling future growth.
|
||||||
31
backend/.env.example
Normal file
31
backend/.env.example
Normal file
@@ -0,0 +1,31 @@
|
|||||||
|
# Django Configuration
|
||||||
|
SECRET_KEY=your-secret-key-here
|
||||||
|
DEBUG=True
|
||||||
|
DJANGO_SETTINGS_MODULE=config.django.local
|
||||||
|
|
||||||
|
# Database
|
||||||
|
DATABASE_URL=postgresql://user:password@localhost:5432/thrillwiki
|
||||||
|
|
||||||
|
# Redis
|
||||||
|
REDIS_URL=redis://localhost:6379
|
||||||
|
|
||||||
|
# Email Configuration (Optional)
|
||||||
|
EMAIL_HOST=smtp.gmail.com
|
||||||
|
EMAIL_PORT=587
|
||||||
|
EMAIL_USE_TLS=True
|
||||||
|
EMAIL_HOST_USER=your-email@gmail.com
|
||||||
|
EMAIL_HOST_PASSWORD=your-app-password
|
||||||
|
|
||||||
|
# Media and Static Files
|
||||||
|
MEDIA_URL=/media/
|
||||||
|
STATIC_URL=/static/
|
||||||
|
|
||||||
|
# Security
|
||||||
|
ALLOWED_HOSTS=localhost,127.0.0.1
|
||||||
|
|
||||||
|
# API Configuration
|
||||||
|
CORS_ALLOWED_ORIGINS=http://localhost:3000
|
||||||
|
|
||||||
|
# Feature Flags
|
||||||
|
ENABLE_DEBUG_TOOLBAR=True
|
||||||
|
ENABLE_SILK_PROFILER=False
|
||||||
229
backend/README.md
Normal file
229
backend/README.md
Normal file
@@ -0,0 +1,229 @@
|
|||||||
|
# ThrillWiki Backend
|
||||||
|
|
||||||
|
Django REST API backend for the ThrillWiki monorepo.
|
||||||
|
|
||||||
|
## 🏗️ Architecture
|
||||||
|
|
||||||
|
This backend follows Django best practices with a modular app structure:
|
||||||
|
|
||||||
|
```
|
||||||
|
backend/
|
||||||
|
├── apps/ # Django applications
|
||||||
|
│ ├── accounts/ # User management
|
||||||
|
│ ├── parks/ # Theme park data
|
||||||
|
│ ├── rides/ # Ride information
|
||||||
|
│ ├── moderation/ # Content moderation
|
||||||
|
│ ├── location/ # Geographic data
|
||||||
|
│ ├── media/ # File management
|
||||||
|
│ ├── email_service/ # Email functionality
|
||||||
|
│ └── core/ # Core utilities
|
||||||
|
├── config/ # Django configuration
|
||||||
|
│ ├── django/ # Settings files
|
||||||
|
│ └── settings/ # Modular settings
|
||||||
|
├── templates/ # Django templates
|
||||||
|
├── static/ # Static files
|
||||||
|
└── tests/ # Test files
|
||||||
|
```
|
||||||
|
|
||||||
|
## 🛠️ Technology Stack
|
||||||
|
|
||||||
|
- **Django 5.0+** - Web framework
|
||||||
|
- **Django REST Framework** - API framework
|
||||||
|
- **PostgreSQL** - Primary database
|
||||||
|
- **Redis** - Caching and sessions
|
||||||
|
- **UV** - Python package management
|
||||||
|
- **Celery** - Background task processing
|
||||||
|
|
||||||
|
## 🚀 Quick Start
|
||||||
|
|
||||||
|
### Prerequisites
|
||||||
|
|
||||||
|
- Python 3.11+
|
||||||
|
- [uv](https://docs.astral.sh/uv/) package manager
|
||||||
|
- PostgreSQL 14+
|
||||||
|
- Redis 6+
|
||||||
|
|
||||||
|
### Setup
|
||||||
|
|
||||||
|
1. **Install dependencies**
|
||||||
|
```bash
|
||||||
|
cd backend
|
||||||
|
uv sync
|
||||||
|
```
|
||||||
|
|
||||||
|
2. **Environment configuration**
|
||||||
|
```bash
|
||||||
|
cp .env.example .env
|
||||||
|
# Edit .env with your settings
|
||||||
|
```
|
||||||
|
|
||||||
|
3. **Database setup**
|
||||||
|
```bash
|
||||||
|
uv run manage.py migrate
|
||||||
|
uv run manage.py createsuperuser
|
||||||
|
```
|
||||||
|
|
||||||
|
4. **Start development server**
|
||||||
|
```bash
|
||||||
|
uv run manage.py runserver
|
||||||
|
```
|
||||||
|
|
||||||
|
## 🔧 Configuration
|
||||||
|
|
||||||
|
### Environment Variables
|
||||||
|
|
||||||
|
Required environment variables:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Database
|
||||||
|
DATABASE_URL=postgresql://user:pass@localhost/thrillwiki
|
||||||
|
|
||||||
|
# Django
|
||||||
|
SECRET_KEY=your-secret-key
|
||||||
|
DEBUG=True
|
||||||
|
DJANGO_SETTINGS_MODULE=config.django.local
|
||||||
|
|
||||||
|
# Redis
|
||||||
|
REDIS_URL=redis://localhost:6379
|
||||||
|
|
||||||
|
# Email (optional)
|
||||||
|
EMAIL_HOST=smtp.gmail.com
|
||||||
|
EMAIL_PORT=587
|
||||||
|
EMAIL_USE_TLS=True
|
||||||
|
EMAIL_HOST_USER=your-email@gmail.com
|
||||||
|
EMAIL_HOST_PASSWORD=your-app-password
|
||||||
|
```
|
||||||
|
|
||||||
|
### Settings Structure
|
||||||
|
|
||||||
|
- `config/django/base.py` - Base settings
|
||||||
|
- `config/django/local.py` - Development settings
|
||||||
|
- `config/django/production.py` - Production settings
|
||||||
|
- `config/django/test.py` - Test settings
|
||||||
|
|
||||||
|
## 📁 Apps Overview
|
||||||
|
|
||||||
|
### Core Apps
|
||||||
|
|
||||||
|
- **accounts** - User authentication and profile management
|
||||||
|
- **parks** - Theme park models and operations
|
||||||
|
- **rides** - Ride information and relationships
|
||||||
|
- **core** - Shared utilities and base classes
|
||||||
|
|
||||||
|
### Support Apps
|
||||||
|
|
||||||
|
- **moderation** - Content moderation workflows
|
||||||
|
- **location** - Geographic data and services
|
||||||
|
- **media** - File upload and management
|
||||||
|
- **email_service** - Email sending and templates
|
||||||
|
|
||||||
|
## 🔌 API Endpoints
|
||||||
|
|
||||||
|
Base URL: `http://localhost:8000/api/`
|
||||||
|
|
||||||
|
### Authentication
|
||||||
|
- `POST /auth/login/` - User login
|
||||||
|
- `POST /auth/logout/` - User logout
|
||||||
|
- `POST /auth/register/` - User registration
|
||||||
|
|
||||||
|
### Parks
|
||||||
|
- `GET /parks/` - List parks
|
||||||
|
- `GET /parks/{id}/` - Park details
|
||||||
|
- `POST /parks/` - Create park (admin)
|
||||||
|
|
||||||
|
### Rides
|
||||||
|
- `GET /rides/` - List rides
|
||||||
|
- `GET /rides/{id}/` - Ride details
|
||||||
|
- `GET /parks/{park_id}/rides/` - Rides by park
|
||||||
|
|
||||||
|
## 🧪 Testing
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Run all tests
|
||||||
|
uv run manage.py test
|
||||||
|
|
||||||
|
# Run specific app tests
|
||||||
|
uv run manage.py test apps.parks
|
||||||
|
|
||||||
|
# Run with coverage
|
||||||
|
uv run coverage run manage.py test
|
||||||
|
uv run coverage report
|
||||||
|
```
|
||||||
|
|
||||||
|
## 🔧 Management Commands
|
||||||
|
|
||||||
|
Custom management commands:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Import park data
|
||||||
|
uv run manage.py import_parks data/parks.json
|
||||||
|
|
||||||
|
# Generate test data
|
||||||
|
uv run manage.py generate_test_data
|
||||||
|
|
||||||
|
# Clean up expired sessions
|
||||||
|
uv run manage.py clearsessions
|
||||||
|
```
|
||||||
|
|
||||||
|
## 📊 Database
|
||||||
|
|
||||||
|
### Entity Relationships
|
||||||
|
|
||||||
|
- **Parks** have Operators (required) and PropertyOwners (optional)
|
||||||
|
- **Rides** belong to Parks and may have Manufacturers/Designers
|
||||||
|
- **Users** can create submissions and moderate content
|
||||||
|
|
||||||
|
### Migrations
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Create migrations
|
||||||
|
uv run manage.py makemigrations
|
||||||
|
|
||||||
|
# Apply migrations
|
||||||
|
uv run manage.py migrate
|
||||||
|
|
||||||
|
# Show migration status
|
||||||
|
uv run manage.py showmigrations
|
||||||
|
```
|
||||||
|
|
||||||
|
## 🔐 Security
|
||||||
|
|
||||||
|
- CORS configured for frontend integration
|
||||||
|
- CSRF protection enabled
|
||||||
|
- JWT token authentication
|
||||||
|
- Rate limiting on API endpoints
|
||||||
|
- Input validation and sanitization
|
||||||
|
|
||||||
|
## 📈 Performance
|
||||||
|
|
||||||
|
- Database query optimization
|
||||||
|
- Redis caching for frequent queries
|
||||||
|
- Background task processing with Celery
|
||||||
|
- Database connection pooling
|
||||||
|
|
||||||
|
## 🚀 Deployment
|
||||||
|
|
||||||
|
See the [Deployment Guide](../shared/docs/deployment/) for production setup.
|
||||||
|
|
||||||
|
## 🐛 Debugging
|
||||||
|
|
||||||
|
### Development Tools
|
||||||
|
|
||||||
|
- Django Debug Toolbar
|
||||||
|
- Django Extensions
|
||||||
|
- Silk profiler for performance analysis
|
||||||
|
|
||||||
|
### Logging
|
||||||
|
|
||||||
|
Logs are written to:
|
||||||
|
- Console (development)
|
||||||
|
- Files in `logs/` directory (production)
|
||||||
|
- External logging service (production)
|
||||||
|
|
||||||
|
## 🤝 Contributing
|
||||||
|
|
||||||
|
1. Follow Django coding standards
|
||||||
|
2. Write tests for new features
|
||||||
|
3. Update documentation
|
||||||
|
4. Run linting: `uv run flake8 .`
|
||||||
|
5. Format code: `uv run black .`
|
||||||
6
backend/apps/__init__.py
Normal file
6
backend/apps/__init__.py
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
"""
|
||||||
|
Django apps package.
|
||||||
|
|
||||||
|
This directory contains all Django applications for the ThrillWiki backend.
|
||||||
|
Each app is self-contained and follows Django best practices.
|
||||||
|
"""
|
||||||
@@ -6,18 +6,19 @@ from django.contrib.sites.shortcuts import get_current_site
|
|||||||
|
|
||||||
User = get_user_model()
|
User = get_user_model()
|
||||||
|
|
||||||
|
|
||||||
class CustomAccountAdapter(DefaultAccountAdapter):
|
class CustomAccountAdapter(DefaultAccountAdapter):
|
||||||
def is_open_for_signup(self, request):
|
def is_open_for_signup(self, request):
|
||||||
"""
|
"""
|
||||||
Whether to allow sign ups.
|
Whether to allow sign ups.
|
||||||
"""
|
"""
|
||||||
return getattr(settings, 'ACCOUNT_ALLOW_SIGNUPS', True)
|
return True
|
||||||
|
|
||||||
def get_email_confirmation_url(self, request, emailconfirmation):
|
def get_email_confirmation_url(self, request, emailconfirmation):
|
||||||
"""
|
"""
|
||||||
Constructs the email confirmation (activation) url.
|
Constructs the email confirmation (activation) url.
|
||||||
"""
|
"""
|
||||||
site = get_current_site(request)
|
get_current_site(request)
|
||||||
return f"{settings.LOGIN_REDIRECT_URL}verify-email?key={emailconfirmation.key}"
|
return f"{settings.LOGIN_REDIRECT_URL}verify-email?key={emailconfirmation.key}"
|
||||||
|
|
||||||
def send_confirmation_mail(self, request, emailconfirmation, signup):
|
def send_confirmation_mail(self, request, emailconfirmation, signup):
|
||||||
@@ -27,30 +28,31 @@ class CustomAccountAdapter(DefaultAccountAdapter):
|
|||||||
current_site = get_current_site(request)
|
current_site = get_current_site(request)
|
||||||
activate_url = self.get_email_confirmation_url(request, emailconfirmation)
|
activate_url = self.get_email_confirmation_url(request, emailconfirmation)
|
||||||
ctx = {
|
ctx = {
|
||||||
'user': emailconfirmation.email_address.user,
|
"user": emailconfirmation.email_address.user,
|
||||||
'activate_url': activate_url,
|
"activate_url": activate_url,
|
||||||
'current_site': current_site,
|
"current_site": current_site,
|
||||||
'key': emailconfirmation.key,
|
"key": emailconfirmation.key,
|
||||||
}
|
}
|
||||||
if signup:
|
if signup:
|
||||||
email_template = 'account/email/email_confirmation_signup'
|
email_template = "account/email/email_confirmation_signup"
|
||||||
else:
|
else:
|
||||||
email_template = 'account/email/email_confirmation'
|
email_template = "account/email/email_confirmation"
|
||||||
self.send_mail(email_template, emailconfirmation.email_address.email, ctx)
|
self.send_mail(email_template, emailconfirmation.email_address.email, ctx)
|
||||||
|
|
||||||
|
|
||||||
class CustomSocialAccountAdapter(DefaultSocialAccountAdapter):
|
class CustomSocialAccountAdapter(DefaultSocialAccountAdapter):
|
||||||
def is_open_for_signup(self, request, sociallogin):
|
def is_open_for_signup(self, request, sociallogin):
|
||||||
"""
|
"""
|
||||||
Whether to allow social account sign ups.
|
Whether to allow social account sign ups.
|
||||||
"""
|
"""
|
||||||
return getattr(settings, 'SOCIALACCOUNT_ALLOW_SIGNUPS', True)
|
return True
|
||||||
|
|
||||||
def populate_user(self, request, sociallogin, data):
|
def populate_user(self, request, sociallogin, data):
|
||||||
"""
|
"""
|
||||||
Hook that can be used to further populate the user instance.
|
Hook that can be used to further populate the user instance.
|
||||||
"""
|
"""
|
||||||
user = super().populate_user(request, sociallogin, data)
|
user = super().populate_user(request, sociallogin, data)
|
||||||
if sociallogin.account.provider == 'discord':
|
if sociallogin.account.provider == "discord":
|
||||||
user.discord_id = sociallogin.account.uid
|
user.discord_id = sociallogin.account.uid
|
||||||
return user
|
return user
|
||||||
|
|
||||||
360
backend/apps/accounts/admin.py
Normal file
360
backend/apps/accounts/admin.py
Normal file
@@ -0,0 +1,360 @@
|
|||||||
|
from django.contrib import admin
|
||||||
|
from django.contrib.auth.admin import UserAdmin
|
||||||
|
from django.utils.html import format_html
|
||||||
|
from django.contrib.auth.models import Group
|
||||||
|
from .models import (
|
||||||
|
User,
|
||||||
|
UserProfile,
|
||||||
|
EmailVerification,
|
||||||
|
PasswordReset,
|
||||||
|
TopList,
|
||||||
|
TopListItem,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class UserProfileInline(admin.StackedInline):
|
||||||
|
model = UserProfile
|
||||||
|
can_delete = False
|
||||||
|
verbose_name_plural = "Profile"
|
||||||
|
fieldsets = (
|
||||||
|
(
|
||||||
|
"Personal Info",
|
||||||
|
{"fields": ("display_name", "avatar", "pronouns", "bio")},
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"Social Media",
|
||||||
|
{"fields": ("twitter", "instagram", "youtube", "discord")},
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"Ride Credits",
|
||||||
|
{
|
||||||
|
"fields": (
|
||||||
|
"coaster_credits",
|
||||||
|
"dark_ride_credits",
|
||||||
|
"flat_ride_credits",
|
||||||
|
"water_ride_credits",
|
||||||
|
)
|
||||||
|
},
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class TopListItemInline(admin.TabularInline):
|
||||||
|
model = TopListItem
|
||||||
|
extra = 1
|
||||||
|
fields = ("content_type", "object_id", "rank", "notes")
|
||||||
|
ordering = ("rank",)
|
||||||
|
|
||||||
|
|
||||||
|
@admin.register(User)
|
||||||
|
class CustomUserAdmin(UserAdmin):
|
||||||
|
list_display = (
|
||||||
|
"username",
|
||||||
|
"email",
|
||||||
|
"get_avatar",
|
||||||
|
"get_status",
|
||||||
|
"role",
|
||||||
|
"date_joined",
|
||||||
|
"last_login",
|
||||||
|
"get_credits",
|
||||||
|
)
|
||||||
|
list_filter = (
|
||||||
|
"is_active",
|
||||||
|
"is_staff",
|
||||||
|
"role",
|
||||||
|
"is_banned",
|
||||||
|
"groups",
|
||||||
|
"date_joined",
|
||||||
|
)
|
||||||
|
search_fields = ("username", "email")
|
||||||
|
ordering = ("-date_joined",)
|
||||||
|
actions = [
|
||||||
|
"activate_users",
|
||||||
|
"deactivate_users",
|
||||||
|
"ban_users",
|
||||||
|
"unban_users",
|
||||||
|
]
|
||||||
|
inlines = [UserProfileInline]
|
||||||
|
|
||||||
|
fieldsets = (
|
||||||
|
(None, {"fields": ("username", "password")}),
|
||||||
|
("Personal info", {"fields": ("email", "pending_email")}),
|
||||||
|
(
|
||||||
|
"Roles and Permissions",
|
||||||
|
{
|
||||||
|
"fields": ("role", "groups", "user_permissions"),
|
||||||
|
"description": (
|
||||||
|
"Role determines group membership. Groups determine permissions."
|
||||||
|
),
|
||||||
|
},
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"Status",
|
||||||
|
{
|
||||||
|
"fields": ("is_active", "is_staff", "is_superuser"),
|
||||||
|
"description": "These are automatically managed based on role.",
|
||||||
|
},
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"Ban Status",
|
||||||
|
{
|
||||||
|
"fields": ("is_banned", "ban_reason", "ban_date"),
|
||||||
|
},
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"Preferences",
|
||||||
|
{
|
||||||
|
"fields": ("theme_preference",),
|
||||||
|
},
|
||||||
|
),
|
||||||
|
("Important dates", {"fields": ("last_login", "date_joined")}),
|
||||||
|
)
|
||||||
|
add_fieldsets = (
|
||||||
|
(
|
||||||
|
None,
|
||||||
|
{
|
||||||
|
"classes": ("wide",),
|
||||||
|
"fields": (
|
||||||
|
"username",
|
||||||
|
"email",
|
||||||
|
"password1",
|
||||||
|
"password2",
|
||||||
|
"role",
|
||||||
|
),
|
||||||
|
},
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
@admin.display(description="Avatar")
|
||||||
|
def get_avatar(self, obj):
|
||||||
|
if obj.profile.avatar:
|
||||||
|
return format_html(
|
||||||
|
'<img src="{}" width="30" height="30" style="border-radius:50%;" />',
|
||||||
|
obj.profile.avatar.url,
|
||||||
|
)
|
||||||
|
return format_html(
|
||||||
|
'<div style="width:30px; height:30px; border-radius:50%; '
|
||||||
|
"background-color:#007bff; color:white; display:flex; "
|
||||||
|
'align-items:center; justify-content:center;">{}</div>',
|
||||||
|
obj.username[0].upper(),
|
||||||
|
)
|
||||||
|
|
||||||
|
@admin.display(description="Status")
|
||||||
|
def get_status(self, obj):
|
||||||
|
if obj.is_banned:
|
||||||
|
return format_html('<span style="color: red;">Banned</span>')
|
||||||
|
if not obj.is_active:
|
||||||
|
return format_html('<span style="color: orange;">Inactive</span>')
|
||||||
|
if obj.is_superuser:
|
||||||
|
return format_html('<span style="color: purple;">Superuser</span>')
|
||||||
|
if obj.is_staff:
|
||||||
|
return format_html('<span style="color: blue;">Staff</span>')
|
||||||
|
return format_html('<span style="color: green;">Active</span>')
|
||||||
|
|
||||||
|
@admin.display(description="Ride Credits")
|
||||||
|
def get_credits(self, obj):
|
||||||
|
try:
|
||||||
|
profile = obj.profile
|
||||||
|
return format_html(
|
||||||
|
"RC: {}<br>DR: {}<br>FR: {}<br>WR: {}",
|
||||||
|
profile.coaster_credits,
|
||||||
|
profile.dark_ride_credits,
|
||||||
|
profile.flat_ride_credits,
|
||||||
|
profile.water_ride_credits,
|
||||||
|
)
|
||||||
|
except UserProfile.DoesNotExist:
|
||||||
|
return "-"
|
||||||
|
|
||||||
|
@admin.action(description="Activate selected users")
|
||||||
|
def activate_users(self, request, queryset):
|
||||||
|
queryset.update(is_active=True)
|
||||||
|
|
||||||
|
@admin.action(description="Deactivate selected users")
|
||||||
|
def deactivate_users(self, request, queryset):
|
||||||
|
queryset.update(is_active=False)
|
||||||
|
|
||||||
|
@admin.action(description="Ban selected users")
|
||||||
|
def ban_users(self, request, queryset):
|
||||||
|
from django.utils import timezone
|
||||||
|
|
||||||
|
queryset.update(is_banned=True, ban_date=timezone.now())
|
||||||
|
|
||||||
|
@admin.action(description="Unban selected users")
|
||||||
|
def unban_users(self, request, queryset):
|
||||||
|
queryset.update(is_banned=False, ban_date=None, ban_reason="")
|
||||||
|
|
||||||
|
def save_model(self, request, obj, form, change):
|
||||||
|
creating = not obj.pk
|
||||||
|
super().save_model(request, obj, form, change)
|
||||||
|
if creating and obj.role != User.Roles.USER:
|
||||||
|
# Ensure new user with role gets added to appropriate group
|
||||||
|
group = Group.objects.filter(name=obj.role).first()
|
||||||
|
if group:
|
||||||
|
obj.groups.add(group)
|
||||||
|
|
||||||
|
|
||||||
|
@admin.register(UserProfile)
|
||||||
|
class UserProfileAdmin(admin.ModelAdmin):
|
||||||
|
list_display = (
|
||||||
|
"user",
|
||||||
|
"display_name",
|
||||||
|
"coaster_credits",
|
||||||
|
"dark_ride_credits",
|
||||||
|
"flat_ride_credits",
|
||||||
|
"water_ride_credits",
|
||||||
|
)
|
||||||
|
list_filter = (
|
||||||
|
"coaster_credits",
|
||||||
|
"dark_ride_credits",
|
||||||
|
"flat_ride_credits",
|
||||||
|
"water_ride_credits",
|
||||||
|
)
|
||||||
|
search_fields = ("user__username", "user__email", "display_name", "bio")
|
||||||
|
|
||||||
|
fieldsets = (
|
||||||
|
(
|
||||||
|
"User Information",
|
||||||
|
{"fields": ("user", "display_name", "avatar", "pronouns", "bio")},
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"Social Media",
|
||||||
|
{"fields": ("twitter", "instagram", "youtube", "discord")},
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"Ride Credits",
|
||||||
|
{
|
||||||
|
"fields": (
|
||||||
|
"coaster_credits",
|
||||||
|
"dark_ride_credits",
|
||||||
|
"flat_ride_credits",
|
||||||
|
"water_ride_credits",
|
||||||
|
)
|
||||||
|
},
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@admin.register(EmailVerification)
|
||||||
|
class EmailVerificationAdmin(admin.ModelAdmin):
|
||||||
|
list_display = ("user", "created_at", "last_sent", "is_expired")
|
||||||
|
list_filter = ("created_at", "last_sent")
|
||||||
|
search_fields = ("user__username", "user__email", "token")
|
||||||
|
readonly_fields = ("created_at", "last_sent")
|
||||||
|
|
||||||
|
fieldsets = (
|
||||||
|
("Verification Details", {"fields": ("user", "token")}),
|
||||||
|
("Timing", {"fields": ("created_at", "last_sent")}),
|
||||||
|
)
|
||||||
|
|
||||||
|
@admin.display(description="Status")
|
||||||
|
def is_expired(self, obj):
|
||||||
|
from django.utils import timezone
|
||||||
|
from datetime import timedelta
|
||||||
|
|
||||||
|
if timezone.now() - obj.last_sent > timedelta(days=1):
|
||||||
|
return format_html('<span style="color: red;">Expired</span>')
|
||||||
|
return format_html('<span style="color: green;">Valid</span>')
|
||||||
|
|
||||||
|
|
||||||
|
@admin.register(TopList)
|
||||||
|
class TopListAdmin(admin.ModelAdmin):
|
||||||
|
list_display = ("title", "user", "category", "created_at", "updated_at")
|
||||||
|
list_filter = ("category", "created_at", "updated_at")
|
||||||
|
search_fields = ("title", "user__username", "description")
|
||||||
|
inlines = [TopListItemInline]
|
||||||
|
|
||||||
|
fieldsets = (
|
||||||
|
(
|
||||||
|
"Basic Information",
|
||||||
|
{"fields": ("user", "title", "category", "description")},
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"Timestamps",
|
||||||
|
{"fields": ("created_at", "updated_at"), "classes": ("collapse",)},
|
||||||
|
),
|
||||||
|
)
|
||||||
|
readonly_fields = ("created_at", "updated_at")
|
||||||
|
|
||||||
|
|
||||||
|
@admin.register(TopListItem)
|
||||||
|
class TopListItemAdmin(admin.ModelAdmin):
|
||||||
|
list_display = ("top_list", "content_type", "object_id", "rank")
|
||||||
|
list_filter = ("top_list__category", "rank")
|
||||||
|
search_fields = ("top_list__title", "notes")
|
||||||
|
ordering = ("top_list", "rank")
|
||||||
|
|
||||||
|
fieldsets = (
|
||||||
|
("List Information", {"fields": ("top_list", "rank")}),
|
||||||
|
("Item Details", {"fields": ("content_type", "object_id", "notes")}),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@admin.register(PasswordReset)
|
||||||
|
class PasswordResetAdmin(admin.ModelAdmin):
|
||||||
|
"""Admin interface for password reset tokens"""
|
||||||
|
|
||||||
|
list_display = (
|
||||||
|
"user",
|
||||||
|
"created_at",
|
||||||
|
"expires_at",
|
||||||
|
"is_expired",
|
||||||
|
"used",
|
||||||
|
)
|
||||||
|
list_filter = (
|
||||||
|
"used",
|
||||||
|
"created_at",
|
||||||
|
"expires_at",
|
||||||
|
)
|
||||||
|
search_fields = (
|
||||||
|
"user__username",
|
||||||
|
"user__email",
|
||||||
|
"token",
|
||||||
|
)
|
||||||
|
readonly_fields = (
|
||||||
|
"token",
|
||||||
|
"created_at",
|
||||||
|
"expires_at",
|
||||||
|
)
|
||||||
|
date_hierarchy = "created_at"
|
||||||
|
ordering = ("-created_at",)
|
||||||
|
|
||||||
|
fieldsets = (
|
||||||
|
(
|
||||||
|
"Reset Details",
|
||||||
|
{
|
||||||
|
"fields": (
|
||||||
|
"user",
|
||||||
|
"token",
|
||||||
|
"used",
|
||||||
|
)
|
||||||
|
},
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"Timing",
|
||||||
|
{
|
||||||
|
"fields": (
|
||||||
|
"created_at",
|
||||||
|
"expires_at",
|
||||||
|
)
|
||||||
|
},
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
@admin.display(description="Status", boolean=True)
|
||||||
|
def is_expired(self, obj):
|
||||||
|
"""Display expiration status with color coding"""
|
||||||
|
from django.utils import timezone
|
||||||
|
|
||||||
|
if obj.used:
|
||||||
|
return format_html('<span style="color: blue;">Used</span>')
|
||||||
|
elif timezone.now() > obj.expires_at:
|
||||||
|
return format_html('<span style="color: red;">Expired</span>')
|
||||||
|
return format_html('<span style="color: green;">Valid</span>')
|
||||||
|
|
||||||
|
def has_add_permission(self, request):
|
||||||
|
"""Disable manual creation of password reset tokens"""
|
||||||
|
return False
|
||||||
|
|
||||||
|
def has_change_permission(self, request, obj=None):
|
||||||
|
"""Allow viewing but restrict editing of password reset tokens"""
|
||||||
|
return getattr(request.user, "is_superuser", False)
|
||||||
@@ -3,7 +3,7 @@ from django.apps import AppConfig
|
|||||||
|
|
||||||
class AccountsConfig(AppConfig):
|
class AccountsConfig(AppConfig):
|
||||||
default_auto_field = "django.db.models.BigAutoField"
|
default_auto_field = "django.db.models.BigAutoField"
|
||||||
name = "accounts"
|
name = "apps.accounts"
|
||||||
|
|
||||||
def ready(self):
|
def ready(self):
|
||||||
import accounts.signals # noqa
|
import apps.accounts.signals # noqa
|
||||||
@@ -0,0 +1,46 @@
|
|||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
from allauth.socialaccount.models import SocialApp, SocialAccount, SocialToken
|
||||||
|
from django.contrib.sites.models import Site
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = "Check all social auth related tables"
|
||||||
|
|
||||||
|
def handle(self, *args, **options):
|
||||||
|
# Check SocialApp
|
||||||
|
self.stdout.write("\nChecking SocialApp table:")
|
||||||
|
for app in SocialApp.objects.all():
|
||||||
|
self.stdout.write(
|
||||||
|
f"ID: {
|
||||||
|
app.pk}, Provider: {
|
||||||
|
app.provider}, Name: {
|
||||||
|
app.name}, Client ID: {
|
||||||
|
app.client_id}"
|
||||||
|
)
|
||||||
|
self.stdout.write("Sites:")
|
||||||
|
for site in app.sites.all():
|
||||||
|
self.stdout.write(f" - {site.domain}")
|
||||||
|
|
||||||
|
# Check SocialAccount
|
||||||
|
self.stdout.write("\nChecking SocialAccount table:")
|
||||||
|
for account in SocialAccount.objects.all():
|
||||||
|
self.stdout.write(
|
||||||
|
f"ID: {
|
||||||
|
account.pk}, Provider: {
|
||||||
|
account.provider}, UID: {
|
||||||
|
account.uid}"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Check SocialToken
|
||||||
|
self.stdout.write("\nChecking SocialToken table:")
|
||||||
|
for token in SocialToken.objects.all():
|
||||||
|
self.stdout.write(
|
||||||
|
f"ID: {token.pk}, Account: {token.account}, App: {token.app}"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Check Site
|
||||||
|
self.stdout.write("\nChecking Site table:")
|
||||||
|
for site in Site.objects.all():
|
||||||
|
self.stdout.write(
|
||||||
|
f"ID: {site.pk}, Domain: {site.domain}, Name: {site.name}"
|
||||||
|
)
|
||||||
@@ -0,0 +1,27 @@
|
|||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
from allauth.socialaccount.models import SocialApp
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = "Check social app configurations"
|
||||||
|
|
||||||
|
def handle(self, *args, **options):
|
||||||
|
social_apps = SocialApp.objects.all()
|
||||||
|
|
||||||
|
if not social_apps:
|
||||||
|
self.stdout.write(self.style.ERROR("No social apps found"))
|
||||||
|
return
|
||||||
|
|
||||||
|
for app in social_apps:
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS(
|
||||||
|
f"\nProvider: {
|
||||||
|
app.provider}"
|
||||||
|
)
|
||||||
|
)
|
||||||
|
self.stdout.write(f"Name: {app.name}")
|
||||||
|
self.stdout.write(f"Client ID: {app.client_id}")
|
||||||
|
self.stdout.write(f"Secret: {app.secret}")
|
||||||
|
self.stdout.write(
|
||||||
|
f'Sites: {", ".join(str(site.domain) for site in app.sites.all())}'
|
||||||
|
)
|
||||||
@@ -1,8 +1,9 @@
|
|||||||
from django.core.management.base import BaseCommand
|
from django.core.management.base import BaseCommand
|
||||||
from django.db import connection
|
from django.db import connection
|
||||||
|
|
||||||
|
|
||||||
class Command(BaseCommand):
|
class Command(BaseCommand):
|
||||||
help = 'Clean up social auth tables and migrations'
|
help = "Clean up social auth tables and migrations"
|
||||||
|
|
||||||
def handle(self, *args, **options):
|
def handle(self, *args, **options):
|
||||||
with connection.cursor() as cursor:
|
with connection.cursor() as cursor:
|
||||||
@@ -11,12 +12,17 @@ class Command(BaseCommand):
|
|||||||
cursor.execute("DROP TABLE IF EXISTS socialaccount_socialapp_sites")
|
cursor.execute("DROP TABLE IF EXISTS socialaccount_socialapp_sites")
|
||||||
cursor.execute("DROP TABLE IF EXISTS socialaccount_socialaccount")
|
cursor.execute("DROP TABLE IF EXISTS socialaccount_socialaccount")
|
||||||
cursor.execute("DROP TABLE IF EXISTS socialaccount_socialtoken")
|
cursor.execute("DROP TABLE IF EXISTS socialaccount_socialtoken")
|
||||||
|
|
||||||
# Remove migration records
|
# Remove migration records
|
||||||
cursor.execute("DELETE FROM django_migrations WHERE app='socialaccount'")
|
cursor.execute("DELETE FROM django_migrations WHERE app='socialaccount'")
|
||||||
cursor.execute("DELETE FROM django_migrations WHERE app='accounts' AND name LIKE '%social%'")
|
cursor.execute(
|
||||||
|
"DELETE FROM django_migrations WHERE app='accounts' "
|
||||||
|
"AND name LIKE '%social%'"
|
||||||
|
)
|
||||||
|
|
||||||
# Reset sequences
|
# Reset sequences
|
||||||
cursor.execute("DELETE FROM sqlite_sequence WHERE name LIKE '%social%'")
|
cursor.execute("DELETE FROM sqlite_sequence WHERE name LIKE '%social%'")
|
||||||
|
|
||||||
self.stdout.write(self.style.SUCCESS('Successfully cleaned up social auth configuration'))
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS("Successfully cleaned up social auth configuration")
|
||||||
|
)
|
||||||
@@ -1,9 +1,8 @@
|
|||||||
from django.core.management.base import BaseCommand
|
from django.core.management.base import BaseCommand
|
||||||
from django.contrib.auth import get_user_model
|
from django.contrib.auth import get_user_model
|
||||||
from django.contrib.auth.models import Group
|
from apps.parks.models import ParkReview, Park
|
||||||
from parks.models import Park, ParkReview as Review
|
from apps.rides.models import Ride
|
||||||
from rides.models import Ride
|
from apps.media.models import Photo
|
||||||
from media.models import Photo
|
|
||||||
|
|
||||||
User = get_user_model()
|
User = get_user_model()
|
||||||
|
|
||||||
@@ -13,22 +12,21 @@ class Command(BaseCommand):
|
|||||||
|
|
||||||
def handle(self, *args, **kwargs):
|
def handle(self, *args, **kwargs):
|
||||||
# Delete test users
|
# Delete test users
|
||||||
test_users = User.objects.filter(
|
test_users = User.objects.filter(username__in=["testuser", "moderator"])
|
||||||
username__in=["testuser", "moderator"])
|
|
||||||
count = test_users.count()
|
count = test_users.count()
|
||||||
test_users.delete()
|
test_users.delete()
|
||||||
self.stdout.write(self.style.SUCCESS(f"Deleted {count} test users"))
|
self.stdout.write(self.style.SUCCESS(f"Deleted {count} test users"))
|
||||||
|
|
||||||
# Delete test reviews
|
# Delete test reviews
|
||||||
reviews = Review.objects.filter(
|
reviews = ParkReview.objects.filter(
|
||||||
user__username__in=["testuser", "moderator"])
|
user__username__in=["testuser", "moderator"]
|
||||||
|
)
|
||||||
count = reviews.count()
|
count = reviews.count()
|
||||||
reviews.delete()
|
reviews.delete()
|
||||||
self.stdout.write(self.style.SUCCESS(f"Deleted {count} test reviews"))
|
self.stdout.write(self.style.SUCCESS(f"Deleted {count} test reviews"))
|
||||||
|
|
||||||
# Delete test photos
|
# Delete test photos
|
||||||
photos = Photo.objects.filter(uploader__username__in=[
|
photos = Photo.objects.filter(uploader__username__in=["testuser", "moderator"])
|
||||||
"testuser", "moderator"])
|
|
||||||
count = photos.count()
|
count = photos.count()
|
||||||
photos.delete()
|
photos.delete()
|
||||||
self.stdout.write(self.style.SUCCESS(f"Deleted {count} test photos"))
|
self.stdout.write(self.style.SUCCESS(f"Deleted {count} test photos"))
|
||||||
@@ -64,7 +62,6 @@ class Command(BaseCommand):
|
|||||||
os.remove(f)
|
os.remove(f)
|
||||||
self.stdout.write(self.style.SUCCESS(f"Deleted {f}"))
|
self.stdout.write(self.style.SUCCESS(f"Deleted {f}"))
|
||||||
except OSError as e:
|
except OSError as e:
|
||||||
self.stdout.write(self.style.WARNING(
|
self.stdout.write(self.style.WARNING(f"Error deleting {f}: {e}"))
|
||||||
f"Error deleting {f}: {e}"))
|
|
||||||
|
|
||||||
self.stdout.write(self.style.SUCCESS("Test data cleanup complete"))
|
self.stdout.write(self.style.SUCCESS("Test data cleanup complete"))
|
||||||
@@ -0,0 +1,55 @@
|
|||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
from django.contrib.sites.models import Site
|
||||||
|
from allauth.socialaccount.models import SocialApp
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = "Create social apps for authentication"
|
||||||
|
|
||||||
|
def handle(self, *args, **options):
|
||||||
|
# Get the default site
|
||||||
|
site = Site.objects.get_or_create(
|
||||||
|
id=1,
|
||||||
|
defaults={
|
||||||
|
"domain": "localhost:8000",
|
||||||
|
"name": "ThrillWiki Development",
|
||||||
|
},
|
||||||
|
)[0]
|
||||||
|
|
||||||
|
# Create Discord app
|
||||||
|
discord_app, created = SocialApp.objects.get_or_create(
|
||||||
|
provider="discord",
|
||||||
|
defaults={
|
||||||
|
"name": "Discord",
|
||||||
|
"client_id": "1299112802274902047",
|
||||||
|
"secret": "ece7Pe_M4mD4mYzAgcINjTEKL_3ftL11",
|
||||||
|
},
|
||||||
|
)
|
||||||
|
if not created:
|
||||||
|
discord_app.client_id = "1299112802274902047"
|
||||||
|
discord_app.secret = "ece7Pe_M4mD4mYzAgcINjTEKL_3ftL11"
|
||||||
|
discord_app.save()
|
||||||
|
discord_app.sites.add(site)
|
||||||
|
self.stdout.write(f'{"Created" if created else "Updated"} Discord app')
|
||||||
|
|
||||||
|
# Create Google app
|
||||||
|
google_app, created = SocialApp.objects.get_or_create(
|
||||||
|
provider="google",
|
||||||
|
defaults={
|
||||||
|
"name": "Google",
|
||||||
|
"client_id": (
|
||||||
|
"135166769591-nopcgmo0fkqfqfs9qe783a137mtmcrt2."
|
||||||
|
"apps.googleusercontent.com"
|
||||||
|
),
|
||||||
|
"secret": "GOCSPX-Wd_0Ue0Ue0Ue0Ue0Ue0Ue0Ue0Ue",
|
||||||
|
},
|
||||||
|
)
|
||||||
|
if not created:
|
||||||
|
google_app.client_id = (
|
||||||
|
"135166769591-nopcgmo0fkqfqfs9qe783a137mtmcrt2."
|
||||||
|
"apps.googleusercontent.com"
|
||||||
|
)
|
||||||
|
google_app.secret = "GOCSPX-Wd_0Ue0Ue0Ue0Ue0Ue0Ue0Ue0Ue"
|
||||||
|
google_app.save()
|
||||||
|
google_app.sites.add(site)
|
||||||
|
self.stdout.write(f'{"Created" if created else "Updated"} Google app')
|
||||||
@@ -1,8 +1,5 @@
|
|||||||
from django.core.management.base import BaseCommand
|
from django.core.management.base import BaseCommand
|
||||||
from django.contrib.auth import get_user_model
|
from django.contrib.auth.models import Group, Permission, User
|
||||||
from django.contrib.auth.models import Group, Permission
|
|
||||||
|
|
||||||
User = get_user_model()
|
|
||||||
|
|
||||||
|
|
||||||
class Command(BaseCommand):
|
class Command(BaseCommand):
|
||||||
@@ -11,22 +8,25 @@ class Command(BaseCommand):
|
|||||||
def handle(self, *args, **kwargs):
|
def handle(self, *args, **kwargs):
|
||||||
# Create regular test user
|
# Create regular test user
|
||||||
if not User.objects.filter(username="testuser").exists():
|
if not User.objects.filter(username="testuser").exists():
|
||||||
user = User.objects.create_user(
|
user = User.objects.create(
|
||||||
username="testuser",
|
username="testuser",
|
||||||
email="testuser@example.com",
|
email="testuser@example.com",
|
||||||
[PASSWORD-REMOVED]",
|
|
||||||
)
|
)
|
||||||
self.stdout.write(self.style.SUCCESS(f"Created test user: {user.username}"))
|
user.set_password("testpass123")
|
||||||
|
user.save()
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS(f"Created test user: {user.get_username()}")
|
||||||
|
)
|
||||||
else:
|
else:
|
||||||
self.stdout.write(self.style.WARNING("Test user already exists"))
|
self.stdout.write(self.style.WARNING("Test user already exists"))
|
||||||
|
|
||||||
# Create moderator user
|
|
||||||
if not User.objects.filter(username="moderator").exists():
|
if not User.objects.filter(username="moderator").exists():
|
||||||
moderator = User.objects.create_user(
|
moderator = User.objects.create(
|
||||||
username="moderator",
|
username="moderator",
|
||||||
email="moderator@example.com",
|
email="moderator@example.com",
|
||||||
[PASSWORD-REMOVED]",
|
|
||||||
)
|
)
|
||||||
|
moderator.set_password("modpass123")
|
||||||
|
moderator.save()
|
||||||
|
|
||||||
# Create moderator group if it doesn't exist
|
# Create moderator group if it doesn't exist
|
||||||
moderator_group, created = Group.objects.get_or_create(name="Moderators")
|
moderator_group, created = Group.objects.get_or_create(name="Moderators")
|
||||||
@@ -48,7 +48,9 @@ class Command(BaseCommand):
|
|||||||
moderator.groups.add(moderator_group)
|
moderator.groups.add(moderator_group)
|
||||||
|
|
||||||
self.stdout.write(
|
self.stdout.write(
|
||||||
self.style.SUCCESS(f"Created moderator user: {moderator.username}")
|
self.style.SUCCESS(
|
||||||
|
f"Created moderator user: {moderator.get_username()}"
|
||||||
|
)
|
||||||
)
|
)
|
||||||
else:
|
else:
|
||||||
self.stdout.write(self.style.WARNING("Moderator user already exists"))
|
self.stdout.write(self.style.WARNING("Moderator user already exists"))
|
||||||
@@ -0,0 +1,18 @@
|
|||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
from django.db import connection
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = "Fix migration history by removing rides.0001_initial"
|
||||||
|
|
||||||
|
def handle(self, *args, **kwargs):
|
||||||
|
with connection.cursor() as cursor:
|
||||||
|
cursor.execute(
|
||||||
|
"DELETE FROM django_migrations WHERE app='rides' "
|
||||||
|
"AND name='0001_initial';"
|
||||||
|
)
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS(
|
||||||
|
"Successfully removed rides.0001_initial from migration history"
|
||||||
|
)
|
||||||
|
)
|
||||||
41
backend/apps/accounts/management/commands/fix_social_apps.py
Normal file
41
backend/apps/accounts/management/commands/fix_social_apps.py
Normal file
@@ -0,0 +1,41 @@
|
|||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
from allauth.socialaccount.models import SocialApp
|
||||||
|
from django.contrib.sites.models import Site
|
||||||
|
import os
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = "Fix social app configurations"
|
||||||
|
|
||||||
|
def handle(self, *args, **options):
|
||||||
|
# Delete all existing social apps
|
||||||
|
SocialApp.objects.all().delete()
|
||||||
|
self.stdout.write("Deleted all existing social apps")
|
||||||
|
|
||||||
|
# Get the default site
|
||||||
|
site = Site.objects.get(id=1)
|
||||||
|
|
||||||
|
# Create Google provider
|
||||||
|
google_app = SocialApp.objects.create(
|
||||||
|
provider="google",
|
||||||
|
name="Google",
|
||||||
|
client_id=os.getenv("GOOGLE_CLIENT_ID"),
|
||||||
|
secret=os.getenv("GOOGLE_CLIENT_SECRET"),
|
||||||
|
)
|
||||||
|
google_app.sites.add(site)
|
||||||
|
self.stdout.write(
|
||||||
|
f"Created Google app with client_id: {
|
||||||
|
google_app.client_id}"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Create Discord provider
|
||||||
|
discord_app = SocialApp.objects.create(
|
||||||
|
provider="discord",
|
||||||
|
name="Discord",
|
||||||
|
client_id=os.getenv("DISCORD_CLIENT_ID"),
|
||||||
|
secret=os.getenv("DISCORD_CLIENT_SECRET"),
|
||||||
|
)
|
||||||
|
discord_app.sites.add(site)
|
||||||
|
self.stdout.write(
|
||||||
|
f"Created Discord app with client_id: {discord_app.client_id}"
|
||||||
|
)
|
||||||
@@ -2,6 +2,7 @@ from django.core.management.base import BaseCommand
|
|||||||
from PIL import Image, ImageDraw, ImageFont
|
from PIL import Image, ImageDraw, ImageFont
|
||||||
import os
|
import os
|
||||||
|
|
||||||
|
|
||||||
def generate_avatar(letter):
|
def generate_avatar(letter):
|
||||||
"""Generate an avatar for a given letter or number"""
|
"""Generate an avatar for a given letter or number"""
|
||||||
avatar_size = (100, 100)
|
avatar_size = (100, 100)
|
||||||
@@ -10,7 +11,7 @@ def generate_avatar(letter):
|
|||||||
font_size = 100
|
font_size = 100
|
||||||
|
|
||||||
# Create a blank image with background color
|
# Create a blank image with background color
|
||||||
image = Image.new('RGB', avatar_size, background_color)
|
image = Image.new("RGB", avatar_size, background_color)
|
||||||
draw = ImageDraw.Draw(image)
|
draw = ImageDraw.Draw(image)
|
||||||
|
|
||||||
# Load a font
|
# Load a font
|
||||||
@@ -19,8 +20,14 @@ def generate_avatar(letter):
|
|||||||
|
|
||||||
# Calculate text size and position using textbbox
|
# Calculate text size and position using textbbox
|
||||||
text_bbox = draw.textbbox((0, 0), letter, font=font)
|
text_bbox = draw.textbbox((0, 0), letter, font=font)
|
||||||
text_width, text_height = text_bbox[2] - text_bbox[0], text_bbox[3] - text_bbox[1]
|
text_width, text_height = (
|
||||||
text_position = ((avatar_size[0] - text_width) / 2, (avatar_size[1] - text_height) / 2)
|
text_bbox[2] - text_bbox[0],
|
||||||
|
text_bbox[3] - text_bbox[1],
|
||||||
|
)
|
||||||
|
text_position = (
|
||||||
|
(avatar_size[0] - text_width) / 2,
|
||||||
|
(avatar_size[1] - text_height) / 2,
|
||||||
|
)
|
||||||
|
|
||||||
# Draw the text on the image
|
# Draw the text on the image
|
||||||
draw.text(text_position, letter, font=font, fill=text_color)
|
draw.text(text_position, letter, font=font, fill=text_color)
|
||||||
@@ -34,11 +41,14 @@ def generate_avatar(letter):
|
|||||||
avatar_path = os.path.join(avatar_dir, f"{letter}_avatar.png")
|
avatar_path = os.path.join(avatar_dir, f"{letter}_avatar.png")
|
||||||
image.save(avatar_path)
|
image.save(avatar_path)
|
||||||
|
|
||||||
|
|
||||||
class Command(BaseCommand):
|
class Command(BaseCommand):
|
||||||
help = 'Generate avatars for letters A-Z and numbers 0-9'
|
help = "Generate avatars for letters A-Z and numbers 0-9"
|
||||||
|
|
||||||
def handle(self, *args, **kwargs):
|
def handle(self, *args, **kwargs):
|
||||||
characters = [chr(i) for i in range(65, 91)] + [str(i) for i in range(10)] # A-Z and 0-9
|
characters = [chr(i) for i in range(65, 91)] + [
|
||||||
|
str(i) for i in range(10)
|
||||||
|
] # A-Z and 0-9
|
||||||
for char in characters:
|
for char in characters:
|
||||||
generate_avatar(char)
|
generate_avatar(char)
|
||||||
self.stdout.write(self.style.SUCCESS(f"Generated avatar for {char}"))
|
self.stdout.write(self.style.SUCCESS(f"Generated avatar for {char}"))
|
||||||
@@ -0,0 +1,18 @@
|
|||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
from apps.accounts.models import UserProfile
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = "Regenerate default avatars for users without an uploaded avatar"
|
||||||
|
|
||||||
|
def handle(self, *args, **kwargs):
|
||||||
|
profiles = UserProfile.objects.filter(avatar="")
|
||||||
|
for profile in profiles:
|
||||||
|
# This will trigger the avatar generation logic in the save method
|
||||||
|
profile.save()
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS(
|
||||||
|
f"Regenerated avatar for {
|
||||||
|
profile.user.username}"
|
||||||
|
)
|
||||||
|
)
|
||||||
@@ -3,66 +3,87 @@ from django.db import connection
|
|||||||
from django.contrib.auth.hashers import make_password
|
from django.contrib.auth.hashers import make_password
|
||||||
import uuid
|
import uuid
|
||||||
|
|
||||||
|
|
||||||
class Command(BaseCommand):
|
class Command(BaseCommand):
|
||||||
help = 'Reset database and create admin user'
|
help = "Reset database and create admin user"
|
||||||
|
|
||||||
def handle(self, *args, **options):
|
def handle(self, *args, **options):
|
||||||
self.stdout.write('Resetting database...')
|
self.stdout.write("Resetting database...")
|
||||||
|
|
||||||
# Drop all tables
|
# Drop all tables
|
||||||
with connection.cursor() as cursor:
|
with connection.cursor() as cursor:
|
||||||
cursor.execute("""
|
cursor.execute(
|
||||||
|
"""
|
||||||
DO $$ DECLARE
|
DO $$ DECLARE
|
||||||
r RECORD;
|
r RECORD;
|
||||||
BEGIN
|
BEGIN
|
||||||
FOR r IN (SELECT tablename FROM pg_tables WHERE schemaname = current_schema()) LOOP
|
FOR r IN (
|
||||||
EXECUTE 'DROP TABLE IF EXISTS ' || quote_ident(r.tablename) || ' CASCADE';
|
SELECT tablename FROM pg_tables
|
||||||
|
WHERE schemaname = current_schema()
|
||||||
|
) LOOP
|
||||||
|
EXECUTE 'DROP TABLE IF EXISTS ' || \
|
||||||
|
quote_ident(r.tablename) || ' CASCADE';
|
||||||
END LOOP;
|
END LOOP;
|
||||||
END $$;
|
END $$;
|
||||||
""")
|
"""
|
||||||
|
)
|
||||||
|
|
||||||
# Reset sequences
|
# Reset sequences
|
||||||
cursor.execute("""
|
cursor.execute(
|
||||||
|
"""
|
||||||
DO $$ DECLARE
|
DO $$ DECLARE
|
||||||
r RECORD;
|
r RECORD;
|
||||||
BEGIN
|
BEGIN
|
||||||
FOR r IN (SELECT sequencename FROM pg_sequences WHERE schemaname = current_schema()) LOOP
|
FOR r IN (
|
||||||
EXECUTE 'ALTER SEQUENCE ' || quote_ident(r.sequencename) || ' RESTART WITH 1';
|
SELECT sequencename FROM pg_sequences
|
||||||
|
WHERE schemaname = current_schema()
|
||||||
|
) LOOP
|
||||||
|
EXECUTE 'ALTER SEQUENCE ' || \
|
||||||
|
quote_ident(r.sequencename) || ' RESTART WITH 1';
|
||||||
END LOOP;
|
END LOOP;
|
||||||
END $$;
|
END $$;
|
||||||
""")
|
"""
|
||||||
|
)
|
||||||
|
|
||||||
self.stdout.write('All tables dropped and sequences reset.')
|
self.stdout.write("All tables dropped and sequences reset.")
|
||||||
|
|
||||||
# Run migrations
|
# Run migrations
|
||||||
from django.core.management import call_command
|
from django.core.management import call_command
|
||||||
call_command('migrate')
|
|
||||||
|
|
||||||
self.stdout.write('Migrations applied.')
|
call_command("migrate")
|
||||||
|
|
||||||
|
self.stdout.write("Migrations applied.")
|
||||||
|
|
||||||
# Create superuser using raw SQL
|
# Create superuser using raw SQL
|
||||||
try:
|
try:
|
||||||
with connection.cursor() as cursor:
|
with connection.cursor() as cursor:
|
||||||
# Create user
|
# Create user
|
||||||
user_id = str(uuid.uuid4())[:10]
|
user_id = str(uuid.uuid4())[:10]
|
||||||
cursor.execute("""
|
cursor.execute(
|
||||||
|
"""
|
||||||
INSERT INTO accounts_user (
|
INSERT INTO accounts_user (
|
||||||
username, password, email, is_superuser, is_staff,
|
username, password, email, is_superuser, is_staff,
|
||||||
is_active, date_joined, user_id, first_name,
|
is_active, date_joined, user_id, first_name,
|
||||||
last_name, role, is_banned, ban_reason,
|
last_name, role, is_banned, ban_reason,
|
||||||
theme_preference
|
theme_preference
|
||||||
) VALUES (
|
) VALUES (
|
||||||
'admin', %s, 'admin@thrillwiki.com', true, true,
|
'admin', %s, 'admin@thrillwiki.com', true, true,
|
||||||
true, NOW(), %s, '', '', 'SUPERUSER', false, '',
|
true, NOW(), %s, '', '', 'SUPERUSER', false, '',
|
||||||
'light'
|
'light'
|
||||||
) RETURNING id;
|
) RETURNING id;
|
||||||
""", [make_password('admin'), user_id])
|
""",
|
||||||
|
[make_password("admin"), user_id],
|
||||||
user_db_id = cursor.fetchone()[0]
|
)
|
||||||
|
|
||||||
|
result = cursor.fetchone()
|
||||||
|
if result is None:
|
||||||
|
raise Exception("Failed to create user - no ID returned")
|
||||||
|
user_db_id = result[0]
|
||||||
|
|
||||||
# Create profile
|
# Create profile
|
||||||
profile_id = str(uuid.uuid4())[:10]
|
profile_id = str(uuid.uuid4())[:10]
|
||||||
cursor.execute("""
|
cursor.execute(
|
||||||
|
"""
|
||||||
INSERT INTO accounts_userprofile (
|
INSERT INTO accounts_userprofile (
|
||||||
profile_id, display_name, pronouns, bio,
|
profile_id, display_name, pronouns, bio,
|
||||||
twitter, instagram, youtube, discord,
|
twitter, instagram, youtube, discord,
|
||||||
@@ -75,11 +96,18 @@ class Command(BaseCommand):
|
|||||||
0, 0, 0, 0,
|
0, 0, 0, 0,
|
||||||
%s, ''
|
%s, ''
|
||||||
);
|
);
|
||||||
""", [profile_id, user_db_id])
|
""",
|
||||||
|
[profile_id, user_db_id],
|
||||||
|
)
|
||||||
|
|
||||||
self.stdout.write('Superuser created.')
|
self.stdout.write("Superuser created.")
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
self.stdout.write(self.style.ERROR(f'Error creating superuser: {str(e)}'))
|
self.stdout.write(
|
||||||
|
self.style.ERROR(
|
||||||
|
f"Error creating superuser: {
|
||||||
|
str(e)}"
|
||||||
|
)
|
||||||
|
)
|
||||||
raise
|
raise
|
||||||
|
|
||||||
self.stdout.write(self.style.SUCCESS('Database reset complete.'))
|
self.stdout.write(self.style.SUCCESS("Database reset complete."))
|
||||||
@@ -3,34 +3,37 @@ from allauth.socialaccount.models import SocialApp
|
|||||||
from django.contrib.sites.models import Site
|
from django.contrib.sites.models import Site
|
||||||
from django.db import connection
|
from django.db import connection
|
||||||
|
|
||||||
|
|
||||||
class Command(BaseCommand):
|
class Command(BaseCommand):
|
||||||
help = 'Reset social apps configuration'
|
help = "Reset social apps configuration"
|
||||||
|
|
||||||
def handle(self, *args, **options):
|
def handle(self, *args, **options):
|
||||||
# Delete all social apps using raw SQL to bypass Django's ORM
|
# Delete all social apps using raw SQL to bypass Django's ORM
|
||||||
with connection.cursor() as cursor:
|
with connection.cursor() as cursor:
|
||||||
cursor.execute("DELETE FROM socialaccount_socialapp_sites")
|
cursor.execute("DELETE FROM socialaccount_socialapp_sites")
|
||||||
cursor.execute("DELETE FROM socialaccount_socialapp")
|
cursor.execute("DELETE FROM socialaccount_socialapp")
|
||||||
|
|
||||||
# Get the default site
|
# Get the default site
|
||||||
site = Site.objects.get(id=1)
|
site = Site.objects.get(id=1)
|
||||||
|
|
||||||
# Create Discord app
|
# Create Discord app
|
||||||
discord_app = SocialApp.objects.create(
|
discord_app = SocialApp.objects.create(
|
||||||
provider='discord',
|
provider="discord",
|
||||||
name='Discord',
|
name="Discord",
|
||||||
client_id='1299112802274902047',
|
client_id="1299112802274902047",
|
||||||
secret='ece7Pe_M4mD4mYzAgcINjTEKL_3ftL11',
|
secret="ece7Pe_M4mD4mYzAgcINjTEKL_3ftL11",
|
||||||
)
|
)
|
||||||
discord_app.sites.add(site)
|
discord_app.sites.add(site)
|
||||||
self.stdout.write(f'Created Discord app with ID: {discord_app.id}')
|
self.stdout.write(f"Created Discord app with ID: {discord_app.pk}")
|
||||||
|
|
||||||
# Create Google app
|
# Create Google app
|
||||||
google_app = SocialApp.objects.create(
|
google_app = SocialApp.objects.create(
|
||||||
provider='google',
|
provider="google",
|
||||||
name='Google',
|
name="Google",
|
||||||
client_id='135166769591-nopcgmo0fkqfqfs9qe783a137mtmcrt2.apps.googleusercontent.com',
|
client_id=(
|
||||||
secret='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',
|
"135166769591-nopcgmo0fkqfqfs9qe783a137mtmcrt2.apps.googleusercontent.com"
|
||||||
|
),
|
||||||
|
secret="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",
|
||||||
)
|
)
|
||||||
google_app.sites.add(site)
|
google_app.sites.add(site)
|
||||||
self.stdout.write(f'Created Google app with ID: {google_app.id}')
|
self.stdout.write(f"Created Google app with ID: {google_app.pk}")
|
||||||
@@ -0,0 +1,24 @@
|
|||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
from django.db import connection
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = "Reset social auth configuration"
|
||||||
|
|
||||||
|
def handle(self, *args, **options):
|
||||||
|
with connection.cursor() as cursor:
|
||||||
|
# Delete all social apps
|
||||||
|
cursor.execute("DELETE FROM socialaccount_socialapp")
|
||||||
|
cursor.execute("DELETE FROM socialaccount_socialapp_sites")
|
||||||
|
|
||||||
|
# Reset sequences
|
||||||
|
cursor.execute(
|
||||||
|
"DELETE FROM sqlite_sequence WHERE name='socialaccount_socialapp'"
|
||||||
|
)
|
||||||
|
cursor.execute(
|
||||||
|
"DELETE FROM sqlite_sequence WHERE name='socialaccount_socialapp_sites'"
|
||||||
|
)
|
||||||
|
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS("Successfully reset social auth configuration")
|
||||||
|
)
|
||||||
@@ -1,26 +1,26 @@
|
|||||||
from django.core.management.base import BaseCommand
|
from django.core.management.base import BaseCommand
|
||||||
from django.contrib.auth.models import Group, Permission
|
from django.contrib.auth.models import Group
|
||||||
from django.contrib.contenttypes.models import ContentType
|
from apps.accounts.models import User
|
||||||
from accounts.models import User
|
from apps.accounts.signals import create_default_groups
|
||||||
from accounts.signals import create_default_groups
|
|
||||||
|
|
||||||
class Command(BaseCommand):
|
class Command(BaseCommand):
|
||||||
help = 'Set up default groups and permissions for user roles'
|
help = "Set up default groups and permissions for user roles"
|
||||||
|
|
||||||
def handle(self, *args, **options):
|
def handle(self, *args, **options):
|
||||||
self.stdout.write('Creating default groups and permissions...')
|
self.stdout.write("Creating default groups and permissions...")
|
||||||
|
|
||||||
try:
|
try:
|
||||||
# Create default groups with permissions
|
# Create default groups with permissions
|
||||||
create_default_groups()
|
create_default_groups()
|
||||||
|
|
||||||
# Sync existing users with groups based on their roles
|
# Sync existing users with groups based on their roles
|
||||||
users = User.objects.exclude(role=User.Roles.USER)
|
users = User.objects.exclude(role=User.Roles.USER)
|
||||||
for user in users:
|
for user in users:
|
||||||
group = Group.objects.filter(name=user.role).first()
|
group = Group.objects.filter(name=user.role).first()
|
||||||
if group:
|
if group:
|
||||||
user.groups.add(group)
|
user.groups.add(group)
|
||||||
|
|
||||||
# Update staff/superuser status based on role
|
# Update staff/superuser status based on role
|
||||||
if user.role == User.Roles.SUPERUSER:
|
if user.role == User.Roles.SUPERUSER:
|
||||||
user.is_superuser = True
|
user.is_superuser = True
|
||||||
@@ -28,15 +28,22 @@ class Command(BaseCommand):
|
|||||||
elif user.role in [User.Roles.ADMIN, User.Roles.MODERATOR]:
|
elif user.role in [User.Roles.ADMIN, User.Roles.MODERATOR]:
|
||||||
user.is_staff = True
|
user.is_staff = True
|
||||||
user.save()
|
user.save()
|
||||||
|
|
||||||
self.stdout.write(self.style.SUCCESS('Successfully set up groups and permissions'))
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS("Successfully set up groups and permissions")
|
||||||
|
)
|
||||||
|
|
||||||
# Print summary
|
# Print summary
|
||||||
for group in Group.objects.all():
|
for group in Group.objects.all():
|
||||||
self.stdout.write(f'\nGroup: {group.name}')
|
self.stdout.write(f"\nGroup: {group.name}")
|
||||||
self.stdout.write('Permissions:')
|
self.stdout.write("Permissions:")
|
||||||
for perm in group.permissions.all():
|
for perm in group.permissions.all():
|
||||||
self.stdout.write(f' - {perm.codename}')
|
self.stdout.write(f" - {perm.codename}")
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
self.stdout.write(self.style.ERROR(f'Error setting up groups: {str(e)}'))
|
self.stdout.write(
|
||||||
|
self.style.ERROR(
|
||||||
|
f"Error setting up groups: {
|
||||||
|
str(e)}"
|
||||||
|
)
|
||||||
|
)
|
||||||
@@ -1,17 +1,16 @@
|
|||||||
from django.core.management.base import BaseCommand
|
from django.core.management.base import BaseCommand
|
||||||
from django.contrib.sites.models import Site
|
from django.contrib.sites.models import Site
|
||||||
|
|
||||||
|
|
||||||
class Command(BaseCommand):
|
class Command(BaseCommand):
|
||||||
help = 'Set up default site'
|
help = "Set up default site"
|
||||||
|
|
||||||
def handle(self, *args, **options):
|
def handle(self, *args, **options):
|
||||||
# Delete any existing sites
|
# Delete any existing sites
|
||||||
Site.objects.all().delete()
|
Site.objects.all().delete()
|
||||||
|
|
||||||
# Create default site
|
# Create default site
|
||||||
site = Site.objects.create(
|
site = Site.objects.create(
|
||||||
id=1,
|
id=1, domain="localhost:8000", name="ThrillWiki Development"
|
||||||
domain='localhost:8000',
|
|
||||||
name='ThrillWiki Development'
|
|
||||||
)
|
)
|
||||||
self.stdout.write(self.style.SUCCESS(f'Created site: {site.domain}'))
|
self.stdout.write(self.style.SUCCESS(f"Created site: {site.domain}"))
|
||||||
126
backend/apps/accounts/management/commands/setup_social_auth.py
Normal file
126
backend/apps/accounts/management/commands/setup_social_auth.py
Normal file
@@ -0,0 +1,126 @@
|
|||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
from django.contrib.sites.models import Site
|
||||||
|
from allauth.socialaccount.models import SocialApp
|
||||||
|
from dotenv import load_dotenv
|
||||||
|
import os
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = "Sets up social authentication apps"
|
||||||
|
|
||||||
|
def handle(self, *args, **kwargs):
|
||||||
|
# Load environment variables
|
||||||
|
load_dotenv()
|
||||||
|
|
||||||
|
# Get environment variables
|
||||||
|
google_client_id = os.getenv("GOOGLE_CLIENT_ID")
|
||||||
|
google_client_secret = os.getenv("GOOGLE_CLIENT_SECRET")
|
||||||
|
discord_client_id = os.getenv("DISCORD_CLIENT_ID")
|
||||||
|
discord_client_secret = os.getenv("DISCORD_CLIENT_SECRET")
|
||||||
|
|
||||||
|
# DEBUG: Log environment variable values
|
||||||
|
self.stdout.write(
|
||||||
|
f"DEBUG: google_client_id type: {
|
||||||
|
type(google_client_id)}, value: {google_client_id}"
|
||||||
|
)
|
||||||
|
self.stdout.write(
|
||||||
|
f"DEBUG: google_client_secret type: {
|
||||||
|
type(google_client_secret)}, value: {google_client_secret}"
|
||||||
|
)
|
||||||
|
self.stdout.write(
|
||||||
|
f"DEBUG: discord_client_id type: {
|
||||||
|
type(discord_client_id)}, value: {discord_client_id}"
|
||||||
|
)
|
||||||
|
self.stdout.write(
|
||||||
|
f"DEBUG: discord_client_secret type: {
|
||||||
|
type(discord_client_secret)}, value: {discord_client_secret}"
|
||||||
|
)
|
||||||
|
|
||||||
|
if not all(
|
||||||
|
[
|
||||||
|
google_client_id,
|
||||||
|
google_client_secret,
|
||||||
|
discord_client_id,
|
||||||
|
discord_client_secret,
|
||||||
|
]
|
||||||
|
):
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.ERROR("Missing required environment variables")
|
||||||
|
)
|
||||||
|
self.stdout.write(
|
||||||
|
f"DEBUG: google_client_id is None: {google_client_id is None}"
|
||||||
|
)
|
||||||
|
self.stdout.write(
|
||||||
|
f"DEBUG: google_client_secret is None: {
|
||||||
|
google_client_secret is None}"
|
||||||
|
)
|
||||||
|
self.stdout.write(
|
||||||
|
f"DEBUG: discord_client_id is None: {
|
||||||
|
discord_client_id is None}"
|
||||||
|
)
|
||||||
|
self.stdout.write(
|
||||||
|
f"DEBUG: discord_client_secret is None: {
|
||||||
|
discord_client_secret is None}"
|
||||||
|
)
|
||||||
|
return
|
||||||
|
|
||||||
|
# Get or create the default site
|
||||||
|
site, _ = Site.objects.get_or_create(
|
||||||
|
id=1, defaults={"domain": "localhost:8000", "name": "localhost"}
|
||||||
|
)
|
||||||
|
|
||||||
|
# Set up Google
|
||||||
|
google_app, created = SocialApp.objects.get_or_create(
|
||||||
|
provider="google",
|
||||||
|
defaults={
|
||||||
|
"name": "Google",
|
||||||
|
"client_id": google_client_id,
|
||||||
|
"secret": google_client_secret,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
if not created:
|
||||||
|
self.stdout.write(
|
||||||
|
f"DEBUG: About to assign google_client_id: {google_client_id} (type: {
|
||||||
|
type(google_client_id)})"
|
||||||
|
)
|
||||||
|
if google_client_id is not None and google_client_secret is not None:
|
||||||
|
google_app.client_id = google_client_id
|
||||||
|
google_app.secret = google_client_secret
|
||||||
|
google_app.save()
|
||||||
|
self.stdout.write("DEBUG: Successfully updated Google app")
|
||||||
|
else:
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.ERROR(
|
||||||
|
"Google client_id or secret is None, skipping update."
|
||||||
|
)
|
||||||
|
)
|
||||||
|
google_app.sites.add(site)
|
||||||
|
|
||||||
|
# Set up Discord
|
||||||
|
discord_app, created = SocialApp.objects.get_or_create(
|
||||||
|
provider="discord",
|
||||||
|
defaults={
|
||||||
|
"name": "Discord",
|
||||||
|
"client_id": discord_client_id,
|
||||||
|
"secret": discord_client_secret,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
if not created:
|
||||||
|
self.stdout.write(
|
||||||
|
f"DEBUG: About to assign discord_client_id: {discord_client_id} (type: {
|
||||||
|
type(discord_client_id)})"
|
||||||
|
)
|
||||||
|
if discord_client_id is not None and discord_client_secret is not None:
|
||||||
|
discord_app.client_id = discord_client_id
|
||||||
|
discord_app.secret = discord_client_secret
|
||||||
|
discord_app.save()
|
||||||
|
self.stdout.write("DEBUG: Successfully updated Discord app")
|
||||||
|
else:
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.ERROR(
|
||||||
|
"Discord client_id or secret is None, skipping update."
|
||||||
|
)
|
||||||
|
)
|
||||||
|
discord_app.sites.add(site)
|
||||||
|
|
||||||
|
self.stdout.write(self.style.SUCCESS("Successfully set up social auth apps"))
|
||||||
@@ -1,35 +1,43 @@
|
|||||||
from django.core.management.base import BaseCommand
|
from django.core.management.base import BaseCommand
|
||||||
from django.contrib.sites.models import Site
|
from django.contrib.sites.models import Site
|
||||||
from django.contrib.auth import get_user_model
|
from django.contrib.auth import get_user_model
|
||||||
from django.contrib.auth.models import Permission
|
|
||||||
from allauth.socialaccount.models import SocialApp
|
|
||||||
|
|
||||||
User = get_user_model()
|
User = get_user_model()
|
||||||
|
|
||||||
|
|
||||||
class Command(BaseCommand):
|
class Command(BaseCommand):
|
||||||
help = 'Set up social authentication through admin interface'
|
help = "Set up social authentication through admin interface"
|
||||||
|
|
||||||
def handle(self, *args, **options):
|
def handle(self, *args, **options):
|
||||||
# Get or create the default site
|
# Get or create the default site
|
||||||
site, _ = Site.objects.get_or_create(
|
site, _ = Site.objects.get_or_create(
|
||||||
id=1,
|
id=1,
|
||||||
defaults={
|
defaults={
|
||||||
'domain': 'localhost:8000',
|
"domain": "localhost:8000",
|
||||||
'name': 'ThrillWiki Development'
|
"name": "ThrillWiki Development",
|
||||||
}
|
},
|
||||||
)
|
)
|
||||||
if not _:
|
if not _:
|
||||||
site.domain = 'localhost:8000'
|
site.domain = "localhost:8000"
|
||||||
site.name = 'ThrillWiki Development'
|
site.name = "ThrillWiki Development"
|
||||||
site.save()
|
site.save()
|
||||||
self.stdout.write(f'{"Created" if _ else "Updated"} site: {site.domain}')
|
self.stdout.write(f'{"Created" if _ else "Updated"} site: {site.domain}')
|
||||||
|
|
||||||
# Create superuser if it doesn't exist
|
# Create superuser if it doesn't exist
|
||||||
if not User.objects.filter(username='admin').exists():
|
if not User.objects.filter(username="admin").exists():
|
||||||
User.objects.create_superuser('admin', 'admin@example.com', 'admin')
|
admin_user = User.objects.create(
|
||||||
self.stdout.write('Created superuser: admin/admin')
|
username="admin",
|
||||||
|
email="admin@example.com",
|
||||||
|
is_staff=True,
|
||||||
|
is_superuser=True,
|
||||||
|
)
|
||||||
|
admin_user.set_password("admin")
|
||||||
|
admin_user.save()
|
||||||
|
self.stdout.write("Created superuser: admin/admin")
|
||||||
|
|
||||||
self.stdout.write(self.style.SUCCESS('''
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS(
|
||||||
|
"""
|
||||||
Social auth setup instructions:
|
Social auth setup instructions:
|
||||||
|
|
||||||
1. Run the development server:
|
1. Run the development server:
|
||||||
@@ -57,4 +65,6 @@ Social auth setup instructions:
|
|||||||
Client id: 135166769591-nopcgmo0fkqfqfs9qe783a137mtmcrt2.apps.googleusercontent.com
|
Client id: 135166769591-nopcgmo0fkqfqfs9qe783a137mtmcrt2.apps.googleusercontent.com
|
||||||
Secret key: GOCSPX-Wd_0Ue0Ue0Ue0Ue0Ue0Ue0Ue0Ue
|
Secret key: GOCSPX-Wd_0Ue0Ue0Ue0Ue0Ue0Ue0Ue0Ue
|
||||||
Sites: Add "localhost:8000"
|
Sites: Add "localhost:8000"
|
||||||
'''))
|
"""
|
||||||
|
)
|
||||||
|
)
|
||||||
@@ -0,0 +1,51 @@
|
|||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
from allauth.socialaccount.models import SocialApp
|
||||||
|
from django.contrib.sites.models import Site
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = 'Set up social authentication providers for development'
|
||||||
|
|
||||||
|
def handle(self, *args, **options):
|
||||||
|
# Get the current site
|
||||||
|
site = Site.objects.get_current()
|
||||||
|
self.stdout.write(f'Setting up social providers for site: {site}')
|
||||||
|
|
||||||
|
# Clear existing social apps to avoid duplicates
|
||||||
|
deleted_count = SocialApp.objects.all().delete()[0]
|
||||||
|
self.stdout.write(f'Cleared {deleted_count} existing social apps')
|
||||||
|
|
||||||
|
# Create Google social app
|
||||||
|
google_app = SocialApp.objects.create(
|
||||||
|
provider='google',
|
||||||
|
name='Google',
|
||||||
|
client_id='demo-google-client-id.apps.googleusercontent.com',
|
||||||
|
secret='demo-google-client-secret',
|
||||||
|
key='',
|
||||||
|
)
|
||||||
|
google_app.sites.add(site)
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS('✅ Created Google social app')
|
||||||
|
)
|
||||||
|
|
||||||
|
# Create Discord social app
|
||||||
|
discord_app = SocialApp.objects.create(
|
||||||
|
provider='discord',
|
||||||
|
name='Discord',
|
||||||
|
client_id='demo-discord-client-id',
|
||||||
|
secret='demo-discord-client-secret',
|
||||||
|
key='',
|
||||||
|
)
|
||||||
|
discord_app.sites.add(site)
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS('✅ Created Discord social app')
|
||||||
|
)
|
||||||
|
|
||||||
|
# List all social apps
|
||||||
|
self.stdout.write('\nConfigured social apps:')
|
||||||
|
for app in SocialApp.objects.all():
|
||||||
|
self.stdout.write(f'- {app.name} ({app.provider}): {app.client_id}')
|
||||||
|
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS(f'\nTotal social apps: {SocialApp.objects.count()}')
|
||||||
|
)
|
||||||
@@ -0,0 +1,61 @@
|
|||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
from django.test import Client
|
||||||
|
from allauth.socialaccount.models import SocialApp
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = "Test Discord OAuth2 authentication flow"
|
||||||
|
|
||||||
|
def handle(self, *args, **options):
|
||||||
|
client = Client(HTTP_HOST="localhost:8000")
|
||||||
|
|
||||||
|
# Get Discord app
|
||||||
|
try:
|
||||||
|
discord_app = SocialApp.objects.get(provider="discord")
|
||||||
|
self.stdout.write("Found Discord app configuration:")
|
||||||
|
self.stdout.write(f"Client ID: {discord_app.client_id}")
|
||||||
|
|
||||||
|
# Test login URL
|
||||||
|
login_url = "/accounts/discord/login/"
|
||||||
|
response = client.get(login_url, HTTP_HOST="localhost:8000")
|
||||||
|
self.stdout.write(f"\nTesting login URL: {login_url}")
|
||||||
|
self.stdout.write(f"Status code: {response.status_code}")
|
||||||
|
|
||||||
|
if response.status_code == 302:
|
||||||
|
redirect_url = response["Location"]
|
||||||
|
self.stdout.write(f"Redirects to: {redirect_url}")
|
||||||
|
|
||||||
|
# Parse OAuth2 parameters
|
||||||
|
self.stdout.write("\nOAuth2 Parameters:")
|
||||||
|
if "client_id=" in redirect_url:
|
||||||
|
self.stdout.write("✓ client_id parameter present")
|
||||||
|
if "redirect_uri=" in redirect_url:
|
||||||
|
self.stdout.write("✓ redirect_uri parameter present")
|
||||||
|
if "scope=" in redirect_url:
|
||||||
|
self.stdout.write("✓ scope parameter present")
|
||||||
|
if "response_type=" in redirect_url:
|
||||||
|
self.stdout.write("✓ response_type parameter present")
|
||||||
|
if "code_challenge=" in redirect_url:
|
||||||
|
self.stdout.write("✓ PKCE enabled (code_challenge present)")
|
||||||
|
|
||||||
|
# Show callback URL
|
||||||
|
callback_url = "http://localhost:8000/accounts/discord/login/callback/"
|
||||||
|
self.stdout.write(
|
||||||
|
"\nCallback URL to configure in Discord Developer Portal:"
|
||||||
|
)
|
||||||
|
self.stdout.write(callback_url)
|
||||||
|
|
||||||
|
# Show frontend login URL
|
||||||
|
frontend_url = "http://localhost:5173"
|
||||||
|
self.stdout.write("\nFrontend configuration:")
|
||||||
|
self.stdout.write(f"Frontend URL: {frontend_url}")
|
||||||
|
self.stdout.write("Discord login button should use:")
|
||||||
|
self.stdout.write("/accounts/discord/login/?process=login")
|
||||||
|
|
||||||
|
# Show allauth URLs
|
||||||
|
self.stdout.write("\nAllauth URLs:")
|
||||||
|
self.stdout.write("Login URL: /accounts/discord/login/?process=login")
|
||||||
|
self.stdout.write("Callback URL: /accounts/discord/login/callback/")
|
||||||
|
|
||||||
|
except SocialApp.DoesNotExist:
|
||||||
|
self.stdout.write(self.style.ERROR("Discord app not found"))
|
||||||
@@ -2,19 +2,22 @@ from django.core.management.base import BaseCommand
|
|||||||
from allauth.socialaccount.models import SocialApp
|
from allauth.socialaccount.models import SocialApp
|
||||||
from django.contrib.sites.models import Site
|
from django.contrib.sites.models import Site
|
||||||
|
|
||||||
|
|
||||||
class Command(BaseCommand):
|
class Command(BaseCommand):
|
||||||
help = 'Update social apps to be associated with all sites'
|
help = "Update social apps to be associated with all sites"
|
||||||
|
|
||||||
def handle(self, *args, **options):
|
def handle(self, *args, **options):
|
||||||
# Get all sites
|
# Get all sites
|
||||||
sites = Site.objects.all()
|
sites = Site.objects.all()
|
||||||
|
|
||||||
# Update each social app
|
# Update each social app
|
||||||
for app in SocialApp.objects.all():
|
for app in SocialApp.objects.all():
|
||||||
self.stdout.write(f'Updating {app.provider} app...')
|
self.stdout.write(f"Updating {app.provider} app...")
|
||||||
# Clear existing sites
|
# Clear existing sites
|
||||||
app.sites.clear()
|
app.sites.clear()
|
||||||
# Add all sites
|
# Add all sites
|
||||||
for site in sites:
|
for site in sites:
|
||||||
app.sites.add(site)
|
app.sites.add(site)
|
||||||
self.stdout.write(f'Added sites: {", ".join(site.domain for site in sites)}')
|
self.stdout.write(
|
||||||
|
f'Added sites: {", ".join(site.domain for site in sites)}'
|
||||||
|
)
|
||||||
@@ -0,0 +1,42 @@
|
|||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
from allauth.socialaccount.models import SocialApp
|
||||||
|
from django.conf import settings
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = "Verify Discord OAuth2 settings"
|
||||||
|
|
||||||
|
def handle(self, *args, **options):
|
||||||
|
# Get Discord app
|
||||||
|
try:
|
||||||
|
discord_app = SocialApp.objects.get(provider="discord")
|
||||||
|
self.stdout.write("Found Discord app configuration:")
|
||||||
|
self.stdout.write(f"Client ID: {discord_app.client_id}")
|
||||||
|
self.stdout.write(f"Secret: {discord_app.secret}")
|
||||||
|
|
||||||
|
# Get sites
|
||||||
|
sites = discord_app.sites.all()
|
||||||
|
self.stdout.write("\nAssociated sites:")
|
||||||
|
for site in sites:
|
||||||
|
self.stdout.write(f"- {site.domain} ({site.name})")
|
||||||
|
|
||||||
|
# Show callback URL
|
||||||
|
callback_url = "http://localhost:8000/accounts/discord/login/callback/"
|
||||||
|
self.stdout.write(
|
||||||
|
"\nCallback URL to configure in Discord Developer Portal:"
|
||||||
|
)
|
||||||
|
self.stdout.write(callback_url)
|
||||||
|
|
||||||
|
# Show OAuth2 settings
|
||||||
|
self.stdout.write("\nOAuth2 settings in settings.py:")
|
||||||
|
discord_settings = settings.SOCIALACCOUNT_PROVIDERS.get("discord", {})
|
||||||
|
self.stdout.write(
|
||||||
|
f'PKCE Enabled: {
|
||||||
|
discord_settings.get(
|
||||||
|
"OAUTH_PKCE_ENABLED",
|
||||||
|
False)}'
|
||||||
|
)
|
||||||
|
self.stdout.write(f'Scopes: {discord_settings.get("SCOPE", [])}')
|
||||||
|
|
||||||
|
except SocialApp.DoesNotExist:
|
||||||
|
self.stdout.write(self.style.ERROR("Discord app not found"))
|
||||||
@@ -33,7 +33,10 @@ class Migration(migrations.Migration):
|
|||||||
verbose_name="ID",
|
verbose_name="ID",
|
||||||
),
|
),
|
||||||
),
|
),
|
||||||
("password", models.CharField(max_length=128, verbose_name="password")),
|
(
|
||||||
|
"password",
|
||||||
|
models.CharField(max_length=128, verbose_name="password"),
|
||||||
|
),
|
||||||
(
|
(
|
||||||
"last_login",
|
"last_login",
|
||||||
models.DateTimeField(
|
models.DateTimeField(
|
||||||
@@ -78,7 +81,9 @@ class Migration(migrations.Migration):
|
|||||||
(
|
(
|
||||||
"email",
|
"email",
|
||||||
models.EmailField(
|
models.EmailField(
|
||||||
blank=True, max_length=254, verbose_name="email address"
|
blank=True,
|
||||||
|
max_length=254,
|
||||||
|
verbose_name="email address",
|
||||||
),
|
),
|
||||||
),
|
),
|
||||||
(
|
(
|
||||||
@@ -100,7 +105,8 @@ class Migration(migrations.Migration):
|
|||||||
(
|
(
|
||||||
"date_joined",
|
"date_joined",
|
||||||
models.DateTimeField(
|
models.DateTimeField(
|
||||||
default=django.utils.timezone.now, verbose_name="date joined"
|
default=django.utils.timezone.now,
|
||||||
|
verbose_name="date joined",
|
||||||
),
|
),
|
||||||
),
|
),
|
||||||
(
|
(
|
||||||
@@ -274,7 +280,10 @@ class Migration(migrations.Migration):
|
|||||||
migrations.CreateModel(
|
migrations.CreateModel(
|
||||||
name="TopListEvent",
|
name="TopListEvent",
|
||||||
fields=[
|
fields=[
|
||||||
("pgh_id", models.AutoField(primary_key=True, serialize=False)),
|
(
|
||||||
|
"pgh_id",
|
||||||
|
models.AutoField(primary_key=True, serialize=False),
|
||||||
|
),
|
||||||
("pgh_created_at", models.DateTimeField(auto_now_add=True)),
|
("pgh_created_at", models.DateTimeField(auto_now_add=True)),
|
||||||
("pgh_label", models.TextField(help_text="The event label.")),
|
("pgh_label", models.TextField(help_text="The event label.")),
|
||||||
("id", models.BigIntegerField()),
|
("id", models.BigIntegerField()),
|
||||||
@@ -369,7 +378,10 @@ class Migration(migrations.Migration):
|
|||||||
migrations.CreateModel(
|
migrations.CreateModel(
|
||||||
name="TopListItemEvent",
|
name="TopListItemEvent",
|
||||||
fields=[
|
fields=[
|
||||||
("pgh_id", models.AutoField(primary_key=True, serialize=False)),
|
(
|
||||||
|
"pgh_id",
|
||||||
|
models.AutoField(primary_key=True, serialize=False),
|
||||||
|
),
|
||||||
("pgh_created_at", models.DateTimeField(auto_now_add=True)),
|
("pgh_created_at", models.DateTimeField(auto_now_add=True)),
|
||||||
("pgh_label", models.TextField(help_text="The event label.")),
|
("pgh_label", models.TextField(help_text="The event label.")),
|
||||||
("id", models.BigIntegerField()),
|
("id", models.BigIntegerField()),
|
||||||
@@ -451,7 +463,10 @@ class Migration(migrations.Migration):
|
|||||||
unique=True,
|
unique=True,
|
||||||
),
|
),
|
||||||
),
|
),
|
||||||
("avatar", models.ImageField(blank=True, upload_to="avatars/")),
|
(
|
||||||
|
"avatar",
|
||||||
|
models.ImageField(blank=True, upload_to="avatars/"),
|
||||||
|
),
|
||||||
("pronouns", models.CharField(blank=True, max_length=50)),
|
("pronouns", models.CharField(blank=True, max_length=50)),
|
||||||
("bio", models.TextField(blank=True, max_length=500)),
|
("bio", models.TextField(blank=True, max_length=500)),
|
||||||
("twitter", models.URLField(blank=True)),
|
("twitter", models.URLField(blank=True)),
|
||||||
@@ -0,0 +1,64 @@
|
|||||||
|
# Generated by Django 5.2.5 on 2025-08-24 18:23
|
||||||
|
|
||||||
|
import pgtrigger.migrations
|
||||||
|
from django.db import migrations
|
||||||
|
|
||||||
|
|
||||||
|
class Migration(migrations.Migration):
|
||||||
|
|
||||||
|
dependencies = [
|
||||||
|
("accounts", "0001_initial"),
|
||||||
|
]
|
||||||
|
|
||||||
|
operations = [
|
||||||
|
migrations.RemoveField(
|
||||||
|
model_name="toplistevent",
|
||||||
|
name="pgh_context",
|
||||||
|
),
|
||||||
|
migrations.RemoveField(
|
||||||
|
model_name="toplistevent",
|
||||||
|
name="pgh_obj",
|
||||||
|
),
|
||||||
|
migrations.RemoveField(
|
||||||
|
model_name="toplistevent",
|
||||||
|
name="user",
|
||||||
|
),
|
||||||
|
migrations.RemoveField(
|
||||||
|
model_name="toplistitemevent",
|
||||||
|
name="content_type",
|
||||||
|
),
|
||||||
|
migrations.RemoveField(
|
||||||
|
model_name="toplistitemevent",
|
||||||
|
name="pgh_context",
|
||||||
|
),
|
||||||
|
migrations.RemoveField(
|
||||||
|
model_name="toplistitemevent",
|
||||||
|
name="pgh_obj",
|
||||||
|
),
|
||||||
|
migrations.RemoveField(
|
||||||
|
model_name="toplistitemevent",
|
||||||
|
name="top_list",
|
||||||
|
),
|
||||||
|
pgtrigger.migrations.RemoveTrigger(
|
||||||
|
model_name="toplist",
|
||||||
|
name="insert_insert",
|
||||||
|
),
|
||||||
|
pgtrigger.migrations.RemoveTrigger(
|
||||||
|
model_name="toplist",
|
||||||
|
name="update_update",
|
||||||
|
),
|
||||||
|
pgtrigger.migrations.RemoveTrigger(
|
||||||
|
model_name="toplistitem",
|
||||||
|
name="insert_insert",
|
||||||
|
),
|
||||||
|
pgtrigger.migrations.RemoveTrigger(
|
||||||
|
model_name="toplistitem",
|
||||||
|
name="update_update",
|
||||||
|
),
|
||||||
|
migrations.DeleteModel(
|
||||||
|
name="TopListEvent",
|
||||||
|
),
|
||||||
|
migrations.DeleteModel(
|
||||||
|
name="TopListItemEvent",
|
||||||
|
),
|
||||||
|
]
|
||||||
@@ -0,0 +1,439 @@
|
|||||||
|
# Generated by Django 5.2.5 on 2025-08-24 19:11
|
||||||
|
|
||||||
|
import django.contrib.auth.validators
|
||||||
|
import django.db.models.deletion
|
||||||
|
import django.utils.timezone
|
||||||
|
import pgtrigger.compiler
|
||||||
|
import pgtrigger.migrations
|
||||||
|
from django.conf import settings
|
||||||
|
from django.db import migrations, models
|
||||||
|
|
||||||
|
|
||||||
|
class Migration(migrations.Migration):
|
||||||
|
|
||||||
|
dependencies = [
|
||||||
|
("accounts", "0002_remove_toplistevent_pgh_context_and_more"),
|
||||||
|
("pghistory", "0007_auto_20250421_0444"),
|
||||||
|
]
|
||||||
|
|
||||||
|
operations = [
|
||||||
|
migrations.CreateModel(
|
||||||
|
name="EmailVerificationEvent",
|
||||||
|
fields=[
|
||||||
|
("pgh_id", models.AutoField(primary_key=True, serialize=False)),
|
||||||
|
("pgh_created_at", models.DateTimeField(auto_now_add=True)),
|
||||||
|
("pgh_label", models.TextField(help_text="The event label.")),
|
||||||
|
("id", models.BigIntegerField()),
|
||||||
|
("token", models.CharField(max_length=64)),
|
||||||
|
("created_at", models.DateTimeField(auto_now_add=True)),
|
||||||
|
("last_sent", models.DateTimeField(auto_now_add=True)),
|
||||||
|
],
|
||||||
|
options={
|
||||||
|
"abstract": False,
|
||||||
|
},
|
||||||
|
),
|
||||||
|
migrations.CreateModel(
|
||||||
|
name="PasswordResetEvent",
|
||||||
|
fields=[
|
||||||
|
("pgh_id", models.AutoField(primary_key=True, serialize=False)),
|
||||||
|
("pgh_created_at", models.DateTimeField(auto_now_add=True)),
|
||||||
|
("pgh_label", models.TextField(help_text="The event label.")),
|
||||||
|
("id", models.BigIntegerField()),
|
||||||
|
("token", models.CharField(max_length=64)),
|
||||||
|
("created_at", models.DateTimeField(auto_now_add=True)),
|
||||||
|
("expires_at", models.DateTimeField()),
|
||||||
|
("used", models.BooleanField(default=False)),
|
||||||
|
],
|
||||||
|
options={
|
||||||
|
"abstract": False,
|
||||||
|
},
|
||||||
|
),
|
||||||
|
migrations.CreateModel(
|
||||||
|
name="UserEvent",
|
||||||
|
fields=[
|
||||||
|
("pgh_id", models.AutoField(primary_key=True, serialize=False)),
|
||||||
|
("pgh_created_at", models.DateTimeField(auto_now_add=True)),
|
||||||
|
("pgh_label", models.TextField(help_text="The event label.")),
|
||||||
|
("id", models.BigIntegerField()),
|
||||||
|
("password", models.CharField(max_length=128, verbose_name="password")),
|
||||||
|
(
|
||||||
|
"last_login",
|
||||||
|
models.DateTimeField(
|
||||||
|
blank=True, null=True, verbose_name="last login"
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"is_superuser",
|
||||||
|
models.BooleanField(
|
||||||
|
default=False,
|
||||||
|
help_text="Designates that this user has all permissions without explicitly assigning them.",
|
||||||
|
verbose_name="superuser status",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"username",
|
||||||
|
models.CharField(
|
||||||
|
error_messages={
|
||||||
|
"unique": "A user with that username already exists."
|
||||||
|
},
|
||||||
|
help_text="Required. 150 characters or fewer. Letters, digits and @/./+/-/_ only.",
|
||||||
|
max_length=150,
|
||||||
|
validators=[
|
||||||
|
django.contrib.auth.validators.UnicodeUsernameValidator()
|
||||||
|
],
|
||||||
|
verbose_name="username",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"first_name",
|
||||||
|
models.CharField(
|
||||||
|
blank=True, max_length=150, verbose_name="first name"
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"last_name",
|
||||||
|
models.CharField(
|
||||||
|
blank=True, max_length=150, verbose_name="last name"
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"email",
|
||||||
|
models.EmailField(
|
||||||
|
blank=True, max_length=254, verbose_name="email address"
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"is_staff",
|
||||||
|
models.BooleanField(
|
||||||
|
default=False,
|
||||||
|
help_text="Designates whether the user can log into this admin site.",
|
||||||
|
verbose_name="staff status",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"is_active",
|
||||||
|
models.BooleanField(
|
||||||
|
default=True,
|
||||||
|
help_text="Designates whether this user should be treated as active. Unselect this instead of deleting accounts.",
|
||||||
|
verbose_name="active",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"date_joined",
|
||||||
|
models.DateTimeField(
|
||||||
|
default=django.utils.timezone.now, verbose_name="date joined"
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"user_id",
|
||||||
|
models.CharField(
|
||||||
|
editable=False,
|
||||||
|
help_text="Unique identifier for this user that remains constant even if the username changes",
|
||||||
|
max_length=10,
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"role",
|
||||||
|
models.CharField(
|
||||||
|
choices=[
|
||||||
|
("USER", "User"),
|
||||||
|
("MODERATOR", "Moderator"),
|
||||||
|
("ADMIN", "Admin"),
|
||||||
|
("SUPERUSER", "Superuser"),
|
||||||
|
],
|
||||||
|
default="USER",
|
||||||
|
max_length=10,
|
||||||
|
),
|
||||||
|
),
|
||||||
|
("is_banned", models.BooleanField(default=False)),
|
||||||
|
("ban_reason", models.TextField(blank=True)),
|
||||||
|
("ban_date", models.DateTimeField(blank=True, null=True)),
|
||||||
|
(
|
||||||
|
"pending_email",
|
||||||
|
models.EmailField(blank=True, max_length=254, null=True),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"theme_preference",
|
||||||
|
models.CharField(
|
||||||
|
choices=[("light", "Light"), ("dark", "Dark")],
|
||||||
|
default="light",
|
||||||
|
max_length=5,
|
||||||
|
),
|
||||||
|
),
|
||||||
|
],
|
||||||
|
options={
|
||||||
|
"abstract": False,
|
||||||
|
},
|
||||||
|
),
|
||||||
|
migrations.CreateModel(
|
||||||
|
name="UserProfileEvent",
|
||||||
|
fields=[
|
||||||
|
("pgh_id", models.AutoField(primary_key=True, serialize=False)),
|
||||||
|
("pgh_created_at", models.DateTimeField(auto_now_add=True)),
|
||||||
|
("pgh_label", models.TextField(help_text="The event label.")),
|
||||||
|
("id", models.BigIntegerField()),
|
||||||
|
(
|
||||||
|
"profile_id",
|
||||||
|
models.CharField(
|
||||||
|
editable=False,
|
||||||
|
help_text="Unique identifier for this profile that remains constant",
|
||||||
|
max_length=10,
|
||||||
|
),
|
||||||
|
),
|
||||||
|
(
|
||||||
|
"display_name",
|
||||||
|
models.CharField(
|
||||||
|
help_text="This is the name that will be displayed on the site",
|
||||||
|
max_length=50,
|
||||||
|
),
|
||||||
|
),
|
||||||
|
("avatar", models.ImageField(blank=True, upload_to="avatars/")),
|
||||||
|
("pronouns", models.CharField(blank=True, max_length=50)),
|
||||||
|
("bio", models.TextField(blank=True, max_length=500)),
|
||||||
|
("twitter", models.URLField(blank=True)),
|
||||||
|
("instagram", models.URLField(blank=True)),
|
||||||
|
("youtube", models.URLField(blank=True)),
|
||||||
|
("discord", models.CharField(blank=True, max_length=100)),
|
||||||
|
("coaster_credits", models.IntegerField(default=0)),
|
||||||
|
("dark_ride_credits", models.IntegerField(default=0)),
|
||||||
|
("flat_ride_credits", models.IntegerField(default=0)),
|
||||||
|
("water_ride_credits", models.IntegerField(default=0)),
|
||||||
|
],
|
||||||
|
options={
|
||||||
|
"abstract": False,
|
||||||
|
},
|
||||||
|
),
|
||||||
|
pgtrigger.migrations.AddTrigger(
|
||||||
|
model_name="emailverification",
|
||||||
|
trigger=pgtrigger.compiler.Trigger(
|
||||||
|
name="insert_insert",
|
||||||
|
sql=pgtrigger.compiler.UpsertTriggerSql(
|
||||||
|
func='INSERT INTO "accounts_emailverificationevent" ("created_at", "id", "last_sent", "pgh_context_id", "pgh_created_at", "pgh_label", "pgh_obj_id", "token", "user_id") VALUES (NEW."created_at", NEW."id", NEW."last_sent", _pgh_attach_context(), NOW(), \'insert\', NEW."id", NEW."token", NEW."user_id"); RETURN NULL;',
|
||||||
|
hash="c485bf0cd5bea8a05ef2d4ae309b60eff42abd84",
|
||||||
|
operation="INSERT",
|
||||||
|
pgid="pgtrigger_insert_insert_53748",
|
||||||
|
table="accounts_emailverification",
|
||||||
|
when="AFTER",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
pgtrigger.migrations.AddTrigger(
|
||||||
|
model_name="emailverification",
|
||||||
|
trigger=pgtrigger.compiler.Trigger(
|
||||||
|
name="update_update",
|
||||||
|
sql=pgtrigger.compiler.UpsertTriggerSql(
|
||||||
|
condition="WHEN (OLD.* IS DISTINCT FROM NEW.*)",
|
||||||
|
func='INSERT INTO "accounts_emailverificationevent" ("created_at", "id", "last_sent", "pgh_context_id", "pgh_created_at", "pgh_label", "pgh_obj_id", "token", "user_id") VALUES (NEW."created_at", NEW."id", NEW."last_sent", _pgh_attach_context(), NOW(), \'update\', NEW."id", NEW."token", NEW."user_id"); RETURN NULL;',
|
||||||
|
hash="c20942bdc0713db74310da8da8c3138ca4c3bba9",
|
||||||
|
operation="UPDATE",
|
||||||
|
pgid="pgtrigger_update_update_7a2a8",
|
||||||
|
table="accounts_emailverification",
|
||||||
|
when="AFTER",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
pgtrigger.migrations.AddTrigger(
|
||||||
|
model_name="passwordreset",
|
||||||
|
trigger=pgtrigger.compiler.Trigger(
|
||||||
|
name="insert_insert",
|
||||||
|
sql=pgtrigger.compiler.UpsertTriggerSql(
|
||||||
|
func='INSERT INTO "accounts_passwordresetevent" ("created_at", "expires_at", "id", "pgh_context_id", "pgh_created_at", "pgh_label", "pgh_obj_id", "token", "used", "user_id") VALUES (NEW."created_at", NEW."expires_at", NEW."id", _pgh_attach_context(), NOW(), \'insert\', NEW."id", NEW."token", NEW."used", NEW."user_id"); RETURN NULL;',
|
||||||
|
hash="496ac059671b25460cdf2ca20d0e43b14d417a26",
|
||||||
|
operation="INSERT",
|
||||||
|
pgid="pgtrigger_insert_insert_d2b72",
|
||||||
|
table="accounts_passwordreset",
|
||||||
|
when="AFTER",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
pgtrigger.migrations.AddTrigger(
|
||||||
|
model_name="passwordreset",
|
||||||
|
trigger=pgtrigger.compiler.Trigger(
|
||||||
|
name="update_update",
|
||||||
|
sql=pgtrigger.compiler.UpsertTriggerSql(
|
||||||
|
condition="WHEN (OLD.* IS DISTINCT FROM NEW.*)",
|
||||||
|
func='INSERT INTO "accounts_passwordresetevent" ("created_at", "expires_at", "id", "pgh_context_id", "pgh_created_at", "pgh_label", "pgh_obj_id", "token", "used", "user_id") VALUES (NEW."created_at", NEW."expires_at", NEW."id", _pgh_attach_context(), NOW(), \'update\', NEW."id", NEW."token", NEW."used", NEW."user_id"); RETURN NULL;',
|
||||||
|
hash="c40acc416f85287b4a6fcc06724626707df90016",
|
||||||
|
operation="UPDATE",
|
||||||
|
pgid="pgtrigger_update_update_526d2",
|
||||||
|
table="accounts_passwordreset",
|
||||||
|
when="AFTER",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
pgtrigger.migrations.AddTrigger(
|
||||||
|
model_name="user",
|
||||||
|
trigger=pgtrigger.compiler.Trigger(
|
||||||
|
name="insert_insert",
|
||||||
|
sql=pgtrigger.compiler.UpsertTriggerSql(
|
||||||
|
func='INSERT INTO "accounts_userevent" ("ban_date", "ban_reason", "date_joined", "email", "first_name", "id", "is_active", "is_banned", "is_staff", "is_superuser", "last_login", "last_name", "password", "pending_email", "pgh_context_id", "pgh_created_at", "pgh_label", "pgh_obj_id", "role", "theme_preference", "user_id", "username") VALUES (NEW."ban_date", NEW."ban_reason", NEW."date_joined", NEW."email", NEW."first_name", NEW."id", NEW."is_active", NEW."is_banned", NEW."is_staff", NEW."is_superuser", NEW."last_login", NEW."last_name", NEW."password", NEW."pending_email", _pgh_attach_context(), NOW(), \'insert\', NEW."id", NEW."role", NEW."theme_preference", NEW."user_id", NEW."username"); RETURN NULL;',
|
||||||
|
hash="b6992f02a4c1135fef9527e3f1ed330e2e626267",
|
||||||
|
operation="INSERT",
|
||||||
|
pgid="pgtrigger_insert_insert_3867c",
|
||||||
|
table="accounts_user",
|
||||||
|
when="AFTER",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
pgtrigger.migrations.AddTrigger(
|
||||||
|
model_name="user",
|
||||||
|
trigger=pgtrigger.compiler.Trigger(
|
||||||
|
name="update_update",
|
||||||
|
sql=pgtrigger.compiler.UpsertTriggerSql(
|
||||||
|
condition="WHEN (OLD.* IS DISTINCT FROM NEW.*)",
|
||||||
|
func='INSERT INTO "accounts_userevent" ("ban_date", "ban_reason", "date_joined", "email", "first_name", "id", "is_active", "is_banned", "is_staff", "is_superuser", "last_login", "last_name", "password", "pending_email", "pgh_context_id", "pgh_created_at", "pgh_label", "pgh_obj_id", "role", "theme_preference", "user_id", "username") VALUES (NEW."ban_date", NEW."ban_reason", NEW."date_joined", NEW."email", NEW."first_name", NEW."id", NEW."is_active", NEW."is_banned", NEW."is_staff", NEW."is_superuser", NEW."last_login", NEW."last_name", NEW."password", NEW."pending_email", _pgh_attach_context(), NOW(), \'update\', NEW."id", NEW."role", NEW."theme_preference", NEW."user_id", NEW."username"); RETURN NULL;',
|
||||||
|
hash="6c3271b9f184dc137da7b9e42b0ae9f72d47c9c2",
|
||||||
|
operation="UPDATE",
|
||||||
|
pgid="pgtrigger_update_update_0e890",
|
||||||
|
table="accounts_user",
|
||||||
|
when="AFTER",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
pgtrigger.migrations.AddTrigger(
|
||||||
|
model_name="userprofile",
|
||||||
|
trigger=pgtrigger.compiler.Trigger(
|
||||||
|
name="insert_insert",
|
||||||
|
sql=pgtrigger.compiler.UpsertTriggerSql(
|
||||||
|
func='INSERT INTO "accounts_userprofileevent" ("avatar", "bio", "coaster_credits", "dark_ride_credits", "discord", "display_name", "flat_ride_credits", "id", "instagram", "pgh_context_id", "pgh_created_at", "pgh_label", "pgh_obj_id", "profile_id", "pronouns", "twitter", "user_id", "water_ride_credits", "youtube") VALUES (NEW."avatar", NEW."bio", NEW."coaster_credits", NEW."dark_ride_credits", NEW."discord", NEW."display_name", NEW."flat_ride_credits", NEW."id", NEW."instagram", _pgh_attach_context(), NOW(), \'insert\', NEW."id", NEW."profile_id", NEW."pronouns", NEW."twitter", NEW."user_id", NEW."water_ride_credits", NEW."youtube"); RETURN NULL;',
|
||||||
|
hash="af6a89f13ff879d978a1154bbcf4664de0fcf913",
|
||||||
|
operation="INSERT",
|
||||||
|
pgid="pgtrigger_insert_insert_c09d7",
|
||||||
|
table="accounts_userprofile",
|
||||||
|
when="AFTER",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
pgtrigger.migrations.AddTrigger(
|
||||||
|
model_name="userprofile",
|
||||||
|
trigger=pgtrigger.compiler.Trigger(
|
||||||
|
name="update_update",
|
||||||
|
sql=pgtrigger.compiler.UpsertTriggerSql(
|
||||||
|
condition="WHEN (OLD.* IS DISTINCT FROM NEW.*)",
|
||||||
|
func='INSERT INTO "accounts_userprofileevent" ("avatar", "bio", "coaster_credits", "dark_ride_credits", "discord", "display_name", "flat_ride_credits", "id", "instagram", "pgh_context_id", "pgh_created_at", "pgh_label", "pgh_obj_id", "profile_id", "pronouns", "twitter", "user_id", "water_ride_credits", "youtube") VALUES (NEW."avatar", NEW."bio", NEW."coaster_credits", NEW."dark_ride_credits", NEW."discord", NEW."display_name", NEW."flat_ride_credits", NEW."id", NEW."instagram", _pgh_attach_context(), NOW(), \'update\', NEW."id", NEW."profile_id", NEW."pronouns", NEW."twitter", NEW."user_id", NEW."water_ride_credits", NEW."youtube"); RETURN NULL;',
|
||||||
|
hash="37e99b5cc374ec0a3fc44d2482b411cba63fa84d",
|
||||||
|
operation="UPDATE",
|
||||||
|
pgid="pgtrigger_update_update_87ef6",
|
||||||
|
table="accounts_userprofile",
|
||||||
|
when="AFTER",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
migrations.AddField(
|
||||||
|
model_name="emailverificationevent",
|
||||||
|
name="pgh_context",
|
||||||
|
field=models.ForeignKey(
|
||||||
|
db_constraint=False,
|
||||||
|
null=True,
|
||||||
|
on_delete=django.db.models.deletion.DO_NOTHING,
|
||||||
|
related_name="+",
|
||||||
|
to="pghistory.context",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
migrations.AddField(
|
||||||
|
model_name="emailverificationevent",
|
||||||
|
name="pgh_obj",
|
||||||
|
field=models.ForeignKey(
|
||||||
|
db_constraint=False,
|
||||||
|
on_delete=django.db.models.deletion.DO_NOTHING,
|
||||||
|
related_name="events",
|
||||||
|
to="accounts.emailverification",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
migrations.AddField(
|
||||||
|
model_name="emailverificationevent",
|
||||||
|
name="user",
|
||||||
|
field=models.ForeignKey(
|
||||||
|
db_constraint=False,
|
||||||
|
on_delete=django.db.models.deletion.DO_NOTHING,
|
||||||
|
related_name="+",
|
||||||
|
related_query_name="+",
|
||||||
|
to=settings.AUTH_USER_MODEL,
|
||||||
|
),
|
||||||
|
),
|
||||||
|
migrations.AddField(
|
||||||
|
model_name="passwordresetevent",
|
||||||
|
name="pgh_context",
|
||||||
|
field=models.ForeignKey(
|
||||||
|
db_constraint=False,
|
||||||
|
null=True,
|
||||||
|
on_delete=django.db.models.deletion.DO_NOTHING,
|
||||||
|
related_name="+",
|
||||||
|
to="pghistory.context",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
migrations.AddField(
|
||||||
|
model_name="passwordresetevent",
|
||||||
|
name="pgh_obj",
|
||||||
|
field=models.ForeignKey(
|
||||||
|
db_constraint=False,
|
||||||
|
on_delete=django.db.models.deletion.DO_NOTHING,
|
||||||
|
related_name="events",
|
||||||
|
to="accounts.passwordreset",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
migrations.AddField(
|
||||||
|
model_name="passwordresetevent",
|
||||||
|
name="user",
|
||||||
|
field=models.ForeignKey(
|
||||||
|
db_constraint=False,
|
||||||
|
on_delete=django.db.models.deletion.DO_NOTHING,
|
||||||
|
related_name="+",
|
||||||
|
related_query_name="+",
|
||||||
|
to=settings.AUTH_USER_MODEL,
|
||||||
|
),
|
||||||
|
),
|
||||||
|
migrations.AddField(
|
||||||
|
model_name="userevent",
|
||||||
|
name="pgh_context",
|
||||||
|
field=models.ForeignKey(
|
||||||
|
db_constraint=False,
|
||||||
|
null=True,
|
||||||
|
on_delete=django.db.models.deletion.DO_NOTHING,
|
||||||
|
related_name="+",
|
||||||
|
to="pghistory.context",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
migrations.AddField(
|
||||||
|
model_name="userevent",
|
||||||
|
name="pgh_obj",
|
||||||
|
field=models.ForeignKey(
|
||||||
|
db_constraint=False,
|
||||||
|
on_delete=django.db.models.deletion.DO_NOTHING,
|
||||||
|
related_name="events",
|
||||||
|
to=settings.AUTH_USER_MODEL,
|
||||||
|
),
|
||||||
|
),
|
||||||
|
migrations.AddField(
|
||||||
|
model_name="userprofileevent",
|
||||||
|
name="pgh_context",
|
||||||
|
field=models.ForeignKey(
|
||||||
|
db_constraint=False,
|
||||||
|
null=True,
|
||||||
|
on_delete=django.db.models.deletion.DO_NOTHING,
|
||||||
|
related_name="+",
|
||||||
|
to="pghistory.context",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
migrations.AddField(
|
||||||
|
model_name="userprofileevent",
|
||||||
|
name="pgh_obj",
|
||||||
|
field=models.ForeignKey(
|
||||||
|
db_constraint=False,
|
||||||
|
on_delete=django.db.models.deletion.DO_NOTHING,
|
||||||
|
related_name="events",
|
||||||
|
to="accounts.userprofile",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
migrations.AddField(
|
||||||
|
model_name="userprofileevent",
|
||||||
|
name="user",
|
||||||
|
field=models.ForeignKey(
|
||||||
|
db_constraint=False,
|
||||||
|
on_delete=django.db.models.deletion.DO_NOTHING,
|
||||||
|
related_name="+",
|
||||||
|
related_query_name="+",
|
||||||
|
to=settings.AUTH_USER_MODEL,
|
||||||
|
),
|
||||||
|
),
|
||||||
|
]
|
||||||
@@ -2,11 +2,13 @@ import requests
|
|||||||
from django.conf import settings
|
from django.conf import settings
|
||||||
from django.core.exceptions import ValidationError
|
from django.core.exceptions import ValidationError
|
||||||
|
|
||||||
|
|
||||||
class TurnstileMixin:
|
class TurnstileMixin:
|
||||||
"""
|
"""
|
||||||
Mixin to handle Cloudflare Turnstile validation.
|
Mixin to handle Cloudflare Turnstile validation.
|
||||||
Bypasses validation when DEBUG is True.
|
Bypasses validation when DEBUG is True.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
def validate_turnstile(self, request):
|
def validate_turnstile(self, request):
|
||||||
"""
|
"""
|
||||||
Validate the Turnstile response token.
|
Validate the Turnstile response token.
|
||||||
@@ -14,20 +16,20 @@ class TurnstileMixin:
|
|||||||
"""
|
"""
|
||||||
if settings.DEBUG:
|
if settings.DEBUG:
|
||||||
return
|
return
|
||||||
|
|
||||||
token = request.POST.get('cf-turnstile-response')
|
token = request.POST.get("cf-turnstile-response")
|
||||||
if not token:
|
if not token:
|
||||||
raise ValidationError('Please complete the Turnstile challenge.')
|
raise ValidationError("Please complete the Turnstile challenge.")
|
||||||
|
|
||||||
# Verify the token with Cloudflare
|
# Verify the token with Cloudflare
|
||||||
data = {
|
data = {
|
||||||
'secret': settings.TURNSTILE_SECRET_KEY,
|
"secret": settings.TURNSTILE_SECRET_KEY,
|
||||||
'response': token,
|
"response": token,
|
||||||
'remoteip': request.META.get('REMOTE_ADDR'),
|
"remoteip": request.META.get("REMOTE_ADDR"),
|
||||||
}
|
}
|
||||||
|
|
||||||
response = requests.post(settings.TURNSTILE_VERIFY_URL, data=data, timeout=60)
|
response = requests.post(settings.TURNSTILE_VERIFY_URL, data=data, timeout=60)
|
||||||
result = response.json()
|
result = response.json()
|
||||||
|
|
||||||
if not result.get('success'):
|
if not result.get("success"):
|
||||||
raise ValidationError('Turnstile validation failed. Please try again.')
|
raise ValidationError("Turnstile validation failed. Please try again.")
|
||||||
@@ -2,13 +2,11 @@ from django.contrib.auth.models import AbstractUser
|
|||||||
from django.db import models
|
from django.db import models
|
||||||
from django.urls import reverse
|
from django.urls import reverse
|
||||||
from django.utils.translation import gettext_lazy as _
|
from django.utils.translation import gettext_lazy as _
|
||||||
from PIL import Image, ImageDraw, ImageFont
|
|
||||||
from io import BytesIO
|
|
||||||
import base64
|
|
||||||
import os
|
import os
|
||||||
import secrets
|
import secrets
|
||||||
from core.history import TrackedModel
|
from apps.core.history import TrackedModel
|
||||||
# import pghistory
|
import pghistory
|
||||||
|
|
||||||
|
|
||||||
def generate_random_id(model_class, id_field):
|
def generate_random_id(model_class, id_field):
|
||||||
"""Generate a random ID starting at 4 digits, expanding to 5 if needed"""
|
"""Generate a random ID starting at 4 digits, expanding to 5 if needed"""
|
||||||
@@ -17,29 +15,34 @@ def generate_random_id(model_class, id_field):
|
|||||||
new_id = str(secrets.SystemRandom().randint(1000, 9999))
|
new_id = str(secrets.SystemRandom().randint(1000, 9999))
|
||||||
if not model_class.objects.filter(**{id_field: new_id}).exists():
|
if not model_class.objects.filter(**{id_field: new_id}).exists():
|
||||||
return new_id
|
return new_id
|
||||||
|
|
||||||
# If all 4-digit numbers are taken, try 5 digits
|
# If all 4-digit numbers are taken, try 5 digits
|
||||||
new_id = str(secrets.SystemRandom().randint(10000, 99999))
|
new_id = str(secrets.SystemRandom().randint(10000, 99999))
|
||||||
if not model_class.objects.filter(**{id_field: new_id}).exists():
|
if not model_class.objects.filter(**{id_field: new_id}).exists():
|
||||||
return new_id
|
return new_id
|
||||||
|
|
||||||
|
|
||||||
|
@pghistory.track()
|
||||||
class User(AbstractUser):
|
class User(AbstractUser):
|
||||||
class Roles(models.TextChoices):
|
class Roles(models.TextChoices):
|
||||||
USER = 'USER', _('User')
|
USER = "USER", _("User")
|
||||||
MODERATOR = 'MODERATOR', _('Moderator')
|
MODERATOR = "MODERATOR", _("Moderator")
|
||||||
ADMIN = 'ADMIN', _('Admin')
|
ADMIN = "ADMIN", _("Admin")
|
||||||
SUPERUSER = 'SUPERUSER', _('Superuser')
|
SUPERUSER = "SUPERUSER", _("Superuser")
|
||||||
|
|
||||||
class ThemePreference(models.TextChoices):
|
class ThemePreference(models.TextChoices):
|
||||||
LIGHT = 'light', _('Light')
|
LIGHT = "light", _("Light")
|
||||||
DARK = 'dark', _('Dark')
|
DARK = "dark", _("Dark")
|
||||||
|
|
||||||
# Read-only ID
|
# Read-only ID
|
||||||
user_id = models.CharField(
|
user_id = models.CharField(
|
||||||
max_length=10,
|
max_length=10,
|
||||||
unique=True,
|
unique=True,
|
||||||
editable=False,
|
editable=False,
|
||||||
help_text='Unique identifier for this user that remains constant even if the username changes'
|
help_text=(
|
||||||
|
"Unique identifier for this user that remains constant even if the "
|
||||||
|
"username changes"
|
||||||
|
),
|
||||||
)
|
)
|
||||||
|
|
||||||
role = models.CharField(
|
role = models.CharField(
|
||||||
@@ -61,50 +64,48 @@ class User(AbstractUser):
|
|||||||
return self.get_display_name()
|
return self.get_display_name()
|
||||||
|
|
||||||
def get_absolute_url(self):
|
def get_absolute_url(self):
|
||||||
return reverse('profile', kwargs={'username': self.username})
|
return reverse("profile", kwargs={"username": self.username})
|
||||||
|
|
||||||
def get_display_name(self):
|
def get_display_name(self):
|
||||||
"""Get the user's display name, falling back to username if not set"""
|
"""Get the user's display name, falling back to username if not set"""
|
||||||
profile = getattr(self, 'profile', None)
|
profile = getattr(self, "profile", None)
|
||||||
if profile and profile.display_name:
|
if profile and profile.display_name:
|
||||||
return profile.display_name
|
return profile.display_name
|
||||||
return self.username
|
return self.username
|
||||||
|
|
||||||
def save(self, *args, **kwargs):
|
def save(self, *args, **kwargs):
|
||||||
if not self.user_id:
|
if not self.user_id:
|
||||||
self.user_id = generate_random_id(User, 'user_id')
|
self.user_id = generate_random_id(User, "user_id")
|
||||||
super().save(*args, **kwargs)
|
super().save(*args, **kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
@pghistory.track()
|
||||||
class UserProfile(models.Model):
|
class UserProfile(models.Model):
|
||||||
# Read-only ID
|
# Read-only ID
|
||||||
profile_id = models.CharField(
|
profile_id = models.CharField(
|
||||||
max_length=10,
|
max_length=10,
|
||||||
unique=True,
|
unique=True,
|
||||||
editable=False,
|
editable=False,
|
||||||
help_text='Unique identifier for this profile that remains constant'
|
help_text="Unique identifier for this profile that remains constant",
|
||||||
)
|
)
|
||||||
|
|
||||||
user = models.OneToOneField(
|
user = models.OneToOneField(User, on_delete=models.CASCADE, related_name="profile")
|
||||||
User,
|
|
||||||
on_delete=models.CASCADE,
|
|
||||||
related_name='profile'
|
|
||||||
)
|
|
||||||
display_name = models.CharField(
|
display_name = models.CharField(
|
||||||
max_length=50,
|
max_length=50,
|
||||||
unique=True,
|
unique=True,
|
||||||
help_text="This is the name that will be displayed on the site"
|
help_text="This is the name that will be displayed on the site",
|
||||||
)
|
)
|
||||||
avatar = models.ImageField(upload_to='avatars/', blank=True)
|
avatar = models.ImageField(upload_to="avatars/", blank=True)
|
||||||
pronouns = models.CharField(max_length=50, blank=True)
|
pronouns = models.CharField(max_length=50, blank=True)
|
||||||
|
|
||||||
bio = models.TextField(max_length=500, blank=True)
|
bio = models.TextField(max_length=500, blank=True)
|
||||||
|
|
||||||
# Social media links
|
# Social media links
|
||||||
twitter = models.URLField(blank=True)
|
twitter = models.URLField(blank=True)
|
||||||
instagram = models.URLField(blank=True)
|
instagram = models.URLField(blank=True)
|
||||||
youtube = models.URLField(blank=True)
|
youtube = models.URLField(blank=True)
|
||||||
discord = models.CharField(max_length=100, blank=True)
|
discord = models.CharField(max_length=100, blank=True)
|
||||||
|
|
||||||
# Ride statistics
|
# Ride statistics
|
||||||
coaster_credits = models.IntegerField(default=0)
|
coaster_credits = models.IntegerField(default=0)
|
||||||
dark_ride_credits = models.IntegerField(default=0)
|
dark_ride_credits = models.IntegerField(default=0)
|
||||||
@@ -112,7 +113,10 @@ class UserProfile(models.Model):
|
|||||||
water_ride_credits = models.IntegerField(default=0)
|
water_ride_credits = models.IntegerField(default=0)
|
||||||
|
|
||||||
def get_avatar(self):
|
def get_avatar(self):
|
||||||
"""Return the avatar URL or serve a pre-generated avatar based on the first letter of the username"""
|
"""
|
||||||
|
Return the avatar URL or serve a pre-generated avatar based on the
|
||||||
|
first letter of the username
|
||||||
|
"""
|
||||||
if self.avatar:
|
if self.avatar:
|
||||||
return self.avatar.url
|
return self.avatar.url
|
||||||
first_letter = self.user.username.upper()
|
first_letter = self.user.username.upper()
|
||||||
@@ -127,12 +131,14 @@ class UserProfile(models.Model):
|
|||||||
self.display_name = self.user.username
|
self.display_name = self.user.username
|
||||||
|
|
||||||
if not self.profile_id:
|
if not self.profile_id:
|
||||||
self.profile_id = generate_random_id(UserProfile, 'profile_id')
|
self.profile_id = generate_random_id(UserProfile, "profile_id")
|
||||||
super().save(*args, **kwargs)
|
super().save(*args, **kwargs)
|
||||||
|
|
||||||
def __str__(self):
|
def __str__(self):
|
||||||
return self.display_name
|
return self.display_name
|
||||||
|
|
||||||
|
|
||||||
|
@pghistory.track()
|
||||||
class EmailVerification(models.Model):
|
class EmailVerification(models.Model):
|
||||||
user = models.OneToOneField(User, on_delete=models.CASCADE)
|
user = models.OneToOneField(User, on_delete=models.CASCADE)
|
||||||
token = models.CharField(max_length=64, unique=True)
|
token = models.CharField(max_length=64, unique=True)
|
||||||
@@ -146,6 +152,8 @@ class EmailVerification(models.Model):
|
|||||||
verbose_name = "Email Verification"
|
verbose_name = "Email Verification"
|
||||||
verbose_name_plural = "Email Verifications"
|
verbose_name_plural = "Email Verifications"
|
||||||
|
|
||||||
|
|
||||||
|
@pghistory.track()
|
||||||
class PasswordReset(models.Model):
|
class PasswordReset(models.Model):
|
||||||
user = models.ForeignKey(User, on_delete=models.CASCADE)
|
user = models.ForeignKey(User, on_delete=models.CASCADE)
|
||||||
token = models.CharField(max_length=64)
|
token = models.CharField(max_length=64)
|
||||||
@@ -160,53 +168,55 @@ class PasswordReset(models.Model):
|
|||||||
verbose_name = "Password Reset"
|
verbose_name = "Password Reset"
|
||||||
verbose_name_plural = "Password Resets"
|
verbose_name_plural = "Password Resets"
|
||||||
|
|
||||||
|
|
||||||
# @pghistory.track()
|
# @pghistory.track()
|
||||||
|
|
||||||
|
|
||||||
class TopList(TrackedModel):
|
class TopList(TrackedModel):
|
||||||
class Categories(models.TextChoices):
|
class Categories(models.TextChoices):
|
||||||
ROLLER_COASTER = 'RC', _('Roller Coaster')
|
ROLLER_COASTER = "RC", _("Roller Coaster")
|
||||||
DARK_RIDE = 'DR', _('Dark Ride')
|
DARK_RIDE = "DR", _("Dark Ride")
|
||||||
FLAT_RIDE = 'FR', _('Flat Ride')
|
FLAT_RIDE = "FR", _("Flat Ride")
|
||||||
WATER_RIDE = 'WR', _('Water Ride')
|
WATER_RIDE = "WR", _("Water Ride")
|
||||||
PARK = 'PK', _('Park')
|
PARK = "PK", _("Park")
|
||||||
|
|
||||||
user = models.ForeignKey(
|
user = models.ForeignKey(
|
||||||
User,
|
User,
|
||||||
on_delete=models.CASCADE,
|
on_delete=models.CASCADE,
|
||||||
related_name='top_lists' # Added related_name for User model access
|
related_name="top_lists", # Added related_name for User model access
|
||||||
)
|
)
|
||||||
title = models.CharField(max_length=100)
|
title = models.CharField(max_length=100)
|
||||||
category = models.CharField(
|
category = models.CharField(max_length=2, choices=Categories.choices)
|
||||||
max_length=2,
|
|
||||||
choices=Categories.choices
|
|
||||||
)
|
|
||||||
description = models.TextField(blank=True)
|
description = models.TextField(blank=True)
|
||||||
created_at = models.DateTimeField(auto_now_add=True)
|
created_at = models.DateTimeField(auto_now_add=True)
|
||||||
updated_at = models.DateTimeField(auto_now=True)
|
updated_at = models.DateTimeField(auto_now=True)
|
||||||
|
|
||||||
class Meta:
|
class Meta(TrackedModel.Meta):
|
||||||
ordering = ['-updated_at']
|
ordering = ["-updated_at"]
|
||||||
|
|
||||||
def __str__(self):
|
def __str__(self):
|
||||||
return f"{self.user.get_display_name()}'s {self.category} Top List: {self.title}"
|
return (
|
||||||
|
f"{self.user.get_display_name()}'s {self.category} Top List: {self.title}"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
# @pghistory.track()
|
# @pghistory.track()
|
||||||
|
|
||||||
|
|
||||||
class TopListItem(TrackedModel):
|
class TopListItem(TrackedModel):
|
||||||
top_list = models.ForeignKey(
|
top_list = models.ForeignKey(
|
||||||
TopList,
|
TopList, on_delete=models.CASCADE, related_name="items"
|
||||||
on_delete=models.CASCADE,
|
|
||||||
related_name='items'
|
|
||||||
)
|
)
|
||||||
content_type = models.ForeignKey(
|
content_type = models.ForeignKey(
|
||||||
'contenttypes.ContentType',
|
"contenttypes.ContentType", on_delete=models.CASCADE
|
||||||
on_delete=models.CASCADE
|
|
||||||
)
|
)
|
||||||
object_id = models.PositiveIntegerField()
|
object_id = models.PositiveIntegerField()
|
||||||
rank = models.PositiveIntegerField()
|
rank = models.PositiveIntegerField()
|
||||||
notes = models.TextField(blank=True)
|
notes = models.TextField(blank=True)
|
||||||
|
|
||||||
class Meta:
|
class Meta(TrackedModel.Meta):
|
||||||
ordering = ['rank']
|
ordering = ["rank"]
|
||||||
unique_together = [['top_list', 'rank']]
|
unique_together = [["top_list", "rank"]]
|
||||||
|
|
||||||
def __str__(self):
|
def __str__(self):
|
||||||
return f"#{self.rank} in {self.top_list.title}"
|
return f"#{self.rank} in {self.top_list.title}"
|
||||||
@@ -2,14 +2,12 @@ from django.contrib.auth.models import AbstractUser
|
|||||||
from django.db import models
|
from django.db import models
|
||||||
from django.urls import reverse
|
from django.urls import reverse
|
||||||
from django.utils.translation import gettext_lazy as _
|
from django.utils.translation import gettext_lazy as _
|
||||||
from PIL import Image, ImageDraw, ImageFont
|
|
||||||
from io import BytesIO
|
|
||||||
import base64
|
|
||||||
import os
|
import os
|
||||||
import secrets
|
import secrets
|
||||||
from core.history import TrackedModel
|
from apps.core.history import TrackedModel
|
||||||
import pghistory
|
import pghistory
|
||||||
|
|
||||||
|
|
||||||
def generate_random_id(model_class, id_field):
|
def generate_random_id(model_class, id_field):
|
||||||
"""Generate a random ID starting at 4 digits, expanding to 5 if needed"""
|
"""Generate a random ID starting at 4 digits, expanding to 5 if needed"""
|
||||||
while True:
|
while True:
|
||||||
@@ -17,29 +15,30 @@ def generate_random_id(model_class, id_field):
|
|||||||
new_id = str(secrets.SystemRandom().randint(1000, 9999))
|
new_id = str(secrets.SystemRandom().randint(1000, 9999))
|
||||||
if not model_class.objects.filter(**{id_field: new_id}).exists():
|
if not model_class.objects.filter(**{id_field: new_id}).exists():
|
||||||
return new_id
|
return new_id
|
||||||
|
|
||||||
# If all 4-digit numbers are taken, try 5 digits
|
# If all 4-digit numbers are taken, try 5 digits
|
||||||
new_id = str(secrets.SystemRandom().randint(10000, 99999))
|
new_id = str(secrets.SystemRandom().randint(10000, 99999))
|
||||||
if not model_class.objects.filter(**{id_field: new_id}).exists():
|
if not model_class.objects.filter(**{id_field: new_id}).exists():
|
||||||
return new_id
|
return new_id
|
||||||
|
|
||||||
|
|
||||||
class User(AbstractUser):
|
class User(AbstractUser):
|
||||||
class Roles(models.TextChoices):
|
class Roles(models.TextChoices):
|
||||||
USER = 'USER', _('User')
|
USER = "USER", _("User")
|
||||||
MODERATOR = 'MODERATOR', _('Moderator')
|
MODERATOR = "MODERATOR", _("Moderator")
|
||||||
ADMIN = 'ADMIN', _('Admin')
|
ADMIN = "ADMIN", _("Admin")
|
||||||
SUPERUSER = 'SUPERUSER', _('Superuser')
|
SUPERUSER = "SUPERUSER", _("Superuser")
|
||||||
|
|
||||||
class ThemePreference(models.TextChoices):
|
class ThemePreference(models.TextChoices):
|
||||||
LIGHT = 'light', _('Light')
|
LIGHT = "light", _("Light")
|
||||||
DARK = 'dark', _('Dark')
|
DARK = "dark", _("Dark")
|
||||||
|
|
||||||
# Read-only ID
|
# Read-only ID
|
||||||
user_id = models.CharField(
|
user_id = models.CharField(
|
||||||
max_length=10,
|
max_length=10,
|
||||||
unique=True,
|
unique=True,
|
||||||
editable=False,
|
editable=False,
|
||||||
help_text='Unique identifier for this user that remains constant even if the username changes'
|
help_text="Unique identifier for this user that remains constant even if the username changes",
|
||||||
)
|
)
|
||||||
|
|
||||||
role = models.CharField(
|
role = models.CharField(
|
||||||
@@ -61,50 +60,47 @@ class User(AbstractUser):
|
|||||||
return self.get_display_name()
|
return self.get_display_name()
|
||||||
|
|
||||||
def get_absolute_url(self):
|
def get_absolute_url(self):
|
||||||
return reverse('profile', kwargs={'username': self.username})
|
return reverse("profile", kwargs={"username": self.username})
|
||||||
|
|
||||||
def get_display_name(self):
|
def get_display_name(self):
|
||||||
"""Get the user's display name, falling back to username if not set"""
|
"""Get the user's display name, falling back to username if not set"""
|
||||||
profile = getattr(self, 'profile', None)
|
profile = getattr(self, "profile", None)
|
||||||
if profile and profile.display_name:
|
if profile and profile.display_name:
|
||||||
return profile.display_name
|
return profile.display_name
|
||||||
return self.username
|
return self.username
|
||||||
|
|
||||||
def save(self, *args, **kwargs):
|
def save(self, *args, **kwargs):
|
||||||
if not self.user_id:
|
if not self.user_id:
|
||||||
self.user_id = generate_random_id(User, 'user_id')
|
self.user_id = generate_random_id(User, "user_id")
|
||||||
super().save(*args, **kwargs)
|
super().save(*args, **kwargs)
|
||||||
|
|
||||||
|
|
||||||
class UserProfile(models.Model):
|
class UserProfile(models.Model):
|
||||||
# Read-only ID
|
# Read-only ID
|
||||||
profile_id = models.CharField(
|
profile_id = models.CharField(
|
||||||
max_length=10,
|
max_length=10,
|
||||||
unique=True,
|
unique=True,
|
||||||
editable=False,
|
editable=False,
|
||||||
help_text='Unique identifier for this profile that remains constant'
|
help_text="Unique identifier for this profile that remains constant",
|
||||||
)
|
)
|
||||||
|
|
||||||
user = models.OneToOneField(
|
user = models.OneToOneField(User, on_delete=models.CASCADE, related_name="profile")
|
||||||
User,
|
|
||||||
on_delete=models.CASCADE,
|
|
||||||
related_name='profile'
|
|
||||||
)
|
|
||||||
display_name = models.CharField(
|
display_name = models.CharField(
|
||||||
max_length=50,
|
max_length=50,
|
||||||
unique=True,
|
unique=True,
|
||||||
help_text="This is the name that will be displayed on the site"
|
help_text="This is the name that will be displayed on the site",
|
||||||
)
|
)
|
||||||
avatar = models.ImageField(upload_to='avatars/', blank=True)
|
avatar = models.ImageField(upload_to="avatars/", blank=True)
|
||||||
pronouns = models.CharField(max_length=50, blank=True)
|
pronouns = models.CharField(max_length=50, blank=True)
|
||||||
|
|
||||||
bio = models.TextField(max_length=500, blank=True)
|
bio = models.TextField(max_length=500, blank=True)
|
||||||
|
|
||||||
# Social media links
|
# Social media links
|
||||||
twitter = models.URLField(blank=True)
|
twitter = models.URLField(blank=True)
|
||||||
instagram = models.URLField(blank=True)
|
instagram = models.URLField(blank=True)
|
||||||
youtube = models.URLField(blank=True)
|
youtube = models.URLField(blank=True)
|
||||||
discord = models.CharField(max_length=100, blank=True)
|
discord = models.CharField(max_length=100, blank=True)
|
||||||
|
|
||||||
# Ride statistics
|
# Ride statistics
|
||||||
coaster_credits = models.IntegerField(default=0)
|
coaster_credits = models.IntegerField(default=0)
|
||||||
dark_ride_credits = models.IntegerField(default=0)
|
dark_ride_credits = models.IntegerField(default=0)
|
||||||
@@ -127,12 +123,13 @@ class UserProfile(models.Model):
|
|||||||
self.display_name = self.user.username
|
self.display_name = self.user.username
|
||||||
|
|
||||||
if not self.profile_id:
|
if not self.profile_id:
|
||||||
self.profile_id = generate_random_id(UserProfile, 'profile_id')
|
self.profile_id = generate_random_id(UserProfile, "profile_id")
|
||||||
super().save(*args, **kwargs)
|
super().save(*args, **kwargs)
|
||||||
|
|
||||||
def __str__(self):
|
def __str__(self):
|
||||||
return self.display_name
|
return self.display_name
|
||||||
|
|
||||||
|
|
||||||
class EmailVerification(models.Model):
|
class EmailVerification(models.Model):
|
||||||
user = models.OneToOneField(User, on_delete=models.CASCADE)
|
user = models.OneToOneField(User, on_delete=models.CASCADE)
|
||||||
token = models.CharField(max_length=64, unique=True)
|
token = models.CharField(max_length=64, unique=True)
|
||||||
@@ -146,6 +143,7 @@ class EmailVerification(models.Model):
|
|||||||
verbose_name = "Email Verification"
|
verbose_name = "Email Verification"
|
||||||
verbose_name_plural = "Email Verifications"
|
verbose_name_plural = "Email Verifications"
|
||||||
|
|
||||||
|
|
||||||
class PasswordReset(models.Model):
|
class PasswordReset(models.Model):
|
||||||
user = models.ForeignKey(User, on_delete=models.CASCADE)
|
user = models.ForeignKey(User, on_delete=models.CASCADE)
|
||||||
token = models.CharField(max_length=64)
|
token = models.CharField(max_length=64)
|
||||||
@@ -160,53 +158,51 @@ class PasswordReset(models.Model):
|
|||||||
verbose_name = "Password Reset"
|
verbose_name = "Password Reset"
|
||||||
verbose_name_plural = "Password Resets"
|
verbose_name_plural = "Password Resets"
|
||||||
|
|
||||||
|
|
||||||
@pghistory.track()
|
@pghistory.track()
|
||||||
class TopList(TrackedModel):
|
class TopList(TrackedModel):
|
||||||
class Categories(models.TextChoices):
|
class Categories(models.TextChoices):
|
||||||
ROLLER_COASTER = 'RC', _('Roller Coaster')
|
ROLLER_COASTER = "RC", _("Roller Coaster")
|
||||||
DARK_RIDE = 'DR', _('Dark Ride')
|
DARK_RIDE = "DR", _("Dark Ride")
|
||||||
FLAT_RIDE = 'FR', _('Flat Ride')
|
FLAT_RIDE = "FR", _("Flat Ride")
|
||||||
WATER_RIDE = 'WR', _('Water Ride')
|
WATER_RIDE = "WR", _("Water Ride")
|
||||||
PARK = 'PK', _('Park')
|
PARK = "PK", _("Park")
|
||||||
|
|
||||||
user = models.ForeignKey(
|
user = models.ForeignKey(
|
||||||
User,
|
User,
|
||||||
on_delete=models.CASCADE,
|
on_delete=models.CASCADE,
|
||||||
related_name='top_lists' # Added related_name for User model access
|
related_name="top_lists", # Added related_name for User model access
|
||||||
)
|
)
|
||||||
title = models.CharField(max_length=100)
|
title = models.CharField(max_length=100)
|
||||||
category = models.CharField(
|
category = models.CharField(max_length=2, choices=Categories.choices)
|
||||||
max_length=2,
|
|
||||||
choices=Categories.choices
|
|
||||||
)
|
|
||||||
description = models.TextField(blank=True)
|
description = models.TextField(blank=True)
|
||||||
created_at = models.DateTimeField(auto_now_add=True)
|
created_at = models.DateTimeField(auto_now_add=True)
|
||||||
updated_at = models.DateTimeField(auto_now=True)
|
updated_at = models.DateTimeField(auto_now=True)
|
||||||
|
|
||||||
class Meta:
|
class Meta(TrackedModel.Meta):
|
||||||
ordering = ['-updated_at']
|
ordering = ["-updated_at"]
|
||||||
|
|
||||||
def __str__(self):
|
def __str__(self):
|
||||||
return f"{self.user.get_display_name()}'s {self.category} Top List: {self.title}"
|
return (
|
||||||
|
f"{self.user.get_display_name()}'s {self.category} Top List: {self.title}"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
@pghistory.track()
|
@pghistory.track()
|
||||||
class TopListItem(TrackedModel):
|
class TopListItem(TrackedModel):
|
||||||
top_list = models.ForeignKey(
|
top_list = models.ForeignKey(
|
||||||
TopList,
|
TopList, on_delete=models.CASCADE, related_name="items"
|
||||||
on_delete=models.CASCADE,
|
|
||||||
related_name='items'
|
|
||||||
)
|
)
|
||||||
content_type = models.ForeignKey(
|
content_type = models.ForeignKey(
|
||||||
'contenttypes.ContentType',
|
"contenttypes.ContentType", on_delete=models.CASCADE
|
||||||
on_delete=models.CASCADE
|
|
||||||
)
|
)
|
||||||
object_id = models.PositiveIntegerField()
|
object_id = models.PositiveIntegerField()
|
||||||
rank = models.PositiveIntegerField()
|
rank = models.PositiveIntegerField()
|
||||||
notes = models.TextField(blank=True)
|
notes = models.TextField(blank=True)
|
||||||
|
|
||||||
class Meta:
|
class Meta(TrackedModel.Meta):
|
||||||
ordering = ['rank']
|
ordering = ["rank"]
|
||||||
unique_together = [['top_list', 'rank']]
|
unique_together = [["top_list", "rank"]]
|
||||||
|
|
||||||
def __str__(self):
|
def __str__(self):
|
||||||
return f"#{self.rank} in {self.top_list.title}"
|
return f"#{self.rank} in {self.top_list.title}"
|
||||||
273
backend/apps/accounts/selectors.py
Normal file
273
backend/apps/accounts/selectors.py
Normal file
@@ -0,0 +1,273 @@
|
|||||||
|
"""
|
||||||
|
Selectors for user and account-related data retrieval.
|
||||||
|
Following Django styleguide pattern for separating data access from business logic.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from typing import Dict, Any
|
||||||
|
from django.db.models import QuerySet, Q, F, Count
|
||||||
|
from django.contrib.auth import get_user_model
|
||||||
|
from django.utils import timezone
|
||||||
|
from datetime import timedelta
|
||||||
|
|
||||||
|
User = get_user_model()
|
||||||
|
|
||||||
|
|
||||||
|
def user_profile_optimized(*, user_id: int) -> Any:
|
||||||
|
"""
|
||||||
|
Get a user with optimized queries for profile display.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
user_id: User ID
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
User instance with prefetched related data
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
User.DoesNotExist: If user doesn't exist
|
||||||
|
"""
|
||||||
|
return (
|
||||||
|
User.objects.prefetch_related(
|
||||||
|
"park_reviews", "ride_reviews", "socialaccount_set"
|
||||||
|
)
|
||||||
|
.annotate(
|
||||||
|
park_review_count=Count(
|
||||||
|
"park_reviews", filter=Q(park_reviews__is_published=True)
|
||||||
|
),
|
||||||
|
ride_review_count=Count(
|
||||||
|
"ride_reviews", filter=Q(ride_reviews__is_published=True)
|
||||||
|
),
|
||||||
|
total_review_count=F("park_review_count") + F("ride_review_count"),
|
||||||
|
)
|
||||||
|
.get(id=user_id)
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def active_users_with_stats() -> QuerySet:
|
||||||
|
"""
|
||||||
|
Get active users with review statistics.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
QuerySet of active users with review counts
|
||||||
|
"""
|
||||||
|
return (
|
||||||
|
User.objects.filter(is_active=True)
|
||||||
|
.annotate(
|
||||||
|
park_review_count=Count(
|
||||||
|
"park_reviews", filter=Q(park_reviews__is_published=True)
|
||||||
|
),
|
||||||
|
ride_review_count=Count(
|
||||||
|
"ride_reviews", filter=Q(ride_reviews__is_published=True)
|
||||||
|
),
|
||||||
|
total_review_count=F("park_review_count") + F("ride_review_count"),
|
||||||
|
)
|
||||||
|
.order_by("-total_review_count")
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def users_with_recent_activity(*, days: int = 30) -> QuerySet:
|
||||||
|
"""
|
||||||
|
Get users who have been active in the last N days.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
days: Number of days to look back for activity
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
QuerySet of recently active users
|
||||||
|
"""
|
||||||
|
cutoff_date = timezone.now() - timedelta(days=days)
|
||||||
|
|
||||||
|
return (
|
||||||
|
User.objects.filter(
|
||||||
|
Q(last_login__gte=cutoff_date)
|
||||||
|
| Q(park_reviews__created_at__gte=cutoff_date)
|
||||||
|
| Q(ride_reviews__created_at__gte=cutoff_date)
|
||||||
|
)
|
||||||
|
.annotate(
|
||||||
|
recent_park_reviews=Count(
|
||||||
|
"park_reviews",
|
||||||
|
filter=Q(park_reviews__created_at__gte=cutoff_date),
|
||||||
|
),
|
||||||
|
recent_ride_reviews=Count(
|
||||||
|
"ride_reviews",
|
||||||
|
filter=Q(ride_reviews__created_at__gte=cutoff_date),
|
||||||
|
),
|
||||||
|
recent_total_reviews=F("recent_park_reviews") + F("recent_ride_reviews"),
|
||||||
|
)
|
||||||
|
.order_by("-last_login")
|
||||||
|
.distinct()
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def top_reviewers(*, limit: int = 10) -> QuerySet:
|
||||||
|
"""
|
||||||
|
Get top users by review count.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
limit: Maximum number of users to return
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
QuerySet of top reviewers
|
||||||
|
"""
|
||||||
|
return (
|
||||||
|
User.objects.filter(is_active=True)
|
||||||
|
.annotate(
|
||||||
|
park_review_count=Count(
|
||||||
|
"park_reviews", filter=Q(park_reviews__is_published=True)
|
||||||
|
),
|
||||||
|
ride_review_count=Count(
|
||||||
|
"ride_reviews", filter=Q(ride_reviews__is_published=True)
|
||||||
|
),
|
||||||
|
total_review_count=F("park_review_count") + F("ride_review_count"),
|
||||||
|
)
|
||||||
|
.filter(total_review_count__gt=0)
|
||||||
|
.order_by("-total_review_count")[:limit]
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def moderator_users() -> QuerySet:
|
||||||
|
"""
|
||||||
|
Get users with moderation permissions.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
QuerySet of users who can moderate content
|
||||||
|
"""
|
||||||
|
return (
|
||||||
|
User.objects.filter(
|
||||||
|
Q(is_staff=True)
|
||||||
|
| Q(groups__name="Moderators")
|
||||||
|
| Q(
|
||||||
|
user_permissions__codename__in=[
|
||||||
|
"change_parkreview",
|
||||||
|
"change_ridereview",
|
||||||
|
]
|
||||||
|
)
|
||||||
|
)
|
||||||
|
.distinct()
|
||||||
|
.order_by("username")
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def users_by_registration_date(*, start_date, end_date) -> QuerySet:
|
||||||
|
"""
|
||||||
|
Get users who registered within a date range.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
start_date: Start of date range
|
||||||
|
end_date: End of date range
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
QuerySet of users registered in the date range
|
||||||
|
"""
|
||||||
|
return User.objects.filter(
|
||||||
|
date_joined__date__gte=start_date, date_joined__date__lte=end_date
|
||||||
|
).order_by("-date_joined")
|
||||||
|
|
||||||
|
|
||||||
|
def user_search_autocomplete(*, query: str, limit: int = 10) -> QuerySet:
|
||||||
|
"""
|
||||||
|
Get users matching a search query for autocomplete functionality.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
query: Search string
|
||||||
|
limit: Maximum number of results
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
QuerySet of matching users for autocomplete
|
||||||
|
"""
|
||||||
|
return User.objects.filter(
|
||||||
|
Q(username__icontains=query)
|
||||||
|
| Q(first_name__icontains=query)
|
||||||
|
| Q(last_name__icontains=query),
|
||||||
|
is_active=True,
|
||||||
|
).order_by("username")[:limit]
|
||||||
|
|
||||||
|
|
||||||
|
def users_with_social_accounts() -> QuerySet:
|
||||||
|
"""
|
||||||
|
Get users who have connected social accounts.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
QuerySet of users with social account connections
|
||||||
|
"""
|
||||||
|
return (
|
||||||
|
User.objects.filter(socialaccount__isnull=False)
|
||||||
|
.prefetch_related("socialaccount_set")
|
||||||
|
.distinct()
|
||||||
|
.order_by("username")
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def user_statistics_summary() -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Get overall user statistics for dashboard/analytics.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary containing user statistics
|
||||||
|
"""
|
||||||
|
total_users = User.objects.count()
|
||||||
|
active_users = User.objects.filter(is_active=True).count()
|
||||||
|
staff_users = User.objects.filter(is_staff=True).count()
|
||||||
|
|
||||||
|
# Users with reviews
|
||||||
|
users_with_reviews = (
|
||||||
|
User.objects.filter(
|
||||||
|
Q(park_reviews__isnull=False) | Q(ride_reviews__isnull=False)
|
||||||
|
)
|
||||||
|
.distinct()
|
||||||
|
.count()
|
||||||
|
)
|
||||||
|
|
||||||
|
# Recent registrations (last 30 days)
|
||||||
|
cutoff_date = timezone.now() - timedelta(days=30)
|
||||||
|
recent_registrations = User.objects.filter(date_joined__gte=cutoff_date).count()
|
||||||
|
|
||||||
|
return {
|
||||||
|
"total_users": total_users,
|
||||||
|
"active_users": active_users,
|
||||||
|
"inactive_users": total_users - active_users,
|
||||||
|
"staff_users": staff_users,
|
||||||
|
"users_with_reviews": users_with_reviews,
|
||||||
|
"recent_registrations": recent_registrations,
|
||||||
|
"review_participation_rate": (
|
||||||
|
(users_with_reviews / total_users * 100) if total_users > 0 else 0
|
||||||
|
),
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def users_needing_email_verification() -> QuerySet:
|
||||||
|
"""
|
||||||
|
Get users who haven't verified their email addresses.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
QuerySet of users with unverified emails
|
||||||
|
"""
|
||||||
|
return (
|
||||||
|
User.objects.filter(is_active=True, emailaddress__verified=False)
|
||||||
|
.distinct()
|
||||||
|
.order_by("date_joined")
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def users_by_review_activity(*, min_reviews: int = 1) -> QuerySet:
|
||||||
|
"""
|
||||||
|
Get users who have written at least a minimum number of reviews.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
min_reviews: Minimum number of reviews required
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
QuerySet of users with sufficient review activity
|
||||||
|
"""
|
||||||
|
return (
|
||||||
|
User.objects.annotate(
|
||||||
|
park_review_count=Count(
|
||||||
|
"park_reviews", filter=Q(park_reviews__is_published=True)
|
||||||
|
),
|
||||||
|
ride_review_count=Count(
|
||||||
|
"ride_reviews", filter=Q(ride_reviews__is_published=True)
|
||||||
|
),
|
||||||
|
total_review_count=F("park_review_count") + F("ride_review_count"),
|
||||||
|
)
|
||||||
|
.filter(total_review_count__gte=min_reviews)
|
||||||
|
.order_by("-total_review_count")
|
||||||
|
)
|
||||||
246
backend/apps/accounts/serializers.py
Normal file
246
backend/apps/accounts/serializers.py
Normal file
@@ -0,0 +1,246 @@
|
|||||||
|
from rest_framework import serializers
|
||||||
|
from django.contrib.auth import get_user_model
|
||||||
|
from django.contrib.auth.password_validation import validate_password
|
||||||
|
from django.core.exceptions import ValidationError
|
||||||
|
from django.utils.crypto import get_random_string
|
||||||
|
from django.utils import timezone
|
||||||
|
from datetime import timedelta
|
||||||
|
from django.contrib.sites.shortcuts import get_current_site
|
||||||
|
from .models import User, PasswordReset
|
||||||
|
from apps.email_service.services import EmailService
|
||||||
|
from django.template.loader import render_to_string
|
||||||
|
|
||||||
|
UserModel = get_user_model()
|
||||||
|
|
||||||
|
|
||||||
|
class UserSerializer(serializers.ModelSerializer):
|
||||||
|
"""
|
||||||
|
User serializer for API responses
|
||||||
|
"""
|
||||||
|
avatar_url = serializers.SerializerMethodField()
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
model = User
|
||||||
|
fields = [
|
||||||
|
'id', 'username', 'email', 'first_name', 'last_name',
|
||||||
|
'date_joined', 'is_active', 'avatar_url'
|
||||||
|
]
|
||||||
|
read_only_fields = ['id', 'date_joined', 'is_active']
|
||||||
|
|
||||||
|
def get_avatar_url(self, obj):
|
||||||
|
"""Get user avatar URL"""
|
||||||
|
if hasattr(obj, 'profile') and obj.profile.avatar:
|
||||||
|
return obj.profile.avatar.url
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
class LoginSerializer(serializers.Serializer):
|
||||||
|
"""
|
||||||
|
Serializer for user login
|
||||||
|
"""
|
||||||
|
username = serializers.CharField(
|
||||||
|
max_length=254,
|
||||||
|
help_text="Username or email address"
|
||||||
|
)
|
||||||
|
password = serializers.CharField(
|
||||||
|
max_length=128,
|
||||||
|
style={'input_type': 'password'},
|
||||||
|
trim_whitespace=False
|
||||||
|
)
|
||||||
|
|
||||||
|
def validate(self, attrs):
|
||||||
|
username = attrs.get('username')
|
||||||
|
password = attrs.get('password')
|
||||||
|
|
||||||
|
if username and password:
|
||||||
|
return attrs
|
||||||
|
|
||||||
|
raise serializers.ValidationError(
|
||||||
|
'Must include username/email and password.'
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class SignupSerializer(serializers.ModelSerializer):
|
||||||
|
"""
|
||||||
|
Serializer for user registration
|
||||||
|
"""
|
||||||
|
password = serializers.CharField(
|
||||||
|
write_only=True,
|
||||||
|
validators=[validate_password],
|
||||||
|
style={'input_type': 'password'}
|
||||||
|
)
|
||||||
|
password_confirm = serializers.CharField(
|
||||||
|
write_only=True,
|
||||||
|
style={'input_type': 'password'}
|
||||||
|
)
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
model = User
|
||||||
|
fields = [
|
||||||
|
'username', 'email', 'first_name', 'last_name',
|
||||||
|
'password', 'password_confirm'
|
||||||
|
]
|
||||||
|
extra_kwargs = {
|
||||||
|
'password': {'write_only': True},
|
||||||
|
'email': {'required': True},
|
||||||
|
}
|
||||||
|
|
||||||
|
def validate_email(self, value):
|
||||||
|
"""Validate email is unique"""
|
||||||
|
if UserModel.objects.filter(email=value).exists():
|
||||||
|
raise serializers.ValidationError(
|
||||||
|
"A user with this email already exists."
|
||||||
|
)
|
||||||
|
return value
|
||||||
|
|
||||||
|
def validate_username(self, value):
|
||||||
|
"""Validate username is unique"""
|
||||||
|
if UserModel.objects.filter(username=value).exists():
|
||||||
|
raise serializers.ValidationError(
|
||||||
|
"A user with this username already exists."
|
||||||
|
)
|
||||||
|
return value
|
||||||
|
|
||||||
|
def validate(self, attrs):
|
||||||
|
"""Validate passwords match"""
|
||||||
|
password = attrs.get('password')
|
||||||
|
password_confirm = attrs.get('password_confirm')
|
||||||
|
|
||||||
|
if password != password_confirm:
|
||||||
|
raise serializers.ValidationError({
|
||||||
|
'password_confirm': 'Passwords do not match.'
|
||||||
|
})
|
||||||
|
|
||||||
|
return attrs
|
||||||
|
|
||||||
|
def create(self, validated_data):
|
||||||
|
"""Create user with validated data"""
|
||||||
|
validated_data.pop('password_confirm', None)
|
||||||
|
password = validated_data.pop('password')
|
||||||
|
|
||||||
|
user = UserModel.objects.create(
|
||||||
|
**validated_data
|
||||||
|
)
|
||||||
|
user.set_password(password)
|
||||||
|
user.save()
|
||||||
|
|
||||||
|
return user
|
||||||
|
|
||||||
|
|
||||||
|
class PasswordResetSerializer(serializers.Serializer):
|
||||||
|
"""
|
||||||
|
Serializer for password reset request
|
||||||
|
"""
|
||||||
|
email = serializers.EmailField()
|
||||||
|
|
||||||
|
def validate_email(self, value):
|
||||||
|
"""Validate email exists"""
|
||||||
|
try:
|
||||||
|
user = UserModel.objects.get(email=value)
|
||||||
|
self.user = user
|
||||||
|
return value
|
||||||
|
except UserModel.DoesNotExist:
|
||||||
|
# Don't reveal if email exists or not for security
|
||||||
|
return value
|
||||||
|
|
||||||
|
def save(self, **kwargs):
|
||||||
|
"""Send password reset email if user exists"""
|
||||||
|
if hasattr(self, 'user'):
|
||||||
|
# Create password reset token
|
||||||
|
token = get_random_string(64)
|
||||||
|
PasswordReset.objects.update_or_create(
|
||||||
|
user=self.user,
|
||||||
|
defaults={
|
||||||
|
'token': token,
|
||||||
|
'expires_at': timezone.now() + timedelta(hours=24),
|
||||||
|
'used': False
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
# Send reset email
|
||||||
|
request = self.context.get('request')
|
||||||
|
if request:
|
||||||
|
site = get_current_site(request)
|
||||||
|
reset_url = f"{request.scheme}://{site.domain}/reset-password/{token}/"
|
||||||
|
|
||||||
|
context = {
|
||||||
|
'user': self.user,
|
||||||
|
'reset_url': reset_url,
|
||||||
|
'site_name': site.name,
|
||||||
|
}
|
||||||
|
|
||||||
|
email_html = render_to_string(
|
||||||
|
'accounts/email/password_reset.html',
|
||||||
|
context
|
||||||
|
)
|
||||||
|
|
||||||
|
EmailService.send_email(
|
||||||
|
to=getattr(self.user, 'email', None),
|
||||||
|
subject="Reset your password",
|
||||||
|
text=f"Click the link to reset your password: {reset_url}",
|
||||||
|
site=site,
|
||||||
|
html=email_html,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class PasswordChangeSerializer(serializers.Serializer):
|
||||||
|
"""
|
||||||
|
Serializer for password change
|
||||||
|
"""
|
||||||
|
old_password = serializers.CharField(
|
||||||
|
max_length=128,
|
||||||
|
style={'input_type': 'password'}
|
||||||
|
)
|
||||||
|
new_password = serializers.CharField(
|
||||||
|
max_length=128,
|
||||||
|
validators=[validate_password],
|
||||||
|
style={'input_type': 'password'}
|
||||||
|
)
|
||||||
|
new_password_confirm = serializers.CharField(
|
||||||
|
max_length=128,
|
||||||
|
style={'input_type': 'password'}
|
||||||
|
)
|
||||||
|
|
||||||
|
def validate_old_password(self, value):
|
||||||
|
"""Validate old password is correct"""
|
||||||
|
user = self.context['request'].user
|
||||||
|
if not user.check_password(value):
|
||||||
|
raise serializers.ValidationError(
|
||||||
|
'Old password is incorrect.'
|
||||||
|
)
|
||||||
|
return value
|
||||||
|
|
||||||
|
def validate(self, attrs):
|
||||||
|
"""Validate new passwords match"""
|
||||||
|
new_password = attrs.get('new_password')
|
||||||
|
new_password_confirm = attrs.get('new_password_confirm')
|
||||||
|
|
||||||
|
if new_password != new_password_confirm:
|
||||||
|
raise serializers.ValidationError({
|
||||||
|
'new_password_confirm': 'New passwords do not match.'
|
||||||
|
})
|
||||||
|
|
||||||
|
return attrs
|
||||||
|
|
||||||
|
def save(self, **kwargs):
|
||||||
|
"""Change user password"""
|
||||||
|
user = self.context['request'].user
|
||||||
|
new_password = self.initial_data.get(
|
||||||
|
'new_password') if self.initial_data else None
|
||||||
|
|
||||||
|
if new_password is None:
|
||||||
|
raise serializers.ValidationError('New password is required.')
|
||||||
|
|
||||||
|
user.set_password(new_password)
|
||||||
|
user.save()
|
||||||
|
|
||||||
|
return user
|
||||||
|
|
||||||
|
|
||||||
|
class SocialProviderSerializer(serializers.Serializer):
|
||||||
|
"""
|
||||||
|
Serializer for social authentication providers
|
||||||
|
"""
|
||||||
|
id = serializers.CharField()
|
||||||
|
name = serializers.CharField()
|
||||||
|
login_url = serializers.URLField()
|
||||||
@@ -5,7 +5,8 @@ from django.db import transaction
|
|||||||
from django.core.files import File
|
from django.core.files import File
|
||||||
from django.core.files.temp import NamedTemporaryFile
|
from django.core.files.temp import NamedTemporaryFile
|
||||||
import requests
|
import requests
|
||||||
from .models import User, UserProfile, EmailVerification
|
from .models import User, UserProfile
|
||||||
|
|
||||||
|
|
||||||
@receiver(post_save, sender=User)
|
@receiver(post_save, sender=User)
|
||||||
def create_user_profile(sender, instance, created, **kwargs):
|
def create_user_profile(sender, instance, created, **kwargs):
|
||||||
@@ -14,21 +15,21 @@ def create_user_profile(sender, instance, created, **kwargs):
|
|||||||
if created:
|
if created:
|
||||||
# Create profile
|
# Create profile
|
||||||
profile = UserProfile.objects.create(user=instance)
|
profile = UserProfile.objects.create(user=instance)
|
||||||
|
|
||||||
# If user has a social account with avatar, download it
|
# If user has a social account with avatar, download it
|
||||||
social_account = instance.socialaccount_set.first()
|
social_account = instance.socialaccount_set.first()
|
||||||
if social_account:
|
if social_account:
|
||||||
extra_data = social_account.extra_data
|
extra_data = social_account.extra_data
|
||||||
avatar_url = None
|
avatar_url = None
|
||||||
|
|
||||||
if social_account.provider == 'google':
|
if social_account.provider == "google":
|
||||||
avatar_url = extra_data.get('picture')
|
avatar_url = extra_data.get("picture")
|
||||||
elif social_account.provider == 'discord':
|
elif social_account.provider == "discord":
|
||||||
avatar = extra_data.get('avatar')
|
avatar = extra_data.get("avatar")
|
||||||
discord_id = extra_data.get('id')
|
discord_id = extra_data.get("id")
|
||||||
if avatar:
|
if avatar:
|
||||||
avatar_url = f'https://cdn.discordapp.com/avatars/{discord_id}/{avatar}.png'
|
avatar_url = f"https://cdn.discordapp.com/avatars/{discord_id}/{avatar}.png"
|
||||||
|
|
||||||
if avatar_url:
|
if avatar_url:
|
||||||
try:
|
try:
|
||||||
response = requests.get(avatar_url, timeout=60)
|
response = requests.get(avatar_url, timeout=60)
|
||||||
@@ -36,28 +37,34 @@ def create_user_profile(sender, instance, created, **kwargs):
|
|||||||
img_temp = NamedTemporaryFile(delete=True)
|
img_temp = NamedTemporaryFile(delete=True)
|
||||||
img_temp.write(response.content)
|
img_temp.write(response.content)
|
||||||
img_temp.flush()
|
img_temp.flush()
|
||||||
|
|
||||||
file_name = f"avatar_{instance.username}.png"
|
file_name = f"avatar_{instance.username}.png"
|
||||||
profile.avatar.save(
|
profile.avatar.save(file_name, File(img_temp), save=True)
|
||||||
file_name,
|
|
||||||
File(img_temp),
|
|
||||||
save=True
|
|
||||||
)
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
print(f"Error downloading avatar for user {instance.username}: {str(e)}")
|
print(
|
||||||
|
f"Error downloading avatar for user {
|
||||||
|
instance.username}: {
|
||||||
|
str(e)}"
|
||||||
|
)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
print(f"Error creating profile for user {instance.username}: {str(e)}")
|
print(f"Error creating profile for user {instance.username}: {str(e)}")
|
||||||
|
|
||||||
|
|
||||||
@receiver(post_save, sender=User)
|
@receiver(post_save, sender=User)
|
||||||
def save_user_profile(sender, instance, **kwargs):
|
def save_user_profile(sender, instance, **kwargs):
|
||||||
"""Ensure UserProfile exists and is saved"""
|
"""Ensure UserProfile exists and is saved"""
|
||||||
try:
|
try:
|
||||||
if not hasattr(instance, 'profile'):
|
# Try to get existing profile first
|
||||||
|
try:
|
||||||
|
profile = instance.profile
|
||||||
|
profile.save()
|
||||||
|
except UserProfile.DoesNotExist:
|
||||||
|
# Profile doesn't exist, create it
|
||||||
UserProfile.objects.create(user=instance)
|
UserProfile.objects.create(user=instance)
|
||||||
instance.profile.save()
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
print(f"Error saving profile for user {instance.username}: {str(e)}")
|
print(f"Error saving profile for user {instance.username}: {str(e)}")
|
||||||
|
|
||||||
|
|
||||||
@receiver(pre_save, sender=User)
|
@receiver(pre_save, sender=User)
|
||||||
def sync_user_role_with_groups(sender, instance, **kwargs):
|
def sync_user_role_with_groups(sender, instance, **kwargs):
|
||||||
"""Sync user role with Django groups"""
|
"""Sync user role with Django groups"""
|
||||||
@@ -72,33 +79,49 @@ def sync_user_role_with_groups(sender, instance, **kwargs):
|
|||||||
old_group = Group.objects.filter(name=old_instance.role).first()
|
old_group = Group.objects.filter(name=old_instance.role).first()
|
||||||
if old_group:
|
if old_group:
|
||||||
instance.groups.remove(old_group)
|
instance.groups.remove(old_group)
|
||||||
|
|
||||||
# Add to new role group
|
# Add to new role group
|
||||||
if instance.role != User.Roles.USER:
|
if instance.role != User.Roles.USER:
|
||||||
new_group, _ = Group.objects.get_or_create(name=instance.role)
|
new_group, _ = Group.objects.get_or_create(name=instance.role)
|
||||||
instance.groups.add(new_group)
|
instance.groups.add(new_group)
|
||||||
|
|
||||||
# Special handling for superuser role
|
# Special handling for superuser role
|
||||||
if instance.role == User.Roles.SUPERUSER:
|
if instance.role == User.Roles.SUPERUSER:
|
||||||
instance.is_superuser = True
|
instance.is_superuser = True
|
||||||
instance.is_staff = True
|
instance.is_staff = True
|
||||||
elif old_instance.role == User.Roles.SUPERUSER:
|
elif old_instance.role == User.Roles.SUPERUSER:
|
||||||
# If removing superuser role, remove superuser status
|
# If removing superuser role, remove superuser
|
||||||
|
# status
|
||||||
instance.is_superuser = False
|
instance.is_superuser = False
|
||||||
if instance.role not in [User.Roles.ADMIN, User.Roles.MODERATOR]:
|
if instance.role not in [
|
||||||
|
User.Roles.ADMIN,
|
||||||
|
User.Roles.MODERATOR,
|
||||||
|
]:
|
||||||
instance.is_staff = False
|
instance.is_staff = False
|
||||||
|
|
||||||
# Handle staff status for admin and moderator roles
|
# Handle staff status for admin and moderator roles
|
||||||
if instance.role in [User.Roles.ADMIN, User.Roles.MODERATOR]:
|
if instance.role in [
|
||||||
|
User.Roles.ADMIN,
|
||||||
|
User.Roles.MODERATOR,
|
||||||
|
]:
|
||||||
instance.is_staff = True
|
instance.is_staff = True
|
||||||
elif old_instance.role in [User.Roles.ADMIN, User.Roles.MODERATOR]:
|
elif old_instance.role in [
|
||||||
# If removing admin/moderator role, remove staff status
|
User.Roles.ADMIN,
|
||||||
|
User.Roles.MODERATOR,
|
||||||
|
]:
|
||||||
|
# If removing admin/moderator role, remove staff
|
||||||
|
# status
|
||||||
if instance.role not in [User.Roles.SUPERUSER]:
|
if instance.role not in [User.Roles.SUPERUSER]:
|
||||||
instance.is_staff = False
|
instance.is_staff = False
|
||||||
except User.DoesNotExist:
|
except User.DoesNotExist:
|
||||||
pass
|
pass
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
print(f"Error syncing role with groups for user {instance.username}: {str(e)}")
|
print(
|
||||||
|
f"Error syncing role with groups for user {
|
||||||
|
instance.username}: {
|
||||||
|
str(e)}"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
def create_default_groups():
|
def create_default_groups():
|
||||||
"""
|
"""
|
||||||
@@ -107,33 +130,47 @@ def create_default_groups():
|
|||||||
"""
|
"""
|
||||||
try:
|
try:
|
||||||
from django.contrib.auth.models import Permission
|
from django.contrib.auth.models import Permission
|
||||||
from django.contrib.contenttypes.models import ContentType
|
|
||||||
|
|
||||||
# Create Moderator group
|
# Create Moderator group
|
||||||
moderator_group, _ = Group.objects.get_or_create(name=User.Roles.MODERATOR)
|
moderator_group, _ = Group.objects.get_or_create(name=User.Roles.MODERATOR)
|
||||||
moderator_permissions = [
|
moderator_permissions = [
|
||||||
# Review moderation permissions
|
# Review moderation permissions
|
||||||
'change_review', 'delete_review',
|
"change_review",
|
||||||
'change_reviewreport', 'delete_reviewreport',
|
"delete_review",
|
||||||
|
"change_reviewreport",
|
||||||
|
"delete_reviewreport",
|
||||||
# Edit moderation permissions
|
# Edit moderation permissions
|
||||||
'change_parkedit', 'delete_parkedit',
|
"change_parkedit",
|
||||||
'change_rideedit', 'delete_rideedit',
|
"delete_parkedit",
|
||||||
'change_companyedit', 'delete_companyedit',
|
"change_rideedit",
|
||||||
'change_manufactureredit', 'delete_manufactureredit',
|
"delete_rideedit",
|
||||||
|
"change_companyedit",
|
||||||
|
"delete_companyedit",
|
||||||
|
"change_manufactureredit",
|
||||||
|
"delete_manufactureredit",
|
||||||
]
|
]
|
||||||
|
|
||||||
# Create Admin group
|
# Create Admin group
|
||||||
admin_group, _ = Group.objects.get_or_create(name=User.Roles.ADMIN)
|
admin_group, _ = Group.objects.get_or_create(name=User.Roles.ADMIN)
|
||||||
admin_permissions = moderator_permissions + [
|
admin_permissions = moderator_permissions + [
|
||||||
# User management permissions
|
# User management permissions
|
||||||
'change_user', 'delete_user',
|
"change_user",
|
||||||
|
"delete_user",
|
||||||
# Content management permissions
|
# Content management permissions
|
||||||
'add_park', 'change_park', 'delete_park',
|
"add_park",
|
||||||
'add_ride', 'change_ride', 'delete_ride',
|
"change_park",
|
||||||
'add_company', 'change_company', 'delete_company',
|
"delete_park",
|
||||||
'add_manufacturer', 'change_manufacturer', 'delete_manufacturer',
|
"add_ride",
|
||||||
|
"change_ride",
|
||||||
|
"delete_ride",
|
||||||
|
"add_company",
|
||||||
|
"change_company",
|
||||||
|
"delete_company",
|
||||||
|
"add_manufacturer",
|
||||||
|
"change_manufacturer",
|
||||||
|
"delete_manufacturer",
|
||||||
]
|
]
|
||||||
|
|
||||||
# Assign permissions to groups
|
# Assign permissions to groups
|
||||||
for codename in moderator_permissions:
|
for codename in moderator_permissions:
|
||||||
try:
|
try:
|
||||||
@@ -141,7 +178,7 @@ def create_default_groups():
|
|||||||
moderator_group.permissions.add(perm)
|
moderator_group.permissions.add(perm)
|
||||||
except Permission.DoesNotExist:
|
except Permission.DoesNotExist:
|
||||||
print(f"Permission not found: {codename}")
|
print(f"Permission not found: {codename}")
|
||||||
|
|
||||||
for codename in admin_permissions:
|
for codename in admin_permissions:
|
||||||
try:
|
try:
|
||||||
perm = Permission.objects.get(codename=codename)
|
perm = Permission.objects.get(codename=codename)
|
||||||
@@ -4,6 +4,7 @@ from django.template.loader import render_to_string
|
|||||||
|
|
||||||
register = template.Library()
|
register = template.Library()
|
||||||
|
|
||||||
|
|
||||||
@register.simple_tag
|
@register.simple_tag
|
||||||
def turnstile_widget():
|
def turnstile_widget():
|
||||||
"""
|
"""
|
||||||
@@ -13,12 +14,10 @@ def turnstile_widget():
|
|||||||
Usage: {% load turnstile_tags %}{% turnstile_widget %}
|
Usage: {% load turnstile_tags %}{% turnstile_widget %}
|
||||||
"""
|
"""
|
||||||
if settings.DEBUG:
|
if settings.DEBUG:
|
||||||
template_name = 'accounts/turnstile_widget_empty.html'
|
template_name = "accounts/turnstile_widget_empty.html"
|
||||||
context = {}
|
context = {}
|
||||||
else:
|
else:
|
||||||
template_name = 'accounts/turnstile_widget.html'
|
template_name = "accounts/turnstile_widget.html"
|
||||||
context = {
|
context = {"site_key": settings.TURNSTILE_SITE_KEY}
|
||||||
'site_key': settings.TURNSTILE_SITE_KEY
|
|
||||||
}
|
|
||||||
|
|
||||||
return render_to_string(template_name, context)
|
return render_to_string(template_name, context)
|
||||||
126
backend/apps/accounts/tests.py
Normal file
126
backend/apps/accounts/tests.py
Normal file
@@ -0,0 +1,126 @@
|
|||||||
|
from django.test import TestCase
|
||||||
|
from django.contrib.auth.models import Group, Permission
|
||||||
|
from django.contrib.contenttypes.models import ContentType
|
||||||
|
from unittest.mock import patch, MagicMock
|
||||||
|
from .models import User, UserProfile
|
||||||
|
from .signals import create_default_groups
|
||||||
|
|
||||||
|
|
||||||
|
class SignalsTestCase(TestCase):
|
||||||
|
def setUp(self):
|
||||||
|
self.user = User.objects.create_user(
|
||||||
|
username="testuser",
|
||||||
|
email="testuser@example.com",
|
||||||
|
password="password",
|
||||||
|
)
|
||||||
|
|
||||||
|
def test_create_user_profile(self):
|
||||||
|
# Refresh user from database to ensure signals have been processed
|
||||||
|
self.user.refresh_from_db()
|
||||||
|
|
||||||
|
# Check if profile exists in database first
|
||||||
|
profile_exists = UserProfile.objects.filter(user=self.user).exists()
|
||||||
|
self.assertTrue(profile_exists, "UserProfile should be created by signals")
|
||||||
|
|
||||||
|
# Now safely access the profile
|
||||||
|
profile = UserProfile.objects.get(user=self.user)
|
||||||
|
self.assertIsInstance(profile, UserProfile)
|
||||||
|
|
||||||
|
# Test the reverse relationship
|
||||||
|
self.assertTrue(hasattr(self.user, "profile"))
|
||||||
|
# Test that we can access the profile through the user relationship
|
||||||
|
user_profile = getattr(self.user, "profile", None)
|
||||||
|
self.assertEqual(user_profile, profile)
|
||||||
|
|
||||||
|
@patch("accounts.signals.requests.get")
|
||||||
|
def test_create_user_profile_with_social_avatar(self, mock_get):
|
||||||
|
# Mock the response from requests.get
|
||||||
|
mock_response = MagicMock()
|
||||||
|
mock_response.status_code = 200
|
||||||
|
mock_response.content = b"fake-image-content"
|
||||||
|
mock_get.return_value = mock_response
|
||||||
|
|
||||||
|
# Create a social account for the user (we'll skip this test since socialaccount_set requires allauth setup)
|
||||||
|
# This test would need proper allauth configuration to work
|
||||||
|
self.skipTest("Requires proper allauth socialaccount setup")
|
||||||
|
|
||||||
|
def test_save_user_profile(self):
|
||||||
|
# Get the profile safely first
|
||||||
|
profile = UserProfile.objects.get(user=self.user)
|
||||||
|
profile.delete()
|
||||||
|
|
||||||
|
# Refresh user to clear cached profile relationship
|
||||||
|
self.user.refresh_from_db()
|
||||||
|
|
||||||
|
# Check that profile no longer exists
|
||||||
|
self.assertFalse(UserProfile.objects.filter(user=self.user).exists())
|
||||||
|
|
||||||
|
# Trigger save to recreate profile via signal
|
||||||
|
self.user.save()
|
||||||
|
|
||||||
|
# Verify profile was recreated
|
||||||
|
self.assertTrue(UserProfile.objects.filter(user=self.user).exists())
|
||||||
|
new_profile = UserProfile.objects.get(user=self.user)
|
||||||
|
self.assertIsInstance(new_profile, UserProfile)
|
||||||
|
|
||||||
|
def test_sync_user_role_with_groups(self):
|
||||||
|
self.user.role = User.Roles.MODERATOR
|
||||||
|
self.user.save()
|
||||||
|
self.assertTrue(self.user.groups.filter(name=User.Roles.MODERATOR).exists())
|
||||||
|
self.assertTrue(self.user.is_staff)
|
||||||
|
|
||||||
|
self.user.role = User.Roles.ADMIN
|
||||||
|
self.user.save()
|
||||||
|
self.assertFalse(self.user.groups.filter(name=User.Roles.MODERATOR).exists())
|
||||||
|
self.assertTrue(self.user.groups.filter(name=User.Roles.ADMIN).exists())
|
||||||
|
self.assertTrue(self.user.is_staff)
|
||||||
|
|
||||||
|
self.user.role = User.Roles.SUPERUSER
|
||||||
|
self.user.save()
|
||||||
|
self.assertFalse(self.user.groups.filter(name=User.Roles.ADMIN).exists())
|
||||||
|
self.assertTrue(self.user.groups.filter(name=User.Roles.SUPERUSER).exists())
|
||||||
|
self.assertTrue(self.user.is_superuser)
|
||||||
|
self.assertTrue(self.user.is_staff)
|
||||||
|
|
||||||
|
self.user.role = User.Roles.USER
|
||||||
|
self.user.save()
|
||||||
|
self.assertFalse(self.user.groups.exists())
|
||||||
|
self.assertFalse(self.user.is_superuser)
|
||||||
|
self.assertFalse(self.user.is_staff)
|
||||||
|
|
||||||
|
def test_create_default_groups(self):
|
||||||
|
# Create some permissions for testing
|
||||||
|
content_type = ContentType.objects.get_for_model(User)
|
||||||
|
Permission.objects.create(
|
||||||
|
codename="change_review",
|
||||||
|
name="Can change review",
|
||||||
|
content_type=content_type,
|
||||||
|
)
|
||||||
|
Permission.objects.create(
|
||||||
|
codename="delete_review",
|
||||||
|
name="Can delete review",
|
||||||
|
content_type=content_type,
|
||||||
|
)
|
||||||
|
Permission.objects.create(
|
||||||
|
codename="change_user",
|
||||||
|
name="Can change user",
|
||||||
|
content_type=content_type,
|
||||||
|
)
|
||||||
|
|
||||||
|
create_default_groups()
|
||||||
|
|
||||||
|
moderator_group = Group.objects.get(name=User.Roles.MODERATOR)
|
||||||
|
self.assertIsNotNone(moderator_group)
|
||||||
|
self.assertTrue(
|
||||||
|
moderator_group.permissions.filter(codename="change_review").exists()
|
||||||
|
)
|
||||||
|
self.assertFalse(
|
||||||
|
moderator_group.permissions.filter(codename="change_user").exists()
|
||||||
|
)
|
||||||
|
|
||||||
|
admin_group = Group.objects.get(name=User.Roles.ADMIN)
|
||||||
|
self.assertIsNotNone(admin_group)
|
||||||
|
self.assertTrue(
|
||||||
|
admin_group.permissions.filter(codename="change_review").exists()
|
||||||
|
)
|
||||||
|
self.assertTrue(admin_group.permissions.filter(codename="change_user").exists())
|
||||||
48
backend/apps/accounts/urls.py
Normal file
48
backend/apps/accounts/urls.py
Normal file
@@ -0,0 +1,48 @@
|
|||||||
|
from django.urls import path
|
||||||
|
from django.contrib.auth import views as auth_views
|
||||||
|
from allauth.account.views import LogoutView
|
||||||
|
from . import views
|
||||||
|
|
||||||
|
app_name = "accounts"
|
||||||
|
|
||||||
|
urlpatterns = [
|
||||||
|
# Override allauth's login and signup views with our Turnstile-enabled
|
||||||
|
# versions
|
||||||
|
path("login/", views.CustomLoginView.as_view(), name="account_login"),
|
||||||
|
path("signup/", views.CustomSignupView.as_view(), name="account_signup"),
|
||||||
|
# Authentication views
|
||||||
|
path("logout/", LogoutView.as_view(), name="logout"),
|
||||||
|
path(
|
||||||
|
"password_change/",
|
||||||
|
auth_views.PasswordChangeView.as_view(),
|
||||||
|
name="password_change",
|
||||||
|
),
|
||||||
|
path(
|
||||||
|
"password_change/done/",
|
||||||
|
auth_views.PasswordChangeDoneView.as_view(),
|
||||||
|
name="password_change_done",
|
||||||
|
),
|
||||||
|
path(
|
||||||
|
"password_reset/",
|
||||||
|
auth_views.PasswordResetView.as_view(),
|
||||||
|
name="password_reset",
|
||||||
|
),
|
||||||
|
path(
|
||||||
|
"password_reset/done/",
|
||||||
|
auth_views.PasswordResetDoneView.as_view(),
|
||||||
|
name="password_reset_done",
|
||||||
|
),
|
||||||
|
path(
|
||||||
|
"reset/<uidb64>/<token>/",
|
||||||
|
auth_views.PasswordResetConfirmView.as_view(),
|
||||||
|
name="password_reset_confirm",
|
||||||
|
),
|
||||||
|
path(
|
||||||
|
"reset/done/",
|
||||||
|
auth_views.PasswordResetCompleteView.as_view(),
|
||||||
|
name="password_reset_complete",
|
||||||
|
),
|
||||||
|
# Profile views
|
||||||
|
path("profile/", views.user_redirect_view, name="profile_redirect"),
|
||||||
|
path("settings/", views.SettingsView.as_view(), name="settings"),
|
||||||
|
]
|
||||||
426
backend/apps/accounts/views.py
Normal file
426
backend/apps/accounts/views.py
Normal file
@@ -0,0 +1,426 @@
|
|||||||
|
from django.views.generic import DetailView, TemplateView
|
||||||
|
from django.contrib.auth import get_user_model
|
||||||
|
from django.shortcuts import get_object_or_404, redirect, render
|
||||||
|
from django.contrib.auth.decorators import login_required
|
||||||
|
from django.contrib.auth.mixins import LoginRequiredMixin
|
||||||
|
from django.contrib import messages
|
||||||
|
from django.core.exceptions import ValidationError
|
||||||
|
from django.template.loader import render_to_string
|
||||||
|
from django.utils.crypto import get_random_string
|
||||||
|
from django.utils import timezone
|
||||||
|
from datetime import timedelta
|
||||||
|
from django.contrib.sites.shortcuts import get_current_site
|
||||||
|
from django.contrib.sites.models import Site
|
||||||
|
from django.contrib.sites.requests import RequestSite
|
||||||
|
from django.db.models import QuerySet
|
||||||
|
from django.http import HttpResponseRedirect, HttpResponse, HttpRequest
|
||||||
|
from django.urls import reverse
|
||||||
|
from django.contrib.auth import login
|
||||||
|
from django.core.files.uploadedfile import UploadedFile
|
||||||
|
from apps.accounts.models import (
|
||||||
|
User,
|
||||||
|
PasswordReset,
|
||||||
|
TopList,
|
||||||
|
EmailVerification,
|
||||||
|
UserProfile,
|
||||||
|
)
|
||||||
|
from apps.email_service.services import EmailService
|
||||||
|
from apps.parks.models import ParkReview
|
||||||
|
from apps.rides.models import RideReview
|
||||||
|
from allauth.account.views import LoginView, SignupView
|
||||||
|
from .mixins import TurnstileMixin
|
||||||
|
from typing import Dict, Any, Optional, Union, cast
|
||||||
|
from django_htmx.http import HttpResponseClientRefresh
|
||||||
|
from contextlib import suppress
|
||||||
|
import re
|
||||||
|
|
||||||
|
UserModel = get_user_model()
|
||||||
|
|
||||||
|
|
||||||
|
class CustomLoginView(TurnstileMixin, LoginView):
|
||||||
|
def form_valid(self, form):
|
||||||
|
try:
|
||||||
|
self.validate_turnstile(self.request)
|
||||||
|
except ValidationError as e:
|
||||||
|
form.add_error(None, str(e))
|
||||||
|
return self.form_invalid(form)
|
||||||
|
|
||||||
|
response = super().form_valid(form)
|
||||||
|
return (
|
||||||
|
HttpResponseClientRefresh()
|
||||||
|
if getattr(self.request, "htmx", False)
|
||||||
|
else response
|
||||||
|
)
|
||||||
|
|
||||||
|
def form_invalid(self, form):
|
||||||
|
if getattr(self.request, "htmx", False):
|
||||||
|
return render(
|
||||||
|
self.request,
|
||||||
|
"account/partials/login_form.html",
|
||||||
|
self.get_context_data(form=form),
|
||||||
|
)
|
||||||
|
return super().form_invalid(form)
|
||||||
|
|
||||||
|
def get(self, request: HttpRequest, *args: Any, **kwargs: Any) -> HttpResponse:
|
||||||
|
if getattr(request, "htmx", False):
|
||||||
|
return render(
|
||||||
|
request,
|
||||||
|
"account/partials/login_modal.html",
|
||||||
|
self.get_context_data(),
|
||||||
|
)
|
||||||
|
return super().get(request, *args, **kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
class CustomSignupView(TurnstileMixin, SignupView):
|
||||||
|
def form_valid(self, form):
|
||||||
|
try:
|
||||||
|
self.validate_turnstile(self.request)
|
||||||
|
except ValidationError as e:
|
||||||
|
form.add_error(None, str(e))
|
||||||
|
return self.form_invalid(form)
|
||||||
|
|
||||||
|
response = super().form_valid(form)
|
||||||
|
return (
|
||||||
|
HttpResponseClientRefresh()
|
||||||
|
if getattr(self.request, "htmx", False)
|
||||||
|
else response
|
||||||
|
)
|
||||||
|
|
||||||
|
def form_invalid(self, form):
|
||||||
|
if getattr(self.request, "htmx", False):
|
||||||
|
return render(
|
||||||
|
self.request,
|
||||||
|
"account/partials/signup_modal.html",
|
||||||
|
self.get_context_data(form=form),
|
||||||
|
)
|
||||||
|
return super().form_invalid(form)
|
||||||
|
|
||||||
|
def get(self, request: HttpRequest, *args: Any, **kwargs: Any) -> HttpResponse:
|
||||||
|
if getattr(request, "htmx", False):
|
||||||
|
return render(
|
||||||
|
request,
|
||||||
|
"account/partials/signup_modal.html",
|
||||||
|
self.get_context_data(),
|
||||||
|
)
|
||||||
|
return super().get(request, *args, **kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
@login_required
|
||||||
|
def user_redirect_view(request: HttpRequest) -> HttpResponse:
|
||||||
|
user = cast(User, request.user)
|
||||||
|
return redirect("profile", username=user.username)
|
||||||
|
|
||||||
|
|
||||||
|
def handle_social_login(request: HttpRequest, email: str) -> HttpResponse:
|
||||||
|
if sociallogin := request.session.get("socialaccount_sociallogin"):
|
||||||
|
sociallogin.user.email = email
|
||||||
|
sociallogin.save()
|
||||||
|
login(request, sociallogin.user)
|
||||||
|
del request.session["socialaccount_sociallogin"]
|
||||||
|
messages.success(request, "Successfully logged in")
|
||||||
|
return redirect("/")
|
||||||
|
|
||||||
|
|
||||||
|
def email_required(request: HttpRequest) -> HttpResponse:
|
||||||
|
if not request.session.get("socialaccount_sociallogin"):
|
||||||
|
messages.error(request, "No social login in progress")
|
||||||
|
return redirect("/")
|
||||||
|
|
||||||
|
if request.method == "POST":
|
||||||
|
if email := request.POST.get("email"):
|
||||||
|
return handle_social_login(request, email)
|
||||||
|
messages.error(request, "Email is required")
|
||||||
|
return render(
|
||||||
|
request,
|
||||||
|
"accounts/email_required.html",
|
||||||
|
{"error": "Email is required"},
|
||||||
|
)
|
||||||
|
|
||||||
|
return render(request, "accounts/email_required.html")
|
||||||
|
|
||||||
|
|
||||||
|
class ProfileView(DetailView):
|
||||||
|
model = User
|
||||||
|
template_name = "accounts/profile.html"
|
||||||
|
context_object_name = "profile_user"
|
||||||
|
slug_field = "username"
|
||||||
|
slug_url_kwarg = "username"
|
||||||
|
|
||||||
|
def get_queryset(self) -> QuerySet[User]:
|
||||||
|
return User.objects.select_related("profile")
|
||||||
|
|
||||||
|
def get_context_data(self, **kwargs: Any) -> Dict[str, Any]:
|
||||||
|
context = super().get_context_data(**kwargs)
|
||||||
|
user = cast(User, self.get_object())
|
||||||
|
|
||||||
|
context["park_reviews"] = self._get_user_park_reviews(user)
|
||||||
|
context["ride_reviews"] = self._get_user_ride_reviews(user)
|
||||||
|
context["top_lists"] = self._get_user_top_lists(user)
|
||||||
|
|
||||||
|
return context
|
||||||
|
|
||||||
|
def _get_user_park_reviews(self, user: User) -> QuerySet[ParkReview]:
|
||||||
|
return (
|
||||||
|
ParkReview.objects.filter(user=user, is_published=True)
|
||||||
|
.select_related("user", "user__profile", "park")
|
||||||
|
.order_by("-created_at")[:5]
|
||||||
|
)
|
||||||
|
|
||||||
|
def _get_user_ride_reviews(self, user: User) -> QuerySet[RideReview]:
|
||||||
|
return (
|
||||||
|
RideReview.objects.filter(user=user, is_published=True)
|
||||||
|
.select_related("user", "user__profile", "ride")
|
||||||
|
.order_by("-created_at")[:5]
|
||||||
|
)
|
||||||
|
|
||||||
|
def _get_user_top_lists(self, user: User) -> QuerySet[TopList]:
|
||||||
|
return (
|
||||||
|
TopList.objects.filter(user=user)
|
||||||
|
.select_related("user", "user__profile")
|
||||||
|
.prefetch_related("items")
|
||||||
|
.order_by("-created_at")[:5]
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class SettingsView(LoginRequiredMixin, TemplateView):
|
||||||
|
template_name = "accounts/settings.html"
|
||||||
|
|
||||||
|
def get_context_data(self, **kwargs: Any) -> Dict[str, Any]:
|
||||||
|
context = super().get_context_data(**kwargs)
|
||||||
|
context["user"] = self.request.user
|
||||||
|
return context
|
||||||
|
|
||||||
|
def _handle_profile_update(self, request: HttpRequest) -> None:
|
||||||
|
user = cast(User, request.user)
|
||||||
|
profile = get_object_or_404(UserProfile, user=user)
|
||||||
|
|
||||||
|
if display_name := request.POST.get("display_name"):
|
||||||
|
profile.display_name = display_name
|
||||||
|
|
||||||
|
if "avatar" in request.FILES:
|
||||||
|
avatar_file = cast(UploadedFile, request.FILES["avatar"])
|
||||||
|
profile.avatar.save(avatar_file.name, avatar_file, save=False)
|
||||||
|
profile.save()
|
||||||
|
|
||||||
|
user.save()
|
||||||
|
messages.success(request, "Profile updated successfully")
|
||||||
|
|
||||||
|
def _validate_password(self, password: str) -> bool:
|
||||||
|
"""Validate password meets requirements."""
|
||||||
|
return (
|
||||||
|
len(password) >= 8
|
||||||
|
and bool(re.search(r"[A-Z]", password))
|
||||||
|
and bool(re.search(r"[a-z]", password))
|
||||||
|
and bool(re.search(r"[0-9]", password))
|
||||||
|
)
|
||||||
|
|
||||||
|
def _send_password_change_confirmation(
|
||||||
|
self, request: HttpRequest, user: User
|
||||||
|
) -> None:
|
||||||
|
"""Send password change confirmation email."""
|
||||||
|
site = get_current_site(request)
|
||||||
|
context = {
|
||||||
|
"user": user,
|
||||||
|
"site_name": site.name,
|
||||||
|
}
|
||||||
|
|
||||||
|
email_html = render_to_string(
|
||||||
|
"accounts/email/password_change_confirmation.html", context
|
||||||
|
)
|
||||||
|
|
||||||
|
EmailService.send_email(
|
||||||
|
to=user.email,
|
||||||
|
subject="Password Changed Successfully",
|
||||||
|
text="Your password has been changed successfully.",
|
||||||
|
site=site,
|
||||||
|
html=email_html,
|
||||||
|
)
|
||||||
|
|
||||||
|
def _handle_password_change(
|
||||||
|
self, request: HttpRequest
|
||||||
|
) -> Optional[HttpResponseRedirect]:
|
||||||
|
user = cast(User, request.user)
|
||||||
|
old_password = request.POST.get("old_password", "")
|
||||||
|
new_password = request.POST.get("new_password", "")
|
||||||
|
confirm_password = request.POST.get("confirm_password", "")
|
||||||
|
|
||||||
|
if not user.check_password(old_password):
|
||||||
|
messages.error(request, "Current password is incorrect")
|
||||||
|
return None
|
||||||
|
|
||||||
|
if new_password != confirm_password:
|
||||||
|
messages.error(request, "New passwords do not match")
|
||||||
|
return None
|
||||||
|
|
||||||
|
if not self._validate_password(new_password):
|
||||||
|
messages.error(
|
||||||
|
request,
|
||||||
|
"Password must be at least 8 characters and contain uppercase, lowercase, and numbers",
|
||||||
|
)
|
||||||
|
return None
|
||||||
|
|
||||||
|
user.set_password(new_password)
|
||||||
|
user.save()
|
||||||
|
|
||||||
|
self._send_password_change_confirmation(request, user)
|
||||||
|
messages.success(
|
||||||
|
request,
|
||||||
|
"Password changed successfully. Please check your email for confirmation.",
|
||||||
|
)
|
||||||
|
return HttpResponseRedirect(reverse("account_login"))
|
||||||
|
|
||||||
|
def _handle_email_change(self, request: HttpRequest) -> None:
|
||||||
|
if new_email := request.POST.get("new_email"):
|
||||||
|
self._send_email_verification(request, new_email)
|
||||||
|
messages.success(
|
||||||
|
request, "Verification email sent to your new email address"
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
messages.error(request, "New email is required")
|
||||||
|
|
||||||
|
def _send_email_verification(self, request: HttpRequest, new_email: str) -> None:
|
||||||
|
user = cast(User, request.user)
|
||||||
|
token = get_random_string(64)
|
||||||
|
EmailVerification.objects.update_or_create(user=user, defaults={"token": token})
|
||||||
|
|
||||||
|
site = cast(Site, get_current_site(request))
|
||||||
|
verification_url = reverse("verify_email", kwargs={"token": token})
|
||||||
|
|
||||||
|
context = {
|
||||||
|
"user": user,
|
||||||
|
"verification_url": verification_url,
|
||||||
|
"site_name": site.name,
|
||||||
|
}
|
||||||
|
|
||||||
|
email_html = render_to_string("accounts/email/verify_email.html", context)
|
||||||
|
EmailService.send_email(
|
||||||
|
to=new_email,
|
||||||
|
subject="Verify your new email address",
|
||||||
|
text="Click the link to verify your new email address",
|
||||||
|
site=site,
|
||||||
|
html=email_html,
|
||||||
|
)
|
||||||
|
|
||||||
|
user.pending_email = new_email
|
||||||
|
user.save()
|
||||||
|
|
||||||
|
def post(self, request: HttpRequest, *args: Any, **kwargs: Any) -> HttpResponse:
|
||||||
|
action = request.POST.get("action")
|
||||||
|
|
||||||
|
if action == "update_profile":
|
||||||
|
self._handle_profile_update(request)
|
||||||
|
elif action == "change_password":
|
||||||
|
if response := self._handle_password_change(request):
|
||||||
|
return response
|
||||||
|
elif action == "change_email":
|
||||||
|
self._handle_email_change(request)
|
||||||
|
|
||||||
|
return self.get(request, *args, **kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
def create_password_reset_token(user: User) -> str:
|
||||||
|
token = get_random_string(64)
|
||||||
|
PasswordReset.objects.update_or_create(
|
||||||
|
user=user,
|
||||||
|
defaults={
|
||||||
|
"token": token,
|
||||||
|
"expires_at": timezone.now() + timedelta(hours=24),
|
||||||
|
},
|
||||||
|
)
|
||||||
|
return token
|
||||||
|
|
||||||
|
|
||||||
|
def send_password_reset_email(
|
||||||
|
user: User, site: Union[Site, RequestSite], token: str
|
||||||
|
) -> None:
|
||||||
|
reset_url = reverse("password_reset_confirm", kwargs={"token": token})
|
||||||
|
context = {
|
||||||
|
"user": user,
|
||||||
|
"reset_url": reset_url,
|
||||||
|
"site_name": site.name,
|
||||||
|
}
|
||||||
|
email_html = render_to_string("accounts/email/password_reset.html", context)
|
||||||
|
|
||||||
|
EmailService.send_email(
|
||||||
|
to=user.email,
|
||||||
|
subject="Reset your password",
|
||||||
|
text="Click the link to reset your password",
|
||||||
|
site=site,
|
||||||
|
html=email_html,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def request_password_reset(request: HttpRequest) -> HttpResponse:
|
||||||
|
if request.method != "POST":
|
||||||
|
return render(request, "accounts/password_reset.html")
|
||||||
|
|
||||||
|
if not (email := request.POST.get("email")):
|
||||||
|
messages.error(request, "Email is required")
|
||||||
|
return redirect("account_reset_password")
|
||||||
|
|
||||||
|
with suppress(User.DoesNotExist):
|
||||||
|
user = User.objects.get(email=email)
|
||||||
|
token = create_password_reset_token(user)
|
||||||
|
site = get_current_site(request)
|
||||||
|
send_password_reset_email(user, site, token)
|
||||||
|
|
||||||
|
messages.success(request, "Password reset email sent")
|
||||||
|
return redirect("account_login")
|
||||||
|
|
||||||
|
|
||||||
|
def handle_password_reset(
|
||||||
|
request: HttpRequest,
|
||||||
|
user: User,
|
||||||
|
new_password: str,
|
||||||
|
reset: PasswordReset,
|
||||||
|
site: Union[Site, RequestSite],
|
||||||
|
) -> None:
|
||||||
|
user.set_password(new_password)
|
||||||
|
user.save()
|
||||||
|
|
||||||
|
reset.used = True
|
||||||
|
reset.save()
|
||||||
|
|
||||||
|
send_password_reset_confirmation(user, site)
|
||||||
|
messages.success(request, "Password reset successfully")
|
||||||
|
|
||||||
|
|
||||||
|
def send_password_reset_confirmation(
|
||||||
|
user: User, site: Union[Site, RequestSite]
|
||||||
|
) -> None:
|
||||||
|
context = {
|
||||||
|
"user": user,
|
||||||
|
"site_name": site.name,
|
||||||
|
}
|
||||||
|
email_html = render_to_string(
|
||||||
|
"accounts/email/password_reset_complete.html", context
|
||||||
|
)
|
||||||
|
|
||||||
|
EmailService.send_email(
|
||||||
|
to=user.email,
|
||||||
|
subject="Password Reset Complete",
|
||||||
|
text="Your password has been reset successfully.",
|
||||||
|
site=site,
|
||||||
|
html=email_html,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def reset_password(request: HttpRequest, token: str) -> HttpResponse:
|
||||||
|
try:
|
||||||
|
reset = PasswordReset.objects.select_related("user").get(
|
||||||
|
token=token, expires_at__gt=timezone.now(), used=False
|
||||||
|
)
|
||||||
|
|
||||||
|
if request.method == "POST":
|
||||||
|
if new_password := request.POST.get("new_password"):
|
||||||
|
site = get_current_site(request)
|
||||||
|
handle_password_reset(request, reset.user, new_password, reset, site)
|
||||||
|
return redirect("account_login")
|
||||||
|
|
||||||
|
messages.error(request, "New password is required")
|
||||||
|
|
||||||
|
return render(request, "accounts/password_reset_confirm.html", {"token": token})
|
||||||
|
|
||||||
|
except PasswordReset.DoesNotExist:
|
||||||
|
messages.error(request, "Invalid or expired reset token")
|
||||||
|
return redirect("account_reset_password")
|
||||||
5
backend/apps/api/__init__.py
Normal file
5
backend/apps/api/__init__.py
Normal file
@@ -0,0 +1,5 @@
|
|||||||
|
"""
|
||||||
|
Consolidated API app for ThrillWiki.
|
||||||
|
|
||||||
|
This app provides a unified, versioned API interface for all ThrillWiki resources.
|
||||||
|
"""
|
||||||
17
backend/apps/api/apps.py
Normal file
17
backend/apps/api/apps.py
Normal file
@@ -0,0 +1,17 @@
|
|||||||
|
"""Django app configuration for the consolidated API."""
|
||||||
|
|
||||||
|
from django.apps import AppConfig
|
||||||
|
|
||||||
|
|
||||||
|
class ApiConfig(AppConfig):
|
||||||
|
"""Configuration for the consolidated API app."""
|
||||||
|
|
||||||
|
default_auto_field = "django.db.models.BigAutoField"
|
||||||
|
name = "apps.api"
|
||||||
|
|
||||||
|
def ready(self):
|
||||||
|
"""Import schema extensions when app is ready."""
|
||||||
|
try:
|
||||||
|
import apps.api.v1.schema # noqa: F401
|
||||||
|
except ImportError:
|
||||||
|
pass
|
||||||
6
backend/apps/api/v1/__init__.py
Normal file
6
backend/apps/api/v1/__init__.py
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
"""
|
||||||
|
ThrillWiki API v1.
|
||||||
|
|
||||||
|
This module provides the version 1 REST API for ThrillWiki, consolidating
|
||||||
|
all endpoints under a unified, well-documented API structure.
|
||||||
|
"""
|
||||||
334
backend/apps/api/v1/schema.py
Normal file
334
backend/apps/api/v1/schema.py
Normal file
@@ -0,0 +1,334 @@
|
|||||||
|
"""
|
||||||
|
Schema extensions and customizations for drf-spectacular.
|
||||||
|
|
||||||
|
This module provides custom extensions to improve OpenAPI schema generation
|
||||||
|
for the ThrillWiki API, including better documentation and examples.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from drf_spectacular.openapi import AutoSchema
|
||||||
|
from drf_spectacular.utils import OpenApiExample
|
||||||
|
from drf_spectacular.types import OpenApiTypes
|
||||||
|
|
||||||
|
|
||||||
|
# Custom examples for common serializers
|
||||||
|
|
||||||
|
PARK_EXAMPLE = {
|
||||||
|
"id": 1,
|
||||||
|
"name": "Cedar Point",
|
||||||
|
"slug": "cedar-point",
|
||||||
|
"description": "The Roller Coaster Capital of the World",
|
||||||
|
"status": "OPERATING",
|
||||||
|
"opening_date": "1870-07-04",
|
||||||
|
"closing_date": None,
|
||||||
|
"location": {
|
||||||
|
"latitude": 41.4793,
|
||||||
|
"longitude": -82.6833,
|
||||||
|
"city": "Sandusky",
|
||||||
|
"state": "Ohio",
|
||||||
|
"country": "United States",
|
||||||
|
"formatted_address": "Sandusky, OH, United States",
|
||||||
|
},
|
||||||
|
"operator": {
|
||||||
|
"id": 1,
|
||||||
|
"name": "Cedar Fair",
|
||||||
|
"slug": "cedar-fair",
|
||||||
|
"roles": ["OPERATOR", "PROPERTY_OWNER"],
|
||||||
|
},
|
||||||
|
"property_owner": {
|
||||||
|
"id": 1,
|
||||||
|
"name": "Cedar Fair",
|
||||||
|
"slug": "cedar-fair",
|
||||||
|
"roles": ["OPERATOR", "PROPERTY_OWNER"],
|
||||||
|
},
|
||||||
|
"area_count": 15,
|
||||||
|
"ride_count": 70,
|
||||||
|
"operating_rides_count": 68,
|
||||||
|
"roller_coaster_count": 17,
|
||||||
|
}
|
||||||
|
|
||||||
|
RIDE_EXAMPLE = {
|
||||||
|
"id": 1,
|
||||||
|
"name": "Steel Vengeance",
|
||||||
|
"slug": "steel-vengeance",
|
||||||
|
"description": "A hybrid wooden/steel roller coaster",
|
||||||
|
"category": "ROLLER_COASTER",
|
||||||
|
"status": "OPERATING",
|
||||||
|
"opening_date": "2018-05-05",
|
||||||
|
"closing_date": None,
|
||||||
|
"park": {"id": 1, "name": "Cedar Point", "slug": "cedar-point"},
|
||||||
|
"manufacturer": {
|
||||||
|
"id": 1,
|
||||||
|
"name": "Rocky Mountain Construction",
|
||||||
|
"slug": "rmc",
|
||||||
|
"roles": ["MANUFACTURER"],
|
||||||
|
},
|
||||||
|
"designer": {
|
||||||
|
"id": 1,
|
||||||
|
"name": "Rocky Mountain Construction",
|
||||||
|
"slug": "rmc",
|
||||||
|
"roles": ["DESIGNER"],
|
||||||
|
},
|
||||||
|
"height_feet": 205,
|
||||||
|
"length_feet": 5740,
|
||||||
|
"speed_mph": 74,
|
||||||
|
"inversions": 4,
|
||||||
|
"duration_seconds": 150,
|
||||||
|
"capacity_per_hour": 1200,
|
||||||
|
"minimum_height_inches": 48,
|
||||||
|
"maximum_height_inches": None,
|
||||||
|
}
|
||||||
|
|
||||||
|
COMPANY_EXAMPLE = {
|
||||||
|
"id": 1,
|
||||||
|
"name": "Cedar Fair",
|
||||||
|
"slug": "cedar-fair",
|
||||||
|
"roles": ["OPERATOR", "PROPERTY_OWNER"],
|
||||||
|
}
|
||||||
|
|
||||||
|
LOCATION_EXAMPLE = {
|
||||||
|
"latitude": 41.4793,
|
||||||
|
"longitude": -82.6833,
|
||||||
|
"city": "Sandusky",
|
||||||
|
"state": "Ohio",
|
||||||
|
"country": "United States",
|
||||||
|
"formatted_address": "Sandusky, OH, United States",
|
||||||
|
}
|
||||||
|
|
||||||
|
HISTORY_EVENT_EXAMPLE = {
|
||||||
|
"id": "12345678-1234-5678-9012-123456789012",
|
||||||
|
"pgh_created_at": "2024-01-15T14:30:00Z",
|
||||||
|
"pgh_label": "updated",
|
||||||
|
"pgh_model": "parks.park",
|
||||||
|
"pgh_obj_id": 1,
|
||||||
|
"pgh_context": {
|
||||||
|
"user_id": 42,
|
||||||
|
"request_id": "req_abc123",
|
||||||
|
"ip_address": "192.168.1.100",
|
||||||
|
},
|
||||||
|
"changed_fields": ["name", "description"],
|
||||||
|
"field_changes": {
|
||||||
|
"name": {"old_value": "Cedar Point Amusement Park", "new_value": "Cedar Point"},
|
||||||
|
"description": {
|
||||||
|
"old_value": "America's Roller Coast",
|
||||||
|
"new_value": "The Roller Coaster Capital of the World",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
PARK_HISTORY_EXAMPLE = {
|
||||||
|
"park": PARK_EXAMPLE,
|
||||||
|
"current_state": PARK_EXAMPLE,
|
||||||
|
"summary": {
|
||||||
|
"total_events": 25,
|
||||||
|
"first_recorded": "2023-01-01T00:00:00Z",
|
||||||
|
"last_modified": "2024-01-15T14:30:00Z",
|
||||||
|
"significant_changes": [
|
||||||
|
{
|
||||||
|
"date": "2024-01-15T14:30:00Z",
|
||||||
|
"event_type": "updated",
|
||||||
|
"description": "Name and description updated",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"date": "2023-06-01T10:00:00Z",
|
||||||
|
"event_type": "updated",
|
||||||
|
"description": "Operating status changed",
|
||||||
|
},
|
||||||
|
],
|
||||||
|
},
|
||||||
|
"events": [HISTORY_EVENT_EXAMPLE],
|
||||||
|
}
|
||||||
|
|
||||||
|
UNIFIED_HISTORY_TIMELINE_EXAMPLE = {
|
||||||
|
"summary": {
|
||||||
|
"total_events": 1250,
|
||||||
|
"events_returned": 100,
|
||||||
|
"event_type_breakdown": {"created": 45, "updated": 180, "deleted": 5},
|
||||||
|
"model_type_breakdown": {
|
||||||
|
"parks.park": 75,
|
||||||
|
"rides.ride": 120,
|
||||||
|
"companies.operator": 15,
|
||||||
|
"companies.manufacturer": 25,
|
||||||
|
"accounts.user": 30,
|
||||||
|
},
|
||||||
|
"time_range": {
|
||||||
|
"earliest": "2023-01-01T00:00:00Z",
|
||||||
|
"latest": "2024-01-15T14:30:00Z",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
"events": [
|
||||||
|
{
|
||||||
|
"id": "event_001",
|
||||||
|
"pgh_created_at": "2024-01-15T14:30:00Z",
|
||||||
|
"pgh_label": "updated",
|
||||||
|
"pgh_model": "parks.park",
|
||||||
|
"pgh_obj_id": 1,
|
||||||
|
"entity_name": "Cedar Point",
|
||||||
|
"entity_slug": "cedar-point",
|
||||||
|
"change_significance": "minor",
|
||||||
|
"change_summary": "Park description updated",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"id": "event_002",
|
||||||
|
"pgh_created_at": "2024-01-15T12:00:00Z",
|
||||||
|
"pgh_label": "created",
|
||||||
|
"pgh_model": "rides.ride",
|
||||||
|
"pgh_obj_id": 100,
|
||||||
|
"entity_name": "New Roller Coaster",
|
||||||
|
"entity_slug": "new-roller-coaster",
|
||||||
|
"change_significance": "major",
|
||||||
|
"change_summary": "New ride added to park",
|
||||||
|
},
|
||||||
|
],
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
# OpenAPI schema customizations
|
||||||
|
|
||||||
|
|
||||||
|
def custom_preprocessing_hook(endpoints):
|
||||||
|
"""
|
||||||
|
Custom preprocessing hook to modify endpoints before schema generation.
|
||||||
|
|
||||||
|
This can be used to filter out certain endpoints, modify their metadata,
|
||||||
|
or add custom documentation.
|
||||||
|
"""
|
||||||
|
# Filter out any endpoints we don't want in the public API
|
||||||
|
filtered = []
|
||||||
|
for path, path_regex, method, callback in endpoints:
|
||||||
|
# Skip internal or debug endpoints
|
||||||
|
if "/debug/" not in path and "/internal/" not in path:
|
||||||
|
filtered.append((path, path_regex, method, callback))
|
||||||
|
|
||||||
|
return filtered
|
||||||
|
|
||||||
|
|
||||||
|
def custom_postprocessing_hook(result, generator, request, public):
|
||||||
|
"""
|
||||||
|
Custom postprocessing hook to modify the generated schema.
|
||||||
|
|
||||||
|
This can be used to add custom metadata, modify response schemas,
|
||||||
|
or enhance the overall API documentation.
|
||||||
|
"""
|
||||||
|
# Add custom info to the schema
|
||||||
|
if "info" in result:
|
||||||
|
result["info"]["contact"] = {
|
||||||
|
"name": "ThrillWiki API Support",
|
||||||
|
"email": "api@thrillwiki.com",
|
||||||
|
"url": "https://thrillwiki.com/support",
|
||||||
|
}
|
||||||
|
|
||||||
|
result["info"]["license"] = {
|
||||||
|
"name": "MIT",
|
||||||
|
"url": "https://opensource.org/licenses/MIT",
|
||||||
|
}
|
||||||
|
|
||||||
|
# Add custom tags with descriptions
|
||||||
|
if "tags" not in result:
|
||||||
|
result["tags"] = []
|
||||||
|
|
||||||
|
result["tags"].extend(
|
||||||
|
[
|
||||||
|
{
|
||||||
|
"name": "Parks",
|
||||||
|
"description": "Operations related to theme parks, including CRUD operations and statistics",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": "Rides",
|
||||||
|
"description": "Operations related to rides and attractions within theme parks",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": "History",
|
||||||
|
"description": "Historical change tracking for all entities, providing complete audit trails and version history",
|
||||||
|
"externalDocs": {
|
||||||
|
"description": "Learn more about pghistory",
|
||||||
|
"url": "https://django-pghistory.readthedocs.io/",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": "Statistics",
|
||||||
|
"description": "Statistical endpoints providing aggregated data and insights",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": "Reviews",
|
||||||
|
"description": "User reviews and ratings for parks and rides",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": "Authentication",
|
||||||
|
"description": "User authentication and account management endpoints",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": "Health",
|
||||||
|
"description": "System health checks and monitoring endpoints",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": "Recent Changes",
|
||||||
|
"description": "Endpoints for accessing recently changed entities by type and change category",
|
||||||
|
},
|
||||||
|
]
|
||||||
|
)
|
||||||
|
|
||||||
|
# Add custom servers if not present
|
||||||
|
if "servers" not in result:
|
||||||
|
result["servers"] = [
|
||||||
|
{
|
||||||
|
"url": "https://api.thrillwiki.com/v1",
|
||||||
|
"description": "Production server",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"url": "https://staging-api.thrillwiki.com/v1",
|
||||||
|
"description": "Staging server",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"url": "http://localhost:8000/api/v1",
|
||||||
|
"description": "Development server",
|
||||||
|
},
|
||||||
|
]
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
# Custom AutoSchema class for enhanced documentation
|
||||||
|
class ThrillWikiAutoSchema(AutoSchema):
|
||||||
|
"""
|
||||||
|
Custom AutoSchema class that provides enhanced documentation
|
||||||
|
for ThrillWiki API endpoints.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def get_operation_id(self):
|
||||||
|
"""Generate meaningful operation IDs."""
|
||||||
|
if hasattr(self.view, "basename"):
|
||||||
|
basename = self.view.basename
|
||||||
|
else:
|
||||||
|
basename = getattr(self.view, "__class__", self.view).__name__.lower()
|
||||||
|
if basename.endswith("viewset"):
|
||||||
|
basename = basename[:-7] # Remove 'viewset' suffix
|
||||||
|
|
||||||
|
action = self.method_mapping.get(self.method.lower(), self.method.lower())
|
||||||
|
return f"{basename}_{action}"
|
||||||
|
|
||||||
|
def get_tags(self):
|
||||||
|
"""Generate tags based on the viewset."""
|
||||||
|
if hasattr(self.view, "basename"):
|
||||||
|
return [self.view.basename.title()]
|
||||||
|
return super().get_tags()
|
||||||
|
|
||||||
|
def get_summary(self):
|
||||||
|
"""Generate summary from docstring or method name."""
|
||||||
|
summary = super().get_summary()
|
||||||
|
if summary:
|
||||||
|
return summary
|
||||||
|
|
||||||
|
# Generate from method and model
|
||||||
|
action = self.method_mapping.get(self.method.lower(), self.method.lower())
|
||||||
|
model_name = getattr(self.view, "basename", "resource")
|
||||||
|
|
||||||
|
action_map = {
|
||||||
|
"list": f"List {model_name}",
|
||||||
|
"create": f"Create {model_name}",
|
||||||
|
"retrieve": f"Get {model_name} details",
|
||||||
|
"update": f"Update {model_name}",
|
||||||
|
"partial_update": f"Partially update {model_name}",
|
||||||
|
"destroy": f"Delete {model_name}",
|
||||||
|
}
|
||||||
|
|
||||||
|
return action_map.get(action, f"{action.title()} {model_name}")
|
||||||
2179
backend/apps/api/v1/serializers.py
Normal file
2179
backend/apps/api/v1/serializers.py
Normal file
File diff suppressed because it is too large
Load Diff
142
backend/apps/api/v1/urls.py
Normal file
142
backend/apps/api/v1/urls.py
Normal file
@@ -0,0 +1,142 @@
|
|||||||
|
"""
|
||||||
|
URL configuration for ThrillWiki API v1.
|
||||||
|
|
||||||
|
This module provides unified API routing following RESTful conventions
|
||||||
|
and DRF Router patterns for automatic URL generation.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from django.urls import path, include
|
||||||
|
from rest_framework.routers import DefaultRouter
|
||||||
|
from drf_spectacular.views import (
|
||||||
|
SpectacularAPIView,
|
||||||
|
SpectacularSwaggerView,
|
||||||
|
SpectacularRedocView,
|
||||||
|
)
|
||||||
|
|
||||||
|
from .viewsets import (
|
||||||
|
ParkViewSet,
|
||||||
|
RideViewSet,
|
||||||
|
ParkReadOnlyViewSet,
|
||||||
|
RideReadOnlyViewSet,
|
||||||
|
LoginAPIView,
|
||||||
|
SignupAPIView,
|
||||||
|
LogoutAPIView,
|
||||||
|
CurrentUserAPIView,
|
||||||
|
PasswordResetAPIView,
|
||||||
|
PasswordChangeAPIView,
|
||||||
|
SocialProvidersAPIView,
|
||||||
|
AuthStatusAPIView,
|
||||||
|
HealthCheckAPIView,
|
||||||
|
PerformanceMetricsAPIView,
|
||||||
|
SimpleHealthAPIView,
|
||||||
|
# History viewsets
|
||||||
|
ParkHistoryViewSet,
|
||||||
|
RideHistoryViewSet,
|
||||||
|
UnifiedHistoryViewSet,
|
||||||
|
# New comprehensive viewsets
|
||||||
|
ParkAreaViewSet,
|
||||||
|
ParkLocationViewSet,
|
||||||
|
CompanyViewSet,
|
||||||
|
RideModelViewSet,
|
||||||
|
RollerCoasterStatsViewSet,
|
||||||
|
RideLocationViewSet,
|
||||||
|
RideReviewViewSet,
|
||||||
|
UserProfileViewSet,
|
||||||
|
TopListViewSet,
|
||||||
|
TopListItemViewSet,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Create the main API router
|
||||||
|
router = DefaultRouter()
|
||||||
|
|
||||||
|
# Register ViewSets with descriptive prefixes
|
||||||
|
|
||||||
|
# Core models
|
||||||
|
router.register(r"parks", ParkViewSet, basename="park")
|
||||||
|
router.register(r"rides", RideViewSet, basename="ride")
|
||||||
|
|
||||||
|
# Park-related models
|
||||||
|
router.register(r"park-areas", ParkAreaViewSet, basename="park-area")
|
||||||
|
router.register(r"park-locations", ParkLocationViewSet, basename="park-location")
|
||||||
|
|
||||||
|
# Company models
|
||||||
|
router.register(r"companies", CompanyViewSet, basename="company")
|
||||||
|
|
||||||
|
# Ride-related models
|
||||||
|
router.register(r"ride-models", RideModelViewSet, basename="ride-model")
|
||||||
|
router.register(
|
||||||
|
r"roller-coaster-stats", RollerCoasterStatsViewSet, basename="roller-coaster-stats"
|
||||||
|
)
|
||||||
|
router.register(r"ride-locations", RideLocationViewSet, basename="ride-location")
|
||||||
|
router.register(r"ride-reviews", RideReviewViewSet, basename="ride-review")
|
||||||
|
|
||||||
|
# User-related models
|
||||||
|
router.register(r"user-profiles", UserProfileViewSet, basename="user-profile")
|
||||||
|
router.register(r"top-lists", TopListViewSet, basename="top-list")
|
||||||
|
router.register(r"top-list-items", TopListItemViewSet, basename="top-list-item")
|
||||||
|
|
||||||
|
# Register read-only endpoints for reference data
|
||||||
|
router.register(r"ref/parks", ParkReadOnlyViewSet, basename="park-ref")
|
||||||
|
router.register(r"ref/rides", RideReadOnlyViewSet, basename="ride-ref")
|
||||||
|
|
||||||
|
app_name = "api_v1"
|
||||||
|
|
||||||
|
urlpatterns = [
|
||||||
|
# API Documentation endpoints
|
||||||
|
path("schema/", SpectacularAPIView.as_view(), name="schema"),
|
||||||
|
path(
|
||||||
|
"docs/",
|
||||||
|
SpectacularSwaggerView.as_view(url_name="api_v1:schema"),
|
||||||
|
name="swagger-ui",
|
||||||
|
),
|
||||||
|
path(
|
||||||
|
"redoc/", SpectacularRedocView.as_view(url_name="api_v1:schema"), name="redoc"
|
||||||
|
),
|
||||||
|
# Authentication endpoints
|
||||||
|
path("auth/login/", LoginAPIView.as_view(), name="login"),
|
||||||
|
path("auth/signup/", SignupAPIView.as_view(), name="signup"),
|
||||||
|
path("auth/logout/", LogoutAPIView.as_view(), name="logout"),
|
||||||
|
path("auth/user/", CurrentUserAPIView.as_view(), name="current-user"),
|
||||||
|
path("auth/password/reset/", PasswordResetAPIView.as_view(), name="password-reset"),
|
||||||
|
path(
|
||||||
|
"auth/password/change/", PasswordChangeAPIView.as_view(), name="password-change"
|
||||||
|
),
|
||||||
|
path("auth/providers/", SocialProvidersAPIView.as_view(), name="social-providers"),
|
||||||
|
path("auth/status/", AuthStatusAPIView.as_view(), name="auth-status"),
|
||||||
|
# Health check endpoints
|
||||||
|
path("health/", HealthCheckAPIView.as_view(), name="health-check"),
|
||||||
|
path("health/simple/", SimpleHealthAPIView.as_view(), name="simple-health"),
|
||||||
|
path(
|
||||||
|
"health/performance/",
|
||||||
|
PerformanceMetricsAPIView.as_view(),
|
||||||
|
name="performance-metrics",
|
||||||
|
),
|
||||||
|
# History endpoints
|
||||||
|
path(
|
||||||
|
"history/timeline/",
|
||||||
|
UnifiedHistoryViewSet.as_view({"get": "list"}),
|
||||||
|
name="unified-history-timeline",
|
||||||
|
),
|
||||||
|
path(
|
||||||
|
"parks/<str:park_slug>/history/",
|
||||||
|
ParkHistoryViewSet.as_view({"get": "list"}),
|
||||||
|
name="park-history-list",
|
||||||
|
),
|
||||||
|
path(
|
||||||
|
"parks/<str:park_slug>/history/detail/",
|
||||||
|
ParkHistoryViewSet.as_view({"get": "retrieve"}),
|
||||||
|
name="park-history-detail",
|
||||||
|
),
|
||||||
|
path(
|
||||||
|
"parks/<str:park_slug>/rides/<str:ride_slug>/history/",
|
||||||
|
RideHistoryViewSet.as_view({"get": "list"}),
|
||||||
|
name="ride-history-list",
|
||||||
|
),
|
||||||
|
path(
|
||||||
|
"parks/<str:park_slug>/rides/<str:ride_slug>/history/detail/",
|
||||||
|
RideHistoryViewSet.as_view({"get": "retrieve"}),
|
||||||
|
name="ride-history-detail",
|
||||||
|
),
|
||||||
|
# Include all router-generated URLs
|
||||||
|
path("", include(router.urls)),
|
||||||
|
]
|
||||||
2910
backend/apps/api/v1/viewsets.py
Normal file
2910
backend/apps/api/v1/viewsets.py
Normal file
File diff suppressed because it is too large
Load Diff
43
backend/apps/context_portal/alembic.ini
Normal file
43
backend/apps/context_portal/alembic.ini
Normal file
@@ -0,0 +1,43 @@
|
|||||||
|
|
||||||
|
# A generic Alembic configuration file.
|
||||||
|
|
||||||
|
[alembic]
|
||||||
|
# path to migration scripts
|
||||||
|
script_location = alembic
|
||||||
|
|
||||||
|
# The database URL is now set dynamically by ConPort's run_migrations function.
|
||||||
|
# sqlalchemy.url = sqlite:///your_database.db
|
||||||
|
# ... other Alembic settings ...
|
||||||
|
[loggers]
|
||||||
|
keys = root,sqlalchemy,alembic
|
||||||
|
|
||||||
|
[handlers]
|
||||||
|
keys = console
|
||||||
|
|
||||||
|
[formatters]
|
||||||
|
keys = generic
|
||||||
|
|
||||||
|
[logger_root]
|
||||||
|
level = WARN
|
||||||
|
handlers = console
|
||||||
|
qualname =
|
||||||
|
|
||||||
|
[logger_sqlalchemy]
|
||||||
|
level = WARN
|
||||||
|
handlers =
|
||||||
|
qualname = sqlalchemy.engine
|
||||||
|
|
||||||
|
[logger_alembic]
|
||||||
|
level = INFO
|
||||||
|
handlers =
|
||||||
|
qualname = alembic
|
||||||
|
|
||||||
|
[handler_console]
|
||||||
|
class = StreamHandler
|
||||||
|
args = (sys.stderr,)
|
||||||
|
level = NOTSET
|
||||||
|
formatter = generic
|
||||||
|
|
||||||
|
[formatter_generic]
|
||||||
|
format = %(levelname)-5.5s [%(name)s] %(message)s
|
||||||
|
datefmt = %H:%M:%S
|
||||||
76
backend/apps/context_portal/alembic/env.py
Normal file
76
backend/apps/context_portal/alembic/env.py
Normal file
@@ -0,0 +1,76 @@
|
|||||||
|
from logging.config import fileConfig
|
||||||
|
|
||||||
|
from sqlalchemy import engine_from_config
|
||||||
|
from sqlalchemy import pool
|
||||||
|
|
||||||
|
from alembic import context
|
||||||
|
|
||||||
|
# this is the Alembic Config object, which provides
|
||||||
|
# access to the values within the .ini file in use.
|
||||||
|
config = context.config
|
||||||
|
|
||||||
|
# Interpret the config file for Python logging.
|
||||||
|
# This line prevents the need to have a separate logging config file.
|
||||||
|
if config.config_file_name is not None:
|
||||||
|
fileConfig(config.config_file_name)
|
||||||
|
|
||||||
|
# add your model's MetaData object here
|
||||||
|
# for 'autogenerate' support
|
||||||
|
# from myapp import mymodel
|
||||||
|
# target_metadata = mymodel.Base.metadata
|
||||||
|
target_metadata = None
|
||||||
|
|
||||||
|
# other values from the config, defined by the needs of env.py,
|
||||||
|
# can be acquired:
|
||||||
|
# my_important_option = config.get_main_option("my_important_option")
|
||||||
|
# ... etc.
|
||||||
|
|
||||||
|
|
||||||
|
def run_migrations_offline() -> None:
|
||||||
|
"""Run migrations in 'offline' mode.
|
||||||
|
|
||||||
|
This configures the context with just a URL
|
||||||
|
and not an Engine, though an Engine is acceptable
|
||||||
|
here as well. By skipping the Engine creation
|
||||||
|
we don't even need a DBAPI to be available.
|
||||||
|
|
||||||
|
Calls to context.execute() here emit the given string to the
|
||||||
|
script output.
|
||||||
|
|
||||||
|
"""
|
||||||
|
url = config.get_main_option("sqlalchemy.url")
|
||||||
|
context.configure(
|
||||||
|
url=url,
|
||||||
|
target_metadata=target_metadata,
|
||||||
|
literal_binds=True,
|
||||||
|
dialect_opts={"paramstyle": "named"},
|
||||||
|
)
|
||||||
|
|
||||||
|
with context.begin_transaction():
|
||||||
|
context.run_migrations()
|
||||||
|
|
||||||
|
|
||||||
|
def run_migrations_online() -> None:
|
||||||
|
"""Run migrations in 'online' mode.
|
||||||
|
|
||||||
|
In this scenario we need to create an Engine
|
||||||
|
and associate a connection with the context.
|
||||||
|
|
||||||
|
"""
|
||||||
|
connectable = engine_from_config(
|
||||||
|
config.get_section(config.config_ini_section, {}),
|
||||||
|
prefix="sqlalchemy.",
|
||||||
|
poolclass=pool.NullPool,
|
||||||
|
)
|
||||||
|
|
||||||
|
with connectable.connect() as connection:
|
||||||
|
context.configure(connection=connection, target_metadata=target_metadata)
|
||||||
|
|
||||||
|
with context.begin_transaction():
|
||||||
|
context.run_migrations()
|
||||||
|
|
||||||
|
|
||||||
|
if context.is_offline_mode():
|
||||||
|
run_migrations_offline()
|
||||||
|
else:
|
||||||
|
run_migrations_online()
|
||||||
@@ -0,0 +1,247 @@
|
|||||||
|
"""Initial schema
|
||||||
|
|
||||||
|
Revision ID: 20250617
|
||||||
|
Revises:
|
||||||
|
Create Date: 2025-06-17 15:00:00.000000
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
import json
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision = "20250617"
|
||||||
|
down_revision = None
|
||||||
|
branch_labels = None
|
||||||
|
depends_on = None
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade() -> None:
|
||||||
|
# ### commands auto-generated by Alembic - please adjust! ###
|
||||||
|
op.create_table(
|
||||||
|
"active_context",
|
||||||
|
sa.Column("id", sa.Integer(), nullable=False),
|
||||||
|
sa.Column("content", sa.Text(), nullable=False),
|
||||||
|
sa.PrimaryKeyConstraint("id"),
|
||||||
|
)
|
||||||
|
op.create_table(
|
||||||
|
"active_context_history",
|
||||||
|
sa.Column("id", sa.Integer(), nullable=False),
|
||||||
|
sa.Column("timestamp", sa.DateTime(), nullable=False),
|
||||||
|
sa.Column("version", sa.Integer(), nullable=False),
|
||||||
|
sa.Column("content", sa.Text(), nullable=False),
|
||||||
|
sa.Column("change_source", sa.String(length=255), nullable=True),
|
||||||
|
sa.PrimaryKeyConstraint("id"),
|
||||||
|
)
|
||||||
|
op.create_table(
|
||||||
|
"context_links",
|
||||||
|
sa.Column("id", sa.Integer(), nullable=False),
|
||||||
|
sa.Column("workspace_id", sa.String(length=1024), nullable=False),
|
||||||
|
sa.Column("source_item_type", sa.String(length=255), nullable=False),
|
||||||
|
sa.Column("source_item_id", sa.String(length=255), nullable=False),
|
||||||
|
sa.Column("target_item_type", sa.String(length=255), nullable=False),
|
||||||
|
sa.Column("target_item_id", sa.String(length=255), nullable=False),
|
||||||
|
sa.Column("relationship_type", sa.String(length=255), nullable=False),
|
||||||
|
sa.Column("description", sa.Text(), nullable=True),
|
||||||
|
sa.Column(
|
||||||
|
"timestamp",
|
||||||
|
sa.DateTime(),
|
||||||
|
server_default=sa.text("(CURRENT_TIMESTAMP)"),
|
||||||
|
nullable=False,
|
||||||
|
),
|
||||||
|
sa.PrimaryKeyConstraint("id"),
|
||||||
|
)
|
||||||
|
op.create_index(
|
||||||
|
op.f("ix_context_links_source_item_id"),
|
||||||
|
"context_links",
|
||||||
|
["source_item_id"],
|
||||||
|
unique=False,
|
||||||
|
)
|
||||||
|
op.create_index(
|
||||||
|
op.f("ix_context_links_source_item_type"),
|
||||||
|
"context_links",
|
||||||
|
["source_item_type"],
|
||||||
|
unique=False,
|
||||||
|
)
|
||||||
|
op.create_index(
|
||||||
|
op.f("ix_context_links_target_item_id"),
|
||||||
|
"context_links",
|
||||||
|
["target_item_id"],
|
||||||
|
unique=False,
|
||||||
|
)
|
||||||
|
op.create_index(
|
||||||
|
op.f("ix_context_links_target_item_type"),
|
||||||
|
"context_links",
|
||||||
|
["target_item_type"],
|
||||||
|
unique=False,
|
||||||
|
)
|
||||||
|
op.create_table(
|
||||||
|
"custom_data",
|
||||||
|
sa.Column("id", sa.Integer(), nullable=False),
|
||||||
|
sa.Column("timestamp", sa.DateTime(), nullable=False),
|
||||||
|
sa.Column("category", sa.String(length=255), nullable=False),
|
||||||
|
sa.Column("key", sa.String(length=255), nullable=False),
|
||||||
|
sa.Column("value", sa.Text(), nullable=False),
|
||||||
|
sa.PrimaryKeyConstraint("id"),
|
||||||
|
sa.UniqueConstraint("category", "key"),
|
||||||
|
)
|
||||||
|
op.create_table(
|
||||||
|
"decisions",
|
||||||
|
sa.Column("id", sa.Integer(), nullable=False),
|
||||||
|
sa.Column("timestamp", sa.DateTime(), nullable=False),
|
||||||
|
sa.Column("summary", sa.Text(), nullable=False),
|
||||||
|
sa.Column("rationale", sa.Text(), nullable=True),
|
||||||
|
sa.Column("implementation_details", sa.Text(), nullable=True),
|
||||||
|
sa.Column("tags", sa.Text(), nullable=True),
|
||||||
|
sa.PrimaryKeyConstraint("id"),
|
||||||
|
)
|
||||||
|
op.create_table(
|
||||||
|
"product_context",
|
||||||
|
sa.Column("id", sa.Integer(), nullable=False),
|
||||||
|
sa.Column("content", sa.Text(), nullable=False),
|
||||||
|
sa.PrimaryKeyConstraint("id"),
|
||||||
|
)
|
||||||
|
op.create_table(
|
||||||
|
"product_context_history",
|
||||||
|
sa.Column("id", sa.Integer(), nullable=False),
|
||||||
|
sa.Column("timestamp", sa.DateTime(), nullable=False),
|
||||||
|
sa.Column("version", sa.Integer(), nullable=False),
|
||||||
|
sa.Column("content", sa.Text(), nullable=False),
|
||||||
|
sa.Column("change_source", sa.String(length=255), nullable=True),
|
||||||
|
sa.PrimaryKeyConstraint("id"),
|
||||||
|
)
|
||||||
|
op.create_table(
|
||||||
|
"progress_entries",
|
||||||
|
sa.Column("id", sa.Integer(), nullable=False),
|
||||||
|
sa.Column("timestamp", sa.DateTime(), nullable=False),
|
||||||
|
sa.Column("status", sa.String(length=50), nullable=False),
|
||||||
|
sa.Column("description", sa.Text(), nullable=False),
|
||||||
|
sa.Column("parent_id", sa.Integer(), nullable=True),
|
||||||
|
sa.ForeignKeyConstraint(
|
||||||
|
["parent_id"], ["progress_entries.id"], ondelete="SET NULL"
|
||||||
|
),
|
||||||
|
sa.PrimaryKeyConstraint("id"),
|
||||||
|
)
|
||||||
|
op.create_table(
|
||||||
|
"system_patterns",
|
||||||
|
sa.Column("id", sa.Integer(), nullable=False),
|
||||||
|
sa.Column("timestamp", sa.DateTime(), nullable=False),
|
||||||
|
sa.Column("name", sa.String(length=255), nullable=False),
|
||||||
|
sa.Column("description", sa.Text(), nullable=True),
|
||||||
|
sa.Column("tags", sa.Text(), nullable=True),
|
||||||
|
sa.PrimaryKeyConstraint("id"),
|
||||||
|
sa.UniqueConstraint("name"),
|
||||||
|
)
|
||||||
|
|
||||||
|
# Seed initial data
|
||||||
|
op.execute("INSERT INTO product_context (id, content) VALUES (1, '{}')")
|
||||||
|
op.execute("INSERT INTO active_context (id, content) VALUES (1, '{}')")
|
||||||
|
|
||||||
|
# Create FTS5 virtual table for decisions
|
||||||
|
op.execute(
|
||||||
|
"""
|
||||||
|
CREATE VIRTUAL TABLE decisions_fts USING fts5(
|
||||||
|
summary,
|
||||||
|
rationale,
|
||||||
|
implementation_details,
|
||||||
|
tags,
|
||||||
|
content="decisions",
|
||||||
|
content_rowid="id"
|
||||||
|
);
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
|
||||||
|
# Create triggers to keep the FTS table in sync with the decisions table
|
||||||
|
op.execute(
|
||||||
|
"""
|
||||||
|
CREATE TRIGGER decisions_after_insert AFTER INSERT ON decisions
|
||||||
|
BEGIN
|
||||||
|
INSERT INTO decisions_fts (rowid, summary, rationale, implementation_details, tags)
|
||||||
|
VALUES (new.id, new.summary, new.rationale, new.implementation_details, new.tags);
|
||||||
|
END;
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
op.execute(
|
||||||
|
"""
|
||||||
|
CREATE TRIGGER decisions_after_delete AFTER DELETE ON decisions
|
||||||
|
BEGIN
|
||||||
|
INSERT INTO decisions_fts (decisions_fts, rowid, summary, rationale, implementation_details, tags)
|
||||||
|
VALUES ('delete', old.id, old.summary, old.rationale, old.implementation_details, old.tags);
|
||||||
|
END;
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
op.execute(
|
||||||
|
"""
|
||||||
|
CREATE TRIGGER decisions_after_update AFTER UPDATE ON decisions
|
||||||
|
BEGIN
|
||||||
|
INSERT INTO decisions_fts (decisions_fts, rowid, summary, rationale, implementation_details, tags)
|
||||||
|
VALUES ('delete', old.id, old.summary, old.rationale, old.implementation_details, old.tags);
|
||||||
|
INSERT INTO decisions_fts (rowid, summary, rationale, implementation_details, tags)
|
||||||
|
VALUES (new.id, new.summary, new.rationale, new.implementation_details, new.tags);
|
||||||
|
END;
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
|
||||||
|
# Create FTS5 virtual table for custom_data
|
||||||
|
op.execute(
|
||||||
|
"""
|
||||||
|
CREATE VIRTUAL TABLE custom_data_fts USING fts5(
|
||||||
|
category,
|
||||||
|
key,
|
||||||
|
value_text,
|
||||||
|
content="custom_data",
|
||||||
|
content_rowid="id"
|
||||||
|
);
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
|
||||||
|
# Create triggers for custom_data_fts
|
||||||
|
op.execute(
|
||||||
|
"""
|
||||||
|
CREATE TRIGGER custom_data_after_insert AFTER INSERT ON custom_data
|
||||||
|
BEGIN
|
||||||
|
INSERT INTO custom_data_fts (rowid, category, key, value_text)
|
||||||
|
VALUES (new.id, new.category, new.key, new.value);
|
||||||
|
END;
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
op.execute(
|
||||||
|
"""
|
||||||
|
CREATE TRIGGER custom_data_after_delete AFTER DELETE ON custom_data
|
||||||
|
BEGIN
|
||||||
|
INSERT INTO custom_data_fts (custom_data_fts, rowid, category, key, value_text)
|
||||||
|
VALUES ('delete', old.id, old.category, old.key, old.value);
|
||||||
|
END;
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
op.execute(
|
||||||
|
"""
|
||||||
|
CREATE TRIGGER custom_data_after_update AFTER UPDATE ON custom_data
|
||||||
|
BEGIN
|
||||||
|
INSERT INTO custom_data_fts (custom_data_fts, rowid, category, key, value_text)
|
||||||
|
VALUES ('delete', old.id, old.category, old.key, old.value);
|
||||||
|
INSERT INTO custom_data_fts (rowid, category, key, value_text)
|
||||||
|
VALUES (new.id, new.category, new.key, new.value);
|
||||||
|
END;
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
# ### end Alembic commands ###
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade() -> None:
|
||||||
|
# ### commands auto-generated by Alembic - please adjust! ###
|
||||||
|
op.drop_table("system_patterns")
|
||||||
|
op.drop_table("progress_entries")
|
||||||
|
op.drop_table("product_context_history")
|
||||||
|
op.drop_table("product_context")
|
||||||
|
op.drop_table("decisions")
|
||||||
|
op.drop_table("custom_data")
|
||||||
|
op.drop_index(op.f("ix_context_links_target_item_type"), table_name="context_links")
|
||||||
|
op.drop_index(op.f("ix_context_links_target_item_id"), table_name="context_links")
|
||||||
|
op.drop_index(op.f("ix_context_links_source_item_type"), table_name="context_links")
|
||||||
|
op.drop_index(op.f("ix_context_links_source_item_id"), table_name="context_links")
|
||||||
|
op.drop_table("context_links")
|
||||||
|
op.drop_table("active_context_history")
|
||||||
|
op.drop_table("active_context")
|
||||||
|
# ### end Alembic commands ###
|
||||||
BIN
backend/apps/context_portal/context.db
Normal file
BIN
backend/apps/context_portal/context.db
Normal file
Binary file not shown.
@@ -1,29 +1,25 @@
|
|||||||
from django.contrib import admin
|
from django.contrib import admin
|
||||||
from django.contrib.contenttypes.models import ContentType
|
|
||||||
from django.utils.html import format_html
|
from django.utils.html import format_html
|
||||||
from .models import SlugHistory
|
from .models import SlugHistory
|
||||||
|
|
||||||
|
|
||||||
@admin.register(SlugHistory)
|
@admin.register(SlugHistory)
|
||||||
class SlugHistoryAdmin(admin.ModelAdmin):
|
class SlugHistoryAdmin(admin.ModelAdmin):
|
||||||
list_display = ['content_object_link', 'old_slug', 'created_at']
|
list_display = ["content_object_link", "old_slug", "created_at"]
|
||||||
list_filter = ['content_type', 'created_at']
|
list_filter = ["content_type", "created_at"]
|
||||||
search_fields = ['old_slug', 'object_id']
|
search_fields = ["old_slug", "object_id"]
|
||||||
readonly_fields = ['content_type', 'object_id', 'old_slug', 'created_at']
|
readonly_fields = ["content_type", "object_id", "old_slug", "created_at"]
|
||||||
date_hierarchy = 'created_at'
|
date_hierarchy = "created_at"
|
||||||
ordering = ['-created_at']
|
ordering = ["-created_at"]
|
||||||
|
|
||||||
|
@admin.display(description="Object")
|
||||||
def content_object_link(self, obj):
|
def content_object_link(self, obj):
|
||||||
"""Create a link to the related object's admin page"""
|
"""Create a link to the related object's admin page"""
|
||||||
try:
|
try:
|
||||||
url = obj.content_object.get_absolute_url()
|
url = obj.content_object.get_absolute_url()
|
||||||
return format_html(
|
return format_html('<a href="{}">{}</a>', url, str(obj.content_object))
|
||||||
'<a href="{}">{}</a>',
|
|
||||||
url,
|
|
||||||
str(obj.content_object)
|
|
||||||
)
|
|
||||||
except (AttributeError, ValueError):
|
except (AttributeError, ValueError):
|
||||||
return str(obj.content_object)
|
return str(obj.content_object)
|
||||||
content_object_link.short_description = 'Object'
|
|
||||||
|
|
||||||
def has_add_permission(self, request):
|
def has_add_permission(self, request):
|
||||||
"""Disable manual creation of slug history records"""
|
"""Disable manual creation of slug history records"""
|
||||||
@@ -3,47 +3,52 @@ from django.contrib.contenttypes.fields import GenericForeignKey
|
|||||||
from django.contrib.contenttypes.models import ContentType
|
from django.contrib.contenttypes.models import ContentType
|
||||||
from django.utils import timezone
|
from django.utils import timezone
|
||||||
from django.db.models import Count
|
from django.db.models import Count
|
||||||
from django.conf import settings
|
from datetime import timedelta
|
||||||
|
import pghistory
|
||||||
|
|
||||||
|
|
||||||
|
@pghistory.track()
|
||||||
class PageView(models.Model):
|
class PageView(models.Model):
|
||||||
content_type = models.ForeignKey(ContentType, on_delete=models.CASCADE, related_name='page_views')
|
content_type = models.ForeignKey(
|
||||||
|
ContentType, on_delete=models.CASCADE, related_name="page_views"
|
||||||
|
)
|
||||||
object_id = models.PositiveIntegerField()
|
object_id = models.PositiveIntegerField()
|
||||||
content_object = GenericForeignKey('content_type', 'object_id')
|
content_object = GenericForeignKey("content_type", "object_id")
|
||||||
|
|
||||||
timestamp = models.DateTimeField(auto_now_add=True, db_index=True)
|
timestamp = models.DateTimeField(auto_now_add=True, db_index=True)
|
||||||
ip_address = models.GenericIPAddressField()
|
ip_address = models.GenericIPAddressField()
|
||||||
user_agent = models.CharField(max_length=512, blank=True)
|
user_agent = models.CharField(max_length=512, blank=True)
|
||||||
|
|
||||||
class Meta:
|
class Meta:
|
||||||
indexes = [
|
indexes = [
|
||||||
models.Index(fields=['timestamp']),
|
models.Index(fields=["timestamp"]),
|
||||||
models.Index(fields=['content_type', 'object_id']),
|
models.Index(fields=["content_type", "object_id"]),
|
||||||
]
|
]
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def get_trending_items(cls, model_class, hours=24, limit=10):
|
def get_trending_items(cls, model_class, hours=24, limit=10):
|
||||||
"""Get trending items of a specific model class based on views in last X hours.
|
"""Get trending items of a specific model class based on views in last X hours.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
model_class: The model class to get trending items for (e.g., Park, Ride)
|
model_class: The model class to get trending items for (e.g., Park, Ride)
|
||||||
hours (int): Number of hours to look back for views (default: 24)
|
hours (int): Number of hours to look back for views (default: 24)
|
||||||
limit (int): Maximum number of items to return (default: 10)
|
limit (int): Maximum number of items to return (default: 10)
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
QuerySet: The trending items ordered by view count
|
QuerySet: The trending items ordered by view count
|
||||||
"""
|
"""
|
||||||
content_type = ContentType.objects.get_for_model(model_class)
|
content_type = ContentType.objects.get_for_model(model_class)
|
||||||
cutoff = timezone.now() - timezone.timedelta(hours=hours)
|
cutoff = timezone.now() - timedelta(hours=hours)
|
||||||
|
|
||||||
# Query through the ContentType relationship
|
# Query through the ContentType relationship
|
||||||
item_ids = cls.objects.filter(
|
item_ids = (
|
||||||
content_type=content_type,
|
cls.objects.filter(content_type=content_type, timestamp__gte=cutoff)
|
||||||
timestamp__gte=cutoff
|
.values("object_id")
|
||||||
).values('object_id').annotate(
|
.annotate(view_count=Count("id"))
|
||||||
view_count=Count('id')
|
.filter(view_count__gt=0)
|
||||||
).filter(
|
.order_by("-view_count")
|
||||||
view_count__gt=0
|
.values_list("object_id", flat=True)[:limit]
|
||||||
).order_by('-view_count').values_list('object_id', flat=True)[:limit]
|
)
|
||||||
|
|
||||||
# Get the actual items in the correct order
|
# Get the actual items in the correct order
|
||||||
if item_ids:
|
if item_ids:
|
||||||
@@ -51,7 +56,8 @@ class PageView(models.Model):
|
|||||||
id_list = list(item_ids)
|
id_list = list(item_ids)
|
||||||
# Use Case/When to preserve the ordering
|
# Use Case/When to preserve the ordering
|
||||||
from django.db.models import Case, When
|
from django.db.models import Case, When
|
||||||
|
|
||||||
preserved = Case(*[When(pk=pk, then=pos) for pos, pk in enumerate(id_list)])
|
preserved = Case(*[When(pk=pk, then=pos) for pos, pk in enumerate(id_list)])
|
||||||
return model_class.objects.filter(pk__in=id_list).order_by(preserved)
|
return model_class.objects.filter(pk__in=id_list).order_by(preserved)
|
||||||
|
|
||||||
return model_class.objects.none()
|
return model_class.objects.none()
|
||||||
@@ -3,15 +3,21 @@ Custom exception handling for ThrillWiki API.
|
|||||||
Provides standardized error responses following Django styleguide patterns.
|
Provides standardized error responses following Django styleguide patterns.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
import logging
|
|
||||||
from typing import Any, Dict, Optional
|
from typing import Any, Dict, Optional
|
||||||
|
|
||||||
from django.http import Http404
|
from django.http import Http404
|
||||||
from django.core.exceptions import PermissionDenied, ValidationError as DjangoValidationError
|
from django.core.exceptions import (
|
||||||
|
PermissionDenied,
|
||||||
|
ValidationError as DjangoValidationError,
|
||||||
|
)
|
||||||
from rest_framework import status
|
from rest_framework import status
|
||||||
from rest_framework.response import Response
|
from rest_framework.response import Response
|
||||||
from rest_framework.views import exception_handler
|
from rest_framework.views import exception_handler
|
||||||
from rest_framework.exceptions import ValidationError as DRFValidationError, NotFound, PermissionDenied as DRFPermissionDenied
|
from rest_framework.exceptions import (
|
||||||
|
ValidationError as DRFValidationError,
|
||||||
|
NotFound,
|
||||||
|
PermissionDenied as DRFPermissionDenied,
|
||||||
|
)
|
||||||
|
|
||||||
from ..exceptions import ThrillWikiException
|
from ..exceptions import ThrillWikiException
|
||||||
from ..logging import get_logger, log_exception
|
from ..logging import get_logger, log_exception
|
||||||
@@ -19,106 +25,133 @@ from ..logging import get_logger, log_exception
|
|||||||
logger = get_logger(__name__)
|
logger = get_logger(__name__)
|
||||||
|
|
||||||
|
|
||||||
def custom_exception_handler(exc: Exception, context: Dict[str, Any]) -> Optional[Response]:
|
def custom_exception_handler(
|
||||||
|
exc: Exception, context: Dict[str, Any]
|
||||||
|
) -> Optional[Response]:
|
||||||
"""
|
"""
|
||||||
Custom exception handler for DRF that provides standardized error responses.
|
Custom exception handler for DRF that provides standardized error responses.
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
Response with standardized error format or None to fallback to default handler
|
Response with standardized error format or None to fallback to default handler
|
||||||
"""
|
"""
|
||||||
# Call REST framework's default exception handler first
|
# Call REST framework's default exception handler first
|
||||||
response = exception_handler(exc, context)
|
response = exception_handler(exc, context)
|
||||||
|
|
||||||
if response is not None:
|
if response is not None:
|
||||||
# Standardize the error response format
|
# Standardize the error response format
|
||||||
custom_response_data = {
|
custom_response_data = {
|
||||||
'status': 'error',
|
"status": "error",
|
||||||
'error': {
|
"error": {
|
||||||
'code': _get_error_code(exc),
|
"code": _get_error_code(exc),
|
||||||
'message': _get_error_message(exc, response.data),
|
"message": _get_error_message(exc, response.data),
|
||||||
'details': _get_error_details(exc, response.data),
|
"details": _get_error_details(exc, response.data),
|
||||||
},
|
},
|
||||||
'data': None,
|
"data": None,
|
||||||
}
|
}
|
||||||
|
|
||||||
# Add request context for debugging
|
# Add request context for debugging
|
||||||
if hasattr(context.get('request'), 'user'):
|
if hasattr(context.get("request"), "user"):
|
||||||
custom_response_data['error']['request_user'] = str(context['request'].user)
|
custom_response_data["error"]["request_user"] = str(context["request"].user)
|
||||||
|
|
||||||
# Log the error for monitoring
|
# Log the error for monitoring
|
||||||
log_exception(logger, exc, context={'response_status': response.status_code}, request=context.get('request'))
|
log_exception(
|
||||||
|
logger,
|
||||||
|
exc,
|
||||||
|
context={"response_status": response.status_code},
|
||||||
|
request=context.get("request"),
|
||||||
|
)
|
||||||
|
|
||||||
response.data = custom_response_data
|
response.data = custom_response_data
|
||||||
|
|
||||||
# Handle ThrillWiki custom exceptions
|
# Handle ThrillWiki custom exceptions
|
||||||
elif isinstance(exc, ThrillWikiException):
|
elif isinstance(exc, ThrillWikiException):
|
||||||
custom_response_data = {
|
custom_response_data = {
|
||||||
'status': 'error',
|
"status": "error",
|
||||||
'error': exc.to_dict(),
|
"error": exc.to_dict(),
|
||||||
'data': None,
|
"data": None,
|
||||||
}
|
}
|
||||||
|
|
||||||
log_exception(logger, exc, context={'response_status': exc.status_code}, request=context.get('request'))
|
log_exception(
|
||||||
|
logger,
|
||||||
|
exc,
|
||||||
|
context={"response_status": exc.status_code},
|
||||||
|
request=context.get("request"),
|
||||||
|
)
|
||||||
response = Response(custom_response_data, status=exc.status_code)
|
response = Response(custom_response_data, status=exc.status_code)
|
||||||
|
|
||||||
# Handle specific Django exceptions that DRF doesn't catch
|
# Handle specific Django exceptions that DRF doesn't catch
|
||||||
elif isinstance(exc, DjangoValidationError):
|
elif isinstance(exc, DjangoValidationError):
|
||||||
custom_response_data = {
|
custom_response_data = {
|
||||||
'status': 'error',
|
"status": "error",
|
||||||
'error': {
|
"error": {
|
||||||
'code': 'VALIDATION_ERROR',
|
"code": "VALIDATION_ERROR",
|
||||||
'message': 'Validation failed',
|
"message": "Validation failed",
|
||||||
'details': _format_django_validation_errors(exc),
|
"details": _format_django_validation_errors(exc),
|
||||||
},
|
},
|
||||||
'data': None,
|
"data": None,
|
||||||
}
|
}
|
||||||
|
|
||||||
log_exception(logger, exc, context={'response_status': status.HTTP_400_BAD_REQUEST}, request=context.get('request'))
|
log_exception(
|
||||||
|
logger,
|
||||||
|
exc,
|
||||||
|
context={"response_status": status.HTTP_400_BAD_REQUEST},
|
||||||
|
request=context.get("request"),
|
||||||
|
)
|
||||||
response = Response(custom_response_data, status=status.HTTP_400_BAD_REQUEST)
|
response = Response(custom_response_data, status=status.HTTP_400_BAD_REQUEST)
|
||||||
|
|
||||||
elif isinstance(exc, Http404):
|
elif isinstance(exc, Http404):
|
||||||
custom_response_data = {
|
custom_response_data = {
|
||||||
'status': 'error',
|
"status": "error",
|
||||||
'error': {
|
"error": {
|
||||||
'code': 'NOT_FOUND',
|
"code": "NOT_FOUND",
|
||||||
'message': 'Resource not found',
|
"message": "Resource not found",
|
||||||
'details': str(exc) if str(exc) else None,
|
"details": str(exc) if str(exc) else None,
|
||||||
},
|
},
|
||||||
'data': None,
|
"data": None,
|
||||||
}
|
}
|
||||||
|
|
||||||
log_exception(logger, exc, context={'response_status': status.HTTP_404_NOT_FOUND}, request=context.get('request'))
|
log_exception(
|
||||||
|
logger,
|
||||||
|
exc,
|
||||||
|
context={"response_status": status.HTTP_404_NOT_FOUND},
|
||||||
|
request=context.get("request"),
|
||||||
|
)
|
||||||
response = Response(custom_response_data, status=status.HTTP_404_NOT_FOUND)
|
response = Response(custom_response_data, status=status.HTTP_404_NOT_FOUND)
|
||||||
|
|
||||||
elif isinstance(exc, PermissionDenied):
|
elif isinstance(exc, PermissionDenied):
|
||||||
custom_response_data = {
|
custom_response_data = {
|
||||||
'status': 'error',
|
"status": "error",
|
||||||
'error': {
|
"error": {
|
||||||
'code': 'PERMISSION_DENIED',
|
"code": "PERMISSION_DENIED",
|
||||||
'message': 'Permission denied',
|
"message": "Permission denied",
|
||||||
'details': str(exc) if str(exc) else None,
|
"details": str(exc) if str(exc) else None,
|
||||||
},
|
},
|
||||||
'data': None,
|
"data": None,
|
||||||
}
|
}
|
||||||
|
|
||||||
log_exception(logger, exc, context={'response_status': status.HTTP_403_FORBIDDEN}, request=context.get('request'))
|
log_exception(
|
||||||
|
logger,
|
||||||
|
exc,
|
||||||
|
context={"response_status": status.HTTP_403_FORBIDDEN},
|
||||||
|
request=context.get("request"),
|
||||||
|
)
|
||||||
response = Response(custom_response_data, status=status.HTTP_403_FORBIDDEN)
|
response = Response(custom_response_data, status=status.HTTP_403_FORBIDDEN)
|
||||||
|
|
||||||
return response
|
return response
|
||||||
|
|
||||||
|
|
||||||
def _get_error_code(exc: Exception) -> str:
|
def _get_error_code(exc: Exception) -> str:
|
||||||
"""Extract or determine error code from exception."""
|
"""Extract or determine error code from exception."""
|
||||||
if hasattr(exc, 'default_code'):
|
if hasattr(exc, "default_code"):
|
||||||
return exc.default_code.upper()
|
return exc.default_code.upper()
|
||||||
|
|
||||||
if isinstance(exc, DRFValidationError):
|
if isinstance(exc, DRFValidationError):
|
||||||
return 'VALIDATION_ERROR'
|
return "VALIDATION_ERROR"
|
||||||
elif isinstance(exc, NotFound):
|
elif isinstance(exc, NotFound):
|
||||||
return 'NOT_FOUND'
|
return "NOT_FOUND"
|
||||||
elif isinstance(exc, DRFPermissionDenied):
|
elif isinstance(exc, DRFPermissionDenied):
|
||||||
return 'PERMISSION_DENIED'
|
return "PERMISSION_DENIED"
|
||||||
|
|
||||||
return exc.__class__.__name__.upper()
|
return exc.__class__.__name__.upper()
|
||||||
|
|
||||||
|
|
||||||
@@ -126,47 +159,47 @@ def _get_error_message(exc: Exception, response_data: Any) -> str:
|
|||||||
"""Extract user-friendly error message."""
|
"""Extract user-friendly error message."""
|
||||||
if isinstance(response_data, dict):
|
if isinstance(response_data, dict):
|
||||||
# Handle DRF validation errors
|
# Handle DRF validation errors
|
||||||
if 'detail' in response_data:
|
if "detail" in response_data:
|
||||||
return str(response_data['detail'])
|
return str(response_data["detail"])
|
||||||
elif 'non_field_errors' in response_data:
|
elif "non_field_errors" in response_data:
|
||||||
errors = response_data['non_field_errors']
|
errors = response_data["non_field_errors"]
|
||||||
return errors[0] if isinstance(errors, list) and errors else str(errors)
|
return errors[0] if isinstance(errors, list) and errors else str(errors)
|
||||||
elif isinstance(response_data, dict) and len(response_data) == 1:
|
elif isinstance(response_data, dict) and len(response_data) == 1:
|
||||||
key, value = next(iter(response_data.items()))
|
key, value = next(iter(response_data.items()))
|
||||||
if isinstance(value, list) and value:
|
if isinstance(value, list) and value:
|
||||||
return f"{key}: {value[0]}"
|
return f"{key}: {value[0]}"
|
||||||
return f"{key}: {value}"
|
return f"{key}: {value}"
|
||||||
|
|
||||||
# Fallback to exception message
|
# Fallback to exception message
|
||||||
return str(exc) if str(exc) else 'An error occurred'
|
return str(exc) if str(exc) else "An error occurred"
|
||||||
|
|
||||||
|
|
||||||
def _get_error_details(exc: Exception, response_data: Any) -> Optional[Dict[str, Any]]:
|
def _get_error_details(exc: Exception, response_data: Any) -> Optional[Dict[str, Any]]:
|
||||||
"""Extract detailed error information for debugging."""
|
"""Extract detailed error information for debugging."""
|
||||||
if isinstance(response_data, dict) and len(response_data) > 1:
|
if isinstance(response_data, dict) and len(response_data) > 1:
|
||||||
return response_data
|
return response_data
|
||||||
|
|
||||||
if hasattr(exc, 'detail') and isinstance(exc.detail, dict):
|
if hasattr(exc, "detail") and isinstance(exc.detail, dict):
|
||||||
return exc.detail
|
return exc.detail
|
||||||
|
|
||||||
return None
|
return None
|
||||||
|
|
||||||
|
|
||||||
def _format_django_validation_errors(exc: DjangoValidationError) -> Dict[str, Any]:
|
def _format_django_validation_errors(
|
||||||
|
exc: DjangoValidationError,
|
||||||
|
) -> Dict[str, Any]:
|
||||||
"""Format Django ValidationError for API response."""
|
"""Format Django ValidationError for API response."""
|
||||||
if hasattr(exc, 'error_dict'):
|
if hasattr(exc, "error_dict"):
|
||||||
# Field-specific errors
|
# Field-specific errors
|
||||||
return {
|
return {
|
||||||
field: [str(error) for error in errors]
|
field: [str(error) for error in errors]
|
||||||
for field, errors in exc.error_dict.items()
|
for field, errors in exc.error_dict.items()
|
||||||
}
|
}
|
||||||
elif hasattr(exc, 'error_list'):
|
elif hasattr(exc, "error_list"):
|
||||||
# Non-field errors
|
# Non-field errors
|
||||||
return {
|
return {"non_field_errors": [str(error) for error in exc.error_list]}
|
||||||
'non_field_errors': [str(error) for error in exc.error_list]
|
|
||||||
}
|
return {"non_field_errors": [str(exc)]}
|
||||||
|
|
||||||
return {'non_field_errors': [str(exc)]}
|
|
||||||
|
|
||||||
|
|
||||||
# Removed _log_api_error - using centralized logging instead
|
# Removed _log_api_error - using centralized logging instead
|
||||||
@@ -12,79 +12,79 @@ class ApiMixin:
|
|||||||
"""
|
"""
|
||||||
Base mixin for API views providing standardized response formatting.
|
Base mixin for API views providing standardized response formatting.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
def create_response(
|
def create_response(
|
||||||
self,
|
self,
|
||||||
*,
|
*,
|
||||||
data: Any = None,
|
data: Any = None,
|
||||||
message: Optional[str] = None,
|
message: Optional[str] = None,
|
||||||
status_code: int = status.HTTP_200_OK,
|
status_code: int = status.HTTP_200_OK,
|
||||||
pagination: Optional[Dict[str, Any]] = None,
|
pagination: Optional[Dict[str, Any]] = None,
|
||||||
metadata: Optional[Dict[str, Any]] = None
|
metadata: Optional[Dict[str, Any]] = None,
|
||||||
) -> Response:
|
) -> Response:
|
||||||
"""
|
"""
|
||||||
Create standardized API response.
|
Create standardized API response.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
data: Response data
|
data: Response data
|
||||||
message: Optional success message
|
message: Optional success message
|
||||||
status_code: HTTP status code
|
status_code: HTTP status code
|
||||||
pagination: Pagination information
|
pagination: Pagination information
|
||||||
metadata: Additional metadata
|
metadata: Additional metadata
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
Standardized Response object
|
Standardized Response object
|
||||||
"""
|
"""
|
||||||
response_data = {
|
response_data = {
|
||||||
'status': 'success' if status_code < 400 else 'error',
|
"status": "success" if status_code < 400 else "error",
|
||||||
'data': data,
|
"data": data,
|
||||||
}
|
}
|
||||||
|
|
||||||
if message:
|
if message:
|
||||||
response_data['message'] = message
|
response_data["message"] = message
|
||||||
|
|
||||||
if pagination:
|
if pagination:
|
||||||
response_data['pagination'] = pagination
|
response_data["pagination"] = pagination
|
||||||
|
|
||||||
if metadata:
|
if metadata:
|
||||||
response_data['metadata'] = metadata
|
response_data["metadata"] = metadata
|
||||||
|
|
||||||
return Response(response_data, status=status_code)
|
return Response(response_data, status=status_code)
|
||||||
|
|
||||||
def create_error_response(
|
def create_error_response(
|
||||||
self,
|
self,
|
||||||
*,
|
*,
|
||||||
message: str,
|
message: str,
|
||||||
status_code: int = status.HTTP_400_BAD_REQUEST,
|
status_code: int = status.HTTP_400_BAD_REQUEST,
|
||||||
error_code: Optional[str] = None,
|
error_code: Optional[str] = None,
|
||||||
details: Optional[Dict[str, Any]] = None
|
details: Optional[Dict[str, Any]] = None,
|
||||||
) -> Response:
|
) -> Response:
|
||||||
"""
|
"""
|
||||||
Create standardized error response.
|
Create standardized error response.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
message: Error message
|
message: Error message
|
||||||
status_code: HTTP status code
|
status_code: HTTP status code
|
||||||
error_code: Optional error code
|
error_code: Optional error code
|
||||||
details: Additional error details
|
details: Additional error details
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
Standardized error Response object
|
Standardized error Response object
|
||||||
"""
|
"""
|
||||||
error_data = {
|
error_data = {
|
||||||
'code': error_code or 'GENERIC_ERROR',
|
"code": error_code or "GENERIC_ERROR",
|
||||||
'message': message,
|
"message": message,
|
||||||
}
|
}
|
||||||
|
|
||||||
if details:
|
if details:
|
||||||
error_data['details'] = details
|
error_data["details"] = details
|
||||||
|
|
||||||
response_data = {
|
response_data = {
|
||||||
'status': 'error',
|
"status": "error",
|
||||||
'error': error_data,
|
"error": error_data,
|
||||||
'data': None,
|
"data": None,
|
||||||
}
|
}
|
||||||
|
|
||||||
return Response(response_data, status=status_code)
|
return Response(response_data, status=status_code)
|
||||||
|
|
||||||
|
|
||||||
@@ -92,37 +92,37 @@ class CreateApiMixin(ApiMixin):
|
|||||||
"""
|
"""
|
||||||
Mixin for create API endpoints with standardized input/output handling.
|
Mixin for create API endpoints with standardized input/output handling.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
def create(self, request: Request, *args, **kwargs) -> Response:
|
def create(self, request: Request, *args, **kwargs) -> Response:
|
||||||
"""Handle POST requests for creating resources."""
|
"""Handle POST requests for creating resources."""
|
||||||
serializer = self.get_input_serializer(data=request.data)
|
serializer = self.get_input_serializer(data=request.data)
|
||||||
serializer.is_valid(raise_exception=True)
|
serializer.is_valid(raise_exception=True)
|
||||||
|
|
||||||
# Create the object using the service layer
|
# Create the object using the service layer
|
||||||
obj = self.perform_create(**serializer.validated_data)
|
obj = self.perform_create(**serializer.validated_data)
|
||||||
|
|
||||||
# Serialize the output
|
# Serialize the output
|
||||||
output_serializer = self.get_output_serializer(obj)
|
output_serializer = self.get_output_serializer(obj)
|
||||||
|
|
||||||
return self.create_response(
|
return self.create_response(
|
||||||
data=output_serializer.data,
|
data=output_serializer.data,
|
||||||
status_code=status.HTTP_201_CREATED,
|
status_code=status.HTTP_201_CREATED,
|
||||||
message="Resource created successfully"
|
message="Resource created successfully",
|
||||||
)
|
)
|
||||||
|
|
||||||
def perform_create(self, **validated_data):
|
def perform_create(self, **validated_data):
|
||||||
"""
|
"""
|
||||||
Override this method to implement object creation logic.
|
Override this method to implement object creation logic.
|
||||||
Should use service layer methods.
|
Should use service layer methods.
|
||||||
"""
|
"""
|
||||||
raise NotImplementedError("Subclasses must implement perform_create")
|
raise NotImplementedError("Subclasses must implement perform_create")
|
||||||
|
|
||||||
def get_input_serializer(self, *args, **kwargs):
|
def get_input_serializer(self, *args, **kwargs):
|
||||||
"""Get the input serializer for validation."""
|
"""Get the input serializer for validation."""
|
||||||
return self.InputSerializer(*args, **kwargs)
|
return self.InputSerializer(*args, **kwargs)
|
||||||
|
|
||||||
def get_output_serializer(self, *args, **kwargs):
|
def get_output_serializer(self, *args, **kwargs):
|
||||||
"""Get the output serializer for response."""
|
"""Get the output serializer for response."""
|
||||||
return self.OutputSerializer(*args, **kwargs)
|
return self.OutputSerializer(*args, **kwargs)
|
||||||
|
|
||||||
|
|
||||||
@@ -130,35 +130,37 @@ class UpdateApiMixin(ApiMixin):
|
|||||||
"""
|
"""
|
||||||
Mixin for update API endpoints with standardized input/output handling.
|
Mixin for update API endpoints with standardized input/output handling.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
def update(self, request: Request, *args, **kwargs) -> Response:
|
def update(self, request: Request, *args, **kwargs) -> Response:
|
||||||
"""Handle PUT/PATCH requests for updating resources."""
|
"""Handle PUT/PATCH requests for updating resources."""
|
||||||
instance = self.get_object()
|
instance = self.get_object()
|
||||||
serializer = self.get_input_serializer(data=request.data, partial=kwargs.get('partial', False))
|
serializer = self.get_input_serializer(
|
||||||
|
data=request.data, partial=kwargs.get("partial", False)
|
||||||
|
)
|
||||||
serializer.is_valid(raise_exception=True)
|
serializer.is_valid(raise_exception=True)
|
||||||
|
|
||||||
# Update the object using the service layer
|
# Update the object using the service layer
|
||||||
updated_obj = self.perform_update(instance, **serializer.validated_data)
|
updated_obj = self.perform_update(instance, **serializer.validated_data)
|
||||||
|
|
||||||
# Serialize the output
|
# Serialize the output
|
||||||
output_serializer = self.get_output_serializer(updated_obj)
|
output_serializer = self.get_output_serializer(updated_obj)
|
||||||
|
|
||||||
return self.create_response(
|
return self.create_response(
|
||||||
data=output_serializer.data,
|
data=output_serializer.data,
|
||||||
message="Resource updated successfully"
|
message="Resource updated successfully",
|
||||||
)
|
)
|
||||||
|
|
||||||
def perform_update(self, instance, **validated_data):
|
def perform_update(self, instance, **validated_data):
|
||||||
"""
|
"""
|
||||||
Override this method to implement object update logic.
|
Override this method to implement object update logic.
|
||||||
Should use service layer methods.
|
Should use service layer methods.
|
||||||
"""
|
"""
|
||||||
raise NotImplementedError("Subclasses must implement perform_update")
|
raise NotImplementedError("Subclasses must implement perform_update")
|
||||||
|
|
||||||
def get_input_serializer(self, *args, **kwargs):
|
def get_input_serializer(self, *args, **kwargs):
|
||||||
"""Get the input serializer for validation."""
|
"""Get the input serializer for validation."""
|
||||||
return self.InputSerializer(*args, **kwargs)
|
return self.InputSerializer(*args, **kwargs)
|
||||||
|
|
||||||
def get_output_serializer(self, *args, **kwargs):
|
def get_output_serializer(self, *args, **kwargs):
|
||||||
"""Get the output serializer for response."""
|
"""Get the output serializer for response."""
|
||||||
return self.OutputSerializer(*args, **kwargs)
|
return self.OutputSerializer(*args, **kwargs)
|
||||||
@@ -168,29 +170,31 @@ class ListApiMixin(ApiMixin):
|
|||||||
"""
|
"""
|
||||||
Mixin for list API endpoints with pagination and filtering.
|
Mixin for list API endpoints with pagination and filtering.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
def list(self, request: Request, *args, **kwargs) -> Response:
|
def list(self, request: Request, *args, **kwargs) -> Response:
|
||||||
"""Handle GET requests for listing resources."""
|
"""Handle GET requests for listing resources."""
|
||||||
# Use selector to get filtered queryset
|
# Use selector to get filtered queryset
|
||||||
queryset = self.get_queryset()
|
queryset = self.get_queryset()
|
||||||
|
|
||||||
# Apply pagination
|
# Apply pagination
|
||||||
page = self.paginate_queryset(queryset)
|
page = self.paginate_queryset(queryset)
|
||||||
if page is not None:
|
if page is not None:
|
||||||
serializer = self.get_output_serializer(page, many=True)
|
serializer = self.get_output_serializer(page, many=True)
|
||||||
return self.get_paginated_response(serializer.data)
|
return self.get_paginated_response(serializer.data)
|
||||||
|
|
||||||
# No pagination
|
# No pagination
|
||||||
serializer = self.get_output_serializer(queryset, many=True)
|
serializer = self.get_output_serializer(queryset, many=True)
|
||||||
return self.create_response(data=serializer.data)
|
return self.create_response(data=serializer.data)
|
||||||
|
|
||||||
def get_queryset(self):
|
def get_queryset(self):
|
||||||
"""
|
"""
|
||||||
Override this method to use selector patterns.
|
Override this method to use selector patterns.
|
||||||
Should call selector functions, not access model managers directly.
|
Should call selector functions, not access model managers directly.
|
||||||
"""
|
"""
|
||||||
raise NotImplementedError("Subclasses must implement get_queryset using selectors")
|
raise NotImplementedError(
|
||||||
|
"Subclasses must implement get_queryset using selectors"
|
||||||
|
)
|
||||||
|
|
||||||
def get_output_serializer(self, *args, **kwargs):
|
def get_output_serializer(self, *args, **kwargs):
|
||||||
"""Get the output serializer for response."""
|
"""Get the output serializer for response."""
|
||||||
return self.OutputSerializer(*args, **kwargs)
|
return self.OutputSerializer(*args, **kwargs)
|
||||||
@@ -200,21 +204,23 @@ class RetrieveApiMixin(ApiMixin):
|
|||||||
"""
|
"""
|
||||||
Mixin for retrieve API endpoints.
|
Mixin for retrieve API endpoints.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
def retrieve(self, request: Request, *args, **kwargs) -> Response:
|
def retrieve(self, request: Request, *args, **kwargs) -> Response:
|
||||||
"""Handle GET requests for retrieving a single resource."""
|
"""Handle GET requests for retrieving a single resource."""
|
||||||
instance = self.get_object()
|
instance = self.get_object()
|
||||||
serializer = self.get_output_serializer(instance)
|
serializer = self.get_output_serializer(instance)
|
||||||
|
|
||||||
return self.create_response(data=serializer.data)
|
return self.create_response(data=serializer.data)
|
||||||
|
|
||||||
def get_object(self):
|
def get_object(self):
|
||||||
"""
|
"""
|
||||||
Override this method to use selector patterns.
|
Override this method to use selector patterns.
|
||||||
Should call selector functions for optimized queries.
|
Should call selector functions for optimized queries.
|
||||||
"""
|
"""
|
||||||
raise NotImplementedError("Subclasses must implement get_object using selectors")
|
raise NotImplementedError(
|
||||||
|
"Subclasses must implement get_object using selectors"
|
||||||
|
)
|
||||||
|
|
||||||
def get_output_serializer(self, *args, **kwargs):
|
def get_output_serializer(self, *args, **kwargs):
|
||||||
"""Get the output serializer for response."""
|
"""Get the output serializer for response."""
|
||||||
return self.OutputSerializer(*args, **kwargs)
|
return self.OutputSerializer(*args, **kwargs)
|
||||||
@@ -224,29 +230,31 @@ class DestroyApiMixin(ApiMixin):
|
|||||||
"""
|
"""
|
||||||
Mixin for delete API endpoints.
|
Mixin for delete API endpoints.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
def destroy(self, request: Request, *args, **kwargs) -> Response:
|
def destroy(self, request: Request, *args, **kwargs) -> Response:
|
||||||
"""Handle DELETE requests for destroying resources."""
|
"""Handle DELETE requests for destroying resources."""
|
||||||
instance = self.get_object()
|
instance = self.get_object()
|
||||||
|
|
||||||
# Delete using service layer
|
# Delete using service layer
|
||||||
self.perform_destroy(instance)
|
self.perform_destroy(instance)
|
||||||
|
|
||||||
return self.create_response(
|
return self.create_response(
|
||||||
status_code=status.HTTP_204_NO_CONTENT,
|
status_code=status.HTTP_204_NO_CONTENT,
|
||||||
message="Resource deleted successfully"
|
message="Resource deleted successfully",
|
||||||
)
|
)
|
||||||
|
|
||||||
def perform_destroy(self, instance):
|
def perform_destroy(self, instance):
|
||||||
"""
|
"""
|
||||||
Override this method to implement object deletion logic.
|
Override this method to implement object deletion logic.
|
||||||
Should use service layer methods.
|
Should use service layer methods.
|
||||||
"""
|
"""
|
||||||
raise NotImplementedError("Subclasses must implement perform_destroy")
|
raise NotImplementedError("Subclasses must implement perform_destroy")
|
||||||
|
|
||||||
def get_object(self):
|
def get_object(self):
|
||||||
"""
|
"""
|
||||||
Override this method to use selector patterns.
|
Override this method to use selector patterns.
|
||||||
Should call selector functions for optimized queries.
|
Should call selector functions for optimized queries.
|
||||||
"""
|
"""
|
||||||
raise NotImplementedError("Subclasses must implement get_object using selectors")
|
raise NotImplementedError(
|
||||||
|
"Subclasses must implement get_object using selectors"
|
||||||
|
)
|
||||||
6
backend/apps/core/apps.py
Normal file
6
backend/apps/core/apps.py
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
from django.apps import AppConfig
|
||||||
|
|
||||||
|
|
||||||
|
class CoreConfig(AppConfig):
|
||||||
|
default_auto_field = "django.db.models.BigAutoField"
|
||||||
|
name = "apps.core"
|
||||||
@@ -6,102 +6,127 @@ import hashlib
|
|||||||
import json
|
import json
|
||||||
import time
|
import time
|
||||||
from functools import wraps
|
from functools import wraps
|
||||||
from typing import Optional, List, Callable, Any
|
from typing import Optional, List, Callable
|
||||||
from django.core.cache import cache
|
|
||||||
from django.http import JsonResponse
|
|
||||||
from django.utils.decorators import method_decorator
|
from django.utils.decorators import method_decorator
|
||||||
from django.views.decorators.cache import cache_control, never_cache
|
|
||||||
from django.views.decorators.vary import vary_on_headers
|
from django.views.decorators.vary import vary_on_headers
|
||||||
from rest_framework.response import Response
|
from apps.core.services.enhanced_cache_service import EnhancedCacheService
|
||||||
from core.services.enhanced_cache_service import EnhancedCacheService
|
|
||||||
import logging
|
import logging
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
def cache_api_response(timeout=1800, vary_on=None, key_prefix='api', cache_backend='api'):
|
def cache_api_response(
|
||||||
|
timeout=1800, vary_on=None, key_prefix="api", cache_backend="api"
|
||||||
|
):
|
||||||
"""
|
"""
|
||||||
Advanced decorator for caching API responses with flexible configuration
|
Advanced decorator for caching API responses with flexible configuration
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
timeout: Cache timeout in seconds
|
timeout: Cache timeout in seconds
|
||||||
vary_on: List of request attributes to vary cache on
|
vary_on: List of request attributes to vary cache on
|
||||||
key_prefix: Prefix for cache keys
|
key_prefix: Prefix for cache keys
|
||||||
cache_backend: Cache backend to use
|
cache_backend: Cache backend to use
|
||||||
"""
|
"""
|
||||||
|
|
||||||
def decorator(view_func):
|
def decorator(view_func):
|
||||||
@wraps(view_func)
|
@wraps(view_func)
|
||||||
def wrapper(self, request, *args, **kwargs):
|
def wrapper(self, request, *args, **kwargs):
|
||||||
# Only cache GET requests
|
# Only cache GET requests
|
||||||
if request.method != 'GET':
|
if request.method != "GET":
|
||||||
return view_func(self, request, *args, **kwargs)
|
return view_func(self, request, *args, **kwargs)
|
||||||
|
|
||||||
# Generate cache key based on view, user, and parameters
|
# Generate cache key based on view, user, and parameters
|
||||||
cache_key_parts = [
|
cache_key_parts = [
|
||||||
key_prefix,
|
key_prefix,
|
||||||
view_func.__name__,
|
view_func.__name__,
|
||||||
str(request.user.id) if request.user.is_authenticated else 'anonymous',
|
(
|
||||||
|
str(request.user.id)
|
||||||
|
if request.user.is_authenticated
|
||||||
|
else "anonymous"
|
||||||
|
),
|
||||||
str(hash(frozenset(request.GET.items()))),
|
str(hash(frozenset(request.GET.items()))),
|
||||||
]
|
]
|
||||||
|
|
||||||
# Add URL parameters to cache key
|
# Add URL parameters to cache key
|
||||||
if args:
|
if args:
|
||||||
cache_key_parts.append(str(hash(args)))
|
cache_key_parts.append(str(hash(args)))
|
||||||
if kwargs:
|
if kwargs:
|
||||||
cache_key_parts.append(str(hash(frozenset(kwargs.items()))))
|
cache_key_parts.append(str(hash(frozenset(kwargs.items()))))
|
||||||
|
|
||||||
# Add custom vary_on fields
|
# Add custom vary_on fields
|
||||||
if vary_on:
|
if vary_on:
|
||||||
for field in vary_on:
|
for field in vary_on:
|
||||||
value = getattr(request, field, '')
|
value = getattr(request, field, "")
|
||||||
cache_key_parts.append(str(value))
|
cache_key_parts.append(str(value))
|
||||||
|
|
||||||
cache_key = ':'.join(cache_key_parts)
|
cache_key = ":".join(cache_key_parts)
|
||||||
|
|
||||||
# Try to get from cache
|
# Try to get from cache
|
||||||
cache_service = EnhancedCacheService()
|
cache_service = EnhancedCacheService()
|
||||||
cached_response = getattr(cache_service, cache_backend + '_cache').get(cache_key)
|
cached_response = getattr(cache_service, cache_backend + "_cache").get(
|
||||||
|
cache_key
|
||||||
|
)
|
||||||
|
|
||||||
if cached_response:
|
if cached_response:
|
||||||
logger.debug(f"Cache hit for API view {view_func.__name__}", extra={
|
logger.debug(
|
||||||
'cache_key': cache_key,
|
f"Cache hit for API view {view_func.__name__}",
|
||||||
'view': view_func.__name__,
|
extra={
|
||||||
'cache_hit': True
|
"cache_key": cache_key,
|
||||||
})
|
"view": view_func.__name__,
|
||||||
|
"cache_hit": True,
|
||||||
|
},
|
||||||
|
)
|
||||||
return cached_response
|
return cached_response
|
||||||
|
|
||||||
# Execute view and cache result
|
# Execute view and cache result
|
||||||
start_time = time.time()
|
start_time = time.time()
|
||||||
response = view_func(self, request, *args, **kwargs)
|
response = view_func(self, request, *args, **kwargs)
|
||||||
execution_time = time.time() - start_time
|
execution_time = time.time() - start_time
|
||||||
|
|
||||||
# Only cache successful responses
|
# Only cache successful responses
|
||||||
if hasattr(response, 'status_code') and response.status_code == 200:
|
if hasattr(response, "status_code") and response.status_code == 200:
|
||||||
getattr(cache_service, cache_backend + '_cache').set(cache_key, response, timeout)
|
getattr(cache_service, cache_backend + "_cache").set(
|
||||||
logger.debug(f"Cached API response for view {view_func.__name__}", extra={
|
cache_key, response, timeout
|
||||||
'cache_key': cache_key,
|
)
|
||||||
'view': view_func.__name__,
|
logger.debug(
|
||||||
'execution_time': execution_time,
|
f"Cached API response for view {view_func.__name__}",
|
||||||
'cache_timeout': timeout,
|
extra={
|
||||||
'cache_miss': True
|
"cache_key": cache_key,
|
||||||
})
|
"view": view_func.__name__,
|
||||||
|
"execution_time": execution_time,
|
||||||
|
"cache_timeout": timeout,
|
||||||
|
"cache_miss": True,
|
||||||
|
},
|
||||||
|
)
|
||||||
else:
|
else:
|
||||||
logger.debug(f"Not caching response for view {view_func.__name__} (status: {getattr(response, 'status_code', 'unknown')})")
|
logger.debug(
|
||||||
|
f"Not caching response for view {
|
||||||
|
view_func.__name__} (status: {
|
||||||
|
getattr(
|
||||||
|
response,
|
||||||
|
'status_code',
|
||||||
|
'unknown')})"
|
||||||
|
)
|
||||||
|
|
||||||
return response
|
return response
|
||||||
|
|
||||||
return wrapper
|
return wrapper
|
||||||
|
|
||||||
return decorator
|
return decorator
|
||||||
|
|
||||||
|
|
||||||
def cache_queryset_result(cache_key_template: str, timeout: int = 3600, cache_backend='default'):
|
def cache_queryset_result(
|
||||||
|
cache_key_template: str, timeout: int = 3600, cache_backend="default"
|
||||||
|
):
|
||||||
"""
|
"""
|
||||||
Decorator for caching expensive queryset operations
|
Decorator for caching expensive queryset operations
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
cache_key_template: Template for cache key (can use format placeholders)
|
cache_key_template: Template for cache key (can use format placeholders)
|
||||||
timeout: Cache timeout in seconds
|
timeout: Cache timeout in seconds
|
||||||
cache_backend: Cache backend to use
|
cache_backend: Cache backend to use
|
||||||
"""
|
"""
|
||||||
|
|
||||||
def decorator(func):
|
def decorator(func):
|
||||||
@wraps(func)
|
@wraps(func)
|
||||||
def wrapper(*args, **kwargs):
|
def wrapper(*args, **kwargs):
|
||||||
@@ -110,147 +135,171 @@ def cache_queryset_result(cache_key_template: str, timeout: int = 3600, cache_ba
|
|||||||
cache_key = cache_key_template.format(*args, **kwargs)
|
cache_key = cache_key_template.format(*args, **kwargs)
|
||||||
except (KeyError, IndexError):
|
except (KeyError, IndexError):
|
||||||
# Fallback to simpler key generation
|
# Fallback to simpler key generation
|
||||||
cache_key = f"{cache_key_template}:{hash(str(args) + str(kwargs))}"
|
cache_key = f"{cache_key_template}:{
|
||||||
|
hash(
|
||||||
|
str(args) +
|
||||||
|
str(kwargs))}"
|
||||||
|
|
||||||
cache_service = EnhancedCacheService()
|
cache_service = EnhancedCacheService()
|
||||||
cached_result = getattr(cache_service, cache_backend + '_cache').get(cache_key)
|
cached_result = getattr(cache_service, cache_backend + "_cache").get(
|
||||||
|
cache_key
|
||||||
|
)
|
||||||
|
|
||||||
if cached_result is not None:
|
if cached_result is not None:
|
||||||
logger.debug(f"Cache hit for queryset operation: {func.__name__}")
|
logger.debug(
|
||||||
|
f"Cache hit for queryset operation: {
|
||||||
|
func.__name__}"
|
||||||
|
)
|
||||||
return cached_result
|
return cached_result
|
||||||
|
|
||||||
# Execute function and cache result
|
# Execute function and cache result
|
||||||
start_time = time.time()
|
start_time = time.time()
|
||||||
result = func(*args, **kwargs)
|
result = func(*args, **kwargs)
|
||||||
execution_time = time.time() - start_time
|
execution_time = time.time() - start_time
|
||||||
|
|
||||||
getattr(cache_service, cache_backend + '_cache').set(cache_key, result, timeout)
|
getattr(cache_service, cache_backend + "_cache").set(
|
||||||
logger.debug(f"Cached queryset result for {func.__name__}", extra={
|
cache_key, result, timeout
|
||||||
'cache_key': cache_key,
|
)
|
||||||
'function': func.__name__,
|
logger.debug(
|
||||||
'execution_time': execution_time,
|
f"Cached queryset result for {func.__name__}",
|
||||||
'cache_timeout': timeout
|
extra={
|
||||||
})
|
"cache_key": cache_key,
|
||||||
|
"function": func.__name__,
|
||||||
|
"execution_time": execution_time,
|
||||||
|
"cache_timeout": timeout,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
return result
|
return result
|
||||||
|
|
||||||
return wrapper
|
return wrapper
|
||||||
|
|
||||||
return decorator
|
return decorator
|
||||||
|
|
||||||
|
|
||||||
def invalidate_cache_on_save(model_name: str, cache_patterns: List[str] = None):
|
def invalidate_cache_on_save(model_name: str, cache_patterns: List[str] = None):
|
||||||
"""
|
"""
|
||||||
Decorator to invalidate cache when model instances are saved
|
Decorator to invalidate cache when model instances are saved
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
model_name: Name of the model
|
model_name: Name of the model
|
||||||
cache_patterns: List of cache key patterns to invalidate
|
cache_patterns: List of cache key patterns to invalidate
|
||||||
"""
|
"""
|
||||||
|
|
||||||
def decorator(func):
|
def decorator(func):
|
||||||
@wraps(func)
|
@wraps(func)
|
||||||
def wrapper(self, *args, **kwargs):
|
def wrapper(self, *args, **kwargs):
|
||||||
result = func(self, *args, **kwargs)
|
result = func(self, *args, **kwargs)
|
||||||
|
|
||||||
# Invalidate related cache entries
|
# Invalidate related cache entries
|
||||||
cache_service = EnhancedCacheService()
|
cache_service = EnhancedCacheService()
|
||||||
|
|
||||||
# Standard model cache invalidation
|
# Standard model cache invalidation
|
||||||
instance_id = getattr(self, 'id', None)
|
instance_id = getattr(self, "id", None)
|
||||||
cache_service.invalidate_model_cache(model_name, instance_id)
|
cache_service.invalidate_model_cache(model_name, instance_id)
|
||||||
|
|
||||||
# Custom pattern invalidation
|
# Custom pattern invalidation
|
||||||
if cache_patterns:
|
if cache_patterns:
|
||||||
for pattern in cache_patterns:
|
for pattern in cache_patterns:
|
||||||
if instance_id:
|
if instance_id:
|
||||||
pattern = pattern.format(model=model_name, id=instance_id)
|
pattern = pattern.format(model=model_name, id=instance_id)
|
||||||
cache_service.invalidate_pattern(pattern)
|
cache_service.invalidate_pattern(pattern)
|
||||||
|
|
||||||
logger.info(f"Invalidated cache for {model_name} after save", extra={
|
logger.info(
|
||||||
'model': model_name,
|
f"Invalidated cache for {model_name} after save",
|
||||||
'instance_id': instance_id,
|
extra={
|
||||||
'patterns': cache_patterns
|
"model": model_name,
|
||||||
})
|
"instance_id": instance_id,
|
||||||
|
"patterns": cache_patterns,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
return result
|
return result
|
||||||
|
|
||||||
return wrapper
|
return wrapper
|
||||||
|
|
||||||
return decorator
|
return decorator
|
||||||
|
|
||||||
|
|
||||||
class CachedAPIViewMixin:
|
class CachedAPIViewMixin:
|
||||||
"""Mixin to add caching capabilities to API views"""
|
"""Mixin to add caching capabilities to API views"""
|
||||||
|
|
||||||
cache_timeout = 1800 # 30 minutes default
|
cache_timeout = 1800 # 30 minutes default
|
||||||
cache_vary_on = ['version']
|
cache_vary_on = ["version"]
|
||||||
cache_key_prefix = 'api'
|
cache_key_prefix = "api"
|
||||||
cache_backend = 'api'
|
cache_backend = "api"
|
||||||
|
|
||||||
@method_decorator(vary_on_headers('User-Agent', 'Accept-Language'))
|
@method_decorator(vary_on_headers("User-Agent", "Accept-Language"))
|
||||||
def dispatch(self, request, *args, **kwargs):
|
def dispatch(self, request, *args, **kwargs):
|
||||||
"""Add caching to the dispatch method"""
|
"""Add caching to the dispatch method"""
|
||||||
if request.method == 'GET' and getattr(self, 'enable_caching', True):
|
if request.method == "GET" and getattr(self, "enable_caching", True):
|
||||||
return self._cached_dispatch(request, *args, **kwargs)
|
return self._cached_dispatch(request, *args, **kwargs)
|
||||||
return super().dispatch(request, *args, **kwargs)
|
return super().dispatch(request, *args, **kwargs)
|
||||||
|
|
||||||
def _cached_dispatch(self, request, *args, **kwargs):
|
def _cached_dispatch(self, request, *args, **kwargs):
|
||||||
"""Handle cached dispatch for GET requests"""
|
"""Handle cached dispatch for GET requests"""
|
||||||
cache_key = self._generate_cache_key(request, *args, **kwargs)
|
cache_key = self._generate_cache_key(request, *args, **kwargs)
|
||||||
|
|
||||||
cache_service = EnhancedCacheService()
|
cache_service = EnhancedCacheService()
|
||||||
cached_response = getattr(cache_service, self.cache_backend + '_cache').get(cache_key)
|
cached_response = getattr(cache_service, self.cache_backend + "_cache").get(
|
||||||
|
cache_key
|
||||||
|
)
|
||||||
|
|
||||||
if cached_response:
|
if cached_response:
|
||||||
logger.debug(f"Cache hit for view {self.__class__.__name__}")
|
logger.debug(f"Cache hit for view {self.__class__.__name__}")
|
||||||
return cached_response
|
return cached_response
|
||||||
|
|
||||||
# Execute view
|
# Execute view
|
||||||
response = super().dispatch(request, *args, **kwargs)
|
response = super().dispatch(request, *args, **kwargs)
|
||||||
|
|
||||||
# Cache successful responses
|
# Cache successful responses
|
||||||
if hasattr(response, 'status_code') and response.status_code == 200:
|
if hasattr(response, "status_code") and response.status_code == 200:
|
||||||
getattr(cache_service, self.cache_backend + '_cache').set(
|
getattr(cache_service, self.cache_backend + "_cache").set(
|
||||||
cache_key, response, self.cache_timeout
|
cache_key, response, self.cache_timeout
|
||||||
)
|
)
|
||||||
logger.debug(f"Cached response for view {self.__class__.__name__}")
|
logger.debug(f"Cached response for view {self.__class__.__name__}")
|
||||||
|
|
||||||
return response
|
return response
|
||||||
|
|
||||||
def _generate_cache_key(self, request, *args, **kwargs):
|
def _generate_cache_key(self, request, *args, **kwargs):
|
||||||
"""Generate cache key for the request"""
|
"""Generate cache key for the request"""
|
||||||
key_parts = [
|
key_parts = [
|
||||||
self.cache_key_prefix,
|
self.cache_key_prefix,
|
||||||
self.__class__.__name__,
|
self.__class__.__name__,
|
||||||
request.method,
|
request.method,
|
||||||
str(request.user.id) if request.user.is_authenticated else 'anonymous',
|
(str(request.user.id) if request.user.is_authenticated else "anonymous"),
|
||||||
str(hash(frozenset(request.GET.items()))),
|
str(hash(frozenset(request.GET.items()))),
|
||||||
]
|
]
|
||||||
|
|
||||||
if args:
|
if args:
|
||||||
key_parts.append(str(hash(args)))
|
key_parts.append(str(hash(args)))
|
||||||
if kwargs:
|
if kwargs:
|
||||||
key_parts.append(str(hash(frozenset(kwargs.items()))))
|
key_parts.append(str(hash(frozenset(kwargs.items()))))
|
||||||
|
|
||||||
# Add vary_on fields
|
# Add vary_on fields
|
||||||
for field in self.cache_vary_on:
|
for field in self.cache_vary_on:
|
||||||
value = getattr(request, field, '')
|
value = getattr(request, field, "")
|
||||||
key_parts.append(str(value))
|
key_parts.append(str(value))
|
||||||
|
|
||||||
return ':'.join(key_parts)
|
return ":".join(key_parts)
|
||||||
|
|
||||||
|
|
||||||
def smart_cache(
|
def smart_cache(
|
||||||
timeout: int = 3600,
|
timeout: int = 3600,
|
||||||
key_func: Optional[Callable] = None,
|
key_func: Optional[Callable] = None,
|
||||||
invalidate_on: Optional[List[str]] = None,
|
invalidate_on: Optional[List[str]] = None,
|
||||||
cache_backend: str = 'default'
|
cache_backend: str = "default",
|
||||||
):
|
):
|
||||||
"""
|
"""
|
||||||
Smart caching decorator that adapts to function arguments
|
Smart caching decorator that adapts to function arguments
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
timeout: Cache timeout in seconds
|
timeout: Cache timeout in seconds
|
||||||
key_func: Custom function to generate cache key
|
key_func: Custom function to generate cache key
|
||||||
invalidate_on: List of signals to invalidate cache on
|
invalidate_on: List of signals to invalidate cache on
|
||||||
cache_backend: Cache backend to use
|
cache_backend: Cache backend to use
|
||||||
"""
|
"""
|
||||||
|
|
||||||
def decorator(func):
|
def decorator(func):
|
||||||
@wraps(func)
|
@wraps(func)
|
||||||
def wrapper(*args, **kwargs):
|
def wrapper(*args, **kwargs):
|
||||||
@@ -260,79 +309,96 @@ def smart_cache(
|
|||||||
else:
|
else:
|
||||||
# Default key generation
|
# Default key generation
|
||||||
key_data = {
|
key_data = {
|
||||||
'func': f"{func.__module__}.{func.__name__}",
|
"func": f"{func.__module__}.{func.__name__}",
|
||||||
'args': str(args),
|
"args": str(args),
|
||||||
'kwargs': json.dumps(kwargs, sort_keys=True, default=str)
|
"kwargs": json.dumps(kwargs, sort_keys=True, default=str),
|
||||||
}
|
}
|
||||||
key_string = json.dumps(key_data, sort_keys=True)
|
key_string = json.dumps(key_data, sort_keys=True)
|
||||||
cache_key = f"smart_cache:{hashlib.md5(key_string.encode()).hexdigest()}"
|
cache_key = f"smart_cache:{
|
||||||
|
hashlib.md5(
|
||||||
|
key_string.encode()).hexdigest()}"
|
||||||
|
|
||||||
# Try to get from cache
|
# Try to get from cache
|
||||||
cache_service = EnhancedCacheService()
|
cache_service = EnhancedCacheService()
|
||||||
cached_result = getattr(cache_service, cache_backend + '_cache').get(cache_key)
|
cached_result = getattr(cache_service, cache_backend + "_cache").get(
|
||||||
|
cache_key
|
||||||
|
)
|
||||||
|
|
||||||
if cached_result is not None:
|
if cached_result is not None:
|
||||||
logger.debug(f"Smart cache hit for {func.__name__}")
|
logger.debug(f"Smart cache hit for {func.__name__}")
|
||||||
return cached_result
|
return cached_result
|
||||||
|
|
||||||
# Execute function
|
# Execute function
|
||||||
start_time = time.time()
|
start_time = time.time()
|
||||||
result = func(*args, **kwargs)
|
result = func(*args, **kwargs)
|
||||||
execution_time = time.time() - start_time
|
execution_time = time.time() - start_time
|
||||||
|
|
||||||
# Cache result
|
# Cache result
|
||||||
getattr(cache_service, cache_backend + '_cache').set(cache_key, result, timeout)
|
getattr(cache_service, cache_backend + "_cache").set(
|
||||||
|
cache_key, result, timeout
|
||||||
logger.debug(f"Smart cached result for {func.__name__}", extra={
|
)
|
||||||
'cache_key': cache_key,
|
|
||||||
'execution_time': execution_time,
|
logger.debug(
|
||||||
'function': func.__name__
|
f"Smart cached result for {func.__name__}",
|
||||||
})
|
extra={
|
||||||
|
"cache_key": cache_key,
|
||||||
|
"execution_time": execution_time,
|
||||||
|
"function": func.__name__,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
return result
|
return result
|
||||||
|
|
||||||
# Add cache invalidation if specified
|
# Add cache invalidation if specified
|
||||||
if invalidate_on:
|
if invalidate_on:
|
||||||
wrapper._cache_invalidate_on = invalidate_on
|
wrapper._cache_invalidate_on = invalidate_on
|
||||||
wrapper._cache_backend = cache_backend
|
wrapper._cache_backend = cache_backend
|
||||||
|
|
||||||
return wrapper
|
return wrapper
|
||||||
|
|
||||||
return decorator
|
return decorator
|
||||||
|
|
||||||
|
|
||||||
def conditional_cache(condition_func: Callable, **cache_kwargs):
|
def conditional_cache(condition_func: Callable, **cache_kwargs):
|
||||||
"""
|
"""
|
||||||
Cache decorator that only caches when condition is met
|
Cache decorator that only caches when condition is met
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
condition_func: Function that returns True if caching should be applied
|
condition_func: Function that returns True if caching should be applied
|
||||||
**cache_kwargs: Arguments passed to smart_cache
|
**cache_kwargs: Arguments passed to smart_cache
|
||||||
"""
|
"""
|
||||||
|
|
||||||
def decorator(func):
|
def decorator(func):
|
||||||
cached_func = smart_cache(**cache_kwargs)(func)
|
cached_func = smart_cache(**cache_kwargs)(func)
|
||||||
|
|
||||||
@wraps(func)
|
@wraps(func)
|
||||||
def wrapper(*args, **kwargs):
|
def wrapper(*args, **kwargs):
|
||||||
if condition_func(*args, **kwargs):
|
if condition_func(*args, **kwargs):
|
||||||
return cached_func(*args, **kwargs)
|
return cached_func(*args, **kwargs)
|
||||||
else:
|
else:
|
||||||
return func(*args, **kwargs)
|
return func(*args, **kwargs)
|
||||||
|
|
||||||
return wrapper
|
return wrapper
|
||||||
|
|
||||||
return decorator
|
return decorator
|
||||||
|
|
||||||
|
|
||||||
# Utility functions for cache key generation
|
# Utility functions for cache key generation
|
||||||
def generate_user_cache_key(user, suffix: str = ''):
|
def generate_user_cache_key(user, suffix: str = ""):
|
||||||
"""Generate cache key based on user"""
|
"""Generate cache key based on user"""
|
||||||
user_id = user.id if user.is_authenticated else 'anonymous'
|
user_id = user.id if user.is_authenticated else "anonymous"
|
||||||
return f"user:{user_id}:{suffix}" if suffix else f"user:{user_id}"
|
return f"user:{user_id}:{suffix}" if suffix else f"user:{user_id}"
|
||||||
|
|
||||||
|
|
||||||
def generate_model_cache_key(model_instance, suffix: str = ''):
|
def generate_model_cache_key(model_instance, suffix: str = ""):
|
||||||
"""Generate cache key based on model instance"""
|
"""Generate cache key based on model instance"""
|
||||||
model_name = model_instance._meta.model_name
|
model_name = model_instance._meta.model_name
|
||||||
instance_id = model_instance.id
|
instance_id = model_instance.id
|
||||||
return f"{model_name}:{instance_id}:{suffix}" if suffix else f"{model_name}:{instance_id}"
|
return (
|
||||||
|
f"{model_name}:{instance_id}:{suffix}"
|
||||||
|
if suffix
|
||||||
|
else f"{model_name}:{instance_id}"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
def generate_queryset_cache_key(queryset, params: dict = None):
|
def generate_queryset_cache_key(queryset, params: dict = None):
|
||||||
@@ -8,34 +8,34 @@ from typing import Optional, Dict, Any
|
|||||||
|
|
||||||
class ThrillWikiException(Exception):
|
class ThrillWikiException(Exception):
|
||||||
"""Base exception for all ThrillWiki-specific errors."""
|
"""Base exception for all ThrillWiki-specific errors."""
|
||||||
|
|
||||||
default_message = "An error occurred"
|
default_message = "An error occurred"
|
||||||
error_code = "THRILLWIKI_ERROR"
|
error_code = "THRILLWIKI_ERROR"
|
||||||
status_code = 500
|
status_code = 500
|
||||||
|
|
||||||
def __init__(
|
def __init__(
|
||||||
self,
|
self,
|
||||||
message: Optional[str] = None,
|
message: Optional[str] = None,
|
||||||
error_code: Optional[str] = None,
|
error_code: Optional[str] = None,
|
||||||
details: Optional[Dict[str, Any]] = None
|
details: Optional[Dict[str, Any]] = None,
|
||||||
):
|
):
|
||||||
self.message = message or self.default_message
|
self.message = message or self.default_message
|
||||||
self.error_code = error_code or self.error_code
|
self.error_code = error_code or self.error_code
|
||||||
self.details = details or {}
|
self.details = details or {}
|
||||||
super().__init__(self.message)
|
super().__init__(self.message)
|
||||||
|
|
||||||
def to_dict(self) -> Dict[str, Any]:
|
def to_dict(self) -> Dict[str, Any]:
|
||||||
"""Convert exception to dictionary for API responses."""
|
"""Convert exception to dictionary for API responses."""
|
||||||
return {
|
return {
|
||||||
'error_code': self.error_code,
|
"error_code": self.error_code,
|
||||||
'message': self.message,
|
"message": self.message,
|
||||||
'details': self.details
|
"details": self.details,
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
class ValidationException(ThrillWikiException):
|
class ValidationException(ThrillWikiException):
|
||||||
"""Raised when data validation fails."""
|
"""Raised when data validation fails."""
|
||||||
|
|
||||||
default_message = "Validation failed"
|
default_message = "Validation failed"
|
||||||
error_code = "VALIDATION_ERROR"
|
error_code = "VALIDATION_ERROR"
|
||||||
status_code = 400
|
status_code = 400
|
||||||
@@ -43,7 +43,7 @@ class ValidationException(ThrillWikiException):
|
|||||||
|
|
||||||
class NotFoundError(ThrillWikiException):
|
class NotFoundError(ThrillWikiException):
|
||||||
"""Raised when a requested resource is not found."""
|
"""Raised when a requested resource is not found."""
|
||||||
|
|
||||||
default_message = "Resource not found"
|
default_message = "Resource not found"
|
||||||
error_code = "NOT_FOUND"
|
error_code = "NOT_FOUND"
|
||||||
status_code = 404
|
status_code = 404
|
||||||
@@ -51,7 +51,7 @@ class NotFoundError(ThrillWikiException):
|
|||||||
|
|
||||||
class PermissionDeniedError(ThrillWikiException):
|
class PermissionDeniedError(ThrillWikiException):
|
||||||
"""Raised when user lacks permission for an operation."""
|
"""Raised when user lacks permission for an operation."""
|
||||||
|
|
||||||
default_message = "Permission denied"
|
default_message = "Permission denied"
|
||||||
error_code = "PERMISSION_DENIED"
|
error_code = "PERMISSION_DENIED"
|
||||||
status_code = 403
|
status_code = 403
|
||||||
@@ -59,7 +59,7 @@ class PermissionDeniedError(ThrillWikiException):
|
|||||||
|
|
||||||
class BusinessLogicError(ThrillWikiException):
|
class BusinessLogicError(ThrillWikiException):
|
||||||
"""Raised when business logic constraints are violated."""
|
"""Raised when business logic constraints are violated."""
|
||||||
|
|
||||||
default_message = "Business logic violation"
|
default_message = "Business logic violation"
|
||||||
error_code = "BUSINESS_LOGIC_ERROR"
|
error_code = "BUSINESS_LOGIC_ERROR"
|
||||||
status_code = 400
|
status_code = 400
|
||||||
@@ -67,7 +67,7 @@ class BusinessLogicError(ThrillWikiException):
|
|||||||
|
|
||||||
class ExternalServiceError(ThrillWikiException):
|
class ExternalServiceError(ThrillWikiException):
|
||||||
"""Raised when external service calls fail."""
|
"""Raised when external service calls fail."""
|
||||||
|
|
||||||
default_message = "External service error"
|
default_message = "External service error"
|
||||||
error_code = "EXTERNAL_SERVICE_ERROR"
|
error_code = "EXTERNAL_SERVICE_ERROR"
|
||||||
status_code = 502
|
status_code = 502
|
||||||
@@ -75,127 +75,138 @@ class ExternalServiceError(ThrillWikiException):
|
|||||||
|
|
||||||
# Domain-specific exceptions
|
# Domain-specific exceptions
|
||||||
|
|
||||||
|
|
||||||
class ParkError(ThrillWikiException):
|
class ParkError(ThrillWikiException):
|
||||||
"""Base exception for park-related errors."""
|
"""Base exception for park-related errors."""
|
||||||
|
|
||||||
error_code = "PARK_ERROR"
|
error_code = "PARK_ERROR"
|
||||||
|
|
||||||
|
|
||||||
class ParkNotFoundError(NotFoundError):
|
class ParkNotFoundError(NotFoundError):
|
||||||
"""Raised when a park is not found."""
|
"""Raised when a park is not found."""
|
||||||
|
|
||||||
default_message = "Park not found"
|
default_message = "Park not found"
|
||||||
error_code = "PARK_NOT_FOUND"
|
error_code = "PARK_NOT_FOUND"
|
||||||
|
|
||||||
def __init__(self, park_slug: Optional[str] = None, **kwargs):
|
def __init__(self, park_slug: Optional[str] = None, **kwargs):
|
||||||
if park_slug:
|
if park_slug:
|
||||||
kwargs['details'] = {'park_slug': park_slug}
|
kwargs["details"] = {"park_slug": park_slug}
|
||||||
kwargs['message'] = f"Park with slug '{park_slug}' not found"
|
kwargs["message"] = f"Park with slug '{park_slug}' not found"
|
||||||
super().__init__(**kwargs)
|
super().__init__(**kwargs)
|
||||||
|
|
||||||
|
|
||||||
class ParkOperationError(BusinessLogicError):
|
class ParkOperationError(BusinessLogicError):
|
||||||
"""Raised when park operation constraints are violated."""
|
"""Raised when park operation constraints are violated."""
|
||||||
|
|
||||||
default_message = "Invalid park operation"
|
default_message = "Invalid park operation"
|
||||||
error_code = "PARK_OPERATION_ERROR"
|
error_code = "PARK_OPERATION_ERROR"
|
||||||
|
|
||||||
|
|
||||||
class RideError(ThrillWikiException):
|
class RideError(ThrillWikiException):
|
||||||
"""Base exception for ride-related errors."""
|
"""Base exception for ride-related errors."""
|
||||||
|
|
||||||
error_code = "RIDE_ERROR"
|
error_code = "RIDE_ERROR"
|
||||||
|
|
||||||
|
|
||||||
class RideNotFoundError(NotFoundError):
|
class RideNotFoundError(NotFoundError):
|
||||||
"""Raised when a ride is not found."""
|
"""Raised when a ride is not found."""
|
||||||
|
|
||||||
default_message = "Ride not found"
|
default_message = "Ride not found"
|
||||||
error_code = "RIDE_NOT_FOUND"
|
error_code = "RIDE_NOT_FOUND"
|
||||||
|
|
||||||
def __init__(self, ride_slug: Optional[str] = None, **kwargs):
|
def __init__(self, ride_slug: Optional[str] = None, **kwargs):
|
||||||
if ride_slug:
|
if ride_slug:
|
||||||
kwargs['details'] = {'ride_slug': ride_slug}
|
kwargs["details"] = {"ride_slug": ride_slug}
|
||||||
kwargs['message'] = f"Ride with slug '{ride_slug}' not found"
|
kwargs["message"] = f"Ride with slug '{ride_slug}' not found"
|
||||||
super().__init__(**kwargs)
|
super().__init__(**kwargs)
|
||||||
|
|
||||||
|
|
||||||
class RideOperationError(BusinessLogicError):
|
class RideOperationError(BusinessLogicError):
|
||||||
"""Raised when ride operation constraints are violated."""
|
"""Raised when ride operation constraints are violated."""
|
||||||
|
|
||||||
default_message = "Invalid ride operation"
|
default_message = "Invalid ride operation"
|
||||||
error_code = "RIDE_OPERATION_ERROR"
|
error_code = "RIDE_OPERATION_ERROR"
|
||||||
|
|
||||||
|
|
||||||
class LocationError(ThrillWikiException):
|
class LocationError(ThrillWikiException):
|
||||||
"""Base exception for location-related errors."""
|
"""Base exception for location-related errors."""
|
||||||
|
|
||||||
error_code = "LOCATION_ERROR"
|
error_code = "LOCATION_ERROR"
|
||||||
|
|
||||||
|
|
||||||
class InvalidCoordinatesError(ValidationException):
|
class InvalidCoordinatesError(ValidationException):
|
||||||
"""Raised when geographic coordinates are invalid."""
|
"""Raised when geographic coordinates are invalid."""
|
||||||
|
|
||||||
default_message = "Invalid geographic coordinates"
|
default_message = "Invalid geographic coordinates"
|
||||||
error_code = "INVALID_COORDINATES"
|
error_code = "INVALID_COORDINATES"
|
||||||
|
|
||||||
def __init__(self, latitude: Optional[float] = None, longitude: Optional[float] = None, **kwargs):
|
def __init__(
|
||||||
|
self,
|
||||||
|
latitude: Optional[float] = None,
|
||||||
|
longitude: Optional[float] = None,
|
||||||
|
**kwargs,
|
||||||
|
):
|
||||||
if latitude is not None or longitude is not None:
|
if latitude is not None or longitude is not None:
|
||||||
kwargs['details'] = {'latitude': latitude, 'longitude': longitude}
|
kwargs["details"] = {"latitude": latitude, "longitude": longitude}
|
||||||
super().__init__(**kwargs)
|
super().__init__(**kwargs)
|
||||||
|
|
||||||
|
|
||||||
class GeolocationError(ExternalServiceError):
|
class GeolocationError(ExternalServiceError):
|
||||||
"""Raised when geolocation services fail."""
|
"""Raised when geolocation services fail."""
|
||||||
|
|
||||||
default_message = "Geolocation service unavailable"
|
default_message = "Geolocation service unavailable"
|
||||||
error_code = "GEOLOCATION_ERROR"
|
error_code = "GEOLOCATION_ERROR"
|
||||||
|
|
||||||
|
|
||||||
class ReviewError(ThrillWikiException):
|
class ReviewError(ThrillWikiException):
|
||||||
"""Base exception for review-related errors."""
|
"""Base exception for review-related errors."""
|
||||||
|
|
||||||
error_code = "REVIEW_ERROR"
|
error_code = "REVIEW_ERROR"
|
||||||
|
|
||||||
|
|
||||||
class ReviewModerationError(BusinessLogicError):
|
class ReviewModerationError(BusinessLogicError):
|
||||||
"""Raised when review moderation constraints are violated."""
|
"""Raised when review moderation constraints are violated."""
|
||||||
|
|
||||||
default_message = "Review moderation error"
|
default_message = "Review moderation error"
|
||||||
error_code = "REVIEW_MODERATION_ERROR"
|
error_code = "REVIEW_MODERATION_ERROR"
|
||||||
|
|
||||||
|
|
||||||
class DuplicateReviewError(BusinessLogicError):
|
class DuplicateReviewError(BusinessLogicError):
|
||||||
"""Raised when user tries to create duplicate reviews."""
|
"""Raised when user tries to create duplicate reviews."""
|
||||||
|
|
||||||
default_message = "User has already reviewed this item"
|
default_message = "User has already reviewed this item"
|
||||||
error_code = "DUPLICATE_REVIEW"
|
error_code = "DUPLICATE_REVIEW"
|
||||||
|
|
||||||
|
|
||||||
class AccountError(ThrillWikiException):
|
class AccountError(ThrillWikiException):
|
||||||
"""Base exception for account-related errors."""
|
"""Base exception for account-related errors."""
|
||||||
|
|
||||||
error_code = "ACCOUNT_ERROR"
|
error_code = "ACCOUNT_ERROR"
|
||||||
|
|
||||||
|
|
||||||
class InsufficientPermissionsError(PermissionDeniedError):
|
class InsufficientPermissionsError(PermissionDeniedError):
|
||||||
"""Raised when user lacks required permissions."""
|
"""Raised when user lacks required permissions."""
|
||||||
|
|
||||||
default_message = "Insufficient permissions"
|
default_message = "Insufficient permissions"
|
||||||
error_code = "INSUFFICIENT_PERMISSIONS"
|
error_code = "INSUFFICIENT_PERMISSIONS"
|
||||||
|
|
||||||
def __init__(self, required_permission: Optional[str] = None, **kwargs):
|
def __init__(self, required_permission: Optional[str] = None, **kwargs):
|
||||||
if required_permission:
|
if required_permission:
|
||||||
kwargs['details'] = {'required_permission': required_permission}
|
kwargs["details"] = {"required_permission": required_permission}
|
||||||
kwargs['message'] = f"Permission '{required_permission}' required"
|
kwargs["message"] = f"Permission '{required_permission}' required"
|
||||||
super().__init__(**kwargs)
|
super().__init__(**kwargs)
|
||||||
|
|
||||||
|
|
||||||
class EmailError(ExternalServiceError):
|
class EmailError(ExternalServiceError):
|
||||||
"""Raised when email operations fail."""
|
"""Raised when email operations fail."""
|
||||||
|
|
||||||
default_message = "Email service error"
|
default_message = "Email service error"
|
||||||
error_code = "EMAIL_ERROR"
|
error_code = "EMAIL_ERROR"
|
||||||
|
|
||||||
|
|
||||||
class CacheError(ThrillWikiException):
|
class CacheError(ThrillWikiException):
|
||||||
"""Raised when cache operations fail."""
|
"""Raised when cache operations fail."""
|
||||||
|
|
||||||
default_message = "Cache operation failed"
|
default_message = "Cache operation failed"
|
||||||
error_code = "CACHE_ERROR"
|
error_code = "CACHE_ERROR"
|
||||||
status_code = 500
|
status_code = 500
|
||||||
@@ -203,11 +214,11 @@ class CacheError(ThrillWikiException):
|
|||||||
|
|
||||||
class RoadTripError(ExternalServiceError):
|
class RoadTripError(ExternalServiceError):
|
||||||
"""Raised when road trip planning fails."""
|
"""Raised when road trip planning fails."""
|
||||||
|
|
||||||
default_message = "Road trip planning error"
|
default_message = "Road trip planning error"
|
||||||
error_code = "ROADTRIP_ERROR"
|
error_code = "ROADTRIP_ERROR"
|
||||||
|
|
||||||
def __init__(self, service_name: Optional[str] = None, **kwargs):
|
def __init__(self, service_name: Optional[str] = None, **kwargs):
|
||||||
if service_name:
|
if service_name:
|
||||||
kwargs['details'] = {'service': service_name}
|
kwargs["details"] = {"service": service_name}
|
||||||
super().__init__(**kwargs)
|
super().__init__(**kwargs)
|
||||||
@@ -1,4 +1,5 @@
|
|||||||
"""Core forms and form components."""
|
"""Core forms and form components."""
|
||||||
|
|
||||||
from django.conf import settings
|
from django.conf import settings
|
||||||
from django.core.exceptions import PermissionDenied
|
from django.core.exceptions import PermissionDenied
|
||||||
from django.utils.translation import gettext_lazy as _
|
from django.utils.translation import gettext_lazy as _
|
||||||
@@ -8,20 +9,23 @@ from autocomplete import Autocomplete
|
|||||||
|
|
||||||
class BaseAutocomplete(Autocomplete):
|
class BaseAutocomplete(Autocomplete):
|
||||||
"""Base autocomplete class for consistent autocomplete behavior across the project.
|
"""Base autocomplete class for consistent autocomplete behavior across the project.
|
||||||
|
|
||||||
This class extends django-htmx-autocomplete's base Autocomplete class to provide:
|
This class extends django-htmx-autocomplete's base Autocomplete class to provide:
|
||||||
- Project-wide defaults for autocomplete behavior
|
- Project-wide defaults for autocomplete behavior
|
||||||
- Translation strings
|
- Translation strings
|
||||||
- Authentication enforcement
|
- Authentication enforcement
|
||||||
- Sensible search configuration
|
- Sensible search configuration
|
||||||
"""
|
"""
|
||||||
|
|
||||||
# Search configuration
|
# Search configuration
|
||||||
minimum_search_length = 2 # More responsive than default 3
|
minimum_search_length = 2 # More responsive than default 3
|
||||||
max_results = 10 # Reasonable limit for performance
|
max_results = 10 # Reasonable limit for performance
|
||||||
|
|
||||||
# UI text configuration using gettext for i18n
|
# UI text configuration using gettext for i18n
|
||||||
no_result_text = _("No matches found")
|
no_result_text = _("No matches found")
|
||||||
narrow_search_text = _("Showing %(page_size)s of %(total)s matches. Please refine your search.")
|
narrow_search_text = _(
|
||||||
|
"Showing %(page_size)s of %(total)s matches. Please refine your search."
|
||||||
|
)
|
||||||
type_at_least_n_characters = _("Type at least %(n)s characters...")
|
type_at_least_n_characters = _("Type at least %(n)s characters...")
|
||||||
|
|
||||||
# Project-wide component settings
|
# Project-wide component settings
|
||||||
@@ -30,10 +34,10 @@ class BaseAutocomplete(Autocomplete):
|
|||||||
@staticmethod
|
@staticmethod
|
||||||
def auth_check(request):
|
def auth_check(request):
|
||||||
"""Enforce authentication by default.
|
"""Enforce authentication by default.
|
||||||
|
|
||||||
This can be overridden in subclasses if public access is needed.
|
This can be overridden in subclasses if public access is needed.
|
||||||
Configure AUTOCOMPLETE_BLOCK_UNAUTHENTICATED in settings to disable.
|
Configure AUTOCOMPLETE_BLOCK_UNAUTHENTICATED in settings to disable.
|
||||||
"""
|
"""
|
||||||
block_unauth = getattr(settings, 'AUTOCOMPLETE_BLOCK_UNAUTHENTICATED', True)
|
block_unauth = getattr(settings, "AUTOCOMPLETE_BLOCK_UNAUTHENTICATED", True)
|
||||||
if block_unauth and not request.user.is_authenticated:
|
if block_unauth and not request.user.is_authenticated:
|
||||||
raise PermissionDenied(_("Authentication required"))
|
raise PermissionDenied(_("Authentication required"))
|
||||||
168
backend/apps/core/forms/search.py
Normal file
168
backend/apps/core/forms/search.py
Normal file
@@ -0,0 +1,168 @@
|
|||||||
|
from django import forms
|
||||||
|
from django.utils.translation import gettext_lazy as _
|
||||||
|
|
||||||
|
|
||||||
|
class LocationSearchForm(forms.Form):
|
||||||
|
"""
|
||||||
|
A comprehensive search form that includes text search, location-based
|
||||||
|
search, and content type filtering for a unified search experience.
|
||||||
|
"""
|
||||||
|
|
||||||
|
# Text search query
|
||||||
|
q = forms.CharField(
|
||||||
|
required=False,
|
||||||
|
label=_("Search Query"),
|
||||||
|
widget=forms.TextInput(
|
||||||
|
attrs={
|
||||||
|
"placeholder": _("Search parks, rides, companies..."),
|
||||||
|
"class": (
|
||||||
|
"w-full px-3 py-2 border border-gray-300 rounded-md shadow-sm "
|
||||||
|
"focus:ring-blue-500 focus:border-blue-500 dark:bg-gray-700 "
|
||||||
|
"dark:border-gray-600 dark:text-white"
|
||||||
|
),
|
||||||
|
}
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
# Location-based search
|
||||||
|
location = forms.CharField(
|
||||||
|
required=False,
|
||||||
|
label=_("Near Location"),
|
||||||
|
widget=forms.TextInput(
|
||||||
|
attrs={
|
||||||
|
"placeholder": _("City, address, or coordinates..."),
|
||||||
|
"id": "location-input",
|
||||||
|
"class": (
|
||||||
|
"w-full px-3 py-2 border border-gray-300 rounded-md shadow-sm "
|
||||||
|
"focus:ring-blue-500 focus:border-blue-500 dark:bg-gray-700 "
|
||||||
|
"dark:border-gray-600 dark:text-white"
|
||||||
|
),
|
||||||
|
}
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
# Hidden fields for coordinates
|
||||||
|
lat = forms.FloatField(
|
||||||
|
required=False, widget=forms.HiddenInput(attrs={"id": "lat-input"})
|
||||||
|
)
|
||||||
|
lng = forms.FloatField(
|
||||||
|
required=False, widget=forms.HiddenInput(attrs={"id": "lng-input"})
|
||||||
|
)
|
||||||
|
|
||||||
|
# Search radius
|
||||||
|
radius_km = forms.ChoiceField(
|
||||||
|
required=False,
|
||||||
|
label=_("Search Radius"),
|
||||||
|
choices=[
|
||||||
|
("", _("Any distance")),
|
||||||
|
("5", _("5 km")),
|
||||||
|
("10", _("10 km")),
|
||||||
|
("25", _("25 km")),
|
||||||
|
("50", _("50 km")),
|
||||||
|
("100", _("100 km")),
|
||||||
|
("200", _("200 km")),
|
||||||
|
],
|
||||||
|
widget=forms.Select(
|
||||||
|
attrs={
|
||||||
|
"class": (
|
||||||
|
"w-full px-3 py-2 border border-gray-300 rounded-md shadow-sm "
|
||||||
|
"focus:ring-blue-500 focus:border-blue-500 dark:bg-gray-700 "
|
||||||
|
"dark:border-gray-600 dark:text-white"
|
||||||
|
)
|
||||||
|
}
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
# Content type filters
|
||||||
|
search_parks = forms.BooleanField(
|
||||||
|
required=False,
|
||||||
|
initial=True,
|
||||||
|
label=_("Search Parks"),
|
||||||
|
widget=forms.CheckboxInput(
|
||||||
|
attrs={
|
||||||
|
"class": (
|
||||||
|
"rounded border-gray-300 text-blue-600 focus:ring-blue-500 "
|
||||||
|
"dark:border-gray-600 dark:bg-gray-700"
|
||||||
|
)
|
||||||
|
}
|
||||||
|
),
|
||||||
|
)
|
||||||
|
search_rides = forms.BooleanField(
|
||||||
|
required=False,
|
||||||
|
label=_("Search Rides"),
|
||||||
|
widget=forms.CheckboxInput(
|
||||||
|
attrs={
|
||||||
|
"class": (
|
||||||
|
"rounded border-gray-300 text-blue-600 focus:ring-blue-500 "
|
||||||
|
"dark:border-gray-600 dark:bg-gray-700"
|
||||||
|
)
|
||||||
|
}
|
||||||
|
),
|
||||||
|
)
|
||||||
|
search_companies = forms.BooleanField(
|
||||||
|
required=False,
|
||||||
|
label=_("Search Companies"),
|
||||||
|
widget=forms.CheckboxInput(
|
||||||
|
attrs={
|
||||||
|
"class": (
|
||||||
|
"rounded border-gray-300 text-blue-600 focus:ring-blue-500 "
|
||||||
|
"dark:border-gray-600 dark:bg-gray-700"
|
||||||
|
)
|
||||||
|
}
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
# Geographic filters
|
||||||
|
country = forms.CharField(
|
||||||
|
required=False,
|
||||||
|
widget=forms.TextInput(
|
||||||
|
attrs={
|
||||||
|
"placeholder": _("Country"),
|
||||||
|
"class": (
|
||||||
|
"w-full px-3 py-2 text-sm border border-gray-300 rounded-md "
|
||||||
|
"shadow-sm focus:ring-blue-500 focus:border-blue-500 "
|
||||||
|
"dark:bg-gray-700 dark:border-gray-600 dark:text-white"
|
||||||
|
),
|
||||||
|
}
|
||||||
|
),
|
||||||
|
)
|
||||||
|
state = forms.CharField(
|
||||||
|
required=False,
|
||||||
|
widget=forms.TextInput(
|
||||||
|
attrs={
|
||||||
|
"placeholder": _("State/Region"),
|
||||||
|
"class": (
|
||||||
|
"w-full px-3 py-2 text-sm border border-gray-300 rounded-md "
|
||||||
|
"shadow-sm focus:ring-blue-500 focus:border-blue-500 "
|
||||||
|
"dark:bg-gray-700 dark:border-gray-600 dark:text-white"
|
||||||
|
),
|
||||||
|
}
|
||||||
|
),
|
||||||
|
)
|
||||||
|
city = forms.CharField(
|
||||||
|
required=False,
|
||||||
|
widget=forms.TextInput(
|
||||||
|
attrs={
|
||||||
|
"placeholder": _("City"),
|
||||||
|
"class": (
|
||||||
|
"w-full px-3 py-2 text-sm border border-gray-300 rounded-md "
|
||||||
|
"shadow-sm focus:ring-blue-500 focus:border-blue-500 "
|
||||||
|
"dark:bg-gray-700 dark:border-gray-600 dark:text-white"
|
||||||
|
),
|
||||||
|
}
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
def clean(self):
|
||||||
|
cleaned_data = super().clean()
|
||||||
|
|
||||||
|
# If lat/lng are provided, ensure location field is populated for
|
||||||
|
# display
|
||||||
|
lat = cleaned_data.get("lat")
|
||||||
|
lng = cleaned_data.get("lng")
|
||||||
|
location = cleaned_data.get("location")
|
||||||
|
|
||||||
|
if lat and lng and not location:
|
||||||
|
cleaned_data["location"] = f"{lat}, {lng}"
|
||||||
|
|
||||||
|
return cleaned_data
|
||||||
@@ -7,105 +7,127 @@ import logging
|
|||||||
from django.core.cache import cache
|
from django.core.cache import cache
|
||||||
from django.db import connection
|
from django.db import connection
|
||||||
from health_check.backends import BaseHealthCheckBackend
|
from health_check.backends import BaseHealthCheckBackend
|
||||||
from health_check.exceptions import ServiceUnavailable, ServiceReturnedUnexpectedResult
|
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
class CacheHealthCheck(BaseHealthCheckBackend):
|
class CacheHealthCheck(BaseHealthCheckBackend):
|
||||||
"""Check Redis cache connectivity and performance"""
|
"""Check Redis cache connectivity and performance"""
|
||||||
|
|
||||||
critical_service = True
|
critical_service = True
|
||||||
|
|
||||||
def check_status(self):
|
def check_status(self):
|
||||||
try:
|
try:
|
||||||
# Test cache write/read performance
|
# Test cache write/read performance
|
||||||
test_key = 'health_check_test'
|
test_key = "health_check_test"
|
||||||
test_value = 'test_value_' + str(int(time.time()))
|
test_value = "test_value_" + str(int(time.time()))
|
||||||
|
|
||||||
start_time = time.time()
|
start_time = time.time()
|
||||||
cache.set(test_key, test_value, timeout=30)
|
cache.set(test_key, test_value, timeout=30)
|
||||||
cached_value = cache.get(test_key)
|
cached_value = cache.get(test_key)
|
||||||
cache_time = time.time() - start_time
|
cache_time = time.time() - start_time
|
||||||
|
|
||||||
if cached_value != test_value:
|
if cached_value != test_value:
|
||||||
self.add_error("Cache read/write test failed - values don't match")
|
self.add_error("Cache read/write test failed - values don't match")
|
||||||
return
|
return
|
||||||
|
|
||||||
# Check cache performance
|
# Check cache performance
|
||||||
if cache_time > 0.1: # Warn if cache operations take more than 100ms
|
if cache_time > 0.1: # Warn if cache operations take more than 100ms
|
||||||
self.add_error(f"Cache performance degraded: {cache_time:.3f}s for read/write operation")
|
self.add_error(
|
||||||
|
f"Cache performance degraded: {
|
||||||
|
cache_time:.3f}s for read/write operation"
|
||||||
|
)
|
||||||
return
|
return
|
||||||
|
|
||||||
# Clean up test key
|
# Clean up test key
|
||||||
cache.delete(test_key)
|
cache.delete(test_key)
|
||||||
|
|
||||||
# Additional Redis-specific checks if using django-redis
|
# Additional Redis-specific checks if using django-redis
|
||||||
try:
|
try:
|
||||||
from django_redis import get_redis_connection
|
from django_redis import get_redis_connection
|
||||||
|
|
||||||
redis_client = get_redis_connection("default")
|
redis_client = get_redis_connection("default")
|
||||||
info = redis_client.info()
|
info = redis_client.info()
|
||||||
|
|
||||||
# Check memory usage
|
# Check memory usage
|
||||||
used_memory = info.get('used_memory', 0)
|
used_memory = info.get("used_memory", 0)
|
||||||
max_memory = info.get('maxmemory', 0)
|
max_memory = info.get("maxmemory", 0)
|
||||||
|
|
||||||
if max_memory > 0:
|
if max_memory > 0:
|
||||||
memory_usage_percent = (used_memory / max_memory) * 100
|
memory_usage_percent = (used_memory / max_memory) * 100
|
||||||
if memory_usage_percent > 90:
|
if memory_usage_percent > 90:
|
||||||
self.add_error(f"Redis memory usage critical: {memory_usage_percent:.1f}%")
|
self.add_error(
|
||||||
|
f"Redis memory usage critical: {
|
||||||
|
memory_usage_percent:.1f}%"
|
||||||
|
)
|
||||||
elif memory_usage_percent > 80:
|
elif memory_usage_percent > 80:
|
||||||
logger.warning(f"Redis memory usage high: {memory_usage_percent:.1f}%")
|
logger.warning(
|
||||||
|
f"Redis memory usage high: {
|
||||||
|
memory_usage_percent:.1f}%"
|
||||||
|
)
|
||||||
|
|
||||||
except ImportError:
|
except ImportError:
|
||||||
# django-redis not available, skip additional checks
|
# django-redis not available, skip additional checks
|
||||||
pass
|
pass
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.warning(f"Could not get Redis info: {e}")
|
logger.warning(f"Could not get Redis info: {e}")
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
self.add_error(f"Cache service unavailable: {e}")
|
self.add_error(f"Cache service unavailable: {e}")
|
||||||
|
|
||||||
|
|
||||||
class DatabasePerformanceCheck(BaseHealthCheckBackend):
|
class DatabasePerformanceCheck(BaseHealthCheckBackend):
|
||||||
"""Check database performance and connectivity"""
|
"""Check database performance and connectivity"""
|
||||||
|
|
||||||
critical_service = False
|
critical_service = False
|
||||||
|
|
||||||
def check_status(self):
|
def check_status(self):
|
||||||
try:
|
try:
|
||||||
start_time = time.time()
|
start_time = time.time()
|
||||||
|
|
||||||
# Test basic connectivity
|
# Test basic connectivity
|
||||||
with connection.cursor() as cursor:
|
with connection.cursor() as cursor:
|
||||||
cursor.execute("SELECT 1")
|
cursor.execute("SELECT 1")
|
||||||
result = cursor.fetchone()
|
result = cursor.fetchone()
|
||||||
|
|
||||||
if result[0] != 1:
|
if result[0] != 1:
|
||||||
self.add_error("Database connectivity test failed")
|
self.add_error("Database connectivity test failed")
|
||||||
return
|
return
|
||||||
|
|
||||||
basic_query_time = time.time() - start_time
|
basic_query_time = time.time() - start_time
|
||||||
|
|
||||||
# Test a more complex query (if it takes too long, there might be performance issues)
|
# Test a more complex query (if it takes too long, there might be
|
||||||
|
# performance issues)
|
||||||
start_time = time.time()
|
start_time = time.time()
|
||||||
with connection.cursor() as cursor:
|
with connection.cursor() as cursor:
|
||||||
cursor.execute("SELECT COUNT(*) FROM django_content_type")
|
cursor.execute("SELECT COUNT(*) FROM django_content_type")
|
||||||
cursor.fetchone()
|
cursor.fetchone()
|
||||||
|
|
||||||
complex_query_time = time.time() - start_time
|
complex_query_time = time.time() - start_time
|
||||||
|
|
||||||
# Performance thresholds
|
# Performance thresholds
|
||||||
if basic_query_time > 1.0:
|
if basic_query_time > 1.0:
|
||||||
self.add_error(f"Database responding slowly: basic query took {basic_query_time:.2f}s")
|
self.add_error(
|
||||||
|
f"Database responding slowly: basic query took {
|
||||||
|
basic_query_time:.2f}s"
|
||||||
|
)
|
||||||
elif basic_query_time > 0.5:
|
elif basic_query_time > 0.5:
|
||||||
logger.warning(f"Database performance degraded: basic query took {basic_query_time:.2f}s")
|
logger.warning(
|
||||||
|
f"Database performance degraded: basic query took {
|
||||||
|
basic_query_time:.2f}s"
|
||||||
|
)
|
||||||
|
|
||||||
if complex_query_time > 2.0:
|
if complex_query_time > 2.0:
|
||||||
self.add_error(f"Database performance critical: complex query took {complex_query_time:.2f}s")
|
self.add_error(
|
||||||
|
f"Database performance critical: complex query took {
|
||||||
|
complex_query_time:.2f}s"
|
||||||
|
)
|
||||||
elif complex_query_time > 1.0:
|
elif complex_query_time > 1.0:
|
||||||
logger.warning(f"Database performance slow: complex query took {complex_query_time:.2f}s")
|
logger.warning(
|
||||||
|
f"Database performance slow: complex query took {
|
||||||
|
complex_query_time:.2f}s"
|
||||||
|
)
|
||||||
|
|
||||||
# Check database version and settings if possible
|
# Check database version and settings if possible
|
||||||
try:
|
try:
|
||||||
with connection.cursor() as cursor:
|
with connection.cursor() as cursor:
|
||||||
@@ -114,162 +136,190 @@ class DatabasePerformanceCheck(BaseHealthCheckBackend):
|
|||||||
logger.debug(f"Database version: {version}")
|
logger.debug(f"Database version: {version}")
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.debug(f"Could not get database version: {e}")
|
logger.debug(f"Could not get database version: {e}")
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
self.add_error(f"Database performance check failed: {e}")
|
self.add_error(f"Database performance check failed: {e}")
|
||||||
|
|
||||||
|
|
||||||
class ApplicationHealthCheck(BaseHealthCheckBackend):
|
class ApplicationHealthCheck(BaseHealthCheckBackend):
|
||||||
"""Check application-specific health indicators"""
|
"""Check application-specific health indicators"""
|
||||||
|
|
||||||
critical_service = False
|
critical_service = False
|
||||||
|
|
||||||
def check_status(self):
|
def check_status(self):
|
||||||
try:
|
try:
|
||||||
# Check if we can import critical modules
|
# Check if we can import critical modules
|
||||||
critical_modules = [
|
critical_modules = [
|
||||||
'parks.models',
|
"parks.models",
|
||||||
'rides.models',
|
"rides.models",
|
||||||
'accounts.models',
|
"accounts.models",
|
||||||
'core.services',
|
"core.services",
|
||||||
]
|
]
|
||||||
|
|
||||||
for module_name in critical_modules:
|
for module_name in critical_modules:
|
||||||
try:
|
try:
|
||||||
__import__(module_name)
|
__import__(module_name)
|
||||||
except ImportError as e:
|
except ImportError as e:
|
||||||
self.add_error(f"Critical module import failed: {module_name} - {e}")
|
self.add_error(
|
||||||
|
f"Critical module import failed: {module_name} - {e}"
|
||||||
|
)
|
||||||
|
|
||||||
# Check if we can access critical models
|
# Check if we can access critical models
|
||||||
try:
|
try:
|
||||||
from parks.models import Park
|
from parks.models import Park
|
||||||
from rides.models import Ride
|
from apps.rides.models import Ride
|
||||||
from django.contrib.auth import get_user_model
|
from django.contrib.auth import get_user_model
|
||||||
|
|
||||||
User = get_user_model()
|
User = get_user_model()
|
||||||
|
|
||||||
# Test that we can query these models (just count, don't load data)
|
# Test that we can query these models (just count, don't load
|
||||||
|
# data)
|
||||||
park_count = Park.objects.count()
|
park_count = Park.objects.count()
|
||||||
ride_count = Ride.objects.count()
|
ride_count = Ride.objects.count()
|
||||||
user_count = User.objects.count()
|
user_count = User.objects.count()
|
||||||
|
|
||||||
logger.debug(f"Model counts - Parks: {park_count}, Rides: {ride_count}, Users: {user_count}")
|
logger.debug(
|
||||||
|
f"Model counts - Parks: {park_count}, Rides: {ride_count}, Users: {user_count}"
|
||||||
|
)
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
self.add_error(f"Model access check failed: {e}")
|
self.add_error(f"Model access check failed: {e}")
|
||||||
|
|
||||||
# Check media and static file configuration
|
# Check media and static file configuration
|
||||||
from django.conf import settings
|
from django.conf import settings
|
||||||
import os
|
import os
|
||||||
|
|
||||||
if not os.path.exists(settings.MEDIA_ROOT):
|
if not os.path.exists(settings.MEDIA_ROOT):
|
||||||
self.add_error(f"Media directory does not exist: {settings.MEDIA_ROOT}")
|
self.add_error(
|
||||||
|
f"Media directory does not exist: {
|
||||||
|
settings.MEDIA_ROOT}"
|
||||||
|
)
|
||||||
|
|
||||||
if not os.path.exists(settings.STATIC_ROOT) and not settings.DEBUG:
|
if not os.path.exists(settings.STATIC_ROOT) and not settings.DEBUG:
|
||||||
self.add_error(f"Static directory does not exist: {settings.STATIC_ROOT}")
|
self.add_error(
|
||||||
|
f"Static directory does not exist: {settings.STATIC_ROOT}"
|
||||||
|
)
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
self.add_error(f"Application health check failed: {e}")
|
self.add_error(f"Application health check failed: {e}")
|
||||||
|
|
||||||
|
|
||||||
class ExternalServiceHealthCheck(BaseHealthCheckBackend):
|
class ExternalServiceHealthCheck(BaseHealthCheckBackend):
|
||||||
"""Check external services and dependencies"""
|
"""Check external services and dependencies"""
|
||||||
|
|
||||||
critical_service = False
|
critical_service = False
|
||||||
|
|
||||||
def check_status(self):
|
def check_status(self):
|
||||||
# Check email service if configured
|
# Check email service if configured
|
||||||
try:
|
try:
|
||||||
from django.core.mail import get_connection
|
from django.core.mail import get_connection
|
||||||
from django.conf import settings
|
from django.conf import settings
|
||||||
|
|
||||||
if hasattr(settings, 'EMAIL_BACKEND') and 'console' not in settings.EMAIL_BACKEND:
|
if (
|
||||||
|
hasattr(settings, "EMAIL_BACKEND")
|
||||||
|
and "console" not in settings.EMAIL_BACKEND
|
||||||
|
):
|
||||||
# Only check if not using console backend
|
# Only check if not using console backend
|
||||||
connection = get_connection()
|
connection = get_connection()
|
||||||
if hasattr(connection, 'open'):
|
if hasattr(connection, "open"):
|
||||||
try:
|
try:
|
||||||
connection.open()
|
connection.open()
|
||||||
connection.close()
|
connection.close()
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.warning(f"Email service check failed: {e}")
|
logger.warning(f"Email service check failed: {e}")
|
||||||
# Don't fail the health check for email issues in development
|
# Don't fail the health check for email issues in
|
||||||
|
# development
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.debug(f"Email service check error: {e}")
|
logger.debug(f"Email service check error: {e}")
|
||||||
|
|
||||||
# Check if Sentry is configured and working
|
# Check if Sentry is configured and working
|
||||||
try:
|
try:
|
||||||
import sentry_sdk
|
import sentry_sdk
|
||||||
|
|
||||||
if sentry_sdk.Hub.current.client:
|
if sentry_sdk.Hub.current.client:
|
||||||
# Sentry is configured
|
# Sentry is configured
|
||||||
try:
|
try:
|
||||||
# Test that we can capture a test message (this won't actually send to Sentry)
|
# Test that we can capture a test message (this won't
|
||||||
|
# actually send to Sentry)
|
||||||
with sentry_sdk.push_scope() as scope:
|
with sentry_sdk.push_scope() as scope:
|
||||||
scope.set_tag("health_check", True)
|
scope.set_tag("health_check", True)
|
||||||
# Don't actually send a message, just verify the SDK is working
|
# Don't actually send a message, just verify the SDK is
|
||||||
|
# working
|
||||||
logger.debug("Sentry SDK is operational")
|
logger.debug("Sentry SDK is operational")
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.warning(f"Sentry SDK check failed: {e}")
|
logger.warning(f"Sentry SDK check failed: {e}")
|
||||||
|
|
||||||
except ImportError:
|
except ImportError:
|
||||||
logger.debug("Sentry SDK not installed")
|
logger.debug("Sentry SDK not installed")
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.debug(f"Sentry check error: {e}")
|
logger.debug(f"Sentry check error: {e}")
|
||||||
|
|
||||||
# Check Redis connection if configured
|
# Check Redis connection if configured
|
||||||
try:
|
try:
|
||||||
from django.core.cache import caches
|
from django.core.cache import caches
|
||||||
from django.conf import settings
|
from django.conf import settings
|
||||||
|
|
||||||
cache_config = settings.CACHES.get('default', {})
|
cache_config = settings.CACHES.get("default", {})
|
||||||
if 'redis' in cache_config.get('BACKEND', '').lower():
|
if "redis" in cache_config.get("BACKEND", "").lower():
|
||||||
# Redis is configured, test basic connectivity
|
# Redis is configured, test basic connectivity
|
||||||
redis_cache = caches['default']
|
redis_cache = caches["default"]
|
||||||
redis_cache.set('health_check_redis', 'test', 10)
|
redis_cache.set("health_check_redis", "test", 10)
|
||||||
value = redis_cache.get('health_check_redis')
|
value = redis_cache.get("health_check_redis")
|
||||||
if value != 'test':
|
if value != "test":
|
||||||
self.add_error("Redis cache connectivity test failed")
|
self.add_error("Redis cache connectivity test failed")
|
||||||
else:
|
else:
|
||||||
redis_cache.delete('health_check_redis')
|
redis_cache.delete("health_check_redis")
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.warning(f"Redis connectivity check failed: {e}")
|
logger.warning(f"Redis connectivity check failed: {e}")
|
||||||
|
|
||||||
|
|
||||||
class DiskSpaceHealthCheck(BaseHealthCheckBackend):
|
class DiskSpaceHealthCheck(BaseHealthCheckBackend):
|
||||||
"""Check available disk space"""
|
"""Check available disk space"""
|
||||||
|
|
||||||
critical_service = False
|
critical_service = False
|
||||||
|
|
||||||
def check_status(self):
|
def check_status(self):
|
||||||
try:
|
try:
|
||||||
import shutil
|
import shutil
|
||||||
from django.conf import settings
|
from django.conf import settings
|
||||||
|
|
||||||
# Check disk space for media directory
|
# Check disk space for media directory
|
||||||
media_usage = shutil.disk_usage(settings.MEDIA_ROOT)
|
media_usage = shutil.disk_usage(settings.MEDIA_ROOT)
|
||||||
media_free_percent = (media_usage.free / media_usage.total) * 100
|
media_free_percent = (media_usage.free / media_usage.total) * 100
|
||||||
|
|
||||||
# Check disk space for logs directory if it exists
|
# Check disk space for logs directory if it exists
|
||||||
logs_dir = getattr(settings, 'BASE_DIR', '/tmp') / 'logs'
|
logs_dir = getattr(settings, "BASE_DIR", "/tmp") / "logs"
|
||||||
if logs_dir.exists():
|
if logs_dir.exists():
|
||||||
logs_usage = shutil.disk_usage(logs_dir)
|
logs_usage = shutil.disk_usage(logs_dir)
|
||||||
logs_free_percent = (logs_usage.free / logs_usage.total) * 100
|
logs_free_percent = (logs_usage.free / logs_usage.total) * 100
|
||||||
else:
|
else:
|
||||||
logs_free_percent = media_free_percent # Use same as media
|
logs_free_percent = media_free_percent # Use same as media
|
||||||
|
|
||||||
# Alert thresholds
|
# Alert thresholds
|
||||||
if media_free_percent < 10:
|
if media_free_percent < 10:
|
||||||
self.add_error(f"Critical disk space: {media_free_percent:.1f}% free in media directory")
|
self.add_error(
|
||||||
|
f"Critical disk space: {
|
||||||
|
media_free_percent:.1f}% free in media directory"
|
||||||
|
)
|
||||||
elif media_free_percent < 20:
|
elif media_free_percent < 20:
|
||||||
logger.warning(f"Low disk space: {media_free_percent:.1f}% free in media directory")
|
logger.warning(
|
||||||
|
f"Low disk space: {
|
||||||
|
media_free_percent:.1f}% free in media directory"
|
||||||
|
)
|
||||||
|
|
||||||
if logs_free_percent < 10:
|
if logs_free_percent < 10:
|
||||||
self.add_error(f"Critical disk space: {logs_free_percent:.1f}% free in logs directory")
|
self.add_error(
|
||||||
|
f"Critical disk space: {
|
||||||
|
logs_free_percent:.1f}% free in logs directory"
|
||||||
|
)
|
||||||
elif logs_free_percent < 20:
|
elif logs_free_percent < 20:
|
||||||
logger.warning(f"Low disk space: {logs_free_percent:.1f}% free in logs directory")
|
logger.warning(
|
||||||
|
f"Low disk space: {
|
||||||
|
logs_free_percent:.1f}% free in logs directory"
|
||||||
|
)
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.warning(f"Disk space check failed: {e}")
|
logger.warning(f"Disk space check failed: {e}")
|
||||||
# Don't fail health check for disk space issues in development
|
# Don't fail health check for disk space issues in development
|
||||||
@@ -5,16 +5,22 @@ from django.conf import settings
|
|||||||
from typing import Any, Dict, Optional
|
from typing import Any, Dict, Optional
|
||||||
from django.db.models import QuerySet
|
from django.db.models import QuerySet
|
||||||
|
|
||||||
|
|
||||||
class DiffMixin:
|
class DiffMixin:
|
||||||
"""Mixin to add diffing capabilities to models"""
|
"""Mixin to add diffing capabilities to models"""
|
||||||
|
|
||||||
def get_prev_record(self) -> Optional[Any]:
|
def get_prev_record(self) -> Optional[Any]:
|
||||||
"""Get the previous record for this instance"""
|
"""Get the previous record for this instance"""
|
||||||
try:
|
try:
|
||||||
return type(self).objects.filter(
|
return (
|
||||||
pgh_created_at__lt=self.pgh_created_at,
|
type(self)
|
||||||
pgh_obj_id=self.pgh_obj_id
|
.objects.filter(
|
||||||
).order_by('-pgh_created_at').first()
|
pgh_created_at__lt=self.pgh_created_at,
|
||||||
|
pgh_obj_id=self.pgh_obj_id,
|
||||||
|
)
|
||||||
|
.order_by("-pgh_created_at")
|
||||||
|
.first()
|
||||||
|
)
|
||||||
except (AttributeError, TypeError):
|
except (AttributeError, TypeError):
|
||||||
return None
|
return None
|
||||||
|
|
||||||
@@ -25,15 +31,20 @@ class DiffMixin:
|
|||||||
return {}
|
return {}
|
||||||
|
|
||||||
skip_fields = {
|
skip_fields = {
|
||||||
'pgh_id', 'pgh_created_at', 'pgh_label',
|
"pgh_id",
|
||||||
'pgh_obj_id', 'pgh_context_id', '_state',
|
"pgh_created_at",
|
||||||
'created_at', 'updated_at'
|
"pgh_label",
|
||||||
|
"pgh_obj_id",
|
||||||
|
"pgh_context_id",
|
||||||
|
"_state",
|
||||||
|
"created_at",
|
||||||
|
"updated_at",
|
||||||
}
|
}
|
||||||
|
|
||||||
changes = {}
|
changes = {}
|
||||||
for field, value in self.__dict__.items():
|
for field, value in self.__dict__.items():
|
||||||
# Skip internal fields and those we don't want to track
|
# Skip internal fields and those we don't want to track
|
||||||
if field.startswith('_') or field in skip_fields or field.endswith('_id'):
|
if field.startswith("_") or field in skip_fields or field.endswith("_id"):
|
||||||
continue
|
continue
|
||||||
|
|
||||||
try:
|
try:
|
||||||
@@ -41,16 +52,18 @@ class DiffMixin:
|
|||||||
new_value = value
|
new_value = value
|
||||||
if old_value != new_value:
|
if old_value != new_value:
|
||||||
changes[field] = {
|
changes[field] = {
|
||||||
"old": str(old_value) if old_value is not None else "None",
|
"old": (str(old_value) if old_value is not None else "None"),
|
||||||
"new": str(new_value) if new_value is not None else "None"
|
"new": (str(new_value) if new_value is not None else "None"),
|
||||||
}
|
}
|
||||||
except AttributeError:
|
except AttributeError:
|
||||||
continue
|
continue
|
||||||
|
|
||||||
return changes
|
return changes
|
||||||
|
|
||||||
|
|
||||||
class TrackedModel(models.Model):
|
class TrackedModel(models.Model):
|
||||||
"""Abstract base class for models that need history tracking"""
|
"""Abstract base class for models that need history tracking"""
|
||||||
|
|
||||||
created_at = models.DateTimeField(auto_now_add=True)
|
created_at = models.DateTimeField(auto_now_add=True)
|
||||||
updated_at = models.DateTimeField(auto_now=True)
|
updated_at = models.DateTimeField(auto_now=True)
|
||||||
|
|
||||||
@@ -61,16 +74,18 @@ class TrackedModel(models.Model):
|
|||||||
"""Get all history records for this instance in chronological order"""
|
"""Get all history records for this instance in chronological order"""
|
||||||
event_model = self.events.model # pghistory provides this automatically
|
event_model = self.events.model # pghistory provides this automatically
|
||||||
if event_model:
|
if event_model:
|
||||||
return event_model.objects.filter(
|
return event_model.objects.filter(pgh_obj_id=self.pk).order_by(
|
||||||
pgh_obj_id=self.pk
|
"-pgh_created_at"
|
||||||
).order_by('-pgh_created_at')
|
)
|
||||||
return self.__class__.objects.none()
|
return self.__class__.objects.none()
|
||||||
|
|
||||||
|
|
||||||
class HistoricalSlug(models.Model):
|
class HistoricalSlug(models.Model):
|
||||||
"""Track historical slugs for models"""
|
"""Track historical slugs for models"""
|
||||||
|
|
||||||
content_type = models.ForeignKey(ContentType, on_delete=models.CASCADE)
|
content_type = models.ForeignKey(ContentType, on_delete=models.CASCADE)
|
||||||
object_id = models.PositiveIntegerField()
|
object_id = models.PositiveIntegerField()
|
||||||
content_object = GenericForeignKey('content_type', 'object_id')
|
content_object = GenericForeignKey("content_type", "object_id")
|
||||||
slug = models.SlugField(max_length=255)
|
slug = models.SlugField(max_length=255)
|
||||||
created_at = models.DateTimeField(auto_now_add=True)
|
created_at = models.DateTimeField(auto_now_add=True)
|
||||||
user = models.ForeignKey(
|
user = models.ForeignKey(
|
||||||
@@ -78,14 +93,15 @@ class HistoricalSlug(models.Model):
|
|||||||
null=True,
|
null=True,
|
||||||
blank=True,
|
blank=True,
|
||||||
on_delete=models.SET_NULL,
|
on_delete=models.SET_NULL,
|
||||||
related_name='historical_slugs'
|
related_name="historical_slugs",
|
||||||
)
|
)
|
||||||
|
|
||||||
class Meta:
|
class Meta:
|
||||||
unique_together = ('content_type', 'slug')
|
app_label = "core"
|
||||||
|
unique_together = ("content_type", "slug")
|
||||||
indexes = [
|
indexes = [
|
||||||
models.Index(fields=['content_type', 'object_id']),
|
models.Index(fields=["content_type", "object_id"]),
|
||||||
models.Index(fields=['slug']),
|
models.Index(fields=["slug"]),
|
||||||
]
|
]
|
||||||
|
|
||||||
def __str__(self) -> str:
|
def __str__(self) -> str:
|
||||||
@@ -12,48 +12,52 @@ from django.utils import timezone
|
|||||||
|
|
||||||
class ThrillWikiFormatter(logging.Formatter):
|
class ThrillWikiFormatter(logging.Formatter):
|
||||||
"""Custom formatter for ThrillWiki logs with structured output."""
|
"""Custom formatter for ThrillWiki logs with structured output."""
|
||||||
|
|
||||||
def format(self, record):
|
def format(self, record):
|
||||||
# Add timestamp if not present
|
# Add timestamp if not present
|
||||||
if not hasattr(record, 'timestamp'):
|
if not hasattr(record, "timestamp"):
|
||||||
record.timestamp = timezone.now().isoformat()
|
record.timestamp = timezone.now().isoformat()
|
||||||
|
|
||||||
# Add request context if available
|
# Add request context if available
|
||||||
if hasattr(record, 'request'):
|
if hasattr(record, "request"):
|
||||||
record.request_id = getattr(record.request, 'id', 'unknown')
|
record.request_id = getattr(record.request, "id", "unknown")
|
||||||
record.user_id = getattr(record.request.user, 'id', 'anonymous') if hasattr(record.request, 'user') else 'unknown'
|
record.user_id = (
|
||||||
record.path = getattr(record.request, 'path', 'unknown')
|
getattr(record.request.user, "id", "anonymous")
|
||||||
record.method = getattr(record.request, 'method', 'unknown')
|
if hasattr(record.request, "user")
|
||||||
|
else "unknown"
|
||||||
|
)
|
||||||
|
record.path = getattr(record.request, "path", "unknown")
|
||||||
|
record.method = getattr(record.request, "method", "unknown")
|
||||||
|
|
||||||
# Structure the log message
|
# Structure the log message
|
||||||
if hasattr(record, 'extra_data'):
|
if hasattr(record, "extra_data"):
|
||||||
record.structured_data = record.extra_data
|
record.structured_data = record.extra_data
|
||||||
|
|
||||||
return super().format(record)
|
return super().format(record)
|
||||||
|
|
||||||
|
|
||||||
def get_logger(name: str) -> logging.Logger:
|
def get_logger(name: str) -> logging.Logger:
|
||||||
"""
|
"""
|
||||||
Get a configured logger for ThrillWiki components.
|
Get a configured logger for ThrillWiki components.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
name: Logger name (usually __name__)
|
name: Logger name (usually __name__)
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
Configured logger instance
|
Configured logger instance
|
||||||
"""
|
"""
|
||||||
logger = logging.getLogger(name)
|
logger = logging.getLogger(name)
|
||||||
|
|
||||||
# Only configure if not already configured
|
# Only configure if not already configured
|
||||||
if not logger.handlers:
|
if not logger.handlers:
|
||||||
handler = logging.StreamHandler(sys.stdout)
|
handler = logging.StreamHandler(sys.stdout)
|
||||||
formatter = ThrillWikiFormatter(
|
formatter = ThrillWikiFormatter(
|
||||||
fmt='%(asctime)s - %(name)s - %(levelname)s - %(message)s'
|
fmt="%(asctime)s - %(name)s - %(levelname)s - %(message)s"
|
||||||
)
|
)
|
||||||
handler.setFormatter(formatter)
|
handler.setFormatter(formatter)
|
||||||
logger.addHandler(handler)
|
logger.addHandler(handler)
|
||||||
logger.setLevel(logging.INFO if settings.DEBUG else logging.WARNING)
|
logger.setLevel(logging.INFO if settings.DEBUG else logging.WARNING)
|
||||||
|
|
||||||
return logger
|
return logger
|
||||||
|
|
||||||
|
|
||||||
@@ -63,11 +67,11 @@ def log_exception(
|
|||||||
*,
|
*,
|
||||||
context: Optional[Dict[str, Any]] = None,
|
context: Optional[Dict[str, Any]] = None,
|
||||||
request=None,
|
request=None,
|
||||||
level: int = logging.ERROR
|
level: int = logging.ERROR,
|
||||||
) -> None:
|
) -> None:
|
||||||
"""
|
"""
|
||||||
Log an exception with structured context.
|
Log an exception with structured context.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
logger: Logger instance
|
logger: Logger instance
|
||||||
exception: Exception to log
|
exception: Exception to log
|
||||||
@@ -76,19 +80,30 @@ def log_exception(
|
|||||||
level: Log level
|
level: Log level
|
||||||
"""
|
"""
|
||||||
log_data = {
|
log_data = {
|
||||||
'exception_type': exception.__class__.__name__,
|
"exception_type": exception.__class__.__name__,
|
||||||
'exception_message': str(exception),
|
"exception_message": str(exception),
|
||||||
'context': context or {}
|
"context": context or {},
|
||||||
}
|
}
|
||||||
|
|
||||||
if request:
|
if request:
|
||||||
log_data.update({
|
log_data.update(
|
||||||
'request_path': getattr(request, 'path', 'unknown'),
|
{
|
||||||
'request_method': getattr(request, 'method', 'unknown'),
|
"request_path": getattr(request, "path", "unknown"),
|
||||||
'user_id': getattr(request.user, 'id', 'anonymous') if hasattr(request, 'user') else 'unknown'
|
"request_method": getattr(request, "method", "unknown"),
|
||||||
})
|
"user_id": (
|
||||||
|
getattr(request.user, "id", "anonymous")
|
||||||
logger.log(level, f"Exception occurred: {exception}", extra={'extra_data': log_data}, exc_info=True)
|
if hasattr(request, "user")
|
||||||
|
else "unknown"
|
||||||
|
),
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
logger.log(
|
||||||
|
level,
|
||||||
|
f"Exception occurred: {exception}",
|
||||||
|
extra={"extra_data": log_data},
|
||||||
|
exc_info=True,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
def log_business_event(
|
def log_business_event(
|
||||||
@@ -98,11 +113,11 @@ def log_business_event(
|
|||||||
message: str,
|
message: str,
|
||||||
context: Optional[Dict[str, Any]] = None,
|
context: Optional[Dict[str, Any]] = None,
|
||||||
request=None,
|
request=None,
|
||||||
level: int = logging.INFO
|
level: int = logging.INFO,
|
||||||
) -> None:
|
) -> None:
|
||||||
"""
|
"""
|
||||||
Log a business event with structured context.
|
Log a business event with structured context.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
logger: Logger instance
|
logger: Logger instance
|
||||||
event_type: Type of business event
|
event_type: Type of business event
|
||||||
@@ -111,19 +126,22 @@ def log_business_event(
|
|||||||
request: Django request object
|
request: Django request object
|
||||||
level: Log level
|
level: Log level
|
||||||
"""
|
"""
|
||||||
log_data = {
|
log_data = {"event_type": event_type, "context": context or {}}
|
||||||
'event_type': event_type,
|
|
||||||
'context': context or {}
|
|
||||||
}
|
|
||||||
|
|
||||||
if request:
|
if request:
|
||||||
log_data.update({
|
log_data.update(
|
||||||
'request_path': getattr(request, 'path', 'unknown'),
|
{
|
||||||
'request_method': getattr(request, 'method', 'unknown'),
|
"request_path": getattr(request, "path", "unknown"),
|
||||||
'user_id': getattr(request.user, 'id', 'anonymous') if hasattr(request, 'user') else 'unknown'
|
"request_method": getattr(request, "method", "unknown"),
|
||||||
})
|
"user_id": (
|
||||||
|
getattr(request.user, "id", "anonymous")
|
||||||
logger.log(level, message, extra={'extra_data': log_data})
|
if hasattr(request, "user")
|
||||||
|
else "unknown"
|
||||||
|
),
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
logger.log(level, message, extra={"extra_data": log_data})
|
||||||
|
|
||||||
|
|
||||||
def log_performance_metric(
|
def log_performance_metric(
|
||||||
@@ -132,11 +150,11 @@ def log_performance_metric(
|
|||||||
*,
|
*,
|
||||||
duration_ms: float,
|
duration_ms: float,
|
||||||
context: Optional[Dict[str, Any]] = None,
|
context: Optional[Dict[str, Any]] = None,
|
||||||
level: int = logging.INFO
|
level: int = logging.INFO,
|
||||||
) -> None:
|
) -> None:
|
||||||
"""
|
"""
|
||||||
Log a performance metric.
|
Log a performance metric.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
logger: Logger instance
|
logger: Logger instance
|
||||||
operation: Operation name
|
operation: Operation name
|
||||||
@@ -145,14 +163,14 @@ def log_performance_metric(
|
|||||||
level: Log level
|
level: Log level
|
||||||
"""
|
"""
|
||||||
log_data = {
|
log_data = {
|
||||||
'metric_type': 'performance',
|
"metric_type": "performance",
|
||||||
'operation': operation,
|
"operation": operation,
|
||||||
'duration_ms': duration_ms,
|
"duration_ms": duration_ms,
|
||||||
'context': context or {}
|
"context": context or {},
|
||||||
}
|
}
|
||||||
|
|
||||||
message = f"Performance: {operation} took {duration_ms:.2f}ms"
|
message = f"Performance: {operation} took {duration_ms:.2f}ms"
|
||||||
logger.log(level, message, extra={'extra_data': log_data})
|
logger.log(level, message, extra={"extra_data": log_data})
|
||||||
|
|
||||||
|
|
||||||
def log_api_request(
|
def log_api_request(
|
||||||
@@ -161,11 +179,11 @@ def log_api_request(
|
|||||||
*,
|
*,
|
||||||
response_status: Optional[int] = None,
|
response_status: Optional[int] = None,
|
||||||
duration_ms: Optional[float] = None,
|
duration_ms: Optional[float] = None,
|
||||||
level: int = logging.INFO
|
level: int = logging.INFO,
|
||||||
) -> None:
|
) -> None:
|
||||||
"""
|
"""
|
||||||
Log an API request with context.
|
Log an API request with context.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
logger: Logger instance
|
logger: Logger instance
|
||||||
request: Django request object
|
request: Django request object
|
||||||
@@ -174,21 +192,25 @@ def log_api_request(
|
|||||||
level: Log level
|
level: Log level
|
||||||
"""
|
"""
|
||||||
log_data = {
|
log_data = {
|
||||||
'request_type': 'api',
|
"request_type": "api",
|
||||||
'path': getattr(request, 'path', 'unknown'),
|
"path": getattr(request, "path", "unknown"),
|
||||||
'method': getattr(request, 'method', 'unknown'),
|
"method": getattr(request, "method", "unknown"),
|
||||||
'user_id': getattr(request.user, 'id', 'anonymous') if hasattr(request, 'user') else 'unknown',
|
"user_id": (
|
||||||
'response_status': response_status,
|
getattr(request.user, "id", "anonymous")
|
||||||
'duration_ms': duration_ms
|
if hasattr(request, "user")
|
||||||
|
else "unknown"
|
||||||
|
),
|
||||||
|
"response_status": response_status,
|
||||||
|
"duration_ms": duration_ms,
|
||||||
}
|
}
|
||||||
|
|
||||||
message = f"API Request: {request.method} {request.path}"
|
message = f"API Request: {request.method} {request.path}"
|
||||||
if response_status:
|
if response_status:
|
||||||
message += f" -> {response_status}"
|
message += f" -> {response_status}"
|
||||||
if duration_ms:
|
if duration_ms:
|
||||||
message += f" ({duration_ms:.2f}ms)"
|
message += f" ({duration_ms:.2f}ms)"
|
||||||
|
|
||||||
logger.log(level, message, extra={'extra_data': log_data})
|
logger.log(level, message, extra={"extra_data": log_data})
|
||||||
|
|
||||||
|
|
||||||
def log_security_event(
|
def log_security_event(
|
||||||
@@ -196,13 +218,13 @@ def log_security_event(
|
|||||||
event_type: str,
|
event_type: str,
|
||||||
*,
|
*,
|
||||||
message: str,
|
message: str,
|
||||||
severity: str = 'medium',
|
severity: str = "medium",
|
||||||
context: Optional[Dict[str, Any]] = None,
|
context: Optional[Dict[str, Any]] = None,
|
||||||
request=None
|
request=None,
|
||||||
) -> None:
|
) -> None:
|
||||||
"""
|
"""
|
||||||
Log a security-related event.
|
Log a security-related event.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
logger: Logger instance
|
logger: Logger instance
|
||||||
event_type: Type of security event
|
event_type: Type of security event
|
||||||
@@ -212,22 +234,28 @@ def log_security_event(
|
|||||||
request: Django request object
|
request: Django request object
|
||||||
"""
|
"""
|
||||||
log_data = {
|
log_data = {
|
||||||
'security_event': True,
|
"security_event": True,
|
||||||
'event_type': event_type,
|
"event_type": event_type,
|
||||||
'severity': severity,
|
"severity": severity,
|
||||||
'context': context or {}
|
"context": context or {},
|
||||||
}
|
}
|
||||||
|
|
||||||
if request:
|
if request:
|
||||||
log_data.update({
|
log_data.update(
|
||||||
'request_path': getattr(request, 'path', 'unknown'),
|
{
|
||||||
'request_method': getattr(request, 'method', 'unknown'),
|
"request_path": getattr(request, "path", "unknown"),
|
||||||
'user_id': getattr(request.user, 'id', 'anonymous') if hasattr(request, 'user') else 'unknown',
|
"request_method": getattr(request, "method", "unknown"),
|
||||||
'remote_addr': request.META.get('REMOTE_ADDR', 'unknown'),
|
"user_id": (
|
||||||
'user_agent': request.META.get('HTTP_USER_AGENT', 'unknown')
|
getattr(request.user, "id", "anonymous")
|
||||||
})
|
if hasattr(request, "user")
|
||||||
|
else "unknown"
|
||||||
|
),
|
||||||
|
"remote_addr": request.META.get("REMOTE_ADDR", "unknown"),
|
||||||
|
"user_agent": request.META.get("HTTP_USER_AGENT", "unknown"),
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
# Use WARNING for medium/high, ERROR for critical
|
# Use WARNING for medium/high, ERROR for critical
|
||||||
level = logging.ERROR if severity in ['high', 'critical'] else logging.WARNING
|
level = logging.ERROR if severity in ["high", "critical"] else logging.WARNING
|
||||||
|
|
||||||
logger.log(level, f"SECURITY: {message}", extra={'extra_data': log_data})
|
logger.log(level, f"SECURITY: {message}", extra={"extra_data": log_data})
|
||||||
1
backend/apps/core/management/__init__.py
Normal file
1
backend/apps/core/management/__init__.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
# Django management commands
|
||||||
1
backend/apps/core/management/commands/__init__.py
Normal file
1
backend/apps/core/management/commands/__init__.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
# Django management commands
|
||||||
101
backend/apps/core/management/commands/rundev.py
Normal file
101
backend/apps/core/management/commands/rundev.py
Normal file
@@ -0,0 +1,101 @@
|
|||||||
|
"""
|
||||||
|
Django management command to run the development server.
|
||||||
|
|
||||||
|
This command automatically sets up the development environment and starts
|
||||||
|
the server, replacing the need for the dev_server.sh script.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import subprocess
|
||||||
|
import sys
|
||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
from django.core.management import execute_from_command_line
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = "Run the development server with automatic setup"
|
||||||
|
|
||||||
|
def add_arguments(self, parser):
|
||||||
|
parser.add_argument(
|
||||||
|
"--port",
|
||||||
|
type=str,
|
||||||
|
default="8000",
|
||||||
|
help="Port to run the server on (default: 8000)",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--host",
|
||||||
|
type=str,
|
||||||
|
default="0.0.0.0",
|
||||||
|
help="Host to bind the server to (default: 0.0.0.0)",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--skip-setup",
|
||||||
|
action="store_true",
|
||||||
|
help="Skip the development setup and go straight to running the server",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--use-runserver-plus",
|
||||||
|
action="store_true",
|
||||||
|
help="Use runserver_plus if available (from django-extensions)",
|
||||||
|
)
|
||||||
|
|
||||||
|
def handle(self, *args, **options):
|
||||||
|
"""Run the development setup and start the server."""
|
||||||
|
if not options["skip_setup"]:
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS(
|
||||||
|
"🚀 Setting up and starting ThrillWiki Development Server..."
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
# Run the setup_dev command first
|
||||||
|
execute_from_command_line(["manage.py", "setup_dev"])
|
||||||
|
|
||||||
|
else:
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS("🚀 Starting ThrillWiki Development Server...")
|
||||||
|
)
|
||||||
|
|
||||||
|
# Determine which server command to use
|
||||||
|
server_command = self.get_server_command(options)
|
||||||
|
|
||||||
|
# Start the server
|
||||||
|
self.stdout.write("")
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS(
|
||||||
|
f'🌟 Starting Django development server on http://{options["host"]}:{options["port"]}'
|
||||||
|
)
|
||||||
|
)
|
||||||
|
self.stdout.write("Press Ctrl+C to stop the server")
|
||||||
|
self.stdout.write("")
|
||||||
|
|
||||||
|
try:
|
||||||
|
if options["use_runserver_plus"] or self.has_runserver_plus():
|
||||||
|
execute_from_command_line(
|
||||||
|
[
|
||||||
|
"manage.py",
|
||||||
|
"runserver_plus",
|
||||||
|
f'{options["host"]}:{options["port"]}',
|
||||||
|
]
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
execute_from_command_line(
|
||||||
|
["manage.py", "runserver", f'{options["host"]}:{options["port"]}']
|
||||||
|
)
|
||||||
|
except KeyboardInterrupt:
|
||||||
|
self.stdout.write("")
|
||||||
|
self.stdout.write(self.style.SUCCESS("👋 Development server stopped"))
|
||||||
|
|
||||||
|
def get_server_command(self, options):
|
||||||
|
"""Determine which server command to use."""
|
||||||
|
if options["use_runserver_plus"] or self.has_runserver_plus():
|
||||||
|
return "runserver_plus"
|
||||||
|
return "runserver"
|
||||||
|
|
||||||
|
def has_runserver_plus(self):
|
||||||
|
"""Check if runserver_plus is available (django-extensions)."""
|
||||||
|
try:
|
||||||
|
import django_extensions
|
||||||
|
|
||||||
|
return True
|
||||||
|
except ImportError:
|
||||||
|
return False
|
||||||
226
backend/apps/core/management/commands/setup_dev.py
Normal file
226
backend/apps/core/management/commands/setup_dev.py
Normal file
@@ -0,0 +1,226 @@
|
|||||||
|
"""
|
||||||
|
Django management command to set up the development environment.
|
||||||
|
|
||||||
|
This command performs all the setup tasks that the dev_server.sh script does,
|
||||||
|
allowing the project to run without requiring the shell script.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import os
|
||||||
|
import subprocess
|
||||||
|
import sys
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
from django.core.management import execute_from_command_line
|
||||||
|
from django.conf import settings
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = "Set up the development environment"
|
||||||
|
|
||||||
|
def add_arguments(self, parser):
|
||||||
|
parser.add_argument(
|
||||||
|
"--skip-migrations",
|
||||||
|
action="store_true",
|
||||||
|
help="Skip running database migrations",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--skip-static",
|
||||||
|
action="store_true",
|
||||||
|
help="Skip collecting static files",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--skip-tailwind",
|
||||||
|
action="store_true",
|
||||||
|
help="Skip building Tailwind CSS",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--skip-superuser",
|
||||||
|
action="store_true",
|
||||||
|
help="Skip creating development superuser",
|
||||||
|
)
|
||||||
|
|
||||||
|
def handle(self, *args, **options):
|
||||||
|
"""Run the development setup process."""
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS("🚀 Setting up ThrillWiki Development Environment...")
|
||||||
|
)
|
||||||
|
|
||||||
|
# Create necessary directories
|
||||||
|
self.create_directories()
|
||||||
|
|
||||||
|
# Run database migrations if needed
|
||||||
|
if not options["skip_migrations"]:
|
||||||
|
self.run_migrations()
|
||||||
|
|
||||||
|
# Seed sample data
|
||||||
|
self.seed_sample_data()
|
||||||
|
|
||||||
|
# Create superuser if it doesn't exist
|
||||||
|
if not options["skip_superuser"]:
|
||||||
|
self.create_superuser()
|
||||||
|
|
||||||
|
# Collect static files
|
||||||
|
if not options["skip_static"]:
|
||||||
|
self.collect_static()
|
||||||
|
|
||||||
|
# Build Tailwind CSS
|
||||||
|
if not options["skip_tailwind"]:
|
||||||
|
self.build_tailwind()
|
||||||
|
|
||||||
|
# Run system checks
|
||||||
|
self.run_system_checks()
|
||||||
|
|
||||||
|
# Display environment info
|
||||||
|
self.display_environment_info()
|
||||||
|
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS("✅ Development environment setup complete!")
|
||||||
|
)
|
||||||
|
|
||||||
|
def create_directories(self):
|
||||||
|
"""Create necessary directories."""
|
||||||
|
self.stdout.write("📁 Creating necessary directories...")
|
||||||
|
directories = ["logs", "profiles", "media", "staticfiles", "static/css"]
|
||||||
|
|
||||||
|
for directory in directories:
|
||||||
|
dir_path = Path(settings.BASE_DIR) / directory
|
||||||
|
dir_path.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
self.stdout.write(self.style.SUCCESS("✅ Directories created"))
|
||||||
|
|
||||||
|
def run_migrations(self):
|
||||||
|
"""Run database migrations if needed."""
|
||||||
|
self.stdout.write("🗄️ Checking database migrations...")
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Check if migrations are up to date
|
||||||
|
result = subprocess.run(
|
||||||
|
[sys.executable, "manage.py", "migrate", "--check"],
|
||||||
|
capture_output=True,
|
||||||
|
text=True,
|
||||||
|
)
|
||||||
|
|
||||||
|
if result.returncode == 0:
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS("✅ Database migrations are up to date")
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
self.stdout.write("🔄 Running database migrations...")
|
||||||
|
subprocess.run(
|
||||||
|
[sys.executable, "manage.py", "migrate", "--noinput"], check=True
|
||||||
|
)
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS("✅ Database migrations completed")
|
||||||
|
)
|
||||||
|
|
||||||
|
except subprocess.CalledProcessError as e:
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.WARNING(f"⚠️ Migration error (continuing): {e}")
|
||||||
|
)
|
||||||
|
|
||||||
|
def seed_sample_data(self):
|
||||||
|
"""Seed sample data to the database."""
|
||||||
|
self.stdout.write("🌱 Seeding sample data...")
|
||||||
|
|
||||||
|
try:
|
||||||
|
subprocess.run(
|
||||||
|
[sys.executable, "manage.py", "seed_sample_data"], check=True
|
||||||
|
)
|
||||||
|
self.stdout.write(self.style.SUCCESS("✅ Sample data seeded"))
|
||||||
|
except subprocess.CalledProcessError:
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.WARNING("⚠️ Could not seed sample data (continuing)")
|
||||||
|
)
|
||||||
|
|
||||||
|
def create_superuser(self):
|
||||||
|
"""Create development superuser if it doesn't exist."""
|
||||||
|
self.stdout.write("👤 Checking for superuser...")
|
||||||
|
|
||||||
|
try:
|
||||||
|
from django.contrib.auth import get_user_model
|
||||||
|
|
||||||
|
User = get_user_model()
|
||||||
|
|
||||||
|
if User.objects.filter(is_superuser=True).exists():
|
||||||
|
self.stdout.write(self.style.SUCCESS("✅ Superuser already exists"))
|
||||||
|
else:
|
||||||
|
self.stdout.write("👤 Creating development superuser (admin/admin)...")
|
||||||
|
if not User.objects.filter(username="admin").exists():
|
||||||
|
User.objects.create_superuser("admin", "admin@example.com", "admin")
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS("✅ Created superuser: admin/admin")
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS("✅ Admin user already exists")
|
||||||
|
)
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
self.stdout.write(self.style.WARNING(f"⚠️ Could not create superuser: {e}"))
|
||||||
|
|
||||||
|
def collect_static(self):
|
||||||
|
"""Collect static files for development."""
|
||||||
|
self.stdout.write("📦 Collecting static files...")
|
||||||
|
|
||||||
|
try:
|
||||||
|
subprocess.run(
|
||||||
|
[sys.executable, "manage.py", "collectstatic", "--noinput", "--clear"],
|
||||||
|
check=True,
|
||||||
|
)
|
||||||
|
self.stdout.write(self.style.SUCCESS("✅ Static files collected"))
|
||||||
|
except subprocess.CalledProcessError as e:
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.WARNING(f"⚠️ Could not collect static files: {e}")
|
||||||
|
)
|
||||||
|
|
||||||
|
def build_tailwind(self):
|
||||||
|
"""Build Tailwind CSS if npm is available."""
|
||||||
|
self.stdout.write("🎨 Building Tailwind CSS...")
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Check if npm is available
|
||||||
|
subprocess.run(["npm", "--version"], capture_output=True, check=True)
|
||||||
|
|
||||||
|
# Build Tailwind CSS
|
||||||
|
subprocess.run(
|
||||||
|
[sys.executable, "manage.py", "tailwind", "build"], check=True
|
||||||
|
)
|
||||||
|
self.stdout.write(self.style.SUCCESS("✅ Tailwind CSS built"))
|
||||||
|
|
||||||
|
except (subprocess.CalledProcessError, FileNotFoundError):
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.WARNING(
|
||||||
|
"⚠️ npm not found or Tailwind build failed, skipping"
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
def run_system_checks(self):
|
||||||
|
"""Run Django system checks."""
|
||||||
|
self.stdout.write("🔍 Running system checks...")
|
||||||
|
|
||||||
|
try:
|
||||||
|
subprocess.run([sys.executable, "manage.py", "check"], check=True)
|
||||||
|
self.stdout.write(self.style.SUCCESS("✅ System checks passed"))
|
||||||
|
except subprocess.CalledProcessError:
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.WARNING("❌ System checks failed, but continuing...")
|
||||||
|
)
|
||||||
|
|
||||||
|
def display_environment_info(self):
|
||||||
|
"""Display development environment information."""
|
||||||
|
self.stdout.write("")
|
||||||
|
self.stdout.write(self.style.SUCCESS("🌍 Development Environment:"))
|
||||||
|
self.stdout.write(f" - Settings Module: {settings.SETTINGS_MODULE}")
|
||||||
|
self.stdout.write(f" - Debug Mode: {settings.DEBUG}")
|
||||||
|
self.stdout.write(" - Database: PostgreSQL with PostGIS")
|
||||||
|
self.stdout.write(" - Cache: Local memory cache")
|
||||||
|
self.stdout.write(" - Admin URL: http://localhost:8000/admin/")
|
||||||
|
self.stdout.write(" - Admin User: admin / admin")
|
||||||
|
self.stdout.write(" - Silk Profiler: http://localhost:8000/silk/")
|
||||||
|
self.stdout.write(" - Debug Toolbar: Available on debug pages")
|
||||||
|
self.stdout.write(" - API Documentation: http://localhost:8000/api/docs/")
|
||||||
|
self.stdout.write("")
|
||||||
|
self.stdout.write("🌟 Ready to start development server with:")
|
||||||
|
self.stdout.write(" python manage.py runserver")
|
||||||
|
self.stdout.write("")
|
||||||
@@ -1,20 +1,21 @@
|
|||||||
from django.core.management.base import BaseCommand
|
from django.core.management.base import BaseCommand
|
||||||
from django.core.cache import cache
|
from django.core.cache import cache
|
||||||
from parks.models import Park
|
from apps.parks.models import Park
|
||||||
from rides.models import Ride
|
from apps.rides.models import Ride
|
||||||
from core.analytics import PageView
|
from apps.core.analytics import PageView
|
||||||
|
|
||||||
|
|
||||||
class Command(BaseCommand):
|
class Command(BaseCommand):
|
||||||
help = 'Updates trending parks and rides cache based on views in the last 24 hours'
|
help = "Updates trending parks and rides cache based on views in the last 24 hours"
|
||||||
|
|
||||||
def handle(self, *args, **kwargs):
|
def handle(self, *args, **kwargs):
|
||||||
"""
|
"""
|
||||||
Updates the trending parks and rides in the cache.
|
Updates the trending parks and rides in the cache.
|
||||||
|
|
||||||
This command is designed to be run every hour via cron to keep the trending
|
This command is designed to be run every hour via cron to keep the trending
|
||||||
items up to date. It looks at page views from the last 24 hours and caches
|
items up to date. It looks at page views from the last 24 hours and caches
|
||||||
the top 10 most viewed parks and rides.
|
the top 10 most viewed parks and rides.
|
||||||
|
|
||||||
The cached data is used by the home page to display trending items without
|
The cached data is used by the home page to display trending items without
|
||||||
having to query the database on every request.
|
having to query the database on every request.
|
||||||
"""
|
"""
|
||||||
@@ -23,12 +24,12 @@ class Command(BaseCommand):
|
|||||||
trending_rides = PageView.get_trending_items(Ride, hours=24, limit=10)
|
trending_rides = PageView.get_trending_items(Ride, hours=24, limit=10)
|
||||||
|
|
||||||
# Cache the results for 1 hour
|
# Cache the results for 1 hour
|
||||||
cache.set('trending_parks', trending_parks, 3600) # 3600 seconds = 1 hour
|
cache.set("trending_parks", trending_parks, 3600) # 3600 seconds = 1 hour
|
||||||
cache.set('trending_rides', trending_rides, 3600)
|
cache.set("trending_rides", trending_rides, 3600)
|
||||||
|
|
||||||
self.stdout.write(
|
self.stdout.write(
|
||||||
self.style.SUCCESS(
|
self.style.SUCCESS(
|
||||||
'Successfully updated trending parks and rides. '
|
"Successfully updated trending parks and rides. "
|
||||||
'Cached 10 items each for parks and rides based on views in the last 24 hours.'
|
"Cached 10 items each for parks and rides based on views in the last 24 hours."
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user