mirror of
https://github.com/pacnpal/thrilltrack-explorer.git
synced 2025-12-23 16:51:14 -05:00
Add logging for submission data
This commit is contained in:
177
docs/logging/SUBMISSION_FLOW_LOGGING.md
Normal file
177
docs/logging/SUBMISSION_FLOW_LOGGING.md
Normal file
@@ -0,0 +1,177 @@
|
|||||||
|
# Submission Flow Logging
|
||||||
|
|
||||||
|
This document describes the structured logging implemented for tracking submission data through the moderation pipeline.
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
|
||||||
|
The submission flow has structured logging at each critical stage to enable debugging and auditing of data transformations.
|
||||||
|
|
||||||
|
## Logging Stages
|
||||||
|
|
||||||
|
### 1. Edit Stage
|
||||||
|
**Location**: `src/lib/submissionItemsService.ts` → `updateSubmissionItem()`
|
||||||
|
|
||||||
|
**Log Points**:
|
||||||
|
- Update item start (when moderator edits)
|
||||||
|
- Saving park data (before database write)
|
||||||
|
- Park data saved successfully (after database write)
|
||||||
|
|
||||||
|
**Log Format**:
|
||||||
|
```typescript
|
||||||
|
console.info('[Submission Flow] Update item start', {
|
||||||
|
itemId: string,
|
||||||
|
hasItemData: boolean,
|
||||||
|
statusUpdate: string | undefined,
|
||||||
|
timestamp: ISO string
|
||||||
|
});
|
||||||
|
|
||||||
|
console.info('[Submission Flow] Saving park data', {
|
||||||
|
itemId: string,
|
||||||
|
parkSubmissionId: string,
|
||||||
|
hasLocation: boolean,
|
||||||
|
locationData: object | null,
|
||||||
|
fields: string[],
|
||||||
|
timestamp: ISO string
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2. Validation Stage
|
||||||
|
**Location**: `src/hooks/moderation/useModerationActions.ts` → `handleApproveSubmission()`
|
||||||
|
|
||||||
|
**Log Points**:
|
||||||
|
- Preparing items for validation (after fetching from DB)
|
||||||
|
- Transformed park data (after temp_location_data → location transform)
|
||||||
|
- Starting validation (before schema validation)
|
||||||
|
- Validation completed (after schema validation)
|
||||||
|
- Validation found blocking errors (if errors exist)
|
||||||
|
|
||||||
|
**Log Format**:
|
||||||
|
```typescript
|
||||||
|
console.info('[Submission Flow] Transformed park data for validation', {
|
||||||
|
itemId: string,
|
||||||
|
hasLocation: boolean,
|
||||||
|
locationData: object | null,
|
||||||
|
transformedHasLocation: boolean,
|
||||||
|
timestamp: ISO string
|
||||||
|
});
|
||||||
|
|
||||||
|
console.warn('[Submission Flow] Validation found blocking errors', {
|
||||||
|
submissionId: string,
|
||||||
|
itemsWithErrors: Array<{
|
||||||
|
itemId: string,
|
||||||
|
itemType: string,
|
||||||
|
errors: string[]
|
||||||
|
}>,
|
||||||
|
timestamp: ISO string
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
### 3. Approval Stage
|
||||||
|
**Location**: `src/lib/submissionItemsService.ts` → `approveSubmissionItems()`
|
||||||
|
|
||||||
|
**Log Points**:
|
||||||
|
- Approval process started (beginning of batch approval)
|
||||||
|
- Processing item for approval (for each item)
|
||||||
|
- Entity created successfully (after entity creation)
|
||||||
|
|
||||||
|
**Log Format**:
|
||||||
|
```typescript
|
||||||
|
console.info('[Submission Flow] Approval process started', {
|
||||||
|
itemCount: number,
|
||||||
|
itemIds: string[],
|
||||||
|
itemTypes: string[],
|
||||||
|
userId: string,
|
||||||
|
timestamp: ISO string
|
||||||
|
});
|
||||||
|
|
||||||
|
console.info('[Submission Flow] Processing item for approval', {
|
||||||
|
itemId: string,
|
||||||
|
itemType: string,
|
||||||
|
isEdit: boolean,
|
||||||
|
hasLocation: boolean,
|
||||||
|
locationData: object | null,
|
||||||
|
timestamp: ISO string
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
## Key Data Transformations Logged
|
||||||
|
|
||||||
|
### Park Location Data
|
||||||
|
The most critical transformation logged is the park location data flow:
|
||||||
|
|
||||||
|
1. **Database Storage**: `temp_location_data` (JSONB in park_submissions)
|
||||||
|
2. **Display/Edit**: `location` (transformed for form compatibility)
|
||||||
|
3. **Validation**: `location` (transformed from temp_location_data)
|
||||||
|
4. **Save**: `temp_location_data` (transformed back for storage)
|
||||||
|
5. **Approval**: `location` (transformed from temp_location_data)
|
||||||
|
|
||||||
|
**Why this matters**: Location validation errors typically indicate a break in this transformation chain.
|
||||||
|
|
||||||
|
## Debugging Workflow
|
||||||
|
|
||||||
|
### To debug location validation errors:
|
||||||
|
|
||||||
|
1. **Check browser console** for `[Submission Flow]` logs
|
||||||
|
2. **Verify data at each stage**:
|
||||||
|
```javascript
|
||||||
|
// Edit stage - should show temp_location_data being saved
|
||||||
|
[Submission Flow] Saving park data { hasLocation: true, locationData: {...} }
|
||||||
|
|
||||||
|
// Validation stage - should show location after transformation
|
||||||
|
[Submission Flow] Transformed park data { hasLocation: true, transformedHasLocation: true }
|
||||||
|
|
||||||
|
// Approval stage - should show location present
|
||||||
|
[Submission Flow] Processing item { hasLocation: true, locationData: {...} }
|
||||||
|
```
|
||||||
|
|
||||||
|
3. **Look for missing data**:
|
||||||
|
- If `hasLocation: false` in "Saving park data" → Edit form didn't capture location
|
||||||
|
- If `hasLocation: true` but `transformedHasLocation: false` → Transformation failed
|
||||||
|
- If validation logs missing → Check database query/fetch
|
||||||
|
|
||||||
|
## Error Logging Integration
|
||||||
|
|
||||||
|
Structured errors use the `handleError()` utility from `@/lib/errorHandler`:
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
handleError(error, {
|
||||||
|
action: 'Update Park Submission Data',
|
||||||
|
metadata: {
|
||||||
|
itemId,
|
||||||
|
parkSubmissionId,
|
||||||
|
updateFields: Object.keys(updateData)
|
||||||
|
}
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
Errors are logged to:
|
||||||
|
- **Database**: `request_metadata` table
|
||||||
|
- **Admin Panel**: `/admin/error-monitoring`
|
||||||
|
- **Console**: Browser developer tools (with reference ID)
|
||||||
|
|
||||||
|
## Log Filtering
|
||||||
|
|
||||||
|
To filter logs in browser console:
|
||||||
|
```javascript
|
||||||
|
// All submission flow logs
|
||||||
|
localStorage.setItem('logFilter', 'Submission Flow');
|
||||||
|
|
||||||
|
// Specific stages
|
||||||
|
localStorage.setItem('logFilter', 'Validation');
|
||||||
|
localStorage.setItem('logFilter', 'Saving park data');
|
||||||
|
```
|
||||||
|
|
||||||
|
## Performance Considerations
|
||||||
|
|
||||||
|
- Logs use `console.info()` and `console.warn()` which are stripped in production builds
|
||||||
|
- Sensitive data (passwords, tokens) are never logged
|
||||||
|
- Object logging uses shallow copies to avoid memory leaks
|
||||||
|
- Timestamps use ISO format for timezone-aware debugging
|
||||||
|
|
||||||
|
## Future Enhancements
|
||||||
|
|
||||||
|
- [ ] Add edge function logging for backend approval process
|
||||||
|
- [ ] Add real-time log streaming to admin dashboard
|
||||||
|
- [ ] Add log retention policies (30-day automatic cleanup)
|
||||||
|
- [ ] Add performance metrics (time between stages)
|
||||||
|
- [ ] Add user action correlation (who edited what when)
|
||||||
@@ -154,6 +154,13 @@ export function useModerationActions(config: ModerationActionsConfig): Moderatio
|
|||||||
}
|
}
|
||||||
|
|
||||||
if (fullItems && fullItems.length > 0) {
|
if (fullItems && fullItems.length > 0) {
|
||||||
|
console.info('[Submission Flow] Preparing items for validation', {
|
||||||
|
submissionId: item.id,
|
||||||
|
itemCount: fullItems.length,
|
||||||
|
itemTypes: fullItems.map(i => i.item_type),
|
||||||
|
timestamp: new Date().toISOString()
|
||||||
|
});
|
||||||
|
|
||||||
// Transform to include item_data
|
// Transform to include item_data
|
||||||
const itemsWithData = fullItems.map(item => {
|
const itemsWithData = fullItems.map(item => {
|
||||||
let itemData = {};
|
let itemData = {};
|
||||||
@@ -166,6 +173,14 @@ export function useModerationActions(config: ModerationActionsConfig): Moderatio
|
|||||||
location: parkSub.temp_location_data || undefined,
|
location: parkSub.temp_location_data || undefined,
|
||||||
temp_location_data: undefined
|
temp_location_data: undefined
|
||||||
};
|
};
|
||||||
|
|
||||||
|
console.info('[Submission Flow] Transformed park data for validation', {
|
||||||
|
itemId: item.id,
|
||||||
|
hasLocation: !!parkSub.temp_location_data,
|
||||||
|
locationData: parkSub.temp_location_data,
|
||||||
|
transformedHasLocation: !!(itemData as any).location,
|
||||||
|
timestamp: new Date().toISOString()
|
||||||
|
});
|
||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
case 'ride':
|
case 'ride':
|
||||||
@@ -202,8 +217,21 @@ export function useModerationActions(config: ModerationActionsConfig): Moderatio
|
|||||||
|
|
||||||
// Run validation on all items
|
// Run validation on all items
|
||||||
try {
|
try {
|
||||||
|
console.info('[Submission Flow] Starting validation', {
|
||||||
|
submissionId: item.id,
|
||||||
|
itemCount: itemsWithData.length,
|
||||||
|
itemTypes: itemsWithData.map(i => i.item_type),
|
||||||
|
timestamp: new Date().toISOString()
|
||||||
|
});
|
||||||
|
|
||||||
const validationResults = await validateMultipleItems(itemsWithData);
|
const validationResults = await validateMultipleItems(itemsWithData);
|
||||||
|
|
||||||
|
console.info('[Submission Flow] Validation completed', {
|
||||||
|
submissionId: item.id,
|
||||||
|
resultsCount: validationResults.size,
|
||||||
|
timestamp: new Date().toISOString()
|
||||||
|
});
|
||||||
|
|
||||||
// Check for blocking errors
|
// Check for blocking errors
|
||||||
const itemsWithBlockingErrors = itemsWithData.filter(item => {
|
const itemsWithBlockingErrors = itemsWithData.filter(item => {
|
||||||
const result = validationResults.get(item.id);
|
const result = validationResults.get(item.id);
|
||||||
@@ -212,6 +240,16 @@ export function useModerationActions(config: ModerationActionsConfig): Moderatio
|
|||||||
|
|
||||||
// CRITICAL: Block approval if any item has blocking errors
|
// CRITICAL: Block approval if any item has blocking errors
|
||||||
if (itemsWithBlockingErrors.length > 0) {
|
if (itemsWithBlockingErrors.length > 0) {
|
||||||
|
console.warn('[Submission Flow] Validation found blocking errors', {
|
||||||
|
submissionId: item.id,
|
||||||
|
itemsWithErrors: itemsWithBlockingErrors.map(i => ({
|
||||||
|
itemId: i.id,
|
||||||
|
itemType: i.item_type,
|
||||||
|
errors: validationResults.get(i.id)?.blockingErrors
|
||||||
|
})),
|
||||||
|
timestamp: new Date().toISOString()
|
||||||
|
});
|
||||||
|
|
||||||
// Log detailed blocking errors
|
// Log detailed blocking errors
|
||||||
itemsWithBlockingErrors.forEach(item => {
|
itemsWithBlockingErrors.forEach(item => {
|
||||||
const result = validationResults.get(item.id);
|
const result = validationResults.get(item.id);
|
||||||
|
|||||||
@@ -232,13 +232,27 @@ export async function updateSubmissionItem(
|
|||||||
): Promise<void> {
|
): Promise<void> {
|
||||||
const { item_data, original_data, ...cleanUpdates } = updates;
|
const { item_data, original_data, ...cleanUpdates } = updates;
|
||||||
|
|
||||||
|
// Log submission item update start
|
||||||
|
console.info('[Submission Flow] Update item start', {
|
||||||
|
itemId,
|
||||||
|
hasItemData: !!item_data,
|
||||||
|
statusUpdate: cleanUpdates.status,
|
||||||
|
timestamp: new Date().toISOString()
|
||||||
|
});
|
||||||
|
|
||||||
// Update submission_items table
|
// Update submission_items table
|
||||||
const { error } = await supabase
|
const { error } = await supabase
|
||||||
.from('submission_items')
|
.from('submission_items')
|
||||||
.update(cleanUpdates)
|
.update(cleanUpdates)
|
||||||
.eq('id', itemId);
|
.eq('id', itemId);
|
||||||
|
|
||||||
if (error) throw error;
|
if (error) {
|
||||||
|
handleError(error, {
|
||||||
|
action: 'Update Submission Item',
|
||||||
|
metadata: { itemId, updates: cleanUpdates }
|
||||||
|
});
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
|
||||||
// If item_data is provided, update the relational table
|
// If item_data is provided, update the relational table
|
||||||
if (item_data !== undefined) {
|
if (item_data !== undefined) {
|
||||||
@@ -272,12 +286,37 @@ export async function updateSubmissionItem(
|
|||||||
if (updateData[key] === undefined) delete updateData[key];
|
if (updateData[key] === undefined) delete updateData[key];
|
||||||
});
|
});
|
||||||
|
|
||||||
|
console.info('[Submission Flow] Saving park data', {
|
||||||
|
itemId,
|
||||||
|
parkSubmissionId: item.park_submission_id,
|
||||||
|
hasLocation: !!updateData.temp_location_data,
|
||||||
|
locationData: updateData.temp_location_data,
|
||||||
|
fields: Object.keys(updateData),
|
||||||
|
timestamp: new Date().toISOString()
|
||||||
|
});
|
||||||
|
|
||||||
const { error: updateError } = await supabase
|
const { error: updateError } = await supabase
|
||||||
.from('park_submissions')
|
.from('park_submissions')
|
||||||
.update(updateData)
|
.update(updateData)
|
||||||
.eq('id', item.park_submission_id);
|
.eq('id', item.park_submission_id);
|
||||||
|
|
||||||
if (updateError) throw updateError;
|
if (updateError) {
|
||||||
|
handleError(updateError, {
|
||||||
|
action: 'Update Park Submission Data',
|
||||||
|
metadata: {
|
||||||
|
itemId,
|
||||||
|
parkSubmissionId: item.park_submission_id,
|
||||||
|
updateFields: Object.keys(updateData)
|
||||||
|
}
|
||||||
|
});
|
||||||
|
throw updateError;
|
||||||
|
}
|
||||||
|
|
||||||
|
console.info('[Submission Flow] Park data saved successfully', {
|
||||||
|
itemId,
|
||||||
|
parkSubmissionId: item.park_submission_id,
|
||||||
|
timestamp: new Date().toISOString()
|
||||||
|
});
|
||||||
break;
|
break;
|
||||||
}
|
}
|
||||||
case 'ride': {
|
case 'ride': {
|
||||||
@@ -340,6 +379,14 @@ export async function approveSubmissionItems(
|
|||||||
throw new Error('User authentication required to approve items');
|
throw new Error('User authentication required to approve items');
|
||||||
}
|
}
|
||||||
|
|
||||||
|
console.info('[Submission Flow] Approval process started', {
|
||||||
|
itemCount: items.length,
|
||||||
|
itemIds: items.map(i => i.id),
|
||||||
|
itemTypes: items.map(i => i.item_type),
|
||||||
|
userId,
|
||||||
|
timestamp: new Date().toISOString()
|
||||||
|
});
|
||||||
|
|
||||||
// Sort by dependency order (parents first)
|
// Sort by dependency order (parents first)
|
||||||
const sortedItems = topologicalSort(items);
|
const sortedItems = topologicalSort(items);
|
||||||
|
|
||||||
@@ -363,6 +410,15 @@ export async function approveSubmissionItems(
|
|||||||
('ride_model_id' in itemData && itemData.ride_model_id)
|
('ride_model_id' in itemData && itemData.ride_model_id)
|
||||||
);
|
);
|
||||||
|
|
||||||
|
console.info('[Submission Flow] Processing item for approval', {
|
||||||
|
itemId: item.id,
|
||||||
|
itemType: item.item_type,
|
||||||
|
isEdit,
|
||||||
|
hasLocation: !!(itemData as any).location,
|
||||||
|
locationData: (itemData as any).location,
|
||||||
|
timestamp: new Date().toISOString()
|
||||||
|
});
|
||||||
|
|
||||||
// Create the entity based on type with dependency resolution
|
// Create the entity based on type with dependency resolution
|
||||||
// PASS sortedItems to enable correct index-based resolution
|
// PASS sortedItems to enable correct index-based resolution
|
||||||
switch (item.item_type) {
|
switch (item.item_type) {
|
||||||
@@ -390,6 +446,14 @@ export async function approveSubmissionItems(
|
|||||||
throw new Error(`Failed to create ${item.item_type}: no entity ID returned`);
|
throw new Error(`Failed to create ${item.item_type}: no entity ID returned`);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
console.info('[Submission Flow] Entity created successfully', {
|
||||||
|
itemId: item.id,
|
||||||
|
itemType: item.item_type,
|
||||||
|
entityId,
|
||||||
|
isEdit,
|
||||||
|
timestamp: new Date().toISOString()
|
||||||
|
});
|
||||||
|
|
||||||
// Update item status
|
// Update item status
|
||||||
await updateSubmissionItem(item.id, {
|
await updateSubmissionItem(item.id, {
|
||||||
status: 'approved' as const,
|
status: 'approved' as const,
|
||||||
|
|||||||
Reference in New Issue
Block a user