Compare commits
64 Commits
38b45e8c79
...
sprint/7-c
| Author | SHA1 | Date | |
|---|---|---|---|
| b6c2b93fd0 | |||
| fee67385d5 | |||
| f587c4dd24 | |||
| 4669907a02 | |||
| 611d653938 | |||
| 16c21c17d2 | |||
| 761b702e57 | |||
| 46b391bd51 | |||
| d9a2af7168 | |||
| f75d4f2915 | |||
| 05de552820 | |||
| f85188ca6b | |||
| 64ff08b2b2 | |||
| 4a9f754fda | |||
| 66e41ba647 | |||
| 2c60a82704 | |||
| 9cde886bd4 | |||
| a8751b5183 | |||
| 170208d3e5 | |||
| ca24a4cb23 | |||
| e0a491e8cb | |||
| aa2846b884 | |||
| 086bbbf546 | |||
| 60d98f1ae8 | |||
| 9aa7c01efe | |||
| d04891c064 | |||
| 9142b8fedb | |||
| 14c474ea87 | |||
| 8d0e801d86 | |||
| a29e8d2892 | |||
| 2fefdd16c9 | |||
| 8ae97dcdc7 | |||
| 05770d6d21 | |||
| b8a69ceae2 | |||
| b2a2cbad5f | |||
| 635a1224cd | |||
| 3a3f6d4179 | |||
| c9afbcf3cc | |||
| ac2b2ba242 | |||
| a208ddf1f7 | |||
| 4a24e9cdca | |||
| bae9ac9e23 | |||
| a5bdde278d | |||
| fc5aa555e5 | |||
| 9d61967350 | |||
| 99f376506d | |||
| 384797551f | |||
| 255519473b | |||
| 1b6185c758 | |||
| 43123d0ff1 | |||
| 388057b641 | |||
| 5a2752a2de | |||
| b12f4952b3 | |||
| ea378f19b1 | |||
| 54aa7ce06e | |||
| a3d1bbad30 | |||
| 54315408eb | |||
| 3103eff949 | |||
| bc0a9ad5c8 | |||
| d921dde598 | |||
| 54d31dd3b1 | |||
|
|
cbe06e9c8f | ||
|
|
b60d03142c | ||
|
|
37001f8e70 |
204
CHANGELOG.md
204
CHANGELOG.md
@@ -5,6 +5,210 @@ Format follows [Keep a Changelog](https://keepachangelog.com/en/1.0.0/), version
|
||||
|
||||
---
|
||||
|
||||
## [1.14.2] — 2026-04-22
|
||||
|
||||
### Fixed
|
||||
- **Admin "Login as Partner" impersonation now completes into `/dashboard`** instead of bouncing back to `/login?error=ImpersonationFailed`. Two linked issues:
|
||||
- **`ALLOWED_HOSTS`** (`eventify/settings.py`) — partner portal's server-side `authorize()` (Next.js) calls `${BACKEND_API_URL}/api/v1/auth/me/` with `BACKEND_API_URL=http://eventify-backend:8000`, so the HTTP `Host` header was `eventify-backend` — not in the Django allowlist. `SecurityMiddleware` rejected with HTTP 400 DisallowedHost, `authorize()` returned null, `signIn()` failed, and the `/impersonate` page redirected to the login error. Added `partner.eventifyplus.com`, `eventify-backend`, and `eventify-django` to `ALLOWED_HOSTS`. Same Host issue was silently breaking regular partner password login too — fixed as a side effect.
|
||||
- **`UserSerializer` missing `partner` field** (`admin_api/serializers.py`) — `MeView` returned `/api/v1/auth/me/` payload with no `partner` key, so the partner portal's `auth.ts` set `partnerId: ""` on the NextAuth session. Downstream dashboard queries that filter by `partnerId` would then return empty/403. Added `partner = PrimaryKeyRelatedField(read_only=True)` to the serializer's `fields` list. Payload now includes `"partner": <id>`.
|
||||
- Deploy: `docker cp` both files into **both** `eventify-backend` and `eventify-django` containers + `kill -HUP 1` on each (per shared admin_api rule).
|
||||
|
||||
---
|
||||
|
||||
## [1.14.1] — 2026-04-22
|
||||
|
||||
### Fixed
|
||||
- **`_serialize_review()` now returns `profile_photo`** (`mobile_api/views/reviews.py`) — `/api/reviews/list` payload was missing the reviewer's photo URL, so the consumer app had no choice but to render DiceBear placeholders for every reviewer regardless of whether they had uploaded a real profile picture
|
||||
- Resolves `r.reviewer.profile_picture.url` when the field is non-empty and the file name is not `default.png` (the model's placeholder default); returns empty string otherwise so the frontend can fall back cleanly to DiceBear
|
||||
- Mirrors the existing pattern in `mobile_api/views/user.py` (`LoginView`, `StatusView`, `UpdateProfileView`) — same defensive try/except around FK dereference
|
||||
- Pure serializer change — no migration, no URL change, no permission change; `gunicorn kill -HUP 1` picks it up
|
||||
|
||||
---
|
||||
|
||||
## [1.14.0] — 2026-04-21
|
||||
|
||||
### Added
|
||||
- **Module-level RBAC scopes for Reviews, Contributions, Leads, Audit Log** — `SCOPE_DEFINITIONS` in `admin_api/views.py` extended with 13 new entries so the admin dashboard's Roles & Permissions grid and the new Base Permissions tab can grant/revoke access at module granularity:
|
||||
- Reviews: `reviews.read`, `reviews.moderate`, `reviews.delete`
|
||||
- Contributions: `contributions.read`, `contributions.approve`, `contributions.reject`, `contributions.award`
|
||||
- Leads: `leads.read`, `leads.write`, `leads.assign`, `leads.convert`
|
||||
- Audit Log: `audit.read`, `audit.export`
|
||||
- **`NotificationSchedule` audit emissions** in `admin_api/views.py` — `NotificationScheduleListView.post` and `NotificationScheduleDetailView.patch` / `.delete` now write `notification.schedule.created` / `.updated` / `.deleted` `AuditLog` rows. Update emits only when at least one field actually changed. Delete captures `name`/`notification_type`/`cron_expression` before the row is deleted so the audit trail survives the deletion
|
||||
|
||||
### Fixed
|
||||
- **`StaffProfile.get_allowed_modules()`** in `admin_api/models.py` — `SCOPE_TO_MODULE` was missing the `'reviews': 'reviews'` entry, so staff granted `reviews.*` scopes could not see the Reviews module in their sidebar. Added
|
||||
|
||||
---
|
||||
|
||||
## [1.13.0] — 2026-04-21
|
||||
|
||||
### Added
|
||||
- **Full admin interaction audit coverage** — `_audit_log()` calls added to 12 views; every meaningful admin state change now writes an `AuditLog` row:
|
||||
|
||||
| View | Action slug(s) | Notes |
|
||||
|---|---|---|
|
||||
| `AdminLoginView` | `auth.admin_login`, `auth.admin_login_failed` | Uses new `user=` kwarg (anonymous at login time) |
|
||||
| `PartnerStatusView` | `partner.status_changed` | Wrapped in `transaction.atomic()` |
|
||||
| `PartnerOnboardView` | `partner.onboarded` | Inside existing `transaction.atomic()` block |
|
||||
| `PartnerStaffCreateView` | `partner.staff.created` | Logged after `staff_user.save()` |
|
||||
| `EventCreateView` | `event.created` | title, partner_id, source in details |
|
||||
| `EventUpdateView` | `event.updated` | changed_fields list in details, wrapped in `transaction.atomic()` |
|
||||
| `EventDeleteView` | `event.deleted` | title + partner_id captured BEFORE delete, wrapped in `transaction.atomic()` |
|
||||
| `SettlementReleaseView` | `settlement.released` | prev/new status in details, `transaction.atomic()` |
|
||||
| `ReviewDeleteView` | `review.deleted` | reviewer_user_id + event_id + rating captured BEFORE delete |
|
||||
| `PaymentGatewaySettingsView` | `gateway.created`, `gateway.updated`, `gateway.deleted` | changed_fields on update |
|
||||
| `EventPrimaryImageView` | `event.primary_image_changed` | prev + new primary image id in details |
|
||||
| `LeadUpdateView` | `lead.updated` | changed_fields list; only emits if any field was changed |
|
||||
|
||||
- **`_audit_log` helper** — optional `user=None` kwarg so `AdminLoginView` can supply the authenticated user explicitly (request.user is still anonymous at that point in the login flow). All 20+ existing callers are unaffected (no kwarg = falls through to `request.user`).
|
||||
- **`admin_api/tests.py`** — `AuthAuditEmissionTests` (login success + failed login) and `EventCrudAuditTests` (create/update/delete) bring total test count to 16, all green
|
||||
|
||||
---
|
||||
|
||||
## [1.12.0] — 2026-04-21
|
||||
|
||||
### Added
|
||||
- **Audit coverage for four moderation endpoints** — every admin state change now leaves a matching row in `AuditLog`, written in the same `transaction.atomic()` block as the state change so the log can never disagree with the database:
|
||||
- `UserStatusView` (`PATCH /api/v1/users/<id>/status/`) — `user.suspended`, `user.banned`, `user.reinstated`, `user.flagged`; details capture `reason`, `previous_status`, `new_status`
|
||||
- `EventModerationView` (`PATCH /api/v1/events/<id>/moderate/`) — `event.approved`, `event.rejected`, `event.flagged`, `event.featured`, `event.unfeatured`; details include `reason`, `partner_id`, `previous_status`/`new_status`, `previous_is_featured`/`new_is_featured`
|
||||
- `ReviewModerationView` (`PATCH /api/v1/reviews/<id>/moderate/`) — `review.approved`, `review.rejected`, `review.edited`; details include `reject_reason`, `edited_text` flag, `original_text` on edits
|
||||
- `PartnerKYCReviewView` (`POST /api/v1/partners/<id>/kyc/review/`) — `partner.kyc.approved`, `partner.kyc.rejected`, `partner.kyc.requested_info` (new `requested_info` decision leaves compliance state intact and only records the info request)
|
||||
- **`GET /api/v1/rbac/audit-log/metrics/`** — `AuditLogMetricsView` returns `total`, `today`, `week`, `distinct_users`, and a `by_action_group` breakdown (`create`/`update`/`delete`/`moderate`/`auth`/`other`). Cached 60 s under key `admin_api:audit_log:metrics:v1`; pass `?nocache=1` to bypass (useful from the Django shell during incident response)
|
||||
- **`GET /api/v1/rbac/audit-log/`** — free-text `search` parameter (Q-filter over `action`, `target_type`, `target_id`, `user__username`, `user__email`); `page_size` now bounded to `[1, 200]` with defensive fallback to defaults on non-integer input
|
||||
- **`accounts.User.ALL_MODULES`** — appended `audit-log`; `StaffProfile.get_allowed_modules()` adds `'audit'` → `'audit-log'` to `SCOPE_TO_MODULE` so scope-based staff resolve the module correctly
|
||||
- **`admin_api/migrations/0005_auditlog_indexes.py`** — composite indexes `(action, -created_at)` and `(target_type, target_id)` on `AuditLog` to keep the /audit-log page fast past ~10k rows; reversible via Django's default `RemoveIndex` reverse op
|
||||
- **`admin_api/tests.py`** — `AuditLogListViewTests`, `AuditLogMetricsViewTests`, `UserStatusAuditEmissionTests` covering list shape, search, pagination bounds, metrics shape + `nocache`, and audit emission on suspend / ban / reinstate
|
||||
|
||||
### Deploy notes
|
||||
Admin users created before this release won't have `audit-log` in their `allowed_modules` TextField. Backfill with:
|
||||
```python
|
||||
# Django shell
|
||||
from accounts.models import User
|
||||
for u in User.objects.filter(role__in=['admin', 'manager']):
|
||||
mods = [m.strip() for m in (u.allowed_modules or '').split(',') if m.strip()]
|
||||
if 'audit-log' not in mods and mods: # only touch users with explicit lists
|
||||
u.allowed_modules = ','.join(mods + ['audit-log'])
|
||||
u.save(update_fields=['allowed_modules'])
|
||||
```
|
||||
Users on the implicit full-access list (empty `allowed_modules` + admin role) pick up the new module automatically via `get_allowed_modules()`.
|
||||
|
||||
---
|
||||
|
||||
## [1.11.0] — 2026-04-12
|
||||
|
||||
### Added
|
||||
- **Worldline Connect payment integration** (`banking_operations/worldline/`)
|
||||
- `client.py` — `WorldlineClient`: HMAC-SHA256 signed requests, `create_hosted_checkout()`, `get_hosted_checkout_status()`, `verify_webhook_signature()`
|
||||
- `views.py` — `POST /api/payments/webhook/` (CSRF-exempt, signature-verified Worldline server callback) + `POST /api/payments/verify/` (frontend polls on return URL)
|
||||
- `emails.py` — HTML ticket confirmation email with per-ticket QR codes embedded as base64 inline images
|
||||
- `WorldlineOrder` model in `banking_operations/models.py` — tracks each hosted-checkout session (hosted_checkout_id, reference_id, status, raw_response, webhook_payload)
|
||||
- **`Booking.payment_status`** field — `pending / paid / failed / cancelled` (default `pending`); migration `bookings/0002_booking_payment_status`
|
||||
- **`banking_operations/services.py::transaction_initiate`** — implemented (was a stub); calls Worldline API, creates `WorldlineOrder`, returns `payment_url` back to `CheckoutAPI`
|
||||
- **Settings**: `WORLDLINE_MERCHANT_ID`, `WORLDLINE_API_KEY_ID`, `WORLDLINE_API_SECRET_KEY`, `WORLDLINE_WEBHOOK_SECRET_KEY`, `WORLDLINE_API_ENDPOINT` (default: sandbox), `WORLDLINE_RETURN_URL`
|
||||
- **Requirements**: `requests>=2.31.0`, `qrcode[pil]>=7.4.2`
|
||||
|
||||
### Flow
|
||||
1. User adds tickets to cart → `POST /api/bookings/checkout/` creates Bookings + calls `transaction_initiate`
|
||||
2. `transaction_initiate` creates `WorldlineOrder` + calls Worldline → returns redirect URL
|
||||
3. Frontend redirects user to Worldline hosted checkout page
|
||||
4. After payment, Worldline redirects to `WORLDLINE_RETURN_URL` (`app.eventifyplus.com/booking/confirm?hostedCheckoutId=...`)
|
||||
5. SPA calls `POST /api/payments/verify/` — checks local status; if still pending, polls Worldline API directly
|
||||
6. Worldline webhook fires `POST /api/payments/webhook/` → generates Tickets (one per quantity), marks Booking `paid`, sends confirmation email with QR codes
|
||||
7. Partner scans QR code at event → existing `POST /api/bookings/check-in/` marks `Ticket.is_checked_in=True`
|
||||
|
||||
### Deploy requirement
|
||||
Set in Django container `.env`:
|
||||
```
|
||||
WORLDLINE_MERCHANT_ID=...
|
||||
WORLDLINE_API_KEY_ID=...
|
||||
WORLDLINE_API_SECRET_KEY=...
|
||||
WORLDLINE_WEBHOOK_SECRET_KEY=...
|
||||
```
|
||||
`WORLDLINE_API_ENDPOINT` defaults to sandbox — set to production URL when going live.
|
||||
|
||||
---
|
||||
|
||||
## [1.10.0] — 2026-04-10
|
||||
|
||||
### Security
|
||||
- **`GoogleLoginView` audience-check fix** (`POST /api/user/google-login/`) — **CRITICAL security patch**
|
||||
- `verify_oauth2_token(token, google_requests.Request())` was called **without** the third `audience` argument, meaning any valid Google-signed ID token from *any* OAuth client was accepted — token spoofing from external apps was trivially possible
|
||||
- Fixed to `verify_oauth2_token(token, google_requests.Request(), settings.GOOGLE_CLIENT_ID)` — only tokens whose `aud` claim matches our registered Client ID are now accepted
|
||||
- Added fail-closed guard: if `settings.GOOGLE_CLIENT_ID` is empty the view returns HTTP 503 instead of silently accepting all tokens
|
||||
|
||||
### Changed
|
||||
- **Removed Clerk scaffolding** — the `@clerk/react` broker approach added in a prior iteration has been replaced with direct Google Identity Services (GIS) ID-token flow on the frontend. Simpler architecture: one trust boundary instead of three.
|
||||
- Removed `ClerkLoginView`, `_clerk_jwks_client`, `_get_clerk_jwks_client()` from `mobile_api/views/user.py`
|
||||
- Removed `path('user/clerk-login/', ...)` from `mobile_api/urls.py`
|
||||
- Removed `CLERK_JWKS_URL` / `CLERK_ISSUER` / `CLERK_SECRET_KEY` from `eventify/settings.py`; replaced with `GOOGLE_CLIENT_ID = os.environ.get('GOOGLE_CLIENT_ID', '')`
|
||||
- Removed `PyJWT[crypto]>=2.8.0` and `requests>=2.31.0` from `requirements.txt` + `requirements-docker.txt` (no longer needed; `google-auth>=2.0.0` handles verification)
|
||||
|
||||
### Added
|
||||
- **Settings**: `GOOGLE_CLIENT_ID = os.environ.get('GOOGLE_CLIENT_ID', '')` in `eventify/settings.py`
|
||||
- **Tests**: `mobile_api/tests.py::GoogleLoginViewTests` — 4 cases: valid token creates user (audience arg verified), missing `id_token` → 400, `ValueError` (wrong sig / wrong aud) → 401, existing user reuses DRF token
|
||||
|
||||
### Context
|
||||
- The consumer SPA (`app.eventifyplus.com`) now loads the Google Identity Services script dynamically and POSTs a Google ID token to the existing `/api/user/google-login/` endpoint. Django is the sole session authority. `localStorage.event_token` / `event_user` are unchanged.
|
||||
- Deploy requirement: set `GOOGLE_CLIENT_ID` in the Django container `.env` **before** deploying — without it the view returns 503 (fail-closed by design).
|
||||
|
||||
---
|
||||
|
||||
## [1.9.0] — 2026-04-07
|
||||
|
||||
### Added
|
||||
- **Lead Manager** — new `Lead` model in `admin_api` for tracking Schedule-a-Call form submissions and sales inquiries
|
||||
- Fields: name, email, phone, event_type, message, status (new/contacted/qualified/converted/closed), source (schedule_call/website/manual), priority (low/medium/high), assigned_to (FK User), notes
|
||||
- Migration `admin_api/0003_lead` with indexes on status, priority, created_at, email
|
||||
- **Consumer endpoint** `POST /api/leads/schedule-call/` — public (AllowAny, CSRF-exempt) endpoint for the Schedule a Call modal; creates Lead with status=new, source=schedule_call
|
||||
- **Admin API endpoints** (all IsAuthenticated):
|
||||
- `GET /api/v1/leads/metrics/` — total, new today, counts per status
|
||||
- `GET /api/v1/leads/` — paginated list with filters (status, priority, source, search, date_from, date_to)
|
||||
- `GET /api/v1/leads/<id>/` — single lead detail
|
||||
- `PATCH /api/v1/leads/<id>/update/` — update status, priority, assigned_to, notes
|
||||
- **RBAC**: `leads` added to `ALL_MODULES`, `get_allowed_modules()`, and `StaffProfile.SCOPE_TO_MODULE`
|
||||
|
||||
---
|
||||
|
||||
## [1.8.3] — 2026-04-06
|
||||
|
||||
### Fixed
|
||||
- **`TopEventsAPI` now works without authentication** — `POST /api/events/top-events/` had `AllowAny` permission but still called `validate_token_and_get_user()`, returning `{"status":"error","message":"token and username required"}` for unauthenticated requests
|
||||
- Removed `validate_token_and_get_user()` call entirely
|
||||
- Added `event_status='published'` filter (was `is_top_event=True` only)
|
||||
- Added `event_type_name` field resolution: `e.event_type.event_type if e.event_type else ''` — `model_to_dict()` only returns the FK integer
|
||||
|
||||
---
|
||||
|
||||
## [1.8.2] — 2026-04-06
|
||||
|
||||
### Fixed
|
||||
- **`FeaturedEventsAPI` now returns `event_type_name` string** — `model_to_dict()` serialises the `event_type` FK as an integer ID; the hero slider frontend reads `ev.event_type_name` to display the category badge, which was always `null`
|
||||
- Added `data_dict['event_type_name'] = e.event_type.event_type if e.event_type else ''` after `model_to_dict(e)` to resolve the FK to its human-readable name (e.g. `"Festivals"`)
|
||||
- No frontend changes required — `fetchHeroSlides()` already falls back to `ev.event_type_name`
|
||||
|
||||
---
|
||||
|
||||
## [1.8.1] — 2026-04-06
|
||||
|
||||
### Fixed
|
||||
- **`FeaturedEventsAPI` now works without authentication** — `POST /api/events/featured-events/` had `AllowAny` permission but still called `validate_token_and_get_user()`, causing the endpoint to return HTTP 200 + `{"status":"error","message":"token and username required"}` for unauthenticated requests (e.g. the desktop hero slider)
|
||||
- Removed the `validate_token_and_get_user()` call entirely — the endpoint is public by design and requires no token
|
||||
- Also tightened the queryset to `event_status='published'` (was `is_featured=True` only) to match `ConsumerFeaturedEventsView` behaviour and avoid returning draft/cancelled events
|
||||
- Root cause: host Nginx routes `/api/` → `eventify-backend` container (port 3001), not `eventify-django` (port 8085); the `validate_token_and_get_user` gate in this container was silently blocking all hero slider requests
|
||||
|
||||
---
|
||||
|
||||
## [1.8.0] — 2026-04-04
|
||||
|
||||
### Added
|
||||
- **`BulkUserPublicInfoView`** (`POST /api/user/bulk-public-info/`)
|
||||
- Internal endpoint for the Node.js gamification server to resolve user details
|
||||
- Accepts `{ emails: [...] }` (max 500), returns `{ users: { email: { display_name, district, eventify_id } } }`
|
||||
- Used for leaderboard data bridge (syncing user names/districts into gamification DB)
|
||||
- CSRF-exempt, returns only public-safe fields (no passwords, tokens, or sensitive PII)
|
||||
|
||||
---
|
||||
|
||||
## [1.7.0] — 2026-04-04
|
||||
|
||||
### Added
|
||||
|
||||
13
accounts/migrations/0013_merge_eventify_id.py
Normal file
13
accounts/migrations/0013_merge_eventify_id.py
Normal file
@@ -0,0 +1,13 @@
|
||||
from django.db import migrations
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
"""Merge migration to resolve conflicting eventify_id migrations."""
|
||||
|
||||
dependencies = [
|
||||
('accounts', '0011_user_eventify_id'),
|
||||
('accounts', '0012_user_eventify_id'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
]
|
||||
13
accounts/migrations/0014_merge_0013.py
Normal file
13
accounts/migrations/0014_merge_0013.py
Normal file
@@ -0,0 +1,13 @@
|
||||
from django.db import migrations
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
"""Merge migration to resolve conflicting 0013 migrations."""
|
||||
|
||||
dependencies = [
|
||||
('accounts', '0013_merge_eventify_id'),
|
||||
('accounts', '0013_user_district_changed_at'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
]
|
||||
@@ -68,10 +68,10 @@ class User(AbstractUser):
|
||||
help_text='Comma-separated module slugs this user can access',
|
||||
)
|
||||
|
||||
ALL_MODULES = ["dashboard", "partners", "events", "ad-control", "users", "reviews", "contributions", "financials", "settings"]
|
||||
ALL_MODULES = ["dashboard", "partners", "events", "ad-control", "users", "reviews", "contributions", "leads", "financials", "audit-log", "settings"]
|
||||
|
||||
def get_allowed_modules(self):
|
||||
ALL = ["dashboard", "partners", "events", "ad-control", "users", "reviews", "contributions", "financials", "settings"]
|
||||
ALL = ["dashboard", "partners", "events", "ad-control", "users", "reviews", "contributions", "leads", "financials", "audit-log", "settings"]
|
||||
if self.is_superuser or self.role == "admin":
|
||||
return ALL
|
||||
if self.allowed_modules:
|
||||
|
||||
0
ad_control/__init__.py
Normal file
0
ad_control/__init__.py
Normal file
24
ad_control/admin.py
Normal file
24
ad_control/admin.py
Normal file
@@ -0,0 +1,24 @@
|
||||
from django.contrib import admin
|
||||
from .models import AdSurface, AdPlacement
|
||||
|
||||
|
||||
@admin.register(AdSurface)
|
||||
class AdSurfaceAdmin(admin.ModelAdmin):
|
||||
list_display = ('key', 'name', 'max_slots', 'layout_type', 'sort_behavior', 'is_active', 'active_count')
|
||||
list_filter = ('is_active', 'layout_type')
|
||||
search_fields = ('key', 'name')
|
||||
readonly_fields = ('created_at',)
|
||||
|
||||
|
||||
@admin.register(AdPlacement)
|
||||
class AdPlacementAdmin(admin.ModelAdmin):
|
||||
list_display = (
|
||||
'id', 'event', 'surface', 'status', 'scope', 'priority',
|
||||
'rank', 'boost_label', 'start_at', 'end_at', 'created_at',
|
||||
)
|
||||
list_filter = ('status', 'scope', 'priority', 'surface')
|
||||
list_editable = ('status', 'scope', 'rank')
|
||||
search_fields = ('event__name', 'event__title', 'boost_label')
|
||||
raw_id_fields = ('event', 'created_by', 'updated_by')
|
||||
readonly_fields = ('created_at', 'updated_at')
|
||||
ordering = ('surface', 'rank')
|
||||
7
ad_control/apps.py
Normal file
7
ad_control/apps.py
Normal file
@@ -0,0 +1,7 @@
|
||||
from django.apps import AppConfig
|
||||
|
||||
|
||||
class AdControlConfig(AppConfig):
|
||||
default_auto_field = 'django.db.models.BigAutoField'
|
||||
name = 'ad_control'
|
||||
verbose_name = 'Ad Control'
|
||||
0
ad_control/management/__init__.py
Normal file
0
ad_control/management/__init__.py
Normal file
0
ad_control/management/commands/__init__.py
Normal file
0
ad_control/management/commands/__init__.py
Normal file
100
ad_control/management/commands/seed_surfaces.py
Normal file
100
ad_control/management/commands/seed_surfaces.py
Normal file
@@ -0,0 +1,100 @@
|
||||
"""
|
||||
Seed the default AdSurface records and migrate existing boolean flags to placements.
|
||||
|
||||
Usage:
|
||||
python manage.py seed_surfaces # seed surfaces only
|
||||
python manage.py seed_surfaces --migrate # also migrate is_featured / is_top_event to placements
|
||||
"""
|
||||
from django.core.management.base import BaseCommand
|
||||
from ad_control.models import AdSurface, AdPlacement
|
||||
from events.models import Event
|
||||
|
||||
|
||||
SURFACES = [
|
||||
{
|
||||
'key': 'HOME_FEATURED_CAROUSEL',
|
||||
'name': 'Featured Carousel',
|
||||
'description': 'Homepage hero carousel — high-impact banner-style placement.',
|
||||
'max_slots': 8,
|
||||
'layout_type': 'carousel',
|
||||
'sort_behavior': 'rank',
|
||||
},
|
||||
{
|
||||
'key': 'HOME_TOP_EVENTS',
|
||||
'name': 'Top Events',
|
||||
'description': 'Homepage "Top Events" grid section below the hero.',
|
||||
'max_slots': 10,
|
||||
'layout_type': 'grid',
|
||||
'sort_behavior': 'rank',
|
||||
},
|
||||
]
|
||||
|
||||
|
||||
class Command(BaseCommand):
|
||||
help = 'Seed default ad surfaces and optionally migrate boolean flags to placements.'
|
||||
|
||||
def add_arguments(self, parser):
|
||||
parser.add_argument(
|
||||
'--migrate',
|
||||
action='store_true',
|
||||
help='Also migrate existing is_featured / is_top_event flags to AdPlacement rows.',
|
||||
)
|
||||
|
||||
def handle(self, *args, **options):
|
||||
# --- Seed surfaces ---
|
||||
for s in SURFACES:
|
||||
obj, created = AdSurface.objects.update_or_create(
|
||||
key=s['key'],
|
||||
defaults=s,
|
||||
)
|
||||
status = 'CREATED' if created else 'EXISTS'
|
||||
self.stdout.write(f" [{status}] {obj.key} — {obj.name}")
|
||||
|
||||
# --- Migrate boolean flags ---
|
||||
if options['migrate']:
|
||||
self.stdout.write('\nMigrating boolean flags to placements...')
|
||||
|
||||
featured_surface = AdSurface.objects.get(key='HOME_FEATURED_CAROUSEL')
|
||||
top_surface = AdSurface.objects.get(key='HOME_TOP_EVENTS')
|
||||
|
||||
featured_events = Event.objects.filter(is_featured=True)
|
||||
top_events = Event.objects.filter(is_top_event=True)
|
||||
|
||||
created_count = 0
|
||||
|
||||
for rank, event in enumerate(featured_events, start=1):
|
||||
_, created = AdPlacement.objects.get_or_create(
|
||||
surface=featured_surface,
|
||||
event=event,
|
||||
defaults={
|
||||
'status': 'ACTIVE',
|
||||
'priority': 'MANUAL',
|
||||
'scope': 'GLOBAL',
|
||||
'rank': rank,
|
||||
'boost_label': 'Featured',
|
||||
},
|
||||
)
|
||||
if created:
|
||||
created_count += 1
|
||||
|
||||
for rank, event in enumerate(top_events, start=1):
|
||||
_, created = AdPlacement.objects.get_or_create(
|
||||
surface=top_surface,
|
||||
event=event,
|
||||
defaults={
|
||||
'status': 'ACTIVE',
|
||||
'priority': 'MANUAL',
|
||||
'scope': 'GLOBAL',
|
||||
'rank': rank,
|
||||
'boost_label': 'Top Event',
|
||||
},
|
||||
)
|
||||
if created:
|
||||
created_count += 1
|
||||
|
||||
self.stdout.write(self.style.SUCCESS(
|
||||
f' Migrated {created_count} placements '
|
||||
f'({featured_events.count()} featured, {top_events.count()} top events).'
|
||||
))
|
||||
|
||||
self.stdout.write(self.style.SUCCESS('\nDone.'))
|
||||
104
ad_control/migrations/0001_initial.py
Normal file
104
ad_control/migrations/0001_initial.py
Normal file
@@ -0,0 +1,104 @@
|
||||
from django.conf import settings
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
initial = True
|
||||
|
||||
dependencies = [
|
||||
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
|
||||
('events', '0007_add_is_featured_is_top_event'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.CreateModel(
|
||||
name='AdSurface',
|
||||
fields=[
|
||||
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||
('key', models.CharField(db_index=True, max_length=50, unique=True)),
|
||||
('name', models.CharField(max_length=100)),
|
||||
('description', models.TextField(blank=True, default='')),
|
||||
('max_slots', models.IntegerField(default=8)),
|
||||
('layout_type', models.CharField(
|
||||
choices=[('carousel', 'Carousel'), ('grid', 'Grid'), ('list', 'List')],
|
||||
default='carousel', max_length=20,
|
||||
)),
|
||||
('sort_behavior', models.CharField(
|
||||
choices=[('rank', 'By Rank'), ('date', 'By Date'), ('popularity', 'By Popularity')],
|
||||
default='rank', max_length=20,
|
||||
)),
|
||||
('is_active', models.BooleanField(default=True)),
|
||||
('created_at', models.DateTimeField(auto_now_add=True)),
|
||||
],
|
||||
options={
|
||||
'verbose_name': 'Ad Surface',
|
||||
'verbose_name_plural': 'Ad Surfaces',
|
||||
'ordering': ['name'],
|
||||
},
|
||||
),
|
||||
migrations.CreateModel(
|
||||
name='AdPlacement',
|
||||
fields=[
|
||||
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||
('status', models.CharField(
|
||||
choices=[
|
||||
('DRAFT', 'Draft'), ('ACTIVE', 'Active'), ('SCHEDULED', 'Scheduled'),
|
||||
('EXPIRED', 'Expired'), ('DISABLED', 'Disabled'),
|
||||
],
|
||||
db_index=True, default='DRAFT', max_length=20,
|
||||
)),
|
||||
('priority', models.CharField(
|
||||
choices=[('SPONSORED', 'Sponsored'), ('MANUAL', 'Manual'), ('ALGO', 'Algorithm')],
|
||||
default='MANUAL', max_length=20,
|
||||
)),
|
||||
('scope', models.CharField(
|
||||
choices=[
|
||||
('GLOBAL', 'Global \u2014 shown to all users'),
|
||||
('LOCAL', 'Local \u2014 shown to nearby users (50 km radius)'),
|
||||
],
|
||||
db_index=True, default='GLOBAL', max_length=10,
|
||||
)),
|
||||
('rank', models.IntegerField(default=0, help_text='Lower rank = higher position')),
|
||||
('start_at', models.DateTimeField(blank=True, help_text='When this placement becomes active', null=True)),
|
||||
('end_at', models.DateTimeField(blank=True, help_text='When this placement expires', null=True)),
|
||||
('boost_label', models.CharField(
|
||||
blank=True, default='', help_text='Display label e.g. "Featured", "Top Pick", "Sponsored"',
|
||||
max_length=50,
|
||||
)),
|
||||
('notes', models.TextField(blank=True, default='')),
|
||||
('created_at', models.DateTimeField(auto_now_add=True)),
|
||||
('updated_at', models.DateTimeField(auto_now=True)),
|
||||
('created_by', models.ForeignKey(
|
||||
blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL,
|
||||
related_name='created_placements', to=settings.AUTH_USER_MODEL,
|
||||
)),
|
||||
('event', models.ForeignKey(
|
||||
on_delete=django.db.models.deletion.CASCADE, related_name='ad_placements',
|
||||
to='events.event',
|
||||
)),
|
||||
('surface', models.ForeignKey(
|
||||
on_delete=django.db.models.deletion.CASCADE, related_name='placements',
|
||||
to='ad_control.adsurface',
|
||||
)),
|
||||
('updated_by', models.ForeignKey(
|
||||
blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL,
|
||||
related_name='updated_placements', to=settings.AUTH_USER_MODEL,
|
||||
)),
|
||||
],
|
||||
options={
|
||||
'verbose_name': 'Ad Placement',
|
||||
'verbose_name_plural': 'Ad Placements',
|
||||
'ordering': ['rank', '-created_at'],
|
||||
},
|
||||
),
|
||||
migrations.AddConstraint(
|
||||
model_name='adplacement',
|
||||
constraint=models.UniqueConstraint(
|
||||
condition=models.Q(('status__in', ['DRAFT', 'ACTIVE', 'SCHEDULED'])),
|
||||
fields=('surface', 'event'),
|
||||
name='unique_active_placement_per_surface',
|
||||
),
|
||||
),
|
||||
]
|
||||
0
ad_control/migrations/__init__.py
Normal file
0
ad_control/migrations/__init__.py
Normal file
107
ad_control/models.py
Normal file
107
ad_control/models.py
Normal file
@@ -0,0 +1,107 @@
|
||||
from django.db import models
|
||||
from django.conf import settings
|
||||
from events.models import Event
|
||||
|
||||
|
||||
class AdSurface(models.Model):
|
||||
"""
|
||||
A display surface where ad placements can appear.
|
||||
e.g. HOME_FEATURED_CAROUSEL, HOME_TOP_EVENTS, CATEGORY_FEATURED
|
||||
"""
|
||||
LAYOUT_CHOICES = [
|
||||
('carousel', 'Carousel'),
|
||||
('grid', 'Grid'),
|
||||
('list', 'List'),
|
||||
]
|
||||
SORT_CHOICES = [
|
||||
('rank', 'By Rank'),
|
||||
('date', 'By Date'),
|
||||
('popularity', 'By Popularity'),
|
||||
]
|
||||
|
||||
key = models.CharField(max_length=50, unique=True, db_index=True)
|
||||
name = models.CharField(max_length=100)
|
||||
description = models.TextField(blank=True, default='')
|
||||
max_slots = models.IntegerField(default=8)
|
||||
layout_type = models.CharField(max_length=20, choices=LAYOUT_CHOICES, default='carousel')
|
||||
sort_behavior = models.CharField(max_length=20, choices=SORT_CHOICES, default='rank')
|
||||
is_active = models.BooleanField(default=True)
|
||||
created_at = models.DateTimeField(auto_now_add=True)
|
||||
|
||||
class Meta:
|
||||
ordering = ['name']
|
||||
verbose_name = 'Ad Surface'
|
||||
verbose_name_plural = 'Ad Surfaces'
|
||||
|
||||
def __str__(self):
|
||||
return f"{self.name} ({self.key})"
|
||||
|
||||
@property
|
||||
def active_count(self):
|
||||
return self.placements.filter(status__in=['ACTIVE', 'SCHEDULED']).count()
|
||||
|
||||
|
||||
class AdPlacement(models.Model):
|
||||
"""
|
||||
A single placement of an event on a surface.
|
||||
Supports scheduling, ranking, scope-based targeting, and lifecycle status.
|
||||
"""
|
||||
STATUS_CHOICES = [
|
||||
('DRAFT', 'Draft'),
|
||||
('ACTIVE', 'Active'),
|
||||
('SCHEDULED', 'Scheduled'),
|
||||
('EXPIRED', 'Expired'),
|
||||
('DISABLED', 'Disabled'),
|
||||
]
|
||||
PRIORITY_CHOICES = [
|
||||
('SPONSORED', 'Sponsored'),
|
||||
('MANUAL', 'Manual'),
|
||||
('ALGO', 'Algorithm'),
|
||||
]
|
||||
SCOPE_CHOICES = [
|
||||
('GLOBAL', 'Global — shown to all users'),
|
||||
('LOCAL', 'Local — shown to nearby users (50 km radius)'),
|
||||
]
|
||||
|
||||
surface = models.ForeignKey(
|
||||
AdSurface, on_delete=models.CASCADE, related_name='placements',
|
||||
)
|
||||
event = models.ForeignKey(
|
||||
Event, on_delete=models.CASCADE, related_name='ad_placements',
|
||||
)
|
||||
status = models.CharField(max_length=20, choices=STATUS_CHOICES, default='DRAFT', db_index=True)
|
||||
priority = models.CharField(max_length=20, choices=PRIORITY_CHOICES, default='MANUAL')
|
||||
scope = models.CharField(max_length=10, choices=SCOPE_CHOICES, default='GLOBAL', db_index=True)
|
||||
rank = models.IntegerField(default=0, help_text='Lower rank = higher position')
|
||||
start_at = models.DateTimeField(null=True, blank=True, help_text='When this placement becomes active')
|
||||
end_at = models.DateTimeField(null=True, blank=True, help_text='When this placement expires')
|
||||
boost_label = models.CharField(
|
||||
max_length=50, blank=True, default='',
|
||||
help_text='Display label e.g. "Featured", "Top Pick", "Sponsored"',
|
||||
)
|
||||
notes = models.TextField(blank=True, default='')
|
||||
created_by = models.ForeignKey(
|
||||
settings.AUTH_USER_MODEL, on_delete=models.SET_NULL,
|
||||
null=True, blank=True, related_name='created_placements',
|
||||
)
|
||||
updated_by = models.ForeignKey(
|
||||
settings.AUTH_USER_MODEL, on_delete=models.SET_NULL,
|
||||
null=True, blank=True, related_name='updated_placements',
|
||||
)
|
||||
created_at = models.DateTimeField(auto_now_add=True)
|
||||
updated_at = models.DateTimeField(auto_now=True)
|
||||
|
||||
class Meta:
|
||||
ordering = ['rank', '-created_at']
|
||||
verbose_name = 'Ad Placement'
|
||||
verbose_name_plural = 'Ad Placements'
|
||||
constraints = [
|
||||
models.UniqueConstraint(
|
||||
fields=['surface', 'event'],
|
||||
condition=models.Q(status__in=['DRAFT', 'ACTIVE', 'SCHEDULED']),
|
||||
name='unique_active_placement_per_surface',
|
||||
),
|
||||
]
|
||||
|
||||
def __str__(self):
|
||||
return f"{self.event.name} on {self.surface.key} [{self.status}]"
|
||||
18
ad_control/urls.py
Normal file
18
ad_control/urls.py
Normal file
@@ -0,0 +1,18 @@
|
||||
from django.urls import path
|
||||
from . import views
|
||||
|
||||
# Admin CRUD endpoints — mounted at /api/v1/ad-control/
|
||||
urlpatterns = [
|
||||
# Surfaces
|
||||
path('surfaces/', views.SurfaceListView.as_view(), name='ad-surfaces'),
|
||||
|
||||
# Placements CRUD
|
||||
path('placements/', views.PlacementListCreateView.as_view(), name='ad-placements'),
|
||||
path('placements/<int:pk>/', views.PlacementDetailView.as_view(), name='ad-placement-detail'),
|
||||
path('placements/<int:pk>/publish/', views.PlacementPublishView.as_view(), name='ad-placement-publish'),
|
||||
path('placements/<int:pk>/unpublish/', views.PlacementUnpublishView.as_view(), name='ad-placement-unpublish'),
|
||||
path('placements/reorder/', views.PlacementReorderView.as_view(), name='ad-placements-reorder'),
|
||||
|
||||
# Events picker
|
||||
path('events/', views.PickerEventsView.as_view(), name='ad-picker-events'),
|
||||
]
|
||||
566
ad_control/views.py
Normal file
566
ad_control/views.py
Normal file
@@ -0,0 +1,566 @@
|
||||
"""
|
||||
Ad Control — Admin CRUD API views.
|
||||
|
||||
All endpoints require JWT authentication (IsAuthenticated).
|
||||
Mounted at /api/v1/ad-control/ via admin_api/urls.py.
|
||||
"""
|
||||
import json
|
||||
import math
|
||||
from datetime import datetime
|
||||
|
||||
from django.utils import timezone
|
||||
from django.utils.dateparse import parse_datetime
|
||||
from django.db import models as db_models
|
||||
from django.db.models import Q, Count, Max
|
||||
from rest_framework.views import APIView
|
||||
from rest_framework.permissions import IsAuthenticated
|
||||
from django.http import JsonResponse
|
||||
|
||||
from .models import AdSurface, AdPlacement
|
||||
from events.models import Event, EventImages
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Serialisation helpers
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def _serialize_surface(s):
|
||||
"""Serialize an AdSurface to camelCase dict matching admin panel types."""
|
||||
active = s.placements.filter(status__in=['ACTIVE', 'SCHEDULED']).count()
|
||||
return {
|
||||
'id': str(s.id),
|
||||
'key': s.key,
|
||||
'name': s.name,
|
||||
'description': s.description,
|
||||
'maxSlots': s.max_slots,
|
||||
'layoutType': s.layout_type,
|
||||
'sortBehavior': s.sort_behavior,
|
||||
'isActive': s.is_active,
|
||||
'activeCount': active,
|
||||
'createdAt': s.created_at.isoformat() if s.created_at else None,
|
||||
}
|
||||
|
||||
|
||||
def _serialize_picker_event(e):
|
||||
"""Serialize an Event for the picker modal (lightweight)."""
|
||||
try:
|
||||
thumb = EventImages.objects.get(event=e.id, is_primary=True)
|
||||
cover = thumb.event_image.url
|
||||
except EventImages.DoesNotExist:
|
||||
cover = None
|
||||
|
||||
return {
|
||||
'id': str(e.id),
|
||||
'title': e.title or e.name,
|
||||
'city': e.district,
|
||||
'state': e.state,
|
||||
'country': 'IN',
|
||||
'date': str(e.start_date) if e.start_date else '',
|
||||
'endDate': str(e.end_date) if e.end_date else '',
|
||||
'organizer': e.partner.name if e.partner else 'Eventify',
|
||||
'organizerLogo': '',
|
||||
'category': e.event_type.event_type if e.event_type else '',
|
||||
'coverImage': cover,
|
||||
'approvalStatus': 'APPROVED' if e.event_status == 'published' else (
|
||||
'REJECTED' if e.event_status == 'cancelled' else 'PENDING'
|
||||
),
|
||||
'ticketsSold': 0,
|
||||
'capacity': 0,
|
||||
}
|
||||
|
||||
|
||||
def _serialize_placement(p, include_event=True):
|
||||
"""Serialize an AdPlacement to camelCase dict matching admin panel types."""
|
||||
result = {
|
||||
'id': str(p.id),
|
||||
'surfaceId': str(p.surface_id),
|
||||
'itemType': 'EVENT',
|
||||
'eventId': str(p.event_id),
|
||||
'status': p.status,
|
||||
'priority': p.priority,
|
||||
'scope': p.scope,
|
||||
'rank': p.rank,
|
||||
'startAt': p.start_at.isoformat() if p.start_at else None,
|
||||
'endAt': p.end_at.isoformat() if p.end_at else None,
|
||||
'targeting': {
|
||||
'cityIds': [],
|
||||
'categoryIds': [],
|
||||
'countryCodes': ['IN'],
|
||||
},
|
||||
'boostLabel': p.boost_label or None,
|
||||
'notes': p.notes or None,
|
||||
'createdBy': str(p.created_by_id) if p.created_by_id else 'system',
|
||||
'updatedBy': str(p.updated_by_id) if p.updated_by_id else 'system',
|
||||
'createdAt': p.created_at.isoformat(),
|
||||
'updatedAt': p.updated_at.isoformat(),
|
||||
}
|
||||
if include_event:
|
||||
result['event'] = _serialize_picker_event(p.event)
|
||||
return result
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Admin API — Surfaces
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class SurfaceListView(APIView):
|
||||
permission_classes = [IsAuthenticated]
|
||||
|
||||
def get(self, request):
|
||||
surfaces = AdSurface.objects.filter(is_active=True)
|
||||
return JsonResponse({
|
||||
'success': True,
|
||||
'data': [_serialize_surface(s) for s in surfaces],
|
||||
})
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Admin API — Placements CRUD
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class PlacementListCreateView(APIView):
|
||||
permission_classes = [IsAuthenticated]
|
||||
|
||||
def get(self, request):
|
||||
"""List placements, optionally filtered by surface_id and status."""
|
||||
qs = AdPlacement.objects.select_related('event', 'event__event_type', 'event__partner', 'surface')
|
||||
|
||||
surface_id = request.GET.get('surface_id')
|
||||
status = request.GET.get('status')
|
||||
|
||||
if surface_id:
|
||||
qs = qs.filter(surface_id=surface_id)
|
||||
if status and status != 'ALL':
|
||||
qs = qs.filter(status=status)
|
||||
|
||||
# Auto-expire: mark past-endAt placements as EXPIRED
|
||||
now = timezone.now()
|
||||
expired = qs.filter(
|
||||
status__in=['ACTIVE', 'SCHEDULED'],
|
||||
end_at__isnull=False,
|
||||
end_at__lt=now,
|
||||
)
|
||||
if expired.exists():
|
||||
expired.update(status='EXPIRED', updated_at=now)
|
||||
# Re-fetch after expiry update
|
||||
qs = AdPlacement.objects.select_related(
|
||||
'event', 'event__event_type', 'event__partner', 'surface',
|
||||
)
|
||||
if surface_id:
|
||||
qs = qs.filter(surface_id=surface_id)
|
||||
if status and status != 'ALL':
|
||||
qs = qs.filter(status=status)
|
||||
|
||||
qs = qs.order_by('rank', '-created_at')
|
||||
return JsonResponse({
|
||||
'success': True,
|
||||
'data': [_serialize_placement(p) for p in qs],
|
||||
})
|
||||
|
||||
def post(self, request):
|
||||
"""Create a new placement."""
|
||||
try:
|
||||
data = json.loads(request.body)
|
||||
except json.JSONDecodeError:
|
||||
return JsonResponse({'success': False, 'message': 'Invalid JSON'}, status=400)
|
||||
|
||||
surface_id = data.get('surfaceId')
|
||||
event_id = data.get('eventId')
|
||||
scope = data.get('scope', 'GLOBAL')
|
||||
priority = data.get('priority', 'MANUAL')
|
||||
start_at = data.get('startAt')
|
||||
end_at = data.get('endAt')
|
||||
boost_label = data.get('boostLabel', '')
|
||||
notes = data.get('notes', '')
|
||||
|
||||
if not surface_id or not event_id:
|
||||
return JsonResponse({'success': False, 'message': 'surfaceId and eventId are required'}, status=400)
|
||||
|
||||
try:
|
||||
surface = AdSurface.objects.get(id=surface_id)
|
||||
except AdSurface.DoesNotExist:
|
||||
return JsonResponse({'success': False, 'message': 'Surface not found'}, status=404)
|
||||
|
||||
try:
|
||||
event = Event.objects.get(id=event_id)
|
||||
except Event.DoesNotExist:
|
||||
return JsonResponse({'success': False, 'message': 'Event not found'}, status=404)
|
||||
|
||||
# Check max slots
|
||||
active_count = surface.placements.filter(status__in=['ACTIVE', 'SCHEDULED']).count()
|
||||
if active_count >= surface.max_slots:
|
||||
return JsonResponse({
|
||||
'success': False,
|
||||
'message': f'Surface "{surface.name}" is full ({surface.max_slots} max slots)',
|
||||
}, status=400)
|
||||
|
||||
# Check duplicate
|
||||
if AdPlacement.objects.filter(
|
||||
surface=surface, event=event, status__in=['DRAFT', 'ACTIVE', 'SCHEDULED'],
|
||||
).exists():
|
||||
return JsonResponse({
|
||||
'success': False,
|
||||
'message': 'This event is already placed on this surface',
|
||||
}, status=400)
|
||||
|
||||
# Calculate next rank
|
||||
max_rank = surface.placements.aggregate(max_rank=Max('rank'))['max_rank'] or 0
|
||||
|
||||
placement = AdPlacement.objects.create(
|
||||
surface=surface,
|
||||
event=event,
|
||||
status='DRAFT',
|
||||
priority=priority,
|
||||
scope=scope,
|
||||
rank=max_rank + 1,
|
||||
start_at=parse_datetime(start_at) if start_at else None,
|
||||
end_at=parse_datetime(end_at) if end_at else None,
|
||||
boost_label=boost_label,
|
||||
notes=notes,
|
||||
created_by=request.user,
|
||||
updated_by=request.user,
|
||||
)
|
||||
|
||||
return JsonResponse({
|
||||
'success': True,
|
||||
'message': 'Placement created as draft',
|
||||
'data': _serialize_placement(placement),
|
||||
}, status=201)
|
||||
|
||||
|
||||
class PlacementDetailView(APIView):
|
||||
permission_classes = [IsAuthenticated]
|
||||
|
||||
def patch(self, request, pk):
|
||||
"""Update a placement's config."""
|
||||
try:
|
||||
data = json.loads(request.body)
|
||||
except json.JSONDecodeError:
|
||||
return JsonResponse({'success': False, 'message': 'Invalid JSON'}, status=400)
|
||||
|
||||
try:
|
||||
placement = AdPlacement.objects.select_related('event', 'surface').get(id=pk)
|
||||
except AdPlacement.DoesNotExist:
|
||||
return JsonResponse({'success': False, 'message': 'Placement not found'}, status=404)
|
||||
|
||||
if 'startAt' in data:
|
||||
placement.start_at = parse_datetime(data['startAt']) if data['startAt'] else None
|
||||
if 'endAt' in data:
|
||||
placement.end_at = parse_datetime(data['endAt']) if data['endAt'] else None
|
||||
if 'scope' in data:
|
||||
placement.scope = data['scope']
|
||||
if 'priority' in data:
|
||||
placement.priority = data['priority']
|
||||
if 'boostLabel' in data:
|
||||
placement.boost_label = data['boostLabel'] or ''
|
||||
if 'notes' in data:
|
||||
placement.notes = data['notes'] or ''
|
||||
|
||||
placement.updated_by = request.user
|
||||
placement.save()
|
||||
|
||||
return JsonResponse({'success': True, 'message': 'Placement updated'})
|
||||
|
||||
def delete(self, request, pk):
|
||||
"""Delete a placement."""
|
||||
try:
|
||||
placement = AdPlacement.objects.get(id=pk)
|
||||
except AdPlacement.DoesNotExist:
|
||||
return JsonResponse({'success': False, 'message': 'Placement not found'}, status=404)
|
||||
|
||||
placement.delete()
|
||||
return JsonResponse({'success': True, 'message': 'Placement deleted'})
|
||||
|
||||
|
||||
class PlacementPublishView(APIView):
|
||||
permission_classes = [IsAuthenticated]
|
||||
|
||||
def post(self, request, pk):
|
||||
"""Publish a placement (DRAFT → ACTIVE or SCHEDULED)."""
|
||||
try:
|
||||
placement = AdPlacement.objects.get(id=pk)
|
||||
except AdPlacement.DoesNotExist:
|
||||
return JsonResponse({'success': False, 'message': 'Placement not found'}, status=404)
|
||||
|
||||
now = timezone.now()
|
||||
if placement.start_at and placement.start_at > now:
|
||||
placement.status = 'SCHEDULED'
|
||||
else:
|
||||
placement.status = 'ACTIVE'
|
||||
|
||||
placement.updated_by = request.user
|
||||
placement.save()
|
||||
|
||||
return JsonResponse({
|
||||
'success': True,
|
||||
'message': f'Placement {"scheduled" if placement.status == "SCHEDULED" else "published"}',
|
||||
})
|
||||
|
||||
|
||||
class PlacementUnpublishView(APIView):
|
||||
permission_classes = [IsAuthenticated]
|
||||
|
||||
def post(self, request, pk):
|
||||
"""Unpublish a placement (→ DISABLED)."""
|
||||
try:
|
||||
placement = AdPlacement.objects.get(id=pk)
|
||||
except AdPlacement.DoesNotExist:
|
||||
return JsonResponse({'success': False, 'message': 'Placement not found'}, status=404)
|
||||
|
||||
placement.status = 'DISABLED'
|
||||
placement.updated_by = request.user
|
||||
placement.save()
|
||||
|
||||
return JsonResponse({'success': True, 'message': 'Placement unpublished'})
|
||||
|
||||
|
||||
class PlacementReorderView(APIView):
|
||||
permission_classes = [IsAuthenticated]
|
||||
|
||||
def post(self, request):
|
||||
"""Bulk-update ranks for a surface's placements."""
|
||||
try:
|
||||
data = json.loads(request.body)
|
||||
except json.JSONDecodeError:
|
||||
return JsonResponse({'success': False, 'message': 'Invalid JSON'}, status=400)
|
||||
|
||||
surface_id = data.get('surfaceId')
|
||||
ordered_ids = data.get('orderedIds', [])
|
||||
|
||||
if not surface_id or not ordered_ids:
|
||||
return JsonResponse({'success': False, 'message': 'surfaceId and orderedIds required'}, status=400)
|
||||
|
||||
now = timezone.now()
|
||||
for index, pid in enumerate(ordered_ids):
|
||||
AdPlacement.objects.filter(id=pid, surface_id=surface_id).update(
|
||||
rank=index + 1, updated_at=now,
|
||||
)
|
||||
|
||||
return JsonResponse({
|
||||
'success': True,
|
||||
'message': f'Reordered {len(ordered_ids)} placements',
|
||||
})
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Admin API — Events picker
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class PickerEventsView(APIView):
|
||||
permission_classes = [IsAuthenticated]
|
||||
|
||||
def get(self, request):
|
||||
"""List events for the event picker modal (search, paginated)."""
|
||||
search = request.GET.get('search', '').strip()
|
||||
page = int(request.GET.get('page', 1))
|
||||
page_size = int(request.GET.get('page_size', 20))
|
||||
|
||||
qs = Event.objects.select_related('event_type', 'partner').filter(
|
||||
event_status__in=['published', 'live'],
|
||||
).order_by('-start_date')
|
||||
|
||||
if search:
|
||||
qs = qs.filter(
|
||||
Q(name__icontains=search) |
|
||||
Q(title__icontains=search) |
|
||||
Q(district__icontains=search) |
|
||||
Q(place__icontains=search)
|
||||
)
|
||||
|
||||
total = qs.count()
|
||||
start = (page - 1) * page_size
|
||||
events = qs[start:start + page_size]
|
||||
|
||||
return JsonResponse({
|
||||
'success': True,
|
||||
'data': [_serialize_picker_event(e) for e in events],
|
||||
'total': total,
|
||||
'page': page,
|
||||
'totalPages': math.ceil(total / page_size) if total > 0 else 1,
|
||||
})
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Consumer API — Featured & Top Events (replaces boolean-based queries)
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def _haversine_km(lat1, lng1, lat2, lng2):
|
||||
"""Great-circle distance in km between two lat/lng points."""
|
||||
R = 6371
|
||||
d_lat = math.radians(float(lat2) - float(lat1))
|
||||
d_lng = math.radians(float(lng2) - float(lng1))
|
||||
a = (
|
||||
math.sin(d_lat / 2) ** 2
|
||||
+ math.cos(math.radians(float(lat1)))
|
||||
* math.cos(math.radians(float(lat2)))
|
||||
* math.sin(d_lng / 2) ** 2
|
||||
)
|
||||
return R * 2 * math.atan2(math.sqrt(a), math.sqrt(1 - a))
|
||||
|
||||
|
||||
def _get_placement_events(surface_key, user_lat=None, user_lng=None):
|
||||
"""
|
||||
Core placement resolution logic.
|
||||
|
||||
1. Fetch ACTIVE placements on the given surface
|
||||
2. Filter by schedule window (start_at / end_at)
|
||||
3. GLOBAL placements → always included
|
||||
4. LOCAL placements → included only if user is within 50 km of event
|
||||
5. Sort by priority (SPONSORED > MANUAL > ALGO) then rank
|
||||
6. Limit to surface.max_slots
|
||||
"""
|
||||
LOCAL_RADIUS_KM = 50
|
||||
|
||||
try:
|
||||
surface = AdSurface.objects.get(key=surface_key, is_active=True)
|
||||
except AdSurface.DoesNotExist:
|
||||
return []
|
||||
|
||||
now = timezone.now()
|
||||
|
||||
qs = AdPlacement.objects.select_related(
|
||||
'event', 'event__event_type', 'event__partner',
|
||||
).filter(
|
||||
surface=surface,
|
||||
status='ACTIVE',
|
||||
).filter(
|
||||
Q(start_at__isnull=True) | Q(start_at__lte=now),
|
||||
).filter(
|
||||
Q(end_at__isnull=True) | Q(end_at__gt=now),
|
||||
).order_by('rank')
|
||||
|
||||
result = []
|
||||
priority_order = {'SPONSORED': 0, 'MANUAL': 1, 'ALGO': 2}
|
||||
|
||||
for p in qs:
|
||||
if p.scope == 'GLOBAL':
|
||||
result.append(p)
|
||||
elif p.scope == 'LOCAL':
|
||||
# Only include if user sent location AND is within 50 km
|
||||
if user_lat is not None and user_lng is not None:
|
||||
dist = _haversine_km(user_lat, user_lng, p.event.latitude, p.event.longitude)
|
||||
if dist <= LOCAL_RADIUS_KM:
|
||||
result.append(p)
|
||||
|
||||
# Sort: priority first, then rank
|
||||
result.sort(key=lambda p: (priority_order.get(p.priority, 9), p.rank))
|
||||
|
||||
# Limit to max_slots
|
||||
return result[:surface.max_slots]
|
||||
|
||||
|
||||
def _serialize_event_for_consumer(e):
|
||||
"""Serialize an Event for the consumer mobile/web API (matches existing format)."""
|
||||
try:
|
||||
thumb = EventImages.objects.get(event=e.id, is_primary=True)
|
||||
thumb_url = thumb.event_image.url
|
||||
except EventImages.DoesNotExist:
|
||||
thumb_url = ''
|
||||
|
||||
return {
|
||||
'id': e.id,
|
||||
'name': e.name,
|
||||
'title': e.title or e.name,
|
||||
'description': (e.description or '')[:200],
|
||||
'start_date': str(e.start_date) if e.start_date else '',
|
||||
'end_date': str(e.end_date) if e.end_date else '',
|
||||
'start_time': str(e.start_time) if e.start_time else '',
|
||||
'end_time': str(e.end_time) if e.end_time else '',
|
||||
'pincode': e.pincode,
|
||||
'place': e.place,
|
||||
'district': e.district,
|
||||
'state': e.state,
|
||||
'is_bookable': e.is_bookable,
|
||||
'event_type': e.event_type_id,
|
||||
'event_status': e.event_status,
|
||||
'venue_name': e.venue_name,
|
||||
'latitude': float(e.latitude),
|
||||
'longitude': float(e.longitude),
|
||||
'location_name': e.place,
|
||||
'thumb_img': thumb_url,
|
||||
'is_eventify_event': e.is_eventify_event,
|
||||
'source': e.source,
|
||||
}
|
||||
|
||||
|
||||
class ConsumerFeaturedEventsView(APIView):
|
||||
"""
|
||||
Public API — returns featured events from the HOME_FEATURED_CAROUSEL surface.
|
||||
POST /api/events/featured-events/
|
||||
Optional body: { "latitude": float, "longitude": float }
|
||||
"""
|
||||
authentication_classes = []
|
||||
permission_classes = []
|
||||
|
||||
def post(self, request):
|
||||
try:
|
||||
try:
|
||||
data = json.loads(request.body) if request.body else {}
|
||||
except json.JSONDecodeError:
|
||||
data = {}
|
||||
|
||||
user_lat = data.get('latitude')
|
||||
user_lng = data.get('longitude')
|
||||
|
||||
placements = _get_placement_events(
|
||||
'HOME_FEATURED_CAROUSEL',
|
||||
user_lat=user_lat,
|
||||
user_lng=user_lng,
|
||||
)
|
||||
|
||||
# IDs already covered by ad placements (used for dedup)
|
||||
placement_ids = {p.event_id for p in placements}
|
||||
|
||||
# Start with placement events (they take priority)
|
||||
events = [_serialize_event_for_consumer(p.event) for p in placements]
|
||||
|
||||
# Append is_featured events that aren't already in the placement set
|
||||
featured_qs = (
|
||||
Event.objects
|
||||
.filter(is_featured=True, event_status='published')
|
||||
.exclude(id__in=placement_ids)
|
||||
.order_by('-start_date', '-created_date')
|
||||
)
|
||||
for evt in featured_qs:
|
||||
if len(events) >= 10:
|
||||
break
|
||||
events.append(_serialize_event_for_consumer(evt))
|
||||
|
||||
# Cap at 10 total
|
||||
events = events[:10]
|
||||
|
||||
return JsonResponse({'status': 'success', 'events': events})
|
||||
except Exception as e:
|
||||
return JsonResponse({'status': 'error', 'message': str(e)}, status=500)
|
||||
|
||||
|
||||
class ConsumerTopEventsView(APIView):
|
||||
"""
|
||||
Public API — returns top events from the HOME_TOP_EVENTS surface.
|
||||
POST /api/events/top-events/
|
||||
Optional body: { "latitude": float, "longitude": float }
|
||||
"""
|
||||
authentication_classes = []
|
||||
permission_classes = []
|
||||
|
||||
def post(self, request):
|
||||
try:
|
||||
try:
|
||||
data = json.loads(request.body) if request.body else {}
|
||||
except json.JSONDecodeError:
|
||||
data = {}
|
||||
|
||||
user_lat = data.get('latitude')
|
||||
user_lng = data.get('longitude')
|
||||
|
||||
placements = _get_placement_events(
|
||||
'HOME_TOP_EVENTS',
|
||||
user_lat=user_lat,
|
||||
user_lng=user_lng,
|
||||
)
|
||||
|
||||
events = [_serialize_event_for_consumer(p.event) for p in placements]
|
||||
|
||||
return JsonResponse({'status': 'success', 'events': events})
|
||||
except Exception as e:
|
||||
return JsonResponse({'status': 'error', 'message': str(e)}, status=500)
|
||||
53
admin_api/migrations/0003_lead.py
Normal file
53
admin_api/migrations/0003_lead.py
Normal file
@@ -0,0 +1,53 @@
|
||||
# Generated by Django 4.2.21 on 2026-04-07
|
||||
|
||||
from django.conf import settings
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
|
||||
('admin_api', '0002_rbac_models'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.CreateModel(
|
||||
name='Lead',
|
||||
fields=[
|
||||
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||
('name', models.CharField(max_length=200)),
|
||||
('email', models.EmailField(max_length=254)),
|
||||
('phone', models.CharField(max_length=20)),
|
||||
('event_type', models.CharField(choices=[('private', 'Private Event'), ('ticketed', 'Ticketed Event'), ('corporate', 'Corporate Event'), ('wedding', 'Wedding'), ('other', 'Other')], default='private', max_length=20)),
|
||||
('message', models.TextField(blank=True, default='')),
|
||||
('status', models.CharField(choices=[('new', 'New'), ('contacted', 'Contacted'), ('qualified', 'Qualified'), ('converted', 'Converted'), ('closed', 'Closed')], default='new', max_length=20)),
|
||||
('source', models.CharField(choices=[('schedule_call', 'Schedule a Call'), ('website', 'Website'), ('manual', 'Manual')], default='schedule_call', max_length=20)),
|
||||
('priority', models.CharField(choices=[('low', 'Low'), ('medium', 'Medium'), ('high', 'High')], default='medium', max_length=10)),
|
||||
('assigned_to', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='assigned_leads', to=settings.AUTH_USER_MODEL)),
|
||||
('notes', models.TextField(blank=True, default='')),
|
||||
('created_at', models.DateTimeField(auto_now_add=True)),
|
||||
('updated_at', models.DateTimeField(auto_now=True)),
|
||||
],
|
||||
options={
|
||||
'ordering': ['-created_at'],
|
||||
},
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='lead',
|
||||
index=models.Index(fields=['status'], name='admin_api_lead_status_idx'),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='lead',
|
||||
index=models.Index(fields=['priority'], name='admin_api_lead_priority_idx'),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='lead',
|
||||
index=models.Index(fields=['created_at'], name='admin_api_lead_created_idx'),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='lead',
|
||||
index=models.Index(fields=['email'], name='admin_api_lead_email_idx'),
|
||||
),
|
||||
]
|
||||
28
admin_api/migrations/0004_lead_user_account.py
Normal file
28
admin_api/migrations/0004_lead_user_account.py
Normal file
@@ -0,0 +1,28 @@
|
||||
# Generated by Django 4.2.21 on 2026-04-07
|
||||
|
||||
from django.conf import settings
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
|
||||
('admin_api', '0003_lead'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddField(
|
||||
model_name='lead',
|
||||
name='user_account',
|
||||
field=models.ForeignKey(
|
||||
blank=True,
|
||||
null=True,
|
||||
on_delete=django.db.models.deletion.SET_NULL,
|
||||
related_name='submitted_leads',
|
||||
to=settings.AUTH_USER_MODEL,
|
||||
help_text='Consumer platform account that submitted this lead (auto-matched by email)',
|
||||
),
|
||||
),
|
||||
]
|
||||
31
admin_api/migrations/0005_auditlog_indexes.py
Normal file
31
admin_api/migrations/0005_auditlog_indexes.py
Normal file
@@ -0,0 +1,31 @@
|
||||
# Generated by Django 4.2.21 for the Audit Log module (admin_api v1.12.0).
|
||||
#
|
||||
# Adds two composite indexes to `AuditLog` so the new /audit-log admin page
|
||||
# can filter by action and resolve "related entries" lookups without a full
|
||||
# table scan once the log grows past a few thousand rows.
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('admin_api', '0004_lead_user_account'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.AddIndex(
|
||||
model_name='auditlog',
|
||||
index=models.Index(
|
||||
fields=['action', '-created_at'],
|
||||
name='auditlog_action_time_idx',
|
||||
),
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='auditlog',
|
||||
index=models.Index(
|
||||
fields=['target_type', 'target_id'],
|
||||
name='auditlog_target_idx',
|
||||
),
|
||||
),
|
||||
]
|
||||
@@ -130,7 +130,7 @@ class StaffProfile(models.Model):
|
||||
def get_allowed_modules(self):
|
||||
scopes = self.get_effective_scopes()
|
||||
if '*' in scopes:
|
||||
return ['dashboard', 'partners', 'events', 'ad-control', 'users', 'reviews', 'contributions', 'financials', 'settings']
|
||||
return ['dashboard', 'partners', 'events', 'ad-control', 'users', 'reviews', 'contributions', 'leads', 'financials', 'audit-log', 'settings']
|
||||
SCOPE_TO_MODULE = {
|
||||
'users': 'users',
|
||||
'events': 'events',
|
||||
@@ -140,6 +140,9 @@ class StaffProfile(models.Model):
|
||||
'settings': 'settings',
|
||||
'ads': 'ad-control',
|
||||
'contributions': 'contributions',
|
||||
'leads': 'leads',
|
||||
'audit': 'audit-log',
|
||||
'reviews': 'reviews',
|
||||
}
|
||||
modules = {'dashboard'}
|
||||
for scope in scopes:
|
||||
@@ -178,6 +181,74 @@ class AuditLog(models.Model):
|
||||
|
||||
class Meta:
|
||||
ordering = ['-created_at']
|
||||
indexes = [
|
||||
# Fast filter-by-action ordered by time (audit log page default view)
|
||||
models.Index(fields=['action', '-created_at'], name='auditlog_action_time_idx'),
|
||||
# Fast "related entries for this target" lookups in the detail panel
|
||||
models.Index(fields=['target_type', 'target_id'], name='auditlog_target_idx'),
|
||||
]
|
||||
|
||||
def __str__(self):
|
||||
return f"{self.action} by {self.user} at {self.created_at}"
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Lead Manager
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
class Lead(models.Model):
|
||||
EVENT_TYPE_CHOICES = [
|
||||
('private', 'Private Event'),
|
||||
('ticketed', 'Ticketed Event'),
|
||||
('corporate', 'Corporate Event'),
|
||||
('wedding', 'Wedding'),
|
||||
('other', 'Other'),
|
||||
]
|
||||
STATUS_CHOICES = [
|
||||
('new', 'New'),
|
||||
('contacted', 'Contacted'),
|
||||
('qualified', 'Qualified'),
|
||||
('converted', 'Converted'),
|
||||
('closed', 'Closed'),
|
||||
]
|
||||
SOURCE_CHOICES = [
|
||||
('schedule_call', 'Schedule a Call'),
|
||||
('website', 'Website'),
|
||||
('manual', 'Manual'),
|
||||
]
|
||||
PRIORITY_CHOICES = [
|
||||
('low', 'Low'),
|
||||
('medium', 'Medium'),
|
||||
('high', 'High'),
|
||||
]
|
||||
|
||||
name = models.CharField(max_length=200)
|
||||
email = models.EmailField()
|
||||
phone = models.CharField(max_length=20)
|
||||
event_type = models.CharField(max_length=20, choices=EVENT_TYPE_CHOICES, default='private')
|
||||
message = models.TextField(blank=True, default='')
|
||||
status = models.CharField(max_length=20, choices=STATUS_CHOICES, default='new')
|
||||
source = models.CharField(max_length=20, choices=SOURCE_CHOICES, default='schedule_call')
|
||||
priority = models.CharField(max_length=10, choices=PRIORITY_CHOICES, default='medium')
|
||||
assigned_to = models.ForeignKey(
|
||||
User, on_delete=models.SET_NULL, null=True, blank=True, related_name='assigned_leads'
|
||||
)
|
||||
user_account = models.ForeignKey(
|
||||
User, on_delete=models.SET_NULL, null=True, blank=True, related_name='submitted_leads',
|
||||
help_text='Consumer platform account that submitted this lead (auto-matched by email)'
|
||||
)
|
||||
notes = models.TextField(blank=True, default='')
|
||||
created_at = models.DateTimeField(auto_now_add=True)
|
||||
updated_at = models.DateTimeField(auto_now=True)
|
||||
|
||||
class Meta:
|
||||
ordering = ['-created_at']
|
||||
indexes = [
|
||||
models.Index(fields=['status']),
|
||||
models.Index(fields=['priority']),
|
||||
models.Index(fields=['created_at']),
|
||||
models.Index(fields=['email']),
|
||||
]
|
||||
|
||||
def __str__(self):
|
||||
return f'Lead #{self.pk} — {self.name} ({self.status})'
|
||||
|
||||
@@ -5,9 +5,10 @@ User = get_user_model()
|
||||
class UserSerializer(serializers.ModelSerializer):
|
||||
name = serializers.SerializerMethodField()
|
||||
role = serializers.SerializerMethodField()
|
||||
partner = serializers.PrimaryKeyRelatedField(read_only=True)
|
||||
class Meta:
|
||||
model = User
|
||||
fields = ['id', 'email', 'username', 'name', 'role']
|
||||
fields = ['id', 'email', 'username', 'name', 'role', 'partner']
|
||||
def get_name(self, obj):
|
||||
return f"{obj.first_name} {obj.last_name}".strip() or obj.username
|
||||
def get_role(self, obj):
|
||||
|
||||
325
admin_api/tests.py
Normal file
325
admin_api/tests.py
Normal file
@@ -0,0 +1,325 @@
|
||||
"""Tests for the Audit Log module (admin_api v1.12.0).
|
||||
|
||||
Covers:
|
||||
* `AuditLogListView` — list + search + filter shape
|
||||
* `AuditLogMetricsView` — shape + `?nocache=1` bypass
|
||||
* `UserStatusView` — emits a row into `AuditLog` for every status change,
|
||||
inside the same transaction as the state change
|
||||
|
||||
Run with:
|
||||
python manage.py test admin_api.tests
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
from django.core.cache import cache
|
||||
from django.test import TestCase
|
||||
from rest_framework_simplejwt.tokens import RefreshToken
|
||||
|
||||
from accounts.models import User
|
||||
from admin_api.models import AuditLog
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Base — auth helper shared across cases
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
class _AuditTestBase(TestCase):
|
||||
"""Gives each subclass an admin user + pre-issued JWT."""
|
||||
|
||||
@classmethod
|
||||
def setUpTestData(cls):
|
||||
cls.admin = User.objects.create_user(
|
||||
username='audit.admin@eventifyplus.com',
|
||||
email='audit.admin@eventifyplus.com',
|
||||
password='irrelevant',
|
||||
role='admin',
|
||||
)
|
||||
cls.admin.is_superuser = True
|
||||
cls.admin.save()
|
||||
|
||||
def setUp(self):
|
||||
# Metrics view caches by key; reset to keep cases independent.
|
||||
cache.delete('admin_api:audit_log:metrics:v1')
|
||||
access = str(RefreshToken.for_user(self.admin).access_token)
|
||||
self.auth = {'HTTP_AUTHORIZATION': f'Bearer {access}'}
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# AuditLogListView
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
class AuditLogListViewTests(_AuditTestBase):
|
||||
url = '/api/v1/rbac/audit-log/'
|
||||
|
||||
def test_unauthenticated_returns_401(self):
|
||||
resp = self.client.get(self.url)
|
||||
self.assertEqual(resp.status_code, 401)
|
||||
|
||||
def test_authenticated_returns_paginated_shape(self):
|
||||
AuditLog.objects.create(
|
||||
user=self.admin,
|
||||
action='user.suspended',
|
||||
target_type='user',
|
||||
target_id='42',
|
||||
details={'reason': 'spam'},
|
||||
)
|
||||
resp = self.client.get(self.url, **self.auth)
|
||||
self.assertEqual(resp.status_code, 200, resp.content)
|
||||
body = resp.json()
|
||||
for key in ('results', 'total', 'page', 'page_size', 'total_pages'):
|
||||
self.assertIn(key, body)
|
||||
self.assertEqual(body['total'], 1)
|
||||
self.assertEqual(body['results'][0]['action'], 'user.suspended')
|
||||
self.assertEqual(body['results'][0]['user']['email'], self.admin.email)
|
||||
|
||||
def test_search_narrows_results(self):
|
||||
AuditLog.objects.create(
|
||||
user=self.admin, action='user.suspended',
|
||||
target_type='user', target_id='1', details={},
|
||||
)
|
||||
AuditLog.objects.create(
|
||||
user=self.admin, action='event.approved',
|
||||
target_type='event', target_id='1', details={},
|
||||
)
|
||||
resp = self.client.get(self.url, {'search': 'suspend'}, **self.auth)
|
||||
self.assertEqual(resp.status_code, 200)
|
||||
body = resp.json()
|
||||
self.assertEqual(body['total'], 1)
|
||||
self.assertEqual(body['results'][0]['action'], 'user.suspended')
|
||||
|
||||
def test_page_size_is_bounded(self):
|
||||
# page_size=999 must be clamped to the 200-row upper bound.
|
||||
resp = self.client.get(self.url, {'page_size': '999'}, **self.auth)
|
||||
self.assertEqual(resp.status_code, 200)
|
||||
self.assertEqual(resp.json()['page_size'], 200)
|
||||
|
||||
def test_invalid_pagination_falls_back_to_defaults(self):
|
||||
resp = self.client.get(self.url, {'page': 'x', 'page_size': 'y'}, **self.auth)
|
||||
self.assertEqual(resp.status_code, 200)
|
||||
body = resp.json()
|
||||
self.assertEqual(body['page'], 1)
|
||||
self.assertEqual(body['page_size'], 50)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# AuditLogMetricsView
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
class AuditLogMetricsViewTests(_AuditTestBase):
|
||||
url = '/api/v1/rbac/audit-log/metrics/'
|
||||
|
||||
def test_unauthenticated_returns_401(self):
|
||||
resp = self.client.get(self.url)
|
||||
self.assertEqual(resp.status_code, 401)
|
||||
|
||||
def test_returns_expected_shape(self):
|
||||
AuditLog.objects.create(
|
||||
user=self.admin, action='event.approved',
|
||||
target_type='event', target_id='7', details={},
|
||||
)
|
||||
AuditLog.objects.create(
|
||||
user=self.admin, action='department.created',
|
||||
target_type='department', target_id='3', details={},
|
||||
)
|
||||
resp = self.client.get(self.url, **self.auth)
|
||||
self.assertEqual(resp.status_code, 200, resp.content)
|
||||
body = resp.json()
|
||||
for key in ('total', 'today', 'week', 'distinct_users', 'by_action_group'):
|
||||
self.assertIn(key, body)
|
||||
self.assertEqual(body['total'], 2)
|
||||
self.assertEqual(body['distinct_users'], 1)
|
||||
# Each group present so frontend tooltip can render all 6 rows.
|
||||
for group in ('create', 'update', 'delete', 'moderate', 'auth', 'other'):
|
||||
self.assertIn(group, body['by_action_group'])
|
||||
self.assertEqual(body['by_action_group']['moderate'], 1)
|
||||
self.assertEqual(body['by_action_group']['create'], 1)
|
||||
|
||||
def test_nocache_bypasses_stale_cache(self):
|
||||
# Prime cache with a fake payload.
|
||||
cache.set(
|
||||
'admin_api:audit_log:metrics:v1',
|
||||
{
|
||||
'total': 999,
|
||||
'today': 0, 'week': 0, 'distinct_users': 0,
|
||||
'by_action_group': {
|
||||
'create': 0, 'update': 0, 'delete': 0,
|
||||
'moderate': 0, 'auth': 0, 'other': 0,
|
||||
},
|
||||
},
|
||||
60,
|
||||
)
|
||||
|
||||
resp_cached = self.client.get(self.url, **self.auth)
|
||||
self.assertEqual(resp_cached.json()['total'], 999)
|
||||
|
||||
resp_fresh = self.client.get(self.url, {'nocache': '1'}, **self.auth)
|
||||
self.assertEqual(resp_fresh.json()['total'], 0) # real DB state
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# UserStatusView — audit emission
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
class UserStatusAuditEmissionTests(_AuditTestBase):
|
||||
"""Each status transition must leave a matching row in `AuditLog`.
|
||||
|
||||
The endpoint wraps the state change + audit log in `transaction.atomic()`
|
||||
so the two can never disagree. These assertions catch regressions where a
|
||||
new branch forgets the audit call.
|
||||
"""
|
||||
|
||||
def _url(self, user_id: int) -> str:
|
||||
return f'/api/v1/users/{user_id}/status/'
|
||||
|
||||
def setUp(self):
|
||||
super().setUp()
|
||||
self.target = User.objects.create_user(
|
||||
username='target@example.com',
|
||||
email='target@example.com',
|
||||
password='irrelevant',
|
||||
role='customer',
|
||||
)
|
||||
|
||||
def test_suspend_emits_audit_row(self):
|
||||
resp = self.client.patch(
|
||||
self._url(self.target.id),
|
||||
data={'action': 'suspend', 'reason': 'spam flood'},
|
||||
content_type='application/json',
|
||||
**self.auth,
|
||||
)
|
||||
self.assertEqual(resp.status_code, 200, resp.content)
|
||||
log = AuditLog.objects.filter(
|
||||
action='user.suspended', target_id=str(self.target.id),
|
||||
).first()
|
||||
self.assertIsNotNone(log, 'suspend did not emit audit log')
|
||||
self.assertEqual(log.details.get('reason'), 'spam flood')
|
||||
|
||||
def test_ban_emits_audit_row(self):
|
||||
resp = self.client.patch(
|
||||
self._url(self.target.id),
|
||||
data={'action': 'ban'},
|
||||
content_type='application/json',
|
||||
**self.auth,
|
||||
)
|
||||
self.assertEqual(resp.status_code, 200, resp.content)
|
||||
self.assertTrue(
|
||||
AuditLog.objects.filter(
|
||||
action='user.banned', target_id=str(self.target.id),
|
||||
).exists(),
|
||||
'ban did not emit audit log',
|
||||
)
|
||||
|
||||
def test_reinstate_emits_audit_row(self):
|
||||
self.target.is_active = False
|
||||
self.target.save()
|
||||
resp = self.client.patch(
|
||||
self._url(self.target.id),
|
||||
data={'action': 'reinstate'},
|
||||
content_type='application/json',
|
||||
**self.auth,
|
||||
)
|
||||
self.assertEqual(resp.status_code, 200, resp.content)
|
||||
self.assertTrue(
|
||||
AuditLog.objects.filter(
|
||||
action='user.reinstated', target_id=str(self.target.id),
|
||||
).exists(),
|
||||
'reinstate did not emit audit log',
|
||||
)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# AdminLoginView — audit emission
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
class AuthAuditEmissionTests(_AuditTestBase):
|
||||
"""Successful and failed logins must leave matching rows in AuditLog."""
|
||||
|
||||
url = '/api/v1/admin/auth/login/'
|
||||
|
||||
def test_successful_login_emits_audit_row(self):
|
||||
resp = self.client.post(
|
||||
self.url,
|
||||
data={'username': self.admin.username, 'password': 'irrelevant'},
|
||||
content_type='application/json',
|
||||
)
|
||||
self.assertEqual(resp.status_code, 200, resp.content)
|
||||
log = AuditLog.objects.filter(
|
||||
action='auth.admin_login', target_id=str(self.admin.id),
|
||||
).first()
|
||||
self.assertIsNotNone(log, 'successful login did not emit audit log')
|
||||
self.assertEqual(log.details.get('username'), self.admin.username)
|
||||
|
||||
def test_failed_login_emits_audit_row(self):
|
||||
resp = self.client.post(
|
||||
self.url,
|
||||
data={'username': self.admin.username, 'password': 'wrong-password'},
|
||||
content_type='application/json',
|
||||
)
|
||||
self.assertEqual(resp.status_code, 401, resp.content)
|
||||
self.assertTrue(
|
||||
AuditLog.objects.filter(action='auth.admin_login_failed').exists(),
|
||||
'failed login did not emit audit log',
|
||||
)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# EventCreateView / EventUpdateView / EventDeleteView — audit emission
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
|
||||
class EventCrudAuditTests(_AuditTestBase):
|
||||
"""Event CRUD operations must emit matching audit rows."""
|
||||
|
||||
def setUp(self):
|
||||
super().setUp()
|
||||
from events.models import EventType
|
||||
self.event_type = EventType.objects.create(event_type='Test Category')
|
||||
|
||||
def _create_event_id(self):
|
||||
resp = self.client.post(
|
||||
'/api/v1/events/create/',
|
||||
data={'title': 'Test Event', 'eventType': self.event_type.id},
|
||||
content_type='application/json',
|
||||
**self.auth,
|
||||
)
|
||||
self.assertEqual(resp.status_code, 201, resp.content)
|
||||
return resp.json()['id']
|
||||
|
||||
def test_create_event_emits_audit_row(self):
|
||||
event_id = self._create_event_id()
|
||||
self.assertTrue(
|
||||
AuditLog.objects.filter(action='event.created', target_id=str(event_id)).exists(),
|
||||
'event create did not emit audit log',
|
||||
)
|
||||
|
||||
def test_update_event_emits_audit_row(self):
|
||||
event_id = self._create_event_id()
|
||||
AuditLog.objects.all().delete()
|
||||
resp = self.client.patch(
|
||||
f'/api/v1/events/{event_id}/update/',
|
||||
data={'title': 'Updated Title'},
|
||||
content_type='application/json',
|
||||
**self.auth,
|
||||
)
|
||||
self.assertEqual(resp.status_code, 200, resp.content)
|
||||
log = AuditLog.objects.filter(action='event.updated', target_id=str(event_id)).first()
|
||||
self.assertIsNotNone(log, 'event update did not emit audit log')
|
||||
self.assertIn('title', log.details.get('changed_fields', []))
|
||||
|
||||
def test_delete_event_emits_audit_row(self):
|
||||
event_id = self._create_event_id()
|
||||
AuditLog.objects.all().delete()
|
||||
resp = self.client.delete(
|
||||
f'/api/v1/events/{event_id}/delete/',
|
||||
**self.auth,
|
||||
)
|
||||
self.assertEqual(resp.status_code, 204)
|
||||
self.assertTrue(
|
||||
AuditLog.objects.filter(action='event.deleted', target_id=str(event_id)).exists(),
|
||||
'event delete did not emit audit log',
|
||||
)
|
||||
@@ -1,4 +1,4 @@
|
||||
from django.urls import path
|
||||
from django.urls import path, include
|
||||
from rest_framework_simplejwt.views import TokenRefreshView
|
||||
from . import views
|
||||
|
||||
@@ -18,8 +18,30 @@ urlpatterns = [
|
||||
path('partners/<int:pk>/', views.PartnerDetailView.as_view(), name='partner-detail'),
|
||||
path('partners/<int:pk>/status/', views.PartnerStatusView.as_view(), name='partner-status'),
|
||||
path('partners/<int:pk>/kyc/review/', views.PartnerKYCReviewView.as_view(), name='partner-kyc-review'),
|
||||
path('partners/<int:pk>/impersonate/', views.PartnerImpersonateView.as_view(), name='partner-impersonate'),
|
||||
path('partners/onboard/', views.PartnerOnboardView.as_view(), name='partner-onboard'),
|
||||
path('partners/<int:partner_id>/staff/', views.PartnerStaffCreateView.as_view(), name='partner-staff-create'),
|
||||
# Partner-Me: partner portal self-service (Sprint 1)
|
||||
path('partners/me/profile/', views.PartnerMeProfileView.as_view(), name='partner-me-profile'),
|
||||
path('partners/me/notifications/', views.PartnerMeNotificationsView.as_view(), name='partner-me-notifications'),
|
||||
path('partners/me/payout/', views.PartnerMePayoutView.as_view(), name='partner-me-payout'),
|
||||
path('partners/me/change-password/', views.PartnerMeChangePasswordView.as_view(), name='partner-me-change-password'),
|
||||
# Partner-Me: events (Sprint 2)
|
||||
path('partners/me/events/', views.PartnerMeEventsView.as_view(), name='partner-me-events'),
|
||||
path('partners/me/events/<int:pk>/', views.PartnerMeEventDetailView.as_view(), name='partner-me-event-detail'),
|
||||
path('partners/me/events/<int:pk>/duplicate/', views.PartnerMeEventDuplicateView.as_view(), name='partner-me-event-duplicate'),
|
||||
# Partner-Me: ticket tiers (Sprint 3)
|
||||
path('partners/me/events/<int:event_pk>/tiers/', views.PartnerMeEventTiersView.as_view(), name='partner-me-event-tiers'),
|
||||
path('partners/me/events/<int:event_pk>/tiers/<int:tier_pk>/', views.PartnerMeEventTierDetailView.as_view(), name='partner-me-event-tier-detail'),
|
||||
# Partner-Me: bookings (Sprint 4)
|
||||
path('partners/me/bookings/', views.PartnerBookingListView.as_view(), name='partner-me-bookings'),
|
||||
# Partner-Me: customers (Sprint 5)
|
||||
path('partners/me/customers/', views.PartnerCustomerListView.as_view(), name='partner-me-customers'),
|
||||
# Partner-Me: staff CRUD (Sprint 6)
|
||||
path('partners/me/staff/', views.PartnerMeStaffListView.as_view(), name='partner-me-staff-list'),
|
||||
path('partners/me/staff/<int:pk>/', views.PartnerMeStaffDetailView.as_view(), name='partner-me-staff-detail'),
|
||||
# Partner-Me: check-in (Sprint 7)
|
||||
path('partners/me/check-in/', views.PartnerMeCheckInView.as_view(), name='partner-me-check-in'),
|
||||
path('users/metrics/', views.UserMetricsView.as_view(), name='user-metrics'),
|
||||
path('users/', views.UserListView.as_view(), name='user-list'),
|
||||
path('users/<int:pk>/', views.UserDetailView.as_view(), name='user-detail'),
|
||||
@@ -44,6 +66,12 @@ urlpatterns = [
|
||||
path('reviews/<int:pk>/moderate/', views.ReviewModerationView.as_view(), name='review-moderate'),
|
||||
path('reviews/<int:pk>/', views.ReviewDeleteView.as_view(), name='review-delete'),
|
||||
|
||||
# Lead Manager
|
||||
path('leads/metrics/', views.LeadMetricsView.as_view(), name='lead-metrics'),
|
||||
path('leads/', views.LeadListView.as_view(), name='lead-list'),
|
||||
path('leads/<int:pk>/', views.LeadDetailView.as_view(), name='lead-detail'),
|
||||
path('leads/<int:pk>/update/', views.LeadUpdateView.as_view(), name='lead-update'),
|
||||
|
||||
path('gamification/submit-event/', views.GamificationSubmitEventView.as_view(), name='gamification-submit-event'),
|
||||
path('gamification/submit-event', views.GamificationSubmitEventView.as_view()),
|
||||
path('shop/items/', views.ShopItemsView.as_view(), name='shop-items'),
|
||||
@@ -74,4 +102,17 @@ urlpatterns = [
|
||||
path('rbac/scopes/', views.ScopeListView.as_view(), name='rbac-scope-list'),
|
||||
path('rbac/org-tree/', views.OrgTreeView.as_view(), name='rbac-org-tree'),
|
||||
path('rbac/audit-log/', views.AuditLogListView.as_view(), name='rbac-audit-log'),
|
||||
path('rbac/audit-log/metrics/', views.AuditLogMetricsView.as_view(), name='rbac-audit-log-metrics'),
|
||||
|
||||
# Notifications (admin-side recurring email jobs)
|
||||
path('notifications/types/', views.NotificationTypesView.as_view(), name='notification-types'),
|
||||
path('notifications/schedules/', views.NotificationScheduleListView.as_view(), name='notification-schedule-list'),
|
||||
path('notifications/schedules/<int:pk>/', views.NotificationScheduleDetailView.as_view(), name='notification-schedule-detail'),
|
||||
path('notifications/schedules/<int:pk>/recipients/', views.NotificationRecipientView.as_view(), name='notification-recipient-create'),
|
||||
path('notifications/schedules/<int:pk>/recipients/<int:rid>/', views.NotificationRecipientDetailView.as_view(), name='notification-recipient-detail'),
|
||||
path('notifications/schedules/<int:pk>/send-now/', views.NotificationScheduleSendNowView.as_view(), name='notification-schedule-send-now'),
|
||||
path('notifications/schedules/<int:pk>/test-send/', views.NotificationScheduleTestSendView.as_view(), name='notification-schedule-test-send'),
|
||||
|
||||
# Ad Control
|
||||
path('ad-control/', include('ad_control.urls')),
|
||||
]
|
||||
2051
admin_api/views.py
2051
admin_api/views.py
File diff suppressed because it is too large
Load Diff
@@ -18,6 +18,9 @@ ALLOWED_HOSTS = [
|
||||
'backend.eventifyplus.com',
|
||||
'admin.eventifyplus.com',
|
||||
'app.eventifyplus.com',
|
||||
'partner.eventifyplus.com',
|
||||
'eventify-backend',
|
||||
'eventify-django',
|
||||
'localhost',
|
||||
'127.0.0.1',
|
||||
]
|
||||
@@ -46,6 +49,7 @@ INSTALLED_APPS = [
|
||||
'django_summernote',
|
||||
'ledger',
|
||||
'notifications',
|
||||
'ad_control',
|
||||
]
|
||||
|
||||
INSTALLED_APPS += [
|
||||
@@ -195,3 +199,9 @@ SIMPLE_JWT = {
|
||||
'USER_ID_FIELD': 'id',
|
||||
'USER_ID_CLAIM': 'user_id',
|
||||
}
|
||||
|
||||
# --- Google OAuth (Sign in with Google via GIS ID-token flow) -----------
|
||||
# The Client ID is public (safe in VITE_* env vars and the SPA bundle).
|
||||
# There is NO client secret — we use the ID-token flow, not auth-code flow.
|
||||
# Set the SAME value in the Django container .env and in SPA .env.local.
|
||||
GOOGLE_CLIENT_ID = os.environ.get('GOOGLE_CLIENT_ID', '')
|
||||
|
||||
@@ -36,6 +36,7 @@ urlpatterns = [
|
||||
path('banking/', include('banking_operations.urls')),
|
||||
path('api/', include('mobile_api.urls')),
|
||||
path('api/v1/', include('admin_api.urls')),
|
||||
path('api/notifications/', include('notifications.urls')),
|
||||
# path('web-api/', include('web_api.urls')),
|
||||
|
||||
path('summernote/', include('django_summernote.urls')),
|
||||
|
||||
73
events/migrations/0011_event_contributed_by.py
Normal file
73
events/migrations/0011_event_contributed_by.py
Normal file
@@ -0,0 +1,73 @@
|
||||
"""
|
||||
Add contributed_by field to Event and backfill from overloaded source field.
|
||||
|
||||
The admin dashboard stores community contributor identifiers (EVT-XXXXXXXX or email)
|
||||
in the source field. This migration:
|
||||
1. Adds a dedicated contributed_by CharField
|
||||
2. Copies user identifiers from source → contributed_by
|
||||
3. Normalizes source back to its intended choices ('eventify', 'community', 'partner')
|
||||
"""
|
||||
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
def backfill_contributed_by(apps, schema_editor):
|
||||
"""Move user identifiers from source to contributed_by."""
|
||||
Event = apps.get_model('events', 'Event')
|
||||
|
||||
STANDARD_SOURCES = {'eventify', 'community', 'partner', 'eventify_team', 'official', ''}
|
||||
|
||||
for event in Event.objects.all().iterator():
|
||||
source_val = (event.source or '').strip()
|
||||
changed = False
|
||||
|
||||
# User identifier: contains @ (email) or starts with EVT- (eventifyId)
|
||||
if source_val and source_val not in STANDARD_SOURCES and not source_val.startswith('partner:'):
|
||||
event.contributed_by = source_val
|
||||
event.source = 'community'
|
||||
changed = True
|
||||
|
||||
# Normalize eventify_team → eventify
|
||||
elif source_val == 'eventify_team':
|
||||
event.source = 'eventify'
|
||||
changed = True
|
||||
|
||||
# Normalize official → eventify
|
||||
elif source_val == 'official':
|
||||
event.source = 'eventify'
|
||||
changed = True
|
||||
|
||||
if changed:
|
||||
event.save(update_fields=['source', 'contributed_by'])
|
||||
|
||||
|
||||
def reverse_backfill(apps, schema_editor):
|
||||
"""Reverse: move contributed_by back to source."""
|
||||
Event = apps.get_model('events', 'Event')
|
||||
for event in Event.objects.exclude(contributed_by__isnull=True).exclude(contributed_by='').iterator():
|
||||
event.source = event.contributed_by
|
||||
event.contributed_by = None
|
||||
event.save(update_fields=['source', 'contributed_by'])
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('events', '0010_merge_20260324_1443'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
# Step 1: Add the field
|
||||
migrations.AddField(
|
||||
model_name='event',
|
||||
name='contributed_by',
|
||||
field=models.CharField(
|
||||
blank=True,
|
||||
help_text='Eventify ID (EVT-XXXXXXXX) or email of the community contributor',
|
||||
max_length=100,
|
||||
null=True,
|
||||
),
|
||||
),
|
||||
# Step 2: Backfill data
|
||||
migrations.RunPython(backfill_contributed_by, reverse_backfill),
|
||||
]
|
||||
38
events/migrations/0012_eventlike.py
Normal file
38
events/migrations/0012_eventlike.py
Normal file
@@ -0,0 +1,38 @@
|
||||
from django.conf import settings
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('events', '0011_event_contributed_by'),
|
||||
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.CreateModel(
|
||||
name='EventLike',
|
||||
fields=[
|
||||
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||
('created_at', models.DateTimeField(auto_now_add=True)),
|
||||
('event', models.ForeignKey(
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
related_name='likes',
|
||||
to='events.event',
|
||||
)),
|
||||
('user', models.ForeignKey(
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
related_name='event_likes',
|
||||
to=settings.AUTH_USER_MODEL,
|
||||
)),
|
||||
],
|
||||
options={
|
||||
'unique_together': {('user', 'event')},
|
||||
},
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='eventlike',
|
||||
index=models.Index(fields=['user', '-created_at'], name='events_even_user_id_created_idx'),
|
||||
),
|
||||
]
|
||||
@@ -58,6 +58,11 @@ class Event(models.Model):
|
||||
is_featured = models.BooleanField(default=False, help_text='Show this event in the featured section')
|
||||
is_top_event = models.BooleanField(default=False, help_text='Show this event in the Top Events section')
|
||||
|
||||
contributed_by = models.CharField(
|
||||
max_length=100, blank=True, null=True,
|
||||
help_text='Eventify ID (EVT-XXXXXXXX) or email of the community contributor',
|
||||
)
|
||||
|
||||
def __str__(self):
|
||||
return f"{self.name} ({self.start_date})"
|
||||
|
||||
@@ -71,3 +76,26 @@ class EventImages(models.Model):
|
||||
return f"{self.event_image}"
|
||||
|
||||
|
||||
class EventLike(models.Model):
|
||||
user = models.ForeignKey(
|
||||
'accounts.User',
|
||||
on_delete=models.CASCADE,
|
||||
related_name='event_likes'
|
||||
)
|
||||
event = models.ForeignKey(
|
||||
Event,
|
||||
on_delete=models.CASCADE,
|
||||
related_name='likes'
|
||||
)
|
||||
created_at = models.DateTimeField(auto_now_add=True)
|
||||
|
||||
class Meta:
|
||||
unique_together = ('user', 'event')
|
||||
indexes = [
|
||||
models.Index(fields=['user', '-created_at']),
|
||||
]
|
||||
|
||||
def __str__(self):
|
||||
return f"{self.user.email} likes {self.event.name}"
|
||||
|
||||
|
||||
|
||||
@@ -1,3 +1,89 @@
|
||||
from django.test import TestCase
|
||||
"""Unit tests for GoogleLoginView.
|
||||
|
||||
# Create your tests here.
|
||||
Run with:
|
||||
python manage.py test mobile_api.tests
|
||||
"""
|
||||
import json
|
||||
from unittest.mock import patch, MagicMock
|
||||
|
||||
from django.test import TestCase, override_settings
|
||||
from rest_framework.authtoken.models import Token
|
||||
|
||||
from accounts.models import User
|
||||
|
||||
|
||||
@override_settings(GOOGLE_CLIENT_ID='test-client-id.apps.googleusercontent.com')
|
||||
class GoogleLoginViewTests(TestCase):
|
||||
url = '/api/user/google-login/'
|
||||
|
||||
def _valid_idinfo(self, email='new.user@example.com'):
|
||||
return {
|
||||
'email': email,
|
||||
'given_name': 'New',
|
||||
'family_name': 'User',
|
||||
'aud': 'test-client-id.apps.googleusercontent.com',
|
||||
}
|
||||
|
||||
@patch('google.oauth2.id_token.verify_oauth2_token')
|
||||
def test_valid_token_creates_user(self, mock_verify):
|
||||
mock_verify.return_value = self._valid_idinfo('fresh@example.com')
|
||||
|
||||
resp = self.client.post(
|
||||
self.url,
|
||||
data=json.dumps({'id_token': 'fake.google.jwt'}),
|
||||
content_type='application/json',
|
||||
)
|
||||
|
||||
self.assertEqual(resp.status_code, 200, resp.content)
|
||||
body = resp.json()
|
||||
self.assertEqual(body['email'], 'fresh@example.com')
|
||||
self.assertEqual(body['role'], 'customer')
|
||||
self.assertTrue(body['token'])
|
||||
|
||||
user = User.objects.get(email='fresh@example.com')
|
||||
self.assertTrue(Token.objects.filter(user=user).exists())
|
||||
# Confirm audience was passed to verify_oauth2_token
|
||||
_, call_kwargs = mock_verify.call_args[0], mock_verify.call_args
|
||||
self.assertEqual(mock_verify.call_args[0][2], 'test-client-id.apps.googleusercontent.com')
|
||||
|
||||
def test_missing_id_token_returns_400(self):
|
||||
resp = self.client.post(
|
||||
self.url,
|
||||
data=json.dumps({}),
|
||||
content_type='application/json',
|
||||
)
|
||||
self.assertEqual(resp.status_code, 400)
|
||||
self.assertEqual(resp.json()['error'], 'id_token is required')
|
||||
|
||||
@patch('google.oauth2.id_token.verify_oauth2_token')
|
||||
def test_invalid_token_returns_401(self, mock_verify):
|
||||
mock_verify.side_effect = ValueError('Token audience mismatch')
|
||||
|
||||
resp = self.client.post(
|
||||
self.url,
|
||||
data=json.dumps({'id_token': 'tampered.or.wrong-aud.jwt'}),
|
||||
content_type='application/json',
|
||||
)
|
||||
self.assertEqual(resp.status_code, 401)
|
||||
self.assertEqual(resp.json()['error'], 'Invalid Google token')
|
||||
|
||||
@patch('google.oauth2.id_token.verify_oauth2_token')
|
||||
def test_existing_user_reuses_token(self, mock_verify):
|
||||
existing = User.objects.create_user(
|
||||
username='returning@example.com',
|
||||
email='returning@example.com',
|
||||
password='irrelevant',
|
||||
role='customer',
|
||||
)
|
||||
existing_auth_token = Token.objects.create(user=existing)
|
||||
mock_verify.return_value = self._valid_idinfo('returning@example.com')
|
||||
|
||||
resp = self.client.post(
|
||||
self.url,
|
||||
data=json.dumps({'id_token': 'returning.user.jwt'}),
|
||||
content_type='application/json',
|
||||
)
|
||||
self.assertEqual(resp.status_code, 200)
|
||||
self.assertEqual(resp.json()['token'], existing_auth_token.key)
|
||||
# No duplicate user created
|
||||
self.assertEqual(User.objects.filter(email='returning@example.com').count(), 1)
|
||||
|
||||
@@ -1,6 +1,9 @@
|
||||
from django.urls import path
|
||||
from .views import *
|
||||
from mobile_api.views.user import ScheduleCallView
|
||||
from mobile_api.views.reviews import ReviewSubmitView, MobileReviewListView, ReviewHelpfulView, ReviewFlagView
|
||||
from mobile_api.views.favorites import ToggleLikeView, MyLikedIdsView, MyLikedEventsView
|
||||
from ad_control.views import ConsumerFeaturedEventsView, ConsumerTopEventsView
|
||||
|
||||
|
||||
# Customer URLS
|
||||
@@ -10,6 +13,9 @@ urlpatterns = [
|
||||
path('user/status/', StatusView.as_view(), name='user_status'),
|
||||
path('user/logout/', LogoutView.as_view(), name='user_logout'),
|
||||
path('user/update-profile/', UpdateProfileView.as_view(), name='update_profile'),
|
||||
path('user/bulk-public-info/', BulkUserPublicInfoView.as_view(), name='bulk_public_info'),
|
||||
path('user/google-login/', GoogleLoginView.as_view(), name='google_login'),
|
||||
path('leads/schedule-call/', ScheduleCallView.as_view(), name='schedule_call'),
|
||||
]
|
||||
|
||||
# Event URLS
|
||||
@@ -22,8 +28,9 @@ urlpatterns += [
|
||||
path('events/events-by-category/', EventsByCategoryAPI.as_view(), name='api_events_by_category'),
|
||||
path('events/events-by-month-year/', EventsByMonthYearAPI.as_view(), name='events_by_month_year'),
|
||||
path('events/events-by-date/', EventsByDateAPI.as_view(), name='events_by_date'),
|
||||
path('events/featured-events/', FeaturedEventsAPI.as_view(), name='featured_events'),
|
||||
path('events/top-events/', TopEventsAPI.as_view(), name='top_events'),
|
||||
path('events/featured-events/', ConsumerFeaturedEventsView.as_view(), name='featured_events'),
|
||||
path('events/top-events/', ConsumerTopEventsView.as_view(), name='top_events'),
|
||||
path('events/contributor-profile/', ContributorProfileAPI.as_view(), name='contributor_profile'),
|
||||
]
|
||||
|
||||
# Review URLs
|
||||
@@ -33,3 +40,10 @@ urlpatterns += [
|
||||
path('reviews/helpful', ReviewHelpfulView.as_view()),
|
||||
path('reviews/flag', ReviewFlagView.as_view()),
|
||||
]
|
||||
|
||||
# Favorites URLs
|
||||
urlpatterns += [
|
||||
path('events/like/', ToggleLikeView.as_view()),
|
||||
path('events/my-likes/', MyLikedIdsView.as_view()),
|
||||
path('events/my-liked-events/', MyLikedEventsView.as_view()),
|
||||
]
|
||||
|
||||
@@ -244,9 +244,9 @@ class EventListAPI(APIView):
|
||||
if pincode_qs.count() >= MIN_EVENTS_THRESHOLD:
|
||||
qs = pincode_qs
|
||||
|
||||
# Priority 3: Full-text search on title / description
|
||||
# Priority 3: Full-text search on title / name / description
|
||||
if q:
|
||||
qs = qs.filter(Q(title__icontains=q) | Q(description__icontains=q))
|
||||
qs = qs.filter(Q(title__icontains=q) | Q(name__icontains=q) | Q(description__icontains=q))
|
||||
|
||||
if per_type > 0 and page == 1:
|
||||
type_ids = list(qs.values_list('event_type_id', flat=True).distinct())
|
||||
@@ -581,14 +581,11 @@ class FeaturedEventsAPI(APIView):
|
||||
|
||||
def post(self, request):
|
||||
try:
|
||||
user, token, data, error_response = validate_token_and_get_user(request)
|
||||
if error_response:
|
||||
return error_response
|
||||
|
||||
events = Event.objects.filter(is_featured=True).order_by('-created_date')
|
||||
events = Event.objects.filter(is_featured=True, event_status='published').order_by('-created_date')
|
||||
event_list = []
|
||||
for e in events:
|
||||
data_dict = model_to_dict(e)
|
||||
data_dict['event_type_name'] = e.event_type.event_type if e.event_type else ''
|
||||
try:
|
||||
thumb = EventImages.objects.get(event=e.id, is_primary=True)
|
||||
data_dict['thumb_img'] = thumb.event_image.url
|
||||
@@ -610,14 +607,11 @@ class TopEventsAPI(APIView):
|
||||
|
||||
def post(self, request):
|
||||
try:
|
||||
user, token, data, error_response = validate_token_and_get_user(request)
|
||||
if error_response:
|
||||
return error_response
|
||||
|
||||
events = Event.objects.filter(is_top_event=True).order_by('-created_date')
|
||||
events = Event.objects.filter(is_top_event=True, event_status='published').order_by('-created_date')
|
||||
event_list = []
|
||||
for e in events:
|
||||
data_dict = model_to_dict(e)
|
||||
data_dict['event_type_name'] = e.event_type.event_type if e.event_type else ''
|
||||
try:
|
||||
thumb = EventImages.objects.get(event=e.id, is_primary=True)
|
||||
data_dict['thumb_img'] = thumb.event_image.url
|
||||
|
||||
146
mobile_api/views/favorites.py
Normal file
146
mobile_api/views/favorites.py
Normal file
@@ -0,0 +1,146 @@
|
||||
from django.views import View
|
||||
from django.utils.decorators import method_decorator
|
||||
from django.views.decorators.csrf import csrf_exempt
|
||||
from django.http import JsonResponse
|
||||
from django.core.paginator import Paginator
|
||||
|
||||
from events.models import Event, EventLike, EventImages
|
||||
from mobile_api.utils import validate_token_and_get_user
|
||||
from eventify_logger.services import log
|
||||
|
||||
|
||||
def _serialize_liked_event(event):
|
||||
"""Serialize an Event for the liked-events list."""
|
||||
primary_img = EventImages.objects.filter(
|
||||
event=event, is_primary=True
|
||||
).first()
|
||||
if not primary_img:
|
||||
primary_img = EventImages.objects.filter(event=event).first()
|
||||
|
||||
return {
|
||||
'id': event.id,
|
||||
'title': event.title or event.name,
|
||||
'image': primary_img.event_image.url if primary_img else '',
|
||||
'date': str(event.start_date) if event.start_date else None,
|
||||
'location': event.place or '',
|
||||
'venue': event.venue_name or '',
|
||||
'event_type': event.event_type.event_type if event.event_type else '',
|
||||
'event_status': event.event_status,
|
||||
}
|
||||
|
||||
|
||||
@method_decorator(csrf_exempt, name='dispatch')
|
||||
class ToggleLikeView(View):
|
||||
"""POST /api/events/like/ — toggle like on/off for an event."""
|
||||
|
||||
def post(self, request):
|
||||
try:
|
||||
user, token, data, error_response = validate_token_and_get_user(request)
|
||||
if error_response:
|
||||
return error_response
|
||||
|
||||
event_id = data.get('event_id')
|
||||
if not event_id:
|
||||
return JsonResponse(
|
||||
{'status': 'error', 'message': 'event_id is required'},
|
||||
status=400
|
||||
)
|
||||
|
||||
try:
|
||||
event = Event.objects.get(pk=event_id)
|
||||
except Event.DoesNotExist:
|
||||
return JsonResponse(
|
||||
{'status': 'error', 'message': 'Event not found'},
|
||||
status=404
|
||||
)
|
||||
|
||||
like, created = EventLike.objects.get_or_create(user=user, event=event)
|
||||
if not created:
|
||||
like.delete()
|
||||
return JsonResponse({'status': 'success', 'liked': False})
|
||||
|
||||
return JsonResponse({'status': 'success', 'liked': True})
|
||||
|
||||
except Exception as e:
|
||||
log("error", "ToggleLikeView exception", request=request,
|
||||
logger_data={"error": str(e)})
|
||||
return JsonResponse(
|
||||
{'status': 'error', 'message': 'An unexpected server error occurred.'},
|
||||
status=500
|
||||
)
|
||||
|
||||
|
||||
@method_decorator(csrf_exempt, name='dispatch')
|
||||
class MyLikedIdsView(View):
|
||||
"""POST /api/events/my-likes/ — return all liked event IDs for the user."""
|
||||
|
||||
def post(self, request):
|
||||
try:
|
||||
user, token, data, error_response = validate_token_and_get_user(request)
|
||||
if error_response:
|
||||
return error_response
|
||||
|
||||
liked_ids = list(
|
||||
EventLike.objects.filter(user=user)
|
||||
.values_list('event_id', flat=True)
|
||||
)
|
||||
return JsonResponse({'status': 'success', 'liked_event_ids': liked_ids})
|
||||
|
||||
except Exception as e:
|
||||
log("error", "MyLikedIdsView exception", request=request,
|
||||
logger_data={"error": str(e)})
|
||||
return JsonResponse(
|
||||
{'status': 'error', 'message': 'An unexpected server error occurred.'},
|
||||
status=500
|
||||
)
|
||||
|
||||
|
||||
@method_decorator(csrf_exempt, name='dispatch')
|
||||
class MyLikedEventsView(View):
|
||||
"""POST /api/events/my-liked-events/ — paginated liked events with full data."""
|
||||
|
||||
def post(self, request):
|
||||
try:
|
||||
user, token, data, error_response = validate_token_and_get_user(request)
|
||||
if error_response:
|
||||
return error_response
|
||||
|
||||
page = int(data.get('page', 1))
|
||||
page_size = min(int(data.get('page_size', 20)), 50)
|
||||
|
||||
# Event IDs liked by this user, newest first
|
||||
liked_event_ids = list(
|
||||
EventLike.objects.filter(user=user)
|
||||
.order_by('-created_at')
|
||||
.values_list('event_id', flat=True)
|
||||
)
|
||||
|
||||
# Preserve ordering from liked_event_ids
|
||||
from django.db.models import Case, When, IntegerField
|
||||
ordering = Case(
|
||||
*[When(pk=pk, then=pos) for pos, pk in enumerate(liked_event_ids)],
|
||||
output_field=IntegerField()
|
||||
)
|
||||
events_qs = Event.objects.filter(id__in=liked_event_ids).order_by(ordering)
|
||||
|
||||
paginator = Paginator(events_qs, page_size)
|
||||
page_obj = paginator.get_page(page)
|
||||
|
||||
events_data = [_serialize_liked_event(e) for e in page_obj]
|
||||
|
||||
return JsonResponse({
|
||||
'status': 'success',
|
||||
'events': events_data,
|
||||
'total': paginator.count,
|
||||
'page': page,
|
||||
'page_size': page_size,
|
||||
'has_next': page_obj.has_next(),
|
||||
})
|
||||
|
||||
except Exception as e:
|
||||
log("error", "MyLikedEventsView exception", request=request,
|
||||
logger_data={"error": str(e)})
|
||||
return JsonResponse(
|
||||
{'status': 'error', 'message': 'An unexpected server error occurred.'},
|
||||
status=500
|
||||
)
|
||||
@@ -29,11 +29,20 @@ def _serialize_review(r, user_interactions=None):
|
||||
uname = r.reviewer.username
|
||||
except Exception:
|
||||
uname = ''
|
||||
try:
|
||||
pic = r.reviewer.profile_picture
|
||||
if pic and pic.name and 'default.png' not in pic.name:
|
||||
profile_photo = pic.url
|
||||
else:
|
||||
profile_photo = ''
|
||||
except Exception:
|
||||
profile_photo = ''
|
||||
return {
|
||||
'id': r.id,
|
||||
'event_id': r.event_id,
|
||||
'username': uname,
|
||||
'display_name': display,
|
||||
'profile_photo': profile_photo,
|
||||
'rating': r.rating,
|
||||
'comment': r.review_text,
|
||||
'status': _STATUS_TO_JSON.get(r.status, r.status),
|
||||
|
||||
@@ -1,19 +1,41 @@
|
||||
# accounts/views.py
|
||||
import json
|
||||
import secrets
|
||||
from django.views.decorators.csrf import csrf_exempt
|
||||
from django.http import JsonResponse
|
||||
from django.utils.decorators import method_decorator
|
||||
from django.views import View
|
||||
from rest_framework.views import APIView
|
||||
from rest_framework.authtoken.models import Token
|
||||
from mobile_api.forms import RegisterForm, LoginForm, WebRegisterForm
|
||||
from rest_framework.authentication import TokenAuthentication
|
||||
from django.contrib.auth import logout
|
||||
from django.db import connection
|
||||
from mobile_api.utils import validate_token_and_get_user
|
||||
from utils.errors_json_convertor import simplify_form_errors
|
||||
from accounts.models import User
|
||||
from eventify_logger.services import log
|
||||
|
||||
|
||||
def _seed_gamification_profile(user):
|
||||
"""Insert a gamification profile row for a newly registered user.
|
||||
Non-fatal: if the insert fails for any reason, registration still succeeds."""
|
||||
try:
|
||||
with connection.cursor() as cursor:
|
||||
cursor.execute("""
|
||||
INSERT INTO user_gamification_profiles (user_id, eventify_id)
|
||||
VALUES (%s, %s)
|
||||
ON CONFLICT (user_id) DO UPDATE
|
||||
SET eventify_id = COALESCE(
|
||||
user_gamification_profiles.eventify_id,
|
||||
EXCLUDED.eventify_id
|
||||
)
|
||||
""", [user.email, user.eventify_id])
|
||||
except Exception as e:
|
||||
log("warning", "Failed to seed gamification profile on registration",
|
||||
logger_data={"user": user.email, "error": str(e)})
|
||||
|
||||
|
||||
@method_decorator(csrf_exempt, name='dispatch')
|
||||
class RegisterView(View):
|
||||
def post(self, request):
|
||||
@@ -22,6 +44,7 @@ class RegisterView(View):
|
||||
form = RegisterForm(data)
|
||||
if form.is_valid():
|
||||
user = form.save()
|
||||
_seed_gamification_profile(user)
|
||||
token, _ = Token.objects.get_or_create(user=user)
|
||||
log("info", "API user registration", request=request, user=user)
|
||||
return JsonResponse({'message': 'User registered successfully', 'token': token.key}, status=201)
|
||||
@@ -49,6 +72,7 @@ class WebRegisterView(View):
|
||||
if form.is_valid():
|
||||
print('2')
|
||||
user = form.save()
|
||||
_seed_gamification_profile(user)
|
||||
token, _ = Token.objects.get_or_create(user=user)
|
||||
print('3')
|
||||
log("info", "Web user registration", request=request, user=user)
|
||||
@@ -132,6 +156,7 @@ class StatusView(View):
|
||||
"eventify_id": user.eventify_id or '',
|
||||
"district": user.district or '',
|
||||
"district_changed_at": user.district_changed_at.isoformat() if user.district_changed_at else None,
|
||||
"profile_photo": user.profile_picture.url if user.profile_picture else '',
|
||||
})
|
||||
|
||||
except Exception as e:
|
||||
@@ -363,3 +388,169 @@ class UpdateProfileView(View):
|
||||
'success': False,
|
||||
'error': 'An unexpected server error occurred. Please try again.'
|
||||
}, status=500)
|
||||
|
||||
|
||||
@method_decorator(csrf_exempt, name='dispatch')
|
||||
class BulkUserPublicInfoView(APIView):
|
||||
"""Internal endpoint for Node.js gamification server to resolve user details.
|
||||
Accepts POST with { emails: [...] } (max 500).
|
||||
Returns { users: { email: { district, display_name, eventify_id } } }
|
||||
"""
|
||||
authentication_classes = []
|
||||
permission_classes = []
|
||||
|
||||
def post(self, request):
|
||||
try:
|
||||
json_data = json.loads(request.body)
|
||||
emails = json_data.get('emails', [])
|
||||
if not emails or not isinstance(emails, list) or len(emails) > 500:
|
||||
return JsonResponse({'error': 'Provide 1-500 emails'}, status=400)
|
||||
|
||||
users_qs = User.objects.filter(email__in=emails).values_list(
|
||||
'email', 'first_name', 'last_name', 'district', 'eventify_id'
|
||||
)
|
||||
result = {}
|
||||
for email, first, last, district, eid in users_qs:
|
||||
name = f"{first} {last}".strip() or email.split('@')[0]
|
||||
result[email] = {
|
||||
'display_name': name,
|
||||
'district': district or '',
|
||||
'eventify_id': eid or '',
|
||||
}
|
||||
return JsonResponse({'users': result})
|
||||
except Exception as e:
|
||||
log("error", "BulkUserPublicInfoView error", logger_data={"error": str(e)})
|
||||
return JsonResponse({'error': 'An unexpected server error occurred.'}, status=500)
|
||||
|
||||
|
||||
@method_decorator(csrf_exempt, name='dispatch')
|
||||
class GoogleLoginView(View):
|
||||
"""Verify a Google ID token, find or create the user, return the same response shape as LoginView."""
|
||||
def post(self, request):
|
||||
try:
|
||||
from google.oauth2 import id_token as google_id_token
|
||||
from google.auth.transport import requests as google_requests
|
||||
|
||||
from django.conf import settings
|
||||
|
||||
data = json.loads(request.body)
|
||||
token = data.get('id_token')
|
||||
if not token:
|
||||
return JsonResponse({'error': 'id_token is required'}, status=400)
|
||||
|
||||
if not settings.GOOGLE_CLIENT_ID:
|
||||
log("error", "GOOGLE_CLIENT_ID not configured", request=request)
|
||||
return JsonResponse({'error': 'Google login temporarily unavailable'}, status=503)
|
||||
|
||||
idinfo = google_id_token.verify_oauth2_token(
|
||||
token,
|
||||
google_requests.Request(),
|
||||
settings.GOOGLE_CLIENT_ID,
|
||||
)
|
||||
email = idinfo.get('email')
|
||||
if not email:
|
||||
return JsonResponse({'error': 'Email not found in Google token'}, status=400)
|
||||
|
||||
user, created = User.objects.get_or_create(
|
||||
email=email,
|
||||
defaults={
|
||||
'username': email,
|
||||
'first_name': idinfo.get('given_name', ''),
|
||||
'last_name': idinfo.get('family_name', ''),
|
||||
'role': 'customer',
|
||||
},
|
||||
)
|
||||
if created:
|
||||
user.set_password(secrets.token_urlsafe(32))
|
||||
user.save()
|
||||
log("info", "Google OAuth new user created", request=request, user=user)
|
||||
|
||||
auth_token, _ = Token.objects.get_or_create(user=user)
|
||||
log("info", "Google OAuth login", request=request, user=user)
|
||||
|
||||
return JsonResponse({
|
||||
'message': 'Login successful',
|
||||
'token': auth_token.key,
|
||||
'eventify_id': user.eventify_id or '',
|
||||
'username': user.username,
|
||||
'email': user.email,
|
||||
'phone_number': user.phone_number or '',
|
||||
'first_name': user.first_name,
|
||||
'last_name': user.last_name,
|
||||
'role': user.role,
|
||||
'pincode': user.pincode or '',
|
||||
'district': user.district or '',
|
||||
'district_changed_at': user.district_changed_at.isoformat() if user.district_changed_at else None,
|
||||
'state': user.state or '',
|
||||
'country': user.country or '',
|
||||
'place': user.place or '',
|
||||
'latitude': user.latitude or '',
|
||||
'longitude': user.longitude or '',
|
||||
'profile_photo': user.profile_picture.url if user.profile_picture else '',
|
||||
}, status=200)
|
||||
except ValueError as e:
|
||||
log("warning", "Google OAuth invalid token", request=request, logger_data={"error": str(e)})
|
||||
return JsonResponse({'error': 'Invalid Google token'}, status=401)
|
||||
except Exception as e:
|
||||
log("error", "Google OAuth exception", request=request, logger_data={"error": str(e)})
|
||||
return JsonResponse({'error': 'An unexpected server error occurred.'}, status=500)
|
||||
|
||||
|
||||
@method_decorator(csrf_exempt, name='dispatch')
|
||||
class ScheduleCallView(View):
|
||||
"""Public endpoint for the 'Schedule a Call' form on the consumer app."""
|
||||
|
||||
def post(self, request):
|
||||
from admin_api.models import Lead
|
||||
try:
|
||||
data = json.loads(request.body)
|
||||
name = (data.get('name') or '').strip()
|
||||
email = (data.get('email') or '').strip()
|
||||
phone = (data.get('phone') or '').strip()
|
||||
event_type = (data.get('eventType') or '').strip()
|
||||
message = (data.get('message') or '').strip()
|
||||
|
||||
errors = {}
|
||||
if not name:
|
||||
errors['name'] = ['This field is required.']
|
||||
if not email:
|
||||
errors['email'] = ['This field is required.']
|
||||
if not phone:
|
||||
errors['phone'] = ['This field is required.']
|
||||
valid_event_types = [c[0] for c in Lead.EVENT_TYPE_CHOICES]
|
||||
if not event_type or event_type not in valid_event_types:
|
||||
errors['eventType'] = [f'Must be one of: {", ".join(valid_event_types)}']
|
||||
|
||||
if errors:
|
||||
return JsonResponse({'errors': errors}, status=400)
|
||||
|
||||
# Auto-link to a consumer account if one exists with this email
|
||||
from django.contrib.auth import get_user_model
|
||||
_User = get_user_model()
|
||||
try:
|
||||
consumer_account = _User.objects.get(email=email)
|
||||
except _User.DoesNotExist:
|
||||
consumer_account = None
|
||||
|
||||
lead = Lead.objects.create(
|
||||
name=name,
|
||||
email=email,
|
||||
phone=phone,
|
||||
event_type=event_type,
|
||||
message=message,
|
||||
status='new',
|
||||
source='schedule_call',
|
||||
priority='medium',
|
||||
user_account=consumer_account,
|
||||
)
|
||||
log("info", f"New schedule-call lead #{lead.pk} from {email}", request=request)
|
||||
return JsonResponse({
|
||||
'status': 'success',
|
||||
'message': 'Your request has been submitted. Our team will get back to you soon.',
|
||||
'lead_id': lead.pk,
|
||||
}, status=201)
|
||||
except json.JSONDecodeError:
|
||||
return JsonResponse({'error': 'Invalid JSON body.'}, status=400)
|
||||
except Exception as e:
|
||||
log("error", "Schedule call exception", request=request, logger_data={"error": str(e)})
|
||||
return JsonResponse({'error': 'An unexpected server error occurred.'}, status=500)
|
||||
|
||||
0
notifications/__init__.py
Normal file
0
notifications/__init__.py
Normal file
10
notifications/admin.py
Normal file
10
notifications/admin.py
Normal file
@@ -0,0 +1,10 @@
|
||||
from django.contrib import admin
|
||||
from .models import Notification
|
||||
|
||||
|
||||
@admin.register(Notification)
|
||||
class NotificationAdmin(admin.ModelAdmin):
|
||||
list_display = ('title', 'user', 'notification_type', 'is_read', 'created_at')
|
||||
list_filter = ('notification_type', 'is_read', 'created_at')
|
||||
search_fields = ('title', 'message', 'user__email')
|
||||
readonly_fields = ('created_at',)
|
||||
6
notifications/apps.py
Normal file
6
notifications/apps.py
Normal file
@@ -0,0 +1,6 @@
|
||||
from django.apps import AppConfig
|
||||
|
||||
|
||||
class NotificationsConfig(AppConfig):
|
||||
default_auto_field = 'django.db.models.BigAutoField'
|
||||
name = 'notifications'
|
||||
181
notifications/emails.py
Normal file
181
notifications/emails.py
Normal file
@@ -0,0 +1,181 @@
|
||||
"""HTML email builders for scheduled admin notifications.
|
||||
|
||||
Each builder is registered in ``BUILDERS`` keyed by ``NotificationSchedule.notification_type``
|
||||
and returns ``(subject, html_body)``. Add new types by appending to the registry
|
||||
and extending ``NotificationSchedule.TYPE_CHOICES``.
|
||||
|
||||
Week bounds for ``events_expiring_this_week`` are computed in Asia/Kolkata so the
|
||||
"this week" semantics match the operations team's wall-clock week regardless of
|
||||
``settings.TIME_ZONE`` (currently UTC).
|
||||
"""
|
||||
from datetime import date, datetime, timedelta
|
||||
from html import escape
|
||||
|
||||
try:
|
||||
from zoneinfo import ZoneInfo
|
||||
except ImportError: # pragma: no cover — fallback for py<3.9
|
||||
from backports.zoneinfo import ZoneInfo # type: ignore
|
||||
|
||||
from django.conf import settings
|
||||
from django.core.mail import EmailMessage
|
||||
from django.db.models import Q
|
||||
|
||||
from eventify_logger.services import log
|
||||
|
||||
|
||||
IST = ZoneInfo('Asia/Kolkata')
|
||||
|
||||
|
||||
def _today_in_ist() -> date:
|
||||
return datetime.now(IST).date()
|
||||
|
||||
|
||||
def _upcoming_week_bounds(today: date) -> tuple[date, date]:
|
||||
"""Return (next Monday, next Sunday) inclusive.
|
||||
|
||||
If today is Monday the result is *this week* (today..Sunday).
|
||||
If today is any other weekday the result is *next week* (next Monday..next Sunday).
|
||||
Mon=0 per Python ``weekday()``.
|
||||
"""
|
||||
days_until_monday = (7 - today.weekday()) % 7
|
||||
monday = today + timedelta(days=days_until_monday)
|
||||
sunday = monday + timedelta(days=6)
|
||||
return monday, sunday
|
||||
|
||||
|
||||
def _build_events_expiring_this_week(schedule) -> tuple[str, str]:
|
||||
from events.models import Event
|
||||
|
||||
today = _today_in_ist()
|
||||
monday, sunday = _upcoming_week_bounds(today)
|
||||
|
||||
qs = (
|
||||
Event.objects
|
||||
.select_related('partner', 'event_type')
|
||||
.filter(event_status='published')
|
||||
.filter(
|
||||
Q(end_date__isnull=False, end_date__gte=monday, end_date__lte=sunday)
|
||||
| Q(end_date__isnull=True, start_date__gte=monday, start_date__lte=sunday)
|
||||
)
|
||||
.order_by('end_date', 'start_date', 'name')
|
||||
)
|
||||
|
||||
events = list(qs)
|
||||
rows_html = ''
|
||||
for e in events:
|
||||
end = e.end_date or e.start_date
|
||||
title = e.title or e.name or '(untitled)'
|
||||
partner_name = ''
|
||||
if e.partner_id:
|
||||
try:
|
||||
partner_name = e.partner.name or ''
|
||||
except Exception:
|
||||
partner_name = ''
|
||||
category = ''
|
||||
if e.event_type_id and e.event_type:
|
||||
category = getattr(e.event_type, 'event_type', '') or ''
|
||||
rows_html += (
|
||||
'<tr>'
|
||||
f'<td style="padding:10px 12px;border-bottom:1px solid #eee;">{escape(title)}</td>'
|
||||
f'<td style="padding:10px 12px;border-bottom:1px solid #eee;">{escape(partner_name or "—")}</td>'
|
||||
f'<td style="padding:10px 12px;border-bottom:1px solid #eee;">{escape(category or "—")}</td>'
|
||||
f'<td style="padding:10px 12px;border-bottom:1px solid #eee;">'
|
||||
f'{end.strftime("%a %d %b %Y") if end else "—"}</td>'
|
||||
'</tr>'
|
||||
)
|
||||
|
||||
if not events:
|
||||
rows_html = (
|
||||
'<tr><td colspan="4" style="padding:24px;text-align:center;color:#888;">'
|
||||
'No published events are expiring next week.'
|
||||
'</td></tr>'
|
||||
)
|
||||
|
||||
subject = (
|
||||
f'[Eventify] {len(events)} event(s) expiring '
|
||||
f'{monday.strftime("%d %b")}–{sunday.strftime("%d %b")}'
|
||||
)
|
||||
|
||||
html = f"""<!doctype html>
|
||||
<html><body style="margin:0;padding:0;background:#f5f5f5;font-family:Arial,Helvetica,sans-serif;color:#1a1a1a;">
|
||||
<table role="presentation" width="100%" cellspacing="0" cellpadding="0" style="background:#f5f5f5;">
|
||||
<tr><td align="center" style="padding:24px 12px;">
|
||||
<table role="presentation" width="640" cellspacing="0" cellpadding="0" style="max-width:640px;background:#ffffff;border-radius:10px;overflow:hidden;box-shadow:0 2px 6px rgba(15,69,207,0.08);">
|
||||
<tr><td style="background:#0F45CF;color:#ffffff;padding:24px 28px;">
|
||||
<h2 style="margin:0;font-size:20px;">Events expiring next week</h2>
|
||||
<p style="margin:6px 0 0;color:#d2dcff;font-size:14px;">
|
||||
{monday.strftime("%A %d %b %Y")} → {sunday.strftime("%A %d %b %Y")}
|
||||
· {len(events)} event(s)
|
||||
</p>
|
||||
</td></tr>
|
||||
<tr><td style="padding:20px 24px;">
|
||||
<p style="margin:0 0 12px;font-size:14px;color:#444;">
|
||||
Scheduled notification: <strong>{escape(schedule.name)}</strong>
|
||||
</p>
|
||||
<table role="presentation" width="100%" cellspacing="0" cellpadding="0" style="border-collapse:collapse;font-size:14px;">
|
||||
<thead>
|
||||
<tr style="background:#f0f4ff;color:#0F45CF;">
|
||||
<th align="left" style="padding:10px 12px;">Title</th>
|
||||
<th align="left" style="padding:10px 12px;">Partner</th>
|
||||
<th align="left" style="padding:10px 12px;">Category</th>
|
||||
<th align="left" style="padding:10px 12px;">End date</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>{rows_html}</tbody>
|
||||
</table>
|
||||
</td></tr>
|
||||
<tr><td style="padding:16px 24px 24px;color:#888;font-size:12px;">
|
||||
Sent automatically by Eventify Command Center.
|
||||
To change recipients or the schedule, open
|
||||
<a href="https://admin.eventifyplus.com/settings" style="color:#0F45CF;">admin.eventifyplus.com › Settings › Notifications</a>.
|
||||
</td></tr>
|
||||
</table>
|
||||
</td></tr>
|
||||
</table>
|
||||
</body></html>"""
|
||||
|
||||
return subject, html
|
||||
|
||||
|
||||
BUILDERS: dict = {
|
||||
'events_expiring_this_week': _build_events_expiring_this_week,
|
||||
}
|
||||
|
||||
|
||||
def render_and_send(schedule) -> int:
|
||||
"""Render the email for ``schedule`` and deliver it to active recipients.
|
||||
|
||||
Returns the number of recipients the message was sent to. Raises on SMTP
|
||||
failure so the management command can mark the schedule as errored.
|
||||
"""
|
||||
builder = BUILDERS.get(schedule.notification_type)
|
||||
if builder is None:
|
||||
raise ValueError(f'No builder for notification type: {schedule.notification_type}')
|
||||
|
||||
subject, html = builder(schedule)
|
||||
recipients = list(
|
||||
schedule.recipients.filter(is_active=True).values_list('email', flat=True)
|
||||
)
|
||||
if not recipients:
|
||||
log('warning', 'notification schedule has no active recipients', logger_data={
|
||||
'schedule_id': schedule.id,
|
||||
'schedule_name': schedule.name,
|
||||
})
|
||||
return 0
|
||||
|
||||
msg = EmailMessage(
|
||||
subject=subject,
|
||||
body=html,
|
||||
from_email=settings.DEFAULT_FROM_EMAIL,
|
||||
to=recipients,
|
||||
)
|
||||
msg.content_subtype = 'html'
|
||||
msg.send(fail_silently=False)
|
||||
|
||||
log('info', 'notification email sent', logger_data={
|
||||
'schedule_id': schedule.id,
|
||||
'schedule_name': schedule.name,
|
||||
'type': schedule.notification_type,
|
||||
'recipient_count': len(recipients),
|
||||
})
|
||||
return len(recipients)
|
||||
0
notifications/management/__init__.py
Normal file
0
notifications/management/__init__.py
Normal file
0
notifications/management/commands/__init__.py
Normal file
0
notifications/management/commands/__init__.py
Normal file
@@ -0,0 +1,153 @@
|
||||
"""Dispatch due ``NotificationSchedule`` jobs.
|
||||
|
||||
Host cron invokes this every ~15 minutes via ``docker exec``. The command
|
||||
walks all active schedules, evaluates their cron expression against
|
||||
``last_run_at`` using ``croniter``, and fires any that are due. A row-level
|
||||
``select_for_update(skip_locked=True)`` prevents duplicate sends if two cron
|
||||
ticks race or the container is restarted mid-run.
|
||||
|
||||
Evaluation timezone is **Asia/Kolkata** to match
|
||||
``notifications/emails.py::_upcoming_week_bounds`` — the same wall-clock week
|
||||
used in the outgoing email body.
|
||||
|
||||
Flags:
|
||||
--schedule-id <id> Fire exactly one schedule, ignoring cron check.
|
||||
--dry-run Resolve due schedules + render emails, send nothing.
|
||||
"""
|
||||
from datetime import datetime, timedelta
|
||||
|
||||
try:
|
||||
from zoneinfo import ZoneInfo
|
||||
except ImportError: # pragma: no cover — py<3.9
|
||||
from backports.zoneinfo import ZoneInfo # type: ignore
|
||||
|
||||
from croniter import croniter
|
||||
from django.core.management.base import BaseCommand, CommandError
|
||||
from django.db import transaction
|
||||
from django.utils import timezone
|
||||
|
||||
from eventify_logger.services import log
|
||||
from notifications.emails import BUILDERS, render_and_send
|
||||
from notifications.models import NotificationSchedule
|
||||
|
||||
|
||||
IST = ZoneInfo('Asia/Kolkata')
|
||||
|
||||
|
||||
def _is_due(schedule: NotificationSchedule, now_ist: datetime) -> bool:
|
||||
"""Return True if ``schedule`` should fire at ``now_ist``.
|
||||
|
||||
``croniter`` is seeded with ``last_run_at`` (or one year ago for a fresh
|
||||
schedule) and asked for the next fire time. If that time has already
|
||||
passed relative to ``now_ist`` the schedule is due.
|
||||
"""
|
||||
if not croniter.is_valid(schedule.cron_expression):
|
||||
return False
|
||||
|
||||
if schedule.last_run_at is not None:
|
||||
seed = schedule.last_run_at.astimezone(IST)
|
||||
else:
|
||||
seed = now_ist - timedelta(days=365)
|
||||
|
||||
itr = croniter(schedule.cron_expression, seed)
|
||||
next_fire = itr.get_next(datetime)
|
||||
return next_fire <= now_ist
|
||||
|
||||
|
||||
class Command(BaseCommand):
|
||||
help = 'Dispatch due NotificationSchedule email jobs.'
|
||||
|
||||
def add_arguments(self, parser):
|
||||
parser.add_argument(
|
||||
'--schedule-id', type=int, default=None,
|
||||
help='Force-run a single schedule by ID, ignoring cron check.',
|
||||
)
|
||||
parser.add_argument(
|
||||
'--dry-run', action='store_true',
|
||||
help='Render and log but do not send or persist last_run_at.',
|
||||
)
|
||||
|
||||
def handle(self, *args, **opts):
|
||||
schedule_id = opts.get('schedule_id')
|
||||
dry_run = opts.get('dry_run', False)
|
||||
|
||||
now_ist = datetime.now(IST)
|
||||
qs = NotificationSchedule.objects.filter(is_active=True)
|
||||
if schedule_id is not None:
|
||||
qs = qs.filter(id=schedule_id)
|
||||
|
||||
candidate_ids = list(qs.values_list('id', flat=True))
|
||||
if not candidate_ids:
|
||||
self.stdout.write('No active schedules to evaluate.')
|
||||
return
|
||||
|
||||
fired = 0
|
||||
skipped = 0
|
||||
errored = 0
|
||||
|
||||
for sid in candidate_ids:
|
||||
with transaction.atomic():
|
||||
locked_qs = (
|
||||
NotificationSchedule.objects
|
||||
.select_for_update(skip_locked=True)
|
||||
.filter(id=sid, is_active=True)
|
||||
)
|
||||
schedule = locked_qs.first()
|
||||
if schedule is None:
|
||||
skipped += 1
|
||||
continue
|
||||
|
||||
forced = schedule_id is not None
|
||||
if not forced and not _is_due(schedule, now_ist):
|
||||
skipped += 1
|
||||
continue
|
||||
|
||||
if schedule.notification_type not in BUILDERS:
|
||||
schedule.last_status = NotificationSchedule.STATUS_ERROR
|
||||
schedule.last_error = (
|
||||
f'No builder registered for {schedule.notification_type!r}'
|
||||
)
|
||||
schedule.save(update_fields=['last_status', 'last_error', 'updated_at'])
|
||||
errored += 1
|
||||
continue
|
||||
|
||||
if dry_run:
|
||||
self.stdout.write(
|
||||
f'[dry-run] would fire schedule {schedule.id} '
|
||||
f'({schedule.name}) type={schedule.notification_type}'
|
||||
)
|
||||
fired += 1
|
||||
continue
|
||||
|
||||
try:
|
||||
recipient_count = render_and_send(schedule)
|
||||
except Exception as exc: # noqa: BLE001 — wide catch, store msg
|
||||
log('error', 'notification dispatch failed', logger_data={
|
||||
'schedule_id': schedule.id,
|
||||
'schedule_name': schedule.name,
|
||||
'error': str(exc),
|
||||
})
|
||||
schedule.last_status = NotificationSchedule.STATUS_ERROR
|
||||
schedule.last_error = str(exc)[:2000]
|
||||
schedule.save(update_fields=['last_status', 'last_error', 'updated_at'])
|
||||
errored += 1
|
||||
continue
|
||||
|
||||
schedule.last_run_at = timezone.now()
|
||||
schedule.last_status = NotificationSchedule.STATUS_SUCCESS
|
||||
schedule.last_error = ''
|
||||
schedule.save(update_fields=[
|
||||
'last_run_at', 'last_status', 'last_error', 'updated_at',
|
||||
])
|
||||
fired += 1
|
||||
self.stdout.write(
|
||||
f'Fired schedule {schedule.id} ({schedule.name}) '
|
||||
f'→ {recipient_count} recipient(s)'
|
||||
)
|
||||
|
||||
summary = f'Done. fired={fired} skipped={skipped} errored={errored}'
|
||||
self.stdout.write(summary)
|
||||
log('info', 'send_scheduled_notifications complete', logger_data={
|
||||
'fired': fired, 'skipped': skipped, 'errored': errored,
|
||||
'dry_run': dry_run, 'forced_id': schedule_id,
|
||||
})
|
||||
93
notifications/migrations/0001_initial.py
Normal file
93
notifications/migrations/0001_initial.py
Normal file
@@ -0,0 +1,93 @@
|
||||
from django.conf import settings
|
||||
from django.db import migrations, models
|
||||
import django.db.models.deletion
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
initial = True
|
||||
|
||||
dependencies = [
|
||||
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.CreateModel(
|
||||
name='Notification',
|
||||
fields=[
|
||||
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||
('title', models.CharField(max_length=255)),
|
||||
('message', models.TextField()),
|
||||
('notification_type', models.CharField(
|
||||
choices=[
|
||||
('event', 'Event'),
|
||||
('promo', 'Promotion'),
|
||||
('system', 'System'),
|
||||
('booking', 'Booking'),
|
||||
],
|
||||
default='system', max_length=20,
|
||||
)),
|
||||
('is_read', models.BooleanField(default=False)),
|
||||
('action_url', models.URLField(blank=True, null=True)),
|
||||
('created_at', models.DateTimeField(auto_now_add=True)),
|
||||
('user', models.ForeignKey(
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
related_name='notifications',
|
||||
to=settings.AUTH_USER_MODEL,
|
||||
)),
|
||||
],
|
||||
options={'ordering': ['-created_at']},
|
||||
),
|
||||
migrations.CreateModel(
|
||||
name='NotificationSchedule',
|
||||
fields=[
|
||||
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||
('name', models.CharField(max_length=200)),
|
||||
('notification_type', models.CharField(
|
||||
choices=[('events_expiring_this_week', 'Events Expiring This Week')],
|
||||
db_index=True, max_length=64,
|
||||
)),
|
||||
('cron_expression', models.CharField(
|
||||
default='0 0 * * 1',
|
||||
help_text=(
|
||||
'Standard 5-field cron (minute hour dom month dow). '
|
||||
'Evaluated in Asia/Kolkata.'
|
||||
),
|
||||
max_length=100,
|
||||
)),
|
||||
('is_active', models.BooleanField(db_index=True, default=True)),
|
||||
('last_run_at', models.DateTimeField(blank=True, null=True)),
|
||||
('last_status', models.CharField(blank=True, default='', max_length=20)),
|
||||
('last_error', models.TextField(blank=True, default='')),
|
||||
('created_at', models.DateTimeField(auto_now_add=True)),
|
||||
('updated_at', models.DateTimeField(auto_now=True)),
|
||||
],
|
||||
options={'ordering': ['-created_at']},
|
||||
),
|
||||
migrations.AddIndex(
|
||||
model_name='notificationschedule',
|
||||
index=models.Index(
|
||||
fields=['is_active', 'notification_type'],
|
||||
name='notificatio_is_acti_26dfb5_idx',
|
||||
),
|
||||
),
|
||||
migrations.CreateModel(
|
||||
name='NotificationRecipient',
|
||||
fields=[
|
||||
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||
('email', models.EmailField(max_length=254)),
|
||||
('display_name', models.CharField(blank=True, default='', max_length=200)),
|
||||
('is_active', models.BooleanField(default=True)),
|
||||
('created_at', models.DateTimeField(auto_now_add=True)),
|
||||
('schedule', models.ForeignKey(
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
related_name='recipients',
|
||||
to='notifications.notificationschedule',
|
||||
)),
|
||||
],
|
||||
options={
|
||||
'ordering': ['display_name', 'email'],
|
||||
'unique_together': {('schedule', 'email')},
|
||||
},
|
||||
),
|
||||
]
|
||||
0
notifications/migrations/__init__.py
Normal file
0
notifications/migrations/__init__.py
Normal file
102
notifications/models.py
Normal file
102
notifications/models.py
Normal file
@@ -0,0 +1,102 @@
|
||||
"""
|
||||
Two distinct concerns live in this app:
|
||||
|
||||
1. ``Notification`` — consumer-facing in-app inbox entries surfaced on the mobile
|
||||
SPA (/api/notifications/list/). One row per user per alert.
|
||||
|
||||
2. ``NotificationSchedule`` + ``NotificationRecipient`` — admin-side recurring
|
||||
email jobs configured from the Command Center Settings tab and dispatched by
|
||||
the ``send_scheduled_notifications`` management command (host cron).
|
||||
Not user-facing; strictly operational.
|
||||
"""
|
||||
from django.db import models
|
||||
|
||||
from accounts.models import User
|
||||
|
||||
|
||||
class Notification(models.Model):
|
||||
NOTIFICATION_TYPES = [
|
||||
('event', 'Event'),
|
||||
('promo', 'Promotion'),
|
||||
('system', 'System'),
|
||||
('booking', 'Booking'),
|
||||
]
|
||||
|
||||
user = models.ForeignKey(User, on_delete=models.CASCADE, related_name='notifications')
|
||||
title = models.CharField(max_length=255)
|
||||
message = models.TextField()
|
||||
notification_type = models.CharField(max_length=20, choices=NOTIFICATION_TYPES, default='system')
|
||||
is_read = models.BooleanField(default=False)
|
||||
action_url = models.URLField(blank=True, null=True)
|
||||
created_at = models.DateTimeField(auto_now_add=True)
|
||||
|
||||
class Meta:
|
||||
ordering = ['-created_at']
|
||||
|
||||
def __str__(self):
|
||||
return f"{self.notification_type}: {self.title} → {self.user.email}"
|
||||
|
||||
|
||||
class NotificationSchedule(models.Model):
|
||||
"""One configurable recurring email job.
|
||||
|
||||
New types are added by registering a builder in ``notifications/emails.py``
|
||||
and adding the slug to ``TYPE_CHOICES`` below. Cron expression is evaluated
|
||||
in ``Asia/Kolkata`` by the dispatcher (matches operations team timezone).
|
||||
"""
|
||||
|
||||
TYPE_EVENTS_EXPIRING_THIS_WEEK = 'events_expiring_this_week'
|
||||
|
||||
TYPE_CHOICES = [
|
||||
(TYPE_EVENTS_EXPIRING_THIS_WEEK, 'Events Expiring This Week'),
|
||||
]
|
||||
|
||||
STATUS_SUCCESS = 'success'
|
||||
STATUS_ERROR = 'error'
|
||||
|
||||
name = models.CharField(max_length=200)
|
||||
notification_type = models.CharField(
|
||||
max_length=64, choices=TYPE_CHOICES, db_index=True,
|
||||
)
|
||||
cron_expression = models.CharField(
|
||||
max_length=100, default='0 0 * * 1',
|
||||
help_text='Standard 5-field cron (minute hour dom month dow). '
|
||||
'Evaluated in Asia/Kolkata.',
|
||||
)
|
||||
is_active = models.BooleanField(default=True, db_index=True)
|
||||
last_run_at = models.DateTimeField(null=True, blank=True)
|
||||
last_status = models.CharField(max_length=20, blank=True, default='')
|
||||
last_error = models.TextField(blank=True, default='')
|
||||
created_at = models.DateTimeField(auto_now_add=True)
|
||||
updated_at = models.DateTimeField(auto_now=True)
|
||||
|
||||
class Meta:
|
||||
ordering = ['-created_at']
|
||||
indexes = [models.Index(fields=['is_active', 'notification_type'])]
|
||||
|
||||
def __str__(self):
|
||||
return f'{self.name} ({self.notification_type})'
|
||||
|
||||
|
||||
class NotificationRecipient(models.Model):
|
||||
"""Free-form recipient — not tied to a User row so external stakeholders
|
||||
(vendors, partners, sponsors) can receive notifications without needing
|
||||
platform accounts."""
|
||||
|
||||
schedule = models.ForeignKey(
|
||||
NotificationSchedule,
|
||||
on_delete=models.CASCADE,
|
||||
related_name='recipients',
|
||||
)
|
||||
email = models.EmailField()
|
||||
display_name = models.CharField(max_length=200, blank=True, default='')
|
||||
is_active = models.BooleanField(default=True)
|
||||
created_at = models.DateTimeField(auto_now_add=True)
|
||||
|
||||
class Meta:
|
||||
unique_together = [('schedule', 'email')]
|
||||
ordering = ['display_name', 'email']
|
||||
|
||||
def __str__(self):
|
||||
label = self.display_name or self.email
|
||||
return f'{label} ({self.schedule.name})'
|
||||
8
notifications/urls.py
Normal file
8
notifications/urls.py
Normal file
@@ -0,0 +1,8 @@
|
||||
from django.urls import path
|
||||
from .views import NotificationListView, NotificationMarkReadView, NotificationCountView
|
||||
|
||||
urlpatterns = [
|
||||
path('list/', NotificationListView.as_view(), name='notification_list'),
|
||||
path('mark-read/', NotificationMarkReadView.as_view(), name='notification_mark_read'),
|
||||
path('count/', NotificationCountView.as_view(), name='notification_count'),
|
||||
]
|
||||
85
notifications/views.py
Normal file
85
notifications/views.py
Normal file
@@ -0,0 +1,85 @@
|
||||
import json
|
||||
from django.views.decorators.csrf import csrf_exempt
|
||||
from django.http import JsonResponse
|
||||
from django.utils.decorators import method_decorator
|
||||
from django.views import View
|
||||
from mobile_api.utils import validate_token_and_get_user
|
||||
from eventify_logger.services import log
|
||||
from .models import Notification
|
||||
|
||||
|
||||
@method_decorator(csrf_exempt, name='dispatch')
|
||||
class NotificationListView(View):
|
||||
def post(self, request):
|
||||
try:
|
||||
user, token, data, error_response = validate_token_and_get_user(request, error_status_code=True)
|
||||
if error_response:
|
||||
return error_response
|
||||
|
||||
page = int(data.get('page', 1))
|
||||
page_size = int(data.get('page_size', 20))
|
||||
offset = (page - 1) * page_size
|
||||
|
||||
notifications = Notification.objects.filter(user=user)[offset:offset + page_size]
|
||||
total = Notification.objects.filter(user=user).count()
|
||||
|
||||
items = [{
|
||||
'id': n.id,
|
||||
'title': n.title,
|
||||
'message': n.message,
|
||||
'notification_type': n.notification_type,
|
||||
'is_read': n.is_read,
|
||||
'action_url': n.action_url or '',
|
||||
'created_at': n.created_at.isoformat(),
|
||||
} for n in notifications]
|
||||
|
||||
return JsonResponse({
|
||||
'status': 'success',
|
||||
'notifications': items,
|
||||
'total': total,
|
||||
'page': page,
|
||||
'page_size': page_size,
|
||||
})
|
||||
except Exception as e:
|
||||
log("error", "NotificationListView error", request=request, logger_data={"error": str(e)})
|
||||
return JsonResponse({'error': 'An unexpected server error occurred.'}, status=500)
|
||||
|
||||
|
||||
@method_decorator(csrf_exempt, name='dispatch')
|
||||
class NotificationMarkReadView(View):
|
||||
def post(self, request):
|
||||
try:
|
||||
user, token, data, error_response = validate_token_and_get_user(request, error_status_code=True)
|
||||
if error_response:
|
||||
return error_response
|
||||
|
||||
mark_all = data.get('mark_all', False)
|
||||
notification_id = data.get('notification_id')
|
||||
|
||||
if mark_all:
|
||||
Notification.objects.filter(user=user, is_read=False).update(is_read=True)
|
||||
return JsonResponse({'status': 'success', 'message': 'All notifications marked as read'})
|
||||
|
||||
if notification_id:
|
||||
Notification.objects.filter(id=notification_id, user=user).update(is_read=True)
|
||||
return JsonResponse({'status': 'success', 'message': 'Notification marked as read'})
|
||||
|
||||
return JsonResponse({'error': 'Provide notification_id or mark_all=true'}, status=400)
|
||||
except Exception as e:
|
||||
log("error", "NotificationMarkReadView error", request=request, logger_data={"error": str(e)})
|
||||
return JsonResponse({'error': 'An unexpected server error occurred.'}, status=500)
|
||||
|
||||
|
||||
@method_decorator(csrf_exempt, name='dispatch')
|
||||
class NotificationCountView(View):
|
||||
def post(self, request):
|
||||
try:
|
||||
user, token, data, error_response = validate_token_and_get_user(request, error_status_code=True)
|
||||
if error_response:
|
||||
return error_response
|
||||
|
||||
count = Notification.objects.filter(user=user, is_read=False).count()
|
||||
return JsonResponse({'status': 'success', 'unread_count': count})
|
||||
except Exception as e:
|
||||
log("error", "NotificationCountView error", request=request, logger_data={"error": str(e)})
|
||||
return JsonResponse({'error': 'An unexpected server error occurred.'}, status=500)
|
||||
@@ -0,0 +1,70 @@
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
dependencies = [
|
||||
('partner', '0001_initial'),
|
||||
]
|
||||
|
||||
operations = [
|
||||
# Profile extras
|
||||
migrations.AddField(
|
||||
model_name='partner',
|
||||
name='bio',
|
||||
field=models.TextField(blank=True, null=True),
|
||||
),
|
||||
|
||||
# Payout settings
|
||||
migrations.AddField(
|
||||
model_name='partner',
|
||||
name='payout_account_holder_name',
|
||||
field=models.CharField(blank=True, max_length=250, null=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='partner',
|
||||
name='payout_account_number',
|
||||
field=models.CharField(blank=True, max_length=50, null=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='partner',
|
||||
name='payout_ifsc_code',
|
||||
field=models.CharField(blank=True, max_length=20, null=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='partner',
|
||||
name='payout_bank_name',
|
||||
field=models.CharField(blank=True, max_length=250, null=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='partner',
|
||||
name='payout_schedule',
|
||||
field=models.CharField(
|
||||
choices=[('weekly', 'Weekly'), ('biweekly', 'Bi-weekly'), ('monthly', 'Monthly')],
|
||||
default='monthly',
|
||||
max_length=20,
|
||||
),
|
||||
),
|
||||
|
||||
# Notification preferences
|
||||
migrations.AddField(
|
||||
model_name='partner',
|
||||
name='notif_new_booking',
|
||||
field=models.BooleanField(default=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='partner',
|
||||
name='notif_event_status',
|
||||
field=models.BooleanField(default=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='partner',
|
||||
name='notif_payout_update',
|
||||
field=models.BooleanField(default=True),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name='partner',
|
||||
name='notif_weekly_report',
|
||||
field=models.BooleanField(default=False),
|
||||
),
|
||||
]
|
||||
@@ -65,5 +65,28 @@ class Partner(models.Model):
|
||||
kyc_compliance_document_file = models.FileField(upload_to='kyc_documents/', blank=True, null=True)
|
||||
kyc_compliance_document_number = models.CharField(max_length=250, blank=True, null=True)
|
||||
|
||||
# Profile extras
|
||||
bio = models.TextField(blank=True, null=True)
|
||||
|
||||
# Payout settings
|
||||
PAYOUT_SCHEDULE_CHOICES = (
|
||||
('weekly', 'Weekly'),
|
||||
('biweekly', 'Bi-weekly'),
|
||||
('monthly', 'Monthly'),
|
||||
)
|
||||
payout_account_holder_name = models.CharField(max_length=250, blank=True, null=True)
|
||||
payout_account_number = models.CharField(max_length=50, blank=True, null=True)
|
||||
payout_ifsc_code = models.CharField(max_length=20, blank=True, null=True)
|
||||
payout_bank_name = models.CharField(max_length=250, blank=True, null=True)
|
||||
payout_schedule = models.CharField(
|
||||
max_length=20, choices=PAYOUT_SCHEDULE_CHOICES, default='monthly'
|
||||
)
|
||||
|
||||
# Notification preferences
|
||||
notif_new_booking = models.BooleanField(default=True)
|
||||
notif_event_status = models.BooleanField(default=True)
|
||||
notif_payout_update = models.BooleanField(default=True)
|
||||
notif_weekly_report = models.BooleanField(default=False)
|
||||
|
||||
def __str__(self):
|
||||
return self.name
|
||||
|
||||
@@ -7,3 +7,7 @@ gunicorn==21.2.0
|
||||
django-extensions==3.2.3
|
||||
psycopg2-binary==2.9.9
|
||||
djangorestframework-simplejwt==5.3.1
|
||||
google-auth>=2.0.0
|
||||
requests>=2.28.0
|
||||
qrcode[pil]>=7.4.2
|
||||
croniter>=2.0.0
|
||||
|
||||
@@ -1,3 +1,7 @@
|
||||
Django>=4.2
|
||||
Pillow
|
||||
django-summernote
|
||||
google-auth>=2.0.0
|
||||
requests>=2.31.0
|
||||
qrcode[pil]>=7.4.2
|
||||
croniter>=2.0.0
|
||||
|
||||
Reference in New Issue
Block a user