Compare commits
37 Commits
main
...
d74698f0b8
| Author | SHA1 | Date | |
|---|---|---|---|
| d74698f0b8 | |||
| fb1abc0b99 | |||
| c7cb1fef62 | |||
| 4de955ba62 | |||
| 9c17c6e4fc | |||
| d34caeccae | |||
| e6ffe8efe3 | |||
| 5e14511a12 | |||
| 38b45e8c79 | |||
| d6ca058864 | |||
| 8c9ad49387 | |||
| 0b2050443b | |||
| 7913f9f8e9 | |||
| cb63ceab92 | |||
| 1a82a3a8fc | |||
| d182cfe5ee | |||
| a6e080bf6c | |||
| d1a0c95dfd | |||
| a8f12a9d34 | |||
| 35a99c08fd | |||
| 291fa40283 | |||
| 068d700059 | |||
| 6ea51e1463 | |||
| 4cf70b6330 | |||
| f8c17e2c8d | |||
| f2134c5529 | |||
| 239a84cd76 | |||
| 1a668c078a | |||
| 4274672d25 | |||
| 19b4482057 | |||
| 81a261f726 | |||
| e26f463bbb | |||
| 8e081c8c24 | |||
| c2175fa58e | |||
|
|
0ae42f5757 | ||
|
|
422002e9ef | ||
|
|
91751ac127 |
154
CHANGELOG.md
154
CHANGELOG.md
@@ -5,160 +5,6 @@ Format follows [Keep a Changelog](https://keepachangelog.com/en/1.0.0/), version
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## [1.14.1] — 2026-04-22
|
|
||||||
|
|
||||||
### Fixed
|
|
||||||
- **`_serialize_review()` now returns `profile_photo`** (`mobile_api/views/reviews.py`) — `/api/reviews/list` payload was missing the reviewer's photo URL, so the consumer app had no choice but to render DiceBear placeholders for every reviewer regardless of whether they had uploaded a real profile picture
|
|
||||||
- Resolves `r.reviewer.profile_picture.url` when the field is non-empty and the file name is not `default.png` (the model's placeholder default); returns empty string otherwise so the frontend can fall back cleanly to DiceBear
|
|
||||||
- Mirrors the existing pattern in `mobile_api/views/user.py` (`LoginView`, `StatusView`, `UpdateProfileView`) — same defensive try/except around FK dereference
|
|
||||||
- Pure serializer change — no migration, no URL change, no permission change; `gunicorn kill -HUP 1` picks it up
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## [1.14.0] — 2026-04-21
|
|
||||||
|
|
||||||
### Added
|
|
||||||
- **Module-level RBAC scopes for Reviews, Contributions, Leads, Audit Log** — `SCOPE_DEFINITIONS` in `admin_api/views.py` extended with 13 new entries so the admin dashboard's Roles & Permissions grid and the new Base Permissions tab can grant/revoke access at module granularity:
|
|
||||||
- Reviews: `reviews.read`, `reviews.moderate`, `reviews.delete`
|
|
||||||
- Contributions: `contributions.read`, `contributions.approve`, `contributions.reject`, `contributions.award`
|
|
||||||
- Leads: `leads.read`, `leads.write`, `leads.assign`, `leads.convert`
|
|
||||||
- Audit Log: `audit.read`, `audit.export`
|
|
||||||
- **`NotificationSchedule` audit emissions** in `admin_api/views.py` — `NotificationScheduleListView.post` and `NotificationScheduleDetailView.patch` / `.delete` now write `notification.schedule.created` / `.updated` / `.deleted` `AuditLog` rows. Update emits only when at least one field actually changed. Delete captures `name`/`notification_type`/`cron_expression` before the row is deleted so the audit trail survives the deletion
|
|
||||||
|
|
||||||
### Fixed
|
|
||||||
- **`StaffProfile.get_allowed_modules()`** in `admin_api/models.py` — `SCOPE_TO_MODULE` was missing the `'reviews': 'reviews'` entry, so staff granted `reviews.*` scopes could not see the Reviews module in their sidebar. Added
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## [1.13.0] — 2026-04-21
|
|
||||||
|
|
||||||
### Added
|
|
||||||
- **Full admin interaction audit coverage** — `_audit_log()` calls added to 12 views; every meaningful admin state change now writes an `AuditLog` row:
|
|
||||||
|
|
||||||
| View | Action slug(s) | Notes |
|
|
||||||
|---|---|---|
|
|
||||||
| `AdminLoginView` | `auth.admin_login`, `auth.admin_login_failed` | Uses new `user=` kwarg (anonymous at login time) |
|
|
||||||
| `PartnerStatusView` | `partner.status_changed` | Wrapped in `transaction.atomic()` |
|
|
||||||
| `PartnerOnboardView` | `partner.onboarded` | Inside existing `transaction.atomic()` block |
|
|
||||||
| `PartnerStaffCreateView` | `partner.staff.created` | Logged after `staff_user.save()` |
|
|
||||||
| `EventCreateView` | `event.created` | title, partner_id, source in details |
|
|
||||||
| `EventUpdateView` | `event.updated` | changed_fields list in details, wrapped in `transaction.atomic()` |
|
|
||||||
| `EventDeleteView` | `event.deleted` | title + partner_id captured BEFORE delete, wrapped in `transaction.atomic()` |
|
|
||||||
| `SettlementReleaseView` | `settlement.released` | prev/new status in details, `transaction.atomic()` |
|
|
||||||
| `ReviewDeleteView` | `review.deleted` | reviewer_user_id + event_id + rating captured BEFORE delete |
|
|
||||||
| `PaymentGatewaySettingsView` | `gateway.created`, `gateway.updated`, `gateway.deleted` | changed_fields on update |
|
|
||||||
| `EventPrimaryImageView` | `event.primary_image_changed` | prev + new primary image id in details |
|
|
||||||
| `LeadUpdateView` | `lead.updated` | changed_fields list; only emits if any field was changed |
|
|
||||||
|
|
||||||
- **`_audit_log` helper** — optional `user=None` kwarg so `AdminLoginView` can supply the authenticated user explicitly (request.user is still anonymous at that point in the login flow). All 20+ existing callers are unaffected (no kwarg = falls through to `request.user`).
|
|
||||||
- **`admin_api/tests.py`** — `AuthAuditEmissionTests` (login success + failed login) and `EventCrudAuditTests` (create/update/delete) bring total test count to 16, all green
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## [1.12.0] — 2026-04-21
|
|
||||||
|
|
||||||
### Added
|
|
||||||
- **Audit coverage for four moderation endpoints** — every admin state change now leaves a matching row in `AuditLog`, written in the same `transaction.atomic()` block as the state change so the log can never disagree with the database:
|
|
||||||
- `UserStatusView` (`PATCH /api/v1/users/<id>/status/`) — `user.suspended`, `user.banned`, `user.reinstated`, `user.flagged`; details capture `reason`, `previous_status`, `new_status`
|
|
||||||
- `EventModerationView` (`PATCH /api/v1/events/<id>/moderate/`) — `event.approved`, `event.rejected`, `event.flagged`, `event.featured`, `event.unfeatured`; details include `reason`, `partner_id`, `previous_status`/`new_status`, `previous_is_featured`/`new_is_featured`
|
|
||||||
- `ReviewModerationView` (`PATCH /api/v1/reviews/<id>/moderate/`) — `review.approved`, `review.rejected`, `review.edited`; details include `reject_reason`, `edited_text` flag, `original_text` on edits
|
|
||||||
- `PartnerKYCReviewView` (`POST /api/v1/partners/<id>/kyc/review/`) — `partner.kyc.approved`, `partner.kyc.rejected`, `partner.kyc.requested_info` (new `requested_info` decision leaves compliance state intact and only records the info request)
|
|
||||||
- **`GET /api/v1/rbac/audit-log/metrics/`** — `AuditLogMetricsView` returns `total`, `today`, `week`, `distinct_users`, and a `by_action_group` breakdown (`create`/`update`/`delete`/`moderate`/`auth`/`other`). Cached 60 s under key `admin_api:audit_log:metrics:v1`; pass `?nocache=1` to bypass (useful from the Django shell during incident response)
|
|
||||||
- **`GET /api/v1/rbac/audit-log/`** — free-text `search` parameter (Q-filter over `action`, `target_type`, `target_id`, `user__username`, `user__email`); `page_size` now bounded to `[1, 200]` with defensive fallback to defaults on non-integer input
|
|
||||||
- **`accounts.User.ALL_MODULES`** — appended `audit-log`; `StaffProfile.get_allowed_modules()` adds `'audit'` → `'audit-log'` to `SCOPE_TO_MODULE` so scope-based staff resolve the module correctly
|
|
||||||
- **`admin_api/migrations/0005_auditlog_indexes.py`** — composite indexes `(action, -created_at)` and `(target_type, target_id)` on `AuditLog` to keep the /audit-log page fast past ~10k rows; reversible via Django's default `RemoveIndex` reverse op
|
|
||||||
- **`admin_api/tests.py`** — `AuditLogListViewTests`, `AuditLogMetricsViewTests`, `UserStatusAuditEmissionTests` covering list shape, search, pagination bounds, metrics shape + `nocache`, and audit emission on suspend / ban / reinstate
|
|
||||||
|
|
||||||
### Deploy notes
|
|
||||||
Admin users created before this release won't have `audit-log` in their `allowed_modules` TextField. Backfill with:
|
|
||||||
```python
|
|
||||||
# Django shell
|
|
||||||
from accounts.models import User
|
|
||||||
for u in User.objects.filter(role__in=['admin', 'manager']):
|
|
||||||
mods = [m.strip() for m in (u.allowed_modules or '').split(',') if m.strip()]
|
|
||||||
if 'audit-log' not in mods and mods: # only touch users with explicit lists
|
|
||||||
u.allowed_modules = ','.join(mods + ['audit-log'])
|
|
||||||
u.save(update_fields=['allowed_modules'])
|
|
||||||
```
|
|
||||||
Users on the implicit full-access list (empty `allowed_modules` + admin role) pick up the new module automatically via `get_allowed_modules()`.
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## [1.11.0] — 2026-04-12
|
|
||||||
|
|
||||||
### Added
|
|
||||||
- **Worldline Connect payment integration** (`banking_operations/worldline/`)
|
|
||||||
- `client.py` — `WorldlineClient`: HMAC-SHA256 signed requests, `create_hosted_checkout()`, `get_hosted_checkout_status()`, `verify_webhook_signature()`
|
|
||||||
- `views.py` — `POST /api/payments/webhook/` (CSRF-exempt, signature-verified Worldline server callback) + `POST /api/payments/verify/` (frontend polls on return URL)
|
|
||||||
- `emails.py` — HTML ticket confirmation email with per-ticket QR codes embedded as base64 inline images
|
|
||||||
- `WorldlineOrder` model in `banking_operations/models.py` — tracks each hosted-checkout session (hosted_checkout_id, reference_id, status, raw_response, webhook_payload)
|
|
||||||
- **`Booking.payment_status`** field — `pending / paid / failed / cancelled` (default `pending`); migration `bookings/0002_booking_payment_status`
|
|
||||||
- **`banking_operations/services.py::transaction_initiate`** — implemented (was a stub); calls Worldline API, creates `WorldlineOrder`, returns `payment_url` back to `CheckoutAPI`
|
|
||||||
- **Settings**: `WORLDLINE_MERCHANT_ID`, `WORLDLINE_API_KEY_ID`, `WORLDLINE_API_SECRET_KEY`, `WORLDLINE_WEBHOOK_SECRET_KEY`, `WORLDLINE_API_ENDPOINT` (default: sandbox), `WORLDLINE_RETURN_URL`
|
|
||||||
- **Requirements**: `requests>=2.31.0`, `qrcode[pil]>=7.4.2`
|
|
||||||
|
|
||||||
### Flow
|
|
||||||
1. User adds tickets to cart → `POST /api/bookings/checkout/` creates Bookings + calls `transaction_initiate`
|
|
||||||
2. `transaction_initiate` creates `WorldlineOrder` + calls Worldline → returns redirect URL
|
|
||||||
3. Frontend redirects user to Worldline hosted checkout page
|
|
||||||
4. After payment, Worldline redirects to `WORLDLINE_RETURN_URL` (`app.eventifyplus.com/booking/confirm?hostedCheckoutId=...`)
|
|
||||||
5. SPA calls `POST /api/payments/verify/` — checks local status; if still pending, polls Worldline API directly
|
|
||||||
6. Worldline webhook fires `POST /api/payments/webhook/` → generates Tickets (one per quantity), marks Booking `paid`, sends confirmation email with QR codes
|
|
||||||
7. Partner scans QR code at event → existing `POST /api/bookings/check-in/` marks `Ticket.is_checked_in=True`
|
|
||||||
|
|
||||||
### Deploy requirement
|
|
||||||
Set in Django container `.env`:
|
|
||||||
```
|
|
||||||
WORLDLINE_MERCHANT_ID=...
|
|
||||||
WORLDLINE_API_KEY_ID=...
|
|
||||||
WORLDLINE_API_SECRET_KEY=...
|
|
||||||
WORLDLINE_WEBHOOK_SECRET_KEY=...
|
|
||||||
```
|
|
||||||
`WORLDLINE_API_ENDPOINT` defaults to sandbox — set to production URL when going live.
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## [1.10.0] — 2026-04-10
|
|
||||||
|
|
||||||
### Security
|
|
||||||
- **`GoogleLoginView` audience-check fix** (`POST /api/user/google-login/`) — **CRITICAL security patch**
|
|
||||||
- `verify_oauth2_token(token, google_requests.Request())` was called **without** the third `audience` argument, meaning any valid Google-signed ID token from *any* OAuth client was accepted — token spoofing from external apps was trivially possible
|
|
||||||
- Fixed to `verify_oauth2_token(token, google_requests.Request(), settings.GOOGLE_CLIENT_ID)` — only tokens whose `aud` claim matches our registered Client ID are now accepted
|
|
||||||
- Added fail-closed guard: if `settings.GOOGLE_CLIENT_ID` is empty the view returns HTTP 503 instead of silently accepting all tokens
|
|
||||||
|
|
||||||
### Changed
|
|
||||||
- **Removed Clerk scaffolding** — the `@clerk/react` broker approach added in a prior iteration has been replaced with direct Google Identity Services (GIS) ID-token flow on the frontend. Simpler architecture: one trust boundary instead of three.
|
|
||||||
- Removed `ClerkLoginView`, `_clerk_jwks_client`, `_get_clerk_jwks_client()` from `mobile_api/views/user.py`
|
|
||||||
- Removed `path('user/clerk-login/', ...)` from `mobile_api/urls.py`
|
|
||||||
- Removed `CLERK_JWKS_URL` / `CLERK_ISSUER` / `CLERK_SECRET_KEY` from `eventify/settings.py`; replaced with `GOOGLE_CLIENT_ID = os.environ.get('GOOGLE_CLIENT_ID', '')`
|
|
||||||
- Removed `PyJWT[crypto]>=2.8.0` and `requests>=2.31.0` from `requirements.txt` + `requirements-docker.txt` (no longer needed; `google-auth>=2.0.0` handles verification)
|
|
||||||
|
|
||||||
### Added
|
|
||||||
- **Settings**: `GOOGLE_CLIENT_ID = os.environ.get('GOOGLE_CLIENT_ID', '')` in `eventify/settings.py`
|
|
||||||
- **Tests**: `mobile_api/tests.py::GoogleLoginViewTests` — 4 cases: valid token creates user (audience arg verified), missing `id_token` → 400, `ValueError` (wrong sig / wrong aud) → 401, existing user reuses DRF token
|
|
||||||
|
|
||||||
### Context
|
|
||||||
- The consumer SPA (`app.eventifyplus.com`) now loads the Google Identity Services script dynamically and POSTs a Google ID token to the existing `/api/user/google-login/` endpoint. Django is the sole session authority. `localStorage.event_token` / `event_user` are unchanged.
|
|
||||||
- Deploy requirement: set `GOOGLE_CLIENT_ID` in the Django container `.env` **before** deploying — without it the view returns 503 (fail-closed by design).
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## [1.9.0] — 2026-04-07
|
|
||||||
|
|
||||||
### Added
|
|
||||||
- **Lead Manager** — new `Lead` model in `admin_api` for tracking Schedule-a-Call form submissions and sales inquiries
|
|
||||||
- Fields: name, email, phone, event_type, message, status (new/contacted/qualified/converted/closed), source (schedule_call/website/manual), priority (low/medium/high), assigned_to (FK User), notes
|
|
||||||
- Migration `admin_api/0003_lead` with indexes on status, priority, created_at, email
|
|
||||||
- **Consumer endpoint** `POST /api/leads/schedule-call/` — public (AllowAny, CSRF-exempt) endpoint for the Schedule a Call modal; creates Lead with status=new, source=schedule_call
|
|
||||||
- **Admin API endpoints** (all IsAuthenticated):
|
|
||||||
- `GET /api/v1/leads/metrics/` — total, new today, counts per status
|
|
||||||
- `GET /api/v1/leads/` — paginated list with filters (status, priority, source, search, date_from, date_to)
|
|
||||||
- `GET /api/v1/leads/<id>/` — single lead detail
|
|
||||||
- `PATCH /api/v1/leads/<id>/update/` — update status, priority, assigned_to, notes
|
|
||||||
- **RBAC**: `leads` added to `ALL_MODULES`, `get_allowed_modules()`, and `StaffProfile.SCOPE_TO_MODULE`
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## [1.8.3] — 2026-04-06
|
## [1.8.3] — 2026-04-06
|
||||||
|
|
||||||
### Fixed
|
### Fixed
|
||||||
|
|||||||
@@ -1,13 +0,0 @@
|
|||||||
from django.db import migrations
|
|
||||||
|
|
||||||
|
|
||||||
class Migration(migrations.Migration):
|
|
||||||
"""Merge migration to resolve conflicting 0013 migrations."""
|
|
||||||
|
|
||||||
dependencies = [
|
|
||||||
('accounts', '0013_merge_eventify_id'),
|
|
||||||
('accounts', '0013_user_district_changed_at'),
|
|
||||||
]
|
|
||||||
|
|
||||||
operations = [
|
|
||||||
]
|
|
||||||
@@ -68,10 +68,10 @@ class User(AbstractUser):
|
|||||||
help_text='Comma-separated module slugs this user can access',
|
help_text='Comma-separated module slugs this user can access',
|
||||||
)
|
)
|
||||||
|
|
||||||
ALL_MODULES = ["dashboard", "partners", "events", "ad-control", "users", "reviews", "contributions", "leads", "financials", "audit-log", "settings"]
|
ALL_MODULES = ["dashboard", "partners", "events", "ad-control", "users", "reviews", "contributions", "financials", "settings"]
|
||||||
|
|
||||||
def get_allowed_modules(self):
|
def get_allowed_modules(self):
|
||||||
ALL = ["dashboard", "partners", "events", "ad-control", "users", "reviews", "contributions", "leads", "financials", "audit-log", "settings"]
|
ALL = ["dashboard", "partners", "events", "ad-control", "users", "reviews", "contributions", "financials", "settings"]
|
||||||
if self.is_superuser or self.role == "admin":
|
if self.is_superuser or self.role == "admin":
|
||||||
return ALL
|
return ALL
|
||||||
if self.allowed_modules:
|
if self.allowed_modules:
|
||||||
|
|||||||
@@ -1,53 +0,0 @@
|
|||||||
# Generated by Django 4.2.21 on 2026-04-07
|
|
||||||
|
|
||||||
from django.conf import settings
|
|
||||||
from django.db import migrations, models
|
|
||||||
import django.db.models.deletion
|
|
||||||
|
|
||||||
|
|
||||||
class Migration(migrations.Migration):
|
|
||||||
|
|
||||||
dependencies = [
|
|
||||||
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
|
|
||||||
('admin_api', '0002_rbac_models'),
|
|
||||||
]
|
|
||||||
|
|
||||||
operations = [
|
|
||||||
migrations.CreateModel(
|
|
||||||
name='Lead',
|
|
||||||
fields=[
|
|
||||||
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
|
||||||
('name', models.CharField(max_length=200)),
|
|
||||||
('email', models.EmailField(max_length=254)),
|
|
||||||
('phone', models.CharField(max_length=20)),
|
|
||||||
('event_type', models.CharField(choices=[('private', 'Private Event'), ('ticketed', 'Ticketed Event'), ('corporate', 'Corporate Event'), ('wedding', 'Wedding'), ('other', 'Other')], default='private', max_length=20)),
|
|
||||||
('message', models.TextField(blank=True, default='')),
|
|
||||||
('status', models.CharField(choices=[('new', 'New'), ('contacted', 'Contacted'), ('qualified', 'Qualified'), ('converted', 'Converted'), ('closed', 'Closed')], default='new', max_length=20)),
|
|
||||||
('source', models.CharField(choices=[('schedule_call', 'Schedule a Call'), ('website', 'Website'), ('manual', 'Manual')], default='schedule_call', max_length=20)),
|
|
||||||
('priority', models.CharField(choices=[('low', 'Low'), ('medium', 'Medium'), ('high', 'High')], default='medium', max_length=10)),
|
|
||||||
('assigned_to', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='assigned_leads', to=settings.AUTH_USER_MODEL)),
|
|
||||||
('notes', models.TextField(blank=True, default='')),
|
|
||||||
('created_at', models.DateTimeField(auto_now_add=True)),
|
|
||||||
('updated_at', models.DateTimeField(auto_now=True)),
|
|
||||||
],
|
|
||||||
options={
|
|
||||||
'ordering': ['-created_at'],
|
|
||||||
},
|
|
||||||
),
|
|
||||||
migrations.AddIndex(
|
|
||||||
model_name='lead',
|
|
||||||
index=models.Index(fields=['status'], name='admin_api_lead_status_idx'),
|
|
||||||
),
|
|
||||||
migrations.AddIndex(
|
|
||||||
model_name='lead',
|
|
||||||
index=models.Index(fields=['priority'], name='admin_api_lead_priority_idx'),
|
|
||||||
),
|
|
||||||
migrations.AddIndex(
|
|
||||||
model_name='lead',
|
|
||||||
index=models.Index(fields=['created_at'], name='admin_api_lead_created_idx'),
|
|
||||||
),
|
|
||||||
migrations.AddIndex(
|
|
||||||
model_name='lead',
|
|
||||||
index=models.Index(fields=['email'], name='admin_api_lead_email_idx'),
|
|
||||||
),
|
|
||||||
]
|
|
||||||
@@ -1,28 +0,0 @@
|
|||||||
# Generated by Django 4.2.21 on 2026-04-07
|
|
||||||
|
|
||||||
from django.conf import settings
|
|
||||||
from django.db import migrations, models
|
|
||||||
import django.db.models.deletion
|
|
||||||
|
|
||||||
|
|
||||||
class Migration(migrations.Migration):
|
|
||||||
|
|
||||||
dependencies = [
|
|
||||||
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
|
|
||||||
('admin_api', '0003_lead'),
|
|
||||||
]
|
|
||||||
|
|
||||||
operations = [
|
|
||||||
migrations.AddField(
|
|
||||||
model_name='lead',
|
|
||||||
name='user_account',
|
|
||||||
field=models.ForeignKey(
|
|
||||||
blank=True,
|
|
||||||
null=True,
|
|
||||||
on_delete=django.db.models.deletion.SET_NULL,
|
|
||||||
related_name='submitted_leads',
|
|
||||||
to=settings.AUTH_USER_MODEL,
|
|
||||||
help_text='Consumer platform account that submitted this lead (auto-matched by email)',
|
|
||||||
),
|
|
||||||
),
|
|
||||||
]
|
|
||||||
@@ -1,31 +0,0 @@
|
|||||||
# Generated by Django 4.2.21 for the Audit Log module (admin_api v1.12.0).
|
|
||||||
#
|
|
||||||
# Adds two composite indexes to `AuditLog` so the new /audit-log admin page
|
|
||||||
# can filter by action and resolve "related entries" lookups without a full
|
|
||||||
# table scan once the log grows past a few thousand rows.
|
|
||||||
|
|
||||||
from django.db import migrations, models
|
|
||||||
|
|
||||||
|
|
||||||
class Migration(migrations.Migration):
|
|
||||||
|
|
||||||
dependencies = [
|
|
||||||
('admin_api', '0004_lead_user_account'),
|
|
||||||
]
|
|
||||||
|
|
||||||
operations = [
|
|
||||||
migrations.AddIndex(
|
|
||||||
model_name='auditlog',
|
|
||||||
index=models.Index(
|
|
||||||
fields=['action', '-created_at'],
|
|
||||||
name='auditlog_action_time_idx',
|
|
||||||
),
|
|
||||||
),
|
|
||||||
migrations.AddIndex(
|
|
||||||
model_name='auditlog',
|
|
||||||
index=models.Index(
|
|
||||||
fields=['target_type', 'target_id'],
|
|
||||||
name='auditlog_target_idx',
|
|
||||||
),
|
|
||||||
),
|
|
||||||
]
|
|
||||||
@@ -130,7 +130,7 @@ class StaffProfile(models.Model):
|
|||||||
def get_allowed_modules(self):
|
def get_allowed_modules(self):
|
||||||
scopes = self.get_effective_scopes()
|
scopes = self.get_effective_scopes()
|
||||||
if '*' in scopes:
|
if '*' in scopes:
|
||||||
return ['dashboard', 'partners', 'events', 'ad-control', 'users', 'reviews', 'contributions', 'leads', 'financials', 'audit-log', 'settings']
|
return ['dashboard', 'partners', 'events', 'ad-control', 'users', 'reviews', 'contributions', 'financials', 'settings']
|
||||||
SCOPE_TO_MODULE = {
|
SCOPE_TO_MODULE = {
|
||||||
'users': 'users',
|
'users': 'users',
|
||||||
'events': 'events',
|
'events': 'events',
|
||||||
@@ -140,9 +140,6 @@ class StaffProfile(models.Model):
|
|||||||
'settings': 'settings',
|
'settings': 'settings',
|
||||||
'ads': 'ad-control',
|
'ads': 'ad-control',
|
||||||
'contributions': 'contributions',
|
'contributions': 'contributions',
|
||||||
'leads': 'leads',
|
|
||||||
'audit': 'audit-log',
|
|
||||||
'reviews': 'reviews',
|
|
||||||
}
|
}
|
||||||
modules = {'dashboard'}
|
modules = {'dashboard'}
|
||||||
for scope in scopes:
|
for scope in scopes:
|
||||||
@@ -181,74 +178,6 @@ class AuditLog(models.Model):
|
|||||||
|
|
||||||
class Meta:
|
class Meta:
|
||||||
ordering = ['-created_at']
|
ordering = ['-created_at']
|
||||||
indexes = [
|
|
||||||
# Fast filter-by-action ordered by time (audit log page default view)
|
|
||||||
models.Index(fields=['action', '-created_at'], name='auditlog_action_time_idx'),
|
|
||||||
# Fast "related entries for this target" lookups in the detail panel
|
|
||||||
models.Index(fields=['target_type', 'target_id'], name='auditlog_target_idx'),
|
|
||||||
]
|
|
||||||
|
|
||||||
def __str__(self):
|
def __str__(self):
|
||||||
return f"{self.action} by {self.user} at {self.created_at}"
|
return f"{self.action} by {self.user} at {self.created_at}"
|
||||||
|
|
||||||
|
|
||||||
# ---------------------------------------------------------------------------
|
|
||||||
# Lead Manager
|
|
||||||
# ---------------------------------------------------------------------------
|
|
||||||
|
|
||||||
class Lead(models.Model):
|
|
||||||
EVENT_TYPE_CHOICES = [
|
|
||||||
('private', 'Private Event'),
|
|
||||||
('ticketed', 'Ticketed Event'),
|
|
||||||
('corporate', 'Corporate Event'),
|
|
||||||
('wedding', 'Wedding'),
|
|
||||||
('other', 'Other'),
|
|
||||||
]
|
|
||||||
STATUS_CHOICES = [
|
|
||||||
('new', 'New'),
|
|
||||||
('contacted', 'Contacted'),
|
|
||||||
('qualified', 'Qualified'),
|
|
||||||
('converted', 'Converted'),
|
|
||||||
('closed', 'Closed'),
|
|
||||||
]
|
|
||||||
SOURCE_CHOICES = [
|
|
||||||
('schedule_call', 'Schedule a Call'),
|
|
||||||
('website', 'Website'),
|
|
||||||
('manual', 'Manual'),
|
|
||||||
]
|
|
||||||
PRIORITY_CHOICES = [
|
|
||||||
('low', 'Low'),
|
|
||||||
('medium', 'Medium'),
|
|
||||||
('high', 'High'),
|
|
||||||
]
|
|
||||||
|
|
||||||
name = models.CharField(max_length=200)
|
|
||||||
email = models.EmailField()
|
|
||||||
phone = models.CharField(max_length=20)
|
|
||||||
event_type = models.CharField(max_length=20, choices=EVENT_TYPE_CHOICES, default='private')
|
|
||||||
message = models.TextField(blank=True, default='')
|
|
||||||
status = models.CharField(max_length=20, choices=STATUS_CHOICES, default='new')
|
|
||||||
source = models.CharField(max_length=20, choices=SOURCE_CHOICES, default='schedule_call')
|
|
||||||
priority = models.CharField(max_length=10, choices=PRIORITY_CHOICES, default='medium')
|
|
||||||
assigned_to = models.ForeignKey(
|
|
||||||
User, on_delete=models.SET_NULL, null=True, blank=True, related_name='assigned_leads'
|
|
||||||
)
|
|
||||||
user_account = models.ForeignKey(
|
|
||||||
User, on_delete=models.SET_NULL, null=True, blank=True, related_name='submitted_leads',
|
|
||||||
help_text='Consumer platform account that submitted this lead (auto-matched by email)'
|
|
||||||
)
|
|
||||||
notes = models.TextField(blank=True, default='')
|
|
||||||
created_at = models.DateTimeField(auto_now_add=True)
|
|
||||||
updated_at = models.DateTimeField(auto_now=True)
|
|
||||||
|
|
||||||
class Meta:
|
|
||||||
ordering = ['-created_at']
|
|
||||||
indexes = [
|
|
||||||
models.Index(fields=['status']),
|
|
||||||
models.Index(fields=['priority']),
|
|
||||||
models.Index(fields=['created_at']),
|
|
||||||
models.Index(fields=['email']),
|
|
||||||
]
|
|
||||||
|
|
||||||
def __str__(self):
|
|
||||||
return f'Lead #{self.pk} — {self.name} ({self.status})'
|
|
||||||
|
|||||||
@@ -1,325 +0,0 @@
|
|||||||
"""Tests for the Audit Log module (admin_api v1.12.0).
|
|
||||||
|
|
||||||
Covers:
|
|
||||||
* `AuditLogListView` — list + search + filter shape
|
|
||||||
* `AuditLogMetricsView` — shape + `?nocache=1` bypass
|
|
||||||
* `UserStatusView` — emits a row into `AuditLog` for every status change,
|
|
||||||
inside the same transaction as the state change
|
|
||||||
|
|
||||||
Run with:
|
|
||||||
python manage.py test admin_api.tests
|
|
||||||
"""
|
|
||||||
from __future__ import annotations
|
|
||||||
|
|
||||||
from django.core.cache import cache
|
|
||||||
from django.test import TestCase
|
|
||||||
from rest_framework_simplejwt.tokens import RefreshToken
|
|
||||||
|
|
||||||
from accounts.models import User
|
|
||||||
from admin_api.models import AuditLog
|
|
||||||
|
|
||||||
|
|
||||||
# ---------------------------------------------------------------------------
|
|
||||||
# Base — auth helper shared across cases
|
|
||||||
# ---------------------------------------------------------------------------
|
|
||||||
|
|
||||||
|
|
||||||
class _AuditTestBase(TestCase):
|
|
||||||
"""Gives each subclass an admin user + pre-issued JWT."""
|
|
||||||
|
|
||||||
@classmethod
|
|
||||||
def setUpTestData(cls):
|
|
||||||
cls.admin = User.objects.create_user(
|
|
||||||
username='audit.admin@eventifyplus.com',
|
|
||||||
email='audit.admin@eventifyplus.com',
|
|
||||||
password='irrelevant',
|
|
||||||
role='admin',
|
|
||||||
)
|
|
||||||
cls.admin.is_superuser = True
|
|
||||||
cls.admin.save()
|
|
||||||
|
|
||||||
def setUp(self):
|
|
||||||
# Metrics view caches by key; reset to keep cases independent.
|
|
||||||
cache.delete('admin_api:audit_log:metrics:v1')
|
|
||||||
access = str(RefreshToken.for_user(self.admin).access_token)
|
|
||||||
self.auth = {'HTTP_AUTHORIZATION': f'Bearer {access}'}
|
|
||||||
|
|
||||||
|
|
||||||
# ---------------------------------------------------------------------------
|
|
||||||
# AuditLogListView
|
|
||||||
# ---------------------------------------------------------------------------
|
|
||||||
|
|
||||||
|
|
||||||
class AuditLogListViewTests(_AuditTestBase):
|
|
||||||
url = '/api/v1/rbac/audit-log/'
|
|
||||||
|
|
||||||
def test_unauthenticated_returns_401(self):
|
|
||||||
resp = self.client.get(self.url)
|
|
||||||
self.assertEqual(resp.status_code, 401)
|
|
||||||
|
|
||||||
def test_authenticated_returns_paginated_shape(self):
|
|
||||||
AuditLog.objects.create(
|
|
||||||
user=self.admin,
|
|
||||||
action='user.suspended',
|
|
||||||
target_type='user',
|
|
||||||
target_id='42',
|
|
||||||
details={'reason': 'spam'},
|
|
||||||
)
|
|
||||||
resp = self.client.get(self.url, **self.auth)
|
|
||||||
self.assertEqual(resp.status_code, 200, resp.content)
|
|
||||||
body = resp.json()
|
|
||||||
for key in ('results', 'total', 'page', 'page_size', 'total_pages'):
|
|
||||||
self.assertIn(key, body)
|
|
||||||
self.assertEqual(body['total'], 1)
|
|
||||||
self.assertEqual(body['results'][0]['action'], 'user.suspended')
|
|
||||||
self.assertEqual(body['results'][0]['user']['email'], self.admin.email)
|
|
||||||
|
|
||||||
def test_search_narrows_results(self):
|
|
||||||
AuditLog.objects.create(
|
|
||||||
user=self.admin, action='user.suspended',
|
|
||||||
target_type='user', target_id='1', details={},
|
|
||||||
)
|
|
||||||
AuditLog.objects.create(
|
|
||||||
user=self.admin, action='event.approved',
|
|
||||||
target_type='event', target_id='1', details={},
|
|
||||||
)
|
|
||||||
resp = self.client.get(self.url, {'search': 'suspend'}, **self.auth)
|
|
||||||
self.assertEqual(resp.status_code, 200)
|
|
||||||
body = resp.json()
|
|
||||||
self.assertEqual(body['total'], 1)
|
|
||||||
self.assertEqual(body['results'][0]['action'], 'user.suspended')
|
|
||||||
|
|
||||||
def test_page_size_is_bounded(self):
|
|
||||||
# page_size=999 must be clamped to the 200-row upper bound.
|
|
||||||
resp = self.client.get(self.url, {'page_size': '999'}, **self.auth)
|
|
||||||
self.assertEqual(resp.status_code, 200)
|
|
||||||
self.assertEqual(resp.json()['page_size'], 200)
|
|
||||||
|
|
||||||
def test_invalid_pagination_falls_back_to_defaults(self):
|
|
||||||
resp = self.client.get(self.url, {'page': 'x', 'page_size': 'y'}, **self.auth)
|
|
||||||
self.assertEqual(resp.status_code, 200)
|
|
||||||
body = resp.json()
|
|
||||||
self.assertEqual(body['page'], 1)
|
|
||||||
self.assertEqual(body['page_size'], 50)
|
|
||||||
|
|
||||||
|
|
||||||
# ---------------------------------------------------------------------------
|
|
||||||
# AuditLogMetricsView
|
|
||||||
# ---------------------------------------------------------------------------
|
|
||||||
|
|
||||||
|
|
||||||
class AuditLogMetricsViewTests(_AuditTestBase):
|
|
||||||
url = '/api/v1/rbac/audit-log/metrics/'
|
|
||||||
|
|
||||||
def test_unauthenticated_returns_401(self):
|
|
||||||
resp = self.client.get(self.url)
|
|
||||||
self.assertEqual(resp.status_code, 401)
|
|
||||||
|
|
||||||
def test_returns_expected_shape(self):
|
|
||||||
AuditLog.objects.create(
|
|
||||||
user=self.admin, action='event.approved',
|
|
||||||
target_type='event', target_id='7', details={},
|
|
||||||
)
|
|
||||||
AuditLog.objects.create(
|
|
||||||
user=self.admin, action='department.created',
|
|
||||||
target_type='department', target_id='3', details={},
|
|
||||||
)
|
|
||||||
resp = self.client.get(self.url, **self.auth)
|
|
||||||
self.assertEqual(resp.status_code, 200, resp.content)
|
|
||||||
body = resp.json()
|
|
||||||
for key in ('total', 'today', 'week', 'distinct_users', 'by_action_group'):
|
|
||||||
self.assertIn(key, body)
|
|
||||||
self.assertEqual(body['total'], 2)
|
|
||||||
self.assertEqual(body['distinct_users'], 1)
|
|
||||||
# Each group present so frontend tooltip can render all 6 rows.
|
|
||||||
for group in ('create', 'update', 'delete', 'moderate', 'auth', 'other'):
|
|
||||||
self.assertIn(group, body['by_action_group'])
|
|
||||||
self.assertEqual(body['by_action_group']['moderate'], 1)
|
|
||||||
self.assertEqual(body['by_action_group']['create'], 1)
|
|
||||||
|
|
||||||
def test_nocache_bypasses_stale_cache(self):
|
|
||||||
# Prime cache with a fake payload.
|
|
||||||
cache.set(
|
|
||||||
'admin_api:audit_log:metrics:v1',
|
|
||||||
{
|
|
||||||
'total': 999,
|
|
||||||
'today': 0, 'week': 0, 'distinct_users': 0,
|
|
||||||
'by_action_group': {
|
|
||||||
'create': 0, 'update': 0, 'delete': 0,
|
|
||||||
'moderate': 0, 'auth': 0, 'other': 0,
|
|
||||||
},
|
|
||||||
},
|
|
||||||
60,
|
|
||||||
)
|
|
||||||
|
|
||||||
resp_cached = self.client.get(self.url, **self.auth)
|
|
||||||
self.assertEqual(resp_cached.json()['total'], 999)
|
|
||||||
|
|
||||||
resp_fresh = self.client.get(self.url, {'nocache': '1'}, **self.auth)
|
|
||||||
self.assertEqual(resp_fresh.json()['total'], 0) # real DB state
|
|
||||||
|
|
||||||
|
|
||||||
# ---------------------------------------------------------------------------
|
|
||||||
# UserStatusView — audit emission
|
|
||||||
# ---------------------------------------------------------------------------
|
|
||||||
|
|
||||||
|
|
||||||
class UserStatusAuditEmissionTests(_AuditTestBase):
|
|
||||||
"""Each status transition must leave a matching row in `AuditLog`.
|
|
||||||
|
|
||||||
The endpoint wraps the state change + audit log in `transaction.atomic()`
|
|
||||||
so the two can never disagree. These assertions catch regressions where a
|
|
||||||
new branch forgets the audit call.
|
|
||||||
"""
|
|
||||||
|
|
||||||
def _url(self, user_id: int) -> str:
|
|
||||||
return f'/api/v1/users/{user_id}/status/'
|
|
||||||
|
|
||||||
def setUp(self):
|
|
||||||
super().setUp()
|
|
||||||
self.target = User.objects.create_user(
|
|
||||||
username='target@example.com',
|
|
||||||
email='target@example.com',
|
|
||||||
password='irrelevant',
|
|
||||||
role='customer',
|
|
||||||
)
|
|
||||||
|
|
||||||
def test_suspend_emits_audit_row(self):
|
|
||||||
resp = self.client.patch(
|
|
||||||
self._url(self.target.id),
|
|
||||||
data={'action': 'suspend', 'reason': 'spam flood'},
|
|
||||||
content_type='application/json',
|
|
||||||
**self.auth,
|
|
||||||
)
|
|
||||||
self.assertEqual(resp.status_code, 200, resp.content)
|
|
||||||
log = AuditLog.objects.filter(
|
|
||||||
action='user.suspended', target_id=str(self.target.id),
|
|
||||||
).first()
|
|
||||||
self.assertIsNotNone(log, 'suspend did not emit audit log')
|
|
||||||
self.assertEqual(log.details.get('reason'), 'spam flood')
|
|
||||||
|
|
||||||
def test_ban_emits_audit_row(self):
|
|
||||||
resp = self.client.patch(
|
|
||||||
self._url(self.target.id),
|
|
||||||
data={'action': 'ban'},
|
|
||||||
content_type='application/json',
|
|
||||||
**self.auth,
|
|
||||||
)
|
|
||||||
self.assertEqual(resp.status_code, 200, resp.content)
|
|
||||||
self.assertTrue(
|
|
||||||
AuditLog.objects.filter(
|
|
||||||
action='user.banned', target_id=str(self.target.id),
|
|
||||||
).exists(),
|
|
||||||
'ban did not emit audit log',
|
|
||||||
)
|
|
||||||
|
|
||||||
def test_reinstate_emits_audit_row(self):
|
|
||||||
self.target.is_active = False
|
|
||||||
self.target.save()
|
|
||||||
resp = self.client.patch(
|
|
||||||
self._url(self.target.id),
|
|
||||||
data={'action': 'reinstate'},
|
|
||||||
content_type='application/json',
|
|
||||||
**self.auth,
|
|
||||||
)
|
|
||||||
self.assertEqual(resp.status_code, 200, resp.content)
|
|
||||||
self.assertTrue(
|
|
||||||
AuditLog.objects.filter(
|
|
||||||
action='user.reinstated', target_id=str(self.target.id),
|
|
||||||
).exists(),
|
|
||||||
'reinstate did not emit audit log',
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
# ---------------------------------------------------------------------------
|
|
||||||
# AdminLoginView — audit emission
|
|
||||||
# ---------------------------------------------------------------------------
|
|
||||||
|
|
||||||
|
|
||||||
class AuthAuditEmissionTests(_AuditTestBase):
|
|
||||||
"""Successful and failed logins must leave matching rows in AuditLog."""
|
|
||||||
|
|
||||||
url = '/api/v1/admin/auth/login/'
|
|
||||||
|
|
||||||
def test_successful_login_emits_audit_row(self):
|
|
||||||
resp = self.client.post(
|
|
||||||
self.url,
|
|
||||||
data={'username': self.admin.username, 'password': 'irrelevant'},
|
|
||||||
content_type='application/json',
|
|
||||||
)
|
|
||||||
self.assertEqual(resp.status_code, 200, resp.content)
|
|
||||||
log = AuditLog.objects.filter(
|
|
||||||
action='auth.admin_login', target_id=str(self.admin.id),
|
|
||||||
).first()
|
|
||||||
self.assertIsNotNone(log, 'successful login did not emit audit log')
|
|
||||||
self.assertEqual(log.details.get('username'), self.admin.username)
|
|
||||||
|
|
||||||
def test_failed_login_emits_audit_row(self):
|
|
||||||
resp = self.client.post(
|
|
||||||
self.url,
|
|
||||||
data={'username': self.admin.username, 'password': 'wrong-password'},
|
|
||||||
content_type='application/json',
|
|
||||||
)
|
|
||||||
self.assertEqual(resp.status_code, 401, resp.content)
|
|
||||||
self.assertTrue(
|
|
||||||
AuditLog.objects.filter(action='auth.admin_login_failed').exists(),
|
|
||||||
'failed login did not emit audit log',
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
# ---------------------------------------------------------------------------
|
|
||||||
# EventCreateView / EventUpdateView / EventDeleteView — audit emission
|
|
||||||
# ---------------------------------------------------------------------------
|
|
||||||
|
|
||||||
|
|
||||||
class EventCrudAuditTests(_AuditTestBase):
|
|
||||||
"""Event CRUD operations must emit matching audit rows."""
|
|
||||||
|
|
||||||
def setUp(self):
|
|
||||||
super().setUp()
|
|
||||||
from events.models import EventType
|
|
||||||
self.event_type = EventType.objects.create(event_type='Test Category')
|
|
||||||
|
|
||||||
def _create_event_id(self):
|
|
||||||
resp = self.client.post(
|
|
||||||
'/api/v1/events/create/',
|
|
||||||
data={'title': 'Test Event', 'eventType': self.event_type.id},
|
|
||||||
content_type='application/json',
|
|
||||||
**self.auth,
|
|
||||||
)
|
|
||||||
self.assertEqual(resp.status_code, 201, resp.content)
|
|
||||||
return resp.json()['id']
|
|
||||||
|
|
||||||
def test_create_event_emits_audit_row(self):
|
|
||||||
event_id = self._create_event_id()
|
|
||||||
self.assertTrue(
|
|
||||||
AuditLog.objects.filter(action='event.created', target_id=str(event_id)).exists(),
|
|
||||||
'event create did not emit audit log',
|
|
||||||
)
|
|
||||||
|
|
||||||
def test_update_event_emits_audit_row(self):
|
|
||||||
event_id = self._create_event_id()
|
|
||||||
AuditLog.objects.all().delete()
|
|
||||||
resp = self.client.patch(
|
|
||||||
f'/api/v1/events/{event_id}/update/',
|
|
||||||
data={'title': 'Updated Title'},
|
|
||||||
content_type='application/json',
|
|
||||||
**self.auth,
|
|
||||||
)
|
|
||||||
self.assertEqual(resp.status_code, 200, resp.content)
|
|
||||||
log = AuditLog.objects.filter(action='event.updated', target_id=str(event_id)).first()
|
|
||||||
self.assertIsNotNone(log, 'event update did not emit audit log')
|
|
||||||
self.assertIn('title', log.details.get('changed_fields', []))
|
|
||||||
|
|
||||||
def test_delete_event_emits_audit_row(self):
|
|
||||||
event_id = self._create_event_id()
|
|
||||||
AuditLog.objects.all().delete()
|
|
||||||
resp = self.client.delete(
|
|
||||||
f'/api/v1/events/{event_id}/delete/',
|
|
||||||
**self.auth,
|
|
||||||
)
|
|
||||||
self.assertEqual(resp.status_code, 204)
|
|
||||||
self.assertTrue(
|
|
||||||
AuditLog.objects.filter(action='event.deleted', target_id=str(event_id)).exists(),
|
|
||||||
'event delete did not emit audit log',
|
|
||||||
)
|
|
||||||
@@ -18,7 +18,6 @@ urlpatterns = [
|
|||||||
path('partners/<int:pk>/', views.PartnerDetailView.as_view(), name='partner-detail'),
|
path('partners/<int:pk>/', views.PartnerDetailView.as_view(), name='partner-detail'),
|
||||||
path('partners/<int:pk>/status/', views.PartnerStatusView.as_view(), name='partner-status'),
|
path('partners/<int:pk>/status/', views.PartnerStatusView.as_view(), name='partner-status'),
|
||||||
path('partners/<int:pk>/kyc/review/', views.PartnerKYCReviewView.as_view(), name='partner-kyc-review'),
|
path('partners/<int:pk>/kyc/review/', views.PartnerKYCReviewView.as_view(), name='partner-kyc-review'),
|
||||||
path('partners/<int:pk>/impersonate/', views.PartnerImpersonateView.as_view(), name='partner-impersonate'),
|
|
||||||
path('partners/onboard/', views.PartnerOnboardView.as_view(), name='partner-onboard'),
|
path('partners/onboard/', views.PartnerOnboardView.as_view(), name='partner-onboard'),
|
||||||
path('partners/<int:partner_id>/staff/', views.PartnerStaffCreateView.as_view(), name='partner-staff-create'),
|
path('partners/<int:partner_id>/staff/', views.PartnerStaffCreateView.as_view(), name='partner-staff-create'),
|
||||||
path('users/metrics/', views.UserMetricsView.as_view(), name='user-metrics'),
|
path('users/metrics/', views.UserMetricsView.as_view(), name='user-metrics'),
|
||||||
@@ -45,12 +44,6 @@ urlpatterns = [
|
|||||||
path('reviews/<int:pk>/moderate/', views.ReviewModerationView.as_view(), name='review-moderate'),
|
path('reviews/<int:pk>/moderate/', views.ReviewModerationView.as_view(), name='review-moderate'),
|
||||||
path('reviews/<int:pk>/', views.ReviewDeleteView.as_view(), name='review-delete'),
|
path('reviews/<int:pk>/', views.ReviewDeleteView.as_view(), name='review-delete'),
|
||||||
|
|
||||||
# Lead Manager
|
|
||||||
path('leads/metrics/', views.LeadMetricsView.as_view(), name='lead-metrics'),
|
|
||||||
path('leads/', views.LeadListView.as_view(), name='lead-list'),
|
|
||||||
path('leads/<int:pk>/', views.LeadDetailView.as_view(), name='lead-detail'),
|
|
||||||
path('leads/<int:pk>/update/', views.LeadUpdateView.as_view(), name='lead-update'),
|
|
||||||
|
|
||||||
path('gamification/submit-event/', views.GamificationSubmitEventView.as_view(), name='gamification-submit-event'),
|
path('gamification/submit-event/', views.GamificationSubmitEventView.as_view(), name='gamification-submit-event'),
|
||||||
path('gamification/submit-event', views.GamificationSubmitEventView.as_view()),
|
path('gamification/submit-event', views.GamificationSubmitEventView.as_view()),
|
||||||
path('shop/items/', views.ShopItemsView.as_view(), name='shop-items'),
|
path('shop/items/', views.ShopItemsView.as_view(), name='shop-items'),
|
||||||
@@ -81,16 +74,6 @@ urlpatterns = [
|
|||||||
path('rbac/scopes/', views.ScopeListView.as_view(), name='rbac-scope-list'),
|
path('rbac/scopes/', views.ScopeListView.as_view(), name='rbac-scope-list'),
|
||||||
path('rbac/org-tree/', views.OrgTreeView.as_view(), name='rbac-org-tree'),
|
path('rbac/org-tree/', views.OrgTreeView.as_view(), name='rbac-org-tree'),
|
||||||
path('rbac/audit-log/', views.AuditLogListView.as_view(), name='rbac-audit-log'),
|
path('rbac/audit-log/', views.AuditLogListView.as_view(), name='rbac-audit-log'),
|
||||||
path('rbac/audit-log/metrics/', views.AuditLogMetricsView.as_view(), name='rbac-audit-log-metrics'),
|
|
||||||
|
|
||||||
# Notifications (admin-side recurring email jobs)
|
|
||||||
path('notifications/types/', views.NotificationTypesView.as_view(), name='notification-types'),
|
|
||||||
path('notifications/schedules/', views.NotificationScheduleListView.as_view(), name='notification-schedule-list'),
|
|
||||||
path('notifications/schedules/<int:pk>/', views.NotificationScheduleDetailView.as_view(), name='notification-schedule-detail'),
|
|
||||||
path('notifications/schedules/<int:pk>/recipients/', views.NotificationRecipientView.as_view(), name='notification-recipient-create'),
|
|
||||||
path('notifications/schedules/<int:pk>/recipients/<int:rid>/', views.NotificationRecipientDetailView.as_view(), name='notification-recipient-detail'),
|
|
||||||
path('notifications/schedules/<int:pk>/send-now/', views.NotificationScheduleSendNowView.as_view(), name='notification-schedule-send-now'),
|
|
||||||
path('notifications/schedules/<int:pk>/test-send/', views.NotificationScheduleTestSendView.as_view(), name='notification-schedule-test-send'),
|
|
||||||
|
|
||||||
# Ad Control
|
# Ad Control
|
||||||
path('ad-control/', include('ad_control.urls')),
|
path('ad-control/', include('ad_control.urls')),
|
||||||
|
|||||||
1034
admin_api/views.py
1034
admin_api/views.py
File diff suppressed because it is too large
Load Diff
@@ -196,9 +196,3 @@ SIMPLE_JWT = {
|
|||||||
'USER_ID_FIELD': 'id',
|
'USER_ID_FIELD': 'id',
|
||||||
'USER_ID_CLAIM': 'user_id',
|
'USER_ID_CLAIM': 'user_id',
|
||||||
}
|
}
|
||||||
|
|
||||||
# --- Google OAuth (Sign in with Google via GIS ID-token flow) -----------
|
|
||||||
# The Client ID is public (safe in VITE_* env vars and the SPA bundle).
|
|
||||||
# There is NO client secret — we use the ID-token flow, not auth-code flow.
|
|
||||||
# Set the SAME value in the Django container .env and in SPA .env.local.
|
|
||||||
GOOGLE_CLIENT_ID = os.environ.get('GOOGLE_CLIENT_ID', '')
|
|
||||||
|
|||||||
@@ -36,7 +36,6 @@ urlpatterns = [
|
|||||||
path('banking/', include('banking_operations.urls')),
|
path('banking/', include('banking_operations.urls')),
|
||||||
path('api/', include('mobile_api.urls')),
|
path('api/', include('mobile_api.urls')),
|
||||||
path('api/v1/', include('admin_api.urls')),
|
path('api/v1/', include('admin_api.urls')),
|
||||||
path('api/notifications/', include('notifications.urls')),
|
|
||||||
# path('web-api/', include('web_api.urls')),
|
# path('web-api/', include('web_api.urls')),
|
||||||
|
|
||||||
path('summernote/', include('django_summernote.urls')),
|
path('summernote/', include('django_summernote.urls')),
|
||||||
|
|||||||
@@ -1,73 +0,0 @@
|
|||||||
"""
|
|
||||||
Add contributed_by field to Event and backfill from overloaded source field.
|
|
||||||
|
|
||||||
The admin dashboard stores community contributor identifiers (EVT-XXXXXXXX or email)
|
|
||||||
in the source field. This migration:
|
|
||||||
1. Adds a dedicated contributed_by CharField
|
|
||||||
2. Copies user identifiers from source → contributed_by
|
|
||||||
3. Normalizes source back to its intended choices ('eventify', 'community', 'partner')
|
|
||||||
"""
|
|
||||||
|
|
||||||
from django.db import migrations, models
|
|
||||||
|
|
||||||
|
|
||||||
def backfill_contributed_by(apps, schema_editor):
|
|
||||||
"""Move user identifiers from source to contributed_by."""
|
|
||||||
Event = apps.get_model('events', 'Event')
|
|
||||||
|
|
||||||
STANDARD_SOURCES = {'eventify', 'community', 'partner', 'eventify_team', 'official', ''}
|
|
||||||
|
|
||||||
for event in Event.objects.all().iterator():
|
|
||||||
source_val = (event.source or '').strip()
|
|
||||||
changed = False
|
|
||||||
|
|
||||||
# User identifier: contains @ (email) or starts with EVT- (eventifyId)
|
|
||||||
if source_val and source_val not in STANDARD_SOURCES and not source_val.startswith('partner:'):
|
|
||||||
event.contributed_by = source_val
|
|
||||||
event.source = 'community'
|
|
||||||
changed = True
|
|
||||||
|
|
||||||
# Normalize eventify_team → eventify
|
|
||||||
elif source_val == 'eventify_team':
|
|
||||||
event.source = 'eventify'
|
|
||||||
changed = True
|
|
||||||
|
|
||||||
# Normalize official → eventify
|
|
||||||
elif source_val == 'official':
|
|
||||||
event.source = 'eventify'
|
|
||||||
changed = True
|
|
||||||
|
|
||||||
if changed:
|
|
||||||
event.save(update_fields=['source', 'contributed_by'])
|
|
||||||
|
|
||||||
|
|
||||||
def reverse_backfill(apps, schema_editor):
|
|
||||||
"""Reverse: move contributed_by back to source."""
|
|
||||||
Event = apps.get_model('events', 'Event')
|
|
||||||
for event in Event.objects.exclude(contributed_by__isnull=True).exclude(contributed_by='').iterator():
|
|
||||||
event.source = event.contributed_by
|
|
||||||
event.contributed_by = None
|
|
||||||
event.save(update_fields=['source', 'contributed_by'])
|
|
||||||
|
|
||||||
|
|
||||||
class Migration(migrations.Migration):
|
|
||||||
|
|
||||||
dependencies = [
|
|
||||||
('events', '0010_merge_20260324_1443'),
|
|
||||||
]
|
|
||||||
|
|
||||||
operations = [
|
|
||||||
# Step 1: Add the field
|
|
||||||
migrations.AddField(
|
|
||||||
model_name='event',
|
|
||||||
name='contributed_by',
|
|
||||||
field=models.CharField(
|
|
||||||
blank=True,
|
|
||||||
help_text='Eventify ID (EVT-XXXXXXXX) or email of the community contributor',
|
|
||||||
max_length=100,
|
|
||||||
null=True,
|
|
||||||
),
|
|
||||||
),
|
|
||||||
# Step 2: Backfill data
|
|
||||||
migrations.RunPython(backfill_contributed_by, reverse_backfill),
|
|
||||||
]
|
|
||||||
@@ -1,38 +0,0 @@
|
|||||||
from django.conf import settings
|
|
||||||
from django.db import migrations, models
|
|
||||||
import django.db.models.deletion
|
|
||||||
|
|
||||||
|
|
||||||
class Migration(migrations.Migration):
|
|
||||||
|
|
||||||
dependencies = [
|
|
||||||
('events', '0011_event_contributed_by'),
|
|
||||||
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
|
|
||||||
]
|
|
||||||
|
|
||||||
operations = [
|
|
||||||
migrations.CreateModel(
|
|
||||||
name='EventLike',
|
|
||||||
fields=[
|
|
||||||
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
|
||||||
('created_at', models.DateTimeField(auto_now_add=True)),
|
|
||||||
('event', models.ForeignKey(
|
|
||||||
on_delete=django.db.models.deletion.CASCADE,
|
|
||||||
related_name='likes',
|
|
||||||
to='events.event',
|
|
||||||
)),
|
|
||||||
('user', models.ForeignKey(
|
|
||||||
on_delete=django.db.models.deletion.CASCADE,
|
|
||||||
related_name='event_likes',
|
|
||||||
to=settings.AUTH_USER_MODEL,
|
|
||||||
)),
|
|
||||||
],
|
|
||||||
options={
|
|
||||||
'unique_together': {('user', 'event')},
|
|
||||||
},
|
|
||||||
),
|
|
||||||
migrations.AddIndex(
|
|
||||||
model_name='eventlike',
|
|
||||||
index=models.Index(fields=['user', '-created_at'], name='events_even_user_id_created_idx'),
|
|
||||||
),
|
|
||||||
]
|
|
||||||
@@ -58,11 +58,6 @@ class Event(models.Model):
|
|||||||
is_featured = models.BooleanField(default=False, help_text='Show this event in the featured section')
|
is_featured = models.BooleanField(default=False, help_text='Show this event in the featured section')
|
||||||
is_top_event = models.BooleanField(default=False, help_text='Show this event in the Top Events section')
|
is_top_event = models.BooleanField(default=False, help_text='Show this event in the Top Events section')
|
||||||
|
|
||||||
contributed_by = models.CharField(
|
|
||||||
max_length=100, blank=True, null=True,
|
|
||||||
help_text='Eventify ID (EVT-XXXXXXXX) or email of the community contributor',
|
|
||||||
)
|
|
||||||
|
|
||||||
def __str__(self):
|
def __str__(self):
|
||||||
return f"{self.name} ({self.start_date})"
|
return f"{self.name} ({self.start_date})"
|
||||||
|
|
||||||
@@ -76,26 +71,3 @@ class EventImages(models.Model):
|
|||||||
return f"{self.event_image}"
|
return f"{self.event_image}"
|
||||||
|
|
||||||
|
|
||||||
class EventLike(models.Model):
|
|
||||||
user = models.ForeignKey(
|
|
||||||
'accounts.User',
|
|
||||||
on_delete=models.CASCADE,
|
|
||||||
related_name='event_likes'
|
|
||||||
)
|
|
||||||
event = models.ForeignKey(
|
|
||||||
Event,
|
|
||||||
on_delete=models.CASCADE,
|
|
||||||
related_name='likes'
|
|
||||||
)
|
|
||||||
created_at = models.DateTimeField(auto_now_add=True)
|
|
||||||
|
|
||||||
class Meta:
|
|
||||||
unique_together = ('user', 'event')
|
|
||||||
indexes = [
|
|
||||||
models.Index(fields=['user', '-created_at']),
|
|
||||||
]
|
|
||||||
|
|
||||||
def __str__(self):
|
|
||||||
return f"{self.user.email} likes {self.event.name}"
|
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
@@ -1,89 +1,3 @@
|
|||||||
"""Unit tests for GoogleLoginView.
|
from django.test import TestCase
|
||||||
|
|
||||||
Run with:
|
# Create your tests here.
|
||||||
python manage.py test mobile_api.tests
|
|
||||||
"""
|
|
||||||
import json
|
|
||||||
from unittest.mock import patch, MagicMock
|
|
||||||
|
|
||||||
from django.test import TestCase, override_settings
|
|
||||||
from rest_framework.authtoken.models import Token
|
|
||||||
|
|
||||||
from accounts.models import User
|
|
||||||
|
|
||||||
|
|
||||||
@override_settings(GOOGLE_CLIENT_ID='test-client-id.apps.googleusercontent.com')
|
|
||||||
class GoogleLoginViewTests(TestCase):
|
|
||||||
url = '/api/user/google-login/'
|
|
||||||
|
|
||||||
def _valid_idinfo(self, email='new.user@example.com'):
|
|
||||||
return {
|
|
||||||
'email': email,
|
|
||||||
'given_name': 'New',
|
|
||||||
'family_name': 'User',
|
|
||||||
'aud': 'test-client-id.apps.googleusercontent.com',
|
|
||||||
}
|
|
||||||
|
|
||||||
@patch('google.oauth2.id_token.verify_oauth2_token')
|
|
||||||
def test_valid_token_creates_user(self, mock_verify):
|
|
||||||
mock_verify.return_value = self._valid_idinfo('fresh@example.com')
|
|
||||||
|
|
||||||
resp = self.client.post(
|
|
||||||
self.url,
|
|
||||||
data=json.dumps({'id_token': 'fake.google.jwt'}),
|
|
||||||
content_type='application/json',
|
|
||||||
)
|
|
||||||
|
|
||||||
self.assertEqual(resp.status_code, 200, resp.content)
|
|
||||||
body = resp.json()
|
|
||||||
self.assertEqual(body['email'], 'fresh@example.com')
|
|
||||||
self.assertEqual(body['role'], 'customer')
|
|
||||||
self.assertTrue(body['token'])
|
|
||||||
|
|
||||||
user = User.objects.get(email='fresh@example.com')
|
|
||||||
self.assertTrue(Token.objects.filter(user=user).exists())
|
|
||||||
# Confirm audience was passed to verify_oauth2_token
|
|
||||||
_, call_kwargs = mock_verify.call_args[0], mock_verify.call_args
|
|
||||||
self.assertEqual(mock_verify.call_args[0][2], 'test-client-id.apps.googleusercontent.com')
|
|
||||||
|
|
||||||
def test_missing_id_token_returns_400(self):
|
|
||||||
resp = self.client.post(
|
|
||||||
self.url,
|
|
||||||
data=json.dumps({}),
|
|
||||||
content_type='application/json',
|
|
||||||
)
|
|
||||||
self.assertEqual(resp.status_code, 400)
|
|
||||||
self.assertEqual(resp.json()['error'], 'id_token is required')
|
|
||||||
|
|
||||||
@patch('google.oauth2.id_token.verify_oauth2_token')
|
|
||||||
def test_invalid_token_returns_401(self, mock_verify):
|
|
||||||
mock_verify.side_effect = ValueError('Token audience mismatch')
|
|
||||||
|
|
||||||
resp = self.client.post(
|
|
||||||
self.url,
|
|
||||||
data=json.dumps({'id_token': 'tampered.or.wrong-aud.jwt'}),
|
|
||||||
content_type='application/json',
|
|
||||||
)
|
|
||||||
self.assertEqual(resp.status_code, 401)
|
|
||||||
self.assertEqual(resp.json()['error'], 'Invalid Google token')
|
|
||||||
|
|
||||||
@patch('google.oauth2.id_token.verify_oauth2_token')
|
|
||||||
def test_existing_user_reuses_token(self, mock_verify):
|
|
||||||
existing = User.objects.create_user(
|
|
||||||
username='returning@example.com',
|
|
||||||
email='returning@example.com',
|
|
||||||
password='irrelevant',
|
|
||||||
role='customer',
|
|
||||||
)
|
|
||||||
existing_auth_token = Token.objects.create(user=existing)
|
|
||||||
mock_verify.return_value = self._valid_idinfo('returning@example.com')
|
|
||||||
|
|
||||||
resp = self.client.post(
|
|
||||||
self.url,
|
|
||||||
data=json.dumps({'id_token': 'returning.user.jwt'}),
|
|
||||||
content_type='application/json',
|
|
||||||
)
|
|
||||||
self.assertEqual(resp.status_code, 200)
|
|
||||||
self.assertEqual(resp.json()['token'], existing_auth_token.key)
|
|
||||||
# No duplicate user created
|
|
||||||
self.assertEqual(User.objects.filter(email='returning@example.com').count(), 1)
|
|
||||||
|
|||||||
@@ -1,8 +1,6 @@
|
|||||||
from django.urls import path
|
from django.urls import path
|
||||||
from .views import *
|
from .views import *
|
||||||
from mobile_api.views.user import ScheduleCallView
|
|
||||||
from mobile_api.views.reviews import ReviewSubmitView, MobileReviewListView, ReviewHelpfulView, ReviewFlagView
|
from mobile_api.views.reviews import ReviewSubmitView, MobileReviewListView, ReviewHelpfulView, ReviewFlagView
|
||||||
from mobile_api.views.favorites import ToggleLikeView, MyLikedIdsView, MyLikedEventsView
|
|
||||||
from ad_control.views import ConsumerFeaturedEventsView, ConsumerTopEventsView
|
from ad_control.views import ConsumerFeaturedEventsView, ConsumerTopEventsView
|
||||||
|
|
||||||
|
|
||||||
@@ -15,7 +13,6 @@ urlpatterns = [
|
|||||||
path('user/update-profile/', UpdateProfileView.as_view(), name='update_profile'),
|
path('user/update-profile/', UpdateProfileView.as_view(), name='update_profile'),
|
||||||
path('user/bulk-public-info/', BulkUserPublicInfoView.as_view(), name='bulk_public_info'),
|
path('user/bulk-public-info/', BulkUserPublicInfoView.as_view(), name='bulk_public_info'),
|
||||||
path('user/google-login/', GoogleLoginView.as_view(), name='google_login'),
|
path('user/google-login/', GoogleLoginView.as_view(), name='google_login'),
|
||||||
path('leads/schedule-call/', ScheduleCallView.as_view(), name='schedule_call'),
|
|
||||||
]
|
]
|
||||||
|
|
||||||
# Event URLS
|
# Event URLS
|
||||||
@@ -40,10 +37,3 @@ urlpatterns += [
|
|||||||
path('reviews/helpful', ReviewHelpfulView.as_view()),
|
path('reviews/helpful', ReviewHelpfulView.as_view()),
|
||||||
path('reviews/flag', ReviewFlagView.as_view()),
|
path('reviews/flag', ReviewFlagView.as_view()),
|
||||||
]
|
]
|
||||||
|
|
||||||
# Favorites URLs
|
|
||||||
urlpatterns += [
|
|
||||||
path('events/like/', ToggleLikeView.as_view()),
|
|
||||||
path('events/my-likes/', MyLikedIdsView.as_view()),
|
|
||||||
path('events/my-liked-events/', MyLikedEventsView.as_view()),
|
|
||||||
]
|
|
||||||
|
|||||||
@@ -244,9 +244,9 @@ class EventListAPI(APIView):
|
|||||||
if pincode_qs.count() >= MIN_EVENTS_THRESHOLD:
|
if pincode_qs.count() >= MIN_EVENTS_THRESHOLD:
|
||||||
qs = pincode_qs
|
qs = pincode_qs
|
||||||
|
|
||||||
# Priority 3: Full-text search on title / name / description
|
# Priority 3: Full-text search on title / description
|
||||||
if q:
|
if q:
|
||||||
qs = qs.filter(Q(title__icontains=q) | Q(name__icontains=q) | Q(description__icontains=q))
|
qs = qs.filter(Q(title__icontains=q) | Q(description__icontains=q))
|
||||||
|
|
||||||
if per_type > 0 and page == 1:
|
if per_type > 0 and page == 1:
|
||||||
type_ids = list(qs.values_list('event_type_id', flat=True).distinct())
|
type_ids = list(qs.values_list('event_type_id', flat=True).distinct())
|
||||||
|
|||||||
@@ -1,146 +0,0 @@
|
|||||||
from django.views import View
|
|
||||||
from django.utils.decorators import method_decorator
|
|
||||||
from django.views.decorators.csrf import csrf_exempt
|
|
||||||
from django.http import JsonResponse
|
|
||||||
from django.core.paginator import Paginator
|
|
||||||
|
|
||||||
from events.models import Event, EventLike, EventImages
|
|
||||||
from mobile_api.utils import validate_token_and_get_user
|
|
||||||
from eventify_logger.services import log
|
|
||||||
|
|
||||||
|
|
||||||
def _serialize_liked_event(event):
|
|
||||||
"""Serialize an Event for the liked-events list."""
|
|
||||||
primary_img = EventImages.objects.filter(
|
|
||||||
event=event, is_primary=True
|
|
||||||
).first()
|
|
||||||
if not primary_img:
|
|
||||||
primary_img = EventImages.objects.filter(event=event).first()
|
|
||||||
|
|
||||||
return {
|
|
||||||
'id': event.id,
|
|
||||||
'title': event.title or event.name,
|
|
||||||
'image': primary_img.event_image.url if primary_img else '',
|
|
||||||
'date': str(event.start_date) if event.start_date else None,
|
|
||||||
'location': event.place or '',
|
|
||||||
'venue': event.venue_name or '',
|
|
||||||
'event_type': event.event_type.event_type if event.event_type else '',
|
|
||||||
'event_status': event.event_status,
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
@method_decorator(csrf_exempt, name='dispatch')
|
|
||||||
class ToggleLikeView(View):
|
|
||||||
"""POST /api/events/like/ — toggle like on/off for an event."""
|
|
||||||
|
|
||||||
def post(self, request):
|
|
||||||
try:
|
|
||||||
user, token, data, error_response = validate_token_and_get_user(request)
|
|
||||||
if error_response:
|
|
||||||
return error_response
|
|
||||||
|
|
||||||
event_id = data.get('event_id')
|
|
||||||
if not event_id:
|
|
||||||
return JsonResponse(
|
|
||||||
{'status': 'error', 'message': 'event_id is required'},
|
|
||||||
status=400
|
|
||||||
)
|
|
||||||
|
|
||||||
try:
|
|
||||||
event = Event.objects.get(pk=event_id)
|
|
||||||
except Event.DoesNotExist:
|
|
||||||
return JsonResponse(
|
|
||||||
{'status': 'error', 'message': 'Event not found'},
|
|
||||||
status=404
|
|
||||||
)
|
|
||||||
|
|
||||||
like, created = EventLike.objects.get_or_create(user=user, event=event)
|
|
||||||
if not created:
|
|
||||||
like.delete()
|
|
||||||
return JsonResponse({'status': 'success', 'liked': False})
|
|
||||||
|
|
||||||
return JsonResponse({'status': 'success', 'liked': True})
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
log("error", "ToggleLikeView exception", request=request,
|
|
||||||
logger_data={"error": str(e)})
|
|
||||||
return JsonResponse(
|
|
||||||
{'status': 'error', 'message': 'An unexpected server error occurred.'},
|
|
||||||
status=500
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
@method_decorator(csrf_exempt, name='dispatch')
|
|
||||||
class MyLikedIdsView(View):
|
|
||||||
"""POST /api/events/my-likes/ — return all liked event IDs for the user."""
|
|
||||||
|
|
||||||
def post(self, request):
|
|
||||||
try:
|
|
||||||
user, token, data, error_response = validate_token_and_get_user(request)
|
|
||||||
if error_response:
|
|
||||||
return error_response
|
|
||||||
|
|
||||||
liked_ids = list(
|
|
||||||
EventLike.objects.filter(user=user)
|
|
||||||
.values_list('event_id', flat=True)
|
|
||||||
)
|
|
||||||
return JsonResponse({'status': 'success', 'liked_event_ids': liked_ids})
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
log("error", "MyLikedIdsView exception", request=request,
|
|
||||||
logger_data={"error": str(e)})
|
|
||||||
return JsonResponse(
|
|
||||||
{'status': 'error', 'message': 'An unexpected server error occurred.'},
|
|
||||||
status=500
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
@method_decorator(csrf_exempt, name='dispatch')
|
|
||||||
class MyLikedEventsView(View):
|
|
||||||
"""POST /api/events/my-liked-events/ — paginated liked events with full data."""
|
|
||||||
|
|
||||||
def post(self, request):
|
|
||||||
try:
|
|
||||||
user, token, data, error_response = validate_token_and_get_user(request)
|
|
||||||
if error_response:
|
|
||||||
return error_response
|
|
||||||
|
|
||||||
page = int(data.get('page', 1))
|
|
||||||
page_size = min(int(data.get('page_size', 20)), 50)
|
|
||||||
|
|
||||||
# Event IDs liked by this user, newest first
|
|
||||||
liked_event_ids = list(
|
|
||||||
EventLike.objects.filter(user=user)
|
|
||||||
.order_by('-created_at')
|
|
||||||
.values_list('event_id', flat=True)
|
|
||||||
)
|
|
||||||
|
|
||||||
# Preserve ordering from liked_event_ids
|
|
||||||
from django.db.models import Case, When, IntegerField
|
|
||||||
ordering = Case(
|
|
||||||
*[When(pk=pk, then=pos) for pos, pk in enumerate(liked_event_ids)],
|
|
||||||
output_field=IntegerField()
|
|
||||||
)
|
|
||||||
events_qs = Event.objects.filter(id__in=liked_event_ids).order_by(ordering)
|
|
||||||
|
|
||||||
paginator = Paginator(events_qs, page_size)
|
|
||||||
page_obj = paginator.get_page(page)
|
|
||||||
|
|
||||||
events_data = [_serialize_liked_event(e) for e in page_obj]
|
|
||||||
|
|
||||||
return JsonResponse({
|
|
||||||
'status': 'success',
|
|
||||||
'events': events_data,
|
|
||||||
'total': paginator.count,
|
|
||||||
'page': page,
|
|
||||||
'page_size': page_size,
|
|
||||||
'has_next': page_obj.has_next(),
|
|
||||||
})
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
log("error", "MyLikedEventsView exception", request=request,
|
|
||||||
logger_data={"error": str(e)})
|
|
||||||
return JsonResponse(
|
|
||||||
{'status': 'error', 'message': 'An unexpected server error occurred.'},
|
|
||||||
status=500
|
|
||||||
)
|
|
||||||
@@ -29,20 +29,11 @@ def _serialize_review(r, user_interactions=None):
|
|||||||
uname = r.reviewer.username
|
uname = r.reviewer.username
|
||||||
except Exception:
|
except Exception:
|
||||||
uname = ''
|
uname = ''
|
||||||
try:
|
|
||||||
pic = r.reviewer.profile_picture
|
|
||||||
if pic and pic.name and 'default.png' not in pic.name:
|
|
||||||
profile_photo = pic.url
|
|
||||||
else:
|
|
||||||
profile_photo = ''
|
|
||||||
except Exception:
|
|
||||||
profile_photo = ''
|
|
||||||
return {
|
return {
|
||||||
'id': r.id,
|
'id': r.id,
|
||||||
'event_id': r.event_id,
|
'event_id': r.event_id,
|
||||||
'username': uname,
|
'username': uname,
|
||||||
'display_name': display,
|
'display_name': display,
|
||||||
'profile_photo': profile_photo,
|
|
||||||
'rating': r.rating,
|
'rating': r.rating,
|
||||||
'comment': r.review_text,
|
'comment': r.review_text,
|
||||||
'status': _STATUS_TO_JSON.get(r.status, r.status),
|
'status': _STATUS_TO_JSON.get(r.status, r.status),
|
||||||
|
|||||||
@@ -1,41 +1,19 @@
|
|||||||
# accounts/views.py
|
# accounts/views.py
|
||||||
import json
|
import json
|
||||||
import secrets
|
|
||||||
from django.views.decorators.csrf import csrf_exempt
|
from django.views.decorators.csrf import csrf_exempt
|
||||||
from django.http import JsonResponse
|
from django.http import JsonResponse
|
||||||
from django.utils.decorators import method_decorator
|
from django.utils.decorators import method_decorator
|
||||||
from django.views import View
|
from django.views import View
|
||||||
from rest_framework.views import APIView
|
|
||||||
from rest_framework.authtoken.models import Token
|
from rest_framework.authtoken.models import Token
|
||||||
from mobile_api.forms import RegisterForm, LoginForm, WebRegisterForm
|
from mobile_api.forms import RegisterForm, LoginForm, WebRegisterForm
|
||||||
from rest_framework.authentication import TokenAuthentication
|
from rest_framework.authentication import TokenAuthentication
|
||||||
from django.contrib.auth import logout
|
from django.contrib.auth import logout
|
||||||
from django.db import connection
|
|
||||||
from mobile_api.utils import validate_token_and_get_user
|
from mobile_api.utils import validate_token_and_get_user
|
||||||
from utils.errors_json_convertor import simplify_form_errors
|
from utils.errors_json_convertor import simplify_form_errors
|
||||||
from accounts.models import User
|
from accounts.models import User
|
||||||
from eventify_logger.services import log
|
from eventify_logger.services import log
|
||||||
|
|
||||||
|
|
||||||
def _seed_gamification_profile(user):
|
|
||||||
"""Insert a gamification profile row for a newly registered user.
|
|
||||||
Non-fatal: if the insert fails for any reason, registration still succeeds."""
|
|
||||||
try:
|
|
||||||
with connection.cursor() as cursor:
|
|
||||||
cursor.execute("""
|
|
||||||
INSERT INTO user_gamification_profiles (user_id, eventify_id)
|
|
||||||
VALUES (%s, %s)
|
|
||||||
ON CONFLICT (user_id) DO UPDATE
|
|
||||||
SET eventify_id = COALESCE(
|
|
||||||
user_gamification_profiles.eventify_id,
|
|
||||||
EXCLUDED.eventify_id
|
|
||||||
)
|
|
||||||
""", [user.email, user.eventify_id])
|
|
||||||
except Exception as e:
|
|
||||||
log("warning", "Failed to seed gamification profile on registration",
|
|
||||||
logger_data={"user": user.email, "error": str(e)})
|
|
||||||
|
|
||||||
|
|
||||||
@method_decorator(csrf_exempt, name='dispatch')
|
@method_decorator(csrf_exempt, name='dispatch')
|
||||||
class RegisterView(View):
|
class RegisterView(View):
|
||||||
def post(self, request):
|
def post(self, request):
|
||||||
@@ -44,7 +22,6 @@ class RegisterView(View):
|
|||||||
form = RegisterForm(data)
|
form = RegisterForm(data)
|
||||||
if form.is_valid():
|
if form.is_valid():
|
||||||
user = form.save()
|
user = form.save()
|
||||||
_seed_gamification_profile(user)
|
|
||||||
token, _ = Token.objects.get_or_create(user=user)
|
token, _ = Token.objects.get_or_create(user=user)
|
||||||
log("info", "API user registration", request=request, user=user)
|
log("info", "API user registration", request=request, user=user)
|
||||||
return JsonResponse({'message': 'User registered successfully', 'token': token.key}, status=201)
|
return JsonResponse({'message': 'User registered successfully', 'token': token.key}, status=201)
|
||||||
@@ -72,7 +49,6 @@ class WebRegisterView(View):
|
|||||||
if form.is_valid():
|
if form.is_valid():
|
||||||
print('2')
|
print('2')
|
||||||
user = form.save()
|
user = form.save()
|
||||||
_seed_gamification_profile(user)
|
|
||||||
token, _ = Token.objects.get_or_create(user=user)
|
token, _ = Token.objects.get_or_create(user=user)
|
||||||
print('3')
|
print('3')
|
||||||
log("info", "Web user registration", request=request, user=user)
|
log("info", "Web user registration", request=request, user=user)
|
||||||
@@ -156,7 +132,6 @@ class StatusView(View):
|
|||||||
"eventify_id": user.eventify_id or '',
|
"eventify_id": user.eventify_id or '',
|
||||||
"district": user.district or '',
|
"district": user.district or '',
|
||||||
"district_changed_at": user.district_changed_at.isoformat() if user.district_changed_at else None,
|
"district_changed_at": user.district_changed_at.isoformat() if user.district_changed_at else None,
|
||||||
"profile_photo": user.profile_picture.url if user.profile_picture else '',
|
|
||||||
})
|
})
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
@@ -388,169 +363,3 @@ class UpdateProfileView(View):
|
|||||||
'success': False,
|
'success': False,
|
||||||
'error': 'An unexpected server error occurred. Please try again.'
|
'error': 'An unexpected server error occurred. Please try again.'
|
||||||
}, status=500)
|
}, status=500)
|
||||||
|
|
||||||
|
|
||||||
@method_decorator(csrf_exempt, name='dispatch')
|
|
||||||
class BulkUserPublicInfoView(APIView):
|
|
||||||
"""Internal endpoint for Node.js gamification server to resolve user details.
|
|
||||||
Accepts POST with { emails: [...] } (max 500).
|
|
||||||
Returns { users: { email: { district, display_name, eventify_id } } }
|
|
||||||
"""
|
|
||||||
authentication_classes = []
|
|
||||||
permission_classes = []
|
|
||||||
|
|
||||||
def post(self, request):
|
|
||||||
try:
|
|
||||||
json_data = json.loads(request.body)
|
|
||||||
emails = json_data.get('emails', [])
|
|
||||||
if not emails or not isinstance(emails, list) or len(emails) > 500:
|
|
||||||
return JsonResponse({'error': 'Provide 1-500 emails'}, status=400)
|
|
||||||
|
|
||||||
users_qs = User.objects.filter(email__in=emails).values_list(
|
|
||||||
'email', 'first_name', 'last_name', 'district', 'eventify_id'
|
|
||||||
)
|
|
||||||
result = {}
|
|
||||||
for email, first, last, district, eid in users_qs:
|
|
||||||
name = f"{first} {last}".strip() or email.split('@')[0]
|
|
||||||
result[email] = {
|
|
||||||
'display_name': name,
|
|
||||||
'district': district or '',
|
|
||||||
'eventify_id': eid or '',
|
|
||||||
}
|
|
||||||
return JsonResponse({'users': result})
|
|
||||||
except Exception as e:
|
|
||||||
log("error", "BulkUserPublicInfoView error", logger_data={"error": str(e)})
|
|
||||||
return JsonResponse({'error': 'An unexpected server error occurred.'}, status=500)
|
|
||||||
|
|
||||||
|
|
||||||
@method_decorator(csrf_exempt, name='dispatch')
|
|
||||||
class GoogleLoginView(View):
|
|
||||||
"""Verify a Google ID token, find or create the user, return the same response shape as LoginView."""
|
|
||||||
def post(self, request):
|
|
||||||
try:
|
|
||||||
from google.oauth2 import id_token as google_id_token
|
|
||||||
from google.auth.transport import requests as google_requests
|
|
||||||
|
|
||||||
from django.conf import settings
|
|
||||||
|
|
||||||
data = json.loads(request.body)
|
|
||||||
token = data.get('id_token')
|
|
||||||
if not token:
|
|
||||||
return JsonResponse({'error': 'id_token is required'}, status=400)
|
|
||||||
|
|
||||||
if not settings.GOOGLE_CLIENT_ID:
|
|
||||||
log("error", "GOOGLE_CLIENT_ID not configured", request=request)
|
|
||||||
return JsonResponse({'error': 'Google login temporarily unavailable'}, status=503)
|
|
||||||
|
|
||||||
idinfo = google_id_token.verify_oauth2_token(
|
|
||||||
token,
|
|
||||||
google_requests.Request(),
|
|
||||||
settings.GOOGLE_CLIENT_ID,
|
|
||||||
)
|
|
||||||
email = idinfo.get('email')
|
|
||||||
if not email:
|
|
||||||
return JsonResponse({'error': 'Email not found in Google token'}, status=400)
|
|
||||||
|
|
||||||
user, created = User.objects.get_or_create(
|
|
||||||
email=email,
|
|
||||||
defaults={
|
|
||||||
'username': email,
|
|
||||||
'first_name': idinfo.get('given_name', ''),
|
|
||||||
'last_name': idinfo.get('family_name', ''),
|
|
||||||
'role': 'customer',
|
|
||||||
},
|
|
||||||
)
|
|
||||||
if created:
|
|
||||||
user.set_password(secrets.token_urlsafe(32))
|
|
||||||
user.save()
|
|
||||||
log("info", "Google OAuth new user created", request=request, user=user)
|
|
||||||
|
|
||||||
auth_token, _ = Token.objects.get_or_create(user=user)
|
|
||||||
log("info", "Google OAuth login", request=request, user=user)
|
|
||||||
|
|
||||||
return JsonResponse({
|
|
||||||
'message': 'Login successful',
|
|
||||||
'token': auth_token.key,
|
|
||||||
'eventify_id': user.eventify_id or '',
|
|
||||||
'username': user.username,
|
|
||||||
'email': user.email,
|
|
||||||
'phone_number': user.phone_number or '',
|
|
||||||
'first_name': user.first_name,
|
|
||||||
'last_name': user.last_name,
|
|
||||||
'role': user.role,
|
|
||||||
'pincode': user.pincode or '',
|
|
||||||
'district': user.district or '',
|
|
||||||
'district_changed_at': user.district_changed_at.isoformat() if user.district_changed_at else None,
|
|
||||||
'state': user.state or '',
|
|
||||||
'country': user.country or '',
|
|
||||||
'place': user.place or '',
|
|
||||||
'latitude': user.latitude or '',
|
|
||||||
'longitude': user.longitude or '',
|
|
||||||
'profile_photo': user.profile_picture.url if user.profile_picture else '',
|
|
||||||
}, status=200)
|
|
||||||
except ValueError as e:
|
|
||||||
log("warning", "Google OAuth invalid token", request=request, logger_data={"error": str(e)})
|
|
||||||
return JsonResponse({'error': 'Invalid Google token'}, status=401)
|
|
||||||
except Exception as e:
|
|
||||||
log("error", "Google OAuth exception", request=request, logger_data={"error": str(e)})
|
|
||||||
return JsonResponse({'error': 'An unexpected server error occurred.'}, status=500)
|
|
||||||
|
|
||||||
|
|
||||||
@method_decorator(csrf_exempt, name='dispatch')
|
|
||||||
class ScheduleCallView(View):
|
|
||||||
"""Public endpoint for the 'Schedule a Call' form on the consumer app."""
|
|
||||||
|
|
||||||
def post(self, request):
|
|
||||||
from admin_api.models import Lead
|
|
||||||
try:
|
|
||||||
data = json.loads(request.body)
|
|
||||||
name = (data.get('name') or '').strip()
|
|
||||||
email = (data.get('email') or '').strip()
|
|
||||||
phone = (data.get('phone') or '').strip()
|
|
||||||
event_type = (data.get('eventType') or '').strip()
|
|
||||||
message = (data.get('message') or '').strip()
|
|
||||||
|
|
||||||
errors = {}
|
|
||||||
if not name:
|
|
||||||
errors['name'] = ['This field is required.']
|
|
||||||
if not email:
|
|
||||||
errors['email'] = ['This field is required.']
|
|
||||||
if not phone:
|
|
||||||
errors['phone'] = ['This field is required.']
|
|
||||||
valid_event_types = [c[0] for c in Lead.EVENT_TYPE_CHOICES]
|
|
||||||
if not event_type or event_type not in valid_event_types:
|
|
||||||
errors['eventType'] = [f'Must be one of: {", ".join(valid_event_types)}']
|
|
||||||
|
|
||||||
if errors:
|
|
||||||
return JsonResponse({'errors': errors}, status=400)
|
|
||||||
|
|
||||||
# Auto-link to a consumer account if one exists with this email
|
|
||||||
from django.contrib.auth import get_user_model
|
|
||||||
_User = get_user_model()
|
|
||||||
try:
|
|
||||||
consumer_account = _User.objects.get(email=email)
|
|
||||||
except _User.DoesNotExist:
|
|
||||||
consumer_account = None
|
|
||||||
|
|
||||||
lead = Lead.objects.create(
|
|
||||||
name=name,
|
|
||||||
email=email,
|
|
||||||
phone=phone,
|
|
||||||
event_type=event_type,
|
|
||||||
message=message,
|
|
||||||
status='new',
|
|
||||||
source='schedule_call',
|
|
||||||
priority='medium',
|
|
||||||
user_account=consumer_account,
|
|
||||||
)
|
|
||||||
log("info", f"New schedule-call lead #{lead.pk} from {email}", request=request)
|
|
||||||
return JsonResponse({
|
|
||||||
'status': 'success',
|
|
||||||
'message': 'Your request has been submitted. Our team will get back to you soon.',
|
|
||||||
'lead_id': lead.pk,
|
|
||||||
}, status=201)
|
|
||||||
except json.JSONDecodeError:
|
|
||||||
return JsonResponse({'error': 'Invalid JSON body.'}, status=400)
|
|
||||||
except Exception as e:
|
|
||||||
log("error", "Schedule call exception", request=request, logger_data={"error": str(e)})
|
|
||||||
return JsonResponse({'error': 'An unexpected server error occurred.'}, status=500)
|
|
||||||
|
|||||||
@@ -1,10 +0,0 @@
|
|||||||
from django.contrib import admin
|
|
||||||
from .models import Notification
|
|
||||||
|
|
||||||
|
|
||||||
@admin.register(Notification)
|
|
||||||
class NotificationAdmin(admin.ModelAdmin):
|
|
||||||
list_display = ('title', 'user', 'notification_type', 'is_read', 'created_at')
|
|
||||||
list_filter = ('notification_type', 'is_read', 'created_at')
|
|
||||||
search_fields = ('title', 'message', 'user__email')
|
|
||||||
readonly_fields = ('created_at',)
|
|
||||||
@@ -1,6 +0,0 @@
|
|||||||
from django.apps import AppConfig
|
|
||||||
|
|
||||||
|
|
||||||
class NotificationsConfig(AppConfig):
|
|
||||||
default_auto_field = 'django.db.models.BigAutoField'
|
|
||||||
name = 'notifications'
|
|
||||||
@@ -1,181 +0,0 @@
|
|||||||
"""HTML email builders for scheduled admin notifications.
|
|
||||||
|
|
||||||
Each builder is registered in ``BUILDERS`` keyed by ``NotificationSchedule.notification_type``
|
|
||||||
and returns ``(subject, html_body)``. Add new types by appending to the registry
|
|
||||||
and extending ``NotificationSchedule.TYPE_CHOICES``.
|
|
||||||
|
|
||||||
Week bounds for ``events_expiring_this_week`` are computed in Asia/Kolkata so the
|
|
||||||
"this week" semantics match the operations team's wall-clock week regardless of
|
|
||||||
``settings.TIME_ZONE`` (currently UTC).
|
|
||||||
"""
|
|
||||||
from datetime import date, datetime, timedelta
|
|
||||||
from html import escape
|
|
||||||
|
|
||||||
try:
|
|
||||||
from zoneinfo import ZoneInfo
|
|
||||||
except ImportError: # pragma: no cover — fallback for py<3.9
|
|
||||||
from backports.zoneinfo import ZoneInfo # type: ignore
|
|
||||||
|
|
||||||
from django.conf import settings
|
|
||||||
from django.core.mail import EmailMessage
|
|
||||||
from django.db.models import Q
|
|
||||||
|
|
||||||
from eventify_logger.services import log
|
|
||||||
|
|
||||||
|
|
||||||
IST = ZoneInfo('Asia/Kolkata')
|
|
||||||
|
|
||||||
|
|
||||||
def _today_in_ist() -> date:
|
|
||||||
return datetime.now(IST).date()
|
|
||||||
|
|
||||||
|
|
||||||
def _upcoming_week_bounds(today: date) -> tuple[date, date]:
|
|
||||||
"""Return (next Monday, next Sunday) inclusive.
|
|
||||||
|
|
||||||
If today is Monday the result is *this week* (today..Sunday).
|
|
||||||
If today is any other weekday the result is *next week* (next Monday..next Sunday).
|
|
||||||
Mon=0 per Python ``weekday()``.
|
|
||||||
"""
|
|
||||||
days_until_monday = (7 - today.weekday()) % 7
|
|
||||||
monday = today + timedelta(days=days_until_monday)
|
|
||||||
sunday = monday + timedelta(days=6)
|
|
||||||
return monday, sunday
|
|
||||||
|
|
||||||
|
|
||||||
def _build_events_expiring_this_week(schedule) -> tuple[str, str]:
|
|
||||||
from events.models import Event
|
|
||||||
|
|
||||||
today = _today_in_ist()
|
|
||||||
monday, sunday = _upcoming_week_bounds(today)
|
|
||||||
|
|
||||||
qs = (
|
|
||||||
Event.objects
|
|
||||||
.select_related('partner', 'event_type')
|
|
||||||
.filter(event_status='published')
|
|
||||||
.filter(
|
|
||||||
Q(end_date__isnull=False, end_date__gte=monday, end_date__lte=sunday)
|
|
||||||
| Q(end_date__isnull=True, start_date__gte=monday, start_date__lte=sunday)
|
|
||||||
)
|
|
||||||
.order_by('end_date', 'start_date', 'name')
|
|
||||||
)
|
|
||||||
|
|
||||||
events = list(qs)
|
|
||||||
rows_html = ''
|
|
||||||
for e in events:
|
|
||||||
end = e.end_date or e.start_date
|
|
||||||
title = e.title or e.name or '(untitled)'
|
|
||||||
partner_name = ''
|
|
||||||
if e.partner_id:
|
|
||||||
try:
|
|
||||||
partner_name = e.partner.name or ''
|
|
||||||
except Exception:
|
|
||||||
partner_name = ''
|
|
||||||
category = ''
|
|
||||||
if e.event_type_id and e.event_type:
|
|
||||||
category = getattr(e.event_type, 'event_type', '') or ''
|
|
||||||
rows_html += (
|
|
||||||
'<tr>'
|
|
||||||
f'<td style="padding:10px 12px;border-bottom:1px solid #eee;">{escape(title)}</td>'
|
|
||||||
f'<td style="padding:10px 12px;border-bottom:1px solid #eee;">{escape(partner_name or "—")}</td>'
|
|
||||||
f'<td style="padding:10px 12px;border-bottom:1px solid #eee;">{escape(category or "—")}</td>'
|
|
||||||
f'<td style="padding:10px 12px;border-bottom:1px solid #eee;">'
|
|
||||||
f'{end.strftime("%a %d %b %Y") if end else "—"}</td>'
|
|
||||||
'</tr>'
|
|
||||||
)
|
|
||||||
|
|
||||||
if not events:
|
|
||||||
rows_html = (
|
|
||||||
'<tr><td colspan="4" style="padding:24px;text-align:center;color:#888;">'
|
|
||||||
'No published events are expiring next week.'
|
|
||||||
'</td></tr>'
|
|
||||||
)
|
|
||||||
|
|
||||||
subject = (
|
|
||||||
f'[Eventify] {len(events)} event(s) expiring '
|
|
||||||
f'{monday.strftime("%d %b")}–{sunday.strftime("%d %b")}'
|
|
||||||
)
|
|
||||||
|
|
||||||
html = f"""<!doctype html>
|
|
||||||
<html><body style="margin:0;padding:0;background:#f5f5f5;font-family:Arial,Helvetica,sans-serif;color:#1a1a1a;">
|
|
||||||
<table role="presentation" width="100%" cellspacing="0" cellpadding="0" style="background:#f5f5f5;">
|
|
||||||
<tr><td align="center" style="padding:24px 12px;">
|
|
||||||
<table role="presentation" width="640" cellspacing="0" cellpadding="0" style="max-width:640px;background:#ffffff;border-radius:10px;overflow:hidden;box-shadow:0 2px 6px rgba(15,69,207,0.08);">
|
|
||||||
<tr><td style="background:#0F45CF;color:#ffffff;padding:24px 28px;">
|
|
||||||
<h2 style="margin:0;font-size:20px;">Events expiring next week</h2>
|
|
||||||
<p style="margin:6px 0 0;color:#d2dcff;font-size:14px;">
|
|
||||||
{monday.strftime("%A %d %b %Y")} → {sunday.strftime("%A %d %b %Y")}
|
|
||||||
· {len(events)} event(s)
|
|
||||||
</p>
|
|
||||||
</td></tr>
|
|
||||||
<tr><td style="padding:20px 24px;">
|
|
||||||
<p style="margin:0 0 12px;font-size:14px;color:#444;">
|
|
||||||
Scheduled notification: <strong>{escape(schedule.name)}</strong>
|
|
||||||
</p>
|
|
||||||
<table role="presentation" width="100%" cellspacing="0" cellpadding="0" style="border-collapse:collapse;font-size:14px;">
|
|
||||||
<thead>
|
|
||||||
<tr style="background:#f0f4ff;color:#0F45CF;">
|
|
||||||
<th align="left" style="padding:10px 12px;">Title</th>
|
|
||||||
<th align="left" style="padding:10px 12px;">Partner</th>
|
|
||||||
<th align="left" style="padding:10px 12px;">Category</th>
|
|
||||||
<th align="left" style="padding:10px 12px;">End date</th>
|
|
||||||
</tr>
|
|
||||||
</thead>
|
|
||||||
<tbody>{rows_html}</tbody>
|
|
||||||
</table>
|
|
||||||
</td></tr>
|
|
||||||
<tr><td style="padding:16px 24px 24px;color:#888;font-size:12px;">
|
|
||||||
Sent automatically by Eventify Command Center.
|
|
||||||
To change recipients or the schedule, open
|
|
||||||
<a href="https://admin.eventifyplus.com/settings" style="color:#0F45CF;">admin.eventifyplus.com › Settings › Notifications</a>.
|
|
||||||
</td></tr>
|
|
||||||
</table>
|
|
||||||
</td></tr>
|
|
||||||
</table>
|
|
||||||
</body></html>"""
|
|
||||||
|
|
||||||
return subject, html
|
|
||||||
|
|
||||||
|
|
||||||
BUILDERS: dict = {
|
|
||||||
'events_expiring_this_week': _build_events_expiring_this_week,
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
def render_and_send(schedule) -> int:
|
|
||||||
"""Render the email for ``schedule`` and deliver it to active recipients.
|
|
||||||
|
|
||||||
Returns the number of recipients the message was sent to. Raises on SMTP
|
|
||||||
failure so the management command can mark the schedule as errored.
|
|
||||||
"""
|
|
||||||
builder = BUILDERS.get(schedule.notification_type)
|
|
||||||
if builder is None:
|
|
||||||
raise ValueError(f'No builder for notification type: {schedule.notification_type}')
|
|
||||||
|
|
||||||
subject, html = builder(schedule)
|
|
||||||
recipients = list(
|
|
||||||
schedule.recipients.filter(is_active=True).values_list('email', flat=True)
|
|
||||||
)
|
|
||||||
if not recipients:
|
|
||||||
log('warning', 'notification schedule has no active recipients', logger_data={
|
|
||||||
'schedule_id': schedule.id,
|
|
||||||
'schedule_name': schedule.name,
|
|
||||||
})
|
|
||||||
return 0
|
|
||||||
|
|
||||||
msg = EmailMessage(
|
|
||||||
subject=subject,
|
|
||||||
body=html,
|
|
||||||
from_email=settings.DEFAULT_FROM_EMAIL,
|
|
||||||
to=recipients,
|
|
||||||
)
|
|
||||||
msg.content_subtype = 'html'
|
|
||||||
msg.send(fail_silently=False)
|
|
||||||
|
|
||||||
log('info', 'notification email sent', logger_data={
|
|
||||||
'schedule_id': schedule.id,
|
|
||||||
'schedule_name': schedule.name,
|
|
||||||
'type': schedule.notification_type,
|
|
||||||
'recipient_count': len(recipients),
|
|
||||||
})
|
|
||||||
return len(recipients)
|
|
||||||
@@ -1,153 +0,0 @@
|
|||||||
"""Dispatch due ``NotificationSchedule`` jobs.
|
|
||||||
|
|
||||||
Host cron invokes this every ~15 minutes via ``docker exec``. The command
|
|
||||||
walks all active schedules, evaluates their cron expression against
|
|
||||||
``last_run_at`` using ``croniter``, and fires any that are due. A row-level
|
|
||||||
``select_for_update(skip_locked=True)`` prevents duplicate sends if two cron
|
|
||||||
ticks race or the container is restarted mid-run.
|
|
||||||
|
|
||||||
Evaluation timezone is **Asia/Kolkata** to match
|
|
||||||
``notifications/emails.py::_upcoming_week_bounds`` — the same wall-clock week
|
|
||||||
used in the outgoing email body.
|
|
||||||
|
|
||||||
Flags:
|
|
||||||
--schedule-id <id> Fire exactly one schedule, ignoring cron check.
|
|
||||||
--dry-run Resolve due schedules + render emails, send nothing.
|
|
||||||
"""
|
|
||||||
from datetime import datetime, timedelta
|
|
||||||
|
|
||||||
try:
|
|
||||||
from zoneinfo import ZoneInfo
|
|
||||||
except ImportError: # pragma: no cover — py<3.9
|
|
||||||
from backports.zoneinfo import ZoneInfo # type: ignore
|
|
||||||
|
|
||||||
from croniter import croniter
|
|
||||||
from django.core.management.base import BaseCommand, CommandError
|
|
||||||
from django.db import transaction
|
|
||||||
from django.utils import timezone
|
|
||||||
|
|
||||||
from eventify_logger.services import log
|
|
||||||
from notifications.emails import BUILDERS, render_and_send
|
|
||||||
from notifications.models import NotificationSchedule
|
|
||||||
|
|
||||||
|
|
||||||
IST = ZoneInfo('Asia/Kolkata')
|
|
||||||
|
|
||||||
|
|
||||||
def _is_due(schedule: NotificationSchedule, now_ist: datetime) -> bool:
|
|
||||||
"""Return True if ``schedule`` should fire at ``now_ist``.
|
|
||||||
|
|
||||||
``croniter`` is seeded with ``last_run_at`` (or one year ago for a fresh
|
|
||||||
schedule) and asked for the next fire time. If that time has already
|
|
||||||
passed relative to ``now_ist`` the schedule is due.
|
|
||||||
"""
|
|
||||||
if not croniter.is_valid(schedule.cron_expression):
|
|
||||||
return False
|
|
||||||
|
|
||||||
if schedule.last_run_at is not None:
|
|
||||||
seed = schedule.last_run_at.astimezone(IST)
|
|
||||||
else:
|
|
||||||
seed = now_ist - timedelta(days=365)
|
|
||||||
|
|
||||||
itr = croniter(schedule.cron_expression, seed)
|
|
||||||
next_fire = itr.get_next(datetime)
|
|
||||||
return next_fire <= now_ist
|
|
||||||
|
|
||||||
|
|
||||||
class Command(BaseCommand):
|
|
||||||
help = 'Dispatch due NotificationSchedule email jobs.'
|
|
||||||
|
|
||||||
def add_arguments(self, parser):
|
|
||||||
parser.add_argument(
|
|
||||||
'--schedule-id', type=int, default=None,
|
|
||||||
help='Force-run a single schedule by ID, ignoring cron check.',
|
|
||||||
)
|
|
||||||
parser.add_argument(
|
|
||||||
'--dry-run', action='store_true',
|
|
||||||
help='Render and log but do not send or persist last_run_at.',
|
|
||||||
)
|
|
||||||
|
|
||||||
def handle(self, *args, **opts):
|
|
||||||
schedule_id = opts.get('schedule_id')
|
|
||||||
dry_run = opts.get('dry_run', False)
|
|
||||||
|
|
||||||
now_ist = datetime.now(IST)
|
|
||||||
qs = NotificationSchedule.objects.filter(is_active=True)
|
|
||||||
if schedule_id is not None:
|
|
||||||
qs = qs.filter(id=schedule_id)
|
|
||||||
|
|
||||||
candidate_ids = list(qs.values_list('id', flat=True))
|
|
||||||
if not candidate_ids:
|
|
||||||
self.stdout.write('No active schedules to evaluate.')
|
|
||||||
return
|
|
||||||
|
|
||||||
fired = 0
|
|
||||||
skipped = 0
|
|
||||||
errored = 0
|
|
||||||
|
|
||||||
for sid in candidate_ids:
|
|
||||||
with transaction.atomic():
|
|
||||||
locked_qs = (
|
|
||||||
NotificationSchedule.objects
|
|
||||||
.select_for_update(skip_locked=True)
|
|
||||||
.filter(id=sid, is_active=True)
|
|
||||||
)
|
|
||||||
schedule = locked_qs.first()
|
|
||||||
if schedule is None:
|
|
||||||
skipped += 1
|
|
||||||
continue
|
|
||||||
|
|
||||||
forced = schedule_id is not None
|
|
||||||
if not forced and not _is_due(schedule, now_ist):
|
|
||||||
skipped += 1
|
|
||||||
continue
|
|
||||||
|
|
||||||
if schedule.notification_type not in BUILDERS:
|
|
||||||
schedule.last_status = NotificationSchedule.STATUS_ERROR
|
|
||||||
schedule.last_error = (
|
|
||||||
f'No builder registered for {schedule.notification_type!r}'
|
|
||||||
)
|
|
||||||
schedule.save(update_fields=['last_status', 'last_error', 'updated_at'])
|
|
||||||
errored += 1
|
|
||||||
continue
|
|
||||||
|
|
||||||
if dry_run:
|
|
||||||
self.stdout.write(
|
|
||||||
f'[dry-run] would fire schedule {schedule.id} '
|
|
||||||
f'({schedule.name}) type={schedule.notification_type}'
|
|
||||||
)
|
|
||||||
fired += 1
|
|
||||||
continue
|
|
||||||
|
|
||||||
try:
|
|
||||||
recipient_count = render_and_send(schedule)
|
|
||||||
except Exception as exc: # noqa: BLE001 — wide catch, store msg
|
|
||||||
log('error', 'notification dispatch failed', logger_data={
|
|
||||||
'schedule_id': schedule.id,
|
|
||||||
'schedule_name': schedule.name,
|
|
||||||
'error': str(exc),
|
|
||||||
})
|
|
||||||
schedule.last_status = NotificationSchedule.STATUS_ERROR
|
|
||||||
schedule.last_error = str(exc)[:2000]
|
|
||||||
schedule.save(update_fields=['last_status', 'last_error', 'updated_at'])
|
|
||||||
errored += 1
|
|
||||||
continue
|
|
||||||
|
|
||||||
schedule.last_run_at = timezone.now()
|
|
||||||
schedule.last_status = NotificationSchedule.STATUS_SUCCESS
|
|
||||||
schedule.last_error = ''
|
|
||||||
schedule.save(update_fields=[
|
|
||||||
'last_run_at', 'last_status', 'last_error', 'updated_at',
|
|
||||||
])
|
|
||||||
fired += 1
|
|
||||||
self.stdout.write(
|
|
||||||
f'Fired schedule {schedule.id} ({schedule.name}) '
|
|
||||||
f'→ {recipient_count} recipient(s)'
|
|
||||||
)
|
|
||||||
|
|
||||||
summary = f'Done. fired={fired} skipped={skipped} errored={errored}'
|
|
||||||
self.stdout.write(summary)
|
|
||||||
log('info', 'send_scheduled_notifications complete', logger_data={
|
|
||||||
'fired': fired, 'skipped': skipped, 'errored': errored,
|
|
||||||
'dry_run': dry_run, 'forced_id': schedule_id,
|
|
||||||
})
|
|
||||||
@@ -1,93 +0,0 @@
|
|||||||
from django.conf import settings
|
|
||||||
from django.db import migrations, models
|
|
||||||
import django.db.models.deletion
|
|
||||||
|
|
||||||
|
|
||||||
class Migration(migrations.Migration):
|
|
||||||
|
|
||||||
initial = True
|
|
||||||
|
|
||||||
dependencies = [
|
|
||||||
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
|
|
||||||
]
|
|
||||||
|
|
||||||
operations = [
|
|
||||||
migrations.CreateModel(
|
|
||||||
name='Notification',
|
|
||||||
fields=[
|
|
||||||
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
|
||||||
('title', models.CharField(max_length=255)),
|
|
||||||
('message', models.TextField()),
|
|
||||||
('notification_type', models.CharField(
|
|
||||||
choices=[
|
|
||||||
('event', 'Event'),
|
|
||||||
('promo', 'Promotion'),
|
|
||||||
('system', 'System'),
|
|
||||||
('booking', 'Booking'),
|
|
||||||
],
|
|
||||||
default='system', max_length=20,
|
|
||||||
)),
|
|
||||||
('is_read', models.BooleanField(default=False)),
|
|
||||||
('action_url', models.URLField(blank=True, null=True)),
|
|
||||||
('created_at', models.DateTimeField(auto_now_add=True)),
|
|
||||||
('user', models.ForeignKey(
|
|
||||||
on_delete=django.db.models.deletion.CASCADE,
|
|
||||||
related_name='notifications',
|
|
||||||
to=settings.AUTH_USER_MODEL,
|
|
||||||
)),
|
|
||||||
],
|
|
||||||
options={'ordering': ['-created_at']},
|
|
||||||
),
|
|
||||||
migrations.CreateModel(
|
|
||||||
name='NotificationSchedule',
|
|
||||||
fields=[
|
|
||||||
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
|
||||||
('name', models.CharField(max_length=200)),
|
|
||||||
('notification_type', models.CharField(
|
|
||||||
choices=[('events_expiring_this_week', 'Events Expiring This Week')],
|
|
||||||
db_index=True, max_length=64,
|
|
||||||
)),
|
|
||||||
('cron_expression', models.CharField(
|
|
||||||
default='0 0 * * 1',
|
|
||||||
help_text=(
|
|
||||||
'Standard 5-field cron (minute hour dom month dow). '
|
|
||||||
'Evaluated in Asia/Kolkata.'
|
|
||||||
),
|
|
||||||
max_length=100,
|
|
||||||
)),
|
|
||||||
('is_active', models.BooleanField(db_index=True, default=True)),
|
|
||||||
('last_run_at', models.DateTimeField(blank=True, null=True)),
|
|
||||||
('last_status', models.CharField(blank=True, default='', max_length=20)),
|
|
||||||
('last_error', models.TextField(blank=True, default='')),
|
|
||||||
('created_at', models.DateTimeField(auto_now_add=True)),
|
|
||||||
('updated_at', models.DateTimeField(auto_now=True)),
|
|
||||||
],
|
|
||||||
options={'ordering': ['-created_at']},
|
|
||||||
),
|
|
||||||
migrations.AddIndex(
|
|
||||||
model_name='notificationschedule',
|
|
||||||
index=models.Index(
|
|
||||||
fields=['is_active', 'notification_type'],
|
|
||||||
name='notificatio_is_acti_26dfb5_idx',
|
|
||||||
),
|
|
||||||
),
|
|
||||||
migrations.CreateModel(
|
|
||||||
name='NotificationRecipient',
|
|
||||||
fields=[
|
|
||||||
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
|
||||||
('email', models.EmailField(max_length=254)),
|
|
||||||
('display_name', models.CharField(blank=True, default='', max_length=200)),
|
|
||||||
('is_active', models.BooleanField(default=True)),
|
|
||||||
('created_at', models.DateTimeField(auto_now_add=True)),
|
|
||||||
('schedule', models.ForeignKey(
|
|
||||||
on_delete=django.db.models.deletion.CASCADE,
|
|
||||||
related_name='recipients',
|
|
||||||
to='notifications.notificationschedule',
|
|
||||||
)),
|
|
||||||
],
|
|
||||||
options={
|
|
||||||
'ordering': ['display_name', 'email'],
|
|
||||||
'unique_together': {('schedule', 'email')},
|
|
||||||
},
|
|
||||||
),
|
|
||||||
]
|
|
||||||
@@ -1,102 +0,0 @@
|
|||||||
"""
|
|
||||||
Two distinct concerns live in this app:
|
|
||||||
|
|
||||||
1. ``Notification`` — consumer-facing in-app inbox entries surfaced on the mobile
|
|
||||||
SPA (/api/notifications/list/). One row per user per alert.
|
|
||||||
|
|
||||||
2. ``NotificationSchedule`` + ``NotificationRecipient`` — admin-side recurring
|
|
||||||
email jobs configured from the Command Center Settings tab and dispatched by
|
|
||||||
the ``send_scheduled_notifications`` management command (host cron).
|
|
||||||
Not user-facing; strictly operational.
|
|
||||||
"""
|
|
||||||
from django.db import models
|
|
||||||
|
|
||||||
from accounts.models import User
|
|
||||||
|
|
||||||
|
|
||||||
class Notification(models.Model):
|
|
||||||
NOTIFICATION_TYPES = [
|
|
||||||
('event', 'Event'),
|
|
||||||
('promo', 'Promotion'),
|
|
||||||
('system', 'System'),
|
|
||||||
('booking', 'Booking'),
|
|
||||||
]
|
|
||||||
|
|
||||||
user = models.ForeignKey(User, on_delete=models.CASCADE, related_name='notifications')
|
|
||||||
title = models.CharField(max_length=255)
|
|
||||||
message = models.TextField()
|
|
||||||
notification_type = models.CharField(max_length=20, choices=NOTIFICATION_TYPES, default='system')
|
|
||||||
is_read = models.BooleanField(default=False)
|
|
||||||
action_url = models.URLField(blank=True, null=True)
|
|
||||||
created_at = models.DateTimeField(auto_now_add=True)
|
|
||||||
|
|
||||||
class Meta:
|
|
||||||
ordering = ['-created_at']
|
|
||||||
|
|
||||||
def __str__(self):
|
|
||||||
return f"{self.notification_type}: {self.title} → {self.user.email}"
|
|
||||||
|
|
||||||
|
|
||||||
class NotificationSchedule(models.Model):
|
|
||||||
"""One configurable recurring email job.
|
|
||||||
|
|
||||||
New types are added by registering a builder in ``notifications/emails.py``
|
|
||||||
and adding the slug to ``TYPE_CHOICES`` below. Cron expression is evaluated
|
|
||||||
in ``Asia/Kolkata`` by the dispatcher (matches operations team timezone).
|
|
||||||
"""
|
|
||||||
|
|
||||||
TYPE_EVENTS_EXPIRING_THIS_WEEK = 'events_expiring_this_week'
|
|
||||||
|
|
||||||
TYPE_CHOICES = [
|
|
||||||
(TYPE_EVENTS_EXPIRING_THIS_WEEK, 'Events Expiring This Week'),
|
|
||||||
]
|
|
||||||
|
|
||||||
STATUS_SUCCESS = 'success'
|
|
||||||
STATUS_ERROR = 'error'
|
|
||||||
|
|
||||||
name = models.CharField(max_length=200)
|
|
||||||
notification_type = models.CharField(
|
|
||||||
max_length=64, choices=TYPE_CHOICES, db_index=True,
|
|
||||||
)
|
|
||||||
cron_expression = models.CharField(
|
|
||||||
max_length=100, default='0 0 * * 1',
|
|
||||||
help_text='Standard 5-field cron (minute hour dom month dow). '
|
|
||||||
'Evaluated in Asia/Kolkata.',
|
|
||||||
)
|
|
||||||
is_active = models.BooleanField(default=True, db_index=True)
|
|
||||||
last_run_at = models.DateTimeField(null=True, blank=True)
|
|
||||||
last_status = models.CharField(max_length=20, blank=True, default='')
|
|
||||||
last_error = models.TextField(blank=True, default='')
|
|
||||||
created_at = models.DateTimeField(auto_now_add=True)
|
|
||||||
updated_at = models.DateTimeField(auto_now=True)
|
|
||||||
|
|
||||||
class Meta:
|
|
||||||
ordering = ['-created_at']
|
|
||||||
indexes = [models.Index(fields=['is_active', 'notification_type'])]
|
|
||||||
|
|
||||||
def __str__(self):
|
|
||||||
return f'{self.name} ({self.notification_type})'
|
|
||||||
|
|
||||||
|
|
||||||
class NotificationRecipient(models.Model):
|
|
||||||
"""Free-form recipient — not tied to a User row so external stakeholders
|
|
||||||
(vendors, partners, sponsors) can receive notifications without needing
|
|
||||||
platform accounts."""
|
|
||||||
|
|
||||||
schedule = models.ForeignKey(
|
|
||||||
NotificationSchedule,
|
|
||||||
on_delete=models.CASCADE,
|
|
||||||
related_name='recipients',
|
|
||||||
)
|
|
||||||
email = models.EmailField()
|
|
||||||
display_name = models.CharField(max_length=200, blank=True, default='')
|
|
||||||
is_active = models.BooleanField(default=True)
|
|
||||||
created_at = models.DateTimeField(auto_now_add=True)
|
|
||||||
|
|
||||||
class Meta:
|
|
||||||
unique_together = [('schedule', 'email')]
|
|
||||||
ordering = ['display_name', 'email']
|
|
||||||
|
|
||||||
def __str__(self):
|
|
||||||
label = self.display_name or self.email
|
|
||||||
return f'{label} ({self.schedule.name})'
|
|
||||||
@@ -1,8 +0,0 @@
|
|||||||
from django.urls import path
|
|
||||||
from .views import NotificationListView, NotificationMarkReadView, NotificationCountView
|
|
||||||
|
|
||||||
urlpatterns = [
|
|
||||||
path('list/', NotificationListView.as_view(), name='notification_list'),
|
|
||||||
path('mark-read/', NotificationMarkReadView.as_view(), name='notification_mark_read'),
|
|
||||||
path('count/', NotificationCountView.as_view(), name='notification_count'),
|
|
||||||
]
|
|
||||||
@@ -1,85 +0,0 @@
|
|||||||
import json
|
|
||||||
from django.views.decorators.csrf import csrf_exempt
|
|
||||||
from django.http import JsonResponse
|
|
||||||
from django.utils.decorators import method_decorator
|
|
||||||
from django.views import View
|
|
||||||
from mobile_api.utils import validate_token_and_get_user
|
|
||||||
from eventify_logger.services import log
|
|
||||||
from .models import Notification
|
|
||||||
|
|
||||||
|
|
||||||
@method_decorator(csrf_exempt, name='dispatch')
|
|
||||||
class NotificationListView(View):
|
|
||||||
def post(self, request):
|
|
||||||
try:
|
|
||||||
user, token, data, error_response = validate_token_and_get_user(request, error_status_code=True)
|
|
||||||
if error_response:
|
|
||||||
return error_response
|
|
||||||
|
|
||||||
page = int(data.get('page', 1))
|
|
||||||
page_size = int(data.get('page_size', 20))
|
|
||||||
offset = (page - 1) * page_size
|
|
||||||
|
|
||||||
notifications = Notification.objects.filter(user=user)[offset:offset + page_size]
|
|
||||||
total = Notification.objects.filter(user=user).count()
|
|
||||||
|
|
||||||
items = [{
|
|
||||||
'id': n.id,
|
|
||||||
'title': n.title,
|
|
||||||
'message': n.message,
|
|
||||||
'notification_type': n.notification_type,
|
|
||||||
'is_read': n.is_read,
|
|
||||||
'action_url': n.action_url or '',
|
|
||||||
'created_at': n.created_at.isoformat(),
|
|
||||||
} for n in notifications]
|
|
||||||
|
|
||||||
return JsonResponse({
|
|
||||||
'status': 'success',
|
|
||||||
'notifications': items,
|
|
||||||
'total': total,
|
|
||||||
'page': page,
|
|
||||||
'page_size': page_size,
|
|
||||||
})
|
|
||||||
except Exception as e:
|
|
||||||
log("error", "NotificationListView error", request=request, logger_data={"error": str(e)})
|
|
||||||
return JsonResponse({'error': 'An unexpected server error occurred.'}, status=500)
|
|
||||||
|
|
||||||
|
|
||||||
@method_decorator(csrf_exempt, name='dispatch')
|
|
||||||
class NotificationMarkReadView(View):
|
|
||||||
def post(self, request):
|
|
||||||
try:
|
|
||||||
user, token, data, error_response = validate_token_and_get_user(request, error_status_code=True)
|
|
||||||
if error_response:
|
|
||||||
return error_response
|
|
||||||
|
|
||||||
mark_all = data.get('mark_all', False)
|
|
||||||
notification_id = data.get('notification_id')
|
|
||||||
|
|
||||||
if mark_all:
|
|
||||||
Notification.objects.filter(user=user, is_read=False).update(is_read=True)
|
|
||||||
return JsonResponse({'status': 'success', 'message': 'All notifications marked as read'})
|
|
||||||
|
|
||||||
if notification_id:
|
|
||||||
Notification.objects.filter(id=notification_id, user=user).update(is_read=True)
|
|
||||||
return JsonResponse({'status': 'success', 'message': 'Notification marked as read'})
|
|
||||||
|
|
||||||
return JsonResponse({'error': 'Provide notification_id or mark_all=true'}, status=400)
|
|
||||||
except Exception as e:
|
|
||||||
log("error", "NotificationMarkReadView error", request=request, logger_data={"error": str(e)})
|
|
||||||
return JsonResponse({'error': 'An unexpected server error occurred.'}, status=500)
|
|
||||||
|
|
||||||
|
|
||||||
@method_decorator(csrf_exempt, name='dispatch')
|
|
||||||
class NotificationCountView(View):
|
|
||||||
def post(self, request):
|
|
||||||
try:
|
|
||||||
user, token, data, error_response = validate_token_and_get_user(request, error_status_code=True)
|
|
||||||
if error_response:
|
|
||||||
return error_response
|
|
||||||
|
|
||||||
count = Notification.objects.filter(user=user, is_read=False).count()
|
|
||||||
return JsonResponse({'status': 'success', 'unread_count': count})
|
|
||||||
except Exception as e:
|
|
||||||
log("error", "NotificationCountView error", request=request, logger_data={"error": str(e)})
|
|
||||||
return JsonResponse({'error': 'An unexpected server error occurred.'}, status=500)
|
|
||||||
@@ -7,7 +7,3 @@ gunicorn==21.2.0
|
|||||||
django-extensions==3.2.3
|
django-extensions==3.2.3
|
||||||
psycopg2-binary==2.9.9
|
psycopg2-binary==2.9.9
|
||||||
djangorestframework-simplejwt==5.3.1
|
djangorestframework-simplejwt==5.3.1
|
||||||
google-auth>=2.0.0
|
|
||||||
requests>=2.28.0
|
|
||||||
qrcode[pil]>=7.4.2
|
|
||||||
croniter>=2.0.0
|
|
||||||
|
|||||||
@@ -1,7 +1,3 @@
|
|||||||
Django>=4.2
|
Django>=4.2
|
||||||
Pillow
|
Pillow
|
||||||
django-summernote
|
django-summernote
|
||||||
google-auth>=2.0.0
|
|
||||||
requests>=2.31.0
|
|
||||||
qrcode[pil]>=7.4.2
|
|
||||||
croniter>=2.0.0
|
|
||||||
|
|||||||
Reference in New Issue
Block a user