Compare commits
3 Commits
170208d3e5
...
2c60a82704
| Author | SHA1 | Date | |
|---|---|---|---|
| 2c60a82704 | |||
| 9cde886bd4 | |||
| a8751b5183 |
63
CHANGELOG.md
63
CHANGELOG.md
@@ -5,6 +5,69 @@ Format follows [Keep a Changelog](https://keepachangelog.com/en/1.0.0/), version
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
|
## [1.12.0] — 2026-04-21
|
||||||
|
|
||||||
|
### Added
|
||||||
|
- **Audit coverage for four moderation endpoints** — every admin state change now leaves a matching row in `AuditLog`, written in the same `transaction.atomic()` block as the state change so the log can never disagree with the database:
|
||||||
|
- `UserStatusView` (`PATCH /api/v1/users/<id>/status/`) — `user.suspended`, `user.banned`, `user.reinstated`, `user.flagged`; details capture `reason`, `previous_status`, `new_status`
|
||||||
|
- `EventModerationView` (`PATCH /api/v1/events/<id>/moderate/`) — `event.approved`, `event.rejected`, `event.flagged`, `event.featured`, `event.unfeatured`; details include `reason`, `partner_id`, `previous_status`/`new_status`, `previous_is_featured`/`new_is_featured`
|
||||||
|
- `ReviewModerationView` (`PATCH /api/v1/reviews/<id>/moderate/`) — `review.approved`, `review.rejected`, `review.edited`; details include `reject_reason`, `edited_text` flag, `original_text` on edits
|
||||||
|
- `PartnerKYCReviewView` (`POST /api/v1/partners/<id>/kyc/review/`) — `partner.kyc.approved`, `partner.kyc.rejected`, `partner.kyc.requested_info` (new `requested_info` decision leaves compliance state intact and only records the info request)
|
||||||
|
- **`GET /api/v1/rbac/audit-log/metrics/`** — `AuditLogMetricsView` returns `total`, `today`, `week`, `distinct_users`, and a `by_action_group` breakdown (`create`/`update`/`delete`/`moderate`/`auth`/`other`). Cached 60 s under key `admin_api:audit_log:metrics:v1`; pass `?nocache=1` to bypass (useful from the Django shell during incident response)
|
||||||
|
- **`GET /api/v1/rbac/audit-log/`** — free-text `search` parameter (Q-filter over `action`, `target_type`, `target_id`, `user__username`, `user__email`); `page_size` now bounded to `[1, 200]` with defensive fallback to defaults on non-integer input
|
||||||
|
- **`accounts.User.ALL_MODULES`** — appended `audit-log`; `StaffProfile.get_allowed_modules()` adds `'audit'` → `'audit-log'` to `SCOPE_TO_MODULE` so scope-based staff resolve the module correctly
|
||||||
|
- **`admin_api/migrations/0005_auditlog_indexes.py`** — composite indexes `(action, -created_at)` and `(target_type, target_id)` on `AuditLog` to keep the /audit-log page fast past ~10k rows; reversible via Django's default `RemoveIndex` reverse op
|
||||||
|
- **`admin_api/tests.py`** — `AuditLogListViewTests`, `AuditLogMetricsViewTests`, `UserStatusAuditEmissionTests` covering list shape, search, pagination bounds, metrics shape + `nocache`, and audit emission on suspend / ban / reinstate
|
||||||
|
|
||||||
|
### Deploy notes
|
||||||
|
Admin users created before this release won't have `audit-log` in their `allowed_modules` TextField. Backfill with:
|
||||||
|
```python
|
||||||
|
# Django shell
|
||||||
|
from accounts.models import User
|
||||||
|
for u in User.objects.filter(role__in=['admin', 'manager']):
|
||||||
|
mods = [m.strip() for m in (u.allowed_modules or '').split(',') if m.strip()]
|
||||||
|
if 'audit-log' not in mods and mods: # only touch users with explicit lists
|
||||||
|
u.allowed_modules = ','.join(mods + ['audit-log'])
|
||||||
|
u.save(update_fields=['allowed_modules'])
|
||||||
|
```
|
||||||
|
Users on the implicit full-access list (empty `allowed_modules` + admin role) pick up the new module automatically via `get_allowed_modules()`.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## [1.11.0] — 2026-04-12
|
||||||
|
|
||||||
|
### Added
|
||||||
|
- **Worldline Connect payment integration** (`banking_operations/worldline/`)
|
||||||
|
- `client.py` — `WorldlineClient`: HMAC-SHA256 signed requests, `create_hosted_checkout()`, `get_hosted_checkout_status()`, `verify_webhook_signature()`
|
||||||
|
- `views.py` — `POST /api/payments/webhook/` (CSRF-exempt, signature-verified Worldline server callback) + `POST /api/payments/verify/` (frontend polls on return URL)
|
||||||
|
- `emails.py` — HTML ticket confirmation email with per-ticket QR codes embedded as base64 inline images
|
||||||
|
- `WorldlineOrder` model in `banking_operations/models.py` — tracks each hosted-checkout session (hosted_checkout_id, reference_id, status, raw_response, webhook_payload)
|
||||||
|
- **`Booking.payment_status`** field — `pending / paid / failed / cancelled` (default `pending`); migration `bookings/0002_booking_payment_status`
|
||||||
|
- **`banking_operations/services.py::transaction_initiate`** — implemented (was a stub); calls Worldline API, creates `WorldlineOrder`, returns `payment_url` back to `CheckoutAPI`
|
||||||
|
- **Settings**: `WORLDLINE_MERCHANT_ID`, `WORLDLINE_API_KEY_ID`, `WORLDLINE_API_SECRET_KEY`, `WORLDLINE_WEBHOOK_SECRET_KEY`, `WORLDLINE_API_ENDPOINT` (default: sandbox), `WORLDLINE_RETURN_URL`
|
||||||
|
- **Requirements**: `requests>=2.31.0`, `qrcode[pil]>=7.4.2`
|
||||||
|
|
||||||
|
### Flow
|
||||||
|
1. User adds tickets to cart → `POST /api/bookings/checkout/` creates Bookings + calls `transaction_initiate`
|
||||||
|
2. `transaction_initiate` creates `WorldlineOrder` + calls Worldline → returns redirect URL
|
||||||
|
3. Frontend redirects user to Worldline hosted checkout page
|
||||||
|
4. After payment, Worldline redirects to `WORLDLINE_RETURN_URL` (`app.eventifyplus.com/booking/confirm?hostedCheckoutId=...`)
|
||||||
|
5. SPA calls `POST /api/payments/verify/` — checks local status; if still pending, polls Worldline API directly
|
||||||
|
6. Worldline webhook fires `POST /api/payments/webhook/` → generates Tickets (one per quantity), marks Booking `paid`, sends confirmation email with QR codes
|
||||||
|
7. Partner scans QR code at event → existing `POST /api/bookings/check-in/` marks `Ticket.is_checked_in=True`
|
||||||
|
|
||||||
|
### Deploy requirement
|
||||||
|
Set in Django container `.env`:
|
||||||
|
```
|
||||||
|
WORLDLINE_MERCHANT_ID=...
|
||||||
|
WORLDLINE_API_KEY_ID=...
|
||||||
|
WORLDLINE_API_SECRET_KEY=...
|
||||||
|
WORLDLINE_WEBHOOK_SECRET_KEY=...
|
||||||
|
```
|
||||||
|
`WORLDLINE_API_ENDPOINT` defaults to sandbox — set to production URL when going live.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
## [1.10.0] — 2026-04-10
|
## [1.10.0] — 2026-04-10
|
||||||
|
|
||||||
### Security
|
### Security
|
||||||
|
|||||||
@@ -68,10 +68,10 @@ class User(AbstractUser):
|
|||||||
help_text='Comma-separated module slugs this user can access',
|
help_text='Comma-separated module slugs this user can access',
|
||||||
)
|
)
|
||||||
|
|
||||||
ALL_MODULES = ["dashboard", "partners", "events", "ad-control", "users", "reviews", "contributions", "leads", "financials", "settings"]
|
ALL_MODULES = ["dashboard", "partners", "events", "ad-control", "users", "reviews", "contributions", "leads", "financials", "audit-log", "settings"]
|
||||||
|
|
||||||
def get_allowed_modules(self):
|
def get_allowed_modules(self):
|
||||||
ALL = ["dashboard", "partners", "events", "ad-control", "users", "reviews", "contributions", "leads", "financials", "settings"]
|
ALL = ["dashboard", "partners", "events", "ad-control", "users", "reviews", "contributions", "leads", "financials", "audit-log", "settings"]
|
||||||
if self.is_superuser or self.role == "admin":
|
if self.is_superuser or self.role == "admin":
|
||||||
return ALL
|
return ALL
|
||||||
if self.allowed_modules:
|
if self.allowed_modules:
|
||||||
|
|||||||
31
admin_api/migrations/0005_auditlog_indexes.py
Normal file
31
admin_api/migrations/0005_auditlog_indexes.py
Normal file
@@ -0,0 +1,31 @@
|
|||||||
|
# Generated by Django 4.2.21 for the Audit Log module (admin_api v1.12.0).
|
||||||
|
#
|
||||||
|
# Adds two composite indexes to `AuditLog` so the new /audit-log admin page
|
||||||
|
# can filter by action and resolve "related entries" lookups without a full
|
||||||
|
# table scan once the log grows past a few thousand rows.
|
||||||
|
|
||||||
|
from django.db import migrations, models
|
||||||
|
|
||||||
|
|
||||||
|
class Migration(migrations.Migration):
|
||||||
|
|
||||||
|
dependencies = [
|
||||||
|
('admin_api', '0004_lead_user_account'),
|
||||||
|
]
|
||||||
|
|
||||||
|
operations = [
|
||||||
|
migrations.AddIndex(
|
||||||
|
model_name='auditlog',
|
||||||
|
index=models.Index(
|
||||||
|
fields=['action', '-created_at'],
|
||||||
|
name='auditlog_action_time_idx',
|
||||||
|
),
|
||||||
|
),
|
||||||
|
migrations.AddIndex(
|
||||||
|
model_name='auditlog',
|
||||||
|
index=models.Index(
|
||||||
|
fields=['target_type', 'target_id'],
|
||||||
|
name='auditlog_target_idx',
|
||||||
|
),
|
||||||
|
),
|
||||||
|
]
|
||||||
@@ -130,7 +130,7 @@ class StaffProfile(models.Model):
|
|||||||
def get_allowed_modules(self):
|
def get_allowed_modules(self):
|
||||||
scopes = self.get_effective_scopes()
|
scopes = self.get_effective_scopes()
|
||||||
if '*' in scopes:
|
if '*' in scopes:
|
||||||
return ['dashboard', 'partners', 'events', 'ad-control', 'users', 'reviews', 'contributions', 'leads', 'financials', 'settings']
|
return ['dashboard', 'partners', 'events', 'ad-control', 'users', 'reviews', 'contributions', 'leads', 'financials', 'audit-log', 'settings']
|
||||||
SCOPE_TO_MODULE = {
|
SCOPE_TO_MODULE = {
|
||||||
'users': 'users',
|
'users': 'users',
|
||||||
'events': 'events',
|
'events': 'events',
|
||||||
@@ -141,6 +141,7 @@ class StaffProfile(models.Model):
|
|||||||
'ads': 'ad-control',
|
'ads': 'ad-control',
|
||||||
'contributions': 'contributions',
|
'contributions': 'contributions',
|
||||||
'leads': 'leads',
|
'leads': 'leads',
|
||||||
|
'audit': 'audit-log',
|
||||||
}
|
}
|
||||||
modules = {'dashboard'}
|
modules = {'dashboard'}
|
||||||
for scope in scopes:
|
for scope in scopes:
|
||||||
@@ -179,6 +180,12 @@ class AuditLog(models.Model):
|
|||||||
|
|
||||||
class Meta:
|
class Meta:
|
||||||
ordering = ['-created_at']
|
ordering = ['-created_at']
|
||||||
|
indexes = [
|
||||||
|
# Fast filter-by-action ordered by time (audit log page default view)
|
||||||
|
models.Index(fields=['action', '-created_at'], name='auditlog_action_time_idx'),
|
||||||
|
# Fast "related entries for this target" lookups in the detail panel
|
||||||
|
models.Index(fields=['target_type', 'target_id'], name='auditlog_target_idx'),
|
||||||
|
]
|
||||||
|
|
||||||
def __str__(self):
|
def __str__(self):
|
||||||
return f"{self.action} by {self.user} at {self.created_at}"
|
return f"{self.action} by {self.user} at {self.created_at}"
|
||||||
|
|||||||
231
admin_api/tests.py
Normal file
231
admin_api/tests.py
Normal file
@@ -0,0 +1,231 @@
|
|||||||
|
"""Tests for the Audit Log module (admin_api v1.12.0).
|
||||||
|
|
||||||
|
Covers:
|
||||||
|
* `AuditLogListView` — list + search + filter shape
|
||||||
|
* `AuditLogMetricsView` — shape + `?nocache=1` bypass
|
||||||
|
* `UserStatusView` — emits a row into `AuditLog` for every status change,
|
||||||
|
inside the same transaction as the state change
|
||||||
|
|
||||||
|
Run with:
|
||||||
|
python manage.py test admin_api.tests
|
||||||
|
"""
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from django.core.cache import cache
|
||||||
|
from django.test import TestCase
|
||||||
|
from rest_framework_simplejwt.tokens import RefreshToken
|
||||||
|
|
||||||
|
from accounts.models import User
|
||||||
|
from admin_api.models import AuditLog
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Base — auth helper shared across cases
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
|
||||||
|
class _AuditTestBase(TestCase):
|
||||||
|
"""Gives each subclass an admin user + pre-issued JWT."""
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def setUpTestData(cls):
|
||||||
|
cls.admin = User.objects.create_user(
|
||||||
|
username='audit.admin@eventifyplus.com',
|
||||||
|
email='audit.admin@eventifyplus.com',
|
||||||
|
password='irrelevant',
|
||||||
|
role='admin',
|
||||||
|
)
|
||||||
|
cls.admin.is_superuser = True
|
||||||
|
cls.admin.save()
|
||||||
|
|
||||||
|
def setUp(self):
|
||||||
|
# Metrics view caches by key; reset to keep cases independent.
|
||||||
|
cache.delete('admin_api:audit_log:metrics:v1')
|
||||||
|
access = str(RefreshToken.for_user(self.admin).access_token)
|
||||||
|
self.auth = {'HTTP_AUTHORIZATION': f'Bearer {access}'}
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# AuditLogListView
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
|
||||||
|
class AuditLogListViewTests(_AuditTestBase):
|
||||||
|
url = '/api/v1/rbac/audit-log/'
|
||||||
|
|
||||||
|
def test_unauthenticated_returns_401(self):
|
||||||
|
resp = self.client.get(self.url)
|
||||||
|
self.assertEqual(resp.status_code, 401)
|
||||||
|
|
||||||
|
def test_authenticated_returns_paginated_shape(self):
|
||||||
|
AuditLog.objects.create(
|
||||||
|
user=self.admin,
|
||||||
|
action='user.suspended',
|
||||||
|
target_type='user',
|
||||||
|
target_id='42',
|
||||||
|
details={'reason': 'spam'},
|
||||||
|
)
|
||||||
|
resp = self.client.get(self.url, **self.auth)
|
||||||
|
self.assertEqual(resp.status_code, 200, resp.content)
|
||||||
|
body = resp.json()
|
||||||
|
for key in ('results', 'total', 'page', 'page_size', 'total_pages'):
|
||||||
|
self.assertIn(key, body)
|
||||||
|
self.assertEqual(body['total'], 1)
|
||||||
|
self.assertEqual(body['results'][0]['action'], 'user.suspended')
|
||||||
|
self.assertEqual(body['results'][0]['user']['email'], self.admin.email)
|
||||||
|
|
||||||
|
def test_search_narrows_results(self):
|
||||||
|
AuditLog.objects.create(
|
||||||
|
user=self.admin, action='user.suspended',
|
||||||
|
target_type='user', target_id='1', details={},
|
||||||
|
)
|
||||||
|
AuditLog.objects.create(
|
||||||
|
user=self.admin, action='event.approved',
|
||||||
|
target_type='event', target_id='1', details={},
|
||||||
|
)
|
||||||
|
resp = self.client.get(self.url, {'search': 'suspend'}, **self.auth)
|
||||||
|
self.assertEqual(resp.status_code, 200)
|
||||||
|
body = resp.json()
|
||||||
|
self.assertEqual(body['total'], 1)
|
||||||
|
self.assertEqual(body['results'][0]['action'], 'user.suspended')
|
||||||
|
|
||||||
|
def test_page_size_is_bounded(self):
|
||||||
|
# page_size=999 must be clamped to the 200-row upper bound.
|
||||||
|
resp = self.client.get(self.url, {'page_size': '999'}, **self.auth)
|
||||||
|
self.assertEqual(resp.status_code, 200)
|
||||||
|
self.assertEqual(resp.json()['page_size'], 200)
|
||||||
|
|
||||||
|
def test_invalid_pagination_falls_back_to_defaults(self):
|
||||||
|
resp = self.client.get(self.url, {'page': 'x', 'page_size': 'y'}, **self.auth)
|
||||||
|
self.assertEqual(resp.status_code, 200)
|
||||||
|
body = resp.json()
|
||||||
|
self.assertEqual(body['page'], 1)
|
||||||
|
self.assertEqual(body['page_size'], 50)
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# AuditLogMetricsView
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
|
||||||
|
class AuditLogMetricsViewTests(_AuditTestBase):
|
||||||
|
url = '/api/v1/rbac/audit-log/metrics/'
|
||||||
|
|
||||||
|
def test_unauthenticated_returns_401(self):
|
||||||
|
resp = self.client.get(self.url)
|
||||||
|
self.assertEqual(resp.status_code, 401)
|
||||||
|
|
||||||
|
def test_returns_expected_shape(self):
|
||||||
|
AuditLog.objects.create(
|
||||||
|
user=self.admin, action='event.approved',
|
||||||
|
target_type='event', target_id='7', details={},
|
||||||
|
)
|
||||||
|
AuditLog.objects.create(
|
||||||
|
user=self.admin, action='department.created',
|
||||||
|
target_type='department', target_id='3', details={},
|
||||||
|
)
|
||||||
|
resp = self.client.get(self.url, **self.auth)
|
||||||
|
self.assertEqual(resp.status_code, 200, resp.content)
|
||||||
|
body = resp.json()
|
||||||
|
for key in ('total', 'today', 'week', 'distinct_users', 'by_action_group'):
|
||||||
|
self.assertIn(key, body)
|
||||||
|
self.assertEqual(body['total'], 2)
|
||||||
|
self.assertEqual(body['distinct_users'], 1)
|
||||||
|
# Each group present so frontend tooltip can render all 6 rows.
|
||||||
|
for group in ('create', 'update', 'delete', 'moderate', 'auth', 'other'):
|
||||||
|
self.assertIn(group, body['by_action_group'])
|
||||||
|
self.assertEqual(body['by_action_group']['moderate'], 1)
|
||||||
|
self.assertEqual(body['by_action_group']['create'], 1)
|
||||||
|
|
||||||
|
def test_nocache_bypasses_stale_cache(self):
|
||||||
|
# Prime cache with a fake payload.
|
||||||
|
cache.set(
|
||||||
|
'admin_api:audit_log:metrics:v1',
|
||||||
|
{
|
||||||
|
'total': 999,
|
||||||
|
'today': 0, 'week': 0, 'distinct_users': 0,
|
||||||
|
'by_action_group': {
|
||||||
|
'create': 0, 'update': 0, 'delete': 0,
|
||||||
|
'moderate': 0, 'auth': 0, 'other': 0,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
60,
|
||||||
|
)
|
||||||
|
|
||||||
|
resp_cached = self.client.get(self.url, **self.auth)
|
||||||
|
self.assertEqual(resp_cached.json()['total'], 999)
|
||||||
|
|
||||||
|
resp_fresh = self.client.get(self.url, {'nocache': '1'}, **self.auth)
|
||||||
|
self.assertEqual(resp_fresh.json()['total'], 0) # real DB state
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# UserStatusView — audit emission
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
|
||||||
|
class UserStatusAuditEmissionTests(_AuditTestBase):
|
||||||
|
"""Each status transition must leave a matching row in `AuditLog`.
|
||||||
|
|
||||||
|
The endpoint wraps the state change + audit log in `transaction.atomic()`
|
||||||
|
so the two can never disagree. These assertions catch regressions where a
|
||||||
|
new branch forgets the audit call.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def _url(self, user_id: int) -> str:
|
||||||
|
return f'/api/v1/users/{user_id}/status/'
|
||||||
|
|
||||||
|
def setUp(self):
|
||||||
|
super().setUp()
|
||||||
|
self.target = User.objects.create_user(
|
||||||
|
username='target@example.com',
|
||||||
|
email='target@example.com',
|
||||||
|
password='irrelevant',
|
||||||
|
role='customer',
|
||||||
|
)
|
||||||
|
|
||||||
|
def test_suspend_emits_audit_row(self):
|
||||||
|
resp = self.client.patch(
|
||||||
|
self._url(self.target.id),
|
||||||
|
data={'action': 'suspend', 'reason': 'spam flood'},
|
||||||
|
content_type='application/json',
|
||||||
|
**self.auth,
|
||||||
|
)
|
||||||
|
self.assertEqual(resp.status_code, 200, resp.content)
|
||||||
|
log = AuditLog.objects.filter(
|
||||||
|
action='user.suspended', target_id=str(self.target.id),
|
||||||
|
).first()
|
||||||
|
self.assertIsNotNone(log, 'suspend did not emit audit log')
|
||||||
|
self.assertEqual(log.details.get('reason'), 'spam flood')
|
||||||
|
|
||||||
|
def test_ban_emits_audit_row(self):
|
||||||
|
resp = self.client.patch(
|
||||||
|
self._url(self.target.id),
|
||||||
|
data={'action': 'ban'},
|
||||||
|
content_type='application/json',
|
||||||
|
**self.auth,
|
||||||
|
)
|
||||||
|
self.assertEqual(resp.status_code, 200, resp.content)
|
||||||
|
self.assertTrue(
|
||||||
|
AuditLog.objects.filter(
|
||||||
|
action='user.banned', target_id=str(self.target.id),
|
||||||
|
).exists(),
|
||||||
|
'ban did not emit audit log',
|
||||||
|
)
|
||||||
|
|
||||||
|
def test_reinstate_emits_audit_row(self):
|
||||||
|
self.target.is_active = False
|
||||||
|
self.target.save()
|
||||||
|
resp = self.client.patch(
|
||||||
|
self._url(self.target.id),
|
||||||
|
data={'action': 'reinstate'},
|
||||||
|
content_type='application/json',
|
||||||
|
**self.auth,
|
||||||
|
)
|
||||||
|
self.assertEqual(resp.status_code, 200, resp.content)
|
||||||
|
self.assertTrue(
|
||||||
|
AuditLog.objects.filter(
|
||||||
|
action='user.reinstated', target_id=str(self.target.id),
|
||||||
|
).exists(),
|
||||||
|
'reinstate did not emit audit log',
|
||||||
|
)
|
||||||
@@ -80,6 +80,16 @@ urlpatterns = [
|
|||||||
path('rbac/scopes/', views.ScopeListView.as_view(), name='rbac-scope-list'),
|
path('rbac/scopes/', views.ScopeListView.as_view(), name='rbac-scope-list'),
|
||||||
path('rbac/org-tree/', views.OrgTreeView.as_view(), name='rbac-org-tree'),
|
path('rbac/org-tree/', views.OrgTreeView.as_view(), name='rbac-org-tree'),
|
||||||
path('rbac/audit-log/', views.AuditLogListView.as_view(), name='rbac-audit-log'),
|
path('rbac/audit-log/', views.AuditLogListView.as_view(), name='rbac-audit-log'),
|
||||||
|
path('rbac/audit-log/metrics/', views.AuditLogMetricsView.as_view(), name='rbac-audit-log-metrics'),
|
||||||
|
|
||||||
|
# Notifications (admin-side recurring email jobs)
|
||||||
|
path('notifications/types/', views.NotificationTypesView.as_view(), name='notification-types'),
|
||||||
|
path('notifications/schedules/', views.NotificationScheduleListView.as_view(), name='notification-schedule-list'),
|
||||||
|
path('notifications/schedules/<int:pk>/', views.NotificationScheduleDetailView.as_view(), name='notification-schedule-detail'),
|
||||||
|
path('notifications/schedules/<int:pk>/recipients/', views.NotificationRecipientView.as_view(), name='notification-recipient-create'),
|
||||||
|
path('notifications/schedules/<int:pk>/recipients/<int:rid>/', views.NotificationRecipientDetailView.as_view(), name='notification-recipient-detail'),
|
||||||
|
path('notifications/schedules/<int:pk>/send-now/', views.NotificationScheduleSendNowView.as_view(), name='notification-schedule-send-now'),
|
||||||
|
path('notifications/schedules/<int:pk>/test-send/', views.NotificationScheduleTestSendView.as_view(), name='notification-schedule-test-send'),
|
||||||
|
|
||||||
# Ad Control
|
# Ad Control
|
||||||
path('ad-control/', include('ad_control.urls')),
|
path('ad-control/', include('ad_control.urls')),
|
||||||
|
|||||||
@@ -482,22 +482,67 @@ class PartnerStatusView(APIView):
|
|||||||
class PartnerKYCReviewView(APIView):
|
class PartnerKYCReviewView(APIView):
|
||||||
permission_classes = [IsAuthenticated]
|
permission_classes = [IsAuthenticated]
|
||||||
|
|
||||||
|
_DECISION_SLUG = {
|
||||||
|
'approved': 'partner.kyc.approved',
|
||||||
|
'rejected': 'partner.kyc.rejected',
|
||||||
|
'requested_info': 'partner.kyc.requested_info',
|
||||||
|
}
|
||||||
|
|
||||||
def post(self, request, pk):
|
def post(self, request, pk):
|
||||||
from partner.models import Partner
|
from partner.models import Partner
|
||||||
from django.shortcuts import get_object_or_404
|
from django.shortcuts import get_object_or_404
|
||||||
p = get_object_or_404(Partner, pk=pk)
|
p = get_object_or_404(Partner, pk=pk)
|
||||||
decision = request.data.get('decision')
|
decision = request.data.get('decision')
|
||||||
if decision not in ('approved', 'rejected'):
|
audit_slug = self._DECISION_SLUG.get(decision)
|
||||||
return Response({'error': 'decision must be approved or rejected'}, status=400)
|
if audit_slug is None:
|
||||||
p.kyc_compliance_status = decision
|
return Response(
|
||||||
p.is_kyc_compliant = (decision == 'approved')
|
{'error': 'decision must be approved, rejected, or requested_info'},
|
||||||
p.kyc_compliance_reason = request.data.get('reason', '')
|
status=400,
|
||||||
p.save(update_fields=['kyc_compliance_status', 'is_kyc_compliant', 'kyc_compliance_reason'])
|
)
|
||||||
|
|
||||||
|
reason = request.data.get('reason', '') or ''
|
||||||
|
docs_requested = request.data.get('docs_requested') or []
|
||||||
|
previous_status = p.kyc_compliance_status
|
||||||
|
previous_is_compliant = bool(p.is_kyc_compliant)
|
||||||
|
|
||||||
|
with transaction.atomic():
|
||||||
|
if decision in ('approved', 'rejected'):
|
||||||
|
p.kyc_compliance_status = decision
|
||||||
|
p.is_kyc_compliant = (decision == 'approved')
|
||||||
|
# `requested_info` leaves the compliance state intact — it's a
|
||||||
|
# reviewer signalling they need more documents from the partner.
|
||||||
|
p.kyc_compliance_reason = reason
|
||||||
|
p.save(update_fields=[
|
||||||
|
'kyc_compliance_status', 'is_kyc_compliant', 'kyc_compliance_reason',
|
||||||
|
])
|
||||||
|
_audit_log(
|
||||||
|
request,
|
||||||
|
action=audit_slug,
|
||||||
|
target_type='partner',
|
||||||
|
target_id=p.id,
|
||||||
|
details={
|
||||||
|
'reason': reason,
|
||||||
|
'docs_requested': docs_requested,
|
||||||
|
'previous_status': previous_status,
|
||||||
|
'new_status': p.kyc_compliance_status,
|
||||||
|
'previous_is_compliant': previous_is_compliant,
|
||||||
|
'new_is_compliant': bool(p.is_kyc_compliant),
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
# Preserve the existing response shape — frontend depends on it.
|
||||||
|
if decision == 'approved':
|
||||||
|
verification_status = 'Verified'
|
||||||
|
elif decision == 'rejected':
|
||||||
|
verification_status = 'Rejected'
|
||||||
|
else:
|
||||||
|
verification_status = 'Info Requested'
|
||||||
|
|
||||||
return Response({
|
return Response({
|
||||||
'id': str(p.id),
|
'id': str(p.id),
|
||||||
'kyc_compliance_status': p.kyc_compliance_status,
|
'kyc_compliance_status': p.kyc_compliance_status,
|
||||||
'is_kyc_compliant': p.is_kyc_compliant,
|
'is_kyc_compliant': p.is_kyc_compliant,
|
||||||
'verificationStatus': 'Verified' if decision == 'approved' else 'Rejected',
|
'verificationStatus': verification_status,
|
||||||
})
|
})
|
||||||
|
|
||||||
|
|
||||||
@@ -637,19 +682,50 @@ class UserDetailView(APIView):
|
|||||||
class UserStatusView(APIView):
|
class UserStatusView(APIView):
|
||||||
permission_classes = [IsAuthenticated]
|
permission_classes = [IsAuthenticated]
|
||||||
|
|
||||||
|
# Map the external `action` slug sent by the admin dashboard to the
|
||||||
|
# audit-log slug we record + the `is_active` value that slug implies.
|
||||||
|
# Keeping the mapping data-driven means a new moderation verb (e.g.
|
||||||
|
# `flag`) only needs one new row, not scattered `if` branches.
|
||||||
|
_ACTION_MAP = {
|
||||||
|
'suspend': ('user.suspended', False),
|
||||||
|
'ban': ('user.banned', False),
|
||||||
|
'reinstate': ('user.reinstated', True),
|
||||||
|
'flag': ('user.flagged', None),
|
||||||
|
}
|
||||||
|
|
||||||
def patch(self, request, pk):
|
def patch(self, request, pk):
|
||||||
from django.contrib.auth import get_user_model
|
from django.contrib.auth import get_user_model
|
||||||
from django.shortcuts import get_object_or_404
|
from django.shortcuts import get_object_or_404
|
||||||
User = get_user_model()
|
User = get_user_model()
|
||||||
u = get_object_or_404(User, pk=pk)
|
u = get_object_or_404(User, pk=pk)
|
||||||
action = request.data.get('action')
|
action = request.data.get('action')
|
||||||
if action in ('suspend', 'ban'):
|
mapping = self._ACTION_MAP.get(action)
|
||||||
u.is_active = False
|
if mapping is None:
|
||||||
elif action == 'reinstate':
|
return Response(
|
||||||
u.is_active = True
|
{'error': 'action must be suspend, ban, reinstate, or flag'},
|
||||||
else:
|
status=400,
|
||||||
return Response({'error': 'action must be suspend, ban, or reinstate'}, status=400)
|
)
|
||||||
u.save(update_fields=['is_active'])
|
audit_slug, new_active = mapping
|
||||||
|
reason = request.data.get('reason', '') or ''
|
||||||
|
previous_status = _user_status(u)
|
||||||
|
|
||||||
|
# Audit + state change must succeed or fail together — otherwise the
|
||||||
|
# log says one thing and the database says another.
|
||||||
|
with transaction.atomic():
|
||||||
|
if new_active is not None and u.is_active != new_active:
|
||||||
|
u.is_active = new_active
|
||||||
|
u.save(update_fields=['is_active'])
|
||||||
|
_audit_log(
|
||||||
|
request,
|
||||||
|
action=audit_slug,
|
||||||
|
target_type='user',
|
||||||
|
target_id=u.id,
|
||||||
|
details={
|
||||||
|
'reason': reason,
|
||||||
|
'previous_status': previous_status,
|
||||||
|
'new_status': _user_status(u),
|
||||||
|
},
|
||||||
|
)
|
||||||
return Response({'id': str(u.id), 'status': _user_status(u)})
|
return Response({'id': str(u.id), 'status': _user_status(u)})
|
||||||
|
|
||||||
# ---------------------------------------------------------------------------
|
# ---------------------------------------------------------------------------
|
||||||
@@ -759,6 +835,10 @@ class EventListView(APIView):
|
|||||||
qs = qs.filter(
|
qs = qs.filter(
|
||||||
Q(title__icontains=q) | Q(name__icontains=q) | Q(venue_name__icontains=q)
|
Q(title__icontains=q) | Q(name__icontains=q) | Q(venue_name__icontains=q)
|
||||||
)
|
)
|
||||||
|
if date_from := request.GET.get('date_from'):
|
||||||
|
qs = qs.filter(start_date__gte=date_from)
|
||||||
|
if date_to := request.GET.get('date_to'):
|
||||||
|
qs = qs.filter(start_date__lte=date_to)
|
||||||
try:
|
try:
|
||||||
page = max(1, int(request.GET.get('page', 1)))
|
page = max(1, int(request.GET.get('page', 1)))
|
||||||
page_size = min(100, int(request.GET.get('page_size', 20)))
|
page_size = min(100, int(request.GET.get('page_size', 20)))
|
||||||
@@ -839,29 +919,60 @@ class EventUpdateView(APIView):
|
|||||||
class EventModerationView(APIView):
|
class EventModerationView(APIView):
|
||||||
permission_classes = [IsAuthenticated]
|
permission_classes = [IsAuthenticated]
|
||||||
|
|
||||||
|
_ACTION_SLUG = {
|
||||||
|
'approve': 'event.approved',
|
||||||
|
'reject': 'event.rejected',
|
||||||
|
'flag': 'event.flagged',
|
||||||
|
'feature': 'event.featured',
|
||||||
|
'unfeature': 'event.unfeatured',
|
||||||
|
}
|
||||||
|
|
||||||
def patch(self, request, pk):
|
def patch(self, request, pk):
|
||||||
from events.models import Event
|
from events.models import Event
|
||||||
from django.shortcuts import get_object_or_404
|
from django.shortcuts import get_object_or_404
|
||||||
e = get_object_or_404(Event, pk=pk)
|
e = get_object_or_404(Event, pk=pk)
|
||||||
action = request.data.get('action')
|
action = request.data.get('action')
|
||||||
if action == 'approve':
|
audit_slug = self._ACTION_SLUG.get(action)
|
||||||
e.event_status = 'published'
|
if audit_slug is None:
|
||||||
e.save(update_fields=['event_status'])
|
|
||||||
elif action == 'reject':
|
|
||||||
e.event_status = 'cancelled'
|
|
||||||
e.cancelled_reason = request.data.get('reason', '')
|
|
||||||
e.save(update_fields=['event_status', 'cancelled_reason'])
|
|
||||||
elif action == 'flag':
|
|
||||||
e.event_status = 'flagged'
|
|
||||||
e.save(update_fields=['event_status'])
|
|
||||||
elif action == 'feature':
|
|
||||||
e.is_featured = True
|
|
||||||
e.save(update_fields=['is_featured'])
|
|
||||||
elif action == 'unfeature':
|
|
||||||
e.is_featured = False
|
|
||||||
e.save(update_fields=['is_featured'])
|
|
||||||
else:
|
|
||||||
return Response({'error': 'Invalid action'}, status=400)
|
return Response({'error': 'Invalid action'}, status=400)
|
||||||
|
|
||||||
|
reason = request.data.get('reason', '') or ''
|
||||||
|
previous_status = e.event_status
|
||||||
|
previous_is_featured = bool(e.is_featured)
|
||||||
|
|
||||||
|
with transaction.atomic():
|
||||||
|
if action == 'approve':
|
||||||
|
e.event_status = 'published'
|
||||||
|
e.save(update_fields=['event_status'])
|
||||||
|
elif action == 'reject':
|
||||||
|
e.event_status = 'cancelled'
|
||||||
|
e.cancelled_reason = reason
|
||||||
|
e.save(update_fields=['event_status', 'cancelled_reason'])
|
||||||
|
elif action == 'flag':
|
||||||
|
e.event_status = 'flagged'
|
||||||
|
e.save(update_fields=['event_status'])
|
||||||
|
elif action == 'feature':
|
||||||
|
e.is_featured = True
|
||||||
|
e.save(update_fields=['is_featured'])
|
||||||
|
elif action == 'unfeature':
|
||||||
|
e.is_featured = False
|
||||||
|
e.save(update_fields=['is_featured'])
|
||||||
|
|
||||||
|
_audit_log(
|
||||||
|
request,
|
||||||
|
action=audit_slug,
|
||||||
|
target_type='event',
|
||||||
|
target_id=e.id,
|
||||||
|
details={
|
||||||
|
'reason': reason,
|
||||||
|
'partner_id': str(e.partner_id) if e.partner_id else None,
|
||||||
|
'previous_status': previous_status,
|
||||||
|
'new_status': e.event_status,
|
||||||
|
'previous_is_featured': previous_is_featured,
|
||||||
|
'new_is_featured': bool(e.is_featured),
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
e.refresh_from_db()
|
e.refresh_from_db()
|
||||||
return Response(_serialize_event(e))
|
return Response(_serialize_event(e))
|
||||||
|
|
||||||
@@ -1089,8 +1200,15 @@ class ReviewModerationView(APIView):
|
|||||||
from django.shortcuts import get_object_or_404
|
from django.shortcuts import get_object_or_404
|
||||||
review = get_object_or_404(Review, pk=pk)
|
review = get_object_or_404(Review, pk=pk)
|
||||||
action = request.data.get('action')
|
action = request.data.get('action')
|
||||||
|
|
||||||
|
previous_status = review.status
|
||||||
|
previous_text = review.review_text
|
||||||
|
audit_slug = None
|
||||||
|
details = {}
|
||||||
|
|
||||||
if action == 'approve':
|
if action == 'approve':
|
||||||
review.status = 'live'
|
review.status = 'live'
|
||||||
|
audit_slug = 'review.approved'
|
||||||
elif action == 'reject':
|
elif action == 'reject':
|
||||||
rr = request.data.get('reject_reason', 'spam')
|
rr = request.data.get('reject_reason', 'spam')
|
||||||
valid_reasons = [c[0] for c in Review.REJECT_CHOICES]
|
valid_reasons = [c[0] for c in Review.REJECT_CHOICES]
|
||||||
@@ -1098,14 +1216,36 @@ class ReviewModerationView(APIView):
|
|||||||
return Response({'error': 'Invalid reject_reason'}, status=400)
|
return Response({'error': 'Invalid reject_reason'}, status=400)
|
||||||
review.status = 'rejected'
|
review.status = 'rejected'
|
||||||
review.reject_reason = rr
|
review.reject_reason = rr
|
||||||
|
audit_slug = 'review.rejected'
|
||||||
|
details['reject_reason'] = rr
|
||||||
elif action == 'save_and_approve':
|
elif action == 'save_and_approve':
|
||||||
review.review_text = request.data.get('review_text', review.review_text)
|
review.review_text = request.data.get('review_text', review.review_text)
|
||||||
review.status = 'live'
|
review.status = 'live'
|
||||||
|
audit_slug = 'review.edited'
|
||||||
|
details['edited_text'] = True
|
||||||
elif action == 'save_live':
|
elif action == 'save_live':
|
||||||
review.review_text = request.data.get('review_text', review.review_text)
|
review.review_text = request.data.get('review_text', review.review_text)
|
||||||
|
audit_slug = 'review.edited'
|
||||||
|
details['edited_text'] = True
|
||||||
else:
|
else:
|
||||||
return Response({'error': 'Invalid action'}, status=400)
|
return Response({'error': 'Invalid action'}, status=400)
|
||||||
review.save()
|
|
||||||
|
details.update({
|
||||||
|
'previous_status': previous_status,
|
||||||
|
'new_status': review.status,
|
||||||
|
})
|
||||||
|
if previous_text != review.review_text:
|
||||||
|
details['original_text'] = previous_text
|
||||||
|
|
||||||
|
with transaction.atomic():
|
||||||
|
review.save()
|
||||||
|
_audit_log(
|
||||||
|
request,
|
||||||
|
action=audit_slug,
|
||||||
|
target_type='review',
|
||||||
|
target_id=review.id,
|
||||||
|
details=details,
|
||||||
|
)
|
||||||
return Response(_serialize_review(review))
|
return Response(_serialize_review(review))
|
||||||
|
|
||||||
|
|
||||||
@@ -1962,10 +2102,28 @@ class OrgTreeView(APIView):
|
|||||||
|
|
||||||
|
|
||||||
# 14. AuditLogListView
|
# 14. AuditLogListView
|
||||||
|
def _serialize_audit_log(log):
|
||||||
|
return {
|
||||||
|
'id': log.id,
|
||||||
|
'user': {
|
||||||
|
'id': log.user.id,
|
||||||
|
'username': log.user.username,
|
||||||
|
'email': log.user.email,
|
||||||
|
} if log.user else None,
|
||||||
|
'action': log.action,
|
||||||
|
'target_type': log.target_type,
|
||||||
|
'target_id': log.target_id,
|
||||||
|
'details': log.details,
|
||||||
|
'ip_address': log.ip_address,
|
||||||
|
'created_at': log.created_at.isoformat(),
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
class AuditLogListView(APIView):
|
class AuditLogListView(APIView):
|
||||||
permission_classes = [IsAuthenticated]
|
permission_classes = [IsAuthenticated]
|
||||||
|
|
||||||
def get(self, request):
|
def get(self, request):
|
||||||
|
from django.db.models import Q
|
||||||
qs = AuditLog.objects.select_related('user').all()
|
qs = AuditLog.objects.select_related('user').all()
|
||||||
# Filters
|
# Filters
|
||||||
user_id = request.query_params.get('user')
|
user_id = request.query_params.get('user')
|
||||||
@@ -1984,29 +2142,32 @@ class AuditLogListView(APIView):
|
|||||||
if date_to:
|
if date_to:
|
||||||
qs = qs.filter(created_at__date__lte=date_to)
|
qs = qs.filter(created_at__date__lte=date_to)
|
||||||
|
|
||||||
|
# Free-text search across action, target type/id, and actor identity.
|
||||||
|
# ORed so one box can locate entries regardless of which axis the
|
||||||
|
# admin remembers (slug vs. target vs. user).
|
||||||
|
search = request.query_params.get('search')
|
||||||
|
if search:
|
||||||
|
qs = qs.filter(
|
||||||
|
Q(action__icontains=search)
|
||||||
|
| Q(target_type__icontains=search)
|
||||||
|
| Q(target_id__icontains=search)
|
||||||
|
| Q(user__username__icontains=search)
|
||||||
|
| Q(user__email__icontains=search)
|
||||||
|
)
|
||||||
|
|
||||||
# Pagination
|
# Pagination
|
||||||
page = int(request.query_params.get('page', 1))
|
try:
|
||||||
page_size = int(request.query_params.get('page_size', 50))
|
page = max(1, int(request.query_params.get('page', 1)))
|
||||||
|
page_size = max(1, min(200, int(request.query_params.get('page_size', 50))))
|
||||||
|
except (ValueError, TypeError):
|
||||||
|
page, page_size = 1, 50
|
||||||
total = qs.count()
|
total = qs.count()
|
||||||
start = (page - 1) * page_size
|
start = (page - 1) * page_size
|
||||||
end = start + page_size
|
end = start + page_size
|
||||||
logs = qs[start:end]
|
logs = qs[start:end]
|
||||||
|
|
||||||
return Response({
|
return Response({
|
||||||
'results': [{
|
'results': [_serialize_audit_log(log) for log in logs],
|
||||||
'id': log.id,
|
|
||||||
'user': {
|
|
||||||
'id': log.user.id,
|
|
||||||
'username': log.user.username,
|
|
||||||
'email': log.user.email,
|
|
||||||
} if log.user else None,
|
|
||||||
'action': log.action,
|
|
||||||
'target_type': log.target_type,
|
|
||||||
'target_id': log.target_id,
|
|
||||||
'details': log.details,
|
|
||||||
'ip_address': log.ip_address,
|
|
||||||
'created_at': log.created_at.isoformat(),
|
|
||||||
} for log in logs],
|
|
||||||
'total': total,
|
'total': total,
|
||||||
'page': page,
|
'page': page,
|
||||||
'page_size': page_size,
|
'page_size': page_size,
|
||||||
@@ -2014,6 +2175,93 @@ class AuditLogListView(APIView):
|
|||||||
})
|
})
|
||||||
|
|
||||||
|
|
||||||
|
# 15. AuditLogMetricsView — aggregated counts for the admin header bar.
|
||||||
|
def _compute_action_groups(qs):
|
||||||
|
"""Bucket action slugs into UI-visible groups without a DB roundtrip per slug.
|
||||||
|
|
||||||
|
Mirrors `actionGroup()` on the frontend so the tooltip numbers match the
|
||||||
|
badge colours.
|
||||||
|
"""
|
||||||
|
groups = {
|
||||||
|
'create': 0,
|
||||||
|
'update': 0,
|
||||||
|
'delete': 0,
|
||||||
|
'moderate': 0,
|
||||||
|
'auth': 0,
|
||||||
|
'other': 0,
|
||||||
|
}
|
||||||
|
for action in qs.values_list('action', flat=True):
|
||||||
|
slug = (action or '').lower()
|
||||||
|
if '.deactivated' in slug or '.deleted' in slug or '.removed' in slug:
|
||||||
|
groups['delete'] += 1
|
||||||
|
elif '.created' in slug or '.invited' in slug or '.added' in slug:
|
||||||
|
groups['create'] += 1
|
||||||
|
elif (
|
||||||
|
'.updated' in slug or '.edited' in slug
|
||||||
|
or '.moved' in slug or '.changed' in slug
|
||||||
|
):
|
||||||
|
groups['update'] += 1
|
||||||
|
elif (
|
||||||
|
slug.startswith('user.')
|
||||||
|
or slug.startswith('event.')
|
||||||
|
or slug.startswith('review.')
|
||||||
|
or slug.startswith('partner.kyc.')
|
||||||
|
):
|
||||||
|
groups['moderate'] += 1
|
||||||
|
elif slug.startswith('auth.') or 'login' in slug or 'logout' in slug:
|
||||||
|
groups['auth'] += 1
|
||||||
|
else:
|
||||||
|
groups['other'] += 1
|
||||||
|
return groups
|
||||||
|
|
||||||
|
|
||||||
|
class AuditLogMetricsView(APIView):
|
||||||
|
"""GET /api/v1/rbac/audit-log/metrics/ — aggregated counts.
|
||||||
|
|
||||||
|
Cached for 60 s because the header bar is fetched on every page load and
|
||||||
|
the numbers are approximate by nature. If you need fresh numbers, bypass
|
||||||
|
cache with `?nocache=1` (useful from the Django shell during incident
|
||||||
|
response).
|
||||||
|
"""
|
||||||
|
permission_classes = [IsAuthenticated]
|
||||||
|
|
||||||
|
CACHE_KEY = 'admin_api:audit_log:metrics:v1'
|
||||||
|
CACHE_TTL_SECONDS = 60
|
||||||
|
|
||||||
|
def get(self, request):
|
||||||
|
from django.core.cache import cache
|
||||||
|
from django.utils import timezone
|
||||||
|
from datetime import timedelta
|
||||||
|
|
||||||
|
if request.query_params.get('nocache') == '1':
|
||||||
|
payload = self._compute()
|
||||||
|
cache.set(self.CACHE_KEY, payload, self.CACHE_TTL_SECONDS)
|
||||||
|
return Response(payload)
|
||||||
|
|
||||||
|
payload = cache.get(self.CACHE_KEY)
|
||||||
|
if payload is None:
|
||||||
|
payload = self._compute()
|
||||||
|
cache.set(self.CACHE_KEY, payload, self.CACHE_TTL_SECONDS)
|
||||||
|
return Response(payload)
|
||||||
|
|
||||||
|
def _compute(self):
|
||||||
|
from django.utils import timezone
|
||||||
|
from datetime import timedelta
|
||||||
|
|
||||||
|
now = timezone.now()
|
||||||
|
today_start = now.replace(hour=0, minute=0, second=0, microsecond=0)
|
||||||
|
week_start = today_start - timedelta(days=6)
|
||||||
|
qs = AuditLog.objects.all()
|
||||||
|
return {
|
||||||
|
'total': qs.count(),
|
||||||
|
'today': qs.filter(created_at__gte=today_start).count(),
|
||||||
|
'week': qs.filter(created_at__gte=week_start).count(),
|
||||||
|
'distinct_users': qs.exclude(user__isnull=True)
|
||||||
|
.values('user_id').distinct().count(),
|
||||||
|
'by_action_group': _compute_action_groups(qs),
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
# ---------------------------------------------------------------------------
|
# ---------------------------------------------------------------------------
|
||||||
# Partner Onboarding API
|
# Partner Onboarding API
|
||||||
# ---------------------------------------------------------------------------
|
# ---------------------------------------------------------------------------
|
||||||
@@ -2517,3 +2765,318 @@ class LeadUpdateView(APIView):
|
|||||||
lead.save()
|
lead.save()
|
||||||
log("info", f"Lead #{pk} updated: {', '.join(changed)}", request=request, user=request.user)
|
log("info", f"Lead #{pk} updated: {', '.join(changed)}", request=request, user=request.user)
|
||||||
return Response(_serialize_lead(lead))
|
return Response(_serialize_lead(lead))
|
||||||
|
|
||||||
|
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
# Notification schedules (admin-side recurring email jobs)
|
||||||
|
# ---------------------------------------------------------------------------
|
||||||
|
|
||||||
|
|
||||||
|
def _serialize_recipient(r):
|
||||||
|
return {
|
||||||
|
'id': r.pk,
|
||||||
|
'email': r.email,
|
||||||
|
'displayName': r.display_name,
|
||||||
|
'isActive': r.is_active,
|
||||||
|
'createdAt': r.created_at.isoformat(),
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def _serialize_schedule(s, include_recipients=True):
|
||||||
|
payload = {
|
||||||
|
'id': s.pk,
|
||||||
|
'name': s.name,
|
||||||
|
'notificationType': s.notification_type,
|
||||||
|
'cronExpression': s.cron_expression,
|
||||||
|
'isActive': s.is_active,
|
||||||
|
'lastRunAt': s.last_run_at.isoformat() if s.last_run_at else None,
|
||||||
|
'lastStatus': s.last_status,
|
||||||
|
'lastError': s.last_error,
|
||||||
|
'createdAt': s.created_at.isoformat(),
|
||||||
|
'updatedAt': s.updated_at.isoformat(),
|
||||||
|
}
|
||||||
|
if include_recipients:
|
||||||
|
payload['recipients'] = [
|
||||||
|
_serialize_recipient(r) for r in s.recipients.all()
|
||||||
|
]
|
||||||
|
payload['recipientCount'] = s.recipients.filter(is_active=True).count()
|
||||||
|
return payload
|
||||||
|
|
||||||
|
|
||||||
|
def _validate_cron(expr):
|
||||||
|
from croniter import croniter
|
||||||
|
if not expr or not isinstance(expr, str):
|
||||||
|
return False, 'cronExpression is required'
|
||||||
|
if not croniter.is_valid(expr.strip()):
|
||||||
|
return False, f'Invalid cron expression: {expr!r}'
|
||||||
|
return True, None
|
||||||
|
|
||||||
|
|
||||||
|
class NotificationTypesView(APIView):
|
||||||
|
permission_classes = [IsAuthenticated]
|
||||||
|
|
||||||
|
def get(self, request):
|
||||||
|
from notifications.models import NotificationSchedule
|
||||||
|
return Response({
|
||||||
|
'types': [
|
||||||
|
{'value': v, 'label': l}
|
||||||
|
for v, l in NotificationSchedule.TYPE_CHOICES
|
||||||
|
],
|
||||||
|
})
|
||||||
|
|
||||||
|
|
||||||
|
class NotificationScheduleListView(APIView):
|
||||||
|
permission_classes = [IsAuthenticated]
|
||||||
|
|
||||||
|
def get(self, request):
|
||||||
|
from notifications.models import NotificationSchedule
|
||||||
|
qs = NotificationSchedule.objects.prefetch_related('recipients').order_by('-created_at')
|
||||||
|
return Response({
|
||||||
|
'count': qs.count(),
|
||||||
|
'results': [_serialize_schedule(s) for s in qs],
|
||||||
|
})
|
||||||
|
|
||||||
|
def post(self, request):
|
||||||
|
from notifications.models import NotificationSchedule, NotificationRecipient
|
||||||
|
from django.db import transaction as db_tx
|
||||||
|
from eventify_logger.services import log
|
||||||
|
|
||||||
|
name = (request.data.get('name') or '').strip()
|
||||||
|
ntype = (request.data.get('notificationType') or '').strip()
|
||||||
|
cron = (request.data.get('cronExpression') or '').strip()
|
||||||
|
is_active = bool(request.data.get('isActive', True))
|
||||||
|
recipients_in = request.data.get('recipients') or []
|
||||||
|
|
||||||
|
if not name:
|
||||||
|
return Response({'error': 'name is required'}, status=400)
|
||||||
|
if ntype not in dict(NotificationSchedule.TYPE_CHOICES):
|
||||||
|
return Response({'error': f'Invalid notificationType: {ntype!r}'}, status=400)
|
||||||
|
ok, err = _validate_cron(cron)
|
||||||
|
if not ok:
|
||||||
|
return Response({'error': err}, status=400)
|
||||||
|
|
||||||
|
with db_tx.atomic():
|
||||||
|
schedule = NotificationSchedule.objects.create(
|
||||||
|
name=name,
|
||||||
|
notification_type=ntype,
|
||||||
|
cron_expression=cron,
|
||||||
|
is_active=is_active,
|
||||||
|
)
|
||||||
|
seen_emails = set()
|
||||||
|
for r in recipients_in:
|
||||||
|
email = (r.get('email') or '').strip().lower()
|
||||||
|
if not email or email in seen_emails:
|
||||||
|
continue
|
||||||
|
seen_emails.add(email)
|
||||||
|
NotificationRecipient.objects.create(
|
||||||
|
schedule=schedule,
|
||||||
|
email=email,
|
||||||
|
display_name=(r.get('displayName') or '').strip(),
|
||||||
|
is_active=bool(r.get('isActive', True)),
|
||||||
|
)
|
||||||
|
|
||||||
|
log('info', f'Notification schedule #{schedule.pk} created: {schedule.name}',
|
||||||
|
request=request, user=request.user)
|
||||||
|
return Response(_serialize_schedule(schedule), status=201)
|
||||||
|
|
||||||
|
|
||||||
|
class NotificationScheduleDetailView(APIView):
|
||||||
|
permission_classes = [IsAuthenticated]
|
||||||
|
|
||||||
|
def get(self, request, pk):
|
||||||
|
from notifications.models import NotificationSchedule
|
||||||
|
from django.shortcuts import get_object_or_404
|
||||||
|
s = get_object_or_404(
|
||||||
|
NotificationSchedule.objects.prefetch_related('recipients'), pk=pk,
|
||||||
|
)
|
||||||
|
return Response(_serialize_schedule(s))
|
||||||
|
|
||||||
|
def patch(self, request, pk):
|
||||||
|
from notifications.models import NotificationSchedule
|
||||||
|
from django.shortcuts import get_object_or_404
|
||||||
|
from eventify_logger.services import log
|
||||||
|
s = get_object_or_404(NotificationSchedule, pk=pk)
|
||||||
|
|
||||||
|
changed = []
|
||||||
|
if (name := request.data.get('name')) is not None:
|
||||||
|
name = str(name).strip()
|
||||||
|
if not name:
|
||||||
|
return Response({'error': 'name cannot be empty'}, status=400)
|
||||||
|
s.name = name
|
||||||
|
changed.append('name')
|
||||||
|
|
||||||
|
if (ntype := request.data.get('notificationType')) is not None:
|
||||||
|
if ntype not in dict(NotificationSchedule.TYPE_CHOICES):
|
||||||
|
return Response({'error': f'Invalid notificationType: {ntype!r}'}, status=400)
|
||||||
|
s.notification_type = ntype
|
||||||
|
changed.append('notification_type')
|
||||||
|
|
||||||
|
if (cron := request.data.get('cronExpression')) is not None:
|
||||||
|
ok, err = _validate_cron(cron)
|
||||||
|
if not ok:
|
||||||
|
return Response({'error': err}, status=400)
|
||||||
|
s.cron_expression = cron.strip()
|
||||||
|
changed.append('cron_expression')
|
||||||
|
|
||||||
|
if (is_active := request.data.get('isActive')) is not None:
|
||||||
|
s.is_active = bool(is_active)
|
||||||
|
changed.append('is_active')
|
||||||
|
|
||||||
|
s.save()
|
||||||
|
log('info', f'Notification schedule #{pk} updated: {", ".join(changed) or "no-op"}',
|
||||||
|
request=request, user=request.user)
|
||||||
|
return Response(_serialize_schedule(s))
|
||||||
|
|
||||||
|
def delete(self, request, pk):
|
||||||
|
from notifications.models import NotificationSchedule
|
||||||
|
from django.shortcuts import get_object_or_404
|
||||||
|
from eventify_logger.services import log
|
||||||
|
s = get_object_or_404(NotificationSchedule, pk=pk)
|
||||||
|
s.delete()
|
||||||
|
log('info', f'Notification schedule #{pk} deleted', request=request, user=request.user)
|
||||||
|
return Response(status=204)
|
||||||
|
|
||||||
|
|
||||||
|
class NotificationRecipientView(APIView):
|
||||||
|
permission_classes = [IsAuthenticated]
|
||||||
|
|
||||||
|
def post(self, request, pk):
|
||||||
|
from notifications.models import NotificationSchedule, NotificationRecipient
|
||||||
|
from django.shortcuts import get_object_or_404
|
||||||
|
schedule = get_object_or_404(NotificationSchedule, pk=pk)
|
||||||
|
|
||||||
|
email = (request.data.get('email') or '').strip().lower()
|
||||||
|
if not email:
|
||||||
|
return Response({'error': 'email is required'}, status=400)
|
||||||
|
if NotificationRecipient.objects.filter(schedule=schedule, email=email).exists():
|
||||||
|
return Response({'error': f'{email} is already a recipient'}, status=409)
|
||||||
|
|
||||||
|
r = NotificationRecipient.objects.create(
|
||||||
|
schedule=schedule,
|
||||||
|
email=email,
|
||||||
|
display_name=(request.data.get('displayName') or '').strip(),
|
||||||
|
is_active=bool(request.data.get('isActive', True)),
|
||||||
|
)
|
||||||
|
return Response(_serialize_recipient(r), status=201)
|
||||||
|
|
||||||
|
|
||||||
|
class NotificationRecipientDetailView(APIView):
|
||||||
|
permission_classes = [IsAuthenticated]
|
||||||
|
|
||||||
|
def patch(self, request, pk, rid):
|
||||||
|
from notifications.models import NotificationRecipient
|
||||||
|
from django.shortcuts import get_object_or_404
|
||||||
|
r = get_object_or_404(NotificationRecipient, pk=rid, schedule_id=pk)
|
||||||
|
|
||||||
|
if (email := request.data.get('email')) is not None:
|
||||||
|
email = str(email).strip().lower()
|
||||||
|
if not email:
|
||||||
|
return Response({'error': 'email cannot be empty'}, status=400)
|
||||||
|
clash = NotificationRecipient.objects.filter(
|
||||||
|
schedule_id=pk, email=email,
|
||||||
|
).exclude(pk=rid).exists()
|
||||||
|
if clash:
|
||||||
|
return Response({'error': f'{email} is already a recipient'}, status=409)
|
||||||
|
r.email = email
|
||||||
|
|
||||||
|
if (display_name := request.data.get('displayName')) is not None:
|
||||||
|
r.display_name = str(display_name).strip()
|
||||||
|
|
||||||
|
if (is_active := request.data.get('isActive')) is not None:
|
||||||
|
r.is_active = bool(is_active)
|
||||||
|
|
||||||
|
r.save()
|
||||||
|
return Response(_serialize_recipient(r))
|
||||||
|
|
||||||
|
def delete(self, request, pk, rid):
|
||||||
|
from notifications.models import NotificationRecipient
|
||||||
|
from django.shortcuts import get_object_or_404
|
||||||
|
r = get_object_or_404(NotificationRecipient, pk=rid, schedule_id=pk)
|
||||||
|
r.delete()
|
||||||
|
return Response(status=204)
|
||||||
|
|
||||||
|
|
||||||
|
class NotificationScheduleSendNowView(APIView):
|
||||||
|
"""Trigger an immediate dispatch of one schedule, bypassing cron check."""
|
||||||
|
permission_classes = [IsAuthenticated]
|
||||||
|
|
||||||
|
def post(self, request, pk):
|
||||||
|
from notifications.models import NotificationSchedule
|
||||||
|
from notifications.emails import render_and_send
|
||||||
|
from django.shortcuts import get_object_or_404
|
||||||
|
from django.utils import timezone as dj_tz
|
||||||
|
from eventify_logger.services import log
|
||||||
|
schedule = get_object_or_404(NotificationSchedule, pk=pk)
|
||||||
|
|
||||||
|
try:
|
||||||
|
recipient_count = render_and_send(schedule)
|
||||||
|
except Exception as exc: # noqa: BLE001
|
||||||
|
schedule.last_status = NotificationSchedule.STATUS_ERROR
|
||||||
|
schedule.last_error = str(exc)[:2000]
|
||||||
|
schedule.save(update_fields=['last_status', 'last_error', 'updated_at'])
|
||||||
|
log('error', f'Send-now failed for schedule #{pk}: {exc}',
|
||||||
|
request=request, user=request.user)
|
||||||
|
return Response({'error': str(exc)}, status=500)
|
||||||
|
|
||||||
|
schedule.last_run_at = dj_tz.now()
|
||||||
|
schedule.last_status = NotificationSchedule.STATUS_SUCCESS
|
||||||
|
schedule.last_error = ''
|
||||||
|
schedule.save(update_fields=[
|
||||||
|
'last_run_at', 'last_status', 'last_error', 'updated_at',
|
||||||
|
])
|
||||||
|
log('info', f'Send-now fired for schedule #{pk} → {recipient_count} recipient(s)',
|
||||||
|
request=request, user=request.user)
|
||||||
|
return Response({
|
||||||
|
'ok': True,
|
||||||
|
'recipientCount': recipient_count,
|
||||||
|
'schedule': _serialize_schedule(schedule),
|
||||||
|
})
|
||||||
|
|
||||||
|
|
||||||
|
class NotificationScheduleTestSendView(APIView):
|
||||||
|
"""Send a preview of this schedule's email to a single address.
|
||||||
|
|
||||||
|
Does NOT update last_run_at / last_status — purely for previewing content.
|
||||||
|
"""
|
||||||
|
permission_classes = [IsAuthenticated]
|
||||||
|
|
||||||
|
def post(self, request, pk):
|
||||||
|
from notifications.models import NotificationSchedule
|
||||||
|
from notifications.emails import BUILDERS
|
||||||
|
from django.conf import settings
|
||||||
|
from django.core.mail import EmailMessage
|
||||||
|
from django.shortcuts import get_object_or_404
|
||||||
|
from eventify_logger.services import log
|
||||||
|
|
||||||
|
schedule = get_object_or_404(NotificationSchedule, pk=pk)
|
||||||
|
|
||||||
|
email = (request.data.get('email') or '').strip().lower()
|
||||||
|
if not email:
|
||||||
|
return Response({'error': 'email is required'}, status=400)
|
||||||
|
|
||||||
|
builder = BUILDERS.get(schedule.notification_type)
|
||||||
|
if builder is None:
|
||||||
|
return Response(
|
||||||
|
{'error': f'No builder for type: {schedule.notification_type}'},
|
||||||
|
status=400,
|
||||||
|
)
|
||||||
|
|
||||||
|
try:
|
||||||
|
subject, html = builder(schedule)
|
||||||
|
subject = f'[TEST] {subject}'
|
||||||
|
msg = EmailMessage(
|
||||||
|
subject=subject,
|
||||||
|
body=html,
|
||||||
|
from_email=settings.DEFAULT_FROM_EMAIL,
|
||||||
|
to=[email],
|
||||||
|
)
|
||||||
|
msg.content_subtype = 'html'
|
||||||
|
msg.send(fail_silently=False)
|
||||||
|
except Exception as exc: # noqa: BLE001
|
||||||
|
log('error', f'Test-send failed for schedule #{pk}: {exc}',
|
||||||
|
request=request, user=request.user)
|
||||||
|
return Response({'error': str(exc)}, status=500)
|
||||||
|
|
||||||
|
log('info', f'Test email sent for schedule #{pk} → {email}',
|
||||||
|
request=request, user=request.user)
|
||||||
|
return Response({'ok': True, 'sentTo': email})
|
||||||
|
|||||||
181
notifications/emails.py
Normal file
181
notifications/emails.py
Normal file
@@ -0,0 +1,181 @@
|
|||||||
|
"""HTML email builders for scheduled admin notifications.
|
||||||
|
|
||||||
|
Each builder is registered in ``BUILDERS`` keyed by ``NotificationSchedule.notification_type``
|
||||||
|
and returns ``(subject, html_body)``. Add new types by appending to the registry
|
||||||
|
and extending ``NotificationSchedule.TYPE_CHOICES``.
|
||||||
|
|
||||||
|
Week bounds for ``events_expiring_this_week`` are computed in Asia/Kolkata so the
|
||||||
|
"this week" semantics match the operations team's wall-clock week regardless of
|
||||||
|
``settings.TIME_ZONE`` (currently UTC).
|
||||||
|
"""
|
||||||
|
from datetime import date, datetime, timedelta
|
||||||
|
from html import escape
|
||||||
|
|
||||||
|
try:
|
||||||
|
from zoneinfo import ZoneInfo
|
||||||
|
except ImportError: # pragma: no cover — fallback for py<3.9
|
||||||
|
from backports.zoneinfo import ZoneInfo # type: ignore
|
||||||
|
|
||||||
|
from django.conf import settings
|
||||||
|
from django.core.mail import EmailMessage
|
||||||
|
from django.db.models import Q
|
||||||
|
|
||||||
|
from eventify_logger.services import log
|
||||||
|
|
||||||
|
|
||||||
|
IST = ZoneInfo('Asia/Kolkata')
|
||||||
|
|
||||||
|
|
||||||
|
def _today_in_ist() -> date:
|
||||||
|
return datetime.now(IST).date()
|
||||||
|
|
||||||
|
|
||||||
|
def _upcoming_week_bounds(today: date) -> tuple[date, date]:
|
||||||
|
"""Return (next Monday, next Sunday) inclusive.
|
||||||
|
|
||||||
|
If today is Monday the result is *this week* (today..Sunday).
|
||||||
|
If today is any other weekday the result is *next week* (next Monday..next Sunday).
|
||||||
|
Mon=0 per Python ``weekday()``.
|
||||||
|
"""
|
||||||
|
days_until_monday = (7 - today.weekday()) % 7
|
||||||
|
monday = today + timedelta(days=days_until_monday)
|
||||||
|
sunday = monday + timedelta(days=6)
|
||||||
|
return monday, sunday
|
||||||
|
|
||||||
|
|
||||||
|
def _build_events_expiring_this_week(schedule) -> tuple[str, str]:
|
||||||
|
from events.models import Event
|
||||||
|
|
||||||
|
today = _today_in_ist()
|
||||||
|
monday, sunday = _upcoming_week_bounds(today)
|
||||||
|
|
||||||
|
qs = (
|
||||||
|
Event.objects
|
||||||
|
.select_related('partner', 'event_type')
|
||||||
|
.filter(event_status='published')
|
||||||
|
.filter(
|
||||||
|
Q(end_date__isnull=False, end_date__gte=monday, end_date__lte=sunday)
|
||||||
|
| Q(end_date__isnull=True, start_date__gte=monday, start_date__lte=sunday)
|
||||||
|
)
|
||||||
|
.order_by('end_date', 'start_date', 'name')
|
||||||
|
)
|
||||||
|
|
||||||
|
events = list(qs)
|
||||||
|
rows_html = ''
|
||||||
|
for e in events:
|
||||||
|
end = e.end_date or e.start_date
|
||||||
|
title = e.title or e.name or '(untitled)'
|
||||||
|
partner_name = ''
|
||||||
|
if e.partner_id:
|
||||||
|
try:
|
||||||
|
partner_name = e.partner.name or ''
|
||||||
|
except Exception:
|
||||||
|
partner_name = ''
|
||||||
|
category = ''
|
||||||
|
if e.event_type_id and e.event_type:
|
||||||
|
category = getattr(e.event_type, 'event_type', '') or ''
|
||||||
|
rows_html += (
|
||||||
|
'<tr>'
|
||||||
|
f'<td style="padding:10px 12px;border-bottom:1px solid #eee;">{escape(title)}</td>'
|
||||||
|
f'<td style="padding:10px 12px;border-bottom:1px solid #eee;">{escape(partner_name or "—")}</td>'
|
||||||
|
f'<td style="padding:10px 12px;border-bottom:1px solid #eee;">{escape(category or "—")}</td>'
|
||||||
|
f'<td style="padding:10px 12px;border-bottom:1px solid #eee;">'
|
||||||
|
f'{end.strftime("%a %d %b %Y") if end else "—"}</td>'
|
||||||
|
'</tr>'
|
||||||
|
)
|
||||||
|
|
||||||
|
if not events:
|
||||||
|
rows_html = (
|
||||||
|
'<tr><td colspan="4" style="padding:24px;text-align:center;color:#888;">'
|
||||||
|
'No published events are expiring next week.'
|
||||||
|
'</td></tr>'
|
||||||
|
)
|
||||||
|
|
||||||
|
subject = (
|
||||||
|
f'[Eventify] {len(events)} event(s) expiring '
|
||||||
|
f'{monday.strftime("%d %b")}–{sunday.strftime("%d %b")}'
|
||||||
|
)
|
||||||
|
|
||||||
|
html = f"""<!doctype html>
|
||||||
|
<html><body style="margin:0;padding:0;background:#f5f5f5;font-family:Arial,Helvetica,sans-serif;color:#1a1a1a;">
|
||||||
|
<table role="presentation" width="100%" cellspacing="0" cellpadding="0" style="background:#f5f5f5;">
|
||||||
|
<tr><td align="center" style="padding:24px 12px;">
|
||||||
|
<table role="presentation" width="640" cellspacing="0" cellpadding="0" style="max-width:640px;background:#ffffff;border-radius:10px;overflow:hidden;box-shadow:0 2px 6px rgba(15,69,207,0.08);">
|
||||||
|
<tr><td style="background:#0F45CF;color:#ffffff;padding:24px 28px;">
|
||||||
|
<h2 style="margin:0;font-size:20px;">Events expiring next week</h2>
|
||||||
|
<p style="margin:6px 0 0;color:#d2dcff;font-size:14px;">
|
||||||
|
{monday.strftime("%A %d %b %Y")} → {sunday.strftime("%A %d %b %Y")}
|
||||||
|
· {len(events)} event(s)
|
||||||
|
</p>
|
||||||
|
</td></tr>
|
||||||
|
<tr><td style="padding:20px 24px;">
|
||||||
|
<p style="margin:0 0 12px;font-size:14px;color:#444;">
|
||||||
|
Scheduled notification: <strong>{escape(schedule.name)}</strong>
|
||||||
|
</p>
|
||||||
|
<table role="presentation" width="100%" cellspacing="0" cellpadding="0" style="border-collapse:collapse;font-size:14px;">
|
||||||
|
<thead>
|
||||||
|
<tr style="background:#f0f4ff;color:#0F45CF;">
|
||||||
|
<th align="left" style="padding:10px 12px;">Title</th>
|
||||||
|
<th align="left" style="padding:10px 12px;">Partner</th>
|
||||||
|
<th align="left" style="padding:10px 12px;">Category</th>
|
||||||
|
<th align="left" style="padding:10px 12px;">End date</th>
|
||||||
|
</tr>
|
||||||
|
</thead>
|
||||||
|
<tbody>{rows_html}</tbody>
|
||||||
|
</table>
|
||||||
|
</td></tr>
|
||||||
|
<tr><td style="padding:16px 24px 24px;color:#888;font-size:12px;">
|
||||||
|
Sent automatically by Eventify Command Center.
|
||||||
|
To change recipients or the schedule, open
|
||||||
|
<a href="https://admin.eventifyplus.com/settings" style="color:#0F45CF;">admin.eventifyplus.com › Settings › Notifications</a>.
|
||||||
|
</td></tr>
|
||||||
|
</table>
|
||||||
|
</td></tr>
|
||||||
|
</table>
|
||||||
|
</body></html>"""
|
||||||
|
|
||||||
|
return subject, html
|
||||||
|
|
||||||
|
|
||||||
|
BUILDERS: dict = {
|
||||||
|
'events_expiring_this_week': _build_events_expiring_this_week,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def render_and_send(schedule) -> int:
|
||||||
|
"""Render the email for ``schedule`` and deliver it to active recipients.
|
||||||
|
|
||||||
|
Returns the number of recipients the message was sent to. Raises on SMTP
|
||||||
|
failure so the management command can mark the schedule as errored.
|
||||||
|
"""
|
||||||
|
builder = BUILDERS.get(schedule.notification_type)
|
||||||
|
if builder is None:
|
||||||
|
raise ValueError(f'No builder for notification type: {schedule.notification_type}')
|
||||||
|
|
||||||
|
subject, html = builder(schedule)
|
||||||
|
recipients = list(
|
||||||
|
schedule.recipients.filter(is_active=True).values_list('email', flat=True)
|
||||||
|
)
|
||||||
|
if not recipients:
|
||||||
|
log('warning', 'notification schedule has no active recipients', logger_data={
|
||||||
|
'schedule_id': schedule.id,
|
||||||
|
'schedule_name': schedule.name,
|
||||||
|
})
|
||||||
|
return 0
|
||||||
|
|
||||||
|
msg = EmailMessage(
|
||||||
|
subject=subject,
|
||||||
|
body=html,
|
||||||
|
from_email=settings.DEFAULT_FROM_EMAIL,
|
||||||
|
to=recipients,
|
||||||
|
)
|
||||||
|
msg.content_subtype = 'html'
|
||||||
|
msg.send(fail_silently=False)
|
||||||
|
|
||||||
|
log('info', 'notification email sent', logger_data={
|
||||||
|
'schedule_id': schedule.id,
|
||||||
|
'schedule_name': schedule.name,
|
||||||
|
'type': schedule.notification_type,
|
||||||
|
'recipient_count': len(recipients),
|
||||||
|
})
|
||||||
|
return len(recipients)
|
||||||
0
notifications/management/__init__.py
Normal file
0
notifications/management/__init__.py
Normal file
0
notifications/management/commands/__init__.py
Normal file
0
notifications/management/commands/__init__.py
Normal file
@@ -0,0 +1,153 @@
|
|||||||
|
"""Dispatch due ``NotificationSchedule`` jobs.
|
||||||
|
|
||||||
|
Host cron invokes this every ~15 minutes via ``docker exec``. The command
|
||||||
|
walks all active schedules, evaluates their cron expression against
|
||||||
|
``last_run_at`` using ``croniter``, and fires any that are due. A row-level
|
||||||
|
``select_for_update(skip_locked=True)`` prevents duplicate sends if two cron
|
||||||
|
ticks race or the container is restarted mid-run.
|
||||||
|
|
||||||
|
Evaluation timezone is **Asia/Kolkata** to match
|
||||||
|
``notifications/emails.py::_upcoming_week_bounds`` — the same wall-clock week
|
||||||
|
used in the outgoing email body.
|
||||||
|
|
||||||
|
Flags:
|
||||||
|
--schedule-id <id> Fire exactly one schedule, ignoring cron check.
|
||||||
|
--dry-run Resolve due schedules + render emails, send nothing.
|
||||||
|
"""
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
|
||||||
|
try:
|
||||||
|
from zoneinfo import ZoneInfo
|
||||||
|
except ImportError: # pragma: no cover — py<3.9
|
||||||
|
from backports.zoneinfo import ZoneInfo # type: ignore
|
||||||
|
|
||||||
|
from croniter import croniter
|
||||||
|
from django.core.management.base import BaseCommand, CommandError
|
||||||
|
from django.db import transaction
|
||||||
|
from django.utils import timezone
|
||||||
|
|
||||||
|
from eventify_logger.services import log
|
||||||
|
from notifications.emails import BUILDERS, render_and_send
|
||||||
|
from notifications.models import NotificationSchedule
|
||||||
|
|
||||||
|
|
||||||
|
IST = ZoneInfo('Asia/Kolkata')
|
||||||
|
|
||||||
|
|
||||||
|
def _is_due(schedule: NotificationSchedule, now_ist: datetime) -> bool:
|
||||||
|
"""Return True if ``schedule`` should fire at ``now_ist``.
|
||||||
|
|
||||||
|
``croniter`` is seeded with ``last_run_at`` (or one year ago for a fresh
|
||||||
|
schedule) and asked for the next fire time. If that time has already
|
||||||
|
passed relative to ``now_ist`` the schedule is due.
|
||||||
|
"""
|
||||||
|
if not croniter.is_valid(schedule.cron_expression):
|
||||||
|
return False
|
||||||
|
|
||||||
|
if schedule.last_run_at is not None:
|
||||||
|
seed = schedule.last_run_at.astimezone(IST)
|
||||||
|
else:
|
||||||
|
seed = now_ist - timedelta(days=365)
|
||||||
|
|
||||||
|
itr = croniter(schedule.cron_expression, seed)
|
||||||
|
next_fire = itr.get_next(datetime)
|
||||||
|
return next_fire <= now_ist
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = 'Dispatch due NotificationSchedule email jobs.'
|
||||||
|
|
||||||
|
def add_arguments(self, parser):
|
||||||
|
parser.add_argument(
|
||||||
|
'--schedule-id', type=int, default=None,
|
||||||
|
help='Force-run a single schedule by ID, ignoring cron check.',
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
'--dry-run', action='store_true',
|
||||||
|
help='Render and log but do not send or persist last_run_at.',
|
||||||
|
)
|
||||||
|
|
||||||
|
def handle(self, *args, **opts):
|
||||||
|
schedule_id = opts.get('schedule_id')
|
||||||
|
dry_run = opts.get('dry_run', False)
|
||||||
|
|
||||||
|
now_ist = datetime.now(IST)
|
||||||
|
qs = NotificationSchedule.objects.filter(is_active=True)
|
||||||
|
if schedule_id is not None:
|
||||||
|
qs = qs.filter(id=schedule_id)
|
||||||
|
|
||||||
|
candidate_ids = list(qs.values_list('id', flat=True))
|
||||||
|
if not candidate_ids:
|
||||||
|
self.stdout.write('No active schedules to evaluate.')
|
||||||
|
return
|
||||||
|
|
||||||
|
fired = 0
|
||||||
|
skipped = 0
|
||||||
|
errored = 0
|
||||||
|
|
||||||
|
for sid in candidate_ids:
|
||||||
|
with transaction.atomic():
|
||||||
|
locked_qs = (
|
||||||
|
NotificationSchedule.objects
|
||||||
|
.select_for_update(skip_locked=True)
|
||||||
|
.filter(id=sid, is_active=True)
|
||||||
|
)
|
||||||
|
schedule = locked_qs.first()
|
||||||
|
if schedule is None:
|
||||||
|
skipped += 1
|
||||||
|
continue
|
||||||
|
|
||||||
|
forced = schedule_id is not None
|
||||||
|
if not forced and not _is_due(schedule, now_ist):
|
||||||
|
skipped += 1
|
||||||
|
continue
|
||||||
|
|
||||||
|
if schedule.notification_type not in BUILDERS:
|
||||||
|
schedule.last_status = NotificationSchedule.STATUS_ERROR
|
||||||
|
schedule.last_error = (
|
||||||
|
f'No builder registered for {schedule.notification_type!r}'
|
||||||
|
)
|
||||||
|
schedule.save(update_fields=['last_status', 'last_error', 'updated_at'])
|
||||||
|
errored += 1
|
||||||
|
continue
|
||||||
|
|
||||||
|
if dry_run:
|
||||||
|
self.stdout.write(
|
||||||
|
f'[dry-run] would fire schedule {schedule.id} '
|
||||||
|
f'({schedule.name}) type={schedule.notification_type}'
|
||||||
|
)
|
||||||
|
fired += 1
|
||||||
|
continue
|
||||||
|
|
||||||
|
try:
|
||||||
|
recipient_count = render_and_send(schedule)
|
||||||
|
except Exception as exc: # noqa: BLE001 — wide catch, store msg
|
||||||
|
log('error', 'notification dispatch failed', logger_data={
|
||||||
|
'schedule_id': schedule.id,
|
||||||
|
'schedule_name': schedule.name,
|
||||||
|
'error': str(exc),
|
||||||
|
})
|
||||||
|
schedule.last_status = NotificationSchedule.STATUS_ERROR
|
||||||
|
schedule.last_error = str(exc)[:2000]
|
||||||
|
schedule.save(update_fields=['last_status', 'last_error', 'updated_at'])
|
||||||
|
errored += 1
|
||||||
|
continue
|
||||||
|
|
||||||
|
schedule.last_run_at = timezone.now()
|
||||||
|
schedule.last_status = NotificationSchedule.STATUS_SUCCESS
|
||||||
|
schedule.last_error = ''
|
||||||
|
schedule.save(update_fields=[
|
||||||
|
'last_run_at', 'last_status', 'last_error', 'updated_at',
|
||||||
|
])
|
||||||
|
fired += 1
|
||||||
|
self.stdout.write(
|
||||||
|
f'Fired schedule {schedule.id} ({schedule.name}) '
|
||||||
|
f'→ {recipient_count} recipient(s)'
|
||||||
|
)
|
||||||
|
|
||||||
|
summary = f'Done. fired={fired} skipped={skipped} errored={errored}'
|
||||||
|
self.stdout.write(summary)
|
||||||
|
log('info', 'send_scheduled_notifications complete', logger_data={
|
||||||
|
'fired': fired, 'skipped': skipped, 'errored': errored,
|
||||||
|
'dry_run': dry_run, 'forced_id': schedule_id,
|
||||||
|
})
|
||||||
93
notifications/migrations/0001_initial.py
Normal file
93
notifications/migrations/0001_initial.py
Normal file
@@ -0,0 +1,93 @@
|
|||||||
|
from django.conf import settings
|
||||||
|
from django.db import migrations, models
|
||||||
|
import django.db.models.deletion
|
||||||
|
|
||||||
|
|
||||||
|
class Migration(migrations.Migration):
|
||||||
|
|
||||||
|
initial = True
|
||||||
|
|
||||||
|
dependencies = [
|
||||||
|
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
|
||||||
|
]
|
||||||
|
|
||||||
|
operations = [
|
||||||
|
migrations.CreateModel(
|
||||||
|
name='Notification',
|
||||||
|
fields=[
|
||||||
|
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||||
|
('title', models.CharField(max_length=255)),
|
||||||
|
('message', models.TextField()),
|
||||||
|
('notification_type', models.CharField(
|
||||||
|
choices=[
|
||||||
|
('event', 'Event'),
|
||||||
|
('promo', 'Promotion'),
|
||||||
|
('system', 'System'),
|
||||||
|
('booking', 'Booking'),
|
||||||
|
],
|
||||||
|
default='system', max_length=20,
|
||||||
|
)),
|
||||||
|
('is_read', models.BooleanField(default=False)),
|
||||||
|
('action_url', models.URLField(blank=True, null=True)),
|
||||||
|
('created_at', models.DateTimeField(auto_now_add=True)),
|
||||||
|
('user', models.ForeignKey(
|
||||||
|
on_delete=django.db.models.deletion.CASCADE,
|
||||||
|
related_name='notifications',
|
||||||
|
to=settings.AUTH_USER_MODEL,
|
||||||
|
)),
|
||||||
|
],
|
||||||
|
options={'ordering': ['-created_at']},
|
||||||
|
),
|
||||||
|
migrations.CreateModel(
|
||||||
|
name='NotificationSchedule',
|
||||||
|
fields=[
|
||||||
|
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||||
|
('name', models.CharField(max_length=200)),
|
||||||
|
('notification_type', models.CharField(
|
||||||
|
choices=[('events_expiring_this_week', 'Events Expiring This Week')],
|
||||||
|
db_index=True, max_length=64,
|
||||||
|
)),
|
||||||
|
('cron_expression', models.CharField(
|
||||||
|
default='0 0 * * 1',
|
||||||
|
help_text=(
|
||||||
|
'Standard 5-field cron (minute hour dom month dow). '
|
||||||
|
'Evaluated in Asia/Kolkata.'
|
||||||
|
),
|
||||||
|
max_length=100,
|
||||||
|
)),
|
||||||
|
('is_active', models.BooleanField(db_index=True, default=True)),
|
||||||
|
('last_run_at', models.DateTimeField(blank=True, null=True)),
|
||||||
|
('last_status', models.CharField(blank=True, default='', max_length=20)),
|
||||||
|
('last_error', models.TextField(blank=True, default='')),
|
||||||
|
('created_at', models.DateTimeField(auto_now_add=True)),
|
||||||
|
('updated_at', models.DateTimeField(auto_now=True)),
|
||||||
|
],
|
||||||
|
options={'ordering': ['-created_at']},
|
||||||
|
),
|
||||||
|
migrations.AddIndex(
|
||||||
|
model_name='notificationschedule',
|
||||||
|
index=models.Index(
|
||||||
|
fields=['is_active', 'notification_type'],
|
||||||
|
name='notificatio_is_acti_26dfb5_idx',
|
||||||
|
),
|
||||||
|
),
|
||||||
|
migrations.CreateModel(
|
||||||
|
name='NotificationRecipient',
|
||||||
|
fields=[
|
||||||
|
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
|
||||||
|
('email', models.EmailField(max_length=254)),
|
||||||
|
('display_name', models.CharField(blank=True, default='', max_length=200)),
|
||||||
|
('is_active', models.BooleanField(default=True)),
|
||||||
|
('created_at', models.DateTimeField(auto_now_add=True)),
|
||||||
|
('schedule', models.ForeignKey(
|
||||||
|
on_delete=django.db.models.deletion.CASCADE,
|
||||||
|
related_name='recipients',
|
||||||
|
to='notifications.notificationschedule',
|
||||||
|
)),
|
||||||
|
],
|
||||||
|
options={
|
||||||
|
'ordering': ['display_name', 'email'],
|
||||||
|
'unique_together': {('schedule', 'email')},
|
||||||
|
},
|
||||||
|
),
|
||||||
|
]
|
||||||
0
notifications/migrations/__init__.py
Normal file
0
notifications/migrations/__init__.py
Normal file
@@ -1,4 +1,16 @@
|
|||||||
|
"""
|
||||||
|
Two distinct concerns live in this app:
|
||||||
|
|
||||||
|
1. ``Notification`` — consumer-facing in-app inbox entries surfaced on the mobile
|
||||||
|
SPA (/api/notifications/list/). One row per user per alert.
|
||||||
|
|
||||||
|
2. ``NotificationSchedule`` + ``NotificationRecipient`` — admin-side recurring
|
||||||
|
email jobs configured from the Command Center Settings tab and dispatched by
|
||||||
|
the ``send_scheduled_notifications`` management command (host cron).
|
||||||
|
Not user-facing; strictly operational.
|
||||||
|
"""
|
||||||
from django.db import models
|
from django.db import models
|
||||||
|
|
||||||
from accounts.models import User
|
from accounts.models import User
|
||||||
|
|
||||||
|
|
||||||
@@ -23,3 +35,68 @@ class Notification(models.Model):
|
|||||||
|
|
||||||
def __str__(self):
|
def __str__(self):
|
||||||
return f"{self.notification_type}: {self.title} → {self.user.email}"
|
return f"{self.notification_type}: {self.title} → {self.user.email}"
|
||||||
|
|
||||||
|
|
||||||
|
class NotificationSchedule(models.Model):
|
||||||
|
"""One configurable recurring email job.
|
||||||
|
|
||||||
|
New types are added by registering a builder in ``notifications/emails.py``
|
||||||
|
and adding the slug to ``TYPE_CHOICES`` below. Cron expression is evaluated
|
||||||
|
in ``Asia/Kolkata`` by the dispatcher (matches operations team timezone).
|
||||||
|
"""
|
||||||
|
|
||||||
|
TYPE_EVENTS_EXPIRING_THIS_WEEK = 'events_expiring_this_week'
|
||||||
|
|
||||||
|
TYPE_CHOICES = [
|
||||||
|
(TYPE_EVENTS_EXPIRING_THIS_WEEK, 'Events Expiring This Week'),
|
||||||
|
]
|
||||||
|
|
||||||
|
STATUS_SUCCESS = 'success'
|
||||||
|
STATUS_ERROR = 'error'
|
||||||
|
|
||||||
|
name = models.CharField(max_length=200)
|
||||||
|
notification_type = models.CharField(
|
||||||
|
max_length=64, choices=TYPE_CHOICES, db_index=True,
|
||||||
|
)
|
||||||
|
cron_expression = models.CharField(
|
||||||
|
max_length=100, default='0 0 * * 1',
|
||||||
|
help_text='Standard 5-field cron (minute hour dom month dow). '
|
||||||
|
'Evaluated in Asia/Kolkata.',
|
||||||
|
)
|
||||||
|
is_active = models.BooleanField(default=True, db_index=True)
|
||||||
|
last_run_at = models.DateTimeField(null=True, blank=True)
|
||||||
|
last_status = models.CharField(max_length=20, blank=True, default='')
|
||||||
|
last_error = models.TextField(blank=True, default='')
|
||||||
|
created_at = models.DateTimeField(auto_now_add=True)
|
||||||
|
updated_at = models.DateTimeField(auto_now=True)
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
ordering = ['-created_at']
|
||||||
|
indexes = [models.Index(fields=['is_active', 'notification_type'])]
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return f'{self.name} ({self.notification_type})'
|
||||||
|
|
||||||
|
|
||||||
|
class NotificationRecipient(models.Model):
|
||||||
|
"""Free-form recipient — not tied to a User row so external stakeholders
|
||||||
|
(vendors, partners, sponsors) can receive notifications without needing
|
||||||
|
platform accounts."""
|
||||||
|
|
||||||
|
schedule = models.ForeignKey(
|
||||||
|
NotificationSchedule,
|
||||||
|
on_delete=models.CASCADE,
|
||||||
|
related_name='recipients',
|
||||||
|
)
|
||||||
|
email = models.EmailField()
|
||||||
|
display_name = models.CharField(max_length=200, blank=True, default='')
|
||||||
|
is_active = models.BooleanField(default=True)
|
||||||
|
created_at = models.DateTimeField(auto_now_add=True)
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
unique_together = [('schedule', 'email')]
|
||||||
|
ordering = ['display_name', 'email']
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
label = self.display_name or self.email
|
||||||
|
return f'{label} ({self.schedule.name})'
|
||||||
|
|||||||
@@ -9,3 +9,5 @@ psycopg2-binary==2.9.9
|
|||||||
djangorestframework-simplejwt==5.3.1
|
djangorestframework-simplejwt==5.3.1
|
||||||
google-auth>=2.0.0
|
google-auth>=2.0.0
|
||||||
requests>=2.28.0
|
requests>=2.28.0
|
||||||
|
qrcode[pil]>=7.4.2
|
||||||
|
croniter>=2.0.0
|
||||||
|
|||||||
@@ -2,3 +2,6 @@ Django>=4.2
|
|||||||
Pillow
|
Pillow
|
||||||
django-summernote
|
django-summernote
|
||||||
google-auth>=2.0.0
|
google-auth>=2.0.0
|
||||||
|
requests>=2.31.0
|
||||||
|
qrcode[pil]>=7.4.2
|
||||||
|
croniter>=2.0.0
|
||||||
|
|||||||
Reference in New Issue
Block a user