Remove DingTalk and replace Mirage encryption#21
Conversation
|
Caution Review failedPull request was closed or merged during review 📝 WalkthroughWalkthroughRemoves DingTalk authentication/notification code, adds field-level encryption utilities and encrypted model fields, replaces mirage-based crypto with cryptography usage, and introduces management commands/tests and a migration to re-encrypt legacy data. Templates, URLs, admin, and tests updated accordingly. Changes
Sequence Diagram(s)sequenceDiagram
participant App as Application Code
participant Enc as common.encryption
participant MF as MultiFernet
participant DB as Database
App->>Enc: encrypt_value(plaintext)
Enc->>Enc: _load_field_encryption_keys()
Enc->>MF: get_multi_fernet(keys)
MF->>MF: encrypt(plaintext)
MF-->>Enc: ciphertext
Enc-->>App: "enc1:" + ciphertext
App->>DB: INSERT/UPDATE ("enc1:...")
DB-->>App: SELECT ("enc1:...")
App->>Enc: decrypt_value(value)
alt value starts with "enc1:"
Enc->>MF: decrypt(ciphertext)
MF-->>Enc: plaintext
else legacy format
Enc->>Enc: decrypt_legacy_value(value)
Enc-->>App: plaintext or raise DecryptionError
end
Enc-->>App: plaintext
Estimated code review effort🎯 4 (Complex) | ⏱️ ~50 minutes Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches📝 Generate docstrings
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 6
♻️ Duplicate comments (1)
common/test_encryption.py (1)
19-25:⚠️ Potential issue | 🟠 MajorAvoid runtime ECB construction in tests; it keeps code scanning failing.
This helper reproduces legacy behavior correctly, but introducing
modes.ECB()directly triggers the same security finding already reported and can block merge checks. Prefer a fixed precomputed legacy ciphertext fixture (or centralized test fixture) so no ECB primitive is instantiated in test code.#!/bin/bash set -euo pipefail # Verify whether ECB mode is still present in Python sources. # Expected after refactor: no matches for modes.ECB(). rg -n --type=py 'modes\.ECB\s*\('🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@common/test_encryption.py` around lines 19 - 25, The test helper _encrypt_legacy_mirage currently constructs modes.ECB() at runtime which triggers security scanning; remove the runtime ECB instantiation and instead return or load a fixed precomputed legacy ciphertext (e.g., a base64 string) from a centralized test fixture or fixture file used by tests; update callers of _encrypt_legacy_mirage to use the fixture value and delete or replace the AES/modes.ECB() usage in that function so no modes.ECB() is referenced anywhere in the test code.
🧹 Nitpick comments (2)
requirements.txt (1)
22-22: Boundcryptographyto a tested major version.
cryptography>=45.0.0allows 46.0.0+ releases, which introduced breaking changes: removed Python 3.7 support, removed deprecated ciphers (CAST5, SEED, IDEA, Blowfish), and removed theget_attribute_for_oidmethod on CertificateSigningRequest. Usecryptography>=45.0.0,<46.0.0to receive security patches within the 45.x series while preventing breaking changes.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@requirements.txt` at line 22, The requirements entry for the cryptography dependency is too permissive (cryptography>=45.0.0) and may pull in breaking 46.x releases; change the requirement to pin the tested major series by updating the entry to cryptography>=45.0.0,<46.0.0 in requirements.txt so the project receives 45.x security patches but avoids breaking changes from 46.x.sql/test_reencrypt_sensitive_fields.py (1)
19-25: Use a real legacy ciphertext fixture here.This helper mirrors the new compatibility code, so the test can still pass if both copies drift from
django-mirage-field's actual wire format or from deployments that used non-defaultMIRAGE_*settings. Please pin this to a known-good ciphertext produced by the removed package or by the pre-migration branch instead of re-implementing the algorithm in the test.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@sql/test_reencrypt_sensitive_fields.py` around lines 19 - 25, The test currently re-implements the legacy Mirage encryption in _encrypt_legacy_mirage which can drift from the real wire format; replace this function with a pinned, known-good legacy ciphertext fixture (a hard-coded base64 ciphertext string produced by the original django-mirage-field or pre-migration branch) and update usages to decode/return that fixture instead of computing encryption at runtime (keep the helper name _encrypt_legacy_mirage so callers don't change). Ensure the fixture reflects the correct secret_key and encoding expected by the test harness and add a short comment referencing the source/origin of the pinned ciphertext for future audits.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@common/encryption.py`:
- Around line 89-94: The current code appends a legacy key from
_derive_secret_key_fernet_key() whenever FIELD_ENCRYPTION_KEYS is empty, which
must be removed: stop returning or appending legacy_key in the keys-loading code
so _derive_secret_key_fernet_key() is only used for legacy reads, and ensure the
write/encryption paths validate that FIELD_ENCRYPTION_KEYS is non-empty and
raise a clear error (e.g., RuntimeError or ValueError) if not; update the
key-loading function (the block that currently sets legacy_key and returns
[legacy_key]) to return an empty list when no configured keys exist, and add a
guard in the encryption function (the code that performs writes/encryption) to
refuse to encrypt when keys list is empty.
- Around line 28-35: The encrypt_value function currently treats any input that
looks like the ENCRYPTED_VALUE_PREFIX (enc1:) as already-encrypted which
misclassifies legitimate plaintext; remove the early branch that checks
is_encrypted_value(value) and calls decrypt_value(value) so that encrypt_value
always encrypts non-None inputs (i.e., if value is None return None, otherwise
call get_multi_fernet().encrypt(force_bytes(value)) and wrap with
ENCRYPTED_VALUE_PREFIX), and update/adjust any tests or callers that relied on
the old shortcut; reference functions: encrypt_value, is_encrypted_value,
decrypt_value and the constant ENCRYPTED_VALUE_PREFIX.
- Around line 49-61: The function decrypt_legacy_value currently "fails open" by
returning the original non-empty string when neither _try_decrypt_mirage nor
_try_decrypt_old_hex succeed; change it so that for non-empty values it does not
return the raw input but instead raises a clear exception (e.g. DecryptionError
or ValueError) containing the input marker and context so callers know
decryption failed; keep the existing behavior for empty string/None, and update
any call sites that expect the old behavior to catch and handle this new
exception (refer to decrypt_legacy_value, _try_decrypt_mirage, and
_try_decrypt_old_hex).
In `@common/twofa/sms.py`:
- Around line 65-67: The verify() method currently calls
TwoFactorAuthConfig.objects.get(user=self.user, auth_type=self.auth_type) which
raises TwoFactorAuthConfig.DoesNotExist and causes a 500; wrap that lookup in a
try/except catching TwoFactorAuthConfig.DoesNotExist, and in the except return a
proper validation response (e.g., raise a ValidationError or return False/a
controlled error payload) instead of letting the exception bubble up; apply the
same fix to the analogous lookup in totp.py; keep the lookup of .phone as-is
inside the try and optionally log the missing config with context
(user/auth_type).
In `@common/twofa/totp.py`:
- Around line 28-29: The verify() implementation in common/twofa/totp.py must
guard the TwoFactorAuthConfig.objects.get(...) lookup to avoid an unhandled
DoesNotExist exception: wrap the lookup used to fetch .secret_key in a
try/except catching the model's DoesNotExist (same pattern as the base class
disable() in __init__.py) and return a validation-style failure (status 1 /
appropriate error response) when no TOTP config exists instead of allowing a
500; apply the same defensive change to the analogous lookup in
common/twofa/sms.py (the lookup around lines that fetch the SMS config) so both
verify() paths handle missing configs gracefully.
In `@sql/migrations/0003_remove_resourcegroup_ding_webhook_and_more.py`:
- Around line 64-68: Add a data migration RunPython before the
migrations.AlterField that converts instanceaccount.user to CharField: implement
a forward function that loads all InstanceAccount rows, uses the original
mirage.fields.EncryptedCharField decryption logic to decrypt the stored
ciphertext in the user column, writes back the plaintext to the model (or via
queryset.update) and saves, then a no-op reverse function; insert this
migrations.RunPython operation immediately before the migrations.AlterField for
model_name="instanceaccount" name="user" so the column contains plaintext before
the AlterField runs. Ensure the decryption uses the same code used by the
original EncryptedCharField (refer to mirage.fields.EncryptedCharField or the
reencrpyt_sensitive_fields management command logic) and reference the migration
operation names migrations.RunPython and the AlterField for instanceaccount.user
when making the change.
---
Duplicate comments:
In `@common/test_encryption.py`:
- Around line 19-25: The test helper _encrypt_legacy_mirage currently constructs
modes.ECB() at runtime which triggers security scanning; remove the runtime ECB
instantiation and instead return or load a fixed precomputed legacy ciphertext
(e.g., a base64 string) from a centralized test fixture or fixture file used by
tests; update callers of _encrypt_legacy_mirage to use the fixture value and
delete or replace the AES/modes.ECB() usage in that function so no modes.ECB()
is referenced anywhere in the test code.
---
Nitpick comments:
In `@requirements.txt`:
- Line 22: The requirements entry for the cryptography dependency is too
permissive (cryptography>=45.0.0) and may pull in breaking 46.x releases; change
the requirement to pin the tested major series by updating the entry to
cryptography>=45.0.0,<46.0.0 in requirements.txt so the project receives 45.x
security patches but avoids breaking changes from 46.x.
In `@sql/test_reencrypt_sensitive_fields.py`:
- Around line 19-25: The test currently re-implements the legacy Mirage
encryption in _encrypt_legacy_mirage which can drift from the real wire format;
replace this function with a pinned, known-good legacy ciphertext fixture (a
hard-coded base64 ciphertext string produced by the original django-mirage-field
or pre-migration branch) and update usages to decode/return that fixture instead
of computing encryption at runtime (keep the helper name _encrypt_legacy_mirage
so callers don't change). Ensure the fixture reflects the correct secret_key and
encoding expected by the test harness and add a short comment referencing the
source/origin of the pinned ciphertext for future audits.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 5e203207-4185-4e7a-be0c-dd67da2b8ae2
📒 Files selected for processing (37)
archery/settings.pyarchery/urls.pycommon/auth.pycommon/authenticate/dingding_auth.pycommon/encryption.pycommon/fields/__init__.pycommon/fields/encrypted.pycommon/middleware/check_login_middleware.pycommon/templates/config.htmlcommon/templates/login.htmlcommon/test_encryption.pycommon/tests.pycommon/twofa/sms.pycommon/twofa/totp.pycommon/utils/ding_api.pycommon/utils/sendmsg.pyconftest.pymirage/__init__.pymirage/fields.pyrequirements.txtsql/admin.pysql/local_demo.pysql/management/commands/generate_field_encryption_key.pysql/management/commands/reencrypt_sensitive_fields.pysql/migrations/0003_remove_resourcegroup_ding_webhook_and_more.pysql/models.pysql/notify.pysql/resource_group.pysql/test_notify.pysql/test_reencrypt_sensitive_fields.pysql/urls.pysql/utils/tasks.pysql/views.pysql_api/serializers.pysql_api/tests.pysql_api/urls.pysql_api/views.py
💤 Files with no reviewable changes (15)
- conftest.py
- sql/local_demo.py
- common/middleware/check_login_middleware.py
- sql/urls.py
- archery/urls.py
- common/auth.py
- sql_api/urls.py
- common/utils/ding_api.py
- common/tests.py
- common/authenticate/dingding_auth.py
- sql/utils/tasks.py
- sql_api/views.py
- sql_api/tests.py
- sql/notify.py
- common/utils/sendmsg.py
| user=self.user, auth_type=self.auth_type | ||
| ).secret_key |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Check other 2FA verify paths for the same unhandled lookup pattern
rg -n -C2 --type=py 'TwoFactorAuthConfig\.objects\.get\(' common/twofaRepository: jruszo/Datamingle
Length of output: 971
🏁 Script executed:
# Get the full verify() method from totp.py to see the complete flow
head -80 common/twofa/totp.py | tail -60Repository: jruszo/Datamingle
Length of output: 1895
🏁 Script executed:
# Also check the SMS verify method for context on how it's structured
sed -n '50,100p' common/twofa/sms.pyRepository: jruszo/Datamingle
Length of output: 1781
🏁 Script executed:
# Check if there's any configuration validation before verify() is called
# Look for callers of verify() in the totp.py
rg -n 'def verify\|\.verify\(' common/twofa/totp.py -A 5 -B 2Repository: jruszo/Datamingle
Length of output: 43
🏁 Script executed:
# Check what the verify() method returns to ensure suggested fix format is correct
rg -n 'return' common/twofa/totp.pyRepository: jruszo/Datamingle
Length of output: 284
🏁 Script executed:
# Check Black formatting compliance on totp.py
pip install black > /dev/null 2>&1
black --check common/twofa/totp.py 2>&1 | head -20Repository: jruszo/Datamingle
Length of output: 104
🏁 Script executed:
# Look for callers of verify() to see if config existence is validated upstream
rg -n '\.verify\(' common/ -t py | grep -v 'def verify\|t\.verify\|status\.verify'Repository: jruszo/Datamingle
Length of output: 43
🏁 Script executed:
# Check the TwoFactorAuthConfig model to understand the exception type
rg -n 'class TwoFactorAuthConfig\|DoesNotExist' common/ -t pyRepository: jruszo/Datamingle
Length of output: 43
🏁 Script executed:
# Look for TwoFactorAuthConfig definition - may be in models
rg -n 'TwoFactorAuthConfig' common/ -t py | head -10Repository: jruszo/Datamingle
Length of output: 828
🏁 Script executed:
# Search for where verify() is called - broader search in entire repo
rg -n '\.verify\(.*otp' --type=py -B 2Repository: jruszo/Datamingle
Length of output: 773
🏁 Script executed:
# Check imports at top of totp.py to understand model location
head -20 common/twofa/totp.pyRepository: jruszo/Datamingle
Length of output: 569
🏁 Script executed:
# Look at the __init__.py implementation to see how they handle the same config lookup there
cat common/twofa/__init__.pyRepository: jruszo/Datamingle
Length of output: 1164
Guard missing TOTP config in verify() to prevent unhandled exceptions.
TwoFactorAuthConfig.objects.get(...) raises DoesNotExist when TOTP is not configured. In api_auth.py:99, verify() is called with only the OTP parameter, triggering the unguarded lookup. This causes an unhandled exception that becomes a 500 error instead of a validation-style failure (status 1).
The same vulnerability exists in common/twofa/sms.py:65–67. The base class disable() method in __init__.py demonstrates the correct pattern for handling this exception.
Suggested fix
- secret_key = TwoFactorAuthConfig.objects.get(
- user=self.user, auth_type=self.auth_type
- ).secret_key
+ try:
+ secret_key = TwoFactorAuthConfig.objects.get(
+ user=self.user, auth_type=self.auth_type
+ ).secret_key
+ except TwoFactorAuthConfig.DoesNotExist:
+ return {"status": 1, "msg": "TOTP is not configured."}🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@common/twofa/totp.py` around lines 28 - 29, The verify() implementation in
common/twofa/totp.py must guard the TwoFactorAuthConfig.objects.get(...) lookup
to avoid an unhandled DoesNotExist exception: wrap the lookup used to fetch
.secret_key in a try/except catching the model's DoesNotExist (same pattern as
the base class disable() in __init__.py) and return a validation-style failure
(status 1 / appropriate error response) when no TOTP config exists instead of
allowing a 500; apply the same defensive change to the analogous lookup in
common/twofa/sms.py (the lookup around lines that fetch the SMS config) so both
verify() paths handle missing configs gracefully.
There was a problem hiding this comment.
Actionable comments posted: 3
🧹 Nitpick comments (1)
conftest.py (1)
4-6: Scope the default encryption key per test, not at import time.
_load_field_encryption_keys()falls back toos.environ, andget_multi_fernet()caches the first keyset it sees. Setting a process-wide default here makes “missing key” and per-test key override cases depend on manual cache clearing.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@conftest.py` around lines 4 - 6, The module-level call that sets FIELD_ENCRYPTION_KEYS in conftest.py should not run at import time because _load_field_encryption_keys() reads os.environ and get_multi_fernet() caches the first keyset; instead, remove the os.environ.setdefault(...) from module import and set the default inside a per-test setup (e.g., a pytest fixture or pytest_configure) so each test can monkeypatch or override FIELD_ENCRYPTION_KEYS before _load_field_encryption_keys() is invoked; if get_multi_fernet() exposes or relies on an internal cache, add/call a cache-clear helper or reset function in the fixture to ensure tests don’t reuse a previously cached keyset.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@sql/migrations/0003_remove_resourcegroup_ding_webhook_and_more.py`:
- Around line 14-24: The _decrypt_mirage function hardcodes SECRET_KEY and
AES-ECB, which breaks migrations for installs using MIRAGE_SECRET_KEY,
MIRAGE_CIPHER_MODE or MIRAGE_CIPHER_IV; update _decrypt_mirage to mirror
common/encryption.py: accept and use the provided secret_key (falling back to
MIRAGE_SECRET_KEY), derive the cipher_key the same way
(base64.urlsafe_b64encode(force_bytes(secret_key))[:32]) and construct the
Cipher using the configured mode (support at least ECB and CBC) and IV when
required (use MIRAGE_CIPHER_IV for CBC), then perform decryption and unpadding
as before so legacy Mirage keys and CBC-encrypted data decrypt correctly (refer
to function name _decrypt_mirage and any uses in the migration).
In `@src/docker-compose/.env`:
- Line 6: Replace the committed literal field-encryption key in the env template
by removing the current value for FIELD_ENCRYPTION_KEYS and placing a clear
placeholder (e.g. FIELD_ENCRYPTION_KEYS=<GENERATE_WITH_MANAGE_CMD>) so the
repository no longer contains a real key; update any README/setup docs to
instruct developers to run python manage.py generate_field_encryption_key during
setup to generate and populate the real key into their local .env or secret
store.
In `@src/docker-compose/.env.local-arm`:
- Line 5: Replace the hard-coded Fernet key in the env template by removing the
real value for FIELD_ENCRYPTION_KEYS and replacing it with a clear placeholder
(e.g. REPLACE_WITH_GENERATED_FERNET_KEY) so the repo no longer contains a fixed
encryption key; update any setup/bootstrapping logic (installer/script) to
generate and persist a Fernet key into the runtime .env or secrets store at
installation time and document this in your setup instructions so code paths
that read FIELD_ENCRYPTION_KEYS (the env variable name) will get an
installation-generated value instead of the checked-in secret.
---
Nitpick comments:
In `@conftest.py`:
- Around line 4-6: The module-level call that sets FIELD_ENCRYPTION_KEYS in
conftest.py should not run at import time because _load_field_encryption_keys()
reads os.environ and get_multi_fernet() caches the first keyset; instead, remove
the os.environ.setdefault(...) from module import and set the default inside a
per-test setup (e.g., a pytest fixture or pytest_configure) so each test can
monkeypatch or override FIELD_ENCRYPTION_KEYS before
_load_field_encryption_keys() is invoked; if get_multi_fernet() exposes or
relies on an internal cache, add/call a cache-clear helper or reset function in
the fixture to ensure tests don’t reuse a previously cached keyset.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: f441a42c-5962-4d2d-8b52-9e0e522e8d8f
📒 Files selected for processing (13)
common/encryption.pycommon/fields/encrypted.pycommon/test_encryption.pycommon/test_fixtures.pycommon/tests.pycommon/twofa/sms.pycommon/twofa/totp.pyconftest.pyrequirements.txtsql/migrations/0003_remove_resourcegroup_ding_webhook_and_more.pysql/test_reencrypt_sensitive_fields.pysrc/docker-compose/.envsrc/docker-compose/.env.local-arm
🚧 Files skipped from review as they are similar to previous changes (5)
- common/twofa/sms.py
- common/twofa/totp.py
- common/tests.py
- common/fields/encrypted.py
- common/test_encryption.py
| # https://django-environ.readthedocs.io/en/latest/quickstart.html#usage | ||
| # https://docs.djangoproject.com/zh-hans/4.1/ref/settings/ | ||
| DEBUG=false | ||
| FIELD_ENCRYPTION_KEYS=9R_Jxat_be2SV-UbCS0dAYQ0SGjZVf0JyN-VPkVNyi0= |
There was a problem hiding this comment.
Remove the committed field-encryption key from this env template.
This makes every stack bootstrapped from the checked-in file share the same at-rest key. Use a placeholder here and require python manage.py generate_field_encryption_key during setup instead.
🧰 Tools
🪛 Betterleaks (1.1.1)
[high] 6-6: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/docker-compose/.env` at line 6, Replace the committed literal
field-encryption key in the env template by removing the current value for
FIELD_ENCRYPTION_KEYS and placing a clear placeholder (e.g.
FIELD_ENCRYPTION_KEYS=<GENERATE_WITH_MANAGE_CMD>) so the repository no longer
contains a real key; update any README/setup docs to instruct developers to run
python manage.py generate_field_encryption_key during setup to generate and
populate the real key into their local .env or secret store.
|
|
||
| DEBUG=false | ||
| SECRET_KEY=datamingle-local-secret-key-change-me | ||
| FIELD_ENCRYPTION_KEYS=9R_Jxat_be2SV-UbCS0dAYQ0SGjZVf0JyN-VPkVNyi0= |
There was a problem hiding this comment.
Don't ship a real field-encryption key in the ARM env template either.
Keeping a fixed Fernet key in version control means any deployment that reuses this file lands in the same decryption domain. This should be a placeholder/generated-at-setup value too.
🧰 Tools
🪛 Betterleaks (1.1.1)
[high] 5-5: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.
(generic-api-key)
🪛 dotenv-linter (4.0.0)
[warning] 5-5: [UnorderedKey] The FIELD_ENCRYPTION_KEYS key should go before the SECRET_KEY key
(UnorderedKey)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/docker-compose/.env.local-arm` at line 5, Replace the hard-coded Fernet
key in the env template by removing the real value for FIELD_ENCRYPTION_KEYS and
replacing it with a clear placeholder (e.g. REPLACE_WITH_GENERATED_FERNET_KEY)
so the repo no longer contains a fixed encryption key; update any
setup/bootstrapping logic (installer/script) to generate and persist a Fernet
key into the runtime .env or secrets store at installation time and document
this in your setup instructions so code paths that read FIELD_ENCRYPTION_KEYS
(the env variable name) will get an installation-generated value instead of the
checked-in secret.
What changed
This PR removes DingTalk support end to end and replaces the
django-mirage-fielddependency with repo-owned encrypted fields.It introduces a shared encryption layer built on Fernet, adds management commands for key generation and re-encryption, updates the affected models to use supported field types, and removes the legacy DingTalk auth / notification / config / UI paths.
Why it changed
The goal is to remove two upgrade blockers before moving Django forward:
django-auth-dingdingis no longer neededdjango-mirage-fieldis niche and unmaintained for our upgrade pathThe new implementation keeps sensitive values encrypted at rest while moving the encryption behavior into code we control.
User and developer impact
TwoFactorAuthConfig.usernameis removed in favor of the existinguserforeign keyInstanceAccount.userremains queryable as plain text to preserve lookup and uniqueness behaviorpython manage.py generate_field_encryption_keypython manage.py reencrypt_sensitive_fields --batch-size 500mirage.fieldscompatibility shim is included so historical migrations still run on fresh installs without the third-party packageRoot cause
The previous implementation depended on third-party packages that are poor fits for a modern Django upgrade, and some encrypted fields also mixed storage concerns with runtime behavior in ways that made migration and verification harder.
Validation
docker exec datamingle-app pytestdocker exec datamingle-app python manage.py makemigrations sql --checkblack --check --diff .Full pytest result on this branch:
844 passed, 1 skipped.Summary by CodeRabbit
New Features
Removed Features
User-facing changes & Bug Fixes
Tests