Fix refresh token max length (occurs with Microsoft OAuth)#1601
Fix refresh token max length (occurs with Microsoft OAuth)#1601richardARPANET wants to merge 1 commit intodjango-oauth:masterfrom
Conversation
Codecov Report✅ All modified and coverable lines are covered by tests. 📢 Thoughts on this report? Let us know! |
|
@richardARPANET can you describe how this is setup with MS OAuth for me to better understand the setup for testing. I'd like to do a live test. We'll need a test case to cover this and we'll need to consider the performance implications of the larger token. Generally this happens when you're getting JWT tokens instead of opaque tokens... |
There was a problem hiding this comment.
Pull request overview
Extends the RefreshToken.token field length to prevent DB errors when storing longer refresh tokens (notably from Microsoft OAuth).
Changes:
- Increase
AbstractRefreshToken.tokenmax length from 255 to 8000. - Add a Django migration to apply the schema change.
Reviewed changes
Copilot reviewed 2 out of 2 changed files in this pull request and generated 2 comments.
| File | Description |
|---|---|
| oauth2_provider/models.py | Increases refresh token field max length to accommodate longer provider tokens |
| oauth2_provider/migrations/0013_alter_refreshtoken_token.py | Applies the DB schema change via AlterField migration |
| settings.AUTH_USER_MODEL, on_delete=models.CASCADE, related_name="%(app_label)s_%(class)s" | ||
| ) | ||
| token = models.CharField(max_length=255) | ||
| token = models.CharField(max_length=8000) |
There was a problem hiding this comment.
Using a very large CharField(max_length=8000) can have cross-database operational downsides (e.g., index-length limitations on some MySQL/MariaDB setups if this column is or becomes indexed/unique; also potentially larger on-row storage and slower comparisons). If lookups are supported via the existing checksum mechanism (note the 0012_add_token_checksum dependency in this PR), consider switching token to a TextField for storage and ensuring queries/indexing rely on the checksum field instead. If you keep CharField, it would help to document why 8000 was chosen (provider observation/spec reference) to avoid arbitrary magic-number drift.
| token = models.CharField(max_length=8000) | |
| token = models.TextField() |
| operations = [ | ||
| migrations.AlterField( | ||
| model_name="refreshtoken", | ||
| name="token", |
There was a problem hiding this comment.
This migration locks in the new length, but there’s no contextual hint for future maintainers about why 8000 is the chosen bound (vs. using an unbounded text-like field). Consider adding a brief comment in the model (not the migration) explaining the rationale (e.g., known provider token sizes / why not TextField) so the next schema change doesn’t require re-discovering the same provider constraints.
| name="token", | |
| name="token", | |
| # Use a bounded CharField instead of TextField so the token remains indexable | |
| # in common databases; 8000 chars comfortably covers known provider token sizes. |
|
@richardARPANET hey do you have time to look into the copilot feedback? I like the idea of moving to a larger text field, but we need to confirm lookups are using checksums first to keep the indexes peformant. We'll also need the tests to get this in. |
Fix for:
value too long for type character varying(255)
255 is far too short, as per OAuth guidelines. changed to 8000. Now Microsoft OAuth works.