Skip to content

Commit d6f9fa2

Browse files
committed
ImportedFile: use BigAutoField for primary key
We could disable search indexing while we do the migration, but I don't think that should be required, we have 11M records, but to migrate the SphinxDomain model it took 15 min, and we had ~56M. ```python In [7]: ImportedFile.objects.count() Out[7]: 11527437 ``` So some 3 min of not being able to index new versions doesn't seem bad... There are two things that could happen: - The query times out and we don't index that version. - The query waits till the migration is done, nothing gets lost. But if we disable search indexing we definitely won't index new versions. We don't use those models outside search indexing, so doc serving and such shouldn't be affected. ref #9492
1 parent a09bc1a commit d6f9fa2

File tree

2 files changed

+19
-0
lines changed

2 files changed

+19
-0
lines changed
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,18 @@
1+
# Generated by Django 3.2.15 on 2022-10-18 14:21
2+
3+
from django.db import migrations, models
4+
5+
6+
class Migration(migrations.Migration):
7+
8+
dependencies = [
9+
("projects", "0093_migrate_null_fields"),
10+
]
11+
12+
operations = [
13+
migrations.AlterField(
14+
model_name="importedfile",
15+
name="id",
16+
field=models.BigAutoField(primary_key=True, serialize=False),
17+
),
18+
]

readthedocs/projects/models.py

+1
Original file line numberDiff line numberDiff line change
@@ -1345,6 +1345,7 @@ class ImportedFile(models.Model):
13451345
things like CDN invalidation.
13461346
"""
13471347

1348+
id = models.BigAutoField(primary_key=True)
13481349
project = models.ForeignKey(
13491350
Project,
13501351
verbose_name=_('Project'),

0 commit comments

Comments
 (0)