Skip to content

Commit f041c62

Browse files
authored
ImportedFile: use BigAutoField for primary key (#9669)
We could disable search indexing while we do the migration, but I don't think that should be required, we have 11M records, but to migrate the SphinxDomain model it took 15 min, and we had ~56M. ```python In [7]: ImportedFile.objects.count() Out[7]: 11527437 ``` So some 3 min of not being able to index new versions doesn't seem bad... There are two things that could happen: - The query times out and we don't index that version. - The query waits till the migration is done, nothing gets lost. But if we disable search indexing we definitely won't index new versions. We don't use those models outside search indexing, so doc serving and such shouldn't be affected. ref #9492
1 parent 7a3a718 commit f041c62

File tree

2 files changed

+20
-0
lines changed

2 files changed

+20
-0
lines changed
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,19 @@
1+
from django.db import migrations, models
2+
from django_safemigrate import Safe
3+
4+
5+
class Migration(migrations.Migration):
6+
7+
safe = Safe.after_deploy
8+
9+
dependencies = [
10+
("projects", "0143_addons_flyout_position"),
11+
]
12+
13+
operations = [
14+
migrations.AlterField(
15+
model_name="importedfile",
16+
name="id",
17+
field=models.BigAutoField(primary_key=True, serialize=False),
18+
),
19+
]

readthedocs/projects/models.py

+1
Original file line numberDiff line numberDiff line change
@@ -1488,6 +1488,7 @@ class ImportedFile(models.Model):
14881488
things like CDN invalidation.
14891489
"""
14901490

1491+
id = models.BigAutoField(primary_key=True)
14911492
project = models.ForeignKey(
14921493
Project,
14931494
verbose_name=_("Project"),

0 commit comments

Comments
 (0)