postgresql - Django ORM — Significant database alterations / data migrations -
i have django powered website , ec2 postgresql server. website has growing community , tons of user submitted content. during development models , views have gotten extremely messy tack on new features. start fresh, rewrite whole program , split of models , views modular apps.
just example, if wanted migrate from:
content.models.py class content(models.model): user = models.foreignkey(user) post = models.charfield(max_length=500) photo = models.imagefield(upload_to='images/%y/%m/%d')
to:
content.models.py class content(models.model): user = models.foreignkey(user) post = models.charfield(max_length=500)
photo.models.py photo = models.imagefield(upload_to='images/%y/%m/%d') content = models.foreignkey(content.models.content)
what best way go without losing data?
this case solved 3 south migrations:
- [schema migration] create
photo
table - [data migration] each
content
record createphoto
recordphoto
,content
fields set - [schema migration] delete field
photo
content
table
update: step 2
python manage.py datamigration <app_name> copy_photos_to_separate_model
a new file created in <app_name>/migrations/####_copy_photos…
edit file. edit forward
, backward
methods. first 1 called when migrating forward, other while migrating backwards.
the first 1 creates separate photos out of consolidated model. other have pick 1 of possibly many photos @ time , squeeze content
model.
the special orm
object represents db's state @ time of migration (despite how models.py looks in time – when deplying production different how looked when migration run on test/develop environment).
def forward(self, orm): content = orm['<app_name>.content'] photo = orm['<app_name>.photo'] content in content.objects.all(): photo.object.get_or_create(content=content, defaults={'photo': content.photo})
depending on how big table can try optimize number of queries.
it blow out if there multiple photo
records per single content in db, should not case.
Comments
Post a Comment