Django migration would delete important table - django-migrations

SHORT VERSION: Django's migration subsystem seems to want to drop and re-create my table, rather than just adding a column. How can I fix that?
LONG VERSION:
I'd like to add a field to one of my Django 3.0 models. No biggie, right?
add field to the class definition
manage.py makemigrations
manage.py migrate
So there's a strange issue....when I run makemigrations, I see this in the output:
Migrations for 'api':
api/migrations/0006_auto_20200814_0953.py
- Delete model APISearch
[snip]
- Create model APISearch
[snip]
And sure enough, there are instructions in the resulting migrations to delete and then create an APISearch model.
That would be very bad...APISearch is a real table in my database, containing important data.
I think the issue has to do with the fact that a long time ago, APISearch was a proxy class (it has long since been changed to a concrete class). I can't figure out how Django is determining proxy-ness, so that I can correct it.

I think I found the solution.
Edit the migration that originally created the class:
operations = [
migrations.CreateModel(
name='APISearch',
fields=[
],
options={
'proxy': True, # explicitly set this to False
'indexes': [],
},
bases=('foo.search',),
),
]
Create an empty migration, and use the AddField method to do what I want
operations = [
migrations.AddField(
model_name='apisearch',
name='my_new_field',
field=models.NullBooleanField(blank=True, null=True),
),
]

Related

Entity Framework Code First post migration step?

Is there some way to add a post migration method to Code First EF migration?
All the stored proc's are in the Visual Studio project. Right now there is an approach to load the stored proc resource from the file and put it into it's own migration:
protected override void Up(MigrationBuilder migrationBuilder)
{
var script = ScriptMgr.LoadStoredProc( "StoredProcThatChanged.sql" );
migrationBuilder.Sql( script );
}
There is a weak link in this process: Each time the script changes (StoredProcThatChanged.sql) a new migration needs to be created to make sure it executes again. The problem is the previous migration is also loading the same file. When generating a new script, the process reads in the one file both times, effectively changing the previous migration. Which is a classic no-no.
This would be resolved if there is a post migration method where ALL stored proc's can be reapplied to the DB. Is such a step possible? If so, now is it done?
I have been digging into the efcore source code and it looks like it is possible, not ideal, but there might be a way...
It looks like efcore has an interface called IMigrator. It contains the method string GenerateScript(...). The implementation of it, class Migrator, has comments all over the place saying that it's implementation of GenerateScript is internal and subject to change. But... It looks to me like I can achieve my end goal:
class MyMigrator : Microsoft.EntityFrameworkCore.Migrations.Internal.Migrator
{
public string GenerateScript(
string? fromMigration = null,
string? toMigration = null,
MigrationsSqlGenerationOptions options = MigrationsSqlGenerationOptions.Default)
{
var result = base.GenerateScript( fromMigration, toMigration, options);
results += MyPostSteps(...);
return results;
}
}
Will this work and does anyone know how I might go about replacing the default Migrator with the MyMigrator?

Will Syncdb use Meta or MongoMeta on Django-nonrel/mongodb Models?

I have a sample class below, but when i run manage.py syncdb, none of the index below gets created. There is also a whole load of confusion in the internet that i wonder if i should be using "Meta" or "MongoMeta" (tried both).
What is the proper way to allow automatic index creation in Django-nonrel with mongodb
class Item (models.Model) :
xxx
xxx
.
.
class MongoMeta:
unique_together = [("CountryRetailer", "ProductId")]
indexes = [
[('CountryRetailer',1)],
[('ProductId',1)],
[('OnlineRetailerName',1)],
[('UniqueKey', 1)],
[('Type',1), ('PriceValue',1)],
[('CreatedOn', -1)]
You have to add django_mongodb_engine in INSTALLED_APPS to make it work.

Data models generated by Sqlautocode: 'RelationshipProperty' object has no attribute 'c'

Using PGModeler, we created a schema and then exported out some appropriate SQL code. The SQL commands were able to populate the appropriate tables and rows in our Postgres database.
From here, we wanted to create declarative Sqlalchemy models, and so went with Sqlautocode. We ran it at the terminal:
sqlautocode postgresql+psycopg2://username:password#host/db_name -o models.py -d
And it generated our tables and corresponding models as expected. So far, zero errors.
Then, when going to ipython, I imported everything from models.py and simply tried creating an instance of a class defined there. Suddenly, I get this error:
AttributeError: 'RelationshipProperty' object has no attribute 'c'
This one left me confused for a while. The other SO threads that discuss this had solutions nowhere near my issue (often related to a specific framework or syntax not being used by sqlautocode).
After finding the reason, I decided to document the issue at hand. See below.
Our problem was simply due to bad naming given to our variables when sqlautocode ran. Specifically, the bad naming happened with any model that had a foreign key to itself.
Here's an example:
#Note that all \"relationship\"s below are now \"relation\"
#it is labeled relationship here because I was playing around...
service_catalog = Table(u'service_catalog', metadata,
Column(u'id', BIGINT(), nullable=False),
Column(u'uuid', UUID(), primary_key=True, nullable=False),
Column(u'organization_id', INTEGER(), ForeignKey('organization.id')),
Column(u'type', TEXT()),
Column(u'name', TEXT()),
Column(u'parent_service_id', BIGINT(), ForeignKey('service_catalog.id')),
)
#Later on...
class ServiceCatalog(DeclarativeBase):
__table__ = service_catalog
#relation definitions
organization = relationship('Organization', primaryjoin='ServiceCatalog.organization_id==Organization.id')
activities = relationship('Activity', primaryjoin='ServiceCatalog.id==ActivityService.service_id', secondary=activity_service, secondaryjoin='ActivityService.activity_id==Activity.id')
service_catalog = relationship('ServiceCatalog', primaryjoin='ServiceCatalog.parent_service_id==ServiceCatalog.id')
organizations = relationship('Organization', primaryjoin='ServiceCatalog.id==ServiceCatalog.parent_service_id', secondary=service_catalog, secondaryjoin='ServiceCatalog.organization_id==Organization.id')
In ServiceCatalog.organizations, it is looking to have the secondary table be service_catalog, but that variable was just overwritten locally. Switching the order of the two will fix this issue.

Oil g Migration Not Writing Any Queries

Quick Question:
When I run this ...
php htdocs/travel-photo/oil generate migration follow user:string pass:text name:string photo:string email:string
I looked at the migrations folder, this what it shows me ...
namespace Fuel\Migrations;
class Follow
{
public function up()
{
}
public function down()
{
}
}
What could be wrong here?
Thanks.
I Should've read the Documentation more clearly. You need to specify what action you want to do one the migration ... So from what I created:
php htdocs/travel-photo/oil generate migration users user:string pass:text name:string photo:string email:string
I need to add: oil generate migration create_users ...
php htdocs/travel-photo/oil generate migration create_users user:string pass:text name:string photo:string email:string
http://docs.fuelphp.com/packages/oil/generate.html#migrations
I see this going in one of two ways: (i) either generate a model using oil and have the migration auto-generated, or (ii) use a magic-named migration.
For option (i), use oil generate model and the rest should be pretty obvious - oil generates the migration as well as the model.
For option (ii), the magic-name tells oil what you are trying to accomplish with the migration, so you can create tables, add columns, drop tables, or pretty much anything else supported by oil, simply by defining a name that oil can interpret as one of these operations.

Cant persist an entity

I use JPA 2.0 for my project.
I delete and entity and then try to persist same data but it throws :
org.hibernate.ObjectDeletedException: deleted instance passed to merge .
Below is what I am performing :
for(Education edu : educations) {
entManager.remove(edu);
Education tempEdu = new Education();
tempEdu.setCourse(edu.getCourse());
tempEdu.setInstitution(edu.getInstitution());
tempEdu.setPlace(edu.getPlace());
tempEdu.setFromDate(edu.getFromDate());
tempEdu.setToDate(edu.getToDate());
tempEdu.setMember(updatedMem);
entManager.merge(tempEdu);
}
Can you tell me how to remove an entity and then persist its data in another entity?
Regards,
Satya
I think the remove operation cascades to some of the relationships, which you are using later:
tempEdu.setCourse(edu.getCourse());
tempEdu.setInstitution(edu.getInstitution());
tempEdu.setPlace(edu.getPlace());
Try commenting these line and see whether it works. If so, you must either remove cascading on this relationships or create copy of of this objects just as you are doing it with tempEdu.
And this leads to the question already asked by axtavt - what are you trying to achieve? You'll end up deleting bunch of objects and then recreating them and saving back...
As posted by you in comments - If any entity is deleted/edited/added in UI, I delete all related entities and then recreate the ones which come from UI ...
Maintain two lists, one for entities in database & another for UI.
//---
dbEntries.removeAll(uiEntries); // dbEntries will have entities to be deleted
for(Education edu : dbEntries){
em.remove(edu); // Deleting entries not found in UI
}
for(Education edu : uiEntries){
em.merge(edu); // Updating existing entity else persisting new one
}
//---
This way you can achieve what you are trying so far, point out if I got it wrong.
Else in your code, try merging tempEdu before removing edu.