Dynamic composite key in Entity Framework? - entity-framework

I have a database where the convention is that tables contain historic records, numerated incrementally. For example a Product with Id 17 might have three records in the [Products] table:
Id Version Name
-- ------- ----
17 1 <null>
17 2 'Pinpogn Ball'
17 3 'Pingpong ball`
The most up-to-date record is the one with Id 17 and Version 3.
Furthermore, the convention in this database is that records that reference such Products, specify only the Product Id, taking for granted that it's the record with MAX(Version) that is being referred to.
For example, the [ProductColors] table might have a record:
Id Version ProductId Color
-- ------- --------- -----
99 1 17 'Fluorescent Green'
My challenge: I am trying to access this DB using Entity Framework 6 Code First. Is there a way for me to adapt EF6 to the conventions in the DB?
That is, can I "teach" EF to de-reference columns like the above ProductID column using the record having the correct Id and the highest Version?

Related

Moving data between PostgreSQL databases respecting conflicting keys

Situation
I have a 2 databases which were in one time direct copies of each other but now they contain new different data.
What do I want to do
I want to move data from database "SOURCE" to database "TARGET" but the problem is that the tables use auto-incremented keys, and since both databases are used at the same time, a lot of the IDs are already taken up in TARGET so I cannot just identity insert the data coming from SOURCE.
But in theory we could just not use identity insert at all and let the database take care of assigning new IDs.
What makes it harder is that we have like 50 tables where each of them is connected by foreign keys. Clearly the foreign keys will also have to be changed else they will no longer reference the correct thing.
Let's see a very simplified example:
table Human {
id integer NOT NULL PK AutoIncremented
name varchar NOT NULL
parentId integer NULL FK -> Human.id
}
table Pet {
id integer NOT NULL PK AutoIncremented
name varchar NOT NULL
ownerId integer NOT NULL FK -> Human.id
}
SOURCE Human
Id name parentId
==========================
1 Aron null
2 Bert 1
3 Anna 2
SOURCE Pet
Id name ownerId
==========================
1 Frankie 1
2 Doggo 2
TARGET Human
Id name parentId
==========================
1 Armin null
2 Cecil 1
TARGET Pet
Id name ownerId
==========================
1 Gatto 2
Let's say I want to move Aron, Bert, Anna, Frankie and Doggo to the TARGET database.
But if we directly try to insert them with not caring about original ids, the foreign keys will be garbled:
TARGET Human
Id name parentId
==========================
1 Armin null
2 Cecil 1
3 Aron null
4 Bert 1
5 Anna 2
TARGET Pet
Id name ownerId
==========================
1 Gatto 2
2 Frankie 1
3 Doggo 2
The father of Anna is Cecil and the Owner of Doggo is Cecil also instead of Bert. The parent of Bert is Armin instead of Aron.
How I want it to look is:
TARGET Human
Id name parentId
==========================
1 Armin null
2 Cecil 1
3 Aron null
4 Bert 3
5 Anna 4
TARGET Pet
Id name ownerId
==========================
1 Gatto 2
2 Frankie 3
3 Doggo 4
Imagine having like 50 similar tables with 1000 of lines, so we will have to automate the solution.
Questions
Is there a specific tool I can utilize?
Is there some simple SQL logic to precisely do that?
Do I need to roll my own software to do this (e.g. a service that connects to both databases, read everything in EF with including all relations, and save it to the other DB)? I fear that there are too many gotchas and it is time consuming.
Is there a specific tool? Not as far as I know.
Is there some simple SQL? Not exactly simple but not all that complex either.
Do you need to roll own? Maybe, depends on if you think you use the SQL (balow).
I would guess there is no direct path, the problem being as you note, getting the FK values reassigned. The following adds a column to all the tables, which can be used to span the across the tables. For this I would use a uuid. Then with that you can copy from one table set to the other except for the FK. After copying you can join on the uuid to complete the FKs.
-- establish a reference field unique across databases.
alter table target_human add sync_id uuid default gen_random_uuid ();
alter table target_pet add sync_id uuid default gen_random_uuid ();
alter table source_human add sync_id uuid default gen_random_uuid ();
alter table source_pet add sync_id uuid default gen_random_uuid ();
--- copy table 2 to table 1 except parent_id
insert into target_human(name,sync_id)
select name, sync_id
from source_human;
-- update parent id in table to prior parent in table 2 reasigning parent
with conv (sync_parent, sync_child, new_parent) as
( select h2p.sync_id sync_parent, h2c.sync_id sync_child, h1.id new_parent
from source_human h2c
join source_human h2p on h2c.parentid = h2p.id
join target_human h1 on h1.sync_id = h2p.sync_id
)
update target_human h1
set parentid = c.new_parent
from conv c
where h1.sync_id = c.sync_child;
-----------------------------------------------------------------------------------------------
alter table target_pet alter column ownerId drop not null;
insert into target_pet(name, sync_id)
select name, sync_id
from source_pet ;
with conv ( sync_pet,new_owner) as
( select p2.sync_id, h1.id
from source_pet p2
join source_human h2 on p2.ownerid = h2.id
join target_human h1 on h2.sync_id = h1.sync_id
)
update target_pet p1
set ownerid = c.new_owner
from conv c
where p1.sync_id = c.sync_pet;
alter table target_pet alter column ownerId set not null;
See demo. You now reverse the source and target table definitions to complete the other side of the sync. You can then drop the uuid columns if so desired. But you may want to keep them. If you have gotten them out of sync, you will do so again. You could even go a step further and make the UUID your PK/FK and then just copy the data, the keys will remain correct, but that might involve updating the apps to the revised DB structure. This does not address communication across databases, but I assume you already have that handled. You will need to repeat for each set, perhaps you can write a script to generate them. Further, I would guess there are fewer gotchas and less time consuming than rolling your own. This is basically 5 queries per table set you have, but to clean-up the current mess, 500 queries is not that much;

Import and merge rails database tables

I have 3 separate ruby on rails applications running PostgreSQL databases, all with the same tables (and columns) but different values.
For example :
app 1 TABLE
name surname postcode
----------
tom smith so211ux
app 2 TABLE
name surname postcode
----------
mark smith so2ddx
app 3 TABLE
name surname postcode
----------
james roberts F2D1ux
I am looking to export/dump/download from two of the databases and import into one consolidated database/app.
If someone could point me in the right direction/reading for this type of query I would be most grateful.
Please use the foreign data wrappers (fdw), where you can access the data from external postgresql servers. Please read more here - https://www.postgresql.org/docs/9.3/static/postgres-fdw.html

Java - JPA, introducing Table generator sequence, How to avoid SQLIntegrityConstraintViolationException

I have a table "MYTABLE" which has an ID & NAME as its column, mapped JPA entity is similar with class "MyPersistence" with attributes id & name.
This table already has few records with unique ids as:
ID NAME
---------- ---------------
1254 DEV-SA12
234 DEV-SA345
Earlier the ids were generated manually but now I want to introduce table sequence generator, the sequence which I am using is named as "MY_GEN" and has current value as 100 (Restriction** - cannot change).
As you can see for Id 234 is already there, so when the sequence reaches at 233 and creates a new id, and tries to assign it, I am getting the following exception
Internal Exception: java.sql.SQLIntegrityConstraintViolationException:
ORA-00001: unique constraint (DEV.MYTABLE_ID_PK) violated
Any ideas how to avoid this ?
Thanks in advance

entity framework not recognising data type / column format from SQLite

Scene:
SQLite3 database as external source
A number of views exist in the db
Entity Framework
edmx file doesn't seem to recognise the data type of the columns in the view even when CAST
Example:
In the SQLite db I have:
CREATE TABLE [dr]
([ID] INTEGER PRIMARY KEY AUTOINCREMENT,
[CoalNum] INT,
[InputTime] DATE)
-------------
CREATE VIEW "J13" AS
SELECT
dr.ID,
date(dr.InputTime) as InputTime,
CAST (count(*) AS INT) as CoalNum
FROM dr
WHERE dr.InputTime >= "2010-01-13"
GROUP BY dr.ID, date(dr.InputTime)
When I update the edmx model, VS2010 can recognise "dr.ID", but it does not recognise "InputTime" and "CoalNum". The error message is "doesn't support datetype".
In the sqlite3 management studio, I checked this view ("J13") and I found "InputTime" and "CoalNum" datetype is null
cid name type notnull dflt_value pk
------- ------------ ---------- ----------- -------------- ---
0 ID INTEGER 0 0
1 InputTime 0 0
2 CoalNum 0 0
So I cannot update the data model in Entity Framework. Hopefully someone can help or provide further information before I yell bug.
A man have the same question,but i don't know did he solution it.
https://forum.openoffice.org/en/forum/viewtopic.php?t=27214
SQLite uses dynamic typing, and thus does not have data types in computed columns.
Entity Framework assumes that all columns do have known types; this is not fully compatible with SQLite's architecture.
You could try a workaround: create a database with a fake J13 table with data types, update the model, and then replace the database file with the real database while Entity Framework isn't looking.

Entity Framework, Junction Tables with Timestamps

I was wanting to know if there is a good way to work in timestamps into junction tables using the Entity Framework (4.0). An example would be ...
Clients
--------
ID | Uniqueidentifier
Name | varchar(64)
Products
-----------
ID | uniqueidentifier
Name | varchar(64)
Purchases
--------
Client | uniqueidentifier
Product | uniqueidentifier
This works smooth for junctioning the two together - but I'd like to add a timestamp. Whenever I do that, I'm forced to go through the middle-table in my code. I don't think I can add the timestamp field to the junction table - but is there a different method that might be useable?
Well, your question says it all. You must either have a middle "Purchases" entity or not have the timestamp on Purchases. Actually, you can have the field on the table if you don't map it, but if you want it on your entity model then these are the only two choices.