where and how activiti model diagram is being stored in the activiti database tables - workflow

I have started working on activiti i wanted to know after creating/editing process model where and how activiti model diagram is being stored in the activiti database tablel ?
Also When we edit an activiti model, the version remains as version 1 is there any way to increment this version and how multiple versions can be tracked in the activiti database?
i can see version are incremented when there is duplicate deployment. but how to increment version after edit model
Thanks for any help.

Please check out the "3.7. Database table names explained" and "5.4. Automatic resource deployment" sections of the user guide (https://www.activiti.org/userguide/)
You may also list table information and data: https://www.activiti.org/userguide/#_database_tables

Related

Development process for Code First Entity Framework and SQL Server Data Tools Database Projects

I have been using Database First Entity Framework (EDMX) and SQL Server Data Tools Database Projects in combination very successfully - change the schema in the database and 'Update Model from Database' to get them into the EDMX. I see though that Entity Framework 7 will be dropping the EDMX format and I am looking for a new process that will allow me to use Code First in Combination with Database Projects.
Lots of my existing development and deployment processes rely on having a database project that contains the schema. This goes in source control is deployed along with the code and is used to update the production database complete with data migration using pre and post deployment scripts. I would be reluctant to drop it.
I would be keen to split one big EDMX into many smaller models as part of this work. This will mean multiple Code First models referencing the same database.
Assuming that I have an existing database and a database project to go with it - I am thinking that I would start by using the following wizard to create an initial set of entity and context classes - I would do this for each of the models.
Add | New Item... | Visual C# Items | Data | ADO.NET Entity Data Model | Code first from database
My problem is - where do I go from there? How do I handle schema changes? As long as I can get the database schema updated, I can use a schema compare operation to get the changes into the project.
These are the options that I am considering.
Make changes in the database and use the wizard from above to regenerate. I guess that I would need to keep any modifications to the entity and/or context classes in partial classes so that they do not get overwritten. Automating this with a list of tables etc to include would be handy. Powershell or T4 Templates maybe? SqlSharpener (suggested by Keith in comments) looks like it might help here. I would also look at disabling all but the checks for database existence and schema compatibility here, as suggested by Steve Green in the comments.
Make changes in code and use migrations to get these changes applied to the database. From what I understand, not having models map cleanly to database schemas (mine don't) might pose problems. I also see some complaints on the net that migrations do not cover all database object types - this was also my experience when I played around with Code First a while back - unique constraints I think were not covered. Has this improved in Entity Framework 7?
Make changes in the database and then use migrations as a kind of comparison between code and the database. See what the differences are and adjust the code to suit. Keep going until there are no differences.
Make changes manually in both code and the database. Obviously, this is not very appealing.
Which of these would be best? Is there anything that I would need to know before trying to implement it? Are there any other, better options?
So the path that we ended up taking was to create some T4 templates that generate both a DbContext and our entities. We provide the entity T4 a list of tables from which to generate entities and have a syntax to indicate that the entity based on one table should inherit from the entity based on another. Custom code goes in partial classes. So our solution looks most like my option 1 from above.
Also, we started out generating fluent configuration in OnModelCreating in the DbContext but have swapped to using attributes on the Entities (where attributes exist - HasPrecision was one that we had to use fluent configuration for). We found that it is more concise and easier to locate the configuration for a property when it is right there decorating that property.

How to avoid data loss with EF Model First database schema upgrade?

This is a long question, but I would be very very thankful if I can get some good advice on this. In short, I’m looking for a good approach for version upgrade of MS SQL database schema that also demands data being moved from deleted tables into new tables.
I think Stack Overflow is the most appropriate place for this question (not dba.stackexchange.com) because at its core, this is an issue for .NET developers using Entity Framework, and the database parts of this consists mostly of auto-generated sql scripts.
Background
A .NET application and SQL database is running in Azure (The application in worker roles and the database in Azure SQL). Until now, version upgrades have worked fine, because all database schema changes have been simple (like adding a new column). However, from now on I also need to deal with moving data from one table to another during upgrades. (I’m able to fix this temporarily by creating a new database, generate a script with data from the old database and manually edit the script to make it fit the new schema, but I hope there is a better approach).
I use Entity Framework and I use Model First. Entities and associations are defined in Visual Studio Data Model Designer, and this approach is very appropriate for my application.
I use a dacpac to upgrade the Azure SQL database, and this approach has worked well until now (but now I will get data loss, so now I must find a way to move data to new tables).
I hope I can continue to use entity framework and defining entities/associations in the designer, but it’s fine to switch away from dacpac upgrade to another technology if needed.
Upgrade approach until now
I add new entities (tables), associations (relations) and properties (columns) in the designer.
I right-click, pick “Generate Database from Model…” and this results in a .sql script that drops old database objects and creates the new database objects.
I create an empty database and run the script to create the tables/keys etc.
In SQL Server Management Studio, I right-click the database and pick “Tasks -> Extract Data-tier Application…”. When the wizard completes I get the dacpac I need (Actually I can now delete the database, since I only created it to be able to get the dacpac file, since I don’t think I can generate it in Visual Studio Data Model Designer).
I right-click the Azure SQL database and pick “Tasks -> Upgrade Data-tier Application…” and follow the wizard.
Until now I have never had data loss, so this has worked fine!
Current situation
This is a simplified example to illustrate the issue, but I will get into almost identical situations quite often from now on it seems. Look at the old and the new version of the schema in the figure below. Assume there is already data in the database. I need the data in ImageFile to end up in ImageFileOriginal or ImageFileProcessed depending on the IsOriginal boolean/bit value. Using “Upgrade Data-tier Application” I will get alerted of data loss. What approach would you recommend to deal with this? As I said earlier, it’s fine to switch away from dacpac upgrade to another technology if needed.
I have read about Visual Studio Database Projects, Fluent Migrator, Red Gate and Entity Designer Database Generation Power Pack (It doesn't support Visual Studio 2012), but I didn’t find a good way for this. I admit I haven’t spent a whole day digging into each technology, but I certainly spent some time to try finding a good approach.
The best way to migrate database schema (create / delete tables / columns) and also data, is using the SSDT - Sql Server Data Tools, available for Visual Studio 2010 and Visual Studio 2012.
Here are some very useful links:
http://msdn.microsoft.com/data/tools
http://blogs.msdn.com/b/ssdt
http://msdn.microsoft.com/en-us/data/hh297027
In the Configuration class set the constructor as below:
public Configuration()
{
AutomaticMigrationsEnabled = true;
AutomaticMigrationDataLossAllowed = false;
}
Set the AutomaticMigrationEnabled property to true
means we are using automatic code first migration and another property AutomaticMigrationDataLossAllowed is set to false, means that during the migration no existing data is lost from that migration of the table of the database.
The entire Configuration class is as follows.

Entity Data Model Wizard not creating tables in EDMX file

I'm trying the database first approach by creating an ADO.NET Entity Data Model using the wizard with the Adventureworks2012 DB.
Testing DB connection works, and the connection string is added to the App.Config.
I'm selecting all the tables except the ones marked as (dbo) AWBuildVersion, DatabaseLog, and ErrorLog.
When the wizard finishes the .edmx file is blank, and if I view the file in XML view the EntityContainer is empty.
After the model is created it returns this error in the output window:
Unable to generate the model because of the following exception: 'The
table AdventureWorks2012.Production.Document is referenced by a
relationship, but cannot be found.
I'm using VS 2010 & .NET Framework 4.0
It seems that Entity Framework does not know how to deal with data types like hierarchyid set on a table field. I removed the Production.Document table for the list of entities to include solving my problem.
Note also that this reference below was for Adventureworks 2008R2 with EF version 1.0 from Code Plex SQL Sever, and I am using Adventureworks 2012 from the same CodePlex site using EF version 4.4.
Reference: http://msftdbprodsamples.codeplex.com/wikipage?title=AW2008Details
Note: EF 1.0 Compatibility Issues
The Entity Framework team would like us to let you know that AdventureWorks2008 is a little bit ahead of the curve in terms of the Katmai features it uses. Some datatypes in AdventureWorks2008 (such as hierarchyid and geometry) are not supported in the entity framework. The workaround is to exclude tables like Production.Document from your model if possible since there is currently no support for the hierarchyid datatype in Entity Framework 1.0. Unfortunately the Entity Framework tooling which updates your model from the database will pull in tables like Production.Document even if they were specifically excluded when the model was created, so use of that feature on AdventureWorks2008R2 is not supported at this time. We look forward to a follow-on release of Entity Framework which has full SQL Server 2008 type support.
Last edited May 25, 2010 at 2:22 PM by bonniefe, version 17
There is a way to get around this IF you're trying to learn from this example and not doing anything meaningful. I deleted the foreign keys to the offending table and removed it and was able to succesfully get
Uncheck [Allow Nulls] Check-boxes (in the table design) for all the foreign keys of the not created Tables (Tables not converted to the model).
Then you can update your model to retrieve those tables by doing the following steps:-
Step 1 - Right click some where in your .EDMX file's design (i.e. Model1.EDMX [Diagram1].
Step 2 - then from the Context Menu select / Update Model from Database....
Step 3 - then select "Add"
Step 4 - then expend "Tables" Check-Boxes and select your desired tables (tables not created first time).
Step 5 - then click Finish button.
Step 6 - Save the the solution and hope everything will be Ok.
Note: I'm using Visual Studio 2013.
Good luck.

Entitiy Framework: "Update Database from Model" instead of "Generate Database from Model"

I have created a Entity Framework 4 model with Visual Studio 2010 and generated a database from it. Now I found myself adding new properties (with default values), changing documentation of columns, changing names of columns, changing types of columns several times. All tasks that do not require much "extra work" in order not to be possible to be achieved automatically (in my humble opinion). Everytime I did "Generate Database from Model" and lost of course the table data.
Is there a way just to update the database's architecture so to say - leaving the table data untouched? Maybe with some user interaction especially when changing types etc.? Or would this functionality be simply too difficult to be realized to work in a reliable way?
Thanks in advance! Cheers, David
Noam Ben-Ami - MSFT1 (Microsoft Employee) answered my question at http://social.msdn.microsoft.com/Forums/en-us/adodotnetentityframework/thread/3adc080f-ee8c-4104-be29-95b2fb3fabe9 as follows:
We've build the entity designer database power pack to support this. You can download it here: http://visualstudiogallery.msdn.microsoft.com/en-us/df3541c3-d833-4b65-b942-989e7ec74c87
It includes a database generation workflow for the designer that does migration, rather than drop/create.
This posting is provided "AS IS" with no warranties, and confers no rights.
I haven't tested the tool yet, but I guess the info might be valueable for others, too.

Windows Workflow Persistence data (VS 2010 RC / .NET 4.0)

I have started working with Windows Workflow recently (the VS2010 RC / .NET 4.0 version) and am stuggling to get to grips with the SQL persistence functionality.
I have managed to attach persistence to my WorkflowServiceHost via an SqlWorkflowInstanceStoreBehavior object and in my database there are rows appearing in the [System.Activities.DurableInstancing].[InstancesTable] table.
However, I don't know how to make sense of any of this data (it seems as though quite a few columns are in binary format). How can I store custom data regarding my workflow in this? How do I retreive this from the table for MI style reporting?
I can't seem to find any info on the web regarding storing custom data (and then retrieving it again) - please help :)
Many thanks in advance!
The data you see is all use by the workflow persistence system and not really suitable for your own consumption. If you want to query on your own data you need to use a mechanism called property promotion that stores the data in a queryable format using the InstancePromotedProperties table. Basically you need to implement a PersistenceParticipant and overwrite the CollectValues() function to add the values to one of the collections.
See here for more details.