We're having a strange issue with one entity/data-source seemingly caching data in Data Access layer.
Basically the tables are a standard SQL Server table (SQL server 2008 R2). The code is generated the same way using the same template(NetTier 2.3.1) and Code-smith generator 6.5, there's nothing unusual about this procedure of creating DAL files.
BUT... when the tables are updated through custome procedures or outside of DAL , our web app doesn't display the latest data - sometimes it does, but sometimes it takes a few minutes for the latest data to come through. I can query the SQL database directly and see the updated data immediately, so it's not a database/lag issue.
Just to verify - I added a custom stored proc and tried getting the data that way, rather than accessing the table directly through the repository - this doesn't work either, it being an issue with the entity itself.
Any ideas? I wondered about entity caching, not sure how I can see the settings for that. Please note that we are using following tools:
Code-smith generator 6.5
NetTier 2.3.1
SQL Server 2008 R2
Project is hosted on IIS 7with .net Framework 2.0
This is an strange issue and we are facing lot of issues with caching of data. Please do reply if you have any idea ...
thanks,
Shankar..
Do you have entity tracking turned on (http://community.codesmithtools.com/nettiers/f/16/p/9639/35883.aspx)? What are your .netTiers config settings?
Related
This code:
ALTER DATABASE SCOPED CONFIGURATION
SET LEGACY_CARDINALITY_ESTIMATION=ON
makes my Entity Framework code generation very fast.
However I have also learned that I need to set it off my code generation completes.
Can some one explain what actually the code does? I went through google but could not find the understanding.
I am using SQL Server 2016 and Entity Framework v6
It looks like this is a known issue with new CE and the reverse engineering queries in EF. It also looks like the issue is specific to SQL 2016 and has bounced back and forth between the SQL team and the EF team.
The first fix posted was the one you mention in your post, ALTER DATABASE SCOPED CONFIGURATION SET LEGACY_CARDINALITY_ESTIMATION=ON
After this it looks like updating statistics on tables is a viable solution.
Check out this link - https://github.com/Microsoft/sql-server-samples/issues/57
I have been working on a side project for the last few weeks, and built the system with EntityFramework Code first. This was very handy during development, as any changes i needed to make to the code were reflected in the DB nice and easily. But now that i want to launch the site, but continue development, i dont want to have to drop and recreate the DB every time i make a tweak to a model...
Is there a way to get EF to generate change scripts for the model change so i can deploy them myself to the production server? And how do i use the database somewhere else (Windows Service in the background of the site) without having to drop and recreate the table, and use the same model as I have already? Kind of like a "Code first, but now i have a production DB, dont break it..."
Personally i use the builtin data tools in VS2010 to do a database schema synchronization for updating production.
Another cheaper tool if you dont have VS Premium is SQLDelta which ive used in the past and is really good.
Both tools connect to the two database versions and allow you to synchronise the table schemas first. Both also have an export to SQL script functionality.
Comming up for EF is Migrations which allows you to solve just this problem within your solution however its still in beta. Migrations lets you describe upgrade and downgrade events for your database in code.
No RTM version of EF has this feature. Once you go to production you must handle it yourselves. The common way is to turn off database initializer in production and use some tool like VS Premium or RedGate Database compare to compare your production and dev database and create change SQL script.
You can also try to use EF Migrations which is exactly the tool you are asking for. The problem is it is still beta (but it should be part of EF 4.3 once completed) so it doesn't have to work in all cases and functionality / API can change in RTM.
Using Model First, what is the best way to approach preservation of existing database data when the model changes and the database has to be regenerated?
The Database Power Pack extension no longer works (I've been trying to contact the author). I can't find anything that provides similar functionality.
R.
If database power pack doesn't work there is no other automatic way. Manual way requires running created SQL script on another database and using Visual Studio Database tools to create difference script between the current and the newly created database.
My small team used asp.mvc 2.0/entity framework 4.0(model first approach)/Windows Server 2008r2/Sql Server 2008 r2 stack in out web site project. We've already complete developing process, and come to the web deployment stage. In this stage we are faced with the problem - ok we'll use vs2010 features for initial server/db deploy, but what we'll do in the future? Obviously some of our models can be modified after publishing in order to satisfy new conditions, and of course our server db will contains users data sets, articles etc. Is there any approach to update servers db with new db modification, without dropping db, and converting data from old instance to the new one?
Now we have found only DAC/DACPAC approach to update server db, but we don't know how to bind auto EF model generation with DAC.
May be there is exists another solution? Is there any standard way to resolve this kind of situation? Any advice?
Thanks
I'd be interested to know if you have found a solution to this yet?
Have you tried simply generating a database based on your EF model, and using a schema comparison tool such as SQL Compare to deploy changes from the EF-generated database and your target production server?
I am testing on ado.net data service. I just created web application with SQL server there is one table and about 900 rows in database table. I made a model only contains one entity.
After building application I just test to get all entities from web browser. but it takes about 5 or 6 minutes to get all data in internet webbrowser
I don't know this is normal situation or not. source table only has 5 columns.
Do I miss something?
I am working on Visual studio 2008 sp1 and .net framework 3.5
not sure what you've done, but 900odd rows shouldn't take that long!
maybe you could provide a little more info about your data and / or your connection to the web server?
This should not happen, did you checked your internet connection, data transfer rate, It doesn't look like bug.