Why is dataset designer changing the case of my schema objects? - ado.net

I have two xsd files with the same exact schema. The first was generated from an SQL connection and the other was generated from a FoxPro OleDb connection by the Dataset Designer. I can't use the same XSD for generating my table adapters and tables because of SQL and OleDb providers generates different types. I can't use EF because it doesn't support FoxPro OleDb, or at least not officially.
I also have the problem where the SQL xsd file has table, field, and table adapter names in UPPERCASE and the FoxPro OleDb connection in lowercase. This creates a situation where I can't trick into casting them from a base type.
Is there a suitable work-around for this problem or do I have to have two sets of code in my DAL layer? I would hate to have to rewrite CRUD operations for all these tables.

For clarification, are you connecting to BOTH "servers" for data? Are you trying to copy data from one server to another? You could always have your query from each result in same column names and comparable data types (text vs char, integer works in both, memo vs blob, etc).
I've actually written a data querying wrapper object based on the INTERFACE types to perform querying and updating once per database connection, then subclassing from that for the respective individual table, so most of all the grunt work is done once.
As per the interfaces, I'm referring to that of IDbConnection (SQLConnection vs OleDbConnection), IDbCommand (SQLCommand vs OleDbCommand), etc and has worked well for me connecting to SQL, MySQL, Access, SQLite and Visual Foxpro databases.
All that being said, what/how are you using your XSD files. With a little more clarification, I might be able to offer a little more help.

Related

Entity Framework Core configurations with existing database

In one of my projects, I am using an existing SQL Server database. All the database scripts are managed using DBUp and SQL script migrations.
In my application, I am using Entity Framework Core to communicate with this database. When I configure my entities in EF configurations, should I still define functions like IsRequired(), HasMaxLenth() etc.?
I am not using these EF configurations to generate migration scripts; all the migration is outside of EF. I am just using these configurations to communicate with the database.
When I configure my Entities in EF configurations, should I still define functions like IsRequired(), HasMaxLenth() etc.?
Other than table and column name mapping and data type mapping, it's not required, but additional model metadata might be used by front-end components for validation.
In general, yes, you should keep them. Many of these configurations are used throughout EF to make decisions at runtime. For example, some queries can be further optimized if EF knows that a column is never NULL, the max length is used to configure the SQL parameters it sends to the database, and unique constraints are used to sort SQL statements during SaveChanges.
While a few things like constraint names, non-unique indexes, index filters, and sequences aren't currently used at runtime, it's hard to know which ones EF will and won't use, so it's best just to keep them all.
And sometimes, database features like always encrypted on SQL Server, will fail entirely if the mappings aren't precise.

Migrating a schema from one database to other

As part of some requirement, I need to migrate a schema from some existing database to a new schema in a different database. Some part of it is already done and now I need to compare the 2 schema and make changes in the new schema as per gap finding.
I am not using a tool and was trying to understand some details using syscat command but could not get much success.
Any pointer on what is the best way to solve this?
Regards,
Ramakant
A tool really is the best way to solve this – IBM Data Studio is free and can compare schemas between databases.
Assuming you are using DB2 for Linux/UNIX/Windows, you can do a rudimentary compare by looking at selected columns in SYSCAT.TABLES and SYSCAT.COLUMNS (for table definitions), and SYSCAT.INDEXES (for indexes). Exporting this data to files and using diff may be the easiest method. However, doing this for more complex structures (tables with range or database partitioning, foreign keys, etc) will become very complex very quickly as this information is spread across a lot of different system catalog tables.
An alternative method would be to extract DDL using the db2look utility. However, you can't specify the order that db2look outputs objects (db2look extracts DDL based on the objects' CREATE_TIME), so you can't extract DDL for an entire schema into a file and expect to use diff to compare. You would need to extract DDL into a separate file for each table.
Use SchemaCrawler for IBM DB2, a free open-source tool that is designed to produce text output that is designed to be diffed. You can get very detailed information about your schema, including view and stored procedure definitions. All of the information that you need will be output in a single file, and can be compared very easily using a standard diff tool.
Sualeh Fatehi, SchemaCrawler
unfortunately as per company policy, cannot use these tools at this point of time. So am writing some program using JDBC to get the details and do some comparison kind of stuff.

Entity Framework with large number of tables

Our database has about 500 tables we'd like to use in our EF model. Of those I'd be happy to start with 50 or fewer just to get our feet wet after working in plain ADO.net for years.
The problem is, our SQL server contains many thousands of other tables that exist in our database that have been created through the years and many that are dynamically generated. Believe it or not:
select count(*) from INFORMATION_SCHEMA.TABLES
73261
So that's a lot of tables. I have found that pretty much every tool I've tried to design, build or template EF models or entities either hangs or does not return a list of tables. Even SQL Server Object Explorer in VS2012 won't list the tables and instead shows the Tables folder with a little "x" over the icon. So I can't even select a subset of tables.
What options do I have for using EF? Is there a template where I can explicitly define the tables that I want to use entities for? Even with 50 tables, I don't want to hand code each one in an empty EDMX.
Using a Database / Code First approach and avoiding connecting Visual Studio to the database at all (i.e. don't create an edmx, or connect with server explorer) would allow you to do this easily. It does not give you any of the Model First advantages, but I think it sounds like your project would be better served with a Database / Code First approach anyway as:
You have an existing Model, and are not looking to push changes from your EDMX to the DB
You are looking to implement this on a subset of your database
This link has a good summation ( Code-first vs Model/Database-first ) with the caveat that in you case a Database/Code First approach does not have you pushing changes from code to the Database, so the last two bullets under code first apply less, and yours is a Database/Code First hybrid.
With 70k tables I think that any GUI is going to be tricky. When I am saying Database / Code First, I am trying to convey that you are not using the code to create / define and update your Database. Someone may be able to answer this more succinctly / accurately?
I now this is an old question. But for those who land here on a google search. The only tool I have found that actually works with thousands of tables is The Sharp Factory.
It is an ORM. Pretty simple to use. So if you are looking for an ORM that can work with a large number of tables and does not require you to write "POCOS" or "Mappings" or SQL then this is the tool.
You can find it here: The Sharp Factory

Using sql server 2008 ado.net entity data model and postgresql connectionstring

I'm using Visual Studio 2008, SQL Server 2008, asp.net mvc 2.
Now, I have to change my database to postgresql. I tried many ways. But I did something different.
I created the data model based on SQL Server 2008 database.
I have the same structured database in postgresql.
So I just changed the connection string of the entity data model and I got the following error
The 'System.Data.SqlClient' provider from the specified SSDL
artifact(s) does not match the expected 'Npgsql' provider from the
connection string.
How do I solve this error.
Please let me know if this is correct way to implement.
Well, the physical layer of your model (the "SSDL" file in your metadata) will of course contain the physical aspects of your database model - including the database provider that this model is based on. You cannot just simply change the database connection string and be done with that.....
I see two options:
my preferred solution: just re-create the EF model based on your Postgres database - that's the cleanest way to go. If you're doing it right, this is contained in a single assembly in your project anyway, so you could more or less swap in a new assembly to go against Postgres instead of SQL Server
a hack in my opinion: you could have the metadata files (the *.ssdl, *.msl, *.csdl) for your model written out to disk, and then manually edit the SSDL file to switch to the Postgres provider. I have no idea if that will even work and what side-effects it might have! Do at your own risk, and do it on a backup copy of your project first!

Data Type for storing document in SQL Server 2008 via Entity Framework

I'm trying to store a document in SQL Server 2008 using the Entity Framework.
I believe I have the code for doing this completed. The problem I'm now facing is which Data Type to use in SQL Server and in my entity model.
'Image' was my first choice, but this causes an "Invalid mapping" error when I update the model. I see that there's no equivalent of 'Image' (going by the Type drop-down in the entity's properties).
So then I tried 'varbinary(MAX)' and I see that this maps to 'binary' in the entity model. However, when I run the code it tells me that the data would be truncated so it stopped. Upon investigation I see that the SQL Server Data Type 'binary' is 8000 bytes long - which is why I chose 'varbinary(MAX)' - so the entity model seems to be reducing/mapping 'varbinary(MAX)' to 'binary'.
Is this right?
If so, what should my Data Types be (in both SQL Server 2008 and in my entity model) please? Any suggestions?
Change the data type of your model to byte[] and it will be all right dude, if you need more explanations please leave a comment.
EDIT:
Dude, I had tried it before in Linq to Sql and this time I tried it in EF, in conceptual model of your namely Foo.edmx file your type is Binary(you can open it through open with context menu of Visual Studio and then selecting Xml Editor or any other text editor like notepad) but in a generated file named Foo.designer.cs your data type is Byte[].
And there is not a limit you mentioned above.
I tried it with a 10000 bytes and it's inserted successfully without truncating my array. About benchmarking on saving documents in database or file system I read an article and it said that in Sql Server 7, file system have a better performance on retrieving stored data but in later versions of Sql Server, it take over the file system speed and it suggested saving documents on Sql Server.
IMHO on saving documents, if they are not too large, I prefer to store them on DB (NoSql DBs has great performance here as far as I know),
First: Integrity of my data,
Second: More performance that you can have(cause if your folder has large number of files, reading and writing files in those folder slows down gradually more and more unless you organize them in more than one folder and preferably in a tree like folders),
Third: Security policies that you may apply to them through your application more easily(although you can do this on file system approach but i think it's easier here)
Fourth: you can benefit from facilities provided by your DBMS for querying and manipulating and ... those files
and much more ... :-)
Ideally you should not store documents in the database, instead store the path to the document in the database, which then points to the physical document on the web server itself (or some other storage, CDN, etc).
However, if you must store it in SQL Server (and seeing as though your on SQL 2008),
you should be using FILESTREAM.
And it is supported by EF4 (i believe). It maps to binary.
As i said though, i'm not sure how well this will perform, run some benchmarks - and if it's not performing too well, try using regular ADO.NET/FileStream API.
I still think you should put it on the file system, not in the database (my opinion of course)