Changing schema name on runtime - Entity Framework - entity-framework

I need to change the storage schema of the entities on runtime.
I've followed a wonderful post, available here:
http://blogs.microsoft.co.il/blogs/idof/archive/2008/08/22/change-entity-framework-storage-db-schema-in-runtime.aspx?CommentPosted=true#commentmessage
This works perfectly, but only for queries, not for modifications.
Any idea why?

Well, I was looking for this piece of code all around the Internet. In the end I had to do it myself. It's based on Brandon Haynes adapter, but this function is all you need to change the schema on runtime - and you don't need to replace the autogenerated context constructors.
public static EntityConnection Create(
string schema, string connString, string model)
{
XmlReader[] conceptualReader = new XmlReader[]
{
XmlReader.Create(
Assembly
.GetExecutingAssembly()
.GetManifestResourceStream(model + ".csdl")
)
};
XmlReader[] mappingReader = new XmlReader[]
{
XmlReader.Create(
Assembly
.GetExecutingAssembly()
.GetManifestResourceStream(model + ".msl")
)
};
var storageReader = XmlReader.Create(
Assembly
.GetExecutingAssembly()
.GetManifestResourceStream(model + ".ssdl")
);
XNamespace storageNS = "http://schemas.microsoft.com/ado/2009/02/edm/ssdl";
var storageXml = XElement.Load(storageReader);
foreach (var entitySet in storageXml.Descendants(storageNS + "EntitySet"))
{
var schemaAttribute = entitySet.Attributes("Schema").FirstOrDefault();
if (schemaAttribute != null)
{
schemaAttribute.SetValue(schema);
}
}
storageXml.CreateReader();
StoreItemCollection storageCollection =
new StoreItemCollection(
new XmlReader[] { storageXml.CreateReader() }
);
EdmItemCollection conceptualCollection = new EdmItemCollection(conceptualReader);
StorageMappingItemCollection mappingCollection =
new StorageMappingItemCollection(
conceptualCollection, storageCollection, mappingReader
);
var workspace = new MetadataWorkspace();
workspace.RegisterItemCollection(conceptualCollection);
workspace.RegisterItemCollection(storageCollection);
workspace.RegisterItemCollection(mappingCollection);
var connectionData = new EntityConnectionStringBuilder(connString);
var connection = DbProviderFactories
.GetFactory(connectionData.Provider)
.CreateConnection();
connection.ConnectionString = connectionData.ProviderConnectionString;
return new EntityConnection(workspace, connection);
}
The resulting EntityConnection should be passed as a parameter when instantiating the context. You can modify it, so all ssdl models are modified by this function, not only the specified one.

I've managed to resolve this issue by using a brilliant library, located in CodePlex (courtesy of Brandon Haynes), named "Entity Framework Runtime Model Adapter", available here:
http://efmodeladapter.codeplex.com/
I've tweaked it a bit, to fit our needs and without the need of replacing the designer code at all.
So, I'm good.
Thanks anyways, and especially to Brandon, amazing job!

I need import data from postgres database. It by default use schema "public". So I use Entity Framework CTP 4 "Code first". It by default use schema "dbo". To change it in runtime I use:
public class PublicSchemaContext : DbContext
{
protected override void OnModelCreating(System.Data.Entity.ModelConfiguration.ModelBuilder builder)
{
builder.Entity<series_categories>().MapSingleType().ToTable("[public].[series_categories]");
}
public DbSet<series_categories> series_categories { get; set; }
}
It work for select, insert, update and delete data. So next test in pass:
[Test]
public void AccessToPublicSchema()
{
// Select
var db = new PublicSchemaContext();
var list = db.series_categories.ToList();
Assert.Greater(list.Count, 0);
Assert.IsNotNull(list.First().series_category);
// Delete
foreach (var item in db.series_categories.Where(c => c.series_category == "Test"))
db.series_categories.Remove(item);
db.SaveChanges();
// Insert
db.series_categories.Add(new series_categories { series_category = "Test", series_metacategory_id = 1 });
db.SaveChanges();
// Update
var test = db.series_categories.Single(c => c.series_category == "Test");
test.series_category = "Test2";
db.SaveChanges();
// Delete
foreach (var item in db.series_categories.Where(c => c.series_category == "Test2"))
db.series_categories.Remove(item);
db.SaveChanges();
}

Not an answer per se but a followup on Jan Matousek's Create[EntityConnection] method showing how to use from a DbContext. Note DB is the DbContext type passed to the generic repository.
public TxRepository(bool pUseTracking, string pServer, string pDatabase, string pSchema="dbo")
{
// make our own EF database connection string using server and database names
string lConnectionString = BuildEFConnectionString(pServer, pDatabase);
// do nothing special for dbo as that is the default
if (pSchema == "dbo")
{
// supply dbcontext with our connection string
mDbContext = Activator.CreateInstance(typeof(DB), lConnectionString) as DB;
}
else // change the schema in the edmx file before we use it!
{
// Create an EntityConnection and use that to create an ObjectContext,
// then that to create a DbContext with a different default schema from that specified for the edmx file.
// This allows us to have parallel tables in the database that we can make available using either schema or synonym renames.
var lEntityConnection = CreateEntityConnection(pSchema, lConnectionString, "TxData");
// create regular ObjectContext
ObjectContext lObjectContext = new ObjectContext(lEntityConnection);
// create a DbContext from an existing ObjectContext
mDbContext = Activator.CreateInstance(typeof(DB), lObjectContext, true) as DB;
}
// finish EF setup
SetupAndOpen(pUseTracking);
}

I was able to convert the solution from Jan Matousek to work in vb.net 2013 with entity framework 6. I will also try to explain how to use the code in vb.net.
We have a JD Edwards Database which uses different Schema's for each environment (TESTDTA, CRPDTA, PRODDTA). This makes switching between environments cumbersome as you have to manually modify the .edmx file if you want to change environments.
First step is to create a partial class that allows you to pass a value to the constructor of your entities, by default it uses the values from your config file.
Partial Public Class JDE_Entities
Public Sub New(ByVal myObjectContext As ObjectContext)
MyBase.New(myObjectContext, True)
End Sub
End Class
Next create the function that will modify your store schema .ssdl file in memory.
Public Function CreateObjectContext(ByVal schema As String, ByVal connString As String, ByVal model As String) As ObjectContext
Dim myEntityConnection As EntityConnection = Nothing
Try
Dim conceptualReader As XmlReader = XmlReader.Create(Me.GetType().Assembly.GetManifestResourceStream(model + ".csdl"))
Dim mappingReader As XmlReader = XmlReader.Create(Me.GetType().Assembly.GetManifestResourceStream(model + ".msl"))
Dim storageReader As XmlReader = XmlReader.Create(Me.GetType().Assembly.GetManifestResourceStream(model + ".ssdl"))
Dim storageNS As XNamespace = "http://schemas.microsoft.com/ado/2009/11/edm/ssdl"
Dim storageXml = XDocument.Load(storageReader)
Dim conceptualXml = XDocument.Load(conceptualReader)
Dim mappingXml = XDocument.Load(mappingReader)
For Each myItem As XElement In storageXml.Descendants(storageNS + "EntitySet")
Dim schemaAttribute = myItem.Attributes("Schema").FirstOrDefault
If schemaAttribute IsNot Nothing Then
schemaAttribute.SetValue(schema)
End If
Next
storageXml.Save("storage.ssdl")
conceptualXml.Save("storage.csdl")
mappingXml.Save("storage.msl")
Dim storageCollection As StoreItemCollection = New StoreItemCollection("storage.ssdl")
Dim conceptualCollection As EdmItemCollection = New EdmItemCollection("storage.csdl")
Dim mappingCollection As StorageMappingItemCollection = New StorageMappingItemCollection(conceptualCollection, storageCollection, "storage.msl")
Dim workspace = New MetadataWorkspace()
workspace.RegisterItemCollection(conceptualCollection)
workspace.RegisterItemCollection(storageCollection)
workspace.RegisterItemCollection(mappingCollection)
Dim connectionData = New EntityConnectionStringBuilder(connString)
Dim connection = DbProviderFactories.GetFactory(connectionData.Provider).CreateConnection()
connection.ConnectionString = connectionData.ProviderConnectionString
myEntityConnection = New EntityConnection(workspace, connection)
Return New ObjectContext(myEntityConnection)
Catch ex As Exception
End Try
End Function
Make sure that the storageNS namespace hardcoded value matches the one used in your code, you can view this by debugging the code and examining the storageXML variable to see what was actually used.
Now you can pass a new schema name, and different database connection info at runtime when you create your entities. No more manual .edmx changes required.
Using Context As New JDE_Entities(CreateObjectContext("NewSchemaNameHere", ConnectionString_EntityFramework("ServerName", "DatabaseName", "UserName", "Password"), "JDE_Model"))
Dim myWO = From a In Context.F4801 Where a.WADOCO = 400100
If myWO IsNot Nothing Then
For Each r In myWO
Me.Label1.Text = r.WADL01
Next
End If
End Using
These were the .net libraries used:
Imports System.Data.Entity.Core.EntityClient
Imports System.Xml
Imports System.Data.Common
Imports System.Data.Entity.Core.Metadata.Edm
Imports System.Reflection
Imports System.Data.Entity.Core.Mapping
Imports System.Data.Entity.Core.Objects
Imports System.Data.Linq
Imports System.Xml.Linq
Hope that helps anyone out there with the same issues.

I had a lot of problems getting this to work when using EF6 with an OData Data Service, so I had to find an alternate solution. In my case, I didn't really need to do it on the fly. I could get away with changing the schema when deploying to some test environments, and in the installer.
Use Mono.Cecil to rewrite the embedded .ssdl resources straight in the DLLs. This works just fine in my case.
Here is a simplified example of how you can do this:
var filename = "/path/to/some.dll"
var replacement = "Schema=\"new_schema\"";
var module = ModuleDefinition.ReadModule(filename);
var ssdlResources = module.Resources.Where(x => x.Name.EndsWith(".ssdl"));
foreach (var resource in ssdlResources)
{
var item = (EmbeddedResource)resource;
string rewritten;
using (var reader = new StreamReader(item.GetResourceStream()))
{
var text = reader.ReadToEnd();
rewritten = Regex.Replace(text, "Schema=\"old_schema\"", replacement);
}
var bytes = Encoding.UTF8.GetBytes(rewritten);
var newResource = new EmbeddedResource(item.Name, item.Attributes, bytes);
module.Resources.Remove(item);
module.Resources.Add(newResource);
}

Related

how to make sure locks are released with ef core and postgres?

I have a console program that moves Data between two different servers (DatabaseA and DatabaseB).
Database B is a Postgres-Server.
It calls a lot of stored procedures and other raw queries.
I use ExecuteSqlRaw a lot.
I also use NpsqlBulk.EfCore.
The program uses the same context instance for DatabaseB during the whole run it takes to finish.
Somehow i get locks on some of my tables on DatabaseB that never get released.
This happens always on my table mytable_fromdatabase_import.
The code run on that is the following:
protected override void AddIdsNew()
{
var toAdd = IdsNotInDatabaseB();
var newObjectsToAdd = GetByIds(toAdd).Select(Converter.ConvertAToB);
DatabaseBContext.Database.ExecuteSqlRaw("truncate mytable_fromdatabase_import; ");
var uploader = new NpgsqlBulkUploader(DatabaseBContext);
uploader.Insert(newObjectsToAdd); // inserts data into mytable_fromdatabase_import
DatabaseBContext.Database.ExecuteSqlRaw("call insert_myTable_from_importTable();");
}
After i run it the whole table is not accessable annymore and when i query the locks on the server i can see there is a process holding it.
How can i make sure this process always closes and releases its locks on tables?
I thought ef-core would do that automaticaly.
-----------Edit-----------
I just wanted to add that this is not a temporary problem during the run of the console. When i run this code and it is finished my table is still locked and nothing can access it. My understanding was that the ef-core context would release everything after it is disposed (if by error or by being finished)
The problem had nothing to do with ef core but with a wrong configured backupscript. The program is running now with no changes to it and it works fine
For concrete task you need right tools. Probably you have locks when retrieve Ids and also when trying to do not load already imported records. These steps are slow!
I would suggest to use linq2db (disclaimer, I'm co-author of this library)
Create two projects with models from different databases:
Source.Model.csproj - install linq2db.SQLServer
Destination.Model.csproj - install linq2db.PostgreSQL
Follow instructions in T4 templates how to generate model from two databases. It is easy and you can ask questions on linq2db`s github site.
I'll post helper class which I've used for transferring tables on my previous project. It additionally uses library CodeJam for mapping, but in your project, for sure, you can use Automapper.
public class DataImporter
{
private readonly DataConnection _source;
private readonly DataConnection _destination;
public DataImporter(DataConnection source, DataConnection destination)
{
_source = source;
_destination = destination;
}
private long ImportDataPrepared<TSource, TDest>(IOrderedQueryable<TSource> source, Expression<Func<TSource, TDest>> projection) where TDest : class
{
var destination = _destination.GetTable<TDest>();
var tableName = destination.TableName;
var sourceCount = source.Count();
if (sourceCount == 0)
return 0;
var currentCount = destination.Count();
if (currentCount > sourceCount)
throw new Exception($"'{tableName}' what happened here?.");
if (currentCount >= sourceCount)
return 0;
IQueryable<TSource> sourceQuery = source;
if (currentCount > 0)
sourceQuery = sourceQuery.Skip(currentCount);
var projected = sourceQuery.Select(projection);
var copied =
_destination.BulkCopy(
new BulkCopyOptions
{
BulkCopyType = BulkCopyType.MultipleRows,
RowsCopiedCallback = (obj) => RowsCopiedCallback(obj, currentCount, sourceCount, tableName)
}, projected);
return copied.RowsCopied;
}
private void RowsCopiedCallback(BulkCopyRowsCopied obj, int currentRows, int totalRows, string tableName)
{
var percent = (currentRows + obj.RowsCopied) / (double)totalRows * 100;
Console.WriteLine($"Copied {percent:N2}% \tto {tableName}");
}
public class ImporterHelper<TSource>
{
private readonly DataImporter _improrter;
private readonly IOrderedQueryable<TSource> _sourceQuery;
public ImporterHelper(DataImporter improrter, IOrderedQueryable<TSource> sourceQuery)
{
_improrter = improrter;
_sourceQuery = sourceQuery;
}
public long To<TDest>() where TDest : class
{
var mapperBuilder = new MapperBuilder<TSource, TDest>();
return _improrter.ImportDataPrepared(_sourceQuery, mapperBuilder.GetMapper().GetMapperExpressionEx());
}
public long To<TDest>(Expression<Func<TSource, TDest>> projection) where TDest : class
{
return _improrter.ImportDataPrepared(_sourceQuery, projection);
}
}
public ImporterHelper<TSource> ImprortData<TSource>(IOrderedQueryable<TSource> source)
{
return new ImporterHelper<TSource>(this, source);
}
}
So begin transferring. Note that I have used OrderBy/ThenBy to specify Id order to do not import already transferred records - important order fields should be Unique Key combination. So this sample is reentrant and can be re-run again when connection is lost.
var sourceBuilder = new LinqToDbConnectionOptionsBuilder();
sourceBuilder.UseSqlServer(SourceConnectionString);
var destinationBuilder = new LinqToDbConnectionOptionsBuilder();
destinationBuilder.UsePostgreSQL(DestinationConnectionString);
using (var source = new DataConnection(sourceBuilder.Build()))
using (var destination = new DataConnection(destinationBuilder.Build()))
{
var dataImporter = new DataImporter(source, destination);
dataImporter.ImprortData(source.GetTable<Source.Model.FirstTable>()
.OrderBy(e => e.Id1)
.ThenBy(e => e.Id2))
.To<Dest.Model.FirstTable>();
dataImporter.ImprortData(source.GetTable<Source.Model.SecondTable>().OrderBy(e => e.Id))
.To<Dest.Model.SecondTable>();
}
For sure boring part with OrderBy can be generated automatically, but this will explode this already not a short answer.
Also play with BulkCopyOptions. Native Npgsql COPY may fail and Multi-Line variant should be used.

Entity Framework 6: is it possible to update specific object property without getting the whole object?

I have an object with several really large string properties. In addition, it has a simple timestamp property.
What I trying to achieve is to update only timestamp property without getting the whole huge object to the server.
Eventually, I would like to use EF and to do in the most performant way something equivalent to this:
update [...]
set [...] = [...]
where [...]
Using the following, you can update a single column:
var yourEntity = new YourEntity() { Id = id, DateProp = dateTime };
using (var db = new MyEfContextName())
{
db.YourEntities.Attach(yourEntity);
db.Entry(yourEntity).Property(x => x.DateProp).IsModified = true;
db.SaveChanges();
}
OK, I managed to handle this. The solution is the same as proposed by Seany84, with the only addition of disabling validation, in order to overcome issue with required fields. Basically, I had to add the following line just before 'SaveChanges():
db.Configuration.ValidateOnSaveEnabled = false;
So, the complete solution is:
var yourEntity = new YourEntity() { Id = id, DateProp = dateTime };
using (var db = new MyEfContextName())
{
db.YourEntities.Attach(yourEntity);
db.Entry(yourEntity).Property(x => x.DateProp).IsModified = true;
db.Configuration.ValidateOnSaveEnabled = false;
db.SaveChanges();
}

Testing With A Fake DbContext and Autofixture and Moq

SO follow this example
example and how make a fake DBContex For test my test using just this work fine
[Test]
public void CiudadIndex()
{
var ciudades = new FakeDbSet<Ciudad>
{
new Ciudad {CiudadId = 1, EmpresaId =1, Descripcion ="Santa Cruz", FechaProceso = DateTime.Now, MarcaBaja = null, UsuarioId = 1},
new Ciudad {CiudadId = 2, EmpresaId =1, Descripcion ="La Paz", FechaProceso = DateTime.Now, MarcaBaja = null, UsuarioId = 1},
new Ciudad {CiudadId = 3, EmpresaId =1, Descripcion ="Cochabamba", FechaProceso = DateTime.Now, MarcaBaja = null, UsuarioId = 1}
};
//// Create mock unit of work
var mockData = new Mock<IContext>();
mockData.Setup(m => m.Ciudades).Returns(ciudades);
// Setup controller
var homeController = new CiudadController(mockData.Object);
// Invoke
var viewResult = homeController.Index();
var ciudades_de_la_vista = (IEnumerable<Ciudad>)viewResult.Model;
// Assert..
}
Iam tryign now to use Autofixture-Moq
to create "ciudades" but I cant. I try this
var fixture = new Fixture();
var ciudades = fixture.Build<FakeDbSet<Ciudad>>().CreateMany<FakeDbSet<Ciudad>>();
var mockData = new Mock<IContext>();
mockData.Setup(m => m.Ciudades).Returns(ciudades);
I get this error
Cant convert System.Collections.Generic.IEnumerable(FakeDbSet(Ciudad)) to System.Data.Entity.IDbSet(Ciudad)
cant put "<>" so I replace with "()" in the error message
Implementation of IContext and FakeDbSet
public interface IContext
{
IDbSet<Ciudad> Ciudades { get; }
}
public class FakeDbSet<T> : IDbSet<T> where T : class
how can make this to work?
A minor point... In stuff like:
var ciudades_fixture = fixture.Build<Ciudad>().CreateMany<Ciudad>();
The second type arg is unnecessary and should be:
var ciudades_fixture = fixture.Build<Ciudad>().CreateMany();
I really understand why you need a FakeDbSet and the article is a bit TL;DR... In general, I try to avoid faking and mucking with ORM bits and instead dealing with interfaces returning POCOs to the max degree possible.
That aside... The reason the normal syntax for initialising the list works is that there is an Add (and IEnumerable) in DBFixture. AutoFixture doesn't have a story for that pattern directly (after all it is compiler syntactic sugar and not particularly amenable to reflection or in line with any other conventions) but you can use AddManyTo as long as there is an ICollection in play. Luckily, within the impl of FakeDbSet as in the article, the following gives us an in:-
public ObservableCollection<T> Local
{
get { return _data; }
}
As ObservableCollection<T> derives from ICollection<T>, you should be able to:
var ciudades = new FakeDbSet<Cuidad>();
fixture.AddManyTo(ciudades.Local);
var mockData = new Mock<IContext>();
mockData.Setup(m => m.Ciudades).Returns(ciudades);
It's possible to wire up a customization to make this prettier, but at least you have a way to manage it. The other option is to have something implement ICollection (or add a prop with a setter taking IEnumerable<T> and have AF generate the parent object, causing said collection to be filled in.
Long superseded side note: In your initial question, you effectively have:
fixture.Build<FakeDbSet<Ciudad>>().CreateMany()
The problem becomes clearer then - you are asking AF to generate Many FakeDbSet<Ciudad>s, which is not what you want.
I haven't used AutoFixture in a while, but shouldn't it be:
var ciudades = new FakeDbSet<Ciudad>();
fixture.AddManyTo(ciudades);
for the moment I end doing this, I will keep reading about how use automoq, cause I'm new in this
var fixture = new Fixture();
var ciudades_fixture = fixture.Build<Ciudad>().CreateMany<Ciudad>();
var ciudades = new FakeDbSet<Ciudad>();
foreach (var item in ciudades_fixture)
{
ciudades.Add(item);
}
var mockData = new Mock<IContext>();
fixture.Create<Mock<IContext>>();
mockData.Setup(r => r.Ciudades).Returns(ciudades);

execute stored proc in ExecuteStoreQuery EF. is this a bug in EF?

trying to execute the stored proc in EF using the following code:
var params = new object[] {new SqlParameter("#FirstName", "Bob")};
return this._repositoryContext.ObjectContext.ExecuteStoreQuery<ResultType>("GetByName", params);
but keep getting this error:
Procedure or function 'GetByName' expects parameter '#FirstName',
which was not supplied.
and from sql profiler:
exec sp_executesql N'GetByName',N'#FirstName nvarchar(100),#FirstName=N'Bob'
what is wrong wit the above ExecuteStoreQuery code?
Ignoring the fact that params is a reserved word...
Think your query needs to be:
var params = new object[] {new SqlParameter("#FirstName", "Bob")};
return this._repositoryContext.ObjectContext.ExecuteStoreQuery<ResultType>("exec GetByName #FirstName", params);
Should also say that if that proc is a standard part of your database and data model then you should import it into your EDM so it's available directly on your context.
Use the ExecuteFunction instead of ExecuteStoreQuery which is more suitable for the "ad-hoc" queries.
var parameters = new ObjectParameter[] {new ObjectParameter("FirstName", "Bob")};
return this._repositoryContext.ObjectContext.ExecuteFunction<ResultType>("GetByName", parameters);
The stored procedures can also be mapped as function in the context and thus can be used as typed method. Take a look at Using stored procedures with Entity Framework.
This is what I did to use a SP in EF, if you have multiple parameters:-
public virtual ObjectResult<GetEpisodeCountByPracticeId_Result> GetEpisodeCountByPracticeId(Nullable<int> practiceId, Nullable<System.DateTime> dat1)
{
SqlParameter practiceIdParameter = practiceId.HasValue ?
new SqlParameter() { ParameterName = "practiceId", Value = practiceId, SqlDbType = SqlDbType.Int } :
new SqlParameter() { ParameterName = "practiceId", SqlDbType = SqlDbType.Int };
SqlParameter dat1Parameter = dat1.HasValue ?
new SqlParameter() { ParameterName = "dat1", Value = dat1, SqlDbType = SqlDbType.DateTime }:
new SqlParameter() { ParameterName = "dat1", SqlDbType = SqlDbType.DateTime };
return ((IObjectContextAdapter)this).ObjectContext.ExecuteStoreQuery<GetEpisodeCountByPracticeId_Result>("exec GetEpisodeCountByPracticeId #practiceId, #dat1", practiceIdParameter, dat1Parameter);
}
If you dont add the parameters (e.g. #practiceId) in the commandText property then you get the error you received

Returning a DataTable using Entity Framework ExecuteStoreQuery

I am working with a system that has many stored procedures that need to be displayed. Creating entities for each of my objects is not practical.
Is it possible and how would I return a DataTable using ExecuteStoreQuery ?
public ObjectResult<DataTable> MethodName(string fileSetName) {
using (var dataContext = new DataContext(_connectionString))
{
var returnDataTable = ((IObjectContextAdapter)dataContext).ObjectContext.ExecuteStoreQuery<DataTable>("SP_NAME","SP_PARAM");
return returnDataTable;
}
Yes it's possible, but it should be used for just dynamic result-set or raw SQL.
public DataTable ExecuteStoreQuery(string commandText, params Object[] parameters)
{
DataTable retVal = new DataTable();
retVal = context.ExecuteStoreQuery<DataTable>(commandText, parameters).FirstOrDefault();
return retVal;
}
Edit: It's better to use classical ADO.NET to get the data model rather than using Entity Framework because most probably you cannot use DataTable even if you can run the method: context.ExecuteStoreQuery<DataTable>(commandText, parameters).FirstOrDefault();
ADO.NET Example:
public DataSet GetResultReport(int questionId)
{
DataSet retVal = new DataSet();
EntityConnection entityConn = (EntityConnection)context.Connection;
SqlConnection sqlConn = (SqlConnection)entityConn.StoreConnection;
SqlCommand cmdReport = new SqlCommand([YourSpName], sqlConn);
SqlDataAdapter daReport = new SqlDataAdapter(cmdReport);
using (cmdReport)
{
SqlParameter questionIdPrm = new SqlParameter("QuestionId", questionId);
cmdReport.CommandType = CommandType.StoredProcedure;
cmdReport.Parameters.Add(questionIdPrm);
daReport.Fill(retVal);
}
return retVal;
}
No, I don't think that'll work - Entity Framework is geared towards returning entities and isn't meant to return DataTable objects.
If you need DataTable objects, use straight ADO.NET instead.
This method uses the connection string from the entity framework to establish an ADO.NET connection, to a MySQL database in this example.
using MySql.Data.MySqlClient;
public DataSet GetReportSummary( int RecordID )
{
var context = new catalogEntities();
DataSet ds = new DataSet();
using ( MySqlConnection connection = new MySqlConnection( context.Database.Connection.ConnectionString ) )
{
using ( MySqlCommand cmd = new MySqlCommand( "ReportSummary", connection ) )
{
MySqlDataAdapter adapter = new MySqlDataAdapter( cmd );
adapter.SelectCommand.CommandType = CommandType.StoredProcedure;
adapter.SelectCommand.Parameters.Add( new MySqlParameter( "#ID", RecordID ) );
adapter.Fill( ds );
}
}
return ds;
}
Yes it can easily be done like this:
var table = new DataTable();
using (var ctx = new SomeContext())
{
var cmd = ctx.Database.Connection.CreateCommand();
cmd.CommandText = "Select Col1, Col2 from SomeTable";
cmd.Connection.Open();
table.Load(cmd.ExecuteReader());
}
By the rule, you shouldn't use a DataSet inside a EF application. But, if you really need to (for instance, to feed a report), that solution should work (it's EF 6 code):
DataSet GetDataSet(string sql, CommandType commandType, Dictionary<string, Object> parameters)
{
// creates resulting dataset
var result = new DataSet();
// creates a data access context (DbContext descendant)
using (var context = new MyDbContext())
{
// creates a Command
var cmd = context.Database.Connection.CreateCommand();
cmd.CommandType = commandType;
cmd.CommandText = sql;
// adds all parameters
foreach (var pr in parameters)
{
var p = cmd.CreateParameter();
p.ParameterName = pr.Key;
p.Value = pr.Value;
cmd.Parameters.Add(p);
}
try
{
// executes
context.Database.Connection.Open();
var reader = cmd.ExecuteReader();
// loop through all resultsets (considering that it's possible to have more than one)
do
{
// loads the DataTable (schema will be fetch automatically)
var tb = new DataTable();
tb.Load(reader);
result.Tables.Add(tb);
} while (!reader.IsClosed);
}
finally
{
// closes the connection
context.Database.Connection.Close();
}
}
// returns the DataSet
return result;
}
In my Entity Framework based solution I need to replace one of my Linq queries with sql - for efficiency reasons.
Also I want my results in a DataTable from one stored procedure so that I could create a table value parameter to pass into a second stored procedure. So:
I'm using sql
I don't want a DataSet
Iterating an IEnumerable probably isn't going to cut it - for efficiency reasons
Also, I am using EF6, so I would prefer DbContext.SqlQuery over ObjectContext.ExecuteStoreQuery as the original poster requested.
However, I found that this just didn't work:
_Context.Database.SqlQuery<DataTable>(sql, parameters).FirstOrDefault();
This is my solution. It returns a DataTable that is fetched using an ADO.NET SqlDataReader - which I believe is faster than a SqlDataAdapter on read-only data. It doesn't strictly answer the question because it uses ADO.Net, but it shows how to do that after getting a hold of the connection from the DbContext
protected DataTable GetDataTable(string sql, params object[] parameters)
{
//didn't work - table had no columns or rows
//return Context.Database.SqlQuery<DataTable>(sql, parameters).FirstOrDefault();
DataTable result = new DataTable();
SqlConnection conn = Context.Database.Connection as SqlConnection;
if(conn == null)
{
throw new InvalidCastException("SqlConnection is invalid for this database");
}
using (SqlCommand cmd = new SqlCommand(sql, conn))
{
cmd.Parameters.AddRange(parameters);
conn.Open();
using (SqlDataReader reader = cmd.ExecuteReader())
{
result.Load(reader);
}
return result;
}
}
The easiest way to return a DataTable using the EntityFramework is to do the following:
MetaTable metaTable = Global.DefaultModel.GetTable("Your EntitySetName");
For example:
MetaTable metaTable = Global.DefaultModel.GetTable("Employees");
Maybe your stored procedure could return a complex type?
http://blogs.msdn.com/b/somasegar/archive/2010/01/11/entity-framework-in-net-4.aspx