.NET connector for PostgreSQL - postgresql

Last time I used Npgsql, i.e., version 1.0, it worked very slow. Is there any other alternative to Npgsql?

Version 1.0 is three years old. Try to use the newest one.

Npgsql is an excellent connector. Just upgrade to the new one. Make sure you take a look at the documentation it is really good. That will solve the speed issue.
You asked about an alternative, so I also have to recommend an another good connector: dotConnect for PostgreSQL. It is made by Devart. There is a simple free one as well as a fully robust pay connector. The pay one has Linq and entity framework support.
http://www.devart.com/dotconnect/postgresql/

I have experience with the .NET MySQL connector. What you are describing seems to be a DNS issue. If you are using a URL in your connection string and are able to change it to an IP address, try that and see if your delay goes away.

npgsql is still the choice for .NET when connecting to PostgreSQL.
Since version 1.0, the connector has improved drastically, check out this presentation from Shay Rojansky; it is not the latest, still the boost was already quite impressive in 2018/11.
If you are upgrading from an old version, read the release notes of the latest carefully, you might break functionality in your code.
Also, I strongly recommend considering optimizing PostgreSQL as well. I work with it daily in a distributed enterprise environment, with massive workloads; it can be tuned and tweaked with a dramatic impact on the overall performance.

As #yojimbo87 told upgrade to newer connector version. Try that.
Use entity core framework. Npgsql has an Entity Framework (EF) Core provider.
Use Postgres 11
Check connection pool setting
Like most ADO.NET providers, Npgsql uses connection pooling by default. When you Close() the NpgsqlConnection object, an internal object representing the actual underlying connection that Npgsql uses goes into a pool to be re-used, saving the overhead of creating another unnecessarily.
This suits most applications well, as it's common to want to use a connection several times in the space of a second.
It doesn't suit you at all, but if you include the option Pooling=false in your connection string, it will override this default, and Close() will indeed close the actual connection.
Npgsql has an Entity Framework (EF) Core provider. It behaves like other EF Core providers (e.g. SQL Server), so the general EF Core docs apply here as well. If you're just getting started with EF Core, those docs are the best place to start.
Development happens in the Npgsql.EntityFrameworkCore.PostgreSQL repository, all issues should be reported there.
https://www.npgsql.org/efcore/index.html

Related

Scaffold EF Core from Database Project

How can I scaffold EF Core directly from a Visual Studio SQL Server Database Project?
Solutions such as the following are preferred:
scaffold-dbcontext -connection "provider=ssdtproject, name=myprojectname.sqlproj"
scaffold-dbcontext -ddl "ssdtprojectoutput.sql"
scaffold-dbcontext -ssdtschema "ssdtproject.dacpac"
maintained-third-party-tool myprojectname.sqlproj -EfModelGenerationParameters
That's the whole question. What follows is my situation in more detail, so that you may be able to offer alternate solutions:
Although MS acknowledges EF Core is still not production-ready, it's also now 3-4 years since EF 6 progress ceased, and EF Core is the only LINQ code-similar path forward with NETCore compatibility. Thus begins the saga titled "So you're going to be using EF Core."
This part is opinionated, but to me (based on 25+ years of enterprise software design and development experience) Code-First is an absolute non-starter. It's fine for small week-one application concepts, but there's no reasonable pattern/process/practice that I can see to integrate constraints, views, etc. Without views designed-in, real business apps end up with devs repeating logic fundamentals in LINQ expressions all over the place, littering the code with static fields to support LINQ-to-SQL queries, confusing micro-combinatory patterns using LinqKit, etc. Without constraints we end up with ten times the defensive code requirements to handle runtime errors, rapidly blossoming unit and integration tests, and demo failures become the norm. Either our object-oriented experts need to become SQL experts or the converse, and we drastically increase the difficulty of finding and properly-compensating engineers. All of these issues I pointed out in a detailed conversation four years ago with Rowan Miller (who recently left the EF team, which doesn't bode well for near-term solutions).
Model-First (the visual .edmx designer in prior EF versions) is obviously off the table, since the MS solution to this was to claim Code-First really IS Model-First, and wash their hands of it. Consequently a truly neutral, let's call it "Contract-First" for clarity, approach doesn't exist in EF Core.
So, that rant (sorry, frustrated) brings me to Database-First, and thus Scaffold-DbContext. Our DB Schema is currently a revision-controlled Visual Studio SQL Server Database Project. Aside some known issues with this, it also seems ridiculous to have to take our DB schema (currently our single-point-of-truth), rebuild a live database from it, and then back-generate code from the live database, all as part of our build process just to verify database type alignment. I'd like instead to be able to simply detect changes and regenerate my DbContext and related Entities directly from the Database Project.
SSDT Database Projects seem to make Database-like objects available in many of the UIs where normally database connections are required. That makes me think it may be a short walk to use the database schema as a source for existing tools. For example, use a metadata provider in a connection string, make a simple modification to the EF Core code, etc.
SQL Sharpener "generate[s] at design-time using SQL files as the source-of-truth (such as those found in an SSDT project", and was recommended as a solution to this problem for previous versions of EF, but it does not support EF Core.
SQLite and SQL Server Compact Toolbox just added support for generating EF Models directly from .DACPAC, but it appears to depend on the EntityFramework Reverse POCO Code First Generator for that functionality, which prominently lists "Support EF Core" on its TODO List. The primary contributor of this project confirms that incompatibility.
Help?
I was struggling with the same until I ran into the sublime EF Core Power Tools extension for Visual Studio. Its reverse engineering tool sounds just like what you need.
https://marketplace.visualstudio.com/items?itemName=ErikEJ.EFCorePowerTools

Entity Framework 6.1.3 code first fluent mappings compatible database on mono

I have a Asp.Net webapi 2 system that works with sql server. I developed it using entity fraework 6.1.3 code first data models and fluent mappings with the typical workflow of add-migration/update-database. I love it.
I have a need to create the exact same software with a lighter weight db to run on a raspberry pi device. It's the disconnected version of the software that will replay/resync all of its data to the cloud version (sql server).
I realize I may need to relax some of my constraints, but starting at the extreme, I would like to target the exact same code base with something like sqlite and xcopy deploy it to my raspberry pi and run in on mono under kestrel web server.
Ideally, I'd just like to change my connection string to point to an empty sqlite db, do a update-database and have the exact same software initially run on my windows development box (and then xcopy it over).
I have read a lot about sqlite entity framework support but a) it doesn't seem to support migrations b) it doesn't seem to support fluent mapping
I could get by using a tool to convert my sql server db to sqlite (every time I change schema) and thus avoid the need to update-database. But the lack of fluent mappings would still prevent the data model to be properly mapped to the existing sqlite schema.
Does anybody have some thoughts/recommendation for sqlite that my help me accomplish my goals?
Do you have any other database recommendations that would help me accomplish my goals - for instance I looked at vistadb, but I don't think they support fluent either.
The devart sqlite driver seems to support everything I need but their examples are all old school and AFAIK they don't have one single example that is a modern code first model with fluent mappings. And even if they did fully support code first wth fluent I am concerned there would be some syntax differences and I am not sure my existing sql server targeting code would be compatible with it. I asked the question on their forums and sent an email but haven't received a response yet.
Thanks
You could consider using EF7, which is a API compatible new version of Entity Framework, that fully supports migrations and fluent mappings with SQLite. EF7 runs on .NET 4.6 and .NET Core. Depending on what features in EF6 you use, it could be an easy upgrade, in particular since you already use Code First.
http://ef.readthedocs.org/en/latest/getting-started/linux.html

What does Microsoft recommend for 2nd level Caching in Entity Framework?

I've used "EF Provider Wrappers" made by Jarek Kowalski. It works fine but I noticed "Limitations and Disclaimers" section where it says:
The providers have not been extensively tested beyond what’s included in the sample code, so you should use them at your own risk.
As with any other sample, Microsoft is not offering any kind of support for it, but if you find bugs or have feature suggestions, please use this blog’s contact form and let me know about them.
I'm little confused here, Does Microsoft really expect developers to use EnityFramework on production websites without any official support (or recommendation) for 2nd level Caching?
There is no official 2nd level cache support. I'm even not sure if EF Provider wrappers are compatible with .NET 4.5. 2nd level cache is in backlog for future versions of EF.
You can also implement your own solution because EF is fully open sourced.
Btw. I have seen dozens of quite complex web sites running in production without any cache ...
There is now a 2nd level cache provider available for EF 6.x
Entity Framework does not currently support caching of query results. A sample EF Caching provider is available for Entity Framework version 5 and earlier but due to changes to the provider model this sample provider does not work with Entity Framework 6 and newer. This project is filling the gap by enabling caching of query results for Entity Framework 6.1 applications.
https://github.com/moozzyk/EFCache
And Redis provider implemented on top of it :
Extends EFCache by adding Redis support
I wanted to add L2 Cache to EF using Redis - there was nothing
available at the time.
I found EFCache written by Pawel Kadluczka (moozzyk) over on CodePlex
https://github.com/silentbobbert/EFCache.Redis
Apache Ignite.NET provides a distributed in-memory 2nd level cache for Entity Framework: https://apacheignite-net.readme.io/docs/entity-framework-second-level-cache

Creating a Custom Entity Framework for Unsupported System

I could be totally misunderstanding Entity Framework here. I want to use that in my latest project (how else do you learn?) The problem is that the IBM i driver doesn't have support for that built in. Is is possible to create that framework from scratch? It is worth it?
It sounds like you'd be writing your own ADO.NET data provider to connect to IBM DB2 for i. Microsoft provides documentation for creating your own provider and a sample.
The data provider would be responsible for communicating with the database, so I'm not sure how you'd accomplish that. Either you'd be implementing your own connection to the database server running on the i (maybe you can port the SQL piece of JTOpen), or you'd be delegating your calls to the IBM-provided data provider (if that's even possible) or other data access method.
I couldn't decide whether I thought this was (1) a huge pain in the butt or (2) an opportunity for an open source project. (I guess it could be both.) It seems like it'd be easier to lobby IBM to make this part of their stock provider. You might complain about it on MIDRANGE-L and see if people will take up the cause.
Disclaimer: I am a newbie in the .NET world, so maybe there's an easier way to accomplish what you're trying to do.

Is Classic ADO still viable for a mixed managed/unmanaged App?

We have a complex architecture with much logic in unmanaged code that needs database access.
Currently this is via ODBC drivers and MFC classes and we're considering the issues of migrating our abstraction layer to use ADO or ADO.Net. In the latter case we'd have to be pushing database logic back up into the .Net layer. I'm trying to decide if the pain of invoking the database via .Net callbacks is offset by the improvements in ADO.Net.
The Wikipedia comparison was interesting although I'm not sure I believe all the points in the comparison table (eg: does ADO.Net always use XML to pass data?).
A 2005 comparison shows ADO.Net performing dramatically faster.
Microsoft's guide to ADO.Net for ADO programmers suggests we will gain much from going to ADO.Net especially the way that data is available in native (.Net) types rather than solely through OLEAutomation's Variant.
eg: does ADO.Net always use XML to pass data?
No. Sounds like idiot information in wikipedia then.
2 choices. First, I would REALLY get rid of ODBC - and move at least to OleDb driver wise. If possible (tell me - I have a .NET app using an ODBC driver to call a JDBC ddriver to call a third party application server).
Now, you can go both ways - ADO on both sides, managed ADO.NET and expose from the NET layer - but this is really not a programmer decision, it is an architectural thing that should be seen in the major context.
I would possibly go for a .NET layer, possibly with at the same time an OData exposure layer, and try to consume that from the unmanaged layer.