Scaffolding with geometry data in PostgreSql mapping error - postgresql

I'm trying to do ModelFirst (scaffolding) of an existing bbdd in postgresql with geometry data.
In the VS project I have well installed all the necessary nuget packages (EntityFrameworkCore, EntityFrameworkCore.Design, EntityFrameworkCore.Relational, EntityFrameworkCore.Tools, Npgsql.EntityFrameworkCore.PostgreSQL, Npgsql.EntityFrameworkCore.PostgreSQL.Design and Npgsql.NetTopologySuite).
In VS PM, when launching the command:
Scaffold-DbContext "Host=myserver;Database=spatial;Username=postgres;Password=xxxxxxxx" Npgsql.EntityFrameworkCore.PostgreSQL -Schemas spu -OutputDir Spatials
He gives me these exceptions:
Could not find type mapping for column 'spu.nuts.geom' with data type
'geometry(Geometry,4326)'. Skipping column.
And it doesn't map the geometry columns, all rest columns are ok.
What am I doing wrong?
Can I specify scaffolding using NetTopologySuite?
Thanks a lot
Edit: Solved.
Show comments

Related

EF with existing Firebird database model generation problem

I have a Firebird database with tables. I need to generate DbContext, models etc. So provider installed:
Connection string is OK:
Console.WriteLine("Starting");
FbConnection db = new FbConnection(csb.ToString());
db.Open();
Console.WriteLine($"Database state:{db.State.ToString()}");
Db connection is Ok:
Trying to create models using command:
Scaffold-DbContext "user id=ХХХ;password=ХХХ;database=ХХХ;data source=localhost;port number=3050"FirebirdSql.EntityFrameworkCore.Firebird -OutputDir Models
looks good:
However, DBcontext is empty:
You are using FirebirdSql.EntityFrameworkCore.Firebird (7.10.1). This version doesn't support DbContext scaffolding.
Try version-8.0.0 (currently in alpha3).

Failure to find table when using multiple schemas in PostgreSQL

WPF PostgreSQL 11.1
Npgsql.PostgresException: '42P01: relation "testme" does not exist'
When attempting to use a PostgreSQL database with multiple schemas, I have defined the following connection strings in the App.config. Note that the only difference is in the SearchPath:
<system.data>
<DbProviderFactories>
<add name="Npgsql Data Provider" invariant="Npgsql" support="FF" description=".Net Framework Data Provider for Postgresql Server" type="Npgsql.NpgsqlFactory, Npgsql, Version=4.0.4.0, Culture=neutral" />
</DbProviderFactories>
</system.data>
<connectionStrings>
<clear />
<add name="localconnection" providerName="Npgsql" connectionString="Server=127.0.0.1;Port=5432;Database=chaos;User Id=postgres;Password=****;Searchpath=nova" />
<add name="phoenixconnection" providerName="Npgsql" connectionString="Server=127.0.0.1;Port=5432;Database=chaos;User Id=postgres;Password=****;SearchPath=phoenix;" />
</connectionStrings>
The Npgsql data provider was installed using NuGet: Runtime Version:
v4.0.30319 Version: 4.0.4.0
In PostgreSQL, in the Phoenix schema:
CREATE TABLE phoenix.testme
(
name text COLLATE pg_catalog."default" NOT NULL
)
WITH (
OIDS = FALSE
)
TABLESPACE pg_default;
ALTER TABLE phoenix.testme
OWNER to postgres;
Using PgAdmin, displaying the testme table works without problem:
select * from phoenix.testme;
I have configured the WCF service using the above connection strings. Using PetaPoco, I write the following script:
public string SayHello()
{
string msg;
using (var db = new chaosDB("phoenixconnection"))
{
var m = db.ExecuteScalar<string>("select version()");
msg = string.Format("Hello from {0}", m);
m = db.ExecuteScalar<string>("select current_schema");
msg = string.Format("{0} Current Schema is {1}", msg, m);
var ss = db.ExecuteScalar<string>("show search_path");
var s = db.Fetch<string>("select * from testme"); <---THIS FAILS!
msg = string.Format("{0} I Am {1}", msg, m);
}
return msg;
}
All works correctly until the "select * from testme" is executed, when I receive the above error. Note: ss from "show search_path" returns correctly with "phoenix"
WHAT AM I DOING WRONG? How do I get this to work??
Any help is most appreciated?
After much head scratching the answer became self-evident. First I had reset the search_path in the database. This did not help. Then I rebuilt the POCO's with PetaPoco and quickly discovered that not only was the new table, "testme", not created, but nor were any POCO's. So, checking, the Database.tt file in PetaPoco showed it to have the wrong ConnectionStringName. Changing the ConnectionStringName to "phoenixconnection" allowed building the POCO's, but again failed to find the "testme" table.
Then the mistake became readily apparent, as stated above, both the "phoenixconnection" and the "localconnection" were pointed to the same port. From previous development, I had PostgreSQL v10.1 running on the same port as the newer PostgreSQL v11.1. Apparently, the first PostgreSQL v10.1 was receiving the connection (and not the newer PostgreSQL v11.1).
Going to services (services.msc) and shutting down v10.1 and running Database.TT now gave the error:
System.InvalidOperationException: Sequence contains more than one matching element
Apparently v10.1 (which I was using for development) only had ONE schema, but v11.1 has multiple schemas. I take the error message to mean that PetaPoco was seeing multiple tables with the same table name--i.e.,it was not distinguishing between schemas.
So, the problem is now solved.
Fix the ports! The older single-schema PostgreSQL v10.1 is kept on port: 5432.
The newer multiple-schema PostgreSQL is kept on port 5433. The v10.1 will be used for the POCO's.
Fix the connection strings in App.config of the WCF so that at run time, the WCF will use the newer v11.1. Once generated, LEAVE THE POCO'S alone and reference them in the WCF file.
Apparently, PetaPoco, can only work with one schema in generating its POCO's, but at runtime will read the connection strings from the App.Config of the WCF to execute its queries, etc. (So in the App.config where Database.TT resides, point PetaPoco to the "development" Database having only a single schema, but in the WCF environment, point the connection string to the new database with multiple schemas. The SearchPath of the connection string IS respected when running through Npgsql).
It would be nice if PetaPoco could generate POCO's specific to a schema in a multi-schema environment, but at the moment, I guess it can't :(
Addendum Note: It turns out that a given instance of PostgreSQL can have multiple DATABASES. So if the connection string for Npgsql is specific to a development database --i.e., a database with only one schema--then during development, PetaPoco works great to create the POCO's. These POCO's can then be directly used in a WCF Service project and uploaded to IIS website. The App.config files of the web site can then be directed to use the run-time database (again in the connection string) to the deployed database. All works well! :)

Oracle 12c IMPDP errors

The import command...
impdp user/password DIRECTORY=desktop_import DUMPFILE=SENIORS4_Feb1.dmp TABLES=(DOCUMENT_PUBLISH, MEDIA_APPLICANT) REMAP_TABLESPACE=SENIORDB:import_user REMAP_SCHEMA=SENIORDB:import_user
...causes these errors:
Connected to: Oracle Database 12c Standard Edition Release 12.2.0.1.0 - 64bit Production
ORA-39002: invalid operation
ORA-39166: Object IMPORT_USER.MEDIA_APPLICANT was not found or could not be exported or imported.
ORA-39166: Object IMPORT_USER.DOCUMENT_PUBLISH was not found or could not be exported or imported.
When I searched the web, most of these errors were associated with problems creating log files.
Try to precede table names with their owner name, i.e.
TABLES=(SENIORDB.DOCUMENT_PUBLISH, SENIORDB.MEDIA_APPLICANT)
or even by enclosing both of them into double quotes, such as
TABLES=("SENIORDB"."DOCUMENT_PUBLISH", "SENIORDB"."MEDIA_APPLICANT")

export data from mongo to hive

my input: a collection("demo1") in mongo db (version 3.4.4 )
my output : my data imported in a database in hive("demo2") (version 1.2.1.2.3.4.7-4)
purpose : create a connector between mongo and hive
Error:
Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. com/mongodb/util/JSON
I tried 2 solutions following those steps (but the error remains):
1) I create a local collection in mongo (via robomongo) connected to docker
2) I upload those version of jars and add it in hive
ADD JAR /home/.../mongo-hadoop-hive-2.0.2.jar;
ADD JAR /home/.../mongo-hadoop-core-2.0.2.jar;
ADD JAR /home/.../mongo-java-driver-3.4.2.jar;
Unfortunately the error doesn't change; so I upload those version, I hesitate in choosing right version for my export, so I try this:
ADD JAR /home/.../mongo-hadoop-hive-1.3.0.jar;
ADD JAR /home/.../mongo-hadoop-core-1.3.0.jar;
ADD JAR /home/.../mongo-java-driver-2.13.2.jar;
3) I create an external table
CREATE EXTERNAL TABLE demo2
(
id INT,
name STRING,
password STRING,
email STRING
)
STORED BY 'com.mongodb.hadoop.hive.MongoStorageHandler'
WITH
SERDEPROPERTIES('mongo.columns.mapping'='{"id":"_id","name":"name","password":"password","email":"email"}')
TBLPROPERTIES('mongo.uri'='mongodb://localhost:27017/local.demo1');
Error returned in hive :
Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. com/mongodb/util/JSON
How can I resolve this problem?
Copying the correct jar files (mongo-hadoop-core-2.0.2.jar, mongo-hadoop-hive-2.0.2.jar, mongo-java-driver-3.2.2.jar) on ALL the nodes of the cluster did the trick for me.
Other points to take care about:
Follow all steps mentioned here religiously - https://github.com/mongodb/mongo-hadoop/wiki/Hive-Usage#installation
Adhere to the requirements given here - https://github.com/mongodb/mongo-hadoop#requirements
Other useful links
https://github.com/mongodb/mongo-hadoop/wiki/FAQ#i-get-a-classnotfoundexceptionnoclassdeffounderror-when-using-the-connector-what-do-i-do
https://groups.google.com/forum/#!topic/mongodb-user/xMVoTSePgg0

Error while trying to create a new edmx file - VS 2013 , Entity Framework 5.0

As per my requirement i need to create a edmx file and establish a connection to sql server thats residing in a remote server and am following the VS 2013 , and using wizard model am trying to connect to DB.
But its throwing the error:
what am I missing, do i need set any settings / some installables?
Firstly i am getting the sqlclr types error, so i installed the same and other error was sharedmanagementobjects error, so i installed that msi as well.
my sql server is SQL SERVER 2012 STD - SP2,VERSION NUMBER : 11.0.5058
Loading metadata from the database took 00:00:03.2592313.
Generating the model took 00:00:03.1761556.
Could not save the XML to the configuration file 'D:\PoCSolutions\EDMXTEST_WINFORMS1\EDMXTEST_WINFORMS1\App.config' because of the error 'Access to the path 'D:\PoCSolutions\EDMXTEST_WINFORMS1\EDMXTEST_WINFORMS1\App.config' is denied.'.
Unable to update the App.Config file because of the following exception: 'Access to the path 'D:\PoCSolutions\EDMXTEST_WINFORMS1\EDMXTEST_WINFORMS1\App.config' is denied.'
Writing the .edmx file took 00:00:00.0009981.
I am getting this error when trying to create a new .edmx file using VS 2013 with Entity Framework 5.0
acess-denied-edmx-5.0-VS2013