I use this connection-string:
Server=(localdb)\\MyInstance;Timeout=30;Database=MyDB;AttachDBFilename="C:\Temp\MyDB.mdf";Trusted_Connection=True;
Once I run a migration from my code using
dbContext.Database.Migrate();
Normally, this "simply" works. The db is not just migrated, it is also getting its file created for it.
However, on the device of a colleague, the same code results in this error message:
System.Data.SqlClient.SqlException: "Cannot attach the file 'C:\Temp\MyDB.mdf' as database 'MyDB'."
If I give my database files to my colleague and he places them in the appropiate directory first, everything works as expected and the other code in that program can access everything in it, as it would do normally.
We've tried different paths and always checked the file-system rights. LocalDB or entity-framework (I'm not sure which is normally responsible for creating database files) simply won't create the database-file if it's missing on his device.
Are there any switches causing this? Can I explicitly tell localdb with the connection-string that it should create the database file?
Related
I have recently started using code first and migrations and I'm pretty happy with it.. I have been following the constant pattern of add-migration and update-database!
I have just tried to move from localdb to SQL Express and im having a real pain..
when I try and run the application.. I get the follow error..
Cannot find the object "dbo.AspNetUsers" because it does not exist or you do not have permissions.
In my Global file I have..
Database.SetInitializer(new MigrateDatabaseToLatestVersion());
Any ideas? It looks like the core forms tables are not being created?
if I run my application without the Initializer in the global file I get this.
Migrations is enabled for context 'ApplicationDbContext' but the database does not exist or contains no mapped tables. Use Migrations to create the database and its tables, for example by running the 'Update-Database' command from the Package Manager Console.
Thanks
Ste.
I am using Enterprise library semantic logging block (out of process) and using SQL Database sink to dump all the message. After putting everything in place and doing a test run, I am getting the following error - could not find stored procedure 'dbo.WriteTraces'.
Anybody faced similar issue ? Pl suggest.
Out of process semantic logging assembly comes with some powershell scripts and .sql files. We have to edit (to change DB name) and run these scripts. This will generate the stored procs and the associated table for us.
I encountered this same error but it was because we were trying to use a schema other than dbo for our logging database. Once we changed it back to dbo that resolved the problem. We were using the out of process SemanticLogging-svc.exe, which, from what I can tell, assumes that dbo is the schema name.
enviroment
visualStudio 2012
localdb v11
a solution with 3 projects on it:
1st class library with an ORM database model, and a local db Localdb.mdf inside App_Data directory.
2nd is a web project that uses this database model.
and 3rd a c# console project that uses this localdb database, referencing the 1st class library, and having at the app.config a localdb connection string defined as:
Data Source=(LocalDB)\v11.0;AttachDbFilename=C:_work_desarrollo\Apps\Business\OpenAccessAppsModel\App_Data\LocalDb.mdf;Integrated Security=True
My problems are:
i would like to made some changes using VS 2012 server explorer to this database like deleting tables and i got "The database is readonly. Updates to the database will not succeed until the database is made read write"
and second, how can i made available to the console application (the 3rd project) the database file so i can copy/paste the release folder to "install" the console application?
How the database connection should be modified to have the database locally with the console applciation? (same directory as the app)
Thanks a lot
Happened to me today.
You can grant/change db permissions like this:
icacls mydabase*.* /grant "NT Service\MSSQL$SQLEXPRESS":(F)
icacls mydabase*.* /grant "MYMACHINENAME\Administrator":(F)
I already had sqlexpress permissions set but found out that machine\administrator was also needed.
Hope it helps.
Please check atribute of .mdf and .ldf files and the container folder.
Check this answer as well.
I've done these two things and Presto! It Works!
The problem is only file .mdf and .log permissions over the user or application execution. Only give the security permission read and write in the file properties and the connection string for example "Data Source=(LocalDB)\MSSQLLocalDB;AttachDbFilename=C:\WinApps17\Cash\Invoices\Data\VFindx.mdf;Integrated Security=True;Connect Timeout=30"
I get this exception in PostgreSQL:
org.postgresql.util.PSQLException: ERROR: could not access file "$libdir/plpgsql": No such file or directory
at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:1721)
at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:1489)
at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:193)
at org.postgresql.jdbc2.AbstractJdbc2Statement.execute(AbstractJdbc2Statement.java:452)
at org.postgresql.jdbc2.AbstractJdbc2Statement.executeWithFlags(AbstractJdbc2Statement.java:337)
at org.postgresql.jdbc2.AbstractJdbc2Statement.executeQuery(AbstractJdbc2Statement.java:236)
at org.apache.commons.dbcp.DelegatingStatement.executeQuery(DelegatingStatement.java:205)
I searched a lot and most solution points to a wrong installation. But this is my test db which has been running without issues for a long time. Also inserts are working. Issue occurs only on select queries.
Apparently, you moved your PostgreSQL lib directory out of place. To confirm this, try the following in psql:
> SET client_encoding TO iso88591;
ERROR: could not access file "$libdir/utf8_and_iso8859_1": No such file or directory
If you get an error message like this, then my theory is correct. You'll need to find out where those files ended up, or you can reinstall PostgreSQL to restore them.
To find out what $libdir is referring to, run the following command:
pg_config --pkglibdir
For me, this produces:
/usr/lib/postgresql
I have the same problem: the other postgres server instance (8.4) was interfering with the 9.1 one; when the 8.4 instance is removed it works.
the other instance can sometimes be removed from the system while still running (e.g. you do a gentoo update and a depclean without stopping and migrating your data). so the error seems particularly mysterious.
the solution is usually going to be doing a slot install/eselect of the old version (in gentoo terms, or simply downgrading on other distros), running its pg_dumpall, and then uninstalling/reinstalling the new version and importing the data.
this worked pretty painlessly for me
When I run the application with the following connection string the database file is created successfully.
<add name="ConnString1"
connectionString="Data Source=.\SQLEXPRESS;
Database=Database1;
Integrated Security=SSPI;
AttachDBFilename=|DataDirectory|aspnetdb.mdf;
User Instance=true"
providerName="System.Data.SqlClient" />
If I delete the database file and try to run the application again the database file fails to be created and I get the following inner exceptions:
The underlying provider failed on Open.
{"Cannot open database \"Database1\" requested by the login. The login failed.\Database1\nLogin failed for user 'computer\\someuser'."}
If I change Database=Database1 to Database=Database2 in the connection string then the database file is created successfully. The problem repeats itself always.
How can I recreate the database file without having to change the database name?
Check to make sure the directory rights allow you to delete the mdf file and the the login has drop schema/table privileges directory rights are a common issue with mdf files due the high security placed on these files due to their potentially sensitive nature
I can see this is an older post - hopefully this can help someone in the same predicament.
Using code first, the first time the application runs it builds the db no problems - it knows it doesn't exist because it hasn't previously built it. Code first also takes a hash value of the models used and stores that in the new database - check for a table called EdmMetadata - thats where the hash value is stored. It uses the hash value to subsequently check if the model has changed from build to build, so it knows whether to drop the database and rebuild.
The second time through after you've deleted the database, it looks for the missing database to compare the model hash value, and can't find it because the database is now missing.
My workaround is to add a meaningless field (remembering to delete it after development) to one of the models to force the rebuild, without deleting the database. Alternatively, you could just modify the db hash value to force the rebuild.
This works with the code first application databases - not so sure with the membership database.