Any help would be appreciated. I am failing to restore an SSAS cube when the full server path to the folder is provided in the code.
The SSAS cube restore works when I give the normal drive path to the code (eg. F:\OLAP) but when I assign the server-name to the same shared folder (\servername\F\OLAP) then the following error is generated:
[Analysis Services Execute DDL Task] Error: File system error: The following error occurred during a file operation: Cannot create a file when that file already exists. . (\\servername\F\OLAP\28AEC4E6A0A74DF88A3F\CUBE.5.db).
The code is similar to the one below and has the AllowOverwrite property yo 'true'
<Restore xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
<File>\\10.10.10.10\e\Live\DWH\OLAP_backup.abf</File>
<DatabaseName>OLAP</DatabaseName>
<AllowOverwrite>true</AllowOverwrite>
<DbStorageLocation xmlns="http://schemas.microsoft.com/analysisservices/2008/engine/100/100">F:\OLAP\</DbStorageLocation>
</Restore>
I am not sure how if any of you have already come across this and know of the root cause or easy fix?
Thank you
Related
I am working in QGIS through the DB Manager by running some spatial queries. After I create new tables with the queries I need to export the tables as new geopackages.
I've tried using the Export to Vector file inside the DB Manager but I get the following error message:
Error 2 Creation of data source failed (OGR error:
sqlite3_open(/Users/xxx/Documents/xxx/xxx/xxx/xxxx/xx/new_geopackage_layer.gpkg)failed:
unable to open database file)
I've read a couple of posts and they said I needed to create an empty geopackage first and then export the table and save it inside the geopackage but that did not work either. When I try to save inside an existing geopackage
I get an error saying:
"geopackage.gpkg already exists. Do you want to replace it? A file or
folder with the same name already exists in the folder xxx Replacing
it will overwrite its current contents."
If I choose to overwrite then I get a second error message saying:
" Error 1 Unable to create the datasource.
/Users/xxx/Documents/xxx/xxx/xxx/xxx/xxxx/new_geopackage.gpkg exists
and overwrite flag is false."
All I want is to be able to run spatial queries inside QGIS and be able to export the tables created with the queries as geopackages.
It seems that as of now I won't be able to do this from inside QGIS but instead will need to use ogr2ogr command to export to any file type.
Any help would be really appreciated.
Thank you
In the course of trying to upgrade serverless, I received the following error.
Error: EXDEV: cross-device link not permitted, rename '/tmp/serverless-binary-tmp' -> '/home/<username>/.serverless/bin/serverless'
Looking into other similar errors/questions on SO, they point out that this error arises when trying to move files across partitions/devices; trouble is that /tmp is not a separate partition to /.
So I first tried looking into changing the /tmp folder location for serverless.com, but was unable to find documentation/options to that effect.
Fortunately, a manual copy of the file seems to have been the only missing step
cp '/tmp/serverless-binary-tmp' '/home/<username>/.serverless/bin/serverless'
I use this connection-string:
Server=(localdb)\\MyInstance;Timeout=30;Database=MyDB;AttachDBFilename="C:\Temp\MyDB.mdf";Trusted_Connection=True;
Once I run a migration from my code using
dbContext.Database.Migrate();
Normally, this "simply" works. The db is not just migrated, it is also getting its file created for it.
However, on the device of a colleague, the same code results in this error message:
System.Data.SqlClient.SqlException: "Cannot attach the file 'C:\Temp\MyDB.mdf' as database 'MyDB'."
If I give my database files to my colleague and he places them in the appropiate directory first, everything works as expected and the other code in that program can access everything in it, as it would do normally.
We've tried different paths and always checked the file-system rights. LocalDB or entity-framework (I'm not sure which is normally responsible for creating database files) simply won't create the database-file if it's missing on his device.
Are there any switches causing this? Can I explicitly tell localdb with the connection-string that it should create the database file?
I'm trying to install postgreSQL on my windows 10 computer for the first time. I got an error at the end of the installation saying that there was a "problem running post-install step. Installation may not complete correctly. The database cluster initialization failed."
When I run the sql shell I get an error trying to do the default login that says 'chcp' is not recognized as an internal or external command. I set the environmental path variable to the bin of the Postgres folder in my program files. I also tried a number of other (but very dated) solutions to similar problems users experienced such as moving my data directory outside of the Postgres directory entirely. Most of these solutions date back to 2012 and don't seem to work anymore.
The one that seemed closest to working is postgresql installation failed.
However, I can't find "postgres" as a user. I get an error saying:
"An object named "postgres" cannot be found. Check the selected object types and location for accuracy and ensure that you typed the object name correctly, or remove this object from the selection."
Does anybody have any updated solutions/tips for this?
I have 50 txt files on windows and I would like to insert their data into a single table on Redshift.
I created the basic table structure and now I'm having issues with inserting the data. I tried using COPY command from SQLWorkbench/J but it didn't work out.
Here's the command:
copy feed
from 'F:\Data\feed\feed1.txt'
credentials 'aws_access_key_id=<access>;aws_secret_access_key=<key>'
Here's the error:
-----------------------------------------------
error: CREDENTIALS argument is not supported when loading from file system
code: 8001
context:
query: 0
location: xen_load_unload.cpp:333
process: padbmaster [pid=1970]
-----------------------------------------------;
Upon removing the Credentials argument, here's the error I get:
[Amazon](500310) Invalid operation: LOAD source is not supported. (Hint: only S3 or DynamoDB or EMR based load is allowed);
I'm not a UNIX user so I don't really know how this should be done. Any help in this regard would be appreciated.
#patthebug is correct in that Redshift cannot see your local Windows drive. You must push the data into an S3 bucket. There are some additional sources you can use per http://docs.aws.amazon.com/redshift/latest/dg/t_Loading_tables_with_the_COPY_command.html, but they seem outside the context you're working with. I suggest you get a copy of Cloudberry Explorer (http://www.cloudberrylab.com/free-amazon-s3-explorer-cloudfront-IAM.aspx) which you can use to copy those files up to S3.