I'm trying to restore a mongodb database from a dump, I keep getting timeout errors from Azure when restoring and only partial collections are restored.
error: Request timed out.
Related
I have by mistake deleted database files from folder(PostGreSql/Data/base) and now getting below error logs
getting below error in postgresql.log file
UTCLOG: database system is ready to accept connections
UTCFATAL: database "activevos" does not exist
getting below error in scripts.log file
Failed to backup data from the PostgreSQL database.
Process Server update failed.
Can anyone please suggest how to recover process-engine in Informatica Cloud.
Note: using Linux as software
I am performing a postgres server migration on azure but I keep getting a failure. Every time I migrate it fails on extensions with the following error:
Data migration could not be started for one or more of the DBSets. Error details: PGv2RestoreError: PG Restore failed for database 'DATABASE_NAME' with exit code '1' and error message 'error: could not execute query: ERROR: must be owner of extension hypopg'.
Any ideas? The data appears to be in the tables, but the migration in the Azure portal just says failed. This is happening on two servers.
I am trying to restore a dumped DB on Cosmos using mongorestore command but unfortunately it is throwing an error as follows:
error running create command: (Unauthorized) Error=13, Details='Response status code does not indicate success: Forbidden (403); Substatus: 0; ActivityId: 421d33eb-dea0-4372-92b0-ece63fd2b357; Reason: ("Operation 'POST' on resource 'dbs' is not allowed through Azure Cosmos DB endpoint. Please switch on such operations for your account, or perform this operation through Azure Resource Manager, Azure Portal, Azure CLI or Azure Powershell"
ActivityId: 421d33eb-dea0-4372-92b0-ece63fd2b357, Microsoft.Azure.Documents.Common/2.11.0, Please see CosmosDiagnostics, Windows/10.0.14393 cosmos-netstandard-sdk/3.3.2);
2020-08-25T20:47:42.088+0530 0 document(s) restored successfully. 0 document(s) failed to restore.
I am following this link provided by Microsoft to migrate data: https://azure.microsoft.com/en-in/resources/videos/using-mongodb-tools-with-azure-cosmos-db/
The command used for restoring the DB is as follows:
mongorestore --host HOSTNAME:PORT -u USERNAME -p PASSWORD --db DBNAME DUMPED_DB_DIRECTORY_PATH --ssl --sslAllowInvalidCertificates
Which account it is referring to enable the operation and How do I enable the operation on my account as highlighted by the logs ? Is there any equivalent of mongorestore in Azure CLI or Portal ?
I tried restoring a single collection also but the same error came up. I was successfully able to dump the data from my another Cosmos DB instance.
It was a problem from the Azure policy side. The Azure policy respective to the Cosmos DB which prevents key based metadata write access was enabled hence, not able to do any changes in the DB. The organisation put that policy in place, now it is resolved after disabling the policy.
I am working on migrating the projects from on premise TFS Server to VSTS, for that I followed this link.
But when I run this command TfsMigrator import /importFile:C:\TFSDataImportFiles\import.json, I got the error like VS403250: The dacpac or source database is not a detached TFS Collection database even though I detached the collection from my on premise TFS server.
Can you please tell me how to resolve the above error as soon as possible?
I got confused with the difference between detaching the database and detaching the collection. You have to detach he collection from either the TFS admin console or the tfsconfig command line utility on the application tier. I prefer the command line. FYI: My first attempt to re-attach the collection using the admin console got hung up and appeared to run for six hours before I restarted the app tier and it finally errored out. Attaching a 160 GB collection with 48 projects only took a few minutes when it worked correctly using tfsconfig.
You create your dacpac or backup the collection after detaching the collection. Then if it is a dry run, re- attach the collection.
According to the troubleshooting document you need to detach the collection database, then generate the DACPAC again.
VS403250: The dacpac is not a detached TFS Collection database.
The DACPAC is not built off a detached collection. The collection
database will need to be detached and the DACPAC generated again.
I am having the exact same issue and I have already detach the collection and have also create the DACPAC file.
[Info #11:26:21.170] Import couldn't be started due to an error!
[Error #11:26:21.171] + [Error] VS403250: The dacpac or source database is not a detached Azure DevOps Server Collection database. Please refer to the troubleshooting documentation for more details; https://aka.ms/AzureDevOpsImportTroubleshooting
[Info #11:26:21.172] See our documentation for assistance: https://aka.ms/AzureDevOpsImportTroubleshooting
[Error #11:26:21.180] Microsoft.VisualStudio.Services.WebApi.SourceIsNotADetachedDatabaseException: VS403250: The dacpac or source database is not a detached Azure DevOps Server Collection database. Please refer to the troubleshooting documentation for more details; https://aka.ms/AzureDevOpsImportTroubleshooting
at Microsoft.VisualStudio.Services.WebApi.VssHttpClientBase.<HandleResponseAsync>d__53.MoveNext()
I followed all the steps mentioned in https://learn.microsoft.com/en-us/azure/devops/migrate/migration-import-large-collections?view=azure-devops as my DB is huge.
For dryrun, i create a collection on a new server and restore the DB and that was successful but if I do from the actual server (on-perm) where I have Azure server i get this error.
Screenshot of the server
Using Couchbase v2.2. During the process of backup or restore mid-way the following error is seen after which the process of backup aborts.
s0 error: async operation: error: SASL auth exception
Has anyone else faced this issue or have any leads on why this happens?