Failed to retrieve long data for column varbinary (max) - tsql

We have created several packages using the import utility of MS SQL 2016. In each import, we are copying data from a SQL 2016 database to another SQL 2016 database with the same database schema. All the packages are working fine except one. In this one import, we are copying a column of varbinary(max). When I execute this import using import utility, everything works fine. When I execute this as a SQL agent job, it fails with the error: Failed to retrieve long data for column "Samples" Code: 0xC020901C on the source. In the XML of the import .dtsx, I see that the column is intepreted as dataType="image". Should I change this datatype? How can I make this work from the agent?

A team mate had found the answer: The proxy credentials did not have access to write to the temp directory of the SQL Server Agent account. Temp directory was necessary for writing the varbinary column. We didn't realise that the temp directory was being used and although the tasks were running under the proxy credential, SQL server agent account was being used for the temp folder.

Related

Error when trying to import with CSV file format in Cloud SQL

HTTPError 400: Unknow export file type was thrown when I try to Import csv file from my Cloud Storage bucket into my Cloud SQL db. Any idea what I missed out.
Reference:
gcloud sql import csv
CSV files are not supported in Cloud SQL, MS SQL Server. As mentioned here,
In Cloud SQL, SQL Server currently supports importing databases using
SQL and BAK files.
Somehow, it is supported for MySQL and PostgreSQL versions of Cloud SQL.
You could perform one of the next solutions:
Change the database engine to either PostgreSQL or MySQL (where CSV files are supported).
If the data on your CSV file came from an on-premise SQL Server DB table, you can create an SQL file from it, then use it to import into Cloud SQL, SQL Server.

How do i dump data from an Oracle Database without access to the database's file system

I am trying to dump the schema and data from an existing Oracle DB and import it into another Oracle DB.
I have tried using the "Export Wizard" provided by sqldeveloper.
I found answers using Oracle Data Pump, however i do not have access to the filesystem of the DB server.
I expect to get a file that i can copy and import into another DB
Without Data Pump, you have to make some concessions.
The biggest concession is you're going to ask a Client application, running somewhere on your network, to deal with a potentially HUGE amount of data/IO.
Withing reasonable limits, you can use the Tools > Database Export wizard to build a series of SQLPlus style scripts, both DDL (CREATEs) and DATA (INSERTs).
Once you have those scripts, you can use SQLPlus, SQLcl, or SQL Developer to run them on your new/target database.

Where is the DATA_DUMP_DIR in sql developer

I'm trying to import a .dmp file using the Data Pump Import tool in oracle sql developer.
I'm connected to an oracle database running in a container on my local machine.
When I get to the step where I specify where the dump file is to import, where should I place the .dmp file?
DATA_PUMP_DIR is a default Oracle directory object. It isn't part of SQL Developer; the import tool is really just giving you a GUI equivalent of running impdp from the command line.
You can find the operating system location that Oracle directory object points to by querying the data dictionary:
select directory_path from all_directories where directory_name = 'DATA_PUMP_DIR';
The path that returns is on the database server (in your case that'll be inside your container too), and your dump file needs to go there.
You might want to create additional directory objects pointing to other locations, and grant suitable privileges to users to be able to access them; but they all need to be on the DB server and read/writable by the Oracle process owner on that server.
(They could be remote filesystems mounted on the server, they don't necessarily have to be local storage, but that's another issue and more operating-system specific. Again, in your case, you might be able to share a folder on your local machine with the container, if you don't want to copy the file into the container.)

How to export data from SQL Server to PostgreSQL?

I need to export all tables from SQL Server to PostgreSQL.
Try: I tried from SQL Server IDE but at some stage its giving the error about data types are different.
Question:How can I do export of data from SQL Server to PostgreSQL? Is COPY does my job? If yes, then how can I export all tables including records?
You can't export data from MSsql then import to PostgreSql because it is not same syntax, data type, but you can use tool to migration data from mssql to postgreSql,
See more in topic
migrate data from MS SQL to PostgreSQL?
Use https://dbeaver.io/
Create MS SQL and PostgreSQL database connections (login)
Create target tables in PostgreSQL (same structures in MS SQL)
F5 to see new tables
Right-click on new tables -> 'Import Data' -> You will see 'Data Transfer' window
Choose 'Table' type then click 'Next' -> You will see 'Select input object', where you can choose tables from MS SQL connection
Just 'next' and check settings that you need, done :D
First export the schema into a file and run it against PostgreSQL until you've removed all incompatibilities.
You could try to do the same with the data you want to export but you may be better off writing a Python script to migrate it.
There is an absolutely simple way using built-in SSIS tool using Management Studio. You can find the detailed answer here.
Use https://dbeaver.io/ , as An Le mentioned.
After 40 years of DB development, migrating DB data is still a challenge. DBeaver is a free tool to use for data migration. But you still have to migrate the schema.
Exporting data from DBeaver
From contextual menu of your SQLServer database or schema select Tools > Create new Task > Common > Data Export
You will generate SQL insert files or CSV files. For migration between database types use CSV files.
Cons of SQL Server Migration Tool
Unable to migrate rows containing booleans.
Export ended up in errors of migrationg data with Bool columns, complaining that value is not boolean, although both source and destination columns where of boolean type.
Unable to continue with the next tables afer one table migration fails.
SQL Server - A single error stops all migration even for tables that are not related to the initial error.
Configuring the tool over and over again, trying to export your data is a waste of time. SQL Server migration task does not save the configuration of the source and destination connections. And the wizard is not user friendly, spending your time on it is frustrating. I assume the migration project was abandoned for at least 10 years.

How to restore SQL Server 2008 R2 backup to a LocalDb 2012

I have a backup of a database (.bak) created ​​in SQL Server 2008 R2.
To test some features, like to import this backup to LocalDB (2012).
When I click on Restore and select the database the following error occurs:
Property MasterDBLogPath is not available for Information 'Microsoft.SqlServer.Management.Smo.Information'. This property may not exist for this object, or may not be retrievable due to insufficient access rights. (Microsoft.SqlServer.Smo)
You need to add the following 3 registry keys (run/regedit):
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server\MSSQL11E.LOCALDB\MSSQLServer\DefaultData,
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server\MSSQL11E.LOCALDB\MSSQLServer\DefaultLog,
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server\MSSQL11E.LOCALDB\MSSQLServer\BackupDirectory
With an existing folder name as value (where you have write access), e.g. "C:\Databases".
Please have a look at the excellent walkthrough under http://www.roelvanlisdonk.nl/?p=2896 (which is where I have copied the answer from).