Out of memory exception when running data-only script - sql-server-2008-r2

When I run data-only script in SQL Server 2008 R2, it is showing this error:
Cannot execute script
Additional information:
Exception of type 'System.OutOfMemoryException' was thrown. (mscorlib)
The size of script file is 115MB and it's only data .
When I open this script file, it shows:
Document contains one or more extremely long lines of text.
These lines cause the editor to respond slowly when you open the file .
Do you still want to open the file ?
I run schema-only script first and then data-only script .
Is there any way to fix this error ?

I solved it by using sqlcmd utitlity.
sqlcmd -S "Server\InstanceName" -U "instantName" -P "password" -i FilePathForScriptFile
For example :
sqlcmd -S .\SQLEXPRESS -U sa -P 123 -i D:\myScript.sql

Zey's answer was helpful for me, but for completion:
If you want to use Windows Authentication just omit the user and password.
And don't forget the quotes before and after the path if you have spaces.
sqlcmd -S .\SQLEXPRESS -i "C:\Users\Stack Overflow\Desktop\script.sql"

If you're logged into the domain with the correct privileges and there's only one instance running, you also do not have to provide the above user/pw/instance command args. I was able to just execute:
sqlcmd -i myfile.sql

Related

Execute a .sql file with psql and make a log file

I'm dealing with psql for the first time and I need a command that executes a .sql file and after executing, it should make a .log file with the script output. I'm looking for something similar to this other command that I use with SQL Server:
sqlcmd -U userid -P password -S serveraddress -i path_to_the_sql_file -o path_where_to_save_log_file
Can you help me, please? :-)
I would spend some time at the psql page.
A quick example:
psql -U userid -h serveraddress -d some_db -f path_to_the_sql_file -L=path_where_to_save_log_file
With Postgres you need to connect to a database with a client. There are other options for inputting commands and capturing output at the link above.

Postgres from unix shell out put not appear in Log

I am writing a batch job for Postgres for first time. I have return ".sh" file, which has a command with out any out put in the log or console.
Code
export PGPASSWORD=<password>
psql -h <host> -p <port> -U <user> -d <database> --file cleardata.sql > log\cleardata.log 2>&1
What I did at cammond line
su postgres
and run ./cleardatasetup.sh
Nothing is happening.
Please note : When I try psql command in Unix command line, I am getting message as some SQL exception which is valid.
Can any one please help me in this regard.
You probably wanted to create log/cleardata.log but you have a backslash where you need a slash. You will find that the result is a file named log\cleardata.log instead.
The backslash is just a regular character in the file's name, but it's special to the shell, so you'll need to quote or escape it to (unambiguously) manipulate it from the shell;
ls -l log\\cleardata.log # escaped
mv 'log\cleardata.log' log/cleardata.log # quoted

pg_dump through ssh stops after some seconds when use in script

I do backup of three PostgreSQL servers with pgdump launched by script through ssh. The command line in script is :
sudo -u barman ssh postgres#$SERVER 'pg_dump -Fc -b $database 2> ~/dump_error.txt' | gzip > $DUMP_ROOT/$SERVER-$BACKUPDATE.gz
But the dump size is always about 1K, for all servers. When I execute this line in a shell, just replacing the variable by their values, that perfectly works. It executed it as root (sudo -u barman ssh postgres#server ...), and as barman, just as user barman (ssh postgres#server ...), the dump is correct.
When I open the dump, I see the start of dump, but suddenly it stops.
The dump_error.txt on servers is empty.
There is nothing in log (postgres log and syslog), in backup and PostgreSQL servers.
The user barman can connect to server as user postgres without password.
The limits of shell are enough high to not block the script (open files 1024, file size unlimited, max user process 13098).
I try to change the cron hour of script, thinking that a process could consume all resources, but it is always the same thing, and ps -e show nothing special.
The version of postgreSQL is 9.1.
Why does this line never produce a complete dump when executed in script, but only when executed in a shell ?
Thanks for your help, Denis
Your problem is related to bad quoting. Simple quotes will cause the string to not be expanded, while double quotes will expand what's inside. For instance :
>MYVARIABLE=test
>echo '$MYVARIABLE'
$MYVARIABLE
>echo "$MYVARIABLE"
test
In your case, ssh postgres#$SERVER 'pg_dump -Fc -b $database 2> ~/dump_error.txt' will execute the command on the remote computer, without expanding variables. This means ssh will pass the expression pg_dump -Fc -b $database, and bash will interprete the variable $database on the remote computer. If this variable doesn't exist there, it will be considered an empty string.
You can see the difference when you do ssh user#server 'echo $PWD' and ssh user#server "echo $PWD".

postgres, psql and a bat file

I'm going around in circles and so need some help.
I want to be able to run an .sql against a database which is a scheduled task via a bat file. However when i run the bat file manually I get prompted for a password. I enter the password and then get told that....
"psql: FATAL: password authentication failed for user "(my windows login)"
'-h' is not recognised as an internal or external command,
operable program or batch file.
At the moment my bat reads...
#echo off
"D:\Program Files (x86)\PostgreSQL\9.1\bin\psql.exe"
-h localhost -U postgres -d database_name -f D:/scripts/SQL/test.sql
pause
First thing, what cmd do i need to add to populate the password request
What am I doing wrong with the rest of the statement to get it to load the .sql
Thanks
by adding this line to your config file (pg_hba.conf), you can tell postgres to allow local connections without authentication
local <database> <user> trust
http://www.postgresql.org/docs/9.1/static/auth-pg-hba-conf.html
All that needs to go into a single line in your batch file (it seems you have two lines):
#echo off
"D:\Program Files (x86)\PostgreSQL\9.1\bin\psql.exe" -h localhost -U postgres -d database_name -f D:/scripts/SQL/test.sql
pause
To avoid the password prompt, set the environment variable PGPASSWORD before calling psql:
#echo off
setlocal
set PGPASSWORD=my_very_secret_password
"D:\Program Files (x86)\PostgreSQL\9.1\bin\psql.exe" -h localhost -U postgres -d database_name -f D:/scripts/SQL/test.sql
pause
endlocal
The setlocal/endlocal commands are there to make sure the variable is cleared after the batch file was executed.
To avoid having the password in plain text in the batch file you can create a pgpass.conf file that contains the password. For details please see the manual: http://www.postgresql.org/docs/current/static/libpq-pgpass.html

Unable to restore the postgresql data through command prompt

I am trying to restore the postgres sql data from a file . I am trying to do so but it is not importing .
Here is the command which i am using:
postgres-# psql -hlocalhost -p5432 -u postgres -d test -f C:/wamp/www/test/database_backups/backup004.sql
Please help me what I am doing wrong .
I am using windows and the above command does not throws any error but it does not import data.
Regards
Surjan
The only immediate thing I can see there is the capitilsation of -u for username (should be -U).
Correction: You're typing the command line into the psql shell.
You should exit to the CMD.EXE shell, and try the command there. With the correct capitalisation of -U, by the way.
OR, use this to replay the script into that psql shell:
\i C:/wamp/www/test/database_backups/backup004.sql
The forward slashes don't cause a problem on my Windows machine.