Escape windows file path in docker sql string - tsql

Im trying to build a sql server docker image that has a previous backup already mounted. My docker file looks like this (I know the restore command isnt correct):
FROM microsoft/mssql-server-windows-developer
RUN mkdir "C:\\SQLServer"
COPY Backup.bak C:\\SQLServer
ENV sa_password=verysecurepassword
ENV ACCEPT_EULA=Y
RUN sqlcmd -S 127.0.0.1 -Q "RESTORE DATABASE [MyDatabase] FROM DISK = N'C:\\SQLServer\\Backup.bak' WITH FILE = 1, NOUNLOAD, REPLACE, NORECOVERY, STATS = 5"
When i run docker build with this file, i get the following error:
Sqlcmd: 'DATABASE [MyDatabase] FROM DISK = NC:\\SQLServer\\Backup.bak WITH FILE = 1 NOUNLOAD REPLACE NOR
= 5': Unexpected argument. Enter '-?' for help.
As you can see, it has stripped out the single quotes around the filepath. I tried escaping the single quotes (with backslashes), but the quotes still get stripped out, leaving the extra backslashes in the path aswell.
If i run sqlcmd on the container directly, i can run the restore command (and get an error about files not being copied, which is expected). What is the format for embedding windows paths in dockerfile sql strings?

You should escape the double quotes around the SQL with a \ for Docker
You should escape the single quotes around the file name with a ` for Powershell
RUN sqlcmd -S 127.0.0.1 -Q \"RESTORE DATABASE [MyDatabase] FROM DISK = N`'C:\\SQLServer\\Backup.bak`' WITH FILE = 1, NOUNLOAD, REPLACE, NORECOVERY, STATS = 5\"

Related

Dockerized PGAdmin Mapped volume + COPY not working

I have a scenario where a certain data set comes from a CSV and I need to allow a non-dev to hit PG Admin and update this data set. I want to be able to put this CSV in a mapped folder from the host system and then use the PG Admin GUI to run a COPY command. So far PG Admin is telling me:
ERROR: could not open file "/var/lib/pgadmin/data-files/some_data.csv" for reading: No such file or directory
Here are my steps so far, along with a sanity check inspect:
docker volume create --name=data-files
docker run -e PGADMIN_DEFAULT_EMAIL="pgadmin#example.com" -e PGADMIN_DEFAULT_PASSWORD=some_pass -v data-files:/var/lib/pgadmin/data-files -d -p 5050:80 --name pgadmin dpage/pgadmin4
docker volume inspect data-files --format '{{.Mountpoint}}'
/app/docker/volumes/data-files/_data
docker cp ./updated-data.csv pgadmin:/var/lib/pgadmin/data-files
And, now I think that PG Admin could see the updated-data.csv, so I try COPY, which I know works locally on my dev system where PG Admin is on baremetal:
COPY foo.bar(
...
)
FROM '/var/lib/pgadmin/data-files/updated-data.csv'
DELIMITER ','
CSV HEADER
ENCODING 'windows-1252';
Is there any glaring mistake here? When I do docker cp there's no feedback to stdout. No error, no mention of success or a hash or anything.
It looks like you thought the file should be inside the pgadmin container however the file you are going to copy must be inside the postgres container so the query you run will find the file. I suggest you copy the file to postgres container :
docker cp <path_from_your_local>/file.csv <postgres_container_name>:/file.csv
Then in the query tool from your pgadmin you can copy without problems !
I hope this help to others came here...

pg_dump through ssh stops after some seconds when use in script

I do backup of three PostgreSQL servers with pgdump launched by script through ssh. The command line in script is :
sudo -u barman ssh postgres#$SERVER 'pg_dump -Fc -b $database 2> ~/dump_error.txt' | gzip > $DUMP_ROOT/$SERVER-$BACKUPDATE.gz
But the dump size is always about 1K, for all servers. When I execute this line in a shell, just replacing the variable by their values, that perfectly works. It executed it as root (sudo -u barman ssh postgres#server ...), and as barman, just as user barman (ssh postgres#server ...), the dump is correct.
When I open the dump, I see the start of dump, but suddenly it stops.
The dump_error.txt on servers is empty.
There is nothing in log (postgres log and syslog), in backup and PostgreSQL servers.
The user barman can connect to server as user postgres without password.
The limits of shell are enough high to not block the script (open files 1024, file size unlimited, max user process 13098).
I try to change the cron hour of script, thinking that a process could consume all resources, but it is always the same thing, and ps -e show nothing special.
The version of postgreSQL is 9.1.
Why does this line never produce a complete dump when executed in script, but only when executed in a shell ?
Thanks for your help, Denis
Your problem is related to bad quoting. Simple quotes will cause the string to not be expanded, while double quotes will expand what's inside. For instance :
>MYVARIABLE=test
>echo '$MYVARIABLE'
$MYVARIABLE
>echo "$MYVARIABLE"
test
In your case, ssh postgres#$SERVER 'pg_dump -Fc -b $database 2> ~/dump_error.txt' will execute the command on the remote computer, without expanding variables. This means ssh will pass the expression pg_dump -Fc -b $database, and bash will interprete the variable $database on the remote computer. If this variable doesn't exist there, it will be considered an empty string.
You can see the difference when you do ssh user#server 'echo $PWD' and ssh user#server "echo $PWD".

call postgresql pg_dump.exe from visual basic

Im trying to do a backup of my database from my application made in visual basic (visual studio 2012)
I copy the pg_dump.exe with the necessary dll files to the application root.
Test the pg_dump doing a backup from cmd window and goes ok
This is the code i use to try to call the pg_dump exe but apparently does not receive the parameters i'm trying to send.
' New ProcessStartInfo created
Dim p As New ProcessStartInfo
Dim args As String = "-h serverip -p 5432 -U postgres db_name > " & txtPath.Text.ToString
' Specify the location of the binary
p.FileName = "pg_dump.exe"
' Use these arguments for the process
p.Arguments = args
' Use a hidden window
p.WindowStyle = ProcessWindowStyle.Maximized
' Start the process
Process.Start(p)
When the process start i get this msg:
pd_dump: too many command-line arguments (first is ">")
Try "pg_dump --help" for more information
if i type this in cmd the backup is done ok
pg_dump.exe -h serverip -p 5432 -U postgres db_name > c:\db_bak.backup
But i cant make it work from visual.
First, your command is just plain wrong. You're invoking pg_dump, which wants to read from a file, so you don't want to use >. That'd write to the file and overwrite it in the process. Instead you'd want <, the "read file as standard input" operator.
That's not the immediate cause of your error though. pg_dump doesn't understand >, <, etc. These operations instruct the shell (cmd.exe) to do I/O redirection. If you're not running cmd.exe then they don't work. To get I/O redirection (>, etc) you need to run the process via the cmd shell, not invoke it directly.
In this case it's probably better to use the -f filename option to pg_dump to tell it to write to a file instead of standard output. That way you avoid I/O redirection and don't need the shell anymore. It should be as simple as:
Dim args As String = "-h serverip -p 5432 -U postgres db_name -f " & txtPath.Text.ToString
Alternately, you can use cmd /C to invoke pg_dump via the command shell. Visual Basic might offer a shortcut way to do that; I don't use it, so I can't comment specifically on the mechanics of process invocation in Visual Basic. Check the CreateProcess docs; VB likely uses CreateProcess under the hood.
Personally I recommend that you do

Out of memory exception when running data-only script

When I run data-only script in SQL Server 2008 R2, it is showing this error:
Cannot execute script
Additional information:
Exception of type 'System.OutOfMemoryException' was thrown. (mscorlib)
The size of script file is 115MB and it's only data .
When I open this script file, it shows:
Document contains one or more extremely long lines of text.
These lines cause the editor to respond slowly when you open the file .
Do you still want to open the file ?
I run schema-only script first and then data-only script .
Is there any way to fix this error ?
I solved it by using sqlcmd utitlity.
sqlcmd -S "Server\InstanceName" -U "instantName" -P "password" -i FilePathForScriptFile
For example :
sqlcmd -S .\SQLEXPRESS -U sa -P 123 -i D:\myScript.sql
Zey's answer was helpful for me, but for completion:
If you want to use Windows Authentication just omit the user and password.
And don't forget the quotes before and after the path if you have spaces.
sqlcmd -S .\SQLEXPRESS -i "C:\Users\Stack Overflow\Desktop\script.sql"
If you're logged into the domain with the correct privileges and there's only one instance running, you also do not have to provide the above user/pw/instance command args. I was able to just execute:
sqlcmd -i myfile.sql

MongoDB restore works on CLI but not inside BASH

I don't know why a command that runs OK in the command line, gives an error inside a bash script.
mongorestore -h XXXX.mongohq.com:10025 -u USER -p PASS -db pr4
DIC24/pr4
This works OK on the CLI, but inside a bash script I get:
Mon Dec 24 19:48:52 ERROR: don't know what to do with file [DIC24/pr4]
I found that the BASH changed the working dir in another line (I was editing a bash file created by other) so the script was running in another directory.
Hard to say without knowing all the details of your connection screen, but watch out for special bash characters. When automating some console commands I ran via Bash Scripting I ran into issues because one of the credentials contained an exclamation point.
Another thought: are you executing the script from the directory it's stored in?