I have created a stored procedure which takes 3 arguments
databaseName
backupType
backupLocation
if i run this stored procedure its creating the backup.
I want to know hw to write a batch file for this and schedule a job to run at specified time.
I am using sqlserver2008.
Have a look at this article:
sqlcmd utility
You can use the sqlcmd utility in a batch file to execute your procedure.
you can Refer this microsoft link:
This is a microsift link with stored procedure and batch file examples
http://support.microsoft.com/kb/2019698
Related
"'put' is not recognized as an internal or external command, operable program or batch file."
I am inputting the following: "put file://C:\FolderName\FileName.csv"
All I need to do is upload a csv from my C drive to the Snowflake cloud. I figured this would be easy, but I can't for the life of me figure out why I keep getting this message.
First, you need to connect to SnowSQL from the command prompt and after that, you will be able to execute the PUT/GET command.
C:> snowsql -a snowflake_accountname -u snowflake_username
snowsql> use Database_name;
snowsql> use schema SCHEMA_NAME;
snowsql> use WAREHOUSE WAREHOUSE_NAME;
snowsql> ls #My_Stage_NAME
snowsql> put file://C:\FolderName\FileName.csv #My_Stage_NAME
Can you show the steps before this command is executed to show how you have logged into Snowsql? This command needs to be run inside Snowsql and the error message seems to suggest your system isn't seeing this.
#JNevill is right - the PUT commands takes the file from local (as you have identified) and places it in a Snowflake internal stage. You can create one of these stages in Snowflake or use some of the automatically provisioned ones for your user or table(s). So if you have created the table "My_tableName" you can make use of the internal stage for this with the reference "#My_tableName" so:
snowsql> put file://C:\FolderName\FileName.csv #My_tableName
Once the file is loaded into the stage then you can use the COPY INTO command to load the data into the table:
https://docs.snowflake.net/manuals/sql-reference/sql/copy-into-table.html#loading-files-from-an-internal-stage
I am facing a problem in DB2. In my Oracle environment it was very easy for me to include multiple scripts in one master script, which were executed sequentially. e.g.:
Master.sql:
connect ....
#script1.sql
#script2.sql
Now I have to build up same logic in DB2 LUW. Is there a simple way to include multiple scripts in one master script? I would like to have one single db2 call from shell, which executes the master script and within all subscripts.
Regards
Jan
There is notrhing to stop you from creating a single file with multiple sql batches. In the Windows world, it would look like this:
Note: First you initialize the db2 command prompt.
db2cmd -c -w -i %1.bat
With as many of these as you want in the .bat file:
db2 -txf c:\Example\db2html.sql
In Linux, the db2clp is included in the shell once you load the db2profile ('. /home/db2inst1/sqllib/db2profile). In windows, you need to call db2cmd in order to use db2clp.
With a interactive db2clp, you cannot call db2 scripts via #scriptX, however, you can call them from the shell like
db2 -tvf script
However, if you use the CLP*Plus you can do almost everything you do in SQL*Plus. For more information: https://www.ibm.com/developerworks/community/blogs/IMSupport/entry/tech_tip_db2_s_new_clp_plus_utility?lang=en
I have a script in a file (mysqlscript.sql) that is basically a bunch of inserts/updates/deletes separated by GO statements
insert into ....
GO
update .....
GO
How do I run this script?
You can try to use $system.SQL.ImportDir()
And of course, you can read your file and execute each sql-query in your programm.
The tool Caché Monitor use GO as statement separator and connects to InterSystems Caché. With this tool you can execute your script.
I have a bunch of SQL scripts (with shell script wrappers) to unload data like so
EXPORT TO /tmp/out.csv OF DEL MODIFIED BY NOCHARDEL COLDEL, DATESISO
MESSAGES /tmp/out.msg SELECT WIDGETID
...
I want to add an error handler to the script the way Oracle does it:
WHENEVER SQLERROR EXIT FAILURE;
SPOOL /tmp/out.csv;
SELECT WIDGETID...
SPOOL OFF;
According to DB2's documentation, this can be done in stored procedures, C, Perl, REXX, and nothing else...
How can this be done in SQL scripts?
I am running DB2/LINUXX8664 9.7.2.
you could use the DB2 command line command processor and get its return code. http://publib.boulder.ibm.com/infocenter/db2luw/v9r5/topic/com.ibm.db2.luw.admin.cmd.doc/doc/r0010411.html
or you could use the SYSPROC.ADMIN_CMD procedure and use its return codes. http://publib.boulder.ibm.com/infocenter/db2luw/v9r5/topic/com.ibm.db2.luw.sql.rtn.doc/doc/r0023573.html
you could put the stored proc calls in a script file and run something like db2 -tvf runexport.txt or put the db2 commands in a linux script file and use linux scripting foo to handle the db2 return codes.
In linux I can do something like this:
mysql -u user -p=pass somedb <(echo "create database foo;")
How can I do that with windows batch scripts?
Basically, I want to make a batch file that runs a sql script without having to keep the script in a separate file.
Thanks
One way is to echo the SQL commands into a file, run the mysql command with option to include the SQL file, and then remove the file (if you really don't want it.)
You can do
echo create database foo;|mysql ...
just fine, but for multiple lines you really want to make a temporary file you just pass to MySQL to read.