I have two scripts master.sql category.sql and I am setting value for substitute variable in master.sql and use it in category value to export data in csv file - the file name is stored in substitute variable. Here is my code:
master.sql
set feedback off;
set serveroutput on;
SET SQLFORMAT csv;
var CatCode char(5) ;
COL fileName NEW_VALUE csvFile noprint
exec :CatCode := '18';
select ('c:\temp\CatCodes_' || trim(:CatCode) || '.csv') fileName from dual;
#category.sql;
exec :CatCode := '19';
select ('c:\temp\CatCodes_' || trim(:CatCode) || '.csv') fileName from dual;
category.sql
set termout off
spool &csvFile
SELECT Catgry_ID, Category_Name FROM Categories WHERE Catgry_ID = :CatCode;
spool off;
set termout off
When I run script master.sql (F5) on one machine it just works fine and creates two different csv files in c:\temp folder but when I run same script on different machine it prompts for csvFile! I am sure it should be some setting issue but I can't find it. I checked DEFINE setting on machine where it does not prompt, show define "&" (hex 26) and on another one it shows define "&". Is there anything else do I need to set to ignore prompt?
Write this in your script
set define off
If you are using substitution variables, you can set define to some other character(ensure that character is not used elsewhere)
set define <<Character>>
Example:
set define #
I ended up removing SQL Developer and reinstalling it and it worked :)
Related
I have an sql script which copies data from a file:
--myscript.sql
\set file :dir 'data.csv'
copy data from :'file' csv;
and execute it with psql providing the dir variable
psql -v dir="D:\data\" -f myscript.sql
now I would like the copy command executed only if some other variable is let, e.g. -v doit=
Are there any script control structures available for this? Looking for something like
$if :{?doit}
copy data from :'file' csv;
$endif;
Have tried to wrap it by an anonymous block
do '
begin
if ':{?doit}' then
copy data from :'file' csv;
end if;
end';
But it gives the error
Error: syntax error (approximate position "TRUE")
LINE 3: if 'TRUE' then
^
Answering my own question.
There were two issues there
psql variables cannot be directly substituted in do statements
This can be solved by putting a psql variable into a server setting as suggested here
the copy command does not accept any functions for the file path name, so the first solution won't work here.
I ended up formating the do block like this
select format('
begin
if ''%s''=''t'' then
copy rates from ''%s'' csv header delimiter E''\t'';
end if;
end;',:{?doit},:'file') as do \gset
do :'do';
The format function lets us use a multiline string. The resulting string is then assigned to a new varialbe with the help of \gset and feed it to do.
Note: The format function formats 'TRUE' as 't', so we have to treat it as such.
Thanks to #Mark for the directions.
I am using PSQL. My command line is:
$\copy (select current_timestamp) to '/home/myname/outputfile.txt'
I would like to know, How do I replace "(select current_Timestamp)" with a filename that houses that same select statement?
ex:
$\copy (My_SQL_FILE.sql) to '/home/myname/outputfile.txt'
I've tried googling but I can't seem to find the answer.
$\copy (my_Sql_file.sql) to '/home/myname/outputfile.txt'
does not work
You want to run a query stored in a file in a \copy statement, i.e. execute that query and store the output in a file? That is doable.
I've come across this use-case myself and wrote psql2csv. It takes the same arguments as psql and additionally takes a query as parameter (or through STDIN) and prints the output to STDOUT.
Thus, you could use it as follows:
$ psql2csv [CONNECTION_OPTIONS] < My_SQL_FILE.sql > /home/myname/outputfile.txt
What psql2csv will basically do is transform your query to
COPY ($YOUR_QUERY) TO STDOUT WITH (FORMAT csv, ENCODING 'UTF8', HEADER true)
and execute that through psql. Note that you can customize a bunch of things, such as headers, separator, etc.
I need to remove redundant GO statements from a large SQL file before it gets passed through Invoke-sqlcmd for deployment.
Multiple GO statements together causes "There are no batches in the input script" and using -OutputSqlErrors $false masks all other errors.
Get-Unique deletes all duplicate data - which is not desirable. I would only like to delete the duplicate GO statements
Current Script:
Exec (#SQLScript)
Print #SQLScript
End
GO
GO
if obj is not null
drop procedure
go
CREATE PROC
#al varchar(16),
#rule varchar(128)
END CATCH
GO
GO
If Exists (Select * From Table)
Go
Set #Start = DateAdd(m, 1, #Start)
End
GO
GO
I would like to get a script like this:
Exec (#SQLScript)
Print #SQLScript
End
GO
if obj is not null
drop procedure
go
CREATE PROC
#al varchar(16),
#rule varchar(128)
END CATCH
GO
If Exists (Select * From Table)
Go
Set #Start = DateAdd(m, 1, #Start)
End
GO
If you load the script into a variable, you can use regular expressions to match and replace multiple "GO" statements. For ex:
$ReplacedText = $OriginalScript -replace '(GO(\n)*){2,}',"GO`n"
The Regular expression matches "GO" that may or may not be followed by a new line, 2 or more times. and replace it with a single "GO" followed by a new line.
Was trying to datestamp the output filename - but keep getting errors -- along the lines of:
Select * from Orders
output to 'c:'+ select (CONVERT(varchar(10), GETDATE(), 120)) + 'orders.csv'
Any help appreciated...
I had similar problem on Sybase 9 - I had to write each procedure / function body inside spearate file named as that procedure. So I had to dynamically create file name using name of each procedure (or function). Below is my solution (works in isql):
begin
declare folder varchar (40);
declare fileName varchar(60);
-- folder parameter is our destination path - the 4 backslashes are important here
set folder = 'c:\\\\test\\\\sql\\\\';
-- here we are iterating over all procedures / functions
for p as curs dynamic scroll cursor for
Select distinct sysobjects.name
from sysobjects inner join syscomments
on sysobjects.id = syscomments.id where sysobjects.type = 'P'
do
-- each procedure must be inside separate file named like that procedure
set fileName = folder + name + '.sql';
-- finally, there are answer to original question:
-- we are exporting data into file, whose name is defined dynamically
unload select proc_defn from SYS.SYSPROCEDURE where proc_name = name
to fileName escapes off;
end for;
end
output to is a dbisql command, so it's interpreted on the client. This means that you can't use expressions for the filename, since they are executed on the server. However you can use the unload select statement (which does run on the server) with the into client file clause to do what you want.
See docs on the unload select statement here.
Disclaimer: I work for Sybase in SQL Anywhere engineering.
I have a stored procedure called "sp_BulkInsert" that inserts one .csv file into my database, where you specify the full path of the file when you execute it. I am trying to create another stored procedure called "sp_ResultsDump" where you specify the folder path, which then searches the folder for all .csv files, creates a table with file names, then loops through the rows of that table while executing "sp_BulkInsert" for each .csv file in the folder (the names of which are recorded in the previous table).
Here is the code:
--Step 0: Create Stored Procedure
CREATE PROCEDURE sp_ResultsDump
#PathFolder VARCHAR(2000)
AS
--Step 1: Create table of file names
IF EXISTS (SELECT 1
FROM INFORMATION_SCHEMA.TABLES
WHERE TABLE_TYPE='BASE TABLE'
AND TABLE_NAME='Files')
DROP TABLE Files
CREATE TABLE Files(FileID INT IDENTITY NOT NULL, FileName VARCHAR(max))
DECLARE #PathExec VARCHAR(1000)
SET #PathExec = "dir '"+#PathFolder+"'.csv /B"
INSERT INTO Files(FileName) EXEC master..xp_cmdshell #PathExec
DELETE Files WHERE FileName IS NULL
--Step 2: Get # of files, declare and initialize iterator
DECLARE #RowCount INT, #I INT
SET #RowCount = (SELECT COUNT(FileName) FROM Files)
SET #I = 1
--Step 3: Loop through the rows of a table and execute sp_ResultsDump for each file
WHILE (#I <= #RowCount)
BEGIN
DECLARE #FileName VARCHAR(1000)
SELECT #FileName = FileName FROM Files WHERE FileID = #I
SELECT #FileName = #PathFolder+#FileName
EXEC sp_BulkInsert #FileName
SET #I = #I + 1
END
I have confirmed that Steps 1-3 work when I specify the folder (without creating a stored procedure or a dynamic #variable), however storing the #PathFolder seems to be the problem. For example, I want to grab all .csv files from C:\, and each #FileName through the loop will loop through the file names contained in table Files, column FileName.
What I want to do is to be able to execute the following code so that I can get all .csv files in a specified folder and successfully bulk insert them into my database:
EXEC sp_ResultsDump 'c:\'
The reason for this is because the folder path may change later, and I want the user to be able to specify it.
I believe that "SELECT #FileName = #PathFolder+#FileName" is incorrect, and I tried all sorts of combinations of quotation marks and +'s. Steps 1 and 3 both seem to have problems with #PathFolder.
I guess I just need help with my while loop, because I think if my while loop is correct, this should be good.
Any suggestions? Simple syntax error somewhere? Thanks in advance.
I think your problem is with the following SET command
SET #PathExec = "dir '"+#PathFolder+"'.csv /B"
It appears to be mixing the double-quotes and single-quotes. Try changing it to this
SET #PathExec = 'dir "' + #PathFolder + '.csv" /B'
SET #PathExec = 'dir "' + #PathFolder + '.csv" /B'
It appears to be missing *. Try changing it to this
SET #PathExec = 'dir "' + #PathFolder + '*.csv" /B'
Its working fine for me. Otherwise use this:
EXEC master..xp_cmdshell ‘DIR C:\inbox\*.csv /b’