SQL Anywhere v10 Syntax error near OUTPUT - sqlanywhere

I'm attempting to output a table to an outside file. I've found a few questions regarding this and followed the answers from there without any luck.
SELECT *
FROM transactions;
OUTPUT TO 'C:\Users\administrator\Desktop\Test.txt'
Is the statement I've been using, I've attempted different variations with formatting and file types such as .csv with no change.
Which produces:
ErrorCode : 102
SQLState : 42W04
Message : SQL Anywhere Error -131: Syntax error near 'OUTPUT' on line 1
SQL =
OUTPUT TO 'C:\Users\administrator\Desktop\Test.txt'
Appreciate all your help

Are you running this through dbisql, or in a different application? OUTPUT TO is a dbisql command, not a SQL statement recognized by the database server. You can use the UNLOAD statement in any application to allow the server to create the file.
Disclaimer: I work for SAP in SQL Anywhere engineering.

Related

Use SQL Workbench to read a variable from a file

UPDATE: in the workbench/J log file I am seeing this error:
ERROR Variable names may only contain characters (a-z, A-Z), numbers and underscores
I'm sure this is what is causing my process to fail, but I have no idea why because my variables are named appropriately. I've tried renaming them a few times just in case and the same thing happens.
ORIGINAL POST:
I am working on an automated process to dump the contents of a Postgres query to a text file and FTP it to someone. The process I have been using successfully is a windows batch script that runs SQL Workbench to run the query and write the entire contents of the table to a text file and FTP it.
Now I want to be able to use WBVarDef to load a variable from a text file and use it in my query. For reference, the variable is the unique id of the last record that was FTPed. This is the code i have:
WBVarDef -variable=id -contentFile=id.txt;
WBVardef today=#"select to_char(current_date,'mmddyyyy')";
WBExport -type=text
-file='c:/CLP/FTP/$[today]circ_trans.txt'
-delimiter='|'
-quoteAlways=true
-lineEnding=crlf
-encoding=utf8;
SELECT
*
FROM
transactions
WHERE
transactions.id > $[id]
ORDER BY
transactions.id;
The only thing new here is the reference to the text file that contains the id on the first line. This completely breaks the process but as far as I can tell, I am using this according to the SQL Workbench documentation.
Any help would be greatly appreciated.
I have figured this one out. I was running an older version of workbench that did not support this functionality. Now that I upgraded to build 119 this is working. I'm having other issues but that's a different story....

MySQL Workbench 5.2.47 CE: EDIT database.table command doesn't work

When I type the following command on the SQL editor I get Error Code: 1064
EDIT my_database.my_table;
but the commands
SELECT * from my_database.my_table;
works fine.
Thanks
The EDIT command was only a temporary workaround until we had proper parsing in place to determine if a query result can be edited or not. This keyword is no longer supported since a year or more.

How to use BCP to dump query (cdc function ) retrieved data to text file

Im trying to use BCP to dump data from CDC function into a .dat file. Im using the following query (which works in Server 2008 R2):
USE LEESWIJZER
DECLARE #begin_time datetime
, #end_time datetime
, #from_lsn binary(10)
, #to_lsn binary(10)
SET #end_time = '2013-07-05 12:00:00.000';
SELECT #to_lsn = sys.fn_cdc_map_time_to_lsn('largest less than or equal', #end_time);
SELECT #from_lsn = sys.fn_cdc_get_min_lsn('dbo_LWR_CONTRIBUTIES')
SELECT sys.fn_cdc_map_lsn_to_time(__$start_lsn) AS ChangeDTS
, *
FROM cdc.fn_cdc_get_net_changes_dbo_LWR_CONTRIBUTIES (#from_lsn, #to_LSN, 'all')
(edited for readability, used in BCP as single string)
my BCP string is:
BCP "Query above" queryout "C:\temp\LWRCONTRIBUTIES.dat" -w -t ";|" -r \n -T -S {server\\instance} -o "C:\temp\LWRCONTRIBUTIES.log"
As you can see I want a resulting .dat file in unicode, and a log file. I'm guessing the "ChangeDTS" column added to the function outcome is causing my problem. Error message reads: "[Microsoft][SQL Native Client]Host-file columns may be skipped only when copying into the Server".
It may be resolved using a format file, but since this code needs to run daily, likely more than once a day, and the tables are subject to change, I'm reluctant to constantly adjust my format files (there are 100's of tables needing the same procedure).
Furthermore, this is run on a clients database, who wont like me creating views in their database.
Anybody got any idea how I can create a text file (.dat) with a selected number of columns from a cdc function?
Found the answer, regardless of which version of bcp used, bcp cant handle declarations, it seems. If i edit those out, works like a charm.
However, according to someone on a different forum, BCP should be to handle declarations of variables. So happy it works for me now, but still confused why it does now and didnt before.

Sybase: Incorrect syntax near 'go' in a 'IF EXISTS' block

This is my sql statement
IF EXISTS (select 1 from sysobjects where name = 'PNL_VALUE_ESTIMATE')
drop table dbo.PNL_VALUE_ESTIMATE
go
isql bails out with this error message
Msg 102, Level 15, State 1:
Server 'DB_SERVER', Line 3:
Incorrect syntax near 'go'.
But the sql statement look correct to me. What's wrong?
Sybase version is 15
Try this:
IF EXISTS (select 1 from sysobjects where name = 'PNL_VALUE_ESTIMATE')
drop table dbo.PNL_VALUE_ESTIMATE
go
or this:
IF EXISTS (select 1 from sysobjects where name = 'PNL_VALUE_ESTIMATE')
BEGIN
drop table dbo.PNL_VALUE_ESTIMATE
END
go
or this:
IF EXISTS (select 1 from sysobjects where name = 'PNL_VALUE_ESTIMATE')
BEGIN
select 1
END
go
Does any work?
GO is not a keyword of T-SQL, but of the editor.
SMSS (between others) uses it as 'division' between batches of commands it sends to the database server. Executing it inside a stored procedure, or even a script file, won't work.
edit: Maybe it works with SyBase, but I think it'll need to be uppercase in that case.
From the documentation, the GO statement is a command of the editor you're using, not SQL itself:
GO is not a Transact-SQL statement; it is a command recognized by the
sqlcmd and osql utilities and SQL Server Management Studio Code
editor.
That said - Sybase is also an editor that supports the GO statement.
I've had the same problem, but with SQL Management Studio. The issue is that the editor does not support mixed-newline types around certain statements - GO being one of them. In Management Studio, for example, only Windows-style newlines (CR + LF) are allowed and if I were to use the Linux format (LF), it will give the exact same error as yours above.
Text-editors such as Notepad++ (what I use) have an option for what type of End-of-Line characters you use by default (Windows, Linux, Mac (CR)).
Try checking what newline character(s) are being used in your statements to see if that can fix the problem.
Shouldn't the object reference have
dbo..PNL_VALUE_ESTIMATE
because you haven't given a database name, and if you include the obj owner you need .. to miss the db name?
I'd go:
EXEC('DROP TABLE dbo..PNL_VALUE_ESTIMATE')
in the true part as well, because DROP TABLE is always compiled, and if the table isn't there you'll still have a failure.
Do you even need dbo? If your sql always runs as dbo just leave it out.

FreeTDS runs out of memory from DBD::Sybase

When I add
client charset = UTF-8
to my freetds.conf file, my DBD::Sybase program emits:
Out of memory!
and terminates. This happens when I call execute() on an SQL query statement that returns any ntext fields. I can return numeric data, datetimes, and nvarchars just fine, but whenever one of the output fields is ntext, I get this error.
All these queries work perfectly fine without the UTF-8 setting, but I do need to handle some characters that throw warnings under the default character set. (See related question.)
The error message is not formatted the same way other DBD::Sybase error messages seem to be formatted. I do get a message that a rollback() is being issued, though. (My false AutoCommit flag is being honored.) I think I read somewhere that FreeTDS uses the iconv program to convert between character sets; is it possible that this message is being emitted from iconv?
If I execute the same query with the same freetds.conf settings in tsql (FreeTDS's command-line SQL shell), I don't get the error.
I'm connecting to SQL Server.
What do I need to do to get these queries to return successfully?
I saw this in .conf file - see if it helps:
# Command and connection timeouts
; timeout = 10
; connect timeout = 10
# If you get out of memory errors, it may mean that your client
# is trying to allocate a huge buffer for a TEXT field.
# (Microsoft servers sometimes pretend TEXT columns are
# 4 GB wide!) If you have this problem, try setting
# 'text size' to a more reasonable limit
text size = 64512
These links seem relevant as well and show how the setting can be changed without modifying the freetds.conf file:
http://lists.ibiblio.org/pipermail/freetds/2002q1/006611.html
http://www.freetds.org/faq.html#textdata
The FAQ is particularly unhelpful, not listing the actual error message.