How can I use autocomplete in proc SQL queries?
For example, when I use proc print I can use autocomplete for libnames, tables and fields.
How can I do the same in proc SQL?
Thanks
In Enterprise Guide 5.1, you can autocomplete table names only in PROC SQL, as far as I can tell. So:
proc sql;
select * from sashelp.class
where age > 13;
quit;
When you started typing sashelp it would not autocomplete for you, but when you start typing .class it will (assuming your SAS session is connected, and you wait a bit for it to search the metadata - in my uses I almost always have to type the ., then backspace over it, and type it again, for it to work properly). You also couldn't autocomplete age.
In EG 7.1 (and not 6.1), you can also autocomplete the libname (sashelp). The where clause however seems to still be not autocompleted. PROC SQL syntax is a bit more complex for the autocomplete parser to parse, so it's probably left out for that reason.
As Raphael notes in comments, ctrl+L is the hotkey for Autocomplete Library; this works anywhere in an editor window (including in open code), so this would be a workaround for EG 5.1/6.1 not having this by default in PROC SQL.
However, the key combination for Data Set Variable is ctrl+shift+V, and this does not work in PROC SQL (but does work in Data Step, even in code that doesn't normally work this way). My guess is that SQL is harder to parse which dataset a variable is from, and SAS hasn't worked out this functionality yet.
Related
I was creating a function following an example from a database class which included the creation of a temporary variable (base_salary) and using a SELECT INTO to calculate its value later.
However, I did not realize I used a different order for the syntax (SELECT ... FROM ... INTO base_salary) and the function could be used later without any visible issues (values worked as expected).
Is there any difference in using "SELECT ... FROM ... INTO" syntax order? I tried looking about it in the PostgreSQL documentation but found nothing about it. Google search did not provide any meaningful information neither. Only thing I found related to it was from MySQL documentation, which only mentioned about supporting the different order in an older version.
There is no difference. From the docs of pl/pgsql:
The INTO clause can appear almost anywhere in the SQL command.
Customarily it is written either just before or just after the list of
select_expressions in a SELECT command, or at the end of the command for other command types. It is recommended that you follow
this convention in case the PL/pgSQL parser becomes stricter in future
versions.
Notice that in (non-procedural) SQL, there is also a SELECT INTO command which works like CREATE TABLE AS, in this version the INTO must come right after the SELECT clause.
I always use SELECT ... INTO ... FROM , I believe that is the standard supported notation
https://www.w3schools.com/sql/sql_select_into.asp
I would recommend using this, also if there are any updates or if the other version might become unsupported as you mentioned...
Using Oracle SQL Developer, I am trying to make this web (link) example working:
CREATE OR REPLACE FUNCTION concat_self(str VARCHAR2, cnt PLS_INTEGER)
RETURN VARCHAR2 SQL_MACRO(SCALAR)
IS BEGIN RETURN 'rpad(str, cnt * length(str), str)';
END;
/
But I get those errors I do not understand:
Function CONCAT_SELF compiled
LINE/COL ERROR
--------- -------------------------------------------------------------
2/37 PLS-00103: Encountered the symbol "SQL_MACRO" when expecting one of the following: . # % ; is authid as cluster order using external character deterministic parallel_enable pipelined aggregate result_cache accessible
3/4 PLS-00103: Encountered the symbol "BEGIN" when expecting one of the following: not null of nan infinite dangling a empty
5/0 PLS-00103: Encountered the symbol "end-of-file" when expecting one of the following: end not pragma final instantiable order overriding static member constructor map
Errors: check compiler log
Your code is perfectly 'valid' for any instance of Oracle where SQL_MACRO keyword is recognized by the PL/SQL Engine.
The errors start to make a little bit more sense once you realize that the database doesn't understand what you're asking for - it does not recognize that 'SQL_MACRO' is a valid component of the CREATE OR REPLACE FUNCTION PL/SQL library.
Those errors kind of allow you to see how the database's PL/SQL and SQL parser are taking your request and breaking it down into things it knows how to work with.
Everything after the first error is about the parser not being able to make it past the first problem it encountered.
This feature was introduced in version 21c of the database, as explained in the 21c New Features Guide.
You can create SQL Macros (SQM) to factor out common SQL expressions
and statements into reusable, parameterized constructs that can be
used in other SQL statements. SQL macros can either be scalar
expressions, typically used in SELECT lists, WHERE, GROUP BY and
HAVING clauses, to encapsulate calculations and business logic or can
be table expressions, typically used in a FROM clause.
SQL macros increase developer productivity, simplify collaborative
development, and improve code quality.
i have a DB2 data source and an Oracle 12c target.
The Oracle has a DB link to the DB2 defined which is working in general.
Now i have a huge table in the DB2 which has a timestamp column (lets call it ROW_CHANGED) for row changes. I want to retrieve rows which have changed after a particular time.
Running
SELECT * FROM lib.tbl WHERE ROW_CHANGED >'2016-08-01 10:00:00'
on the DB2 returns exactly 1 row after ca. 90 secs which is fine.
Now i try the same query from the Oracle via the db link:
SELECT * FROM lib.tbl#dblink_name WHERE ROW_CHANGED >TO_TIMESTAMP('2016-08-01 10:00:00')
This runs for hours and ends up in a timeout.
I read some Oracle docs and found distributed query optimization tips but most of them refer to joining a local to a remote table which is not my case.
In my desperation, i have tried the DRIVING_SITE hint, without effect.
Now i wonder when the WHERE part of the query will be evaluated. Since i have to use Oracle syntax and not DB2 syntax for the query, is it possible the Oracle will try to first copy the full table and apply the where clause afterwards? I did some research but did not find anything which would help me in this direction.
The ROW_CHANGED is a hidden column in the DB2, if that matters.
Thx for any hint in advance.
Update
Thanks#all for help. I'll share what did the trick for me.
First of all i have used TO_TIMESTAMP since the DB2 column is also Timestamp (not date) and i had expected to circumvent implicit conversions by this.
Without the explicit conversion i ran into ORA-28534: Heterogeneous Services preprocessing error and i have no hope of touching the DB config within reasonable time.
The explain plan btw did not bring much. It showed a FULL hint and no conversion on the predicates. Indeed it showed the ROW_CHANGED column as Date, i wonder why.
I have tried Justins suggestion to use a bind variable, however i got ORA-28534 again. Next thing i did was to wrap it into a pl/sql block (will run in a SP anyway later).
declare
v_tmstmp TIMESTAMP := 01.08.16 10:00:00;
begin
INSERT INTO ORAUSER.TMP_TBL (SRC_PK,ROW_CHANGED)
SELECT SRC_PK,ROW_CHANGED
FROM lib.tbl#dblink_name
WHERE ROW_CHANGED > v_tmstmp;
end;
This was executing in the same time as in DB2 itself. The date format is DD.MM.YY here since it is the default unfortunately.
When changing the variable assignment to
v_tmstmp TIMESTAMP := TO_TIMESTAMP('01.08.16 10:00:00','DD.MM.YY HH24:MI:SS');
I got the same problem as before.
Meanwhile the DB2 operators have created an index in the ROW_CHANGED column which i requested earlier that day. This has solved the problem in general it seems. Even my original query finishes in no time now.
If you are actually using an Oracle-specific conversion function like to_timestamp, that forces the predicate to be evaluated on the Oracle side. Oracle isn't going to know how to convert a built-in function like to_timestamp into an exactly equivalent function call in DB2.
If you used a bind variable, that would be more likely to get evaluated on the DB2 side. But that may be complicated by the data type mapping between different databases-- there may not be a perfect mapping between one engine's date and another engine's timestamp data type. If this was a numeric column, a bind variable would be almost certain to get pushed. In this case, it probably involves playing around a bit to figure out exactly what data type to use for your variable that works for your framework, Oracle, and DB2.
If using a bind variable doesn't work, you can force the predicate to be evaluated on the remote server using the dbms_hs_passthrough package. That lets you send a query verbatim to the remote server which allows you to do things like use functions defined in your DB2 database. That's a bit of overkill in this situation, hopefully, but it's nice to have the hammer as your backup if the simpler solution doesn't work quickly enough.
I have an ssis package where I am using an OLEDB source linking to SQL Server 2005 table. All columns except a date column are NVARCHAR(255). I am using an Excel destination and using a SQL statement to create the sheet in the Excel workbook, the SQL is in the excel connection manager (effectively a create table statement that creates a sheet) and is derived from the mapping of the columns from the DB.
No matter what I have done I keep getting this unicode --> non-unicode conversion error between my source and destination. Tried conversion to string[DT_STR] between S > D, removed it, changed SQL Table VARCHAR to NVARCHAR and still get this flippin error.
Because I am creating the sheet in Excel with a SQL statement I do not see any way to actually pre-define what the data types of the columns will be in the Excel sheet. I imagine it would be a default meta data but I do not know.
So between my SQL table destination and the creation of my Excel sheet with this SSIS sql statement how can I stop this error coming up?
My error is:
Error at Data Flow Task [OLE DB Source [1]]: Column "MyColumn" cannot convert between unicode and non-unicode string data types.
And for all nvarchar columns.
Appreciate any help
Thanks
Andrew
The below Steps worked for me:
right click on source task.
click on "Show Advanced editor".
Go to "Input and Output Properties" tab.
select the output column for which you are getting the error.
Its data type will be "String[DT_STR]".
Change that data type to "Unicode String[DT_WSTR]".
save and close.
Add Data Conversion transformations to convert string columns from non-Unicode (DT_STR) to Unicode (DT_WSTR) strings.
You need to do this for all the string columns...
The missing piece here is Data Conversion object. It should be in between OLE DB Source and Destination object.
First, add a data conversion block into your data flow diagram.
Open the data conversion block and tick the column for which the error is showing. Below change its data type to unicode string(DT_WSTR) or whatever datatype is expected and save.
Go to the destination block. Go to mapping in it and map the newly created element to its corresponding address and save.
Right click your project in the solution explorer.select properties. Select configuration properties and select debugging in it. In this, set the Run64BitRunTime option to false (as excel does not handle the 64 bit application very well).
Instead of adding an earlier suggested Data Conversion you can cast the nvarchar column to a varchar column. This prevents you from having an unnecessary step and has a higher performance then the alternative.
In the select of your SQL statement replace date with CAST(date AS varchar([size])). For some reason this does not yet change the output data type. To do this do the following:
Right click your OLE DB Source step and open the advanced editor.
Go to Input and Output Properties
Select Output Columns
Select your column
Under Data Type Properties change DataType to string [DT_STR]
Change Length to the length you specified in your CAST statement
After doing this your source data will be output as a varchar and your error will disappear.
Source
I have been having the same issue and tried everything written here but it was still giving me the same error.
Turned out to be NULL value in the column which I was trying to convert.
Removing the NULL value solved my issue.
Cheers,
Ahmed
No-one seems to mention this but, converting varchar to nvarchar in the source query also solves the issue.
On the above example I kept losing the values, I think that delaying the Validation will allow the new data types to be saved as part of the meta data.
On the connection Manager for 'Excel Connection Manager' set the Delay Validation to False from the Properties.
Then on the data flow Destination task for Excel set the ValidationExternalMetaData to False, again from the properties.
This will now allow you to right click on the Excel Destination Task and go to Advanced Editor for Excel Destination --> far right tab - Input and Output Properties. In the External Columns folder section you will be able to now change the Data Types and Length values of the problematic columns and this can now be saved.
Good Luck!
I experienced this condition when I had installed Oracle version 12 client 32 bit client connected to an Oracle 12 Server running on windows.
Although both of Oracle-source and SqlServer-destination are NOT Unicode, I kept getting this message, as if the oracle columns were Unicode.
I solved the problem inserting a data conversion box, and
selecting type DT-STR (not unicode) for varchar2 fields and DT-WSTR (unicode) for numeric fields, then I've dropped the 'COPY OF' from the output field name.
Note that I kept getting the error because I had connected the source box arrow with the conversion box BEFORE setting the convertion types. So I had to switch source box and this cleaned all the errors in the destination box.
When creating table in SQL Server make your table columns NVARCHAR instead of VARCHAR.
I think people are missing this. In my case I had 100 character columns to convert between Oracle and MS Sql. All this stuff about Data Conversion and Advanced Editor is incredibly tedious if you have a 100 separate character columns to assign. Plus SSIS being SSIS, it will sometimes reset all your 100 advanced editor changes even if you set VALIDATEEXTERNALMETADATA to false, incredibly obnoxious. I wouldn't mind doing the Data Conversion if there was some value to it but 20 years ago ETL tools used to take oracle character to ms sql characters without fussing. What Bakalolo and Zafer say is the answer if you have a lot of character columns and you can live with nvarchar, just declare all your output ms sql columns (nvarchar) and your data task will automatically assign your oracle fields into ms sql fields with no manual overrides. I have also found that the new Oracle Source (2021) doesn't complain about a unicode conversion to varchar in ms sql. A colleague just told me that the ssis wizard (it may be only in vs 2019+) to assign oracle character to ms sql varchar will do the assignments automatically with no override, but I haven't tried that personally.
2022 update - I think this is just vs 2019 created packages and later. An ado.net task reading a varchar ms sql table going to oledb (and ado.net I think) ms sql varchar will throw the unicode error. If you switch the input task to oledb reading ms sql varchar table you won't have to do the advanced editor overrides for the varchar fields. If you don't want to do advanced editor overrides (who does?) try different tasks and more oledb tasks.
I just encounter same issue, I solve it in my SQL request : using convert directly
CONVERT(NVARCHAR(50),'') AS MyVarName
I need to put an empty (or fix size string) into excel file. Converting force type of MyVarName from DT-STR to DT-WSTR (unicode)
I know this is a very old post but I ran into the same issue and found that I had to manually select the conversion component output alias as the mapping in the excel destination component. Since the names of the OLE DB Source match the excel column names it was mapping it to the OLE DB and not to the Output Alias. Such as SourceID column from the OLE DB component being named Copy of SourceID after conversion. I don't see the original question saying they specifically selected the new alias name just that they mapped to DB columns. #Serge Voloshenko post comes the closest but also does not mention to make sure the mapping happens. To a new SSIS user this might be overlooked.
I have a complicated dynamic query in TSQL that I want to export to Excel.
[The result table contains fields with text longer than 255 chars, if it matters]
I know I can export result using the Management Studio menus but I want to do it automatically by code. Do you know how?
Thanks in advance.
You could have a look at sp_send_dbmail. This allows you to send an email from your query after it's run, containing an attached CSV of the resultset. Obviously the viability of this method would be dependent on how big your resultset is.
Example from the linked document:
EXEC msdb.dbo.sp_send_dbmail
#profile_name = 'AdventureWorks2008R2 Administrator',
#recipients = 'danw#Adventure-Works.com',
#query = 'SELECT COUNT(*) FROM AdventureWorks2008R2.Production.WorkOrder
WHERE DueDate > ''2006-04-30''
AND DATEDIFF(dd, ''2006-04-30'', DueDate) < 2' ,
#subject = 'Work Order Count',
#attach_query_result_as_file = 1 ;
One way is to use bcp which you can call from the command line - check out the examples in that reference, and in particular see the info on the -t argument which you can use to set the field terminator (for CSV). There's this linked reference on Specifying Field and Row Terminators.
Or, directly using TSQL you could use OPENROWSET as explained here by Pinal Dave.
Update:
Re;: 2008 64Bit & OPENROWSET - I wasn't aware of that, quick dig throws up this on MSDN forums with a link given. Any help?
Aside from that, other options include writing an SSIS package or using SQL CLR to write an export procedure in .NET to call directly from SQL. Or, you could call bcp from TSQL via xp_cmdshell - you have to enable it though which will open up the possible "attack surface" of SQL Server. I suggest checking out this discussion.
Some approaches here: SQL Server Excel Workbench
I needed to accept a dynamic query and save the results to disk so I can download it through the web application.
insert into data source didn't work out for me because of continued effort in getting it to work.
Eventually I went with sending the query to powershell from SSMS
Read my post here
How do I create a document on the server by running an existing storedprocedure or the sql statement of that procedure on a R2008 sql server
Single quotes however was a problem and at first i didn't trim my query and write it on one line so it had line breaks in sql studio which actually matters.