I've been at this for a little longer then I'd like to admit.
From a SQL Server database, I'm trying to run a dynamic OPENQUERY on a Firebird database, but I'm getting in all kinds of trouble with the datetime condition. Now, I'm trying to use a parametrized approach to circumvent potential issues with the datetime format, but getting a token unknown error. Any advise is appreciated; full details below:
FYI, I know the query is redundant, but I'm just trying to get it to work.
t-sql:
DECLARE #days INT
DECLARE #start DATETIME
DECLARE #end DATETIME
DECLARE #sql nVARCHAR(MAX)
SET #days = 1
SET #end = GETDATE()
SET #start = DATEADD(DAY,-#days,#end)
SET #sql = N'(SELECT OBJID FROM SALE WHERE MODIFIED >= #test1)'
DECLARE #TEST nvarchar(max)
SET #TEST = N'SELECT * FROM OPENQUERY(x,'+CHAR(39)+'SELECT * FROM SALE WHERE OBJID IN '+#sql+CHAR(39)+')'
EXEC sp_executesql #TEST, N'#test1 datetime', #start
error:
LE DB provider "MSDASQL" for linked server "x" returned message "[ODBC Firebird Driver][Firebird]Dynamic SQL Error
SQL error code = -104
Token unknown - line 1, column 77
#".
I also tried the non-parametrized approach, but couldn't figure out what formats to put the datetime in.
Related
I'm trying to identify new data on a postgres database that does not yet exist on our local SQL Server 2017 so I can download the new data onto our SQL Server. To identify this, I'm connecting to the postgres db using a linked server and then using openquery.
I've inserted the MAX datetime2(7) from our local SQL "events" table into a temp table called #latest:
CREATE TABLE #latest (latest DATETIME2(7))
latest
2023-01-26 14:40:19.1470000
There is a "time" column within the "event" table in the linked server with a datatype of timestamp with time zone. Running this code returns successfully:
DECLARE #Query NVARCHAR(MAX)
SELECT
#Query = '
SELECT
[time],[eventID]
FROM OPENQUERY(
[PostgresServer]
,''
select
"time",
"eventID"
from "event" t
limit 1
''
)'
FROM #latest
EXECUTE sp_executesql #Query
time
eventID
2022-11-17 11:05:17.2450000
730d544e-de4b-47a7-b8d0-80742dc4240d
However, when I try to add a where clause, I get an error:
DECLARE #Query NVARCHAR(MAX)
SELECT
#Query = '
SELECT
[time],[eventID]
FROM OPENQUERY(
[PostgresServer]
,''
select
"time",
"eventID"
from "event" t
where "time" > '[latest]'
limit 1
''
)'
FROM #latest
EXECUTE sp_executesql #Query
Msg 102, Level 15, State 1, Line 33
Incorrect syntax near 'latest'.
I added + to each side of [latest] but then I started getting operator errors:
The data types varchar and datetime2 are incompatible in the add operator.
So I've ended up with the below:
where "time" > '+CAST([latest] AS NVARCHAR(100))+'
But I get this error:
Msg 7399, Level 16, State 1, Line 22
The OLE DB provider "MSDASQL" for linked server "PostgresServer" reported an error. The provider did not give any information about the error.
Msg 7350, Level 16, State 2, Line 22
Cannot get the column information from OLE DB provider "MSDASQL" for linked server "PostgresServer".
Is there a better way I can identify new data on the linked server? Or fix the error?!
I think I've found the answer here: SQL Server : use datetime variable in openquery
needed to add a varchar variable instead of using the datetime column in my temp table :)
I'm new to PostgreSql, used to work with SQL Server.
I'm having trouble understanding one scenario, quite simple from Sql Server perspective.
I need to have as the query result (e.g. dataset) the value of a variable.
I Sql Server i'd have something like this:
Declare #cnt int
Delete from MyTable
set #cnt = ##ROWCOUNT
select #cnt as FinalResult
The expected result was to use the value of the variable (e.g. #cnt as my resultset).
I tried the same in PostgreSql, using the follwoing code:
DO $$
DECLARE
affected_rows integer;
BEGIN
delete from mySchema.MyTable;
GET DIAGNOSTICS affected_rows := ROW_COUNT;
raise notice 'affected records: %', affected_rows;
select affected_rows as result;
END $$;
Obviously it throws me an error.
Could you please advise on how to approach it?
Many thanks.
I assume the error you get is "query has no destination for result data" which is caused by the select affected_rows as result; line. In Postgres' PL/pgSQL (and essentially every other procedural language in relational databases - except for T-SQL), the result of a query needs to be stored somewhere.
A do block is like a temporary function that does not return anything, so the only way to give feedback from there, is to use raise notice to print the number of deleted rows. You can't have a SELECT statement returning something from a DO block.
But you don't need PL/pgSQL for that to begin with:
with deleted as (
delete from table
returning *
)
select count(*) as final_result
from deleted;
I have a query that returns value of a field sent through parameter :
#Field nvarchar(50),
#ID int
...
execute('SELECT ' + #Field + ' from SampleTable where (ID=' + #ID + ');');
I'm doing this to have one SP instead of several SP's with the same structure .
Now I'm not sure is this safe or not ?
You should use sp_executesql and quotename to be safe.
declare #SQL nvarchar(max)
set #SQL = 'select '+quotename(#Field)+' from SampleTable where ID = #ID'
exec sp_executesql #SQL, N'#ID int', #ID
The query is not safe.
The client using the web system can drop your entire database using a SQL injection attack
by passing ' ; DROP DATABASE dbname -- instead of id.
In case you plan to use the above query use a parametrized SQL stored procedure to prevent SQL injection attacks.
More details below :-
How to protect from SQL injection attacks in ASP .NET
I've tried running my SQL (in T-SQL) (I just genereated into a variable) but I can't get it to run.
What I want to do is:
1. Run big SQL from Program
2. Big SQL generates selct-SQL
3. run generated sql like normal select and receive data like normal.
I thought it could be done done with sp_executesql but it looks like it's not right in my case.
What I'm trying looks like this:
declare #sql varchar(max)
set #sql = 'select x, y from z'
exec #sql --this is the point where im stuck.
I know this must be quite a basic question, but I couldn't find something that fits to my problem on google.
Thanks for your help!
Update
I solved my problem by using sp_sqlexec (which isn't supported anymore but works like I wanted).
declare #sql varchar(max)
set #sql = 'select x, y from z'
exec sp_sqlexec #sql
The correct solution is sp_executesql! (see sp_sqlexec vs. sp_executesql)
For my problem it would be quite time consuming if i would "rebuild" everything that I could use it.
Thanks for your help guys!
You need parentheses exec (#sql)
SQL Server will look for a stored procedure of the name in the #sql variable without this and complain Could not find stored procedure 'select x, y from z'.
If you are using dynamic SQL See The Curse and Blessings of Dynamic SQL for a good article on the topic.
You can also use sp_executesql, but note that it needs NVARCHAR (Unicode)
Also, if you are building dynamic filters, you can pass in parameters as per below
declare #SQL nvarchar(max)
set #SQL = N'select x, y from z where x = #someFilter'
exec sp_executesql #SQL, N'#someFilter bigint', #someFilter = 6034280
I am using SQL 2008 and EF
I have following stored proc for bulk insert
CREATE Type [dbo].[xxx] as Table (
[ErrorCode] [nvarchar](10),
[ErrorMessage] [nvarchar](300),
[FieldName] [nvarchar](50),
[FieldLable] [nvarchar](300),
)
CREATE procedure dbo.InsertAll(#Records xxx READONLY)
as
begin
insert into dbo.MyTable
select * from #Records;
end;
go
I am passing a Datatable as parameter (Type=structured) that has multiple records
This proc works when called using SQLCommand.ExecuteNonQuery, but does not do anything when called using contextObject.ExecuteStoreCommand. The return value = affected rows is always 0
Whats wrong? are such procedures not supported with EF? I am not even getting any exception :(
Update: After running SQL trace just realized the difference in the SQL statements being generated
When using contextObject.ExecuteStoreCommand
declare #p3 dbo.xxx
insert into #p3 values(N'M',N'ErrorMsg - 0',NULL,NULL)
insert into #p3 values(N'M',N'ErrorMsg - 1',NULL,NULL)
insert into #p3 values(N'M',N'ErrorMsg - 2',NULL,NULL)
exec sp_executesql N'InsertAll',N'#Records [xxx]
READONLY',#Records=#p3
When using SQLCommand.ExecuteNonQuery
declare #p1 dbo.xxx
insert into #p1 values(N'M',N'ErrorMsg - 0',NULL,NULL)
insert into #p1 values(N'M',N'ErrorMsg - 1',NULL,NULL)
insert into #p1 values(N'M',N'ErrorMsg - 2',NULL,NULL)
exec InsertAll #Records=#p1
How can I get contextObject.ExecuteStoreCommand to execute the same SQL stmt like SQLCommand.ExecuteNonQuery?
I have found it necessary to provide extension methods when the Entity Framework simply will not work for what I want it to do. The good part is adding an extension to the Context object will usually keep you from having to dig into the web.config or app.config for a connection string. Then parameters and return values can be generic. I have personally seen quite a few eloquent solutions using this strategy.