SQL query with XML parameter - tsql

EDIT: I have found a relevant answer already on stack overflow here:
XQuery [value()]: 'value()' requires a singleton (or empty sequence), found operand of type 'xdt:untypedAtomic *'
I have not dealt with XML in T-SQL before, and I am modifying an existing legacy stored proc, and picking most if it up through trial and error.
however I have hit a problem where trial and error is proving fruitless, and very slow. Think it's time to appeal to stack overflow gurus!
Here is some XML
<?xml version=\"1.0\"?>
<Notification xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\" xmlns:xsd=\"http://www.w3.org/2001/XMLSchema\">
<NotificationId>0</NotificationId>
<UserNotifications>
<UserNotification>
<UserNotificationId>0</UserNotificationId>
<NotificationId>0</NotificationId>
<UserId>13514</UserId>
<MessageTypeId>1</MessageTypeId>
</UserNotification>
<UserNotification>
<UserNotificationId>0</UserNotificationId>
<NotificationId>0</NotificationId>
<UserId>13514</UserId>
<MessageTypeId>2</MessageTypeId>
</UserNotification>
</UserNotifications>
</Notification>
The Stored Proc in question accepts the above XML as a parameter:
CREATE PROCEDURE [dbo].[Notification_Insert]
#ParametersXml XML
AS
BEGIN
The XML contains child "UserNotification" elements. I would like to select the UserId, MessageTypeId of each UserNotification, into a table like this
UserId | MessageTypeId
13514 | 1
13514 | 2
Obviously the size of the collection is not fixed.
My current attempt (which doesn't work - is along these lines:
DECLARE #UserDetails TABLE ( UserId INT, MessageTypeId INT);
INSERT INTO #UserDetails (UserId, MessageTypeId)
SELECT Tab.Col.value('#UserId','INT'),
Tab.Col.value('#MessageTypeId','INT')
FROM #ParametersXml.nodes('/Notification/UserNotifications[not(#xsi:nil = "true")][1]/UserNotification') AS Tab(Col)
But this never inserts anything..
I have been playing around with this for a while now and not had any joy :(

I would suggest going through the links below. I found them short and quick to go through:
http://blog.sqlauthority.com/2009/02/12/sql-server-simple-example-of-creating-xml-file-using-t-sql/
http://blog.sqlauthority.com/2009/02/13/sql-server-simple-example-of-reading-xml-file-using-t-sql/

I found the solution to this problem through further searching stack overflow.
The query I need (thanks to XQuery [value()]: 'value()' requires a singleton (or empty sequence), found operand of type 'xdt:untypedAtomic *')
INSERT INTO #UserDetails (UserId, MessageTypeId)
SELECT UserNotification.value('UserId[1]','INT'),
UserNotification.value('MessageTypeId[1]','INT')
FROM #ParametersXml.nodes('//Notification/UserNotifications') AS x(Coll)
cross apply #ParametersXml.nodes('//Notification/UserNotifications/UserNotification') as un(UserNotification)

Related

PGSQL seems to ignore param passed in [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 1 year ago.
Improve this question
Issue: If I call with valid parameter I get a whole lot of results I should not.
So, while it seems aware of parameters, they're not used as I would expect.
If I call with no parameter at all I get an error there is no such function, so that's expected.
If I call with valid parameter I get a whole lot of results I should not.
So, while it seems aware of parameters, they're not used as I would expect.
Workaround: I can be redundantly specific with a where clause, but I know there's something I'm missing here. I expect to pass a territory id and get ONLY results for that territory id. I'm getting all territories right now.
select * from by_territory_id('TER-123') where territory = 'TER-123';
Of course I want to simply do the following, as that is what wrapping this in a function is for:
select * from by_territory_id('TER-123');
If anybody can see what is wrong here please point it out? The parameter is supposed to be used in the WHERE clause of this function:
CREATE OR REPLACE FUNCTION by_territory_id (territory_id char)
RETURNS TABLE (customer_id char, territory char, scan_id int, scan_status iol.consignment_audit_scan_status) AS $$
select distinct con.customer_id, cus.primary_territory stm_territory, cas.id, cas.status
from iol.consignment con
join iol.products prod
on con.description = prod.description
join iol.customers cus
on con.customer_id = cus.customer_id
left join iol.territories_and_roles tr
on cus.primary_territory = tr.territory_id
left outer join iol.consignment_audits_scan cas
on cus.customer_id = cas.customer_id
where prod.lvl3_id in ('PREMIUM', 'STANDARD')
and tr.role = 'STM' and cus.primary_territory = territory_id
order by cas.id
$$ LANGUAGE SQL;
EDIT: Two down votes saying I'm not clear enough at the point of this edit. All the same, the one answer is correct. I hope it is okay that I really just needed eyes on this and I believe somebody else will run into the exact same issue, so rather than expand on the above I'm going to leave it be because the issue was indeed the name of my parameter (and PG gave no complaint or indication).
Looks like one of your tables has a column named territory_id, and that is what is getting invoked by you unqualified reference. You can qualify it with the function name:
and tr.role = 'STM' and cus.primary_territory = by_territory_id.territory_id
Or better yet change the spelling of the parameter so that it is distinct.

Add a missing key to JSON in a Postgres table via Rails

I'm trying to use update_all to update any records that is missing a key in a JSON stored in a table cell. ids is the ids of those records and I've tried the below...
User.where(id: ids).
update_all(
"preferences = jsonb_set(preferences, '{some_key}', 'true'"
)
Where the error returns is...
Caused by PG::SyntaxError: ERROR: syntax error at or near "WHERE"
LINE 1: ...onb_set(preferences, '{some_key}', 'true' WHERE "user...
The key takes a string value so not sure why the query is failing.
UPDATE:
Based on what was mentioned, I added the parentheses and also added / modified the last two arguments...
User.where(id: ids).
update_all(
"preferences = jsonb_set(preferences, '{some_key}', 'true'::jsonb, true)"
)
still running into issues and this time it seems related to the key I'm passing
I know this key doesn't currently exist for the set of ids
I added true for create_missing so that 1 isn't an issue
I get this error now...
Caused by PG::UndefinedFunction: ERROR: function jsonb_set(hstore, unknown, jsonb, boolean) does not exis
some_key should be a key in preferences
You're passing in raw SQL so you are 100% responsible for ensuring that is actually valid SQL. What you have there isn't. Check your parentheses:
User.where(id: ids).
update_all(
"preferences = jsonb_set(preferences, '{some_key}', 'true')"
)
If you look more closely at the error message it was telling you there was a problem precisely at the introduction of the WHERE clause, and right after ...true' so that was a good place to look for problems.
Syntax errors like this can be really annoying, but don't forget your database will usually do its best to pin down the place where the problem occurs.

How to use Postgresql ts_delete function

I am trying to use Postgresql Full Text Search. I read that the stop words (words ignored for indexing) are implemented via dictionary. But I would like to give the user a limited control over the stop words (insert new ones), so I grouped then in a table.
From the example below:
select strip(to_tsvector('simple', texto)) from longtxts where id = 23;
I can get the vector:
{'alta' 'aluno' 'cada' 'do' 'em' 'leia' 'livro' 'pedir' 'que' 'trecho' 'um' 'voz'}
And now I would like to remove the elements from the stopwords table:
select array(select palavra_proibida from stopwords);
That returns the array:
{a,as,ao,aos,com,default,e,eu,o,os,da,das,de,do,dos,em,lhe,na,nao,nas,no,nos,ou,por,para,pra,que,sem,se,um,uma}
Then, following documentation:
ts_delete(vector tsvector, lexemes text[]) tsvector remove any occurrence of lexemes in lexemes from vector ts_delete('fat:2,4 cat:3 rat:5A'::tsvector, ARRAY['fat','rat'])
I tried a lot. For example:
select ts_delete((select strip(to_tsvector('simple', texto)) from longtxts where id = 23), array[(select palavra_proibida from stopwords)]);
But I always receive the error:
ERROR: function ts_delete(tsvector, character varying[]) does not exist
LINE 1: select ts_delete((select strip(to_tsvector('simple', texto))...
^
HINT: No function matches the given name and argument types. You might need to add explicit type casts.
Could anyone help me? Thanks in advance!
ts_delete was introduced in PostgreSQL 9.6. Based on the error message, you're using an earlier version. You may try select version(); to be sure.
When you land on the PostgreSQL online documentation with a web search, it may correspond to any version. The version is in the URL and there's a "This page in another version" set of links at the top of each page to help switching to the equivalent doc for a different version.

How to cast a JSON value to its correspondent datatype?

JSON-string is SQL-text, JSON-number is SQL-numeric, JSON-boolean is SQL-boolean... So I can do "real good cast"... But:
SELECT to_jsonb('hello'::text)::text
is not good. Returning with quotes. Worst case:
SELECT to_jsonb(1::int)::int;
-- ERROR: cannot cast type jsonb to integer
So, how to do casting?
  ((this is a Wiki, you can edit!))
(update for) Modern PostgreSQL do direct cast!
Example using pg 13.5:
SELECT (('{"x":123}'::jsonb)->'x')::int; -- ok no error!
SELECT (('{"x":123.44}'::jsonb)->'x')::float; -- ok no error!
SELECT (('{"x":123.44}'::jsonb)->'x')::numeric; -- ok no error!
SELECT (('{"x":"hello !"}'::jsonb)->'x')::text; -- no error, but quotes
SELECT ('{"x":"hello !"}'::jsonb)->>'x'; -- ok!
-- Modern versions not need cast for simple operations:
SELECT (('{"x":123}'::jsonb)->'x')::int + 3; -- Ok, but...
SELECT jsonb_path_query('{"x":123}', '$.x + 3'); -- How about this!
OLD answers for pg 9.4
Ugly ways only...
Today PostgreSQL is not so serious about JSON... So, let's workaround.
Sorry there are questions (mine!) and answers. The answer with the most simple solution is this one:
In pg v9.4.4+ using the #>> operator works for me: select to_json('test'::text)#>>'{}';
So, using the question's example:
SELECT (to_json(1::int)#>>'{}')::int; -- the cast works
JSONb question... Internal binary value was reused?
SELECT (to_jsonB(10.7::numeric)#>>'{}')::numeric; -- internal repres. preserved?
SELECT (to_jsonB(10.7::float)#>>'{}')::float; -- internal repres. preserved?
-- internal JSONb was float or numeric?
SELECT (to_jsonB(true)#>>'{}')::boolean; -- internal representation preserved?
... there are no guarantee that PostgreSQL's internal parser is doing the correct thing, reusing internal numerical representation of JSONb to avoid CPU-time consuption (converting jsonb-number to SQL-text, them casting text to SQL-number).
"JSONb cast optimization" or "SQL datatypes/JSONb datatypes convertion optionmization" seems to still be a big gap for PostgreSQL.
Here's a simpler solution that I've found for PostgreSQL.
Assuming:
{
"additionaldata" :
{
"Duration" : "123.44"
}
}
The following select works:
select
(additionaldata -> 'Duration')::jsonb::text::numeric as durationSeconds
from ...

How do I get the contents of all of the nodes in a root node with SQL and XQuery?

I have the following table structure:
CREATE TABLE SpecialTable
(
Key UNIQUEIDENTIFIER,
XMLField VARCHAR(MAX)
)
In the first tuple:
Key = "28384841-17283848-7836-18292939"
XMLField =
"<RootNode>
<ForeignKey>92383829-27374848-1298-19283789</ForeignKey>
<ForeignKey>47585853-27374848-4759-19283939</ForeignKey>
<ForeignKey>37383829-27374848-3747-19283930</ForeignKey>
</RootNode>"
In another tuple, I see:
Key = "89984841-17283848-7836-18292939"
XMLField =
"<RootNode>
<ForeignKey>92383829-27374848-1298-19283789</ForeignKey>
<ForeignKey>37383829-27374848-3747-19283930</ForeignKey>
</RootNode>"
In a further tuple, I see:
Key = "11114841-17283848-7836-18292939"
XMLField =
"<RootNode>
<ForeignKey>37383829-27374848-3747-19283930</ForeignKey>
</RootNode>"
What I need to do is to get the following dataset out:
Key ForeignKey
28384841-17283848-7836-18292939 92383829-27374848-1298-19283789
28384841-17283848-7836-18292939 47585853-27374848-4759-19283939
28384841-17283848-7836-18292939 37383829-27374848-3747-19283930
89984841-17283848-7836-18292939 92383829-27374848-1298-19283789
89984841-17283848-7836-18292939 37383829-27374848-3747-19283930
11114841-17283848-7836-18292939 37383829-27374848-3747-19283930
I must say that this is a simplified data set and that the data was more complex than this and I have got to a point where I cannot get any further.
I have tried this:
SELECT sp.Key,
x.XmlCol.Query('.')
FROM SpecialTable AS sp
CROSS APPLY sp.XMLField.nodes('/RootNode') x(XmlCol)
However, it seems just to show the Key and the whole of the XML of the XMLField.
Also, I tried this:
SELECT sp.Key,
x.XmlCol.Query('ForeignKey[text]')
FROM SpecialTable AS sp
CROSS APPLY sp.XMLField.nodes('/RootNode') x(XmlCol)
And I get only the first value in the first ForeignKey node and not the others.
What am I doing wrong?
Kindest regards,
QuietLeni
First of all - if your data looks like XML, quacks like XML, smells like XML - then it IS XML and you should use the XML datatype to store it!
Also: be aware that Key is a very generic term, and also a T-SQL reserved keyword, so it makes for a really bad column name - use something more meaningful that doesn't clash with a keyword!
Once you've done that, you should be able to use this code to get your desired results:
SELECT
[Key],
ForeignKey = xc.value('.', 'varchar(50)')
FROM
dbo.SpecialTable
CROSS APPLY
XMLField.nodes('/RootNode/ForeignKey') AS XT(XC)
This will only work if you column XMLField is of XML datatype !! (which it really should be anyway)