Change titles of columns in MDX with T-SQL - tsql

I created simply query with T-SQL processed OLAP cube like below:
SELECT * FROM OPENQUERY([linkedserver], 'SELECT NON EMPTY { [Measures].[Revenue] } ON COLUMNS,
NON EMPTY { ([Basic].[Name].[Name].ALLMEMBERS ) } ON ROWS
FROM [SummaryCube]');
The result of query is the table. Titles of columns in this table are defaults, ex.:"[Basic].[Names].[Names].[MEMBER_CAPTION]" but I would like to change these titles for ex.: "Names". I cannot change using aliases or I'm using aliases wrong way. Can anyone tell me how can I change name of column?

Instead of your SELECT *, double quote the names of the returned columns:
SELECT "[Basic].[Name].[Name].[MEMBER_CAPTION]" as Names
You need to use double quotes as SQL Server recognises square brackets as identifiers.

I usually do some conversions as well:
SELECT
Names = CONVERT(VARCHAR(100), "[Basic].[Name].[Name].[MEMBER_CAPTION]"),
Revenue = CONVERT(NUMERIC(18,2),CAST("[Measures].[Revenue]" AS FLOAT))
FROM
OPENQUERY
(
[linkedserver],
'SELECT
NON EMPTY { [Measures].[Revenue] } ON COLUMNS,
NON EMPTY { ([Basic].[Name].[Name].ALLMEMBERS ) } ON ROWS
FROM [SummaryCube]'
);
And if you switched to the better olapextensions addin it would be:
DECLARE #Server NVARCHAR(30) = 'SummaryCubeServerName';
DECLARE #Database NVARCHAR(50) = 'SummaryCubeDatabaseName';
DECLARE #MDX NVARCHAR(MAX) = '
SELECT
NON EMPTY { [Measures].[Revenue] } ON COLUMNS,
NON EMPTY { ([Basic].[Name].[Name].ALLMEMBERS ) } ON ROWS
FROM [SummaryCube];
'
CREATE TABLE #Results(
Names VARCHAR(250),
Revenue FLOAT
);
INSERT INTO #Results
EXEC ExecuteOLAP #Server, #Database, #MDX;
SELECT
Names,
Revenue = ISNULL(CONVERT(DECIMAL(28,8),Revenue),0.0),
FROM #Results;

Related

Could not get desired column names using MDX in SQL Openquery. Need MEMBERs instead of complete hierarchy in columns name

Below is the TSQL I have written to fetch execute MDX using Openquery. However, I am unable to get columns in required format. Need columns title as Member only and Total column at the end of the table
DECLARE #DateFrom varchar(100)
SET #DateFrom = '2022-03-01'
DECLARE #DateTo varchar(100)
SET #DateTo = '2022-03-30'
DECLARE #Measure varchar(100)
SET #Measure = '[Measures].[Total Items]';
SET #sql = 'SELECT *
FROM OPENQUERY(POSCUBE,''
SELECT
(
[Date].[FirstDateOfWeek].Members,
{
'+#Measure+'
}
)
ON COLUMNS,
ORDER
(
(
[Product].[ProductName].Children,
[Product].[BrandName].Children
)
,
[Measures].[Total Items]
,
BDESC
)
ON ROWS
FROM [Model]
WHERE
([Date].[Date].['+#DateFrom+'] : [Date].[Date].['+#DateTo+']
) '')'
PRINT (#sql)
EXEC (#sql)
This is giving me below result
Current Result
I need result something like this where ALL (Row Total is at the end and Column Titles are Members value)
Required Result

Concatenate string instead of just replacing it

I have a table with standard columns where I want to perform regular INSERTs.
But one of the columns is of type varchar with special semantics. It's a string that's supposed to behave as a set of strings, where the elements of the set are separated by commas.
Eg. if one row has in that varchar column the value fish,sheep,dove, and I insert the string ,fish,eagle, I want the result to be fish,sheep,dove,eagle (ie. eagle gets added to the set, but fish doesn't because it's already in the set).
I have here this Postgres code that does the "set concatenation" that I want:
SELECT string_agg(unnest, ',') AS x FROM (SELECT DISTINCT unnest(string_to_array('fish,sheep,dove' || ',fish,eagle', ','))) AS x;
But I can't figure out how to apply this logic to insertions.
What I want is something like:
CREATE TABLE IF NOT EXISTS t00(
userid int8 PRIMARY KEY,
a int8,
b varchar);
INSERT INTO t00 (userid,a,b) VALUES (0,1,'fish,sheep,dove');
INSERT INTO t00 (userid,a,b) VALUES (0,1,',fish,eagle')
ON CONFLICT (userid)
DO UPDATE SET
a = EXCLUDED.a,
b = SELECT string_agg(unnest, ',') AS x FROM (SELECT DISTINCT unnest(string_to_array(t00.b || EXCLUDED.b, ','))) AS x;
How can I achieve something like that?
Storing comma separated values is a huge mistake to begin with. But if you really want to make your life harder than it needs to be, you might want to create a function that merges two comma separated lists:
create function merge_lists(p_one text, p_two text)
returns text
as
$$
select string_agg(item, ',')
from (
select e.item
from unnest(string_to_array(p_one, ',')) as e(item)
where e.item <> '' --< necessary because of the leading , in your data
union
select t.item
from unnest(string_to_array(p_two, ',')) t(item)
where t.item <> ''
) t;
$$
language sql;
If you are using Postgres 14 or later, unnest(string_to_array(..., ',')) can be replace with string_to_table(..., ',')
Then your INSERT statement gets a bit simpler:
INSERT INTO t00 (userid,a,b) VALUES (0,1,',fish,eagle')
ON CONFLICT (userid)
DO UPDATE SET
a = EXCLUDED.a,
b = merge_lists(excluded.b, t00.b);
I think I was only missing parentheses around the SELECT statement:
INSERT INTO t00 (userid,a,b) VALUES (0,1,',fish,eagle')
ON CONFLICT (userid)
DO UPDATE SET
a = EXCLUDED.a,
b = (SELECT string_agg(unnest, ',') AS x FROM (SELECT DISTINCT unnest(string_to_array(t00.b || EXCLUDED.b, ','))) AS x);

SQL Server, variable in IN Clause

I want to use a variable inside IN clause, similar to this:
Declare #tt NVARCHAR(MAX)
SET #tt = '02ea2b81-07f0-4660-bca1-81563f65bf65','07728975-cb1d-484c-8894-14f5b793cbef','1071ee4f-a214-443f-8694-0b3e9d2dc77e','120d2881-b04f-4707-a925-e4d941f03201','23af54a7-6666-4747-a74a-c2101cda59b0','260d2ce5-f4f0-4a0b-aa0b-3e1d2b5fcfeb','2710a913-13e7-4300-91f1-2646e2f8449e','2cebc482-4917-4aa3-973b-2481619a78e7','2d2269a4-9164-4dae-a732-90448d761509','2d29c707-1c5f-4e00-bd3c-bfd2ec2c6e29','3ead72a1-de91-47e9-8038-cc504a5274ec','40a03f53-7fd7-488d-922d-3435652219cb','43c93954-2e75-4d47-a53f-848eee609cf1','441e1a59-d397-4981-b770-01fb4594152e','4dacc9df-0536-46f6-af5d-78610ed998cd','4e4910ee-db9b-45ba-8872-2819dcefdc2c','4f9fd3ef-ba81-44bb-8c75-7cf6998e115e','60d9c73f-46c3-4ab1-9a4e-5440d18a0fd8','63e0cc57-1803-473f-847d-f3318f70c993','6510de61-9a1d-4f69-bec4-a744ea2bb847','799e2e55-2ba8-4772-8aff-331ed1817225','7be022db-4d37-4964-9005-3de7c6286027','85ba80c3-5c8b-4097-b5c9-c0d55ac6cf2f','8bc45b07-6a65-43c2-a41e-e791b085a053','8ca2d4a7-f4d6-4b56-aa41-42550e3a11b5','8fa7c3f6-e042-4b93-829f-79b8946a909e','ab34d18a-9482-4146-adb2-7e45e32f8cdd','ac43b44b-651c-4a98-a55f-82878cc8c656','ad9f222c-a98e-44eb-af9e-6f083941be9e','af7e8d24-9126-4d9b-a48a-75bf344c3529','b0e95518-0fef-46ba-81f4-0d1356ebc135','b1f1810f-3044-40b3-b218-5bb02d8922bd','b32ebf2b-f247-4032-8a37-285e4c3488a9','b93a8bb7-c62f-47b7-86ba-0421eb67ca14','c5342d7e-1667-47cb-bccf-91c5e8e9f18c','e2cf46f6-a522-4a96-8a84-f1ce3818c364','f01f4010-a192-43ca-a3bf-157379f4779d','f0f168ec-f043-41ef-90d3-3eac68b90334','f99af706-e1bb-42ba-bdf9-348a3b02c25e','fe691dee-b133-4d1c-90a3-8889cd3482d2';
Select * from table where assessmentId IN(#tt)
But this query is failing, saying Incorrect syntax near ','. The same query will work if I will not use variable in IN clause and directly pass the Id's
Select * from table where AssessmentId IN
('02ea2b81-07f0-4660-bca1-81563f65bf65','07728975-cb1d-484c-8894-14f5b793cbef','1071ee4f-a214-443f-8694-0b3e9d2dc77e','120d2881-b04f-4707-a925-e4d941f03201','23af54a7-6666-4747-a74a-c2101cda59b0','260d2ce5-f4f0-4a0b-aa0b-3e1d2b5fcfeb','2710a913-13e7-4300-91f1-2646e2f8449e','2cebc482-4917-4aa3-973b-2481619a78e7','2d2269a4-9164-4dae-a732-90448d761509','2d29c707-1c5f-4e00-bd3c-bfd2ec2c6e29','3ead72a1-de91-47e9-8038-cc504a5274ec','40a03f53-7fd7-488d-922d-3435652219cb','43c93954-2e75-4d47-a53f-848eee609cf1','441e1a59-d397-4981-b770-01fb4594152e','4dacc9df-0536-46f6-af5d-78610ed998cd','4e4910ee-db9b-45ba-8872-2819dcefdc2c','4f9fd3ef-ba81-44bb-8c75-7cf6998e115e','60d9c73f-46c3-4ab1-9a4e-5440d18a0fd8','63e0cc57-1803-473f-847d-f3318f70c993','6510de61-9a1d-4f69-bec4-a744ea2bb847','799e2e55-2ba8-4772-8aff-331ed1817225','7be022db-4d37-4964-9005-3de7c6286027','85ba80c3-5c8b-4097-b5c9-c0d55ac6cf2f','8bc45b07-6a65-43c2-a41e-e791b085a053','8ca2d4a7-f4d6-4b56-aa41-42550e3a11b5','8fa7c3f6-e042-4b93-829f-79b8946a909e','ab34d18a-9482-4146-adb2-7e45e32f8cdd','ac43b44b-651c-4a98-a55f-82878cc8c656','ad9f222c-a98e-44eb-af9e-6f083941be9e','af7e8d24-9126-4d9b-a48a-75bf344c3529','b0e95518-0fef-46ba-81f4-0d1356ebc135','b1f1810f-3044-40b3-b218-5bb02d8922bd','b32ebf2b-f247-4032-8a37-285e4c3488a9','b93a8bb7-c62f-47b7-86ba-0421eb67ca14','c5342d7e-1667-47cb-bccf-91c5e8e9f18c','e2cf46f6-a522-4a96-8a84-f1ce3818c364','f01f4010-a192-43ca-a3bf-157379f4779d','f0f168ec-f043-41ef-90d3-3eac68b90334','f99af706-e1bb-42ba-bdf9-348a3b02c25e','fe691dee-b133-4d1c-90a3-8889cd3482d2');
How can I use variable in IN clause using the first approach?
You will have to insert the values into a temp table.
Something like
DECLARE #TempTable TABLE(
assessmentId VARCHAR(50)
)
INSERT INTO #TempTable
VALUES
('02ea2b81-07f0-4660-bca1-81563f65bf65'),
('07728975-cb1d-484c-8894-14f5b793cbef'),
('1071ee4f-a214-443f-8694-0b3e9d2dc77e'),
('120d2881-b04f-4707-a925-e4d941f03201'),
('23af54a7-6666-4747-a74a-c2101cda59b0'),
('260d2ce5-f4f0-4a0b-aa0b-3e1d2b5fcfeb'),
('2710a913-13e7-4300-91f1-2646e2f8449e'),
('2cebc482-4917-4aa3-973b-2481619a78e7')
SELECT *
FROM table
where AssessmentId IN (SELECT assessmentId FROM #TempTable)
Since you want to specify multiple values, use a data type that supports multiple values (as opposed to a scalar variable). Here we're using a table variable:
Declare #tt table (value nvarchar(50) not null)
insert into #tt (value) values
('02ea2b81-07f0-4660-bca1-81563f65bf65'),('07728975-cb1d-484c-8894-14f5b793cbef'),('1071ee4f-a214-443f-8694-0b3e9d2dc77e'),
('120d2881-b04f-4707-a925-e4d941f03201'),('23af54a7-6666-4747-a74a-c2101cda59b0'),('260d2ce5-f4f0-4a0b-aa0b-3e1d2b5fcfeb'),
...
Select * from table where assessmentId IN(select value from #tt)

T-SQL Loop in a stored proc

how do I loop through a comma separated variable using tsql in a stored proc
So for instance my list would look like this
"1,2,3,4,5,6,7,8,9,10"
and I would loop thought this list and made some necessary table
insert based on this list
You could do it a couple ways, but if this would be a list of ID's it could be done like this as well. It would change your list format a bit.
UPDATE table
SET column = value
WHERE ID in ('1','2','3','4','5','6','7','8','9','10')
You could do a loop as well
DECLARE #List CHAR(100)
DECLARE #ListItem int
DECLARE #Pos int
SET #List = '1,2,3,4,5,6,7,8,9,10'
WHILE LEN(#List) > 0
BEGIN
--Pull Item Frim List
SET #Pos = CHARINDEX(',', #List)
IF #Pos = 0
BEGIN
SET #ListItem = #List
END
ELSE
BEGIN
SET #ListItem = SUBSTRING(#List, 1, #Pos - 1)
END
UPDATE table
SET column = value
WHERE ID = #ListItem
--Remove Item Frim List
IF #Pos = 0
BEGIN
SET #List = ''
END
ELSE
BEGIN
SET #List = SUBSTRING(#List, #Pos + 1, LEN(#List) - #Pos)
END
END
I'd try to avoid looping and insert the rows directly from your comma list.
Use a table values parameter (new in SQl Server 2008). Set it up by creating the actual table parameter type:
CREATE TYPE IntTableType AS TABLE (ID INTEGER PRIMARY KEY)
Your procedure would then be:
Create Procedure up_TEST
#Ids IntTableType READONLY
AS
SELECT *
FROM ATable a
WHERE a.Id IN (SELECT ID FROM #Ids)
RETURN 0
GO
if you can't use table value parameters, see: "Arrays and Lists in SQL Server 2005 and Beyond, When Table Value Parameters Do Not Cut it" by Erland Sommarskog, then there are many ways to split string in SQL Server. This article covers the PROs and CONs of just about every method. in general, you need to create a split function. This is how a split function can be used to insert rows:
INSERT INTO YourTableA (colA)
SELECT
b.col1
FROM dbo.yourSplitFunction(#Parameter) b
I prefer the number table approach to split a string in TSQL but there are numerous ways to split strings in SQL Server, see the previous link, which explains the PROs and CONs of each.
For the Numbers Table method to work, you need to do this one time table setup, which will create a table Numbers that contains rows from 1 to 10,000:
SELECT TOP 10000 IDENTITY(int,1,1) AS Number
INTO Numbers
FROM sys.objects s1
CROSS JOIN sys.objects s2
ALTER TABLE Numbers ADD CONSTRAINT PK_Numbers PRIMARY KEY CLUSTERED (Number)
Once the Numbers table is set up, create this split function:
CREATE FUNCTION [dbo].[FN_ListToTable]
(
#SplitOn char(1) --REQUIRED, the character to split the #List string on
,#List varchar(8000)--REQUIRED, the list to split apart
)
RETURNS TABLE
AS
RETURN
(
----------------
--SINGLE QUERY-- --this will not return empty rows
----------------
SELECT
ListValue
FROM (SELECT
LTRIM(RTRIM(SUBSTRING(List2, number+1, CHARINDEX(#SplitOn, List2, number+1)-number - 1))) AS ListValue
FROM (
SELECT #SplitOn + #List + #SplitOn AS List2
) AS dt
INNER JOIN Numbers n ON n.Number < LEN(dt.List2)
WHERE SUBSTRING(List2, number, 1) = #SplitOn
) dt2
WHERE ListValue IS NOT NULL AND ListValue!=''
);
GO
You can now easily split a CSV string into a table and join on it:
Create Procedure up_TEST
#Ids VARCHAR(MAX)
AS
SELECT * FROM ATable a
WHERE a.Id IN (SELECT ListValue FROM dbo.FN_ListToTable(',',#Ids))
GO
or insert rows from it:
Create Procedure up_TEST
#Ids VARCHAR(MAX)
,#OtherValue varchar(5)
AS
INSERT INTO YourTableA
(colA, colB, colC)
SELECT
ListValue, #OtherValue, GETDATE()
FROM dbo.FN_ListToTable(',',#Ids)
GO
Using CTE (Common Table Expression) is the most elegant solution I think check this question on stackoverflow,
T-SQL: Opposite to string concatenation - how to split string into multiple records

Most succinct way to transform a CSV string to a table in T-SQL?

-- Given a CSV string like this:
declare #roles varchar(800)
select #roles = 'Pub,RegUser,ServiceAdmin'
-- Question: How to get roles into a table view like this:
select 'Pub'
union
select 'RegUser'
union
select 'ServiceAdmin'
After posting this, I started playing with some dynamic SQL. This seems to work, but seems like there might be some security risks by using dynamic SQL - thoughts on this?
declare #rolesSql varchar(800)
select #rolesSql = 'select ''' + replace(#roles, ',', ''' union select ''') + ''''
exec(#rolesSql)
If you're working with SQL Server compatibility level 130 then the STRING_SPLIT function is now the most succinct method available.
Reference link: https://msdn.microsoft.com/en-gb/library/mt684588.aspx
Usage:
SELECT * FROM string_split('Pub,RegUser,ServiceAdmin',',')
RESULT:
value
-----------
Pub
RegUser
ServiceAdmin
See my answer from here
But basically you would:
Create this function in your DB:
CREATE FUNCTION dbo.Split(#origString varchar(max), #Delimiter char(1))
returns #temptable TABLE (items varchar(max))
as
begin
declare #idx int
declare #split varchar(max)
select #idx = 1
if len(#origString )<1 or #origString is null return
while #idx!= 0
begin
set #idx = charindex(#Delimiter,#origString)
if #idx!=0
set #split= left(#origString,#idx - 1)
else
set #split= #origString
if(len(#split)>0)
insert into #temptable(Items) values(#split)
set #origString= right(#origString,len(#origString) - #idx)
if len(#origString) = 0 break
end
return
end
and then call the function and pass in the string you want to split.
Select * From dbo.Split(#roles, ',')
Here's a thorough discussion of your options:
Arrays and Lists in SQL Server
What i do in this case is just using some string replace to convert it to json and open the json like a table. May not be suitable for every use case but it is very simple to get running and works with strings and files. With files you just need to watch your line break character, mostly i find it to be "Char(13)+Char(10)"
declare #myCSV nvarchar(MAX)= N'"Id";"Duration";"PosX";"PosY"
"•P001";223;-30;35
"•P002";248;-28;35
"•P003";235;-26;35'
--CSV to JSON
--convert to json by replacing some stuff
declare #myJson nvarchar(MAX)= '[['+ replace(#myCSV, Char(13)+Char(10), '],[' ) +']]'
set #myJson = replace(#myJson, ';',',') -- Optional: ensure coma delimiters for json if the current delimiter differs
-- set #myJson = replace(#myJson, ',,',',null,') -- Optional: empty in between
-- set #myJson = replace(#myJson, ',]',',null]') -- Optional: empty before linebreak
SELECT
ROW_NUMBER() OVER (ORDER BY (SELECT 0))-1 AS LineNumber, *
FROM OPENJSON( #myJson )
with (
col0 varchar(255) '$[0]'
,col1 varchar(255) '$[1]'
,col2 varchar(255) '$[2]'
,col3 varchar(255) '$[3]'
,col4 varchar(255) '$[4]'
,col5 varchar(255) '$[5]'
,col6 varchar(255) '$[6]'
,col7 varchar(255) '$[7]'
,col8 varchar(255) '$[8]'
,col9 varchar(255) '$[9]'
--any name column count is possible
) csv
order by (SELECT 0) OFFSET 1 ROWS --hide header row
Using SQL Server's built in XML parsing is also an option. Of course, this glosses over all the nuances of an RFC-4180 compliant CSV.
-- Given a CSV string like this:
declare #roles varchar(800)
select #roles = 'Pub,RegUser,ServiceAdmin'
-- Here's the XML way
select split.csv.value('.', 'varchar(100)') as value
from (
select cast('<x>' + replace(#roles, ',', '</x><x>') + '</x>' as xml) as data
) as csv
cross apply data.nodes('/x') as split(csv)
If you are using SQL 2016+, using string_split is better, but this is a common way to do this prior to SQL 2016.
Using BULK INSERT you can import a csv file into your sql table -
http://blog.sqlauthority.com/2008/02/06/sql-server-import-csv-file-into-sql-server-using-bulk-insert-load-comma-delimited-file-into-sql-server/
Even the accepted answer is working fine. but I got this function much faster even for thousands of record. create below function and use.
IF EXISTS (
SELECT 1
FROM Information_schema.Routines
WHERE Specific_schema = 'dbo'
AND specific_name = 'FN_CSVToStringListTable'
AND Routine_Type = 'FUNCTION'
)
BEGIN
DROP FUNCTION [dbo].[FN_CSVToStringListTable]
END
GO
CREATE FUNCTION [dbo].[FN_CSVToStringListTable] (#InStr VARCHAR(MAX))
RETURNS #TempTab TABLE (Id NVARCHAR(max) NOT NULL)
AS
BEGIN
;-- Ensure input ends with comma
SET #InStr = REPLACE(#InStr + ',', ',,', ',')
DECLARE #SP INT
DECLARE #VALUE VARCHAR(1000)
WHILE PATINDEX('%,%', #INSTR) <> 0
BEGIN
SELECT #SP = PATINDEX('%,%', #INSTR)
SELECT #VALUE = LEFT(#INSTR, #SP - 1)
SELECT #INSTR = STUFF(#INSTR, 1, #SP, '')
INSERT INTO #TempTab (Id)
VALUES (#VALUE)
END
RETURN
END
GO
---Test like this.
declare #v as NVARCHAR(max) = N'asdf,,as34df,234df,fs,,34v,5fghwer,56gfg,';
SELECT Id FROM dbo.FN_CSVToStringListTable(#v)
I was about you use the solution mentioned in the accepted answer, but doing more research led me to use Table Value Types:
These are far more efficient and you don't need a TVF (Table valued function) just to create a table from csv. You can use it directly in your scripts or pass that to a stored procedure as a Table Value Parameter. The Type can be created as :
CREATE TYPE [UniqueIdentifiers] AS TABLE(
[Id] [varchar](20) NOT NULL
)