Process Tabular cube from stored procedure - tsql

I have a stored procedure which populates a few tables in a db which is the data source for a tabular SSAS cube. I want to process the cube as a final step in the stored procedure when I am done loading the tables. I found this code:
DECLARE #XMLA XML = '
<Batch xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
<Process>
<Object>
<DatabaseID>' + #Database + '</DatabaseID>
</Object>
<Type>ProcessFull</Type>
<WriteBackTableCreation>UseExisting</WriteBackTableCreation>
</Process>
</Batch>
';
DECLARE #Command VARCHAR(MAX) = CONVERT(VARCHAR(MAX), #XMLA);
EXEC (#Command) AT SSAS;
Which accepts a database name, but what I cant figure out is how to make it run on a particular server or SSAS instance. My stored procedure and cube are on different named instanceS of Sql server. Does anyone know how to either embed the server/instance name in the xmla or run the xmla on a specified instance?
Thanks in advance.

I discovered that the key is using a linked server (and TMSL, thanks Mitch Wheat):
DECLARE #TMSL VARCHAR(MAX) = '{
"refresh": {
"type": "full",
"objects": [
{
"database": ' + #DataBaseName + '
}
]
}
}';
EXEC (#TMSL) AT SSASTABULAR; -- linked server name
SSASTABULAR is a linked server pointing to my ssas cube server instance.

You can create a job in SQL Agent to process the cube. You can then call the SQL Agent job from your stored procedure using sp_start_job.
https://learn.microsoft.com/en-us/sql/relational-databases/system-stored-procedures/sp-start-job-transact-sql?view=sql-server-ver15
Doing it this way allows you to process the cube from different stored process without having to store the processing code in each one of them. It also makes it easier to update incase your cubes are moved to a different server. You only have to update the SQL Agent Job to point to the new server.

Related

How to use a very long JSON as text parameter in Power Shell?

TL;DR: I'm overflowing a string parameter with an over 500k character JSON.
I'm using an Azure based solution to:
In Logic Apps, go through a list of SharePoint Lists stored in over 200 SharePoint subsites.
Send HTTP Request to SharePoint API and download each list as JSON.
Call a Stored Procedure on SQL Database that transforms and loads the data to the database.
After having some issues with step 3, namely, timeout issues with the connection between Logic Apps, I've added step:
2.5: Call an Automation Runbook that calls the Stored Procedure without timing out. This is based on this solution. Basically, it's a PowerShell script that creates an ADO.NET connection with the Azure SQL Database and then executes the Stored Procedure, with the SP parameters being in turn requested as parameters in Logic Apps. Like this:
But with a few of the lists I'm getting an error indicating that I've busted the character limit on a PowerShell string variable:
{
"code": "BadRequest",
"message": "{\"Message\":\"The request is invalid.\",\"ModelState\":{\"job.properties.parameters\":[\"Job parameter values too long. Max allowed length:524288. Parameter names: Json\"]}}"
}
Here's the core of it: "Job parameter values too long. Max allowed length:524288. Parameter names: Json". This Parameter is declared on Power Shell as follows:
[parameter(Mandatory=$True)]
[string] $Json,
Is there another data type I could declare for this that would not run into this limitation?
Following up on the suggestion by David Browne on the comments, I passed my large JSON responses from the SharePoint API into a Blob Storage, and then passed the blob SAS URI as the parameter for the Stored Procedure. I also hade some Authorization issues executing this procedure and the solution my boss pointed to me was to create a Master Key. This was executed once:
USE ***DataBase***
GO
CREATE MASTER KEY ENCRYPTION BY PASSWORD = '123'
GO
And then on the stored procedure, the part concerned with opening the Blob content and reading it into a table variable and finally into a text variable, was wrapped in a OPEN MASTER KEY - CLOSE MASTER KEY:
SET #vURI = '***Json BLOB URI***'
SET #SplitChar = CHARINDEX('?', #vURI)
SET #vFileName = REPLACE(SUBSTRING(#vURI, 1, #SplitChar -1),'https://***StorageAcc***.blob.core.windows.net/***ContainterName***/','')
OPEN MASTER KEY DECRYPTION BY PASSWORD = '123' 
SET #vSQL = 'ALTER DATABASE SCOPED CREDENTIAL dbscopedcredential WITH IDENTITY = ''SHARED ACCESS SIGNATURE'',
SECRET = ''***BLOB AUTHENTICATION***'';'
EXEC sp_executesql #stmt = #vSQL
SET #vSQL = '
DECLARE #vTable TABLE (BulkColumn NVARCHAR(MAX));
INSERT INTO #vTable
SELECT * FROM OPENROWSET(
BULK ''' + #vFileName + ''',
DATA_SOURCE = ''externaldatasource_import'',
SINGLE_CLOB) AS DataFile;
SELECT * FROM #vTable
'
INSERT INTO #vTable
EXEC sp_executesql #stmt = #vSQL
SELECT #vJson = [BulkColumn] FROM #vTable
CLOSE MASTER KEY 

SSIS Import Files with changing layouts

I'm using SSIS 2008 and trying to work on a package for importing a specified file into a table created for its layout. It will take in the destination table & source file as package variables.
The main problem I'm running into is that the file layouts are subject to change, they're not consistent. The table I'd be importing into will match the file though. I had initial success, but soon after changing the source file/destination it throws the vs_needsnewmetadata error.
Are there any workarounds discovered that could potentially be used here for files not fitting the layout the package was designed with?
Edit: These are .txt files, tab-delimited.
Edit2: Tried fiddling with OPENROWSET as well, hit a security error on our server.
I am assuming here that said file is a CSV file.
I have just been faced with the exact same problem a couple of weeks ago. You need to use dynamic SQL to achieve this.
Create a stored procedure on your database with the code below (change the 2 "C:\Folder\" locations to the location of your file):
CREATE PROCEDURE [dbo].[CreateAndImportCSVs] (#FILENAME NVARCHAR(200))
AS
BEGIN
SET NOCOUNT ON;
DECLARE #PATH NVARCHAR(4000) = N'C:\Folder\' + #FILENAME + ''
DECLARE #TABLE NVARCHAR(50) = SUBSTRING(#FILENAME,0,CHARINDEX('.',#FILENAME))
DECLARE #SQL NVARCHAR(4000) = N'IF OBJECT_ID(''dbo.' + #TABLE + ''' , ''U'') IS NOT NULL DROP TABLE dbo.[' + #TABLE + ']
SELECT * INTO [' + #TABLE + ']
FROM OPENROWSET(''MSDASQL''
,''Driver={Microsoft Access Text Driver (*.txt, *.csv)};DefaultDir=C:\Folder;''
,''SELECT * FROM ' + #FILENAME + ''')'
EXEC(#SQL)
END
You might need to download the Microsoft Access Database Engine from:
https://www.microsoft.com/en-gb/download/details.aspx?id=13255
and install on your machine/server for the Microsoft Access Text Driver to work.
Then create an Execute SQL Task in SSIS with the relevant connection details to your SQL server database. Then pass the file name to the stored procedure you created:
EXEC dbo.CreateAndImportCSVs 'filename.csv'
It will then create the table based on the structure and data contained within the CSV, it also names the table the same as the csv file name.
*This stored procedure can also be used to run through a list of files.
Hope this helps!

How do I save results from a pass through query to a local table?

I need to submit a series of queries to an Oracle server over ODBC from an MS SQL server and store the results as a table on the MS SQL server.
It has to be a pass through because the query requires a server side function defined on the Oracle server.
I can't save the table on the Oracle server and then access it via ODBC because of licensing restrictions from the vendor of the db running on Oracle.
Here's the code that returns the correct results, but I don't know how to save them:
DECLARE #BibID AS bigint
DECLARE BibList CURSOR FOR
SELECT BIB_ID FROM tblActiveSerialsThatHave740s
OPEN BibList
FETCH NEXT FROM BibList INTO #BibID
WHILE ##FETCH_STATUS=0
BEGIN
EXECUTE
('SELECT
AMDB.BIB_DATA.BIB_ID As BIB_ID,
AMDB.GetAllBibTag(AMDB.BIB_DATA.BIB_ID, ''740'', ''2'') As F740_All
FROM
AMDB.BIB_DATA
WHERE
AMDB.BIB_DATA.BIB_ID = ' + #BibID + '
GROUP BY BIB_ID '
)
AT REPORT
FETCH NEXT FROM BibList INTO #BibID
END
DEALLOCATE BibList
You need to use INSERT INTO to capture the results of an EXECUTE.
Because you are executing a passthrough query, the Distributed Transaction Coordinator is going to come into play and you might need to ensure that a distributed transaction is not created (it's unlikely to be necessary to have a distributed transaction in your case) or that the Distributed Transaction Coordinator service is running:
http://blogs.msdn.com/b/sqlprogrammability/archive/2008/08/22/how-to-create-an-autonomous-transaction-in-sql-server-2008.aspx
http://technet.microsoft.com/en-us/library/ms178532.aspx
http://www.sqlservercentral.com/Forums/Topic861249-392-1.aspx

Execute Stored Process with pass in SQL query from another table?

Currently my development environment is using SQL server express 2008 r2 and VS2010 for my project development.
My question is like this by providing a scenario:
Development goal:
I develop window services something like data mining or data warehousing using .net C#.
That meant I have a two or more database involved.
my senario is like this:
I have a database with a table call SQL_Stored inside provided with a coloum name QueryToExec.
I first idea that get on my mind is written a stored procedure and i tried to came out a stored procedure name Extract_Sources with two parameter passed in thats ID and TableName.
My first step is to select out the sql need to be execute from table SQL_Stored. I tried to get the SQL by using a simple select statement such as:
Select Download_Sql As Query From SQL_Stored
Where ID=#ID AND TableName=#TableName
Is that possible to get the result or is there another way to do so?
My Second step is to excecute the Sql that i get from SQL_Stored Table.Is possible to
to execute the query that select on the following process of this particular stored proc?
Need to create a variable to store the sql ?
Thank you,Appreciate for you all help.Please don't hesitate to voice out my error or mistake because I can learn from it. Thank you.
PS_1:I am sorry for my poor English.
PS_2:I am new to stored procedure.
LiangCk
Try this:
DECLARE #download_sql VARCHAR(MAX)
Select
#download_sql = Download_Sql
From
SQL_Stored
Where
AreaID = #AreaID
AND TableName = #TableName
EXEC (#download_sql)

Creating a connection from Microsoft SQL server to an AS/400

I'm trying to connect from Microsoft SQL server to as AS/400 so i can pull data from the AS/400 then flag the data as being pulled.
I've successfully created and OLE DB "IBMDASQL" connection, and am able to pull data some data, but i'm running into an issue when i try to pull data from a very large table
This runs fine, and returns a count of 170 million:
select count(*)
from transactions
This query executed for 15 hours before i gave up on it. (It should return zero since i haven't flagged anything as 'in process' yet)
select count(*)
from transactions
where processed = 'In process'
I'm a Microsoft guy, but my AS/400 guy says that there is an index on the 'processed' column and that locally, that query run instantaneously.
Any thoughts on what i might be doing wrong? I found a table with only 68 records in it, and was able to run this query in about a second:
select count(*)
from smallTable
where RandomColumn = 'randomValue'
So I know that the AS/400 is at least able to understand that type of query.
I have had to fight this battle many times.
There are two ways of approaching this.
1) Stage your data from the AS400 into SQL server where you can optimize your indexes
2) Ask the AS400 folks to create logical views which speed up data retrieval, your AS400 programmer is correct, index will help but I forget the term they use to define a "view" similar to a sql server view, I beleive its something like "physical" v/s "logical". Logical is what you want.
Thirdly, 170 million is a lot of records, even for a relational database like SQL server, have you considered running an SSIS package nightly that stages your data into your own SQL table to see if it improves performance?
I would suggest this way to have good performance, i suppose you have at least SQL2005, i havent tested yet but this is a tip
Let the AS400 perform the select in native way by creating stored procedure in the AS400
open a AS400 session
launch STRSQL
create an AS400 stored procedure in this way to get/update the recordset
CREATE PROCEDURE MYSELECT (IN PARAM CHAR(10))
LANGUAGE SQL
DYNAMIC RESULT SETS 1
BEGIN
DECLARE C1 CURSOR FOR SELECT * FROM MYLIB.MYFILE WHERE MYFIELD=PARAM;
OPEN C1;
RETURN;
END
create an AS400 stored procedure to update the recordset
CREATE PROCEDURE MYUPDATE (IN PARAM CHAR(10))
LANGUAGE SQL
RESULT SETS 0
BEGIN
UPDATE MYLIB.MYFILE SET MYFIELD='newvalue' WHERE MYFIELD=PARAM;
END
Call those AS400 SP from SQL SERVER
declare #myParam char(10)
set #myParam = 'In process'
-- get the recordset
EXEC ('CALL NAME_AS400.MYLIB.MYSELECT(?) ', #myParam) AT AS400 -- < AS400 = name of linked server
-- update
EXEC ('CALL NAME_AS400.MYLIB.MYUPDATE(?) ', #myParam) AT AS400
Hope it helps
I recommend following the suggestions in the IBM Redbook SQL Performance Diagnosis on IBM DB2 Universal Database for iSeries to determine what's really happening.
IBM technical support can also be extremely helpful in diagnosing issues such as these. Don't be afraid to get in touch with them as the software support is generally included as part of the maintenance contract and there is no charge to talk to them.
I've seen OLEDB connections eat up 100% cpu for hours and when the same query is run through VisualExplain (query analyzer) it estimates mere seconds to execute.
We found that running the query like this performed liked expected:
SELECT *
FROM OpenQuery( LinkedServer,
'select count(*)
from transactions
where processed = ''In process''')
GO
Could this be a collation problem? - your WHERE clause is testing on a text field and if the collations of the two servers don't match this clause will be applied clientside rather than serverside so you are first of all pulling all 170 million records down to the client and then performing the WHERE clause on it there.
Based on the past interactions I have had, the query should take about the same amount of time no matter how you access the data. Another thought would be if you could create a view on the table to get the data you need or use a stored procedure.