How do I import data from SQL Server Views? - ssas-tabular

How do I Import data into SSAS from SQL server Views? It only shows me Tables. I have views already customized to provide the data the way it should be.

You can expand the "Advanced Options" and use SQL.
SELECT * FROM MY_VIEW_NAME
Note that any filters that reduce the size of the data should be done in your SQL, as Power Query does not do any query folding after an SQL statement. That means if your SQL is like the one above, you will have to wait until every row is sent to Power BI, and then any other filters in your Power Query would be done by Power BI.
If you are not familiar with SQL, set filters with a WHERE clause:
SELECT * FROM MY_VIEW_NAME
WHERE COLUMN_1 = 'US'
AND COLUMN_2 > 0

Related

Pivot function without manually typing values in `for in`?

Documentation provides an example of using the pivot() function.
SELECT *
FROM (SELECT partname, price FROM part) PIVOT (
AVG(price) FOR partname IN ('prop', 'rudder', 'wing')
);
I would like to use pivot() without having to manually specify each value of partname. I want all parts. I tried:
SELECT *
FROM (SELECT partname, price FROM part) PIVOT (
AVG(price) FOR partname);
That gave an error. Then tried:
SELECT *
FROM (SELECT partname, price FROM part) PIVOT (
AVG(price) FOR partname IN (select distinct partname from part)
);
That also threw an error.
How can I tell Redshift to include all values of partname in the pivot?
I don't think this can be done in a simple single query. This would mean that the query compiler would need to work without knowing how many output columns will be produced. I don't think it can do that.
You can do this in multiple queries - use a query to create the list of partnames and then use this to "generate" a second query that populates the IN list. So something needs issue these queries and generated the second. This can be some code external to Redshift (lots of options) or a stored procedure in Redshift. This code, no matter where it exists, should understand that Redshift has a max number of columns limit - 1,600.
The Redshift docs are fairly good on the topic of dynamic SQL for stored procedures. The EXECUTE statement will be used to fire off the second query in a stored procedure. See: https://docs.aws.amazon.com/redshift/latest/dg/c_PLpgSQL-statements.html

SQL Script in DAX ( Many different Joins )

I have a problem in Power BI with DAX, i want to do SQL joins in Dax , i made my data modelling with my tree tables
Table 1
enter image description here
table 2
enter image description here
table 3
enter image description here
I want to do with DAX this table joining in this SQL query
enter image description here
I want to create a table with the joining bloc
Can you help me ?
Kind Regards ;
I try to do it on power query but i have some speed issue
In power bi, you would address this by creating relationships in the model:
https://learn.microsoft.com/en-us/power-bi/transform-model/desktop-create-and-manage-relationships
Please note that all joins are equivalent to outer joins in SQL. Some of what is in your ssample query is not valid SQL syntax - you can't have duplicate table names, "join" is repeated. However, if you wish to filter a table like T3.U2 = 'NW', you can simply add that as a WHERE clause in the query to import the data into the model or filter out the rows you want with PowerQuery (using the edit queries dialogue in power bi).

Optimize "select * from table" query with 200millionen entries - DB2 database

I have simple
select * from table query but the data has ~200 million of dataentries. How can I optimize the query and still take the whole data from the table?
The database is db2.
Enable parallelism by setting the current degree = any. This will utilize multiple processors to satisfy the query faster. If you have any where clause search conditions, indexing may help.

Pivot column in SSRS

I have report where I am using that stored procedure to create a report using SSRS. The report currently show data like below (it has more additional columns) :
What I want is at the backend (stored procedure level or in SSRS) want the filter the data for product_type A only, but I want to show the amount associated with the product type F in a new column as below:
Can anyone help to achieve this please?
A simple LEFT JOIN with the same table will do the work
SELECT t1.INVOICE_NO,t1.PRODUCT_TYPE,t1.AMOUNT,t2.AMOUNT AS FEE_AMOUNT
FROM tbl t1
LEFT JOIN tbl t2 ON t1.INVOICE_NO=t2.INVOICE_NO AND t2.PRODUCT_TYPE='F'
WHERE t1.PRODUCT_TYPE='A'
you can also use matrix to achieve the same on ssrs side. In this case your trade off is none i.e. doing pivot on sql side and then rendering the data in ssrs table VS leaving data as is in sql and using matrix to pivot the data.

Dynamic FROM clause in Postgres

Using PostgreSQL 9.1.13 I've written the followed query to calculate some data:
WITH windowed AS (
SELECT a.person_id, a.category_id,
CAST(dense_rank() OVER w AS float) / COUNT(*) OVER (ORDER BY category_id) * 100.0 AS percentile
FROM (
SELECT DISTINCT ON (person_id, category_id) *
FROM performances s
-- Want to insert a FROM clause here
INNER JOIN person p ON s.person_id = p.ident
ORDER BY person_id, category_id, created DESC
) a
WINDOW w AS (PARTITION BY category_id ORDER BY score)
)
SELECT category_id,percentile FROM windowed
WHERE person_id = 1;
I now want to turn this into a stored procedure but my issue is that in the middle there, where I showed the comment, I need to place a dynamic WHERE clause. For example, I'd like to add something like:
WHERE p.weight > 110 OR p.weight IS NULL
The calling application let's people pick filters and so I want to be able to pass the appropriate filters into the query. There could be 0 or many filters, depending on the caller, but I could pass it all in as a properly formatted where clause as a string parameter, for example.
The calling application just sends values to a webservice, which then builds the string and calls the stored procedure, so SQL injection attacks won't really be an issue.
The calling application just sends values to a webservice, which then
builds the string and calls the stored procedure, so SQL injection
attacks won't really be an issue.
Too many cooks spoil the broth.
Either let your webserive build the SQL statement or let Postgres do it. Don't use both on the same query. That leaves two possible weak spots for SQL injection attacks and makes debugging and maintenance a lot harder.
Here is full code example for a plpgsql function that builds and executes an SQL statement dynamically while making SQL injection impossible (just from two days ago):
Robust approach for building SQL queries programmatically
Details heavily depend on exact requirements.