Microstrategy Lookup Table join - microstrategy

Lookup table (State)
In this table I have 4 Columns (Region_id,RegionName,State_id,State_name)
2.Fact table
In this I have couple of Columns like(customer,product,date,Region_id,State_id,Revenue, Profit)
In lookup table there are more number of states whereas in fact we have data only for few states.
My requirement is to display all the sates in report even though there is no corresponding data available in fact.i have set the state table as lookup for state attribute.
If I pull the state attribute and a metric from fact and set vldb properties preserve lookup table then I'm getting all the sates and it's metric values and null for the one which don't have metric value.
Now If I add some attribute like customer or product along with it I'm not getting all the sates..I'm getting only the states which have data in fact since on adding some attribute the mstr is not hitting state table at all it's bringing data from fact itself.
What could be done so tat I can get all the sates and null even for the customer and product..?

You could try forcing your metrics to outer-join on the dataset itself while preserving final pass elements to make a left join.
Within developer, navigate to your dataset/cube and open it up in editable mode.
Select "Data" within the top bar of the cube/dataset editor >> Report Data Options >> Metric Join Type.
Here you will see your Metrics. Change the join type from inner to outer. Rerun the report and it should bring back all the data you need.
Let me know how it goes.

Related

Sqlalchemy way to handle select based on outdated data

Assuming I have a table sets with a field filters, containing array of key - value mappings, table items to which select query must be applied to extract rows based on these filters, and associated table for M:M relations to link each set with each item. I am seeking for a method or mechanism to cancel select query if sets.filters were updated, otherwise M:M relation will be built invalid as based on yet not refreshed filters.
The concrete scenario when a problem takes place is:
Receive file with items data, to parse, and insert into items returning new relevant ids(primary keys here);
After insertion, select from relevant sets for filters;
Take items ids and select from items using filters;
Update M:M association table for all the items returned at step 3.
So, unfortunately between step 3 and 4 or even earlier, API call makes an update on one of the sets rows, changing its filters. As the result - M:M table is invalid, because one filter was changed(lets say the filters contained kind of weight <= 100 kilos expression, however after the mentioned update it has become weight <= 50 kilos, so if there are some new items with weight greater than 50, those items ids should not be in M:M table, obviously).
Is there some efficient way to cancel select query from items during transaction? Or maybe there is a strong query to use. My idea is to rollback changes post-factum, checking sets.modified_at column. But it seems as doing additional job by wasting disk and cpu time.

data merging in Tableau

I have two sheets in excel. One has CBG (neighborhood) IDs as shown below.
The second sheet has state and county names and IDs as shown below.
Now the first 5 digits in the CBG ID are just the corresponding state and county IDs for that CBG.
I need to to join this data together in Tableau so that I would have the state and county on the CBG sheet for each CBG.
Basically I tried to blend the data and it didn't work. I also tried to perform a join calculation using the 5-digit code in the second sheet and the LEFT function to extract the 5-digits in the CBG code but it didn't seem to work either.
To fix it, just needed to fix the Join calculation on both sides of the join.
Also, it seems that both variables to be joined need to be the same data type.
The data that you analyze in Tableau is often made up of a collection of tables that are related by specific fields (that is, columns). Joining is a method for combining the related data on those common fields. The result of combining data using a join is a virtual table that is typically extended horizontally by adding columns of data.
When joining tables, the fields that you join on must have the same data type. If you change the data type after you join the tables, the join will break.
Please go through the below steps for joining tables:
In Tableau Desktop: on the start page, under Connect, click a connector to connect to your data. This step creates the first connection in the Tableau data source.
In web authoring: Select New Workbook and connect to your data. This step creates the first connection in the Tableau data source.
Select the file, database, or schema, and then double-click or drag a table to the canvas.
Double-click or drag another table to the canvas, and then click the join relationship to add join clauses and select your join type.
Add one or more join clauses by selecting a field from one of the available tables used in the data source, a join operator, and a field from the added table. Inspect the join clause to make sure it reflects how you want to connect the tables.
When you are finished, close the Join dialog.
Thank you.

How to filter dashboard based on quick filter values selected in Tableau ?

I'm having Dashboard-1 with the data source from SQL Server Table-A having columns
Col1,Col2,Col3
Now, i'm creating a new dashboard-2 with data source as Table-B having columns Col1,Col4,Col5.
But Col1 which is common in both these tables doesn't have common data.
Eg. Col1 from Table-A is having records till 100 and Table-B is having records from 101.Also, the data is not static, its keeps on increasing in Table-B , Table-A is no longer populating but we need the data from it.
Problem1-- How to merge two column as single column for filter in Tableau
Problem2-- in the dashboard i need to show single filter as a union of Col1 from both tables, if user select value <100 then Dashboard-1 will open otherwise Dashboard-2.
Can someone provide me a correct approach.
1) Instead of merging after you have brought the data in, try merging the data using SQL UNION.
2) If that's not possible, do the same after importing both the datasets into Tableau. For an example, try from this official link
3) Try different Joins to see which one works for merging your table columns:
4) If all the above fails, try setting up an Action Filter explained in this link. Essentially you have to use Tiled Containers instead of Floating Containers and set up a action filter using a custom Parameter. This custom Parameter will help display Dashboard 1 when user selects <100 in the filter(for example) and Dashboard 2 when user selects >100(again example)

Filter and display database audit / changelog (activity stream)

I'm developing an application with SQLAlchemy and PostgreSQL. Users of the system modify data in 8 or so tables. Consider this contrived example schema:
I want to add visible logging to the system to record what has changed, but not necessarily how it has changed. For example: "User A modified product Foo", "User A added user B" or "User C purchased product Bar". So basically I want to store:
Who made the change
A message describing the change
Enough information to reference the object that changed, e.g. the product_id and customer_id when an order is placed, so the user can click through to that entity
I want to show each user a list of recent and relevant changes when they log in to the application (a bit like the main timeline in Facebook etc). And I want to store subscriptions, so that users can subscribe to changes, e.g. "tell me when product X is modified", or "tell me when any products in store S are modified".
I have seen the audit trigger recipe, but I'm not sure it's what I want. That audit trigger might do a good job of recording changes, but how can I quickly filter it to show recent, relevant changes to the user? Options that I'm considering:
Have one column per ID type in the log and subscription tables, with an index on each column
Use full text search, combining the ID types as a tsvector
Use an hstore or json column for the IDs, and index the contents somehow
Store references as URIs (strings) without an index, and walk over the logs in reverse date order, using application logic to filter by URI
Any insights appreciated :)
Edit It seems what I'm talking about it an activity stream. The suggestion in this answer to filter by time first is sounding pretty good.
Since the objects all use uuid for the id field, I think I'll create the activity table like this:
Have a generic reference to the target object, with a uuid column with no foreign key, and an enum column specifying the type of object it refers to.
Have an array column that stores generic uuids (maybe as text[]) of the target object and its parents (e.g. parent categories, store and organisation), and search the array for marching subscriptions. That way a subscription for a parent category can match a child in one step (denormalised).
Put a btree index on the date column, and (maybe) a GIN index on the array UUID column.
I'll probably filter by time first to reduce the amount of searching required. Later, if needed, I'll look at using GIN to index the array column (this partially answers my question "Is there a trick for indexing an hstore in a flexible way?")
Update this is working well. The SQL to fetch a timeline looks something like this:
SELECT *
FROM (
SELECT DISTINCT ON (activity.created, activity.id)
*
FROM activity
LEFT OUTER JOIN unnest(activity.object_ref) WITH ORDINALITY AS act_ref
ON true
LEFT OUTER JOIN subscription
ON subscription.object_id = act_ref.act_ref
WHERE activity.created BETWEEN :lower_date AND :upper_date
AND subscription.user_id = :user_id
ORDER BY activity.created DESC,
activity.id,
act_ref.ordinality DESC
) AS sub
WHERE sub.subscribed = true;
Joining with unnest(...) WITH ORDINALITY, ordering by ordinality, and selecting distinct on the activity ID filters out activities that have been unsubscribed from at a deeper level. If you don't need to do that, then you could avoid the unnest and just use the array containment #> operator, and no subquery:
SELECT *
FROM activity
JOIN subscription ON activity.object_ref #> subscription.object_id
WHERE subscription.user_id = :user_id
AND activity.created BETWEEN :lower_date AND :upper_date
ORDER BY activity.created DESC;
You could also join with the other object tables to get the object titles - but instead, I decided to add a title column to the activity table. This is denormalised, but it doesn't require a complex join with many tables, and it tolerates objects being deleted (which might be the action that triggered the activity logging).

Reporting Services and Dynamic Fields

I'm new to reporting services so this question might be insane. I am looking for a way to create an empty 'template' report (that is basically a form letter) rather than having to create one for every client in our system. Part of this form letter is a section that has any number of 25 specific fields. The section is arranged as such:
Name: Jesse James
Date of Birth: 1/1/1800
Address: 123 Blah Blah Street
Anywhere, USA 12345
Another Field: Data
Another Field2: More Data
Those (and any of the other fields the client specifies) could be arranged in any order and the label on the left could be whatever the client decides (example: 'DOB' instead of 'Date of Birth'). IDEALLY, I'd like to be able to have a web interface where you can click on the fields you want, specify the order in which they'll appear, and specify what the custom label is. I figured out a way to specify the labels and order them (and load them 'dynamically' in the report) but I wanted to take it one step further if I could and allow dynamic field (right side) selection and ordering. The catch is, I want to do this without using dynamic SQL. I went down the path of having a configuration table that contained an ordinal, custom label text, and the actual column name and attempting to join that table with the table that actually contains the data via information_schema.columns. Maybe querying ALL of the potential fields and having an INNER JOIN do my filtering (if there's a match from the 'configuration' table, etc). That doesn't work like I thought it would :) I guess I was thinking I could simulate the functionality of a dataset (it having the value and field name baked in to the object). I realize that this isn't the optimal tool to be attempting such a feat, it's just what I'm forced to work with.
The configuration table would hold the configuration for many customers/reports and I would be filtering by a customer ID. The config table would look somthing like this:
CustID LabelText ColumnName Ordinal
1 First Name FName 1
1 Last Name LName 2
1 Date of Birth DOBirth 3
2 Client ID ClientID 1
2 Last Name LName 2
2 Address 1 Address1 3
2 Address 2 Address2 4
All that to say:
Is there a way to pull off the above mentioned query?
Am I being too picky about not using dynamic SQL as the section in question will only be pulling back one row? However, there are hundreds of clients running this report (letter) two or three times a day.
Also, keep in mind I am not trying to dynamically create text boxes on the report. I will either just concatenate the fields into a single string and dump that into a text box or I'll have multiple reports each with a set number of text boxes expecting a generic field name ("field1",etc). The more I type, the crazier this sounds...
If there isn't a way to do this I'll likely finagle something in custom code; but my OCD side wants to believe there is SQL beyond my current powers that can do this in a slicker way.
Not sure why you need this all returned in one row: it seems like SSRS would want this normalized further: return a row for every row in the configuration table for the current report. If you really need to concatenate then do that in Embedded code in the report, or consider just putting a table in the form letter. The query below makes some assumptions about your configuration table. Does it only hold the cofiguration for the current report, or does it hold the config for many customers/reports at once? Also you didn't give much info about how you'll filter to the appropriate record, so I just used a customer ID.
SELECT
config.ordinal,
config.LabelText,
CASE config.ColumnName
WHEN 'FName' THEN DataRecord.FirstName
WHEN 'LName' THEN DataRecord.LastName
WHEN 'ClientID' THEN DataRecord.ClientID
WHEN 'DOBirth' THEN DataRecord.DOB
WHEN 'Address' THEN DataRecord.Address
WHEN 'Field' THEN DataRecord.Field
WHEN 'Field2' THEN DataRecord.Field2
ELSE
NULL
END AS response
FROM
ConfigurationTable AS config
LEFT OUTER JOIN
DataTable AS DataRecord
ON config.CustID = DataRecord.CustomerID
WHERE DataRecord.CustomerID = #CustID
ORDER BY
config.Ordinal
There are other ways to do this, in SSRS or in SQL, depends on more details of your requirements.