Can we add data to a continuous view in pipelinedb externally - pipelinedb

I would like to add data to a specific continuous view. I do not want to feed it through the stream as I want to add it only to this specific view with out disturbing others.
I have tried adding rows to the cv_mrel table directly but I was unable to as some columns of the view are of hll(hyperloglog) type.
Is the any way or a function by which I create/cast to this Data structure from a value?

PipelineDB exposes functions for building and manipulating HLLs:
http://docs.pipelinedb.com/builtin.html#hyperloglog-functions
So in your case I'm guessing you'd want to build representative HLLs and then insert those into the _mrel table directly.

Related

Dynamically Refresh Schema in SSAS tabular cube

Is there any command to refresh cube schema (Structure) every day before processing the SSAS tabular cube?
Requirement: Few column names in an object keeps changing(renaming) so, if we rename those columns in DB view, it should automatically reflect in Cube after overnight processing.
I tried adding ‘select * from an Object’ in cube table properties and processed cube.
Later, I tried to rename column name in view then processed cube, it failed due to different column name.
Is it possible to dynamically refresh schema without manually changing structure via solution and re-deploy?
Please suggest
This answer is not going to solve your issue, however best practice would be NOT to handle these changes in the SSAS Data Model. This should be a consistent layer for your users. I would handle these changes in the data base or source system end, maybe checking if a column exists, and if not adding a dummy row or some other process.

How to call a sas dataset by its label or where to check its name

I have a problem in dealing with SAS Enterprise Guide that runs on the server of my client.
I do not have access to the libraries so, in order to use the datasets the only thing we can do is to store them on the local disk C: of the computer and drag them to SAS.
We can not create libraries because the server does not read local paths.
Once you drag a table, let's call it "mydata" in SAS, the table is automatically renamed "mydata9865" with random numbers at the end and "mydata" is its label.
If you right-click the table and go to properties, you can't find the name of the table, just the label.
The only way I found to check the real name of the dataset is to open the Query Builder and check the name in the code preview.
The problem is that I am dealing with tables of millions of records and the machine I am using is very slow, so whenever I want to open the Query Building, just to check the table's name, it takes sometimes even an hour.
I am not a SAS expert, so I am sure there is a smarter way to do so. Is it possible for instance to use the table by calling it with its label?
data mydata2;
set mydata;
run;
instead of
set mydata9865?
Or is there some place I can rapidly check the name of the table without going through the query builder?
I tried to google it but I can't find anything, I hope someone will be able to help me!
Thank you in advance
Hover the mouse pointer over a data node to see it's attributes. The data set name is the File name: value.
For example:
In this example I had renamed the nodes created by two different queries to be the same (doable:yes, smart:maybe not). NOTE: A data node Label: is not necessarily the same as it's underlying data set's label metadata.
Regarding
use the table by calling it with its label?
Two nodes can have the same label, and is a a situation that defeats this approach.
Use the COPY task to upload your data explicitly. It sounds like you're not adding your data to the projects properly so SAS automatically assigns a name, rather than if you explicitly import or load your data.
Problem solved! I should have simply upload the data to the server with Tasks->Data->Upload Data Sets to Server but I didn't know this task so I didn't know it was possible to do it at all!
https://communities.sas.com/t5/SAS-Enterprise-Guide/Importing-sas-data-sets-from-C-drive-into-SAS-EG-not-possible/td-p/135184
Thank you everybody for you help!

Tableau performance

I've a problem with the dashboard in Tableau. In the dashboard there are many worksheets, and all the columns that are in the report are calculable. The problem is that dashboard is being formed for a very long time. The report contains approximately 2 million rows. And it is generated about 5 minutes.
Tell me, what are the solutions in this case?
Maybe I can somehow adjust the page display and not all the records at once?
To reduce the calculation time, try to exclude data you don't need with a data source filter in tableau. You can also hide or delete unused calculated fields. Other things you can do is reduce sheets that are not used.
Here's a link: https://www.tableau.com/about/blog/2016/1/5-tips-make-your-dashboards-more-performant-48574
Steps to follow to reduce calculation time:
Extract the data and use Extract data and also keep option as extract instead of live.Also replace the data source using extract data.
Use "User Filter" to reduce calculation time so that tableau will display of particular user data only.
I hope this will work to solve your problems.
I have one more idea to resolve this issue.
1)when you loan first time your dashboard put into Dashboard Action Filter
First Time load dashboard data exclude in your sheet.
Dashboard Menu->Action->add action->select sheet and exclude option.
2) Live to Extract data source and select radio button extract.
3)use user filter.
I am following the other answers (use extract, dashboard action filter...) and I want to add one point:
Drag every field used by any tablesheet on the dashboard on "Detail" of every tablesheet you are using on the Dashboard. Now Tableau loads all needed data while loading the first tablesheet and can use this data for the other sheets.
i.e. A dashboard contains three tablesheets (A, B, C) now you drag every field used by A on "Deatil" of B and C, every field used by B on "Deatil" of A and C, every field used by C on "Deatil" of B and A.
We are also having a similar issue with 150 million rows but I want to check if you are doing following steps. This may help you. This goes back to fundamentals of Tableau reporting.
1/ Try to make sure your data set is in star schema format. This will help a lot in report.
2/ Try to have tables and views in DB in such a way that same columns are used in Tableau. Any extra columns in tables adds to the performance issue.
3/Make sure indexing is done properly for all the fields that are joined.
4/ In my experience Dashboard adds extra performance lag. So make sure you try to get as much performance tuning on sheets as possible before even going to dashboard.
5/ If required try to use materialized views.
hope this helps.
Try to capture performance metrics using performance recorder option in Tableau.
Check for the underlying DB tables and joins present on the data source layer.
Try using optimized sets and parameters as required and get rid of less relevant filters.
Try using data extracts with scheduled refresh with data source filter for limited business years data.

Getting updated data from table

I have recently started working with UI5. In my current task, I have displayed data in table using JSONModel and by using TextField in template I am allowing user to update the data.
I have to get the updated complete table contents back in form of JSON so that I can update it back in database.
I have tried Table.getContextByIndex() and getProperty, however I am not getting updated data. Please let me know how this can be done.
For this purpose SAPUI5 provides two-way data binding. If a value is changed in the view, it is reflected in the corresponding model. Two-way data binding is the default binding mode. You can use the methods from model to get the data.
I guess I found the problem. As per below link.
Formatter functions allow one-way conversion only. This may be the reason why I am unable to see changes in model. .
Formatting Property Values

Visio 2013: How to trigger a change in databinding of all shapes

I have a nice process overview for our ordering process in Visio. I have an external data source (SQL Server), which works fine. Every record in my data source represents one ordering process. Currently all my shapes of the process are linked to the first record of the data source.
Now I want to add a dynamic behavior. What I want to achieve is this:
A user provides the order reference in a textbox (order reference is a column in the data source)
Afterwards the user clicks a button
After the button click, the process is updated and all shapes are now linked to the external data source record, that matches the provided order reference
So in short: the user should be able to select which process that needs to be visualized.
I assume that this is common functionality, but I don't see how I can deal with this requirement. I've searched already some days on this issue, but without any success.
Can you help me with this issue?
Thanks a lot!
Problem solved :-)
Some old school VBA was required. Using the DataRecordSet object did the trick. It contains a method GetDataRowIDs that you can use to query the external dataset. Once you have the record to visualize, it's just a matter of dynamically updating the shapes with the correct record. Use macro recording to see how to do this.
MSDN: http://msdn.microsoft.com/en-us/library/office/ms195694(v=office.12).aspx