Dynamic parameters in Tableau - tableau-api

Looking for a solution to dynamically update the parameter in tableau, so that it populates new/changed values, automatically, once the data is refreshed.
Just fyi, Currently, my parameter is getting values from a field which has got date values in string format(for eg: 2016Q1).
Request help.

This is not currently possible without serious XML hacking and scripting. There have been multiple requests for this feature from Tableau users, and it might be added in the future.

Related

talend how can we estimate taggregateSortedRow recordcount parameter value

We are trying out talend and we wanted to aggregate some sorted data on few keys .
Simple enough but when we try to use taggregatesortedrow its asking for Exact number of rows to be specified.
I am not sure how any one can input this on the fly. Dosen't this value change for every run ? am i missing something. surely they cant expect us to know total recs before we run the job.
This has to do with the way in how the Talend component tAggregateSortedRow is programmed. To avoid it omitting data you need to provide the record count. There are a few users with the same question like you asked:
https://www.talendforge.org/forum/viewtopic.php?id=50094
https://www.talendforge.org/forum/viewtopic.php?id=54231
https://www.talendforge.org/forum/viewtopic.php?id=7641
which I found simply by using Google.
Anyway, if you need to do sorting and aggregating, consider using the components tAggregateRow and tSortedRow separately. It should work fine.

PowerApps datasource to overcome 500 visible or searchable items limit

For PowerApps, what data source, other than SharePoint lists are accessible via Powershell?
There are actually two issues that I am dealing with. The first is dynamic updating and the second is the 500 item limit that SharePoint lists are subject to.
I need to dynamically update my data source, which I am currently doing with PowerShell. My data source is not static and updating records by hand is time-consuming and error prone. The driving force behind my question is that the SharePoint list view threshold is 5,000 records however you are limited to 500 visible and searchable records when using SharePoint lists in the Gallery View and my data source contains greater than 500 but less than 1000 records. If you have any items beyond the 500th record that should match the filter criteria, they will not be found. So SharePoint lists are not optional for me until that limitation is remediated
Reference: https://powerapps.microsoft.com/en-us/tutorials/function-filter-lookup/
To your first question, Powershell can be used for almost anything on the Microsoft stack. You could use SQL server, Dynamics 365, SP, Azure, and in the future there will be an SDK for the Common Data Service. There are a lot of connectors, and Powershell can work with a good majority of them.
Take note that working with these data structures through Powershell is independent from Powerapps. Powerapps just takes the data that the data connector gives it, and if you have something updating the data in the background (Powershell, cron job, etc.), In order to get a dynamic list of items, you can use a Timer control and a Refresh function on your data source to update the list every ~5-20 seconds.
To your second question about SharePoint, there is an article that came out around the time you asked this regarding working with large lists. I wouldn't say it completely solves your question, but this article seems to state using the "Filter" function on basic column types would possibly work for you:
...if you’d like to filter the set of items that you are showing in the gallery control, you will make use of a “Filter” expression, rather than the “Search” expression, which is the default that existing apps used. With our changes, SharePoint connector now supports “equals” type of queries on columns that support filtering (Single line of text, choice, numbers, dates and people), so make sure that the columns and the expressions you use are supported and watch for the same warning to avoid reverting back to the top 500 items.
It also notes that if you want to pull from a list larger than the 5k threshold, you would need to use indexes, I have not fully tested this yet but it seems that this could potentially solve your problem.

How to assign values to variables in tableau

I have a field named Datavalue 2, I have to manually assign value to that field,How do I write the calculations for this. Any leads for this will be helpful.
Thanks in advance
Tableau is a read-only tool. It does not allow you to make persistent modifications to the original source data.
If you want to derive a value of a field that can change during the life of a Tableau visualization, but leave the original data source unchanged, that is entirely possible. The Tableau features that help with this are calculated fields and parameters. Parameters can be set interactively by the user. Both features are documented in the on-line help.
Tableau has some tricky issues with it, but it is possible to work with variables as well. I am making the assumption that your data is connected through a database.
In my experience, I was connected with data from an MSSQL server, and I had need to use variables for some data cleaning.
The solution I came upon was the difference between using a 'Custom SQL Query' and 'Initial SQL'.
Tableau will not allow declared variables, temp tables, or common table expressions in 'Custom SQL' but it will allow all of those items in the 'Initial SQL' area.
Under 'Connections' Tableau should have the server connection, right clicking on this will allow you to navigate to the 'Initial SQL' box.
After that, I had to do a little fidgeting with the code, but ultimately it was a successful process. Hope that helps somewhat to the OP or any newer viewers. There weren't quite enough tags to determine more about the issue.

iReport: Setting Parameter Values from Query MongoDB

I am fairly new to JasperReports and iReport and am struggling with something, which seems should be basic.
If you use MongoDB then you know it does not support the concept of a 'JOIN'. Therefore, from the iReport main dataset query I want to set a parameter/variable from the results. Then I want to use the collection values I just set in a different dataset as a query parameter/variable (NOT table, or LIST - just a plain old simple dataset I create, which will also query MongoDB as the source).
It seems this would be a straight forward use case, but I don't see anything intuitive in iReport that seems would do this. Can this be done? If so any clues you can give me would be wonderful and greatly appreciated.
Do you want to pass the values as collection from one report onto the other ?
This can be done by writing the following in your filter expression $P{parameter_name}.contains($F{field_name}). Additionally, you need to create parameter with the same parameter_name with class type java.util.collection .
Now this report is ready to receive any parameters as collections. This works for MongoDB as I have tried this out. Now as you have already said that you have been able to send the collection from the main report, the above method will work for receiving the parameter in the second report.

Mass Changing Fields in Crystal Reports

I'm currently going through all of our Crystal Reports and changing them to read from Stored Procedures instead of having the joins/tables inside of the report itself.
The problem is, I have to manually remove then add the fields. Is there a way to programmatically or mass change the report fields to avoid wasted man hours? Assuming each field on the report exists by a (slightly) different name in the stored procedure.
Unfortunately there isn't an easy way to do this. You can try going to the database, set datasource location and replacing each table with the stored procedure. Each time you do this you should be prompted to map each field that doesn't have a corresponding entry with the same field name.
Though I think crystal will try to alias the sproc multiple times instead of point all to the same sproc.
I meant to elaborate... When you come to designing more complex reports it's a kind of "best practice" to create formula fields for EVERY field you use in your report. This makes life a lot easier when coming to do something like this in future