Can Row Level Security(RLS) be applied programmatically in Power BI? - rest

We can apply Row Level Security (RLS) in Power BI by clicking on the Manage Roles button, but I was wondering if this can also be done programmatically, for example, through a REST API?
Also that article says that there's a known issue where you'll get an error message if you try to publish a previously published report from Power BI Desktop. If its possible to do RLS programmatically, would that issue still remain there and therefore we would have to still do some additional manual step?

Related

Prevent users from creating new work items in Azure DevOps

I've been looking at organisation and project settings but I can't see a setting that would prevent users from creating work items in an Azure DevOps project.
I have a number of users who refuse to follow the guidelines we set out for our projects so I'd like to inconvenience them and the wider project team so that they find it better to follow the guidelines than not - at the moment we've got one-word user stories and/or tasks with estimates of 60-70 hours which isn't reflective of the way that we should be planning.
I'd still want them to be able to edit the stories or tasks and moving statuses, but that initial creation should be off-limits for them (for a time at least). Is there a way to do this??
The Azure DevOps Aggregator project allows you to write simple scripts that get triggered when a work item is created or updated. It uses a service hook to trigger when such an event occurs and abstracts most of the API specific stuff away, providing you with an instance of the work item to directly interact with.
You can't block the creation or update from, such a policy, Azure DevOps will inform the aggregator too late in the creation process to do so, but you can revert changes, close the work item etc. There are also a few utility functions to send email.
You need to install the aggregator somewhere, it can be hosted in Azure Functions and we provide a docker container you can spin up anywhere you want. Then link it to Azure DevOps using a PAT token with sufficient permissions and write your first policy.
A few sample rules can be found in the aggregator docs.
store.DeleteWorkItem(self);
should put the work item in the Recycle Bin in Azure DevOps. You can create a code snippet around it that checks the creator of the work item (self.CreatedBy.Id) against a list of known bad identities.
Be mindful that when Azure DevOps creates a new work item the Created and Updated event may fire in rapid succession (this is caused by the mechanism that sets the backlog order on work items), so you may need to find a way to detect what metadata tells you a work item should be deleted. I generally check for a low Revision number (like, < 5) and the last few revisions didn't change any field other than Backlog Priority.
I'd still want them to be able to edit the stories or tasks and moving statuses, but that initial creation should be off-limits for them (for a time at least). Is there a way to do this??
I am afraid there is no such out of setting to do this.
That because the current permission settings for the workitem have not yet been subdivided to apply to the current scenario.
There is a setting about this is that:
Project Settings->Team configuration->Area->Security:
Set this value to Deny, it will prevent users from creating new work items. But it also prevent users from modify the workitem.
For your request, you could add your request for this feature on our UserVoice site (https://developercommunity.visualstudio.com/content/idea/post.html?space=21 ), which is our main forum for product suggestions.

Creating System Compute Profile

I'm trying to do some testing with Cloud Data Fusion, however, I'm receiving issues with connections when running my pipelines. I've come to understand that it is using the default network, and I would like to change my System Compute Profile over to a different network.
The problem is, I don't have the option to create a new System Compute Profile (The option doesn't show up under the Configuration tab). How can I go about getting the correct access to create a new compute profile? I have the role of Data Fusion Admin.
Thank you.
Creating a new compute profile is only available in Data Fusion Enterprise edition. In the basic edition, only the default compute profile can be used. But you can customize the profile when you run the pipeline. To do that:
Go to the pipeline page
Click on Configure, in the Compute config, click Customize
This will pop up the settings for the profile, in General Settings, you can set the value for the network.
Just an update on this thread for future viewer. Custom profile can be created in Cloud data fusion Version 6.2.2 (Basic)

How to query a Work Item's state changes from Azure DevOps in Power bi

I want to track the changes of all Bug Work Item's states from Azure DevOps in Power Bi as close to live as possible. How do I get Power Bi to identify a work item which has had it's state changed recently?
In the needs simplest form, I'd like to see the state graph (shown in history view of each work item on Azure DevOps) and then add an additional rule which identifies work items that have gone through certain changes as soon as they happen/as soon as I look which will be every hour or so.
so far attempted using analytics views to identify and upload to Power Bi but the "changed date" field applies to all changes not just state changes.
Tried using azure devops queries but they don't identify previous state values.
so far attempted using analytics views to identify and upload to Power Bi but the "changed date" field applies to all changes not just state
changes.
For this issue, you can add State Change Date field
as soon as I look which will be every hour or so.
For this issue,I am afraid it is currently impossible to achieve. The highest granularity possible in Analytics View of Azure Devops is Daily but for a single work item it captures on one state for single day. If there are multiple state changes in a day we loose that data and only get the latest update state row. So, currently the analytics view only shows the latest status within a day.
Here is a case to explore analytics view results incomplete,please refer to Issue#2 of it.

Published Workbook or Dashboards takes quite long time to open in Tableau server

I am using Tableau Desktop 8.2 and Tableau server 8.2 (Licensed versions) , the workbook created in Tableau are successfully published to Tableau server.
But when the user want to see the views or workbooks it takes a very long time to preview or open?
The Workbooks are created with Amazon RedShift Database having (>5 million records)
Could somebody guide me on this? like what is it taking a long to preview or open even after being published to Tableau server?
First question, are the views performant when opened using only Tableau Desktop? Get them working well on Desktop before introducing Server into the mix.
then look at the logs in My Tableau Repository which include query strings and timing info to see if you can narrow down the cause. You can also try the Performance Recorder feature.
A typical problem is an overly expensive query just to display a dashboard. In that case, simplify. Start with a simple high level summary viz and the introduce complexity testing the impact on performance. If one viz is too slow, there are usually alternative approaches available
Completely agree with Alex; I had a similar issue with HP Vertica. I had lot of action set on the dashboard. Considering the database structure is final, I did created the tableau extract and used the Online tableau extract in place of live connection. Vola! that solved my problem and the users are happy with the response time as well. Hope this helps you too..
Tableau provides two mode of data refreshes:
Live : Tableau will execute underlying queries every time the
dashboard is referred / refreshed. Apart from badly formulated
queries, this is one of the reason why your dashboard on Tableau Online
might take forever to load.
Extract : Query will be executed once, according to (your) specified
schedule and same data will reflect everytime the dashboard is
refreshed.
In extract mode, the time is taken only when the extract is being refreshed. However, if the extract is not refreshed periodically, the same, stale data will reflect on the dashboard. Thus, extract view is not recommended for representations of live data.
You can toggle Live <--> Extract from the Data Source pane of Tableau Desktop. (refer top right of the snapshot).

Why is my K2 process not appearing in the reports?

I use K2 as my workflow engine. For some reason my processes are not available in any of the reporting views (on the workspace). Do I have to do something special when deploying to get them there?
Your k2 process should appear in the reports automatically. If it does not, maybe it has never been started?
Look at the _ProcInst table in the K2Server database to see if it's there.
Check that you have view permissions set for that process. If you do not have view permissions you will not see the process in reports.
TrueWill's comments are correct. However, the most likely cause of not seeing specific process data is the lack of required permissions. Make sure your account has either View, View Participate, or the Admin right on the process depending upon the requirement. View Participate requires that you participated in the process in some way, like being a destination user (which would usually mean a task is assigned to you), for you to see the reporting data for that instance.