I am currently administering a project on DevOps and wish to verify how many users are using the platform actively.
I want the following information -
1. Unique visitors per month
2. All visits per month
3. All visits by roles
The best you're going to be able to get is from the usage page (https://dev.azure.com/your_organization/_settings/usage) or audit REST API. It doesn't include everything you want, but you can probably play with the data once you have it to aggregate it into a format that's more to your liking and extract some of the information you're after.
Based on my opinion, all of those what you want could be achieved by audit log.
Beside the api that Daniel mentioned, we offered a direct download button to download those data. Just select the time period and download them with CSV/JSON format:
Here, I suggest you download with CSV because it is convenient to apply data filter in it.
Unique visitors per month
This could be get from the last row: ActorDisplayName.
All visits by roles
This can be achieved by combined with Users list files. In users page, we also offered a button to download user lists as .csv file.
Just combine the data of auditlog.csv and users.csv, then you will get the statistics on visiting by roles.
All visits per month
Not sure what the exactly visit you mentioned. But I think you could get most of you want from auditlog.csv, because we stored the access ip, user agent, and detailed operations.
You may think above is not friendly for viewing. So I recommend analyze those statics with Power BI. It can help create histograms, pie charts or line charts and etc, which are very awesome to analyze the data.
Related
In order to find some Agile Metrix of the project in Azure DevOps (ADO) from last month or from last 2 sprints, I was trying to Query data from ADO like:
When a user story was committed and reach to done state?
How long a developer took on a user story before it turned to QA?
How much time QA took after receiving it from Dev?
How many Dev and QA have worked on a user story and how long? Compare that number with the estimated time given at the time if sprint planning.
How many time a user story been returned back to Dev for any reason like not fulling Acceptance Criteria, etc?
...
In ADO, I can query the current data, however, there is no way to find historical data from State Graph, History, Discussion!!! I want to find out if the User Story has gone through multiple resources, how much time each of them have spend on that!
Can someone please give me some direction? Thanks in advance.
You can get the historical data using Analytics Odata API. For below example using WorkItemRevisions entity set to load all the revisions for a given work item
https://analytics.dev.azure.com/{OrganizationName}/{ProjectName}/_odata/{version}//WorkItemRevisions?
$filter=WorkItemId eq {Id}
&$select=WorkItemId, Title, State
Then You can Run the OData query from Power BI to create a PowerBI report. Also see document Power BI integration. Please check example Calculate time-in-state.
You can also use rest api to get historical data of a workitem. See below:
Revisions - List
Comments - Get Comments
Updates - List
For querying historical data of single workitem you can use this chrome extension - https://chrome.google.com/webstore/detail/azure-devops-workitem-inf/ociekhbkajgbjdenikmandhpjlekckee.
There you can see changes of any field, and also you can see how many time the field was in particular state.
I haven't come across any documentation provided by Tableau in asking this question but I'd like to ask if anyone knows whether the info columns in the Sites or Users section are customizable for anyone with Server Administrator privileges. Mainly these columns show site/user metrics but I'm not sure if you can add your own columns to that list to track your own metrics.
Unfortunately, no. Even Admins do not have the ability to add/sort/change columns on these pages (or any other pages, I believe.)
You can always make this suggestion on the Tableau Ideas Forum though. Others can vote your idea up and Tableau will see it.
You can create your own custom admin views of information stored in the Tableau server repository. See this link. https://onlinehelp.tableau.com/current/server/en-us/adminview_postgres.htm
You could also use the server's Rest API to query information and display however you like. https://onlinehelp.tableau.com/current/api/rest_api/en-us/REST/rest_api.htm
If you're working in Python, there is an open source library that makes using the Rest API more convenient
https://tableau.github.io/server-client-python
I am working on an internal reporting dashboard project . There are majorly 3 roles/level to internal reporting dashboard like higher management, project management etc.
And the breakdown of information for every role/level is different as compare to other roles.
For internal reporting dashboard we have to create a database ( lets say D - SQL SERVER) whose data will be coming from 3 databases ( Lets say A,B,C) after integrating them.
For now as per my research, we can directly link database D using Tableau Live Connection in Tableau Desktop ( Professional ed) and use it to create a dashboard.
To host that workbook for users, I can use Tableau Online for publishing and to make data visible according to the roles I can use filters to restrict the data.
Now my questions are:
1. Will this workflow will be right ? Am I missing any step or process that I would need to cater.
2. How will the changes reflect in the dashboard once it is published ? Lets say if I have to add any filter/ parameter in the dashboard. Do I need to make the changes on the workbook using Tableau Desktop and automatically changes will be reflected ?
or do I have to host it again on Tableau Online ? Please educate me on this too.
Thanks for assistance I have attached a purposed workflow image too.
Regards,
Manail Pasha
WORKFLOW IMAGE
If your system is not a transactional database, I would avoid a live database connection. I would recommend a data extract that combines data blending techniques to create a data extract a.k.a .tde file.
I would publish a dashboard with user filters that enable row-level security via filters and ensure users could only see certain data.
Here is a diagram that I would follow if I were you.
To add filter/ parameter either you can do it from the desktop and publish it to Tableau online or login to online and add the filter/ parameter from the edit mode and Save it, it will get reflected if you do anyone of the above mentioned method.
If your data is frequently changing, i would recommend to go on with Live Connection, Extract refresh can be done on incremental, but the appropriate fields needs to chosen to do it( you should also consider, how to handle negated entries ). It all up-to you to decide to go on with Extract or Live
I would like to know if there's a way to automatically update my extracts?
I have a live connection with redshift and use tableau desktop and publish some workbooks on tableau online.
I like to share some reports using extracts and tableau reader and I really need a way to update the extracts everyday
if I understand your question correctly, you should be able to refresh your Tableau Data Extracts by setting up "Refresh Schedules" on Tableau Online. Tableau Online supports variety of Data Sources to set up Extracts instead of using Live Connection.
Check this link on setting up refresh schedules.
Hope this helps!
Yes, you can do set refresh schedule that will help you to update your extract automatically without any manual effort but there are some terms & conditions you have to follow:
For this you have to set your dashboards/workbooks/project in "Extract" mode & not in LIVE.
I personally recommend "extracts" instead of LIVE until the person who is looking for this need REAL TIME updates
Also, you need to set this schedule from Admin's ID (Publisher access) which you use to publish your work from Desktop to Online
Note: You only can choose from lot of options already available
enter image description here
We are using VSTS Only and i am trying to figure out how can i report on changes made to Efforts in sprint backlog on daily basis. Burn down chart gives good insight but i want to get into what was changed. Also, i want to know what were if any new items added during the sprint(could be Bug etc).
I am managing fairly(18) large teams that are distributed across continents.
Also for reporting purpose since VSTS is not rolling up the totals from Tasks to Feature & Epic trying to find out what options do I have so that it is automated calculations?
You can use query to make daily report of effort, remaining work etc for the all backlog items or a certain sprint. Also you can send the query result as an email.
But for the daily changes, you need to manually compare with the day before.
I have used a spreadsheet connected to team services to load in the data. I then refresh the spreadsheet and analyze the data using pivot tables. For each period of time i want to save a manually I copy the data from the pivot tables for my comparisons. This is not a very clean solution but i have used it and it works.
Something I have not look into at any depth yet is PowerBI connected to team services. This looks like the data warehouse solution for Team Services. It looks promising. Please see - https://www.visualstudio.com/en-us/docs/report/powerbi/connect-vso-pbi-vs
You can refer to these steps to achieve your requirement:
Build a web app (e.g. Web API) to do with web hook data and store into your database
Create Web hooks for Work item created and update of required team projects to send the request to your web app
Add logical to build and send report to team members in your web app