Activate AzureDevops Server warehouse only - azure-devops

I am trying to restore the reports data in Azure Devops Server.
I am wondering how to restore only the database TFS_Warehouse as named in the documentation.
I have created an empty database called something like TFS_Warehouse as mentioned in the documentation. Testing the connection on that part is okay.
I am trying to activate only the warehouse and not the Analysis services or reports.
The reports wizard in Azure Devops administration console does not seem to allow only this and requires that all fields are filled.
How can I use only the warehouse ?
Thanks in advance.

You can find documentation here: Manually process the TFS data warehouse and analysis services cube
Try the ProcessWarehouse command.

Related

Queries on Understanding the necessary roles required for Migration of Azure Devops Server to Services using Data Migration Tool

This relates to the documentation that is available at the link below.
https://learn.microsoft.com/en-us/azure/devops/migrate/migration-overview?view=azure-devops
What would be the minimum role for a user to complete the migration successfully and without any permissions issues? That is my question.
For example, the user must have the what kind of necessary roles and permissions on both the Azure Devops server and the Azure Devops Services.
According to the Data Migration Utility Guide, the user who uses this tool must possess the following:
SQL Server's TFSEXECROLE role, and
Access rights to the TFS collection and configuration databases.
My understanding for example :-
Azure DevOps Server:
If we add the user to the Team Foundation Administrators group on the Azure DevOps Server, does the role fulfill.
Azure Devops Services: If we assign the same user who performs migration as an Azure DevOps Administrator mention in the below image on the Azure devops services, does the below role fulfill.
"Azure Devops Administrator"
Also, it would be useful if you could specify the maximum size limit of the. Dacpac backup file that the Data Migration Tool supports (i.e. the maximum size of the project collection backup) in order for the migration to go properly.
What permissions does the same user that runs the data migration tool need in SQL server to perform the command SqlPackage.exe?
I would thank you in advance for the help. It would help us to understand the better usage of the Data Migration Tool.
Many Thanks..!
Best Regards

Is it possible to ingest work items from Devops into SQL synapse using a data factory

We want to pull the work items from devops into our sql synapse database and then do some reporting on them using power bi. Is there a way to do this?
Yes, it's possible, follow this article it has a detailed explanation about how to ingest work items from DevOps into azure synapse using a data factory.
If you want to connect with the Azure synapse to power bi please follow this:
Go to manage -> linked service ->Click + New -> Search and click Power BI and click Continue -> Enter a name for the linked service and select a workspace .
Expand Power BI and the workspace and you can use.
Reference:
https://www.techtalkcorner.com/enable-azure-devops-azure-synapse-analytics/
https://learn.microsoft.com/en-us/azure/synapse-analytics/quickstart-power-bi

Power Automate and Azure DevOps On Prem 2020 - Create a Workitem - TF400813 Not authorized to access this resource

I am junior admin managing ADO 2020 on Prem . We have a developer who is able to create a work item in a board under a collection/project when logged in using ADO .
The developer is trying to automate work item creation using Power Automate . He is giving the correct information in Power Automate at the required fields. When trying to create a work item, he gets this error
Details: {"$id":"1","innerException":null,"message":"TF400813: The user '157adfsd-912f-4244-xxxx-b45fcasda\\firstname.lastname#domainname.com' is not authorized to access this resource.","typeName":"Microsoft.TeamFoundation.Framework.Server.UnauthorizedRequestException, Microsoft.TeamFoundation.Framework.Server, Version=14.0.0.0, Culture=neutral, PublicKeyToken=acdb03fxxxxxxsdfdsdse","typeKey":"UnauthorizedRequestException","errorCode":0,"eventId":3000}
Question : From ADO 2020 side, is there any kind of permission I need to provide to the developer ? I am not 100 % sure why we get this error as the developer is manually able to create a work item.
To my understanding, Power Automate connects to Azure DevOps Services (that is, the cloud-hosted version of Azure DevOps) via OAuth, and when you are creating Power Automate flow for Azure DevOps, the tool tip when selecting an organization tells you to make sure that the Third Party application access via OAuth is enabled.
I don't think that the OAuth 2.0 authentication (https://learn.microsoft.com/en-us/azure/devops/integrate/get-started/authentication/oauth?view=azure-devops) is available for the on-premises version, so you might be out of luck there.
There is an answer to similar question in Power Automate-forum suggesting that the integration might be possible via installing an on-prem data gateway, but wouldn't really know if it's feasible.
https://powerusers.microsoft.com/t5/Connecting-To-Data/Power-Automate-with-Azure-Devops-Server-On-Premise/td-p/658618

Is it possible to trigger ADF pipeline from Power App?

I'm wondering if it is possible to trigger an Azure Data Factory pipeline from a Microsoft Power App, and if so, how would one go about configuring this?
I was unable to find a PowerApp connector trigger in Azure Data Factory
I was unable to find a PowerApp connector trigger in Azure Logic Apps
I have experience with Azure, but no experience in PowerApps
If you have any ideas or information for me that would be great.
Thanks!
You can trigger an Azure Data Factory pipeline using the REST API like this:
The following sample command shows you how to run your pipeline by using the REST API manually:
POST
https://management.azure.com/subscriptions/mySubId/resourceGroups/myResourceGroup/providers/Microsoft.DataFactory/factories/myDataFactory/pipelines/copyPipeline/createRun?api-version=2017-03-01-preview
More information: Pipeline execution and triggers in Azure Data Factory → Manual execution (on-demand) → REST API
There is now a PowerApps connecter you can use to create a pipeline run: https://learn.microsoft.com/en-us/connectors/azuredatafactory/#create-a-pipeline-run

Connect to backend of VSO

Is there a way to get the server info of my VSO account and access using SQL Server?
I've tried logging in using the URL
{account}.visualstudio.com
But I got a sever not found error
No, the back-end databases are SQL Azure instances, different from the TFS on-premise databases. I cannot see MS ever giving you access to the database - maybe the data, but not the database.
You can only use the API (old and new REST) and Power BI tools to perform queries.
If you have a specific problem you are trying to solve, post it as a new question because it may be possible without database access.