Empty Build Tables When Connecting Power BI to Azure Devops 2019 via OData Feed - azure-devops

We are using Azure Devops 2019 on-premises. I am trying to pull in Build data into PowerBI via OData as described here:
https://gordon.byers.me/azure/reporting-on-devops-build-pipelines-with-powerbi/
Here are a couple more examples:
https://colinsalmcorner.com/azure-devops-build-and-test-reports-using-odata-and-rest-in-powerbi/
https://wouterdekort.com/2019/09/12/measuring-your-way-around-azure-devops/
When I connect to our server, I see the various tables but anything build related is empty:
If I execute either of the following queries through Microsoft Edge, I see that everything should be available:
https://tfsvr2.<org>/tfs/DefaultCollection/<Name>/_odata/v3.0-preview/
https://tfsvr2.<org>/tfs/DefaultCollection/<Name>/_odata/v3.0-preview/$metadata
I know that at some point, I think December 2019, Microsoft changed the Entity Property names in v3.0-preview and is currently on v4.0-preview. However, since we're on premises, haven't updated anything, and the schema seems correct, I'm at a loss as to why Builds (and releated tables such as BuildPipelines) are empty.

Related

How do I identify the Initial Catalog(name of the collection database) in a migrating step of Azure DevOps Server to Services?

I am an administrator of Azure DevOps Server 2019 Update 1.1 in an organization.
I will migrate our collection from the on-premises server to Azure DevOps Services.
Currently, I am on the step of using SqlPackage.exe to generate a DACPAC file.
https://learn.microsoft.com/en-us/azure/devops/migrate/migration-import?view=azure-devops
According to this reference, the command example to generate DACPAC is as below.
SqlPackage.exe /sourceconnectionstring:"Data Source=localhost;Initial Catalog=Foo;Integrated Security=True" /targetFile:C:\DACPAC\Foo.dacpac /action:extract /p:ExtractAllTableData=true /p:IgnoreUserLoginMappings=true /p:IgnorePermissions=true /p:Storage=Memory
However, I cannot understand what is Initial Catalog.
The reference said Initial Catalog - Name of the collection database.
But I could not find the name of the collection database in Azure DevOps Server management console.
I referred another article on dev.to
https://dev.to/timothymcgrath/the-great-azure-devops-migration-part-6-import-2obc
By this article, Initial Catalog=[COLLECTION_NAME],
and the collection name in my Azure DevOps Server is "DefaultCollection" (default name).
Then, I tried the following command then failed.
C:\Program Files (x86)\Microsoft Visual Studio\2017\SQL\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\130> ./SqlPackage.exe /sourceconnectionstring:”Data Source=localhost;Initial Catalog=DefaultCollection;Integrated Security=True” /targetFile:C:\DefaultCollection.dacpac /action:extract /p:ExtractAllTableData=true /p:IgnoreUserLoginMappings=true /p:IgnorePermissions=true /p:Storage=Memory
Connecting to database 'DefaultCollection' on server 'localhost'.
Extracting schema
Extracting schema from database
*** Error extracting database:Could not connect to database server.
(provider: Named Pipes Provider, error: 40
Is this error caused by wrong Initial Catalog?
How do I find the correct Initial Catalog - Name of the collection database?
Environment and pre-conditions
Windows 10 Pro
SqlPackage.exe installed from SSDT for Visual Studio 2017
The machine where commands are executed and where Azure DevOps Server running is the same
so, DataSource=localhost should be correct, I think
Detached my collection by Azure DevOps Server management console
SQL Server Express for my Azure DevOps server is running
Look at the admin console on your app tier. That shows you all of the databases.
For what it's worth, the standard name for the default collection database is Tfs_DefaultCollection. It may be different in your case, but that's a safe bet.
Resolved.
The database name in the case is "AzureDevOps_DefaultCollection".
It could be found by Azure DevOps Management Console in the feature of selecting attach. (Application Tier -> Team Project Collections)
Or, by using SQL Server Management Studio, we can also find "AzureDevOps_DefaultCollection".
And, in my case, DataSource=localhost is wrong, DataSource=<hostname>\SQLEXPRESS is correct.
I noticed this answer when I connect to my database by SQL Server Management Studio.
Finally, I successfully generated a DACPAC file.
Connecting to database 'AzureDevOps_DefaultCollection' on server '<my machine host name>\SQLEXPRESS'.
Extracting schema
Extracting schema from database
Resolving references in schema model
Validating schema model for data package
Validating schema
Exporting data from database
Exporting data
Processing Export.
Processing Table '[dbo].[tbl_PropertyValue]'.
Processing Table '[Task].[tbl_LibraryItemString]'.
...
...
...
Processing Table '[Search].[tbl_FileMetadataStore]'.
Processing Table '[dbo].[SequenceIds]'.
Successfully extracted database and saved it to file 'C:\DACPAC\DefaultCollection.dacpac'.
Thanks so much! Mario Dietner, Daniel Mann.

New SQLAzure databases are not visable in the portal nor via the powershell cmdlets

Last week I created 8 databases on a V12 SqlAzure server via powershell and ARM templates, it worked fine. We started to use these databases in SQL Management studio and have set up users and tables etc. There is some data in them and we can select and update as expected. In short they work!
But today I wanted to apply some resource locks to the databases using the azure powershell cmdlet New-AzureRmResourceLock but I'm finding that the command Get-AzureRmResource | Where-Object {$_.ResourceType -eq "Microsoft.Sql/servers/databases"} does not return the databases I'm looking for!
Also I now look in the portal https://portal.azure.com and I see the SQL Servers listed, and when i enter the blade for my sql server I see the databases. But if I click on a DB I'm lead to a not found resource. Also when using the SQL Databases blade I don't see any of the databases listed.
As an aside if I log on to the classic portal https://manage.windowsazure.com I can see the sql server and see all the databases, and click on them and configure them.
I don't really want to have to recreate all these databases as we have started to set them up with schemas, users and data but do need to be able to use the cmdlets to change them especially to add resource locks to them.
Has anyone see this before? and what could i try to bring them back so i can use powershell to configure them again.
I was in touch with Microsoft support last week and they had a look. this is the resolution.
From: Microsoft support Email
I suspect that our case issue derives from stale subscription cache.
In summary, subscription cache can become stale when changes made
within a subscription occur over time. In an effort to mitigate our
case issue, I have refreshed the subscription cache from the backend.
After they had a look it was sorted out that day, both the portal and more importantly the command line are fixed.
Thanks All
Please provide your subscription id, server name and missing database names and I will have this investigated. Apologies for the inconvenience. You can send details to me at bill dot gibson at microsoft . com.

Unable to configure TFS backup using Backup wizard

When trying to configure the TFS 2010 backup using the TFS Power Tools I kept running into teh following error message:
Account TFS\tfsadmin failed to create backups using path \\tfs-xxxxxxx.local\TFSBackups
The strange thin is that TFS\TFSAdmin has full permissions on both share and file system and that the share path doesn't contain any spaces (thanks for MSDN forums for pointing that out).
I tried backing up through the SQL Server Management Studio, and sure, there the backups fail too.
It turns out that while the backup job is started using the account specified in the Create Backup Wizard of the TFS Power Tools, SQL Server will try to write the files to the share using its own service account.
So in addition to whomever needs access to the share, you need to add the service account running SQL Server to that share as well. In this case it was running under NETWORK SERVICE, so adding MACHINENAME$ to the share's list of permitted users did wonders.

TSQL: How do I move data between SQLServer instances?

I have two SQLSever instances, each of them have an identical schema. One is running in SQLAzure, the other is a standared SQLServer 2008 instance. I need to copy the data from the Azure database to my local instance.
Essentially I want to do this:
insert LOCAL_TABLE (col1, col2)
select col1, col2
from AZURE_TABLE
How would I do that?
In order to move data between SQL Servers, and if one of them is SQL Azure you have couple of options:
SQL Azure Data Sync
Using SSIS
Write your own code that will move data using, most probably SqlBulkCopy class.
If you would like to just copy all the data, you could also use SQL Azure Migration Wizard - you can omit the option for coping the schema, and let it just copy the data.
EDIT
And as, by the original answer from Matthew PK, you could link to your SQL Azure server from your on-prem Server, but this is only an option when you just want to do some ad-hoc testing. I would not use this approach in production for constantly syncing data.
You could accomplish that in a single statements using linked servers.
http://msdn.microsoft.com/en-us/library/ms188279(v=sql.100).aspx
EDIT: Here is a link which appears to explain how to link to SQL Azure:
http://blogs.msdn.com/b/sqlcat/archive/2011/03/08/linked-servers-to-sql-azure.aspx
EDIT: Here is a write-up on connecting to Azure with SSMS
http://www.silverlighthack.com/post/2009/11/11/Connecting-to-SQL-Azure-with-SQL-Server-Management-Studio-2008-R2.aspx
Otherwise I believe you need to do it in two statements.
Linked Server is not officially supported. However, Here are couple of resources that are supported and would help you do what you are looking for:
1) Check out SQL Azure Dac Examples: http://sqldacexamples.codeplex.com/
2) The other options is SQL Azure Data SYNC.
Use a product like "SQL Data Compare" from redgate. I am not a Azure user, but I am guessing it would work, I have used it for a few years and its pretty solid.
Is this a one-time copy or ongoing?
If one-time, then use the SQL Azure Migration Wizard (from Codeplex)
If ongoing then use SQL Azure data sync
Also you can verify that the schema is compliant with SQL Server Data Tools in VS, just set the target to SQL Azure or to SQL Server 2012, or 2008 and then build and any/all schema errors will show up.

How can I create & edit database of Sql Azure using SQL Server 2008 - R2

I have sql azure database and to create and edit database using portal is very boring task due to it' user interface, when i will connect it with my local sql server R2 then i can not able to edit , create table from there.
Is there any way to make it possible , Please give me some solution for that
At this time, the two options available are the web user interface (which will be improved over time) and SQL Server Mgmt Studio (using queries; no user interface) for which SQL Azure support will also improve over time.
After all i found one 3rd party client to manage SQL Azure and that is RazorSQL- Awesome tool! I have write down about it in my blog, see here
Navicat is a commercial application that offers access.
http://www.navicat.com/products/navicat-for-sqlserver
Personally I vastly prefer it to the Microsoft web interface