OpsHub - Import UserList error - azure-devops

My migration fails on the user mapping task with the following error :
OpsHub-014371: Could not instantiate metadata implementation for For User List | TFS Source 1412109029732 ALM TFS 1412109029738, due to (301)Moved Permanently
I am running on v1.0.1.006.
Thanks for help

If the machine that you are using the utility on behind a proxy then this issue will come, new version will be available on first week of November.

Related

IBM Watson Studio API suddenly started throwing an error when trying to upload a model

I'm using the latest ibm_watson_machine_learning SDK (python)
Until a few days/weeks ago my code was working fine but now I get an error when running
client.repository.store_model(model='./model.tar.gz', meta_props=model_metadata)
Here is some sample code:
https://github.com/IBMDecisionOptimization/oplrunonwml
Exception has occurred: IndexError
list index out of range
File "C:\Temp\oplrunonwml\oprunonwmlv2.py", line 126, in main
model_details = client.repository.store_model(model='./model.tar.gz', meta_props=model_metadata)
File "C:\Temp\oplrunonwml\oprunonwmlv2.py", line 215, in <module>
main(sys.argv[1:])
I get this error while using various different models (OPL/Cplex/Docplex) and they all fail with this error.
What's strange, is that the model is uploaded correctly in the Deployment Space and I can use it without problems in deployment/jobs on the UI or on other scripts.
The code was working fine without any changes a few weeks ago so I assume something's changed on the API side
Update:
I'm using a Cloud Lite account.
I'm also using the latest version of the SDK
client = APIClient(wml_credentials)
print(client.version) # 1.0.29
print(client.version_param) #2020-08-01
I deleted all my IBM services (ObjectStorage,WatsonStudio) and created new ones but I still get the same error.
I would suspect the WML v2 instances deployement.
*** With V2 plan, user need to use updated Python SDK (ibm-watson-machine-learning 1.0.38) ***
If you had a v1 iunstance before and according to your plan, it might have been keeping working withoutmirgation for a while.
May be you reached the end of this compatibility period.
Can you clarify your plan type?
See https://medium.com/#AlainChabrier/migrate-your-python-code-for-do-in-wml-v2-instances-710025796f7
Alain

EF Core 3.1 using Authentication=Active Directory Integrated

[Update 1]
I could make it work using the following connection string
Server=tcp:mydatabaseserver.database.windows.net,1433;Initial Catalog=mydbname
and implementing an interceptor as mentioned in this article.
This proves that Azure is correctly configured, and the problem is somewhere in the application (maybe a missing package?).
Anyway, I would still like to be able to change the connection string and switch between AAD authentication and sql authentication, without additional logic in the application.
[/Update 1]
I'm using EF Core 3.1.4 on an Azure WebApp, and I would like to use the Azure AD identity assigned to the application for authentication, but I run into the following exception:
ArgumentException: Invalid value for key 'authentication'.
Microsoft.Data.Common.DbConnectionStringBuilderUtil.ConvertToAuthenticationType(string keyword, object value)
This is the connection string:
{
"ConnectionStrings": {
"Admin": "Server=tcp:mydatabaseserver.database.windows.net,1433;Initial Catalog=mydbname;Authentication=Active Directory Integrated"
}
}
I initialize the context using the following code:
var connectionString = this.Configuration.GetConnectionString("Admin");
services.AddDbContext<NetCoreDataContext>(builder => builder.UseSqlServer(connectionString));
The Microsoft.Azure.Services.AppAuthentication package is also imported (version 1.5.0)
Active Directory Integrated wasn't working for me in .NET Core 3.1 but it works now ever since I installed the NuGet package Microsoft.Data.SqlClient (I installed version v2.0.1). It now works with the following connection string:
"MyDbConnStr": "Server=tcp:mydbserver.database.windows.net,1433;Database=MyDb;Authentication=ActiveDirectoryIntegrated"
Note: it also works if I have spaces between the words like this:
"MyDbConnStr": "Server=tcp:mydbserver.database.windows.net,1433;Database=MyDb;Authentication=Active Directory Integrated"
And it also works if I include escaped quotes like this:
"MyDbConnStr": "Server=tcp:mydbserver.database.windows.net,1433;Database=MyDb;Authentication="Active Directory Integrated""
Finally, note that there are additional properties which can also be used in the connection string:
;User ID=myruntimeuser#mydomain.com;Persist Security Info=true;Encrypt=true;TrustServerCertificate=true;MultipleActiveResultSets=true
Welcome to the Net frameworks/runtimes hell.
Currently ActiveDirectoryIntegrated and ActiveDirectoryInteractiveauthentication options are not supported for NetCore apps.
The reason is that starting with v3.0, EF Core uses Microsoft.Data.SqlClient instead of System.Data.SqlClient. And the most recent at this time version of Microsoft.Data.SqlClient (also the preview versions) supports these two options only for NET Framework.
You can see similar question in their issue tracker Why does SqlClient for .Net Core not allow an authentication method 'Active Directory Interactive'? #374, as well as the documentation of the SqlAuthenticationMethod enum - ActiveDirectoryIntegrated (emphasis is mine):
The authentication method uses Active Directory Integrated. Use Active Directory Integrated to connect to a SQL Database using integrated Windows authentication. Available for .NET Framework applications only.
With that being said, use the Authentication workaround, or wait this option to be eventually implemented for Net Core.
Upgrading the Nuget packages:
Microsoft.EntityFrameworkCore and Microsoft.EntityFrameworkCore.SqlServer to 6.0.1 and using Authentication=Active Directory Managed Identity in the connection string helped me resolve the issue.
UPDATE
If you use azure msi, pls read this document.
https://learn.microsoft.com/en-us/azure/app-service/app-service-web-tutorial-connect-msi
PRIVIOUS
Your problems maybe not configure in portal. You can follow the offical document to finished it, then try again.
First, you need to create SQL managed instances which maybe cost your long time. Then u need to configure Active Directory admin and your db. When you finished it, you will find ADO.NET(Active Directory password authentication) in your SQL database ->Connection strings in portal. You can copy and paste it in your code to solve the issue.
I have tried it by myself, and it works for me. For more detail, you can see this post.

VSTS work item creation throttling issue with vsts-node-api

I’m developing a VSTS extension with a build task which should create up to 20,000 work items in a single build. Work items are created using WorkItemTrackingApi/createWorkItem function of vsts-node-api package. In current implementation, extension sends a request to create each work item, VSTS starts to throttle after creating about 100 work items.
Following are the errors logged in the build console.
• (2017-01-08T12:35:13.1385597Z Error: connect ETIMEDOUT 11.11.111:111:111)
• 2017-01-08T12:36:45.0090704Z Error: Failed Request: Internal Server Error(500) - TF246020: Microsoft SQL Server encountered an error while processing the results from one of the Team Foundation Server databases. The error may be caused by insufficient resources on the server. Wait a few minutes and try the operation again. If the problem persists, contact a SQL Server administrator.
2017-01-08T12:36:45.0090704Z ThrottlingMode = Unknown, MildResourceType = None, SignificantResourceType = None
Is there a way to create a bulk of work item with vsts-node-api?
Please advise how to resolve this.
There are the VSTS REST APIs for creating batches of work items, but the vsts-node-api does not wrap them up yet as of January 2017.
You may want to leverage directly the work item batch REST APIs of VSTS with your preferred JavaScript based library.
Please refer Create Large Amount of Work Items in TFS Using Javascript REST API
You can use below for authentication,
httpntlm.patch(options, function(err,res) {
console.log("patch complete");
console.log(res.body);
})

Why is my OpsHub migration periodically failing part way through for the included error?

I have been having constant issues with the OpsHub Migration Utility erroring at different stages in setting up a migration, but when I finally get a migration going it fails after 15 minutes or so with the following error:
"Unable to get prject map for system : {my on-premise TFS url} to {my VSTS url}| TFS Source 1462312056714 Source TFS 1462312056719 due to :
OpsHub-014301: Error in getting project metadata for system :
TFS.OH-TFS-Connector-0048: Failed to login to Team Foundation Server : {my on-premise TFS url} with user : null.
Server Error : TF400324: Team Foundation services are not available from server {my on-premise TFS url}.
Technical information (for administrator):
The underlying connection was closed: An unexpected error occurred on a receive.. You may have given wrong credentials or are not valid now..."
I previously tried doing migration with ~6500 Changesets (our main project) and got a similar error after an hour, except that none of the changesets had passed. So, I tried migrating a smaller Team project with 182 changesets and it failed after 72 changesets.
This smaller migration then restarted on its own and was able to get to 148/189 changesets before failing again for the same error.
Eventually I was able to complete all 189 changesets, but it took a long time to do so (2 hours). I can't even begin to hope that a migration of 6500 will complete in a reasonable amount of time. It seems I shouldn't be having credential issues after starting a migration so why is this error happening intermittently? Why is the user null?
I will add that prior to starting the smaller migration, I did clear the following:
1)%localappdata%\Microsoft\Team Foundation\3.0\Cache
2)%localappdata%\Microsoft\Team Foundation\4.0\Cache
3)%localappdata%\Microsoft\Team Foundation\5.0\Cache

Unable to import data into ArcSDE (9.3.1) and PostgreSQL (8.3.0)

I've just installed ArcGIS Server Enterprise Advanced with ArcSDE and PostgreSQL, on a virtual Windows Server 2008 box.
After installing, I've been trying to import a feature class (stored in a shapefile) into the geodatabase.
In order to do this I've created a connection to ArcSDE (not a direct database connection) using ArcCatalog -> Database Connections -> Add Spatial Database Connection. I've tested the connection successfully.
However, when I run the tool "Feature Class to Geodatabase", I get the following error message: Failed to convert DNorthEnergyRiskMaps\RiskMapsLibraryTests\Resources\ProbabilityTools\TestFacies.shp. ERROR 000210: Cannot create output Database Connections\s2008NE.sde\arcgis.sde.TestFacies
Failed to execute (CopyFeatures).
According to this blog post, this error is a generic "catch-all".
The blog post suggests some debugging steps which I've followed. I've had ArcMap create an intercept file. However, I'm non-the-wiser after looking at it (users at the ESRI forum says there are no errors in the intercept file). Maybe someone with more experience could interpret it better...
Also, I've scanned through the ArcSDE and PostgreSQL logs... The only reported errors are in the latter log; multiple SELECT queries are failing because the target tables doesn't exist. Some examples: 2009-09-29 13:33:38 CEST ERROR: relation "sde.sdb_surveydatasets" does not exist
2009-09-29 13:33:38 CEST STATEMENT: SELECT 1 FROM arcgis.sde.SDB_SurveyDatasets WHERE 1 = 0
2009-09-29 13:33:38 CEST ERROR: relation "sde.sch_dataset" does not exist
2009-09-29 13:33:38 CEST STATEMENT: SELECT 1 FROM arcgis.sde.SCH_DATASET WHERE 1 = 0
Help would be much appreciated.
Yes, ArcView is restricted to editing in file and personal geodatabases. You need an ArcEditor or higher license to edit ArcSDE.
See the section "Editing with ArcView" on this page.
Try the 'Feature Class to Feature Class' geoprocessor tool instead of 'Feature Class to Geodatabase'. Sometimes the individual geoprocessor tools execute differently or report errors differently.
If that doesn't work, try creating a new feature class directly in the SDE workspace and import the schema from the shapefile. Once it is successfully created, import data into the feature class from the shapefile.
I recommend trying to create a new feature class from scratch and seeing if that works in your PostgreSQL environment first and then work on importing.