Problems when migrating from on-premises TFS to VSO using OpsHub - azure-devops

We are trying to migrate from our on-premises TFS 2010 to Visual Studio Online (VSO). We did it a first time just to test it out. The OpsHub application stopped a couple of times, but we finally succeeded. We did our testing and deleted the project from VSO.
The second time dident go so well. It gives the following error after migrating a lot but not all of our change sets:
com.opshub.eai.config.exception.ConfigServiceException:
OH-CONFIG-0101: Exception while calling service, underlying cause :
could not execute query.
We can press OK and navigate into another view in the application but the same error occurs on every view. We have restarted the application, the service and even the server. We always get the same error. We decided to uninstall the application and reinstall it. That solved the problem, but it appears again after we have migrated about 60% of our source code (a day or so).
So, after 3 re installations we are about to give up. It doesn't crash on the same change set every time.
Does anyone know a way around this? Is there any way we can see the real cause of the problem and fix it?
We are using version V1.3.000B000 (latest version).
Not sure if this is the proper place to post the question, but OpsHub is referring to stack overflow for support.

Related

Deploying updated SSIS package doesn't work

The problem
So I am running into an interesting issue. I have been tasked to change a query for a simple SSIS package in Visual Studio 2015, which is a thing I have done multiple times in the last 6 months.
After changing the package and deploying it (to an installation of SQL server 2016, without errors!) I noticed that the execution of the package (scheduled with SSMS) generates the same result as the pre-updated package, meaning the demanded changes hadn't taken effect. Of course, as test, I have executed the package directly from VS2015 and got the result I wanted.
Ever since I have been running tests and trying to find a solution. The problems seems to lie with the receiving side of the deployment proces.
What I have tried
Deleted the package from the existing project in SSMS and redeployed. Deployment again seemd to succeed but the package didn't show up, so I had to restore an old version of the project.
Deploy the package from multiple different computers with access to VS2015 and the source code. No change...
Deploy the package to a new (empty) SSMS project: package does not appear in the project. This leads me to believe that the old package is kept when I publish the new version to the existing project in SSMS.
Regenating/rebuilding the package in VS2015, frankly this was never necessary and probably doesn't do anything for an SSIS package, but it may help you get an idea of my skill level.
In the past we have had issues with the encryption level blocking the deployment of packages. I have verified these settings and found no issues.
I have verified if any updates have recently been installed to the database server, which does not seem to be the case.
I have (of course) tried to google the issue, which is tricky due to the lack of errors. I have found the following links, that describe the same/a similar issue, but their solutions haven't helped:
https://dba.stackexchange.com/questions/259672/ssis-package-not-being-deployed
Deployed SSIS Package not reflecting changes made to package
What is still left to try
Rebuild project from scratch to see if that version is deployable.
Unfortunately I don't have a lot of experience with this subject and no colleagues or contacts to ask for help.
Thanks in advance.
My workaround
After quite a bit of time attempting to solve the issue I have resorted to working around the problem, by manually importing the .ispac file into the database. While this is not the prettiest of solutions, at least it's a workable one. If anyone has any other idea's I'll gladly see them, but for now the issue isn't nearly as pressing as it was.
From your post. "Deleted the package from the existing project in SSMS and redeployed. Deployment again seemd to succeed but the package didn't show up".
Are you 100% sure you are deploying it to the same project on the same server on the database? Are you refreshing after you deploy?

SQLite not listed in Data Sources window

VS2015 Community is not showing SQLite in the list of available data sources in one place and showing it in the other.
If I click New Connection button in Server Explorer and click Change, I get the following list of Data Sources:
If I add a new item to my project > choose Entity Model > from existing database > New Connection, I get the following list of Data Sources:
How can I get SQLite in the New Connection data sources list?
Background
The problem started when my existing EDMX failed to load with the infamous error message
The Operation could not be completed: Invalid Pointer
This error can be fixed by deleting ComponentModelCache folder as described in this post. This method has worked for me in the past, but not this time. I finally decided to recreate the EDMX from scratch. Since then I'm facing this issue.
A few things that might give some hint:
I have recently installed VS2017 Community side-by-side with VS2015. VS2017 can open the existing EDMX just fine, but cannot do Update from Database, so I came back to VS2015.
I uninstalled and reinstalled System.Data.SQLite provider several times, thinking that this might be a registration issue. Didn't do any good.
Note that VS2017 support is not there yet on System.Data.SQLite's download page. I'm using the last available version that supports VS2015 (version number 1.0.104.0).
Good news is that the issue is fixed finally; at least for VS2015. Bad news is that I don't know what exactly did the trick. So I'll list down everything that I tried and maybe this could help someone in the future. These steps are not in any particular order.
Uninstall all SQLite packages from NuGet.
Uninstall Entity Framework package too.
Reinstall all these packages.
Remove and reinstall the latest version of SQLite provider (1.0.104.0 as of this writing).
Use VS2015 only. VS2017 is currently not supported by SQLite provider.
Clear ComponentModelCache folder and restart Visual Studio.
I found an easy solution, just install below extension from market place and Sqlite will be available to data source list
SQLite/SQL Server Compact Toolbox

OPS Hub - unable to migrate project

I am trying to migrate one project from tfs on prem to VSTSO . I am getting the following error
Ops Hub error
I have successfully migrated couple of projects before but this one wont go . I did notice one thing when i was migrating . It gave different name to the default collection as you see in the screen shot . I have deleted the account and recreated the account and still getting the same.Please advise.
thanks
The issue seems to be a recent change in VSTS as Nicolas pointed out. Currently, the OVSMU tool will not be usable directly. We would be releasing a patched version early next week which will solve this issue.
[Update]: This change is part of the following change annoncement: https://www.visualstudio.com/news/2016-apr-13-vso
I've been able to solve this "page not found" exception by using fiddler and the following rule: oSession.url = oSession.url.Replace("/NameOfTheCollection/Services/v3.0/","/defaultcollection/Services/v3.0/"); But I face another issue on HubOps.
Finally, I used TFS migration tools + fiddler rule described above and I'm able to move forward on this migration.
Another option is to contact microsoft in order to ask for early acces of TFS - VSO migration tool: https://visualstudio.uservoice.com/forums/330519-team-services/suggestions/2264946-import-data-from-tfs-on-prem-into-vs-team-services
BR

Slow migration to Visual Studio Online from TFS 2013.4 (OpsHub)

I am trying to migrate from TFS 2013 Update 4 to Visual Studio Online. I have had real difficulties using OpsHub Migration Utility, so I have come now seeking advice on how I may go forward.
I have 65 Team Projects to migrate, some with source code, others fairly empty with just a few work items so they differ in size a fair bit. I want to maintain branches we currently have which are cross Team Projects so I initially tried selecting all 65 projects for migration. This took 11hours at the creating configuration stage of the utility, then on getting to the end the tool complained it could not communicate with the service OpsHub Visual Studio Online Migration Utility. So back to square one.
Next approach was to batch the projects into sets of 10 projects. The validation and creating configuration stage was done in 10mins for each set which was a big leap forward, and the migration actually started. This has currently been running for about 17hours. It has done 29,000 work item revisions out of 480,000 across all Team Projects, so I think this may well take a couple of weeks to clear. It hasn't started on the Version Control data yet, which I'm hopeful it does at a later stage but this says not running at the moment.
This is running on a fast i7 box, with 16GB RAM, SSDs the business. 100Mbps internet connection.
I have emailed OpsHub to find out if the commercial version will perform better. But any suggestions what I could do better welcome, including any alternatives to the migration tool.
For Configuration creation taking high time, the assumed reason seems
to be due to issue in communicating with the service OpsHub Visual
Studio Online and waiting till the TimeOut.
The Work Item migration sync time seems to be near to the expectation, migration of Version Control data will be started once the work item migration is completed.
Also as the data is in big amount, its taking time but the migration would keep on running in background, you can use your source TFS instance(if required) without stopping the migration.
For more details on Profession Services you can drop an email to support#opshub.com
There is another tool named TFS Integration Platform that developed by MS for TFS Migration. You can try with it to see if it can migrate the data faster.

OpsHub Migration Tool Stuck at "Reading and Analysis"

I started a migration source code from TFS (not work items) to Visual Studio Online and it has been running for more than 24 hours but the status still shows "Reading and Analysis". CPU usage for the application is showing 0.
Should I keep waiting or is there a way to find out if an exception is thrown? I am not too familiar with debugging procedure for Java applications.
Which version you are using?
There is performance issue with label which is fixed in version 1.0.1.005.
Can you try current version?
Error : TF14045: The identity 19b7a3e0-b25b-4c48-90a8-813202d6d1b4 is not a recognized identity.
above error comes while connecting to tfs server that's why utility continuously displaying reading and analysis because everytime utility connects to source TFS for fetching total number of change sets and lables it encounters same error.
As per microsoft blogs, people are encountering this type of issue after upgradation/db restore for collection. After this process when user tries to connect tfs, it always throws the same error which you have encountered. Have you done any of the above process for the collection you are trying to migrate? You can refer following thread which contains resolution for same error.
http://social.msdn.microsoft.com/Forums/vstudio/en-US/b67ad1d6-d763-4602-bf14-489a3389a30d/getting-tf14045-the-identity-guid-is-not-a-recognized-identity-after-db-restored?forum=tfsadmin
Let us know in case you won't be able to resolve this with the above information and also in case you have not done any upgradation/db restore procedure. In that case we would like to know your source tfs version and older upgrade history of this instance (If any).