Ops Hub User Matching - azure-devops

Can you please tell me where OPSHub Visual Studio Online Migration Utility pulls the user data from when doing a match, both on-prem and online? I am trying to match users for just one project, which has 41 users, however the utility wants me to match 119 users, many of which are not even associated with any project on my on-Prem TFS server. Additionally, where is the online user information pulled from? I have some that are displayed as email address, others seem to be usernames. Obviously I need to straighten this out so I can map to the correct users. Also, some users exist on Prem that will never map to any online user, so how can I get rid of them?

Project Collection Valid Users are pulled from both the end points. This is done to make the user mapping explicit. In some cases, a user might have done some changes in a project but is now no longer a part of the project. If such user is not mapped then, loss of information is observed when such data is migrated.
For users of on-prem who are not going to be in VSTS. You can map them as default to any other user. (Or maybe create a dedicated user for) Basically, all changes done by those users in your on-prem TFS would be shown as done by the user with which they would be mapped.

Related

Keeping users out of specific MarkLogic databases

My question is kind of similar to this question, but not quite :
Hide a marklogic database to specific user (permissions)
Background - up until now, developers who use database X were all admins on the server ( this is a historic config that we have recently inherited ), but now we want to have new developers added to the server who definitely wont be admins, and who will have a new database Y added to the server.
What we want to do is have several groups of developers using the same MarkLogic 10 server, but have it so developer group X can only work in their database X, and Developer group Y can only work in database Y. We dont care if they can see all databases on the server.
Does this mean we have to apply permissions to every document in every database to do this, or can we control this via a roles that limit access to specific databases?
Can someone suggest the right way to achieve this please?
Thanks in advance.
You have two tools to work with:
Granular privileges which allow you limit the scope of a privilege to a specific resource (such as database or forest)
Document permissions unique to documents reflective of their respective set of intended users on each database as you already mentioned
However, in my experience, I've generally found this use case is better served by having many small dev clusters rather than one large one as resource contention (one app team pushing CPU to 100%) can become too much of an issue. It is pretty quick and painless to spin up and tear down dev clusters on AWS or Azure. Or, if you're self-hosting, you could look at running multiple MarkLogic Containers on a single host.

Fetch all metadata of Salesforce

I've been trying to implement a way to download all the changes made by a particular user in salesforce using PowerShell script & create a package The changes could be anything whether it can be added or modified, Apex classes, profiles, Account, etc based on the modified by the user, component ID, timestamp, etc. below is the URL that exposes the API. The URL Does not explain any way to do this by using a script.
https://developer.salesforce.com/docs/atlas.en-us.api_meta.meta/api_meta/meta_listmetadata.htm
Does anyone know how I can implement this?
Regards,
Kramer
Salesforce orgs other than scratch orgs do not currently provide source tracking, which makes it possible to pinpoint user changes in metadata and extract only those changes. This is done by an SFDX/Metadata API client, like Salesforce DX or CumulusCI (disclaimer: I'm on the CumulusCI team).
I would not try to implement a Metadata API client in PowerShell; instead, harness one of the existing tools to do so.
Salesforce orgs other than scratch orgs don't provide source tracking at present. To identify user changes, you can either
Attempt to extract all metadata and diff it against your version control, which is considerably harder than it sounds and is implemented by a variety of commercial DevOps tools for Salesforce (GearSet, Copado, etc).
Have the user manually add components to a Change Set or Unmanaged Package, and use a Metadata API client as above to retrieve the contents of that package. (Little-known fact, a Change Set can be retrieved as a package!)
To emphasize: DevOps on Salesforce does not work like other platforms. Working on the Metadata API requires a fair amount of time investment and specialization. Harness the existing work of the Salesforce community where you can, but be aware that the task you are laying out may be rather more involved than you think and it's not necessarily something you can just throw together from off-the-shelf components.

How can you move Azure Devops organisations to a different tenancy

We currently have an Azure organisation, containing several projects and related boards etc, linked to a specific Azure Active directory and tenant id.
Does anyone know if there is a way I can move the organisation and all child objects to a new tenancy/Azure Active directory?
We need to do this as we wish to decommission the original active directory.
I've googled for solutions and can see that other people were waiting for Microsoft to provide a solution.
I've done the same using the following instructions: https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/change-azure-ad-connection?view=azure-devops
Trick here is not to use a Work or School Account.

Publishing and changes in workbook for tableau online

I am working on an internal reporting dashboard project . There are majorly 3 roles/level to internal reporting dashboard like higher management, project management etc.
And the breakdown of information for every role/level is different as compare to other roles.
For internal reporting dashboard we have to create a database ( lets say D - SQL SERVER) whose data will be coming from 3 databases ( Lets say A,B,C) after integrating them.
For now as per my research, we can directly link database D using Tableau Live Connection in Tableau Desktop ( Professional ed) and use it to create a dashboard.
To host that workbook for users, I can use Tableau Online for publishing and to make data visible according to the roles I can use filters to restrict the data.
Now my questions are:
1. Will this workflow will be right ? Am I missing any step or process that I would need to cater.
2. How will the changes reflect in the dashboard once it is published ? Lets say if I have to add any filter/ parameter in the dashboard. Do I need to make the changes on the workbook using Tableau Desktop and automatically changes will be reflected ?
or do I have to host it again on Tableau Online ? Please educate me on this too.
Thanks for assistance I have attached a purposed workflow image too.
Regards,
Manail Pasha
WORKFLOW IMAGE
If your system is not a transactional database, I would avoid a live database connection. I would recommend a data extract that combines data blending techniques to create a data extract a.k.a .tde file.
I would publish a dashboard with user filters that enable row-level security via filters and ensure users could only see certain data.
Here is a diagram that I would follow if I were you.
To add filter/ parameter either you can do it from the desktop and publish it to Tableau online or login to online and add the filter/ parameter from the edit mode and Save it, it will get reflected if you do anyone of the above mentioned method.
If your data is frequently changing, i would recommend to go on with Live Connection, Extract refresh can be done on incremental, but the appropriate fields needs to chosen to do it( you should also consider, how to handle negated entries ). It all up-to you to decide to go on with Extract or Live

How to implement security on a local database created with Entity Framework (6.1)?

We have a desktop application that uses a local database (SQL Server 2012 LocalDb).
We do not want the end user to be able to modify the database directly, and we want to restrict viewing the database contents to certain users.
Moreover, we want to restrict certain actions that can be performed from within the applications depending on the authorization level of the user that is logged in.
How can the first requirement be fulfilled? Is it possible through code-first?
Can the second requirement be integrated with the first?
Currently this is not supported out of the box, however since EF 6 you can create your own migration steps this way you could encapsulate granting rights to certain users and this way you can manage the user rights with migration steps.
About creating a migration step you can read this post: http://dolinkamark.wordpress.com/2014/05/03/creating-a-custom-migration-operation-in-entity-framework/
and you can find a post which has an example closer to your question: http://romiller.com/2013/02/27/ef6-writing-your-own-code-first-migration-operations/