How to import users in CRM 2011 with source GUID - import

We have three Organization tenents, Dev, Test and Live. All hosted on premise (CRM 2011. [5.0.9690.4376] [DB 5.0.9690.4376]).
Because the way dialogs uses GUIDs to refference record in Lookup, we aim to maintain GUIDs for static records same across all three tenents.
While all other entities are working fine, I am failing to import USERS and also maintain their GUIDS. I am using Export/Import to get the data from Master tenent (Dev) in to the Test and Live tenents. It is very similar to what 'configuration migration tool' does in CRM 2013.
Issue I am facing is that in all other entities I can see the Guid field and hence I map it during the import wizard but no such field shows up in SystemUser entity while running import wizards. For example, with Account, I will export a Account, amend CSV file and import it in the target tenant. When I do this, I map AccountId (from target) to the Account of source and as a result this account's AccountId will be same both in source and target.
At this point, I am about to give up trying but that will cause all dialogs that uses User lookup will fail.
Thank you for your help,

Try following steps. I would strongly recommend to try this on a old out of use tenant before trying it on live system. I am not sure if this is supported by MS but it works for me. (Another thing, you will have to manually assign BU and Roles following import)
Create advance find. Include all required fields for the SystemUser record. Add criteria that selects list of users you would like to move across.
Export
Save file as CSV (this will show the first few hidden columns in excel)
Rename the Primary Key field (in this case User) and remove all other fields with Do Not Modify.
Import file and map this User column (with GUID) to the User from CRM
Import file and check GUIDs in both tenants.
Good luck.

My only suggestion is that you could try to write a small console application that connects to both your source and destination organisations.
Using that you can duplicate the user records from the source to the destination preserving the IDs in the process
I can't say 100% it'll work but I can't immediately think of a reason why it wouldn't. This is assuming all of the users you're copying over don't already existing in your target environments

I prefer to resolve these issues by creating custom workflow activities. For example; you could create a custom workflow activity that returns a user record by an input domain name as a string.
This means your dialogs contain only shared configuration values, e.g. mydomain\james.wood which are used to dynamically find the record you need. Your dialog is then linked to a specific record, but without having the encode the source guid.

Related

What is the best way to import data into holochain from another source, like mongo?

MongoDB => Holochain Rust DHT
How to import, if possible
If I am using a different app backend, like mongo, and I get my holochain set up correctly and configured, is there a way to get the data from mongo to holochain? How would I do that?
Here is the question in context
Definitely technologically possible; you could write a nodejs script, fire up a Holochain container with the holochain-nodejs library, and import all the data as one agent. Then when users join the HC-based network, they vouch for their identity in some way and 'claim' all the data as theirs.
Here's a sketch of how it could look:
you (let's call you 'agent 0') import all the data.
For each user, you create an 'anchor' with the user's ID (I'll explain anchors in a
sec) and link each piece of data to the anchor.
You also record that
user's password hash as a private entry on your own source chain. A
user joins the network and is required to prove continuity of
identity.
They do this by using node-to-node messaging to send their
user ID and their password hash to you privately. You authorise them
to claim their identity by publishing an entry that says that "agent
public key x = user ID". (You would probably want to link from your
authorisation entry to their user ID anchor and their public key too,
for convenience's sake.)
The user collects all their data by asking
for all the links to their user ID anchor.
The user then publishes
each piece of their data to their own source chain as a way of
'claiming' ownership of it.
Now, every redundant copy of the data in
the DHT has two authors in its metadata fields -- you and the user
that actually owns the data. Peers validate that piece of data by
saying, "Is agent 0 already the author of this piece of data?
If so,
has agent 0 published an authorisation entry that says that the new
author of this data is allowed to claim/republish it?"
Problems with this approach (not insurmountable):
Agent 0 has to be online all the time cuz they never know when a new
user is going to sign up and try to claim their data. Agent 0 has to
import a ton of data. (I don't think it'd be vastly
time-prohibitive though)
For relational data, there's the chicken-and-egg problem of how to
create links if the data doesn't exist. I'm thinking not of linking
data to data -- that can be done on initial import -- but linking
data to humans, who now have a public key which might not exist on
the DHT yet because they haven't joined the network. That would
always have to happen per-user once they join, and it could create
some cyclic dependency problems.
Anchors
Re: anchors, an anchor is just a pattern that consists of a base and a link -- the base is a simple string, so it's easy for anyone who knows the string to find it by hash. It acts as, well, an anchor to hang links off of. That's why I'm recommending using it to connect legacy user IDs to pieces of content. You can get sample source code for implementing the anchor pattern at https://github.com/holochain/mixins/tree/master/anchors (note that this is for the legacy version of Holochain, so it's written in JavaScript).
( answer provided by
pauldaoust )

Master Data Services - Domain based attributes

We are using Master Data Services as an MDM solution for our SQL Server BI environment. I have an entity containing a first name and last name and then I have created a business rule that concatenates these two fields to form a full name which is then stored in the "name" system field of the entity.
I use this as a domain based entity in another entity. Then the user can then see the full name before linking it as a attribute in the second entity.
I want to be able to restrict the users from capturing data in the first entity against the name attribute because the business rule deals with the logic to populate this attribute. I have read that there are two ways to do this:
Set the display width to zero of the attribute. This does not seem to work, the explorer version still shows a narrow version of the field in the rows and the user can still edit the field in the detail pane.
Use the security to make the attribute read only. I have tried different combinations of this but it seems that you cannot use this functionality for a name field (system field).
This seems like pretty basic functionality that I require and it seems that there is no clear cut way to do this in MDS.
Any assistance will be appreciated.
Thanks
We do exactly the same thing.
I tested it, and whether you create a new member, or edit an existing member, the business rule just overwrites the manual input value in the name attribute.
Is there a specific 'business' reason why you need to restrict data input in the name field? If it is for Ux reasons, you can change the display name of the name attribute to something like 'Don't populate' or alternatively make it a '.', then the users won't know what to input.

Importing data into Address entity in CRM 13 failing

I have some Account info that contains multiple addresses that I need to enter into CRM 13 for a client. I was instructed by my PM to create a separate csv doc with the duplicate accounts and their info on them and import them into the Address entity, and that this should tie those records to their respective account records. The problem I am having, though, is that every time I have tried to do the import, the process fails. When I look at the log, I have two references that the data failed an import because of a parent id issue. I get both this error: The parent is not valid; and this one under the last column on the log: parent id not set for address type 1701.
I have no idea what parent id I am supposed to map to, because there aren't any to map to when on the import field mapping set up. I have tried creating a lookup field on the Address entity that calls back to the Account form, but that is not allowed by the system. I have relabeled ALL the fields I have to map on the CSV doc to match the proper entity names, and it fails. I have even tried to map it to the one primary key field name on the drop down, and it fails. When I look at the fields individually, there is only one lookup field that goes back to the Account data: The Parent field. It's the ONLY one I can think I could use, but when you go into the form editor it doesn't exist. This seems to be the only way I can do this like my PM wanted.
I am at a loss. I have scoured the internet, and I can't find an answer to how to resolve this. If anyone knows how I can import this data so that it ties to the Account entity, I would greatly appreciate it.
Turns out that CRM 15 has some issues with showing fields in Firefox, at the moment, but not with IE. And it was the last thing I thought to look at, until a coworker brought it up. Problem solved.

Maximo Web Service Data Filter

I've created an enterprise web service in maximo that uses extsys1. In extsys1 I've created a duplicate of MXPERSONInterface and managed to create a query from it (sync was default). Now when I finished my web service I can succesfully query maximo from soap ui client and get all the person data but what I'd like to know is, can I select which data I want to export in my response ? Like...ignoring everything except name/lastname/email or anything like that.
If anyone did that / knows how with any other mbo any help would be very much appriciated. The thing is I don't want all the raw data being in my response, want to make it as much user-friendly as I can.
There is a way to do import/export of data via Web Services that are
dynamically accessed from external applications.
Another thing to note when you're accessing pre-defined object structures in
this way is that the response will always contain every single field that exists
in that object structure.
I will write down a brief tutorial on how to filter that data so that when
you query your object structure you only get a partition of the data in the response.
For the sake of this tutorial I will use MXPERSON and will export Firstname, Lastname, City,
Country and Postalcode.
First go to Integration > Object Structures > Create New Object Structure.
Name it My_MXPERSON, set to be consumed by INTEGRATION, set Authorized application PERSON and add new row for Source Objects and select Person from object list. Now you can go to More Actions > Include/Exclude Fields. Here you should un-check everything except Firstname, Lastname, City, Country and Postalcode (only them need to be CHECKED). Click save.
Now we need to create an enterprise service by going to Inegration > Enterprise Services > New Enterprise Service. Call your service My_MXPERSON_ES, for Operation set QUERY and for Object
Structure select your My_MXPERSON you created earlyer. Click save.
Next thing is to create a publish channel by going Integration > Publish Channels > New Publish
Channel. Name it My_MXPERSON_PC and for Object Structure select your My_MXPERSON (If you can't find it on the list go to your Object structure and uncheck "Query Only" box. Click save.
Now you have everything set up to create your external system. Integration > External Systems > New External System. name it My_MXPERSON_EXTSYS, set End Point to which format you want your response
to be in, I use MXXMLFILE. On the left side you have 3 typees of queue you need to set up, I have 1 option for first 2 and 2 for last one (select the upper one - ends with cqin). Check Enabled.
Within your External System go to Publish Channels and Select your My_MXPERSON_PC, enable it.
Within your External System go to Enterprise Services and Select your My_MXPERSON_ES, enable it it. Click save.
Last thing you need to do before you're done is to create your web service, go to Integration >
Web Services > New Web Service from Enterprise Service. Name it My_MXPERSON_Query, and select from list My_MXPERSON_EXTSYS_My_MXPERSON_ES, select your Web Service from the list and go to more actions > deploy.
Once your Web Service is deployed you can access the wsdl file from servername/meaweb/wsdl/webservicename.wsdl .
For test here we will use SoapUI to test the wsdl file.
Create a new Soap project and copy / paste the url of the wsdl file. If it loads succesfully paste this in the xml request field.
<soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:max="http://www.ibm.com/maximo">
<soapenv:Header/>
<soapenv:Body>
<max:QueryMy_MXPERSON baseLanguage="EN" transLanguage="EN">
<max:My_MXPERSONQuery>
<max:PERSON>
<max:Firstname> Name you want to query </max:Firstname>
</max:PERSON>
</max:My_MXPERSONQuery>
</max:QueryMy_MXPERSON>
</soapenv:Body>
</soapenv:Envelope>
Remember to swap "Name you want to query" with the actual name in your table.
Hope this guide helped.
Using Maximo 7.5.0.5, Go To > Integration > External Systems
In External Systems, pick your system that you want to filter records for
Go to the Publish Channels tab
Click on Data Export
In the Export Condition field, enter your where clause to filter your record set
I referenced these steps from IBM Help:
http://publib.boulder.ibm.com/infocenter/tivihelp/v27r1/index.jsp?topic=%2Fcom.ibm.itam.doc%2Fmigrating%2Ft_asset_disposal_export_data.html
Normally, I just reference the link. In my experience though, IBM's web site frequently changes URL structure and occasionally goes offline for "maintenance". For accessibility, I am including the text here. No offense to copyright.
Exporting asset disposal data
To provide information for review or for a company that you hire to dispose of assets, you can use the integration framework applications to export a data file with information about assets that you are planning to dispose of.
Before you begin
Before you attempt to export a file, check that the following tasks are completed:
JMS queues are configured. You can use either continuous queue or sequential queue, depending on your business process.
The external system for asset disposal integration is enabled.
The publish channel is enabled.
About this task
The following procedure explains how to export asset disposal data.
Procedure
1) On the navigation bar, click Go To > Integration > External Systems.
2) On the List tab, select the TAMITEXTSYS external system.
3) On the Publish Channels tab of the External Systems application, select the ITASSETDISPOSAL publish channel and click Data Export.
4) In the Export Condition field in the Data Export window, enter an SQL statement that is appropriate for the Maximo® database that you use. This statement specifies the export condition.
Typically conditions filter by location, by site ID, and by status, as shown in the following example.
location = 'DISPOSAL' and siteid = 'BEDFORD' and status not in ('DECOMMISSIONED','DISPOSED')
The SQL statement must use the database names for attributes as shown in the field help. To view the field help, position the cursor in a field and press Alt+F1. The field help displays the database table and column (attribute) in the following format: ASSET.SITEID, where SITEID is the attribute name.
5) Click OK to export the asset data.
What to do next
The location to which the file is exported depends on the global directory set for the system and on the filedir parameter for the endpoint of the external system. If no global directory is set, look in the root of the application server folder. If no filedir parameter is set for the external system, look in the 'flatfiles' sub-directory. For example,
C:\bea\user_projects\domains\maximo_database\flatfiles\TAMITEXTSYS_ITASSETDISPOSALInterface_1236264695765361846.dat
Another way to locate the file is to search the operating system file structure for TAMITEXTSYS_ITASSETDISPOSALInterface*.dat.

Can Primary-Keys be re-used once deleted?

0x80040237 Cannot insert duplicate key.
I'm trying to write an import routine for MSCRM4.0 through the CrmService.
This has been successful up until this point. Initially I was just letting CRM generate the primary keys of the records. But my client wanted the ability to set the key of a our custom entity to predefined values. Potentially this enables us to know what data was created by our installer, and what data was created post-install.
I tested to ensure that the Guids can be set when calling the CrmService.Update() method and the results indicated that records were created with our desired values. I ran my import and everything seemed successful. In modifying my validation code of the import files, I deleted the data (through the crm browser interface) and tried to re-import. Unfortunately now it throws and a duplicate key error.
Why is this error being thrown? Does the Crm interface delete the record, or does it still exist but hidden from user's eyes? Is there a way to ensure that a deleted record is permanently deleted and the Guid becomes free? In a live environment, these Guids would never have existed, but during my development I need these imports to be successful.
By the way, considering I'm having this issue, does this imply that statically setting Guids is not a recommended practice?
As far I can tell entities are soft-deleted so it would not be possible to reuse that Guid unless you (or the deletion service) deleted the entity out of the database.
For example in the LeadBase table you will find a field called DeletionStateCode, a value of 0 implies the record has not been deleted.
A value of 2 marks the record for deletion. There's a deletion service that runs every 2(?) hours to physically delete those records from the table.
I think Zahir is right, try running the deletion service and try again. There's some info here: http://blogs.msdn.com/crm/archive/2006/10/24/purging-old-instances-of-workflow-in-microsoft-crm.aspx
Zahir is correct.
After you import and delete the records, you can kick off the deletion service at a time you choose with this tool. That will make it easier to test imports and reimports.