Skip field validation when creating a record using Dynamics 365 web api - rest

I am integrating Dynamics 365 with our product, and I am running into an issue when creating fields on dynamics 365 using web api.
I am creating a contact or lead using a set of fields such as email, name etc. Some fields seem to have validation, such as number only field, or picklist. In these cases I am getting an error with 400 status and the record is not created. Is there a way to just create the record using the valid fields and just skip updating the fields that failed validation.
This is the end point I am using for creating a contact:
/api/data/v9.1/contacts
Headers used:
Prefer:return=representation
Content-Type:application/json
OData-MaxVersion:4.0
OData-Version:4.0
If-None-Match:null

You have two options:
Either do the proper validation when user entering values for those fields from outside Dynamics.
Or create custom fields in Dynamics contact entity to store the string type values
Web api payload has to be whole and cannot be truncated/set for adhoc scenarios like skip failing datatypes (your requirement).

Hope this helps. No matter what software or environment you are integrating with, it is important to know your data structure including data types, data validations (including required yes/no) and data constraints (length, decimal places, etc.).
There are three ways you can integrate with Dynamics 365 factoring in the above.
1. Static code based on the Dynamics 365 configuration
If you can login to Dynamics 365, you can view the environment definition by going to Settings > Customizations > Customize the System.
From here you can view all Entity and Attribute definitions and and write your code accordingly. You can also "require" installation of your own solution with Entities and Attributes, giving you control over what your integrating with.
2. Creating Early-Bound classes
You can generate Early Bound classes with the Entity and Attribute definition from Dynamics 365 with the CrmSvcUtil tool. For more information.
For more information:
https://learn.microsoft.com/en-us/dynamics365/customer-engagement/developer/org-service/create-early-bound-entity-classes-code-generation-tool
3. Dynamics 365 Metadata service
Dynamics 365 provides a Metadata service, enabling you to retrieve the exact definition of alle Entities and Attributes directly from the given Dynamics 365 environment. As such you can retrieve the definition prior to integrating.
I think considering the performance, this is definitly not something you would like to do every single message. To resolve this you could retrieve the definition on request (manual trigger) or daily.
For more information: https://learn.microsoft.com/en-us/dynamics365/customer-engagement/customize/create-edit-metadata
When working with Dynamics 365, the XRMToolBox is a must have tool for any developer or consultant. With the tools Metadata Browser plugin you can view the data that you can retrieve by the Metadata service.

Related

How to query domain in jaspersoft with Dynamic Parameters

I am new to jaspersoft reporting. I am currently designing and developing reports by considering following requirements.
I want to create template based reports where all dynamic parameters I need to pass in SQL query.
I was going through japsersoft reporting I found that we can create join views and cache data by creating domains. So that it reduces hits at db level.
While creating report I found that I cant execute SQL script on Domain objects.
Please advice whether I am on right track or not.
Basically I want to query on cached data such as domain view instead of hitting DB directly.
Please suggest if any workaround is available for this problem.
Please note, although JasperReports Server manages a cache for Ad Hoc Views and Ad Hoc Reports running on Domains, running a JRXML report (e.g. designed in Jaspersoft Studio) on a Domain does not guarantee hitting that cache.
You also have the option of using a layer that provides caching between JasperReports Server and your database. For example, support has been recently added for TIBCO Data Virtualization (not a free product) in v.7, see https://www.jaspersoft.com/introducing-jaspersoft-7.
In any case, Domains are not relational databases and therefore do not support straight SQL.
You can use the "Domain query language" though, which offers a subset of the features of SQL. The easiest way to write a query is using Jaspersoft Studio and selecting "domain" in the Language dropdown (top-left corner of the Dataset and Query Dialog, indicated by the red arrow in the screenshot below from Studio 6.4.0):
For example the design above (which uses the Supermart Domain, provided with the sample data) will generate this query and the required "dynamic" parameter as you requested – in this case a Collection as the filter is 'Is One Of' which can take multiple values:
<query>
<queryFields>
<queryField id="sales_fact_ALL.sales__product.sales__product__product_name"/>
<queryField id="sales_fact_ALL.sales_fact_ALL__store_sales_2013"/>
</queryFields>
<queryFilterString>sales_fact_ALL.sales__store.sales__store__region.sales__store__region__sales_country in sales__store__region__sales_country_0</queryFilterString>
</query>
See here for another example of a query (current version of docs based on 7.1.0 release), in this case for use with the REST API: https://community.jaspersoft.com/documentation/tibco-jasperreports-server-rest-api-reference/v710/queryexecutor-service
The queryFilterString tag follows the DomEL syntax as documented here (also for 7.1.0): https://community.jaspersoft.com/documentation/tibco-jasperreports-server-user-guide/v71/domel-syntax

How to securize an entitie on Sails?

I'm developing an API with Sails, and now I need to securize some variables from an entity. Those variable will be accesed only from Admin or own user.
I have an structure like this:
Employee (contains your employee records)
fullName
hourlyWage
phoneNumber
accountBank
Location (contains a record for each location you operate)
streetAddress
city
state
zipcode
...
I need to encrypt phonenumber and accountbank, to avoid anyone to see the values of this fields in the DataBase. Only the owner or the admin.
How I can do that? Thanks
You are looking for a way to encrypt data so that people with no required access right could not see it.
The solution for that is not Sails.js specific and Node actually comes with tools to encrypt data :https://nodejs.org/api/crypto.html.
The key rule here is to always keep your secret password safe.
As for integration in your Sails.js application, I would use callbacks in Models. The official documentation provides a good example here : http://sailsjs.org/documentation/concepts/models-and-orm/lifecycle-callbacks
Basically you just define a function that will be called each time the record is about to be created, fetched or updated. You can then apply your encrypt/decrypt functions there.
This will encrypt/decrypt your phone numbers and bank account numbers automatically.
Regarding access control, you can use Sails' policies along with authentication to determine if the client has the right to access the resource. If not you can always remove attributes from the response sent back to the client.

How to import users in CRM 2011 with source GUID

We have three Organization tenents, Dev, Test and Live. All hosted on premise (CRM 2011. [5.0.9690.4376] [DB 5.0.9690.4376]).
Because the way dialogs uses GUIDs to refference record in Lookup, we aim to maintain GUIDs for static records same across all three tenents.
While all other entities are working fine, I am failing to import USERS and also maintain their GUIDS. I am using Export/Import to get the data from Master tenent (Dev) in to the Test and Live tenents. It is very similar to what 'configuration migration tool' does in CRM 2013.
Issue I am facing is that in all other entities I can see the Guid field and hence I map it during the import wizard but no such field shows up in SystemUser entity while running import wizards. For example, with Account, I will export a Account, amend CSV file and import it in the target tenant. When I do this, I map AccountId (from target) to the Account of source and as a result this account's AccountId will be same both in source and target.
At this point, I am about to give up trying but that will cause all dialogs that uses User lookup will fail.
Thank you for your help,
Try following steps. I would strongly recommend to try this on a old out of use tenant before trying it on live system. I am not sure if this is supported by MS but it works for me. (Another thing, you will have to manually assign BU and Roles following import)
Create advance find. Include all required fields for the SystemUser record. Add criteria that selects list of users you would like to move across.
Export
Save file as CSV (this will show the first few hidden columns in excel)
Rename the Primary Key field (in this case User) and remove all other fields with Do Not Modify.
Import file and map this User column (with GUID) to the User from CRM
Import file and check GUIDs in both tenants.
Good luck.
My only suggestion is that you could try to write a small console application that connects to both your source and destination organisations.
Using that you can duplicate the user records from the source to the destination preserving the IDs in the process
I can't say 100% it'll work but I can't immediately think of a reason why it wouldn't. This is assuming all of the users you're copying over don't already existing in your target environments
I prefer to resolve these issues by creating custom workflow activities. For example; you could create a custom workflow activity that returns a user record by an input domain name as a string.
This means your dialogs contain only shared configuration values, e.g. mydomain\james.wood which are used to dynamically find the record you need. Your dialog is then linked to a specific record, but without having the encode the source guid.

Need pointers on how report generation can happen in CQ5

We have created a set of forms in CQ5 and we have a requirement that the content of these forms should be stored at a specific node, our forms interact with third party services and get some data from there as well, this is also stored on the same nodes.
Now, we have to give authors the permission to go and download these reports based on ACLs. I also will have to provide them start and end date and upon selecting these dates the content placed in these nodes should be exportable in CSV format.
Can anybody guide me in how to achieve this functionality. I have gone through report generation but need better clarity on how this can be achieved like how will i be able to use QueryBuilder api/ how can i export and how do i provide the dates on the UI.
This was achieved as described.
I actually had to override the default report generation mechanism and i created my own custom report using report generation tutorial in cq documentation.
Once the report templates and components were written, i also override cq report page component and provided input dates in body.jsp using date component of granite.
once users selected dates, with the help of querybuilder api i used to search for nodes at path(specified by author, can be different for different form data) and i also created an artificial resource type at nodes where i was storing the data, this lead me to exact nodes where data was stored and this property was also passed to querybuilder. The json returned as response from querybuilder was then supplied to a JS which converted the data to csv format.

stubbing data in REST apis for large system/integration tests

The Problem
Say I've got a cool REST resource /account.
I can create new accounts
POST /account
{accountName:"matt"}
which might produce some json response like:
{account:"/account/matt", accountName:"matt", created:"November 5, 2013"}
and I can look up accounts created within a date range by calling:
GET /account?created-range-start="June 01, 2013"&created-range-end="December 25, 2013"
which might also produce something like:
{accounts: {account:"/account/matt", accountName:"matt", created:"November 5, 2013"}, {...}, ...}
Now, let's say I want to set up some sample data and write some tests against the GET /account resource within some specified creation date range.
For example I want to somehow insert the following accounts into the system
name=account1, created=January 1, 2010
name=account2, created=January 2, 2010
name=account3, created=December 29, 2010
name=account4, created=December 30, 2010
then call
GET /account?created-range-start="January 2, 2010"&created=range-end="December 29,2010"
and verify that only accounts 2 and 3 are returned.
How should I insert these sample accounts to write my tests?
Possible Solutions
1) I could use inversion of control and allow the user to specify the creation date for new accounts.
POST /account
{account:"matt", created="June 01, 2013"}
However, even if the created field were optional, I don't like this approach because I may not want to allow my users the ability to set the creation date of their account. I surely need to be able to do it for testing but having that functionality as part of the public api seems wrong to me. Maybe I want to give a $5 credit to anyone who joined prior to some particular day. If they can specify their create date users can game the system. Not good.
2) I could add one or more testing configuration resources
PUT /account/creationDateTimestampProvider
{provider="DefaultProvider"}
or
PUT /account/creationDateTimestampProvider
{provider="FixedDateProvider", date="June 01, 2013"}
This approach affords me the ability to lock down these resources with security constraints so that only my test context can call them, but it also necessarily has side effects on the system that may become a pain to manage, especially if I have a bunch of backdoor configuration resources.
3) I could interact directly with the database circumventing the REST api altogether to set my sample data.
INSERT INTO ACCOUNTS ...
GET /account?...
However this can allow me to get into states that using the REST api may not allow me to get into and as the db model evolves maintaining these sql scripts might also be a pain.
So... how do i test my GET /account resource? Is there another way I'm not thinking of that is more elegant?
There are a lot of ways to do this, and you've come up with some solid (though maybe not perfect for your situation) solutions.
In the setup for the test, I would spin up an in-memory database like HSQLDB (there are others) and do the inserts. The test configuration will inject the appropriate database configuration into your service provider class. Run the tests, and then shut the database down on teardown.
This post provides a good example at least for the persistence side of things.
Incidentally, do not change the API of your service just to help facilitate a test. Maybe I misunderstood and you aren't anyway, but I thought I would mention just in case.
Hope that helps.
For what it's worth, these days I'm primarily using the second approach for most of my system level (black box) tests.
I create backdoor admin / test apis that have security requirements which only my system tests can access. These superpower apis allow me to seed data. I try to limit the scope of these apis as much as possible so they are not overly coupled to the specific implementation details but are flexible enough to allow specifying whatever is needed for the desired seed data.
The reason I prefer this approach to the database solution that Vidya provided, is so that my tests aren't coupled to the specific data storage technology. If I decide to switch from mongo to dynamo or something like that; using an admin api frees me from having to update all of my tests--instead I only need to update the admin api/impl.