Tuesday, May 29, 2018

Create a custom Grid in MS Dynamic CRM using Jquery DataTable

Sometime we need to show our data in table format. We should use Jquery DataTable to show the data in sub-grid (Table) format. 

Copy the below code and paste this code in your editor. And change the code as per your need. Here i will create a DataTable of Case Entity.


<html>
<head>
    <title>MS Dynamic CRM</title>
  
    <script src="ClientGlobalContext.js.aspx" type="text/javascript"></script> 
 <link rel="stylesheet" href="https://cdn.datatables.net/1.10.16/css/jquery.dataTables.min.css">
 <link rel="stylesheet" href="https://cdn.datatables.net/select/1.2.5/css/select.dataTables.min.css">
 <link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/css/bootstrap.min.css">
 <script src="https://code.jquery.com/jquery-1.12.4.js"> </script>
 <script src="https://cdn.datatables.net/1.10.16/js/jquery.dataTables.min.js"> </script>
   
<script>

  var dataSet;
  var arrData = [];

$(document).ready(function() { 
 // Get the data in Json format, Change the URL as per your need.
    var entityName ="incident";    // This is the Entity name of Case.
  var  url = window.parent.Xrm.Page.context.getClientUrl() + "/api/data/v8.2/" + entityName +"s?$select=title,ticketnumber,prioritycode";      
  var myData = []; 
  var req = new XMLHttpRequest();
  req.open("GET",url, false);
  req.setRequestHeader("OData-MaxVersion", "4.0");
  req.setRequestHeader("OData-Version", "4.0");
  req.setRequestHeader("Accept", "application/json");
  req.setRequestHeader("Content-Type", "application/json; charset=utf-8");
  req.setRequestHeader("Prefer", "odata.include-annotations=\"*\"");
  req.onreadystatechange = function() {
   if (this.readyState === 4) {
    req.onreadystatechange = null;
    if (this.status === 200) {    
       myData = JSON.parse(this.response); 
       dataSet=myData.value;     
    } else {
     Xrm.Utility.alertDialog(this.statusText);
    }
   }
  };
  req.send();
     
   // Convert Json data into 2-d Array
    arrItems = [];    
  $.each(dataSet, function (index, value) {  
   arrItems.push(value.title);
   arrItems.push(value.ticketnumber);
     // arrItems.push(value.prioritycode);   or
   arrItems.push(value["prioritycode@OData.Community.Display.V1.FormattedValue"]) ;  // For OptionSet value    
   arrData.push(arrItems);   // Push The Values Inside the Array to Create 2-D Array
   arrItems = [];          
  });
        
  table(); // Call a table function to create table.  
});

function table() { 
 $('#customdatatable').DataTable( {
        data: arrData,
        columns: [
            { title: "Title" },  // Change the column name as per your need.
   { title: "Ticket Number" },
   { title: "Priority" }          
        ]
    } );
}
   
</script>
</head>

<body style="word-wrap: break-word;">
   
 <table id="customdatatable" class="display" width="100%"></table>

</body>

</html>

 Create a new HTML Web-resource and Upload the code in this web-resource and check the table.









Asynchronous Processes/Wrokflows Stuck in InProgress/Waiting status in MS Dynamic CRM

CRM developer/User once in a life faces the issue regarding Asynchronous Processes stuck in same status i.e. InPrgress/Waiting/Pausing/Canceling.You can see the system job status is not changing.
AsyncWF

The Reason-

The main reason behind this issue  –
  1. Many jobs are in waiting status.
  2. Asyncoperationbase table become full due to many succeeded/canceled jobs occupied space.
  3. Asynchronous processes settings are not proper.
  4. The asynchronous workflows are not configured properly.

Solution-

The solution for this issue –
  1. Very first step is to restart the  Microsoft Dynamics CRM Asynchronous Processing Service. This might work in many cases.
  2. Many Jobs are in waiting status->
    you can update the job status to canceled and completed by creating console application or from database-
    a. Create the console and use the script –
    CancelWFProgramatically
    b. Using Database Query-
    Note- You should create restore(Checkpoint) point first before working on database directly to rollback changes if needed.
    CancelWFDB
    You can use “where statusCode=10 –Waiting”.
  3. Asyncoperationbase table become full due to many succeeded/canceled jobs occupied space –
    You need to cleanup the database by deleting the succeeded and canceled jobs-
    Make sure that only the following Async operation types are deleted if the state code of the types is 3 and the status code of the types is 30 or 32:
    • Workflow Expansion Task (1)
    • Collect SQM data (9)
    • PersistMatchCode (12)
    • FullTextCatalogIndex (25)
    • UpdateContractStates (27)
    • Workflow (10)
    CleanupScript
    If script took very long time then you should stop the script and rebuild the indexes for AsyncOperationBase as well as PrincipalObjectAccess tables. And run the script again.
  4. You can check if the values are optimal-
    • AsyncItemsInMemoryHigh
    • AsyncItemsInMemoryLow
    • AsyncStateStatusUpdateInterval
    • AsyncMaximumThreadsPerCPU
    • AsyncSelectInterval
    • AsyncSelectParallelism
    • AsyncThrottlingConfiguration
    Also you can check the ‘AsyncSdkRootDomain’ setting from  [MSCRM_CONFIG].[dbo].[DeploymentProperties]
    WFsetting1
    Recommended value of ‘AsyncSdkRootDomain’ should be same as ‘ADSdkRootDomain’. Or you can put server name as value.
    WFsetting2
  5. And Finally you can check your asynchronous work flow logic.
    There are many possibilities the workflows are stuck due to internal logic.
You have to restart the  Microsoft Dynamics CRM Asynchronous Processing Service after executing any above step.

Thursday, May 3, 2018

Bad Characters Messing Up Your Migration to Microsoft CRM Dynamics


Our migration process typically consists of moving the source data into a staging SQL Server database prior to the actual migration to CRM. Among other reasons, this gives us a place to do data cleansing prior to the CRM migration.
We run into many common issues such as field length differences and data type mismatches that are often found during the data mapping process with the customer. One less common issue we encounter in testing a migration is that some characters in the source data are not supported in CRM when importing data via the API. There are certain non-printable characters that are supported such as carriage-return and line-feed however others like record separator [char 30] or vertical tab [char 11] often are not accepted when migrating data to CRM.
We've developed a common SQL framework we use to allow us to do some data analysis and clean-up of these invalid characters in our staging tables prior to doing our push of the data to CRM. In most cases we run the data migration without any cleansing and capture any failed rows into an error table where we keep the source system record id and the CRM API error message. From there we can determine if any entities had errors around invalid characters. Here is an example of what that error would look like. In our example we are using the KingswaySoft CRM Adapter for SQL Server Integration Services.
image1
Once we know what entities and fields have invalid characters we can start to build our cleanup routine from our base framework.

So which characters are going to cause us a problem?
In our research we found that the Microsoft Dynamics CRM API does not like asci characters below character number 32 (which is the space " " character). So we start with a list of 1 – 31 to represent potential bad characters. We also know that horizontal tab, carriage return, and line feed (CHAR(9), CHAR(13), and CHAR(10) respectively) are valid in CRM and should not be in this list of bad characters.
For the sake of examples, here is a sample script to spin up 10 ‘note’ records with potentially bad data in the NoteText field.
To create the list of bad characters, we used a Common Table Expression (CTE). The below script gives a numbered list containing the asci character values of known-bad characters called, ‘BadCharacters’.
From there it's a matter of writing a query that will join the known bad characters CTE with your stage table and have it review each character in the field that was reported as having bad characters in your error logging of your data migration. Here the SQL Cross Apply clause comes in handy to make this a simple process. In this example we are migrating notes into the CRM notes entity. I know from my error logging shown above that the notetext field in my stage table has some bad characters that CRM did not like. So I cross apply my BadCharacters table with my notes staging table and have it inspect the notetext field for bad characters (using the above CTE definition).
Here are the results of the above query on my data set. I can see exactly what records, what the raw value is currently in that field, what the bad character was reported and where in the string it exists.
image2
After I do my analysis and confirm that it’s acceptable to replace these characters an update script is run against my stage table. Here is my final script that I can include in my data migration process to swap out any bad characters with a blank string in my stage table prior to sending these records to CRM.
It should be noted that the CTE spins up 256 possible rows to cover every possible ascii character. In this case we know that we only want to do the cross apply on a subset of these potential values. But the Bad Characters CTE could be amended to include/exclude any ascii characters.

Hope it will help You!!.

Get files of last hour in Azure Data Factory

  Case I have a Data Factory pipeline that should run each hour and collect all new files added to the data lake since the last run. What is...