• Inok Systems

RemedyForce CMDB - REST Integration using Pentaho Client (ETL job)

會員評等:  / 0

The topic of this blog is Remedyforce integration with external application using REST API (Pentaho client). The REST Client transformation step in Pentaho enables you to consume RESTfull services. The logic can be applied if there is anyone out there doing ETL work against CouchDB or other big data solutions.

The REST Client Step is fairly simple. Initiate it with a generate rows step and set your method in the HTTP Method drop down. My difficulty came in understanding how to define the path in the JSON input. Once I figured out the path, everything was much simpler.

You can create this as job and create a batch file to call kettle spoon batch file.




  •  RemedyForce account with Admin access.

Generate Rows Step (SetRESTURL)

Even though the REST Client step has a URL field, it doesn't seem to work and you will need the 'Generate Rows' to pass in the URL. As you can see here I’ve defined the URL, X-Authentication these are the values required for RESTClient.  Depends on the REST web service the fields you might need to define.

REST Client Step (REST Query)

Accept URL from the field is defined in the previous step ‘URL’, define  your web service operation method. Define Application type accordingly. Output of your query will ‘result’ field. Also defined the headers since its required for my web service.

Note: If you are using self-signed certificate then you might need to import the cert to java key store.

Json Input Step (JSON Input)

Choose source is defined in field option check box and get source from field as ‘results’ field.  We need to define the JSON path expressions. Refer below links for more details,

JSON path expressions  http://goessner.net/articles/JsonPath/index.html#e2

Validate json online: http://jsonpath.com/

Modified Java Script Vale Step (Transform JSON)

If you want to perform certain operations such as define default value or concatenate or such operations you can define in this step. I’m defining the required success and failure file names here in this step. This will be used in the future steps for error / success logging.

var parsedDate =  new java.text.SimpleDateFormat("yyyyMMddHHmmss").format(new java.util.Date());

var successFileName = getVariable("Internal.Job.Filename.Directory","") + "/" + "COMP_SUCCESS_ROWS_" + parsedDate;

var failureFileName = getVariable("Internal.Job.Filename.Directory","") + "/" + "COMP_FAILURE_ROWS_" + parsedDate;

Select Values Step (Remove JSON values)

In this step we can remove the fields that are no longer required further (eg: results, URL etc.). 

Sort rows step (Sort rows)

Sort your data stream based on the unique key.

Salesforce Input Step (Salesforce Input [CMDB_Class])

Define the salesforce web service URL, using SOQL query getting the Computer system Class ID where we are going to insert the records. The fields will be record id and class name. Ensure that your login id has required permission.

select id, BMCServiceDesk__ClassName__c from BMCServiceDesk__CMDB_Class__c where BMCServiceDesk__ClassType__c  = 'Class' and BMCServiceDesk__ClassName__c='BMC_COMPUTERSYSTEM'

Modified Java Script Vale Step (Get CMDB Class Id)

This step to convert the CMDB class name to upper which will be used in the coming comparison.

Stream value Lookup (Get CMDB Class Id)

This step will look up the class name from the incoming steam to the stream coming JSON step.  

Sort (Sort rows)

This step we can sort all the rows that will be used for upserting the records to salesforce.

Unique rows steps (Unique rows)

Sort unique rows based on the unique identifier defined in the stream.

Salesforce Upsert step (Upsert into class Computer System)

Perform all required mapping to the Remedyforce CMDB field to the incoming stream from the transformation.

Text File Outout (Failure Rows and Success Rows)

This will generate the file with failure/success records. You can define the fields that need to appear in the log file.




Random Blogpost

In the previous article (Report Data Access based on User Profile in Smart Reporting (Part 1)), we are talking about how to create the source filter. In this article, we will talk about how to apply the source filter to field in the View and create report to use that source filter.

After creating the source filter, you can apply the source filter to specific field in the view. For example, if you want to restrict user to only be able to view list of Service Requests that are requested for their Department, you can do the following:

a.       For the Admin Console of the Smart Reporting, go to “View” section and select “Service Request Management View” and click “Edit”.

b.      Go to the second step from the top of the page.

c.       Select the field from the view that you want to map to the source filter (in this example, double click on “Customer Department”)

d.      Go the “Access” tab of the field option and select the “Department” in the “Access Filter” field and then click “Save”


e.       You can select the filter to apply as default. For example if you select “Department” as default, the report created against this view will always have the “Department” filter selected as default.


Create Report and Apply the Source Filter

After mapping the field to the source filter, you can start to create the report using the “Service Request Management” view updated above. The report result list will be displayed based on the user profile if the Source Filter is selected in the report.


You can apply other filter types as needed by updating the SQL Query and map to the corresponding objects. For example, you can add filter to display incident based on the support group that the user is belong to, you can filter service request based on the site of the user and so on.


Hope this article can be useful for you. Have a nice day! J