Category: Blog (Page 1 of 4)

Your blog category

Inspect Maximo Outbound Request with Automation Script

Problem:

When building integration in Maximo, using an inspection tool such as webhook.site to inspect the requests is extremely useful. It allows us to easily identify:

  • Whether Maximo published a message at all (to confirm Event Filter)
  • What is the data content it sends to external system (to confirm Data Mapping logic)
  • What are the values of some request headers (usually to confirm the authentication details are correct)

If you are an integration developer and haven’t used webhook.site, I encourage you to test it with Maximo. Spending 10 minutes now will save you countless hours of head scratching in the future.

However, in certain situations, this approach does not work. For example, a friend came to me recently for help with setting up an endpoint which uses Microsoft Entra’s OAUTH authentication. In this environment, the client has strict firewall and security settings. Means Maximo can only send requests to a few white-listed IP addresses. Using an online or locally installed inspection tool is not possible. We suspected Maximo did not send JD Edwards (the external system in this case) with a correct Authorization header but did not know how to confirm.

Solution

Automation Script can be used to build a fully functional web API. We can use it to capture the details of the request and write them to the system log or displaying it in the response. The request’s details are accessible in the autoscript via the following implicit variabbles:

  • request: exposes important methods such as request.getQueryParam(“parameter”) and request.getHeader(“header”) to retrieve values provided as a query parameter or header respectively. You can also access the UserInfo by calling request.getUserInfo().
  • requestBody: A string representation of the data submitted on request (for POST/PATCH/PUT)
  • httpMethod: whether this was a GET, POST, PUT
  • responseBody: a byte[] or String to return in the response. This is not required if you do not intend to return a response.

For more information on this, please refer to the IBM’s official Maximo Automation Script document

To provide an example, I setup Mock API by creating an automation script as follows:

To confirm that it is working, I run the script by opening it in the browser and can see the details of browser’s request as follows:

In this Maximo environment, I have an endpoint to Microsoft Dynamics 365 – Business Central. It uses Microsoft Entra OAUTH. If I want to inspect the requests to this endpoint, I need to change the URL to direct it to the MOCKAPI script as follows:

Note: Depending on the authentication settings in Maximo, we either have to pass the &_lid=xxx&_lpwd=yyy parameters in the URL, or set a MAXAUTH or APIKEY in the Headers field.

Now if I test the endpoint with some text data, it will give me a response with the details of the request as follows:

By using this method, in the investigation mentioned above, we were able to identify that Maximo sent JD Edwards the correct token in the Authorization header. However, because it required a few other parameters in the header which we forgot to put in, it still gave us a confusing “Unauthorised” error. After updating the header, the endpoint worked as expected.

PYODBC – Login Timeout Expired error

Just want to document a strange issue I had today. I have this small Python program that uses the pyodbc library to connect to an SQL Server database on my local machine. The code was working fine two weeks ago before I left it to work on a different project. Below is the simplified version of the code in which pyodbc is used to read data from SQL Server 2022:

When I tried running it today, it did not work. I had the following error instead:

pyodbc.OperationalError: ('HYT00', '[HYT00] [Microsoft][ODBC SQL Server Driver]Login timeout expired (0) (SQLDriverConnect)')

While it looks like a network issue, there has been no change to my laptop or home network. Looking up this error on the internet does not seem to give me any solution to the issue.

However, this article suggests we can use ODBC driver with pyodbc. My ODBC Data Source (found in Windows’ Control Panel) showed that I have ODBC Driver 17 for SQL Server.

So, I changed my driver as in the example below and it worked fine:

I could not find out what caused the previous driver (SQL Driver) to stop working.

Maximo training books from archive

This is a list of old training books created by MRO before it was acquired by IBM. I used these to learn Maximo when I started with it. While these courses were designed for Maximo 7, the core processes remain the same in the newer versions. You can use this to learn Maximo if you can’t find a recent course elsewhere.

Immersion Training: this book should be the starting point for an absolute beginner to learn the basic Maximo applications and processes. It covers the following topics:

  • Setting up Organisation & Site structure
  • Creating Person and User records
  • Setting up Inventory items
  • Create Asset records
  • Setting up Job Plans, Preventive Maintenance, and generate PM Work Orders
  • Reactive Work Management process
  • Requisition and Reordering

System Administration Training: this book is for a beginner who wants to learn administration tasks and application configuration. It covers the following topics:

  • Users & Security Control
  • Report Administration
  • Database Configuration
  • Domains
  • Other configuration applications: Escalation, Bulletin Board, Escalations, and Communication Templates

Below are some other books that focus on certain specific topics:

  • Functional:
    • Inventory Management: covers Inventory management topics such as Item Master, Storeroom, Inventory Transactions, Reordering, and Item Kits
    • Procurement: covers the purchasing processes including setting up Companies, Contracts, Purchase Requisitions, Purchase Orders, Receipts, and Invoices.
    • Work Management: cover the work management processes including setting up Job Plans, Preventive Maintenance plans, generating Work Orders from PM, and schedule / dispatch / execute / complete Work Orders
  • Configuration:
    • Application Design: this book covers how to use the Application Designer to modify Maximo UI screens or create a new application
    • Workflow Management: how to use the Workflow Designer app to modify or create a new workflow in Maximo
    • Migration Manager: how to use Migration Manager to migrate configuration between Maximo environments

The matrix below shows the training tracks which provide some guidance on what kind of knowledge you need to acquire for each user role

How to use custom endpoint for dynamic delivery?

The challenges

Enterprise application integration needs to be fast and fault-tolerance. As a result, when it comes to integration with Maximo, we almost always opt for asynchronous, near real-time message delivery. This method of delivery has certain challenges which is not a big problem when using an external enterprise integration tool. There are dozens of these tools available on the market.

However, running an enterprise application integration system is expensive. Many smaller companies often opt for direct integration. Maximo Integration Framework offers an arsenal of tools to handle all kind of requirement. Yet, the framework is still lacking in two major areas:

  • Data mapping
  • Dynamic delivery

Previously, I have provided an example on how to use XSLT. In this article, I will discuss on how to address the dynamic delivery requirement.

Dynamic delivery with a custom automation script endpoint

When is dynamic delivery required?

In many cases, we need to build an integration interface with one or more of the following requirements:

  • Different URL for each Add/Update/Delete action: this is a common pattern for the REST API nowadays. For example, the ServiceNow API has 3 different URL formats to Add, Update, and Cancel a Service Order.
  • The target application needs an Internal ID for the Update request: this is also a common pattern. The Maximo OSLC REST API itself is an example.
  • The need for retrying and reprocessing failed deliveries: message delivery is expected to fail due to many external factors. Thus, this is a mandatory requirement for almost all integration interfaces.

In general, the logic to address the above requirement should be handled during delivery, not during publishing and transformation of the integration message.

Why using this approach?

When the external system requires an Internal ID for the Update action, one common strategy is to store the ID in the EXTERNALREFID field. For example, when a new PO is created, it is published to an external ERP system. Upon successful delivery, the receiving system responds with the Internal ID of the newly created record. We can store this value in the PO’s EXTERNALREFID field or a custom field. The next time when the PO needs to be published, if this field has a value, we know that the PO has already been created in ERP and thus, will send an Update request instead of a Create New request.

But this strategy often does not work well with asynchronous integration. For example, if the first request to create a new PO takes a while to process due to the external system being busy, a subsequent update to the PO will send another Create New PO request, resulting in duplicated POs in the target system.

However, if we handle the logic to determine whether to send a Create New or Update during delivery, this should not be an issue. In the case the first request failed, and the second request will be determined as a Create New. In the case both messages failed, and later they are reprocessed successfully, the first one that reaches ERP will be determined as a Create New, and the second one will be delivered as an Update.

Two key benefits of this approach are:

  • No need for manual intervention
  • Can use continuous outbound queue instead of sequential queue

Real-world Implementation

Over the last 2-3 years, I have built a few interfaces using this approach. So far, they have been working well. The latest one was an interface between Maximo and Myob Greentree ERP. In this case, Maximo handles the procurement process and Greentree handles the payable process. The functionality of Greentree API is somewhat limited. It does not support Sync operation and requires Internal ID for both the PO and the POLINE records. To address this requirement, the interface is implemented with the following key points:

  • Data mapping is handled by XSLT
  • Delivery is handled by an automation script endpoint:
    • On delivery, the endpoint will query Greentree to identify the Internal ID and status of the PO and PO Lines. If the PO already exists, it will determine to Update or Add New PO lines depending on the status of each line.
    • The original payload is manipulated during delivery to match with the current status of the PO in Greentree. Which means if it fails, the delivery will retry and come-up with a new payload to match with the new status of the record in Greentree at that point of time.
  • An autoscript crontask is added to send error alerts to the user in Excel format. But it will only send email in the case there is a least one error. The email is delivered directly to the user most of the errors come from missing or incorrect GL Account.

Automation Script example

To implement the requirement above, we can create a custom endpoint with automation script. The endpoint can be associated with the publish channel and external system like other standard endpoints. That means the interface is fully compatible with other MIF functionalities including:

  • Message Queues
  • Message Tracking
  • Message Reprocessing

Below is an example piece of python code for the end point:

The basic example above does a similar job like a standard HTTP endpoint:

  • The published message is passed to the script in the implicit variable requestData
  • Any failure/exception at any line of the code will be equal to a failed delivery:
    • The message is put back to the error queue .
    • Error is captured and can be seen in the Message Reprocessing application.

However, by doing this, we have full control of what we want to happen during delivery such as:

  • Query external system to determine if a record already exists, and if yes, get the Internal ID
  • Raise an Exception to tell Maximo that the delivery has failed. This can be quite handy in given scenarios such as when an external system rejects a message but it still responses with a HTTP 200 code, and only provides an error message inside the XML response body.

Below is an example piece of code which does the following:

  • Query an external system to identify if a PO record exists
  • If the record exists, update the XML payload with Internal ID before delivering it to the system as an Update request instead of a Create.

How to remove HTML tags in Long Description field

Introduction

The Long Description field contains data in richtext format. When sending this data to an external system, the special characters in HTML tags often cause trouble to the integration interface. It can also result in unreadable text displayed in the receiving application. This post provides intructions on how we can easily strip the Maximo’s Long Description field from the richtext format to keep the plain text.

Common problems with Long Description

The Long Description field is used by many key objects such as Service Request, Purchase Order, or Work Log. In Maximo, perhaps, Service Request and the Work Log’s Details are the most common place where this field is used.

When raising a ticket or recording some logs, users often copy the information from other sources such as email. This does not usually cause much problem to Maximo from the front-end because most of the format if copied from standard applications like Words, Outlook or the browser are retained in this field.

However, when it comes to integration, it is a different story. For system integration, there are two common problems with the Long Description data due to the HTML tags of the richtext format:

  • Many integration tools have trouble espcaping the special characters contained in the field data. This often result in integration failure. Even if the tool can handle these special characters graciously, the additional number of characters added can exceed the field length limit of the receiving application.
  • External application does not support this format, as such, the rich-text content is displayed as-is with a lot of HTML tags, making it unreadable to the users.

In many cases, retaining the format of the text is not desirable. The customer might prefer to keep the data as plain-text. In such case, Disabling the Rich Text Editor is a better solution. However, if we want to retain the formating and only remove it when sending the data to an external system, the following section descibes how to achieve it.

How to strip rich-text tags from Long Description?

Requirement

Below is an example of an integration interface between IBM Maximo and Gentrack Unify CRM in a water utility. When carrying out maintenance work, if there are delays, the field workers put the Work Order on Hold, and enter the detail of the delay as a Work Log entry. This information must be sent to Unify CRM so that, if the customer calls up, the call center staff would be able to answer on why the work has not completed.

Setup a simple interface

To provide a simplified example, in a Maximo vanilla instance, we will setup the following integration objects:

  • Object Structure: ZZWORKLOG – make sure to include the LONGDESCRIPTION field
  • Publish Channel: ZZWORKLOG – make sure to enable event listener and message tracking
  • Exteral System: WEBHOOK – I added a HTTP endpoint to webhook.site for testing

Sample text with richtext format

To test the code, we will create new Work Log entries and use the same text with some formating and hyperlinks as follows:

XML Output without tripping

Without any customisation, Maximo will publish an XML message to Webhook as shown in the image below. As you can see, it contains a lot of XML tags for which, in turn, the special characters have been escaped to be compatible with XML format.

Add UserExit logic with Automation Script

For this requirement, we can use the utility class psdi.util.HTML in the Maximo library which is available out-of-the-box. To strip the richtext tags in the Long Description field before sending data to an external system, we can create an UserExit processing logic with Automation Script as follows:

  • From the Automation Scripts application, choose Actions > Create Script for Integration
  • Choose Publish Channel option, select the ZZWORKLOG pulish channel
  • Choose User Exit option
  • Choose Before External Exit
  • Language: Python
  • Source Code:

XML Output after removing formating

If we create a new Work Log entry using the same text above, the XML output will only contain plaintext as shown below. The text is much more readable now and it still contain the hyperlinks which can be an important piece of information.

Other Notes

There is another frequent problem when sending the Long Description data to an external system. It often exceeds the field length limit of the external system. While you are at it, double check the length limit of the receiving field. With the Description and Long Description, it is a great idea to always truncate it to fit the target field. In the case of Long Description, we might want to split the long text into multiple records to avoid integration failure in the future.

In the case we want to strip the richtext formating when it is first entered in Maximo, so that only the plain text content is saved to the database. We can use the same toPlainText function in Automation script when saving the record.

Some problems with authentication in MAS and how to work around

The new Maximo Application Suite is still new to me. It took me a bit of time to get over some of the basics. This post contains a few quick notes, hopefully, it can save you a bit of time when facing the same problem. This is pure personal observation. The solution is based on trial and error. I haven’t dug through the official documentation to have a good understanding of how it works. Please don’t quote me if the information provided here is incomplete or if it doesn’t work for you.

Slow Login Process

In Maximo 7, opening the Maximo URL and getting past the login screen (with a saved password) usually takes 2-3 seconds. However, with MAS, the login screen is much slower to load. It is likely due to the SAML integration which we cannot do much about. After logging in, the Suite Navigator screen also takes a long time to load. The whole process takes about 15-20 seconds.

Once the screen is loaded, I reckon, 99% of the time, people will go to the Manage application (which is the old Maximo app).

To access Maximo faster, after logging in, I bookmarked the manage screen. The next time I can access Manage using the bookmark. Although I still have to log in, it will skip the Suite Navigator screen and get me straight to the Manage application. This saves me about 10 seconds.

REST API URL:

In Maximo 7.6.x, to retrieve a list of assets, I can send a request below:

https://[MAXIMO_ROOT]/maximo/oslc/os/mxasset?oslc.select=assetnum,location,description,status&oslc.pageSize=10&lean=1

It will result in something like this:

However, in MAS (Maximo 8.x), when I sent a similar request using POSTMAN with the same URL format, I got an HTML response with the error message: “You need to enable JavaScript to run this app.

As it turned out, with MAS now, to access this same API endpoint, we will need to change the /oslc/ part to /api/ as follows:

https://[MAS_MANAGE_ROOT]/maximo/api/os/mxasset?oslc.select=assetnum,location,description,status&oslc.pageSize=10&lean=1

API Authentication

It appears to me that with POSTMAN both the Native authentication and the LDAP authentication method don’t work with MAS. The only method that seems to work is using API Key by sending in the apikey header as follows

On a side note, I noticed that, with the new API Key application, anyone who has access to it can see and use the API Key of any other users, including the maxadmin account. This seems like a security concern to me. Once have access to the keys, you can use integration tools like MXLoader to manipulate data while pretending to be someone else. Anyway,

« Older posts