Page 3 of 18

Must-know tips & tricks to improve data entry efficiency in Maximo

A client recently asked me for a solution for importing fuel consumption data. Each day, their operators would refuel about two hundred cranes and vehicles. They enter the data in an Excel file hosted on SharePoint. The finance team has to manually key the data into Maximo from these files. The process takes a few hours each week. The client asked if we could build some integration or custom app that can automate the process.

After looking at various more complex options, we ended up using Maximo’s standard import function. The accountants have to copy and paste the data from Excel to a defined template and then import it into Maximo. The time it takes was reduced to a few minutes. Everyone is happy. The Maximo administrator is happy because this is a maintenance-free solution. The accountants are obviously happy as this helps with one of their most tedious tasks. Neither the finance team leader nor the Maximo administrator were aware of this standard capability. They started discussing applying it to applications like Purchase Orders and Invoices. From a few other recent conversations, I realised many Maximo users are not aware of some basic but useful features and tools. This post hopefully will help close some of this gap.

Import/Export function

We can upload/download almost anything using the standard Import and Export functions. They are not enabled by default in most applications as they require some configuration. As such, most Maximo users are not aware of this. In the cases where the Import/Export feature is being used, the users often don’t know how flexible this can be configured to address complex requirements. We can even customise the inbound/outbound processing rules to perform data transformation. 

As an example, for the above scenario, I set up a template that has almost the exact columns as the current Excel data file the operators are using. This allows the accountants to copy/paste the data from the Excel file to the import template in just a few clicks.

For more details about this function, you can refer to the video below from Starboard Consulting:

Default Insert Fields

This is the feature I love the most in Maximo. Unfortunately, I found it is not used in most companies I had the chance to work with. Earlier in my career, when I did a lot of greenfield implementations, this was the first thing I talked about when discussing screen design. It is so useful that it helped minimise a lot of resistance from the procurement and logistics users when Maximo was introduced to them. When creating a Purchase Requisition or issuing material, the data fields are usually repetitive.

For example, in a material issue transaction, the lines usually have the same charge details like work order, asset, and GL account. By putting these details in the default insert fields, the user will only have to enter the item number and quantity for each line. You can refer to the short demo below to see how it works:

“Stupid” Smart Fill / Type-Ahead / Asynchronous

In Maximo, there are various features we can tweak to increase data entry speed. Some of them are:

  • Smart-Fill: for a look-up field, such as Item Number, if you type BEAR, and there is only one item that matches the first few characters, such as BEARING1001, it will fill the whole value for you. However, the issue is when you type an exact item number, if there is more than one option that partially matches the word, it will show a magnifying button, forcing the user to click and select. This means the user has to move his/her hand between the keyboard and the mouse. By turning “Smart Fill” off on a field, it accepts the exact value you entered without questioning. I once helped turn this off on the Bin Number field for the Inventory Issue/Transfer screens. It only took a minute to make the change, but it made the user “exceedingly happy”. That exact word she used when giving feedback to my boss. Below is a quick demo of the difference when turning “Smart Fill” off:
  • Type-Ahead: after typing a few characters, Maximo will show a list of available options for you to pick from. This needs to be configured to work.
  • Asynchronous Data Validation: after updating a field, the user can move and update the next field instantly without having to wait for the server to validate the data. However, the validation is almost instantaneous in most cases, and there is no benefit from this feature. On the other hand, after entering an invalid value, the user has to update another field to see the error message. This is actually can be counterproductive and annoying. The key takeaway here is that if you don’t like this behaviour, we can easily turn it off locally for certain fields, or for the whole system. 

Bulk Apply Change

Many good applications provide the user with the capability to bulk-apply changes to multiple records. Unfortunately, Maximo doesn’t provide this as a standard feature. The good news is it’s easy to implement this bulk-apply function to the List screen. In the past, this required some Java customization. In newer versions, we can set it up with Automation Script. Below is an example of how it can be implemented to mass update work order scheduling details. This is probably the most common requirement for bulk updates in Maximo. I built this to demonstrate this point which took me less than 30 minutes by following this excellent tutorial.

MXLoader

MXLoader is a data upload/downloading tool written by Bruno Portaluri. It runs on top of Excel and is free to use. In case you don’t know what it is, below is a short introduction video. The only complaint I have about this tool is that it is too powerful. With MXLoader, we can update almost anything in Maximo. This can be a concern in terms of data security and integrity. Since it’s too easy and fast to mass update data, the damage is multiplied when the user makes a mistake. Other than that, for data entry, nothing beats the efficiency of feeding data directly from Excel to Maximo.

Conclusion

This post does not introduce anything new. My goal is to remind you that if you are a Maximo user who has to do a lot of data entry, speak to your Maximo support people. There are simple tricks they could do to improve your experience. If you think data entry in Maximo is a pain in the neck and they don’t have an answer, shout at them, bully them, or threaten to fire them. People often find creative solutions under pressure. Cutting down a few clicks doesn’t sound like much, but if you have to enter a few hundred lines of data, it can make a big difference. If you have any other data entry problem that can’t be addressed by the tricks mentioned above, please share in the comments. I will see what else can be done to address it.

How to approve workflow via email using automation script?

Maximo has an Email Listener and an Email Interaction application. I have seen the Email Listener being used quite effectively by some clients despite having some annoying bugs that require fixes using custom Java code. I haven’t seen the Email Interaction app used in a real production environment. Recently, I attempted to use the two applications without success. I ended up writing a simple email listener using Automation Script instead. This post outlines the detailed steps here in case someone will need it in the future.

Requirement

A major port uses workflow for their purchase requisition approval. The first-level approvers are first-line supervisors who often work in the field and do not have access to a computer most of the time. This causes some delays in the approval process. They asked if they could reply to an email from their mobile phones to approve the workflow assignment instead.

Analysis

To address this requirement, my first solution was to include two hyperlinks at the end of the workflow assignment emails. In the client’s current workflow, when a new PR is submitted to a supervisor, he already gets an email notification from Maximo. We can add at the end of this email an Approve and a Reject link which will trigger the respective action. However, this approach requires the user to enter a username and password to authenticate for Maximo to work. While we can hard-code the API key in the hyperlink, or remove the authentication requirement, it is unacceptable for security reasons.

My next solution was using the Email Interaction and Email Listener application. I followed this instruction provided by IBM. The article is quite detailed, however, it misses a few key pieces of information for which, I had to decompile the code to figure out. Despite that, after spending 8 hours, I couldn’t get it to work for this simple scenario. There were two issues:

  • Email Interaction didn’t seem to populate the record ID correctly
  • Email Listener cron gave me a NullPointerException error without any other intelligence to troubleshoot.

It looked like I needed to debug and write some Java custom code to address those issues and it will likely take another day at least. I decided to write a simple email listener instead. Below is the solution that I came up with.

Solution

Note: For this example, I used the standard Maximo Demo instance. It already has a simple approval workflow named “PRSTATUS

First, I duplicated the standard WFASSIGN communication template to create a new template and named it PRAPPRASSIGN. I added the @@:OWNERID@@ placeholder to the email subject and added some additional instructions at the end of the email.

Duplicate and create a new email template with an ID placeholder in the subject

For the PRSTATUS workflow, I edited the SUP_APPR assignment node to enable sending emails when new assignments are created and set it to use the communication template created in the previous step.

Enable email notification on workflow assignment

As you can see, there is no crazy configuration here. The workflow will work exactly like before. The only change is the email sent out will also have a Record ID in the subject. When the user replies to the email, Maximo will use the ID to identify which PR record to approve.

The next step is to build a simple email listener. For testing, I created a new Gmail account and enabled App Password so it can be accessed from Maximo.

I created an Automation Script with the source code below and set up a cron task to execute it every 5 minutes. This simple script uses Java mail client library to access Gmail, and finds any unread emails that have an ID in the subject. If it can match the ID with a PR record that has an active WF assignment, it will approve or reject the request if the reply is 1 or 2 respectively. It also does a simple check to make sure the sender’s email matches the assignee’s email before routing the workflow.

Usage:

To test this, I set account Wilson to be Maxadmin’s supervisor and set Wilson’s email to be my own email. Then, with the Maxadmin account, I picked a PR in the WAPPR status and routed it through the workflow. The record was assigned to Wilson, and a notification was sent to my email inbox. To approve the assignment, I replied “1”. After a few minutes, Maximo Crontask will kick off the Automation Script which reads the reply and approves the workflow. As mentioned earlier, we can still use account Wilson to route and approve the workflow in Maximo. There is no change to the process.

View Workflow History after PR has been approved by email

How to override HTTP headers for integration end-point with Automation Script?

For the HTTP End-point, we can set a fixed value in the request header. This doesn’t work for header values that must be generated on the fly like authorisation tokens. To override the headers of an end-point during runtime, the traditional approach is to write some custom Java code. This post explains how we can avoid Java by using an automation script end-point instead.

To illustrate the approach, I will use an example of an interface between Maximo and an application on Azure via the Azure Event Hubs. The API requires a SAS token to be provided in the request’s header.

First, we will create an automation script end-point by following this tutorial by Alex at A3J Group:

  • Create an end-point:
    • Name: AZEVENTHUB
    • Handler: SCRIPT
    • Script: AZURE_EVENTHUB_ENDPOINT
    • Note: As explained by Alex, the SCRIPT handler is available OOTB from 7.6.1.1. From versions between 7.6.0.8 and 7.6.1.1, we’ll have to manually create a handler that utilises the java class: com.ibm.tivoli.maximo.script.ScriptRouterHandler
  • Create an automation script:
    • Name: AZURE_EVENTHUB_ENDPOINT
    • Language: python
    • Source Code:

This is a bare minimum example. When the publish channel sends data to this end-point. It will call the automation script, which sets a Content-Type header and sends the payload (implicit variable requestData) by invoking an HTTP Handler.

In this case, we’ll use Webhook.site to test my request. To do it,  we’ll quickly set up a publish channel named AZWO from the standard MXWO object structure as follows:

Then we’ll set up an external system named AZEVENTHUB as follows:

To confirm the end-point is working, we update a work order. As a result, we should see a message posted to Webhook as follows:

Now to get this end-point to work with Azure Event Hub, we’ll have to populate a “ContentType” and an “Authorization” header for the request following the examples by Microsoft on how to generate SAS token. To avoid having to import third-party libraries to Maximo, when there is no suitable OOTB python library available, I’ll use Java code instead. Below is the source code for the end-point:

To test it, we’ll make another update to a work order. On webhook, we should see the result as follows:

How to export Maximo data to Excel using Automation Script?

Earlier I provided an example of how to extract and send data as CSV to a user via email. A friend asked me about a requirement he is dealing with. In this case, he has an escalation which sends an email when there is an error with integration. The problem with this approach is if there are many failed transactions, the administrator will receive a lot of emails.

The alternative approach is setting up a scheduled BIRT report which lists all errors in one file. However, this approach also has a problem. On the days when there are no failures, the admin would still receive an email and still have to open the file to see whether there is an error or not.

This is actually a common requirement. Below are some examples:

  • Operation managers like to monitor a list of critical assets. Maximo should send out a maximum of one email per day with the list of active SR and WOs when the asset is down. Do not send emails if there are no issues.
  • Operators like to receive a list of all high-priority work orders reported daily in one email, such as work orders that deal with water quality issues or sewer overflow.
  • The system admin wants to get a list of suspicious login activities daily.
  • System owners like to monitor data quality issues. Only send out a report if there are issues.

Below is an example of how we extract all P1 work orders in site BEDFORD, and save the data to an Excel file. I didn’t include the code to attach the file and send out an email as it has already been provided in my previous post

As usual, I test the script by calling it via API. Below is how the data looks when opened in Excel.

For data aggregation or when complex joins are required, we can also run an SQL query to retrieve data. Below is an example that provides a list of locations and the total number of work orders for each location.

Below is the data exported by the script

How to send email with CSV attachment using Automation Script

A client asked me to set up Maximo to automatically export work order data in CSV format and send it out via email. Initially, I suggested using the scheduled report function. We can build a simple BIRT report which has one data table. It can be scheduled to run automatically and send out an email with the data attached in Excel format. The user will simply have to open the file and save it in CSV format.

The solution was not accepted due to two reasons:

  • It involves some manual intervention.
  • BIRT Excel format has a limitation of 10,000 rows. If there are more rows, the data is split into multiple worksheets.

To address this requirement, I wrote an automation script that does two things:

  • First, it fetches the MboSet and generates the CSV content using the csv library.
  • Send an email with the CSV file attached.

Below is the simplified version as an example. It works with the OOTB Maximo demo instance. Notice that the csv python library I used was included in Maximo out-of-the-box but it is not available to autoscript by default. Thus, we have to append the Lib folder to the library path at line 7.

I created it as a script without a launch point:

To confirm that it works, I call the script by accessing this URL from the browser:

Once confirmed working, I created a cron task to run the script on a schedule. We can also modify this script to address some simple file-based integration requirements by changing the delivery method to SFTP or HTTP POST.

Update: We can also use Apache POI library (included in Maximo OOTB) to export data in Excel format and send as email attachments

ArcGIS to Maximo synchronisation not working due to API limit

How to send ArcGIS data to Maximo

This is a weird issue with the ArcGIS – Maximo integration cron task. I like to record just in case it hits me again. 

Symptom:

The client reported the Maximo – ArcGIS Asset integration stopped working. New assets are not synchronised from GIS to Maximo by the ArcGISDataSync cron task. The history of the cron task instance shows an error: BMXAA6361I – caused by: BMXAA1482E – The response code received from the HTTP request from the endpoint is not successful. Not Found

If I copy and open the same REST query of the cron task instance from a browser, ArcGIS does return data with HTTP response Code 200 – OK, the response JSON data looks normal. 

However, the GIS specialist advised me that the request exceeded ArcGIS’s API limit was set at 2000. When inspected closely, there is an attribute “exceededTransferLimit”: true at the end of the JSON response. In this case, the feature layer contained more than 5000 records in the updated state that met the request’s filter criteria (MXCREATIONSTATE=1)

Cause: 

To confirm the GIS limit caused the issue, we increased it to 10,000. The cron task ran without error after that. It is unclear how Maximo captures this error message; whether it read the JSON message looking for the exceededTransferLimit attribute or whether Maximo received a different HTTP status code from ArcGIS. We didn’t have time to debug Maximo’s code to find out.

Solution:

To fix the issue in this case, we resorted to a short-term solution, which is increasing the limit to 10,000 to clear the batch. Then we reset it back to default as it wasn’t expected to have more than 2,000 features updated/created per day.

For the long term, I proposed two options:

  • Option 1: increase the limit to 10,000 or higher permanently. I don’t think it causes much stress to the servers. If we have more than 10,000 updates per feature layer per day, it will still fail.
  • Option 2: Set this layer/cron task instance in Maximo to run more frequently (i.e. hourly). This is processed by the background JVM and won’t cause performance degradation to the end-users. When the cron task runs but does not get any result, it doesn’t consume much resource anyway. However, this approach won’t help if we have a large batch update in a short period (e.g. manual data import or bulk update).
« Older posts Newer posts »