When building integration in Maximo, using an inspection tool such as webhook.site to inspect the requests is extremely useful. It allows us to easily identify:
Whether Maximo published a message at all (to confirm Event Filter)
What is the data content it sends to external system (to confirm Data Mapping logic)
What are the values of some request headers (usually to confirm the authentication details are correct)
If you are an integration developer and haven’t used webhook.site, I encourage you to test it with Maximo. Spending 10 minutes now will save you countless hours of head scratching in the future.
However, in certain situations, this approach does not work. For example, a friend came to me recently for help with setting up an endpoint which uses Microsoft Entra’s OAUTH authentication. In this environment, the client has strict firewall and security settings. Means Maximo can only send requests to a few white-listed IP addresses. Using an online or locally installed inspection tool is not possible. We suspected Maximo did not send JD Edwards (the external system in this case) with a correct Authorization header but did not know how to confirm.
Solution
Automation Script can be used to build a fully functional web API. We can use it to capture the details of the request and write them to the system log or displaying it in the response. The request’s details are accessible in the autoscript via the following implicit variabbles:
request: exposes important methods such as request.getQueryParam(“parameter”) and request.getHeader(“header”) to retrieve values provided as a query parameter or header respectively. You can also access the UserInfo by calling request.getUserInfo().
requestBody: A string representation of the data submitted on request (for POST/PATCH/PUT)
httpMethod: whether this was a GET, POST, PUT
responseBody: a byte[] or String to return in the response. This is not required if you do not intend to return a response.
To provide an example, I setup Mock API by creating an automation script as follows:
To confirm that it is working, I run the script by opening it in the browser and can see the details of browser’s request as follows:
In this Maximo environment, I have an endpoint to Microsoft Dynamics 365 – Business Central. It uses Microsoft Entra OAUTH. If I want to inspect the requests to this endpoint, I need to change the URL to direct it to the MOCKAPI script as follows:
Note: Depending on the authentication settings in Maximo, we either have to pass the &_lid=xxx&_lpwd=yyy parameters in the URL, or set a MAXAUTH or APIKEY in the Headers field.
Now if I test the endpoint with some text data, it will give me a response with the details of the request as follows:
By using this method, in the investigation mentioned above, we were able to identify that Maximo sent JD Edwards the correct token in the Authorization header. However, because it required a few other parameters in the header which we forgot to put in, it still gave us a confusing “Unauthorised” error. After updating the header, the endpoint worked as expected.
Maximo is a feature-rich and extensible Enterprise Asset Management software (EAM). It has many modules that cover a wide range of business processes such as operations & maintenance, procurement, and inventory management. Yet, a standard business often employs many other applications for different processes that Maximo does not cover. Finance & Accounting software is the first to come to mind. Complex businesses can run an ERP or a Billling & CRM suit in tandem with EAM. Specialised applications are also common in the asset intensive industries.
This article discusses some of the most common integration requirements between Maximo and other enterprise applications. This is a wide topic, and it is impossible to cover everything in depth. Therefore, the intent is to cover the following key points:
Why is the integration required?
What are the common requirements?
Common implementation strategies and things to watch out for.
1. Integration between Maximo and DCS / SCADA / Data Historian
Why is it required?
It makes a lot of sense to feed meter readings to Maximo from a data collection system (DCS). The information is used for generating preventive maintenance work orders. For instance, the odometer of a vehicle is used to generate work order to service the engine and change oil. Other measurements such as pressure or vibration can be used for setting up conditional monitoring and generating work orders when an asset operates outside of its normal working conditions.
What are the common requirements?
In general, the requirement for Maximo – SCADA integration is a one-way data feed from the source system to Maximo. In most cases, data synchronisation does not need to be high frequency. During the pre-sales stage of a greenfield project, the client often asks for a real-time interface citing alarms from critical equipment will need immediate attention. In many cases, this is made a mandatory requirement to eliminate competitive bidders. However, in practice, this is rarely justified because urgent maintenance work is often identified by control systems or other monitoring tools. Emergency works are usually dealt with outside of Maximo and the details are only entered to Maximo after the fact for recording keeping. As such, meter data fed to Maximo is useful for generating preventive or planned corrective maintenance only.
Things to watch out for
The common implementation strategy for this integration is having DCS to regularly exports data in CSV format at a low frequency interval. Daily data export works fine in this case.
If realtime integration is required as part of the contractual agreement, REST API or Web Services can be used. However, we need to ensure that the frequency and amount of data sent to Maximo is limited to a reasonable level, just enough for Maximo to generate work orders and produce reports. Indiscriminately sending a large volume of meter data to Maximo can slow down the system while achieving little added value. After all, Maximo is used to manage maintenance work, not for storing every piece of meter data that can be collected. It is the job of a Data Historian system.
2. Integration between Maximo and CRM
Why is it required?
The requirement to integrate Maximo with CRM system is common in the Utilities sector. Utility companies provide basic amenities such as gas, water, electricity, and internet services. Billing and CRM is the core system of such businesses. Asset management software like Maximo is used to manage distribution network infrastructure and tracking maintenance work.
At high level, the end-2-end process works like this: when an end customer calls up to setup a new account or raise a new service repair request, it makes sense to keep track of all these details in one place: the CRM system. If new development or repair work is needed, the service order in CRM is sent to Maximo as a new service request. Subsequent work orders are created and assigned to the crew responsible in certain geographical area where the work needs to be executed. Work progress and any related details recorded by the field crew in Maximo are sent back to CRM. This ensures all information is available in CRM, enabling the call-center staff to answer to any query from the customer. The diagram below illustrates the highlevel data flow.
What are the common requirements?
The common requirement for this interface is simple from business process perspective. Service requests are sent from CRM to Maximo. Work Order status and any additional details are sent back to CRM. However, the technical requirement for this integration is demanding. It is critical to synchronise the data in realtime or near realtime; and the interface must be highly fault tolerant. It is because the Utilities sector is highly regulated, there are minimum SLA standards applicable to certain type of problem or service interruptions.
Things to watch out for
Although realtime, synchronous synchronisation sounds great for this integration, one must be careful that the performance of the CRM’s UI/UX is not affected by the poor performance of the network or external systems. To illustrate this point, in a system I once had the chance to work with, when creating a new service request, Maximo queries ArcGIS geocoding service to convert a free-text address to a formatted address record which is linked with an asset or location. Due to the poor performance of the ArcGIS service, it took more than a minute to populate the SR with a formatted address. As a result, the call-center staff and the customer have to stay on the line waiting for this before they can continue filling in other details.
With near realtime, asynchronous integration, one must ensure that the running interval of the queues are set in seconds, not minutes. Needless to say, there should be no bottleneck throughout the end-to-end data transformation and delivering process.
To ensure the interface is fault tolerant, the integration logic should catch any possible fault scenarios such as:
CRM API is out-of-service, or timeout
Maximo MIF is out-of-service, or timeout
CRM or Maximo rejects a request
Extensive alarms should be setup for any possible scenarios so that the failed message can be handled promptly. Talking of which, when a message cannot be delivered, the integration should provide an error message that is easy to understand, and allow the system administrator to easily reprocess the message. If a system administrator or support staff has to go to the developer to troubleshoot everytime a failure occurs, it is unacceptable. Such instances should be the exception, not the norm.
3. Integration between Maximo and Finance / Accounting software
Why is it required?
For someone new to Maximo, it might appear that it has the full financial functionality with all the Chart of Accounts setup and finance-related fields seen everywhere. However, these are just ways for Maximo to reference financial transactions and have them ready for integration with an external Financial & Accounting software.
Within Maximo, hundreds if not thousands of finance-related transactions can be created daily. Having an interface that sends these transactions to the accounting software will not only eliminate manual work but also improve the accuracy of data and ensure the auditability of the book.
What are the common requirements?
The most common requirements for data synchronisation with accounting software are:
Fixed asset register and/or data related to fixed asset depreciation
Things to watch out for
With this integration, it is critical that the data is accurate and fully auditable. Any change to a dollar value should be backed up by a transaction or a persistent record, and the numbers should add up, accurate to the decimal values. To support this, the integration should have full message tracking capability, and the messages should be kept for an extended period (30-day retention policy for this integration is probably too short). The integration should automatically retry failed messages and have the capability to allow resubmission of an error transaction too.
Another common pattern with this integration is that a lot of issues come up around the end of the financial year. When it happens, due to time constraints, many issues are addressed with temporary fixes and quickly forgotten, only to resurface in the next financial year-end. Having a proper process to follow-up and address any technical debts will help avoiding recurring problems.
As an example, a company created a daily CSV export using automation script in Maximo to send inventory reconciliation transactions to its accounting software. This mechanism worked well during the year as it usually produces a few hundred transactions per day. However, on the last day of the financial year, the company reconciled more than ten thousand records. Subsequently, the export failed as it exceeded the default maximum number of records Maximo can process in one go (the default value for mxe.db.fetchResultStopLimit setting is 5000). To quickly address the problem, the support team increased the limit to 15000, rerun the export, then set it back to the previous value. The same issue happened again and again for the next two years on the same date. Each instance was an urgent issue and caused a bit of drama to the business when it happened. Only then, the export process was updated with a permanent fix, by querying the data in smaller chunks.
4. Integration between Maximo and ERP
Why is it required?
The core module of an ERP system is Finance & Accounting. Thus, the points discussed in the previous section also apply here. In addition to that, there are a few other common problems specific to the integration between Maximo and ERP.
Between EAM and ERP, there are a few overlapping functions, mainly in the Procurement and Inventory modules. Integrating the two systems provides a seamless end-to-end business process across the organisation.
What are the common requirements?
For an organisation that is in the process of setting up the systems, a few questions are often asked:
Which system should be used to manage the following processes?
Inventory management
Procurement and Payment
What data needs to be synchronised and on which direction?
There is no standard answer to these questions and each company will have to make their own decision based on many factors specific to their business. Nonetheless, one important factor is the industry the company is in. For example, for a manufacturing company or a retailer, an ERP is the mission critical system while Maximo is only there to manage the assets which support the core business. In that case, the Procurement and Logistics department handle much more responsibilities than just the spare parts and services used by maintenance. As a result, it is more likely that these functions are managed in ERP.
On the other hand, in the case the core function of a company is operations and maintenance – such as a power plant, a freeway operator, or a utility provider – it is likely that only the Finance & Accounting core module of the ERP software is used, and the procurement and logistics processes are managed in Maximo which directly support the maintenance processes.
Integration between Maximo and ERP often demands realtime or near realtime data synchronisation using web services. Similar to the integration with Finance discussed above, auditability and fault tolerance are two important requirements.
Things to watch out for
This integration can be complex, but it doesn’t always have to be that way. Different software has different data structure and business logic. Correctly mapping all processes and data fields will require a lot of discussion and analysis. Working on such a project can result in “paralysis in analysis”, especially if it involves multiple parties. An SME who understands the big picture and pay attention to details will be a great asset.
During analysis and design, discussion on the various scenarios and whether certain design decision works with such scenarios can be an endless conversation. Having a system to keep track and follow-up on these side questions will help the team to focus on the main problem, avoid getting distracted by questions which the team does not have an immediate answer to. A collaboration tool such as JIRA is great to manage these stories, questions, or issues. It can notify and follow-up with people when they are tagged, and all discussion are kept in one place. If such tool is not available, a shared Excel file on Sharepoint works just fine.
5. Integration between Maximo and ArgGIS
Why is it required?
ArcGIS is big in Utilities and Public Service sector. For companies which have the asset networks spread across a big geographical area, and many of which are underground, map software is crucial to operation and maintenance. With such companies, integration between Maximo and ArcGIS using the Maximo Spatial add-on is almost unavoidable.
What are the common requirement?
The requirement in this case is pretty much standardized across the industry. ArcGIS often act as the master for managing the asset register. Maximo Spatial polls ArcGIS and synchronises the list of assets and any engineering details. Asset status updates and technical specifications are also sent back to ArcGIS. This integration can also support the following processes:
Creation of work orders in GIS
Identification of assets or locations based on provided address (geo-coding)
Identify route to work location, or adjacent assets or work
Things to watch out for
The business requirement for the Maximo – ArcGIS integration is straightforward. There isn’t much analysis and discussion involved. However, the technical implementation can be tedious. It can take weeks or months if the data mapping is configured manually. However, there are ways to generate the configuration for Maximo Spatial from the metadata in ArcGIS.
Another thing to watch out for when setting up Maximo Spatial is the initial synchronisation of a large number of assets. ArcGIS might have certain limitations which can cause problem to the data synchronisation cron task in Maximo. A developer who is not familiar with ArcGIS can struggle when dealing with such issues.
Working in IT, we deal with strange issues all the time. However, every once in a while, something would come up that leaves us scratching our heads for days. One such issue happened to us a few years back. It came back to me recently and this time, I thought to myself I should note it down.
Summary
Maximo – TechnologyOne integration error. Work orders went missing.
There are no traces of problem. Everything appears to be working fine
The problem is due to the F5 Load Balancer returns a Maintenance Page with a HTTP Code 200. This leads Maximo to think the outbound message was received successfully by WebMethods.
The mysterious missing work orders
The issue was first reported to us when the user raised a ticket about missing work orders in TechnologyOne, the Finance Management System used by our client. Without work orders created in TechOne, the users won’t be able to report actual labour time or other costs. Thus, this is considered a high-priority issue.
Integration Background
TechOne is integrated with Maximo using WebMethods, an enterprise integration platform. Unlike direct integration, these types of problems are usually easy to deal with when an enterprise integration tool is used. We simply look at the transaction log, identify the failed transactions and what caused them, fix the issue, and then resubmit the message. All good integration tools have such fundamental capabilities.
In this case, we looked at WebMethods’ transaction history and couldn’t find any traces of the missing work orders. We also spent quite some time digging through all of the log files of each server on the cluster to find any errors but couldn’t find anything relevant. Of course, that is the case because if there is an error, it should have been picked up and the system should raise alarms and email notifications to a few overlapped monitoring channels we set up for this client.
Clueless
On the other hand, when we looked at Maximo’s Message Tracking and log files, everything looked normal with work orders published to WebMethods correctly without interruption. In other words, Maximo said it had sent the message, while WebMethods said it never received anything. This left us in limbo for a few days. And of course, when we had no clue, we did what we application people do best, we blamed the network guys.
The network team couldn’t find anything strange in their logs either. So, we let the issue slip for a few days without any real progress. During this, users kept reporting new missing work orders not knowing that I didn’t really do any troubleshooting work. I was staring at the screen mindlessly all day long.
Light at the end of the tunnel
Then of course, when you stare at something long enough, the problem will reveal itself. With enough work orders reported, it became clear that all of the updates only went missing during a period between 9 to 11 PM regardless of the type of work orders or data entered. When this pattern was mentioned, it didn’t take long for someone to point out that this is usually the time when IT do their Windows patching.
When a server is being updated, IT would set the F5 Load Balancer to re-direct any user requests to a “Site Under Maintenance” page, which makes sense for a normal user accessing a service via the browser. The problem is that when Maximo published an integration message to WebMethods, it received the same web page, which is ok, as it doesn’t process any response. However, the status of the response is HTTP 200 which is not ok in this case. Since it’s an HTTP 200 OK status, Maximo thought the message had been accepted by WebMethods and thus marked it as a successful delivery. WebMethods, on the other hand, never received such a message.
Lesson Learned
The recommendation in this case is to set the status of the Maintenance page to something other than HTTP 2xx. When Maximo receives a status other than 2xx, it marks the message as a delivery failure. This means the administrator shall be notified if monitoring is set up. The failed message will be listed as an error and can be resubmitted using the Message Reprocessing app.
Due to the complex communication chain involved, I never heard back from the F5 team on what exactly was done to rectify the issue. However, from a quick search, it looks like it can be achieved easily by updating the rule in F5.
This same issue recently came back to me, so I had to note it down to my list of common issues with Load Balancer. I think it is also fun enough to deserve a separate post. This is a lengthy story, if you made it this far, I hope at least it will be useful to you at some point.
I sometimes have issues with message engine not running. Usually I’ll just try to restart the whole system and hope that it goes away.
If it doesn’t work, in most cases, it is caused by a corrupted file store used by the message engine and the suggestion from the Internet is to delete these files, which seems to work fine.
Sometimes, with the message engine uses a database store, I had a very similar issue. I find it quite hard to find out the exact root cause. So I chose the easier path by simply deleting the whole message engine, create a new one, giving a new schema name for the data store. This ensures it creates new tables when message engine is initialized the first time.
Creating a new message engine and re-assigning bus destinations usually take less than 5 minutes, and it seems a lot easier than troubleshooting and finding the root cause of the issue.
I want to post a simple JSON message to an external system and do not want to add any external library to Maximo as it would require a restart.
In the past, I used the Java HTTPClient library that comes with Maximo, but it would require half a page of boilerplate Jython code. Recently, I found a simpler solution below.
Step 1
First I use WebHook as a mock service for testing. Go to webhook.site, it will give us a unique URL to send request to:
Step 2
Go to the End Point application in Maximo to create a HTTP Endpoint
End Point Name: ZZWEBHOOK
Handler: HTTP
Headers: Content-Type: application/json
URL: <copy/paste the Unique URL from webhook>
HttpMethod: POST
Step 3
To ensure it’s working, Use the Test button, write some simple text, then click on “TEST”. Maximo should show a successful result. In Webhook, it should also show a new request received. (Note: if it gives an error related to SSL, it’s because Websphere doesn’t trust the target website. You’ll need to add the certificate of the target website to Websphere trust store)
Step 4
To send a request, just need two lines of code as follows:
I needed to send an external system a file import request. The external system would take some time to process the file before the import result can be queried. Making a status query immediately after the import request would always return an “import process is still running”. It’s best to wait for a few seconds before making the first attempt to query the import status.
It took quite a bit of time to look up the web for a “wait” or “sleep” function. Some posts suggested using Java flow, some recommended complex processes or involved an external library.
The easiest method I finally settled with is to use Repeat as follows:
Essentially, the flow would repeat 1 time in 5 seconds before getting to the next step (Main Mapping). The repeat loop does nothing other than just writing a line in the server log to make troubleshooting a bit easier.
I am a freelance Maximo consultant based in Melbourne. If you enjoy reading my blog, please connect with me on LinkedIn to get updates on new posts. If you or your company need any professional assistance, please leave me a message, I'll call you back.