Category: Maximo (Page 8 of 12)

Solving problems with Update DB process when install or upgrade Maximo

When upgrading Maximo or installing new add-ons or fix packs, new source files will be copied to SMP folder which include Java classes and DBC (database configuration) script files. After that, the installer will run the UpdateDB process to update the database, run the BuidMaximoEar process to build the EAR file, and then deploy the EAR file to Websphere.
 
The DBC script files contain incremental changes to the Maximo database which add changes to GUI, update data, and modify DB configuration objects. Most of the problems you get when installing fix packs or upgrading Maximo come from the UpdateDB process which execute these DBC files in a set order.

Thus, if you have a problem with this process, you can follow the below general steps to troubleshoot and fix the problem:

 
  • When you have an error with UpdateDB process, note down the name of the log file that contains the error message. It has this format: [Process Name][YYYYMMDDhhmiss].log
  • If you have a problem with the UpdateDB process when running the installation program and don’t know the log file name, locate the [SMP]/maximo/tools/maximo folder and run the updatedb.bat (.sh) tool
    again. It will produce the same error
  • Open the log file which is located under
    [SMP]/maximo/tools/maximo/log folder to get the detailed error message.
  • Fix the problem then run the UpdateDB tool again from the command line. It will continue from the last success point.
 
Following is an example on how I got a problem when trying to create a demo database from an SMP copy I received from my client just a few hours ago. When installing demo instance by executing maxinst.bat, Maximo will create a standard initial  database by running a bunch of SQL statement from [SMP]/maximo/tools/maximo/en/maxdemo.ora for Oracle (or maxdemo.db2, or maxdemo.sqs for DB2 and SQL Server respectively). Then it will call the UpdateDB process to apply incremental changes to this database to install add-ons/fix packs which are already installed in this SMP folder).
 
During this process, I got the follow error:
 
 
To get more details about the error, I locate the Updatedb20180712173523.log in folder: [SMP]/maximo/tools/maximo/log. It gives me more details about the problem:
 
 
The problem occurred while it tried to execute the file V7110_03 script for the Oil & Gas add-on. In this case, if I scrolled up several lines, I can quickly see the problem, which is caused by an ORA-01000 – maximum open cursors exceeded. Basically, I forgot to set open_cursors parameter to 1000 as recommended by IBM, leaving it with default value = 300 when creating a new database. Thus, all I have to do in this case is increase it to 1000, then run the UpdateDB tool again. With real production database, sometimes I have to increase it to 5000 or 10000 in order to apply some fix packs.
 
In many cases, the problem is not easily identified by just looking at this log file. So we have to scroll up to a bit further, to identify which statement is causing the issue. Then, we will have to open the DBC script file which can be found under [SMP]/maximo/tools/maximo/en/[add-on folder]/[dbc script file]. For example, in this case, it is …/en/oilandgas/V7110_03.ora
 
 
 
There are a number of script file types:
 
  • Files with .ora, .sqs, .db2 extensions: are just normal SQL files applicable for Oracle, SQL Server, and DB2 respectively. You can run the SQL statement from this file against the database to check for problems directly.
  • Files with .msg extension: are used to
    add/modify existing messages. You normally don’t have a problem with these.
  • Files with .mxs, and .dbc extensions: are used to
    modify application GUI design, and apply data updates or database
    configuration changes.
 
Most of the problems come from .dbc files. To understand DBC script and identify what each statement does, you can refer to this DBC XML Format Technical Reference 
 
Some common problems we have when running UpdateDB include:
 
  • Error when creating a unique index: in this
    case, we have to query and update the data to remove or fix records with
    duplicated keys
  • Error when creating existing objects: most of the
    time, we can drop the existing object as a new object will be created by the
    script anyway
  • Missing script files, or script file doesn’t follow naming convention, which have ordered number. If there’s a missing script file, you can create an empty file by copy/paste from an existing dbc file, and rename it to have a missing ordered number. For files which don’t follow naming convention, I got get this java.lang.ClassCastExeption a few times when updatedb with Linear add-on, turns out there is a file named V7500_linear.dbc which updatedb expects to have a number after V7500, so all I have to do is change it to V7500_02.dbc and it doesn’t complain any more
  • DB operation fails due to maximum open_cursors or processes exceeded (in Oracle): increase the open_cursors, processes parameter and try to run UpdateDB again.
  • Java NullPointerException: this one is quite difficult to figure out. We have to look at the DBC file to understand what it’s trying to do. This is often caused by bad data, such as orphan records referencing a non-existence parent.
  • Error with certain DB operations like when it is unable to create a new Workorder table after Drop the table (when there’s a change to the table
    structure): I’m not exactly sure what mechanism causes this, but I found the
    problem goes way if I run the UpdateDB process again (and again). I only
    attempt to fix it if the problem persists after the 3
    rd run.
 
I hope this helps you when the going gets tough.
 

How to bulk upload images via the Integration Framework?

In the previous post, I provided an example of how we can customise Object Structure to enable import/export of binary data via MIF. It is achieved with Java customisation. From Maximo version 7.6, the automation scripting framework has been greatly extended to support integration. With this update, we can enable import/export of binary data with a simple automation script. Below is an example of how we can configure Maximo 7.6 to bulk upload images to Item Master application:

Step 1: Add a new Object Structure

  • In the Object Structures application, Open and duplicate the MXITEM object structure to create a new one. Name it MXLITEM
  • Under “Source Objects for MXITEM”, delete all child objects, and add a new child object IMGLIB as shown below

Step 2: Add an Integration Automation Script

  • In the Automation Script application, choose Action > Create > Script for Integration
  • In the pop-up, enter the following  details:
    • Select “Object Structure”, choose “MXLITEM” for Object Structure
    • Select “Inbound Processing
    • Language: Python
    • Copy/paste the piece of code below to the  Source Code area.

Step 3: Upload images to Item Master

Download the Excel VBA tool in this GitHub repo. Use it bulk upload images to Item Master:

  • Upload the Settings page with the URL and the username/password to connect to Maximo
  • Put all the image files in the same folder as the Excel file
  • Update the Data sheet with the list of Item Number and the name of the image file.
  • Click on “Upload Images”

Notes & Updates

Note 1: be careful with photos taken from newer cameras, the files usually have high resolution and thus can be quite big. I’ve seen an eager team of engineers upload images for all assets and inventory items in a power plant, and overnight, the database grew from 1GB to 20 GB. Thus make sure you resize the images before uploading.

One quick and simple method in Windows is to select multiple files, then right-click, and choose Send To > Mail Recipient. Windows will give you a pop-up to resize the files, choose the smallest size (640×480). Windows will resize the files, then attach them to Outlook, in Outlook, select all the files, and copy/paste them to a different folder. These files will be much smaller than the original, full-resolution files.

Update Aug/2023: I have posted a new article on how to automatically rescale the image to a smaller size

Creating high performance service using MaximoCache

Sometimes in our application, we need to build custom services that run when Maximo starts. We can extend the psdi.server.AppService class and register it with MXServer by inserting a new entry into the MAXSERVICE table. If the service executes slow-running queries, it is a good idea to cache the data in memory to improve performance. We can implement MaximoCache interface for this purpose. By doing this, we can initialize the service when MXServer starts and pre-load all data required by the service into JVM memory. When the service is called, it will only use cached data to provide instant response which gives a much better user experience. 
 
Below are the steps to create a sample service that loads all Location’s descriptions into memory. The service will provide a function to check if an input string matches a location’s description or not. We will call this check when the user enters an Item’s description and it will throw an error whenever the input matches with the description of any existing Location. This is not a very good use case. But for the sake of simplicity, I hope it gives you an idea of how it can be implemented.

1. Create a LocationCache class, implementing psdi.mbo.MaximoCache
interface:
  • For the getName() function, set a name for the cache, such as “LOCATION_CACHE”.
  • Create method loadData() to query and load all locations’
    description into a HashSet
  • Call the loadData() method in the init(), reload(), reload(String)
    function overridden from the MaximoCache interface
  • Create function isLocationExist(String) to check if an
    input String matches with a description contained in the HashSet

2. Create a LocationCheck class extending the psdi.server.AppService
class. Override the init() method to initialize a new LocationCache object and
add it to the MXServer’s cache.

 
3. Insert a new entry into the MAXSERVICE table to register with MXServer to initialize the service when Maximo starts. Sample SQL statement to insert the record as follows:

4. Extend the LocationSet class to override the fireEventsAfterDBCommit() method. Essentially this method will reload the cache whenever there is a change to location data. This will ensure that the information stored in the cache will always be in sync with the new update.

5. Create a FldDescription class to override the validate()
method, and associate it with the Item.Description field. In the validate()
method, we will check if the description entered is the same as a location’s description stored inside the cache and throw an error dialog if there is a
match.

You can view or download the full sample code from this GitHub repository: link

After deploying the code and associating the LocationSet to
the Location object and the FldDescription class the Description field of the
Item object, to test if the code works, open an Item and enter a description
that matches the description of a location, it should throw an error
message as in the image below:

To check if the cache is reloaded properly, create a new location or update an existing one, enter the description of the updated location to an item’s description, it should also throw an error message. 

MboSet performance, Memory Cache, and DB Call

 

I recently had to look at ways to improve the performance of a custom-built operation in Maximo. Essentially, it is one of the many validation operations take place after a user uploads a PO with a few hundred thousand lines.  The team here already built a high-performance engine to handle the process, but even with it, this particular operation still takes around 15-17 milliseconds to process each line which is too slow for their current standard. Imagine processing 200,000 PO lines, it will take nearly an hour just for this operation alone. There are a few dozen of these operations that need to be executed, plus other standard basic Maximo operations like status change or save, the whole process takes up many hours.
 
With this millisecond operation, many assumptions or standard recommendations for improving performance may not work. In some instances, following the standard recommendations actually make it slower. For example, many Maximo blogs suggest setting a MboSet to DISCARDABLE to reduce memory usage, but if the MboSet is re-used repeatedly, it will need to query from the DB again and again which can be costly. In this case, don’t set the MboSet to DISCARDABLE will allow Maximo to cache data in memory and thus make it faster. 
 
Usually, for performance tuning, a database query is the most expensive operation, and with Maximo, it has something to do with the getMboSet method. Therefore, a better insight into how it works will help us to devise the best strategy for writing code for high-performing operations. In this post, I’ll list out the findings of some of the experiments that I did around MboSet, hopefully will help save you some time if you have to deal with the same problem.

1. mboSet.count() versus getSize() versus isEmpty()
 
If we need to retrieve a MboSet via relationship such as itemSet = asset.getMboSet(“ALL_ITEM”), and we need to check if the result set is empty or not, we can use either itemSet.count() or .getSize() or .isEmpty(). The differences between them are:
  • count() will fire off a “Select count(*) From ITEM Where…” query; isEmpty() will fire off a “Select * From ITEM Where…” query.
  • getSize() will never fire off a DB query. It only returns the size of the mboSet which has been initialised in memory. Thus, if after the itemSet = asset.getMboSet(“ALL_ITEM”), we call cnt = itemSet.getSize(), we will get cnt = 0. After itemSet.moveFirst() is called, getSize will return cnt = 2 (because Maximo initialised one object ahead), and after we have looped through the whole itemSet using itemSet.moveNext(), calling itemSet.getSize() will return the actual size of the whole set.
  • isEmpty(), together with moveFirst(), or moveNext() or getMbo(i) function will call a “Select * From ITEM Where…” query if the itemSet never been initialized. But if the mboSet is already initialised, any subsequent call to those functions will just access the object in memory and thus doesn’t trigger another DB query.

Considering the following piece of code:

With this code, the mboSet.count() method will fire off a “Select count(*)” query. Then, when getMbo() method is called the first time, it will fire off a “Select * From…” query to initialise the mboSet. Although inside DB, the Select count(*) cost much less; from Maximo JVM, with a small table, it takes about the same amount of time (a few milliseconds) to execute the mboSet.count() and the mboSet.getMbo(i) method.  As such, the above piece of code will take about twice the amount of time to execute when compared to a loop that doesn’t use the count() method. This is consistent with the recommendation by Bruno in his blog.

However, considering a scenario when our logic only needs to check if a result set returned is empty or not, with a massive table, calling count() can be much faster than calling isEmpty().

2. Retrieve MboSet via a MXServer object versus via Relationship

When we retrieve a mboSet via relationship, the data will be cached in Maximo’s JVM memory. Thus, if the same relationship on the same mbo object is used again, it doesn’t fire off another DB query, thus improving performance significantly. However, if in our code, instead of using a relationship, we retrieve the mboSet via mxserver.getMboSet(), it will query the DB every time, thus could impact performance if the same mboSet can be re-used (when there is no change to the DB query). Therefore, in this case, it is advisable to retrieve the data via relationship instead. But be aware that in the case the same relationship is used, but with a different where clause, Maximo will have to call the DB to execute a new query, thus there will be no difference in terms of performance.

This leads to a scenario where you don’t want to add a new relationship to the object every time you need some specific operation. In this case, we can define a temporary relationship in our code as in the following sample:

itemSet = asset.getMboSet(“$TEMP_RELATIONSHIP”, “ITEM”, “status = ‘ACTIVE’”)

This will add a new temporary relationship during runtime, and as long as the where clause doesn’t change, the itemSet data is cached and thus the next time you call the getMboSet using this temporary relationship, data will be accessed from memory instead of being queried from the DB.

Sometimes, you will want to shorten your code using the dot notation in mbo.getString(), getInt(), or getBoolean() method etc., Maximo will try to use cached data instead of querying the database, thus considering the following piece of code:

The first getString statement will initialise the mboSet retrieved by the “ASSET_ITEM” relationship. Thus, subsequent getString statements don’t query the DB again. Because of that, in terms of performance, it will be similar to the following piece of code:

item = asset.getMboSet(“ASSET_ITEM”).getMbo(0) if (“ACTIVE”.equals(item.getString(“STATUS”))) { desc = item.getString(“DESCRIPTION”); orderunit = item.getString(“ORDERUNIT”) }

Using a relationship will help cache the data in memory, but if the data is only used once, all it does is increase memory usage and thus could potentially degrade performance instead of giving any benefit. Thus, you should be considering whether data can be re-used. If not, setting the mboSet to DISCARDABLE will ensure memory is freed up after use. So my recommendation for you to deal with performance issues is to always run your own test and make sure the right strategy is used for your scenario.

Import/Export Maximo ImageLib Data via Integration Framework

In Maximo, we can upload images as attachments (DOCLINKS) which are stored as files on the server or as “profile” images which are stored as binary data inside the IMAGELIB table. Profile image is quite useful to give the user a quick view of what an inventory item or an asset/location looks like.

While Maximo allows us to upload DOCLINKS attachments via the Maximo Integration Framework (MIF), uploading images to the IMAGELIB table via MIF is not supported out-of-the-box. Therefore, to upload images, we can only do it manually one by one via the Maximo UI.

For bulk loading, if we have access to the DB server, we can write a stored procedure to read the files and import binary data directly to the database. There are two scenarios I had in the past in which this
approach doesn’t work

  • When we built mobile apps and wanted to upload data to IMAGELIB. In that case, my teammate extended a REST handler class to achieve this requirement.
  • When we needed to bulk upload images, the client did not give us access to the database

To address this requirement, we can extend the process classes of object structure (OS) to encode/decode binary data into Base64 string to deliver the data via XML or JSON format. Since this processing is done on object structure, it will support both application Import/Export and sending/receiving binary data via the integration framework’s MEAWEB API or JSON API.

To encode binary data to Base64 text string before exporting OS data, we can extend MicSetOut.class and associate it to the OS by specifying the class’s path in the “Outbound Definition Class” field. Below is the sample Java code which exports Item master data with Image:

To decode base64 string back to binary data before storing it to IMAGELIB table, we can extend the MicSetIn.class and associate it to the OS by specifying the class’s path in the “Inbound Processing Class” field. Below is the sample Java code:

Once we have the customized classes in place, it is possible to Export/Import ImageLib data using XML format or via web services. It is also quite easy to write a simple Excel/VBA program to bulk upload images via MIF without the need to have direct access to the DB server. But these are different topics and perhaps, I’ll provide examples in other future posts.

Federated MBO / JSON Resource – Filling the gap between Maximo and IoT

What is a Federated MBO?

Recently I got some time to explore the new Federated MBO feature. I like it a lot because it is simple to configure and quite powerful. Of course, prior to Maximo 7.6, the same functionality can be done with Java programming, but with this new feature, it is so easy to incorporate data from external systems into Maximo business logic and make a seamless experience to the end-users.

Basically, Federated MBO allows you to link an API from an external system and make it looks like a standard object in Maximo and treat it like other standard Maximo objects such as displaying data on GUI or setting up workflow conditions or writing automation script against it. The key difference here is that data is not stored inside Maximo database but queried from the API on-the-fly when the object is accessed.

Use Cases

I can immediately think of several applications that are enabled by this feature:

  • Integrate with Data Historian/SCADA: to display real-time operation conditions of assets. It is always beneficial for the asset managers to know the current conditions of the assets when planning works. Traditionally, operation conditions of critical assets can be imported/synchronized into asset meters for conditional monitoring PM or analytical reports. But it is not real time data. Generally, we don’t import every meter reading collected into Maximo. It will fill-up the database and consume valuable system resource. With this capability, we will only query data of a specific asset or location directly from an external system and don’t store it in the DB. However, we still can interact with the data as part of Maximo functionality which is a distinct advantage over other methods such as embedding the web page of an external system inside Maximo or directing user to a different web application
  • Integrate with security access systems: to display in real-time the current whereabouts of a worker. This could help with work planning and coordination activities.
  • Integrate with external service: to verify ID, Certificates etc. for the labour self-registration process.
  • Integrate with weather service to assist work planning process. For example, Maximo will warn the planner when planning to work if extreme heat, high UV, or heavy rain is forecasted. This allows them to plan the work on a different date or prepare proper PPE equipment to the workers. The same can be done if we deploy sensors to report temperature, noise level, humidity etc. on certain work locations.

Implementation

To demonstrate how the Federated MBO feature works, below is an exercise I did with OpenWeatherMap API. This API provides free access to 5-day weather forecast, UV Index forecast, and Air Pollution. In this example, I will link the UV Index forecast API and make it available as an MBO. When the user set a Scheduled Start Date, if UV Index is forecasted to be higher than 7, Maximo will display a warning message to remind the planner.

Step 1: Register OpenWeatherMap API account, and retrieve data on browser

  • Go to OpenWeatherMap.org to register a free account.
  • Sign-in with your new account, go to Account home page, open the “API keys” tab and copy your API key to use it to access the API.
  • Paste the following link into a new browser tab, replace the XXX with your API key, you should be able to retrieve JSON data of the UV Index forecast for the next 8 days as shown in the image below 
http://api.openweathermap.org/data/2.5/uvi/forecast?lat=-33.8688&lon=151.2093&cnt=7&appid=XXX
Test the OpenWeatherMap URL in browser
  • If you couldn’t retrieve the data, go to the site’s API document to read the instructions. It might have been updated. The OpenWeatherMap’s API syntax is very simple.

Step 2: Create a JSON Resource in Maximo

  • In Maximo, go to the Integration \ JSON Resource application. Under the More Actions menu, click Create JSON resource to start the wizard
  • Create a new Resource
    • Resource: OW_UVI
    • Resource Usage: OBJECT
    • Resource Type: REST
    • URL: <copy/paste the URL in the previous step above>
  • Note: for simplicity, we hardcode the latitude/longitude of the location in the URL. Once it works, you can dynamically populate the values from the main object by putting the field name inside a bracket such as:
    lat={LOCATIONS.LATITUDE}&lon={LOCATIONS.LONGITUDE}
  • Click Next 
  • Copy/Paste the sample JSON data we retrieved in previous step from OpenWeatherMap into the JSON Data text area. This allows Maximo to parse the data structure returned from the API for the next step
  • For Parent Object, enter “WORKORDER”. This will create a relationship on the “WORKORDER” object to point to the new JSON object “OW_UVI” we are creating
  • Click Next to proceed to the next screen. This screen summarizes how Maximo parsed and construct new MBO structure from the sample data provided.
  • Leave the default value, click Process to finish.

Step 3: Test the new JSON Resource

  • In the Database Configuration application, open the new OW_UVI object it just created for us. It also added a relationship OW_UVI in the WORKORDER object to link to this new mbo.
  • To check if our JSON resource can query data from the API correctly, use Application Designer, add a Table to anywhere, such as the “Map” tab. Open properties of the table, put OW_UVI as the relationship of the table. Add two columns and assign the DATE_ISO and VALUE field to the two columns. 
  • Open the Work Order Tracking app, open any work order, the table will display some UV forecast data:

Step 4: Create a warning using Automation Script

  • Open the Database Configuration app, choose Action > Messages to open the Message dialog. Create a new Warning Message:
    • Message Group: WOTRACK
    • Message Key: HighUVI
    • Value: High UV Index is forecasted for the scheduled date. Ensure proper sun protection PPE has been planned.
  • Note: the Message Group and Message Key fields are case-sensitive. Thus, in automation script, we should use the exact case in order for the message to be displayed
  • Open Automation Script app, Create an Attribute Launch Point:
    • Launch Point: SCHEDSTART
    • Object: WORKORDER
    • Attribute: SCHEDSTART
    • Event: Validate 
  • Click Next to the next screen, give the script a name such as Check_UVI choose Python as language
  • Click Next. Copy/Paste the following source into the Source code text area, then click Create to finish creating the script
  • In this script, I compare and identify if the Scheduled Start Date is in the next 8 days, and if the UV Index is higher than 7. If this condition is met, Maximo will display the warning we created above.

Extras

With Maximo 7.6, we can setup to assign Service Address to Location with coordinate of the location. When a Work Order is created against a location, it will inherit the coordinate of the location. We can use it to fill into the parameter of the API request, and retrieve the actual forecast of the Work Order’s location.  If you’ve read my previous post on creating custom chart control, we can also modify the app to display UV Index forecast in a chart for visualization. A planner after receiving the warning can then open the “Map” tab to see the location of the work, and see the UV Index forecast on a chart to determine what day is better to carry out the work.

I hope you have fun with this new Maximo feature. If you can think of any creative application using this feature to make Maximo better, please suggest by leaving me a comment. I would love to try it out

« Older posts Newer posts »