Category: Database (Page 1 of 2)

How to generate INSERT script for Maximo data with SQL Server?

When dealing with a large amount of data or migrating workflow configuration, we might need to generate and use INSERT script. This article describes how to do it with SQL Server database.

Why do we need to generate INSERT script?

When deploying changes, the preferred method is DBC script. There are different tools to generate DBC scripts for many types of configurations. However, in certain cases, such as when migrating workflow configuration, we still need to use SQL script.

For deploying a data package, it is a good practice to script and deploy the change as part of the DBC deployment process. It is easier to track the changes with version control system such as GIT or SVN).

As an example, in a recent project, the users used Maximo in DEV to enter and validate the data. After they finished, I extracted the data using the method described in this post and loaded it to Production as part of the DBC deployment process.

Do you know the UpdateDB or RunScriptFile tool can deploy SQL files directly?  It is not required to put SQL statements into a DBC file with the FREEFORM tag.

What is the limitation with SQL Server Management Studio?

With Oracle database, generating INSERT statements is quite simple with SQL Developer. All we need to do is right clicking on a table, then choose Export Data > Insert. However, with SQL Server, a similar function is buried deep in the drop-down menu and is hard to find. There are at least two other features with similar purpose in the same menu which causes confusion and makes people thought this is not possible with SQL Server. These functions don’t work for this requirement. They are:

  • Right-click on the table, then choose Script Table As > INSERT to
  • Right-click on the database, then choose Tasks > Export Data

For this requirement, we need to right-click on the database (not table), then choose Tasks > Generate Scripts…

However, this approach has some limitations:

  • It does not let you define a WHERE clause. You’ll need to generate script for the whole table.
  • With Maximo, all tables have the ROWSTAMP field which you need to exclude. Otherwise, the INSERT script does not work.

How to generate INSERT Script?

To address the limitations described above, we can create a VIEW or a staging TABLE from the source table. This allows us to set a WHERE clause to filter the data we want to extract and exclude the ROWSTAMP column. A VIEW is more convenient in a development project because we can define any rule to manipulate the data such as changing the ID field to a new range to avoid conflict. However, note that there is a bug with older versions of SQL Servers that it cannot generate script correctly for the Data Only mode. In such case, we can use a staging table instead.

To provide an example, below are the detailed steps to generate INSERT script for all “Fire Extinguisher” records in the ASSET table:

Step 1: Create a temporary table:

  • Use one of the two SQL statements below to get a comma-separated list of column names from the table. The first query uses the MAXATTRIBUTE table, and it only works for Maximo database. The second query works for any generic SQL database, but you will need the SELECT permission to the Information_Schema
SELECT string_agg(column_name, ', ') 
WITHIN GROUP (ORDER BY column_name) 
FROM information_schema.columns 
WHERE table_name = 'asset' AND column_name != 'rowstamp'
SELECT string_agg(attributename, ', ') 
WITHIN GROUP (ORDER BY attributeno) 
FROM maxattribute WHERE objectname = 'asset' 
AND persistent = 1
  • Copy/Paste the list of columns to write a CREATE TABLE statement below, then execute it to create the table. In this step, we can also transform or map any ID fields (such as ASSETID and ASSETUID in this example) to avoid conflict.
DROP TABLE IF EXISTS tmp_asset;

SELECT [Copy/Paste the column names here]
INTO tmp_asset
FROM asset
WHERE description = 'Fire Extinguisher';

Step 2: Generate INSERT Script from the temp table:

  • Right-click on the database, choose Tasks > Generate Scripts…
  • In the Generate Scripts wizard, choose “Select specific database objects”, then select the tmp_asset table we created in the step above.
  • In the next page, “Set Scripting Options”, click on the “Advanced” button, under the General group, change the “Types of data to script” option from Schema Only to Data Only
  • Back to the Set Scripting Options, either choose to save the output as a SQL file or open it in a new query window.

What are the alternative methods?

If the method described in this post does not work for you, two alternative methods you can try are

How to reset sequence without restarting Maximo?

One error we often have to deal with is an incorrect sequence when adding new data to Maximo. There are many situations which can cause this issue, such as:

  • When loading data using MXLoader, or inserting data directly via SQL
  • Sequence corruption due to unknown cause in Production, probably due to errors caused by cancelled/terminated job
  • Restoring database from a copy or after an upgrade.

When this happens, the user sees an error with a duplicated key value such as

BMXAA4211E - Database error number 2601 has occurred…

The solution is well-documented and straightforward, we just need to find the current maximum ID value used in the table and update the corresponding sequence to use the next value.

For example, if the error occurs with the WORKORDERID field of the WORKORDER table, we can do this SQL update and restart Maximo.

UPDATE maxsequence SET maxreserved = (SELECT max(workorderid) + 1 FROM workorder) WHERE tbname = 'WORKORDER' and name = 'WORKORDERID'

However, I like to avoid restarting Maximo if possible due to some obvious problems such as:

  • I recently had to do a quick deployment which involves uploading some data. For some unknown reasons, loading the data via MXLoader causes random sequence corruption a few times. For this client which has a large cluster, restarting Maximo will require an additional 30-60 minutes downtime.
  • A location data hierarchy update requires me to insert a few thousand new records into the LOCANCESTOR table. I needed to update the sequence to a new value for subsequent data upload via MIF to work. Since it is a cloud environment, if I can avoid a restart, we won’t need to be dependent on the availability of the cloud provider.

To address that problem, the simplest solution I found to hot reset the sequence cache without restarting Maximo is by calling the reset sequence Java function via an automation script. The steps are as follows

  • Create a new script with no launch point:

Whenever we update maxsequence table with a new value and need to reset the cache, just execute the script by calling it via the REST API: 

[MAXIMO_URL]//maximo/oslc/script/runtask?_lid=maxadmin&_lpwd=maxadmin

If it works correctly, you should see something like below.

Executing the reset sequence script by calling it via REST API

No restart during a deployment means we can all go to bed earlier. Best of luck.

UPDATE:  On a clustered environment, I find it doesn’t seem to refresh all the JVMs. Thus, to be sure, we might need to run it on each JVM separately (by accessing the script from the JVM 908x port)

My failed attempt to get Maximo to work with Azure SQL database

Recently, I started playing with Azure by attempting to migrate a Maximo instance from my local VM to Azure platform. Although the attempt was a success. I didn’t realize SQL Server and Azure SQL are different databases (or more correctly, two different versions). There were a few issues during the process, but I figured out how to work around them and got Maximo running on Azure VM and Azure SQL. After sharing the result on LinkedIn, there were some comments that Maximo couldn’t be installed on Azure SQL and IBM doesn’t support it, so I spent a bit more time digging and thought I should share the details and some of my opinions on this matter.

First, let us be clear, Azure is a big cloud platform which offers many different services. I’m not a cloud expert, but from what I understand, we are talking about two main services:

  • Azure VM: which is IaaS and it lets us run a virtual machine on the cloud. You can run many different Windows or Linux versions on it; it is transparent to the standard business applications (Maximo). In other words, there shouldn’t be any issue running Maximo on Azure VM as long as you stick to the OS versions supported by Maximo. For example, when referring to Maximo 7.6.1.1 Compatibility matrix, Microsoft Hyper-V 2012 and 2012 R2 are supported, and with them, Windows Server 2012 and 2016 are supported as Guest OS. In fact, many companies large and small are running Maximo on Azure VM.
  • Azure SQL: is a cloud version of SQL Server. It has the core engine of SQL Server database, but certain aspects of it has been modified to work on a cloud environment and to support multi-tenancy. Current version of Maximo doesn’t support Azure SQL, and IBM don’t have any plan to support it in the near future with Maximo 7.6.1.x and 8 (according to their 2020 Roadmap).

Even with that knowledge, I was still curious on whether I can make it work from technical perspective and did a few experiments:

A – Installing Maximo 7.6.1 from scratch on Azure VM and Azure SQL:

People mentioned that Maximo cannot be installed on Azure SQL, so I did just that, just to know what exactly the issue is (and secretly, I hoped I could fix
it). I followed the standard Installation process and managed to install
Websphere, prepare the environment, and install Maximo SMP.

After SMP was installed, I couldn’t get the Configuration tool to connect to Azure DB because it couldn’t resolve the DB Alias provided by Azure.

I attempted to execute this step manually using command line by setting up the
maximo.properties file, then run the maxinst.bat tool. However, DBC script failed when it ran an undocumented DBCC ‘CALLFULLTEXT’ command:

To be honest, installing a fresh Maximo DB on Azure SQL shouldn’t be a problem from technical perspective, because we can always install it on an on-premise SQL
Server, then put it on to Azure SQL (using the process I described in my previous post). However, this is definitely a failure.

B – Upgrading Maximo from 7.6.0.6 to 7.6.0.9

It is critical to be able to upgrade and install updates, fix packs, or industry add-ons. From technical stand point, all of these activities are essentially the same process; it involves running a large number of DBC scripts, which mostly consists of SQL commands. I wanted to see if it works on Azure SQL.

I started out with a simple task first: upgrading from Maximo 7.6.0.6 to 7.6.1. To do this, I simply replaced the existing 7.6.0.6 SMP folder installed previously with a new 7.6.1 SMP folder, then run the updatedb tool. On the first run, I had several issues related to incompatible collation between “Latin1_General_CI_AS” (default setting of on-premise SQL Server) and “SQL_Latin1_General_CP1_CI_AS” (the default value of Azure SQL’s databases, which can be changed, and the ONLY supported collation of master and tempdb).

Changing a database collation is a complex job which involves updating the value at all three levels: database, table, and column.  To do it, we have to drop all indexes and constraints on the tables, then run the update script, then re-create the indexes. I only wanted to find out if upgrade is possible, so I tried again with a fresh Demo DB created with “SQL_Latin1_General_CPI1_CI_AS” collation. The upgrade is successful this time. So, that is a small success.

C – Install Add-ons

Next, I tried installing some serious industry add-ons (Scheduler 7.6.7, Oil and Gas 7.6.1, Utilities 7.6.0.4, Control Desk 7.6.1) on top of the working instance from the previous step. These are probably the most complex add-ons we could get for Maximo. As such, if it worked, there wouldn’t be any problem to get it to work with any future upgrade/update.

The process failed when it tried to run some scripts belong to the ICD add-ons. This time, it had an issue when trying to access object spt_values in the master DB which will never be given in a multi-tenancy cloud environment. There were a few posts on the web showing us how to work around this issue. But I stopped that this, and that is the end of my attempt. There’s no point in trying to spend too much time in getting something to work knowing it is not supported by IBM (unless there are some clients willing to pay for my time doing that in the future)

Conclusion: the result of this process confirms what others have reported, Maximo doesn’t work with Azure SQL database. Although the issue with collation can be worked around (with a lot of effort), there are more than one issue with various DBC scripts requiring access to the master DB which is not possible in Azure DB. For people with extensive knowledge on DBC scripting and custom DBC handler, they might be able to modify the problem scripts to get it to work, but that is definitely not supported by IBM and the effort cannot be justified in my opinion. As support for Azure DB is not on the road map. At this stage, our only option is to use on-premise DB version installed on the cloud as a separate DB server.

How to modify (almost) any Maximo data with no database access

Being Maximo consultants, we often come into a scene where the client gives us MAXADMIN access to the system but access to the database is an absolute No-No. This is usually the case with companies that have a clear separation of the App Admin and DB Admin roles. This is also one of the key restrictions with Maximo as a Service.
 
If you have been doing a bit of admin and config activities, you will surely understand the limitation of having no database access. It’s like having to work with tied hands. Luckily, we can use MXLoader to query/update almost any data tables in Maximo. Below is an example on how to do it.
 
Let’s say we’re working with a Maximo SaaS and IBM only gives us front-end admin access, but no access to the back-end. Sometimes, Maximo is stuck while applying DB config, and we cannot turn off Admin Mode. This is usually not a big problem when we have full DB access. To fix this, we simply have to update the CONFIGURING flag in MAXVARS table to 0. However, with Maximo on the cloud, under standard procedure, we’ll need to raise a support ticket asking for IBM to update the flag. This could take a few days to resolve. It’s a horrible situation since we cannot do anything while Maximo is in Admin Mode.
 
Without DB access, we can update the MAXVARS table using MXLoader using the steps below:
 
1 – Create an Object Structure for the MAXVARS table:
  • OS Name: DBMAXVARS
  • Consume By: Integration
  • Add a new row, set Object Name: MAXVARS
 
 2 – Use MXLoader, add
a new ‘PEOPLE’ sheet:
 
 
 
With the new ‘People’ sheet added, replace ‘MXPERSON’ object structure with ‘DBMAXVARS’ and ‘PERSON’ object with ‘MAXVARS’ object. To identify what columns are available MAXVARS table we can open the ‘DBMAXVARS’ object
structure, and open the ‘Exclude/Include fields’ dialog from the Select Action
menu. In this case, I only need two fields: VARNAME and VARVALUE
 
3 – Set operation method to ‘Query’ and where clause to VARNAME
= ‘CONFIGURING’
, then execute the query by clicking on the ‘Run’ button to retrieve
the record.
 
 
 
4 – After we get the record for ‘CONFIGURING’ variable, set VARVALUE to 0, then change the operation method to ‘Sync-Change’ then execute the operation. We will now be able to turn off Admin Mode.
 
 
 
Some other examples of using MXLoader to update the system
tables include:
 
  • MAXMENU – To move/add an app to a different module menu
  • APPFIELDDEFAULTS – To set a default value when inserting new
    record
  • MAXSESSION – To release user sessions that got stuck in Admin
    mode and the user cannot log in
  • MAXDOMAIN – To create a new Synonym Domain
In case we do not have permission to use the Object Structure application, another method is to create a new single-page application using Application Designer to expose data of the object on GUI. It would enable us to achieve the same result and only take a few minutes to do. However, I like MXLoader a lot more as we can use it to bulk insert/update normal app data, and it doesn’t leave a bunch of junk apps in the system which is quite difficult to remove.
 
The policy to restrict DB access is usually there for a good reason. And MXLoader is an extremely powerful tool. I find it a lot easier to make mistakes updating the wrong data or to the wrong environment. So please be very careful when using it and you should always have a backup of the before and after versions of the data set that you work on.

[Update 2023]: an easier and more powerful way to achieve this task is using API Automation script

Solving problems with Update DB process when install or upgrade Maximo

When upgrading Maximo or installing new add-ons or fix packs, new source files will be copied to SMP folder which include Java classes and DBC (database configuration) script files. After that, the installer will run the UpdateDB process to update the database, run the BuidMaximoEar process to build the EAR file, and then deploy the EAR file to Websphere.
 
The DBC script files contain incremental changes to the Maximo database which add changes to GUI, update data, and modify DB configuration objects. Most of the problems you get when installing fix packs or upgrading Maximo come from the UpdateDB process which execute these DBC files in a set order.

Thus, if you have a problem with this process, you can follow the below general steps to troubleshoot and fix the problem:

 
  • When you have an error with UpdateDB process, note down the name of the log file that contains the error message. It has this format: [Process Name][YYYYMMDDhhmiss].log
  • If you have a problem with the UpdateDB process when running the installation program and don’t know the log file name, locate the [SMP]/maximo/tools/maximo folder and run the updatedb.bat (.sh) tool
    again. It will produce the same error
  • Open the log file which is located under
    [SMP]/maximo/tools/maximo/log folder to get the detailed error message.
  • Fix the problem then run the UpdateDB tool again from the command line. It will continue from the last success point.
 
Following is an example on how I got a problem when trying to create a demo database from an SMP copy I received from my client just a few hours ago. When installing demo instance by executing maxinst.bat, Maximo will create a standard initial  database by running a bunch of SQL statement from [SMP]/maximo/tools/maximo/en/maxdemo.ora for Oracle (or maxdemo.db2, or maxdemo.sqs for DB2 and SQL Server respectively). Then it will call the UpdateDB process to apply incremental changes to this database to install add-ons/fix packs which are already installed in this SMP folder).
 
During this process, I got the follow error:
 
 
To get more details about the error, I locate the Updatedb20180712173523.log in folder: [SMP]/maximo/tools/maximo/log. It gives me more details about the problem:
 
 
The problem occurred while it tried to execute the file V7110_03 script for the Oil & Gas add-on. In this case, if I scrolled up several lines, I can quickly see the problem, which is caused by an ORA-01000 – maximum open cursors exceeded. Basically, I forgot to set open_cursors parameter to 1000 as recommended by IBM, leaving it with default value = 300 when creating a new database. Thus, all I have to do in this case is increase it to 1000, then run the UpdateDB tool again. With real production database, sometimes I have to increase it to 5000 or 10000 in order to apply some fix packs.
 
In many cases, the problem is not easily identified by just looking at this log file. So we have to scroll up to a bit further, to identify which statement is causing the issue. Then, we will have to open the DBC script file which can be found under [SMP]/maximo/tools/maximo/en/[add-on folder]/[dbc script file]. For example, in this case, it is …/en/oilandgas/V7110_03.ora
 
 
 
There are a number of script file types:
 
  • Files with .ora, .sqs, .db2 extensions: are just normal SQL files applicable for Oracle, SQL Server, and DB2 respectively. You can run the SQL statement from this file against the database to check for problems directly.
  • Files with .msg extension: are used to
    add/modify existing messages. You normally don’t have a problem with these.
  • Files with .mxs, and .dbc extensions: are used to
    modify application GUI design, and apply data updates or database
    configuration changes.
 
Most of the problems come from .dbc files. To understand DBC script and identify what each statement does, you can refer to this DBC XML Format Technical Reference 
 
Some common problems we have when running UpdateDB include:
 
  • Error when creating a unique index: in this
    case, we have to query and update the data to remove or fix records with
    duplicated keys
  • Error when creating existing objects: most of the
    time, we can drop the existing object as a new object will be created by the
    script anyway
  • Missing script files, or script file doesn’t follow naming convention, which have ordered number. If there’s a missing script file, you can create an empty file by copy/paste from an existing dbc file, and rename it to have a missing ordered number. For files which don’t follow naming convention, I got get this java.lang.ClassCastExeption a few times when updatedb with Linear add-on, turns out there is a file named V7500_linear.dbc which updatedb expects to have a number after V7500, so all I have to do is change it to V7500_02.dbc and it doesn’t complain any more
  • DB operation fails due to maximum open_cursors or processes exceeded (in Oracle): increase the open_cursors, processes parameter and try to run UpdateDB again.
  • Java NullPointerException: this one is quite difficult to figure out. We have to look at the DBC file to understand what it’s trying to do. This is often caused by bad data, such as orphan records referencing a non-existence parent.
  • Error with certain DB operations like when it is unable to create a new Workorder table after Drop the table (when there’s a change to the table
    structure): I’m not exactly sure what mechanism causes this, but I found the
    problem goes way if I run the UpdateDB process again (and again). I only
    attempt to fix it if the problem persists after the 3
    rd run.
 
I hope this helps you when the going gets tough.
 

How to bulk upload images via the Integration Framework?

In the previous post, I provided an example of how we can customise Object Structure to enable import/export of binary data via MIF. It is achieved with Java customisation. From Maximo version 7.6, the automation scripting framework has been greatly extended to support integration. With this update, we can enable import/export of binary data with a simple automation script. Below is an example of how we can configure Maximo 7.6 to bulk upload images to Item Master application:

Step 1: Add a new Object Structure

  • In the Object Structures application, Open and duplicate the MXITEM object structure to create a new one. Name it MXLITEM
  • Under “Source Objects for MXITEM”, delete all child objects, and add a new child object IMGLIB as shown below

Step 2: Add an Integration Automation Script

  • In the Automation Script application, choose Action > Create > Script for Integration
  • In the pop-up, enter the following  details:
    • Select “Object Structure”, choose “MXLITEM” for Object Structure
    • Select “Inbound Processing
    • Language: Python
    • Copy/paste the piece of code below to the  Source Code area.

Step 3: Upload images to Item Master

Download the Excel VBA tool in this GitHub repo. Use it bulk upload images to Item Master:

  • Upload the Settings page with the URL and the username/password to connect to Maximo
  • Put all the image files in the same folder as the Excel file
  • Update the Data sheet with the list of Item Number and the name of the image file.
  • Click on “Upload Images”

Notes & Updates

Note 1: be careful with photos taken from newer cameras, the files usually have high resolution and thus can be quite big. I’ve seen an eager team of engineers upload images for all assets and inventory items in a power plant, and overnight, the database grew from 1GB to 20 GB. Thus make sure you resize the images before uploading.

One quick and simple method in Windows is to select multiple files, then right-click, and choose Send To > Mail Recipient. Windows will give you a pop-up to resize the files, choose the smallest size (640×480). Windows will resize the files, then attach them to Outlook, in Outlook, select all the files, and copy/paste them to a different folder. These files will be much smaller than the original, full-resolution files.

Update Aug/2023: I have posted a new article on how to automatically rescale the image to a smaller size

« Older posts