Tag: Maximo

How to generate INSERT script for Maximo data with SQL Server?

When dealing with a large amount of data or migrating workflow configuration, we might need to generate and use INSERT script. This article describes how to do it with SQL Server database.

Why do we need to generate INSERT script?

When deploying changes, the preferred method is DBC script. There are different tools to generate DBC scripts for many types of configurations. However, in certain cases, such as when migrating workflow configuration, we still need to use SQL script.

For deploying a data package, it is a good practice to script and deploy the change as part of the DBC deployment process. It is easier to track the changes with version control system such as GIT or SVN).

As an example, in a recent project, the users used Maximo in DEV to enter and validate the data. After they finished, I extracted the data using the method described in this post and loaded it to Production as part of the DBC deployment process.

Do you know the UpdateDB or RunScriptFile tool can deploy SQL files directly?  It is not required to put SQL statements into a DBC file with the FREEFORM tag.

What is the limitation with SQL Server Management Studio?

With Oracle database, generating INSERT statements is quite simple with SQL Developer. All we need to do is right clicking on a table, then choose Export Data > Insert. However, with SQL Server, a similar function is buried deep in the drop-down menu and is hard to find. There are at least two other features with similar purpose in the same menu which causes confusion and makes people thought this is not possible with SQL Server. These functions don’t work for this requirement. They are:

  • Right-click on the table, then choose Script Table As > INSERT to
  • Right-click on the database, then choose Tasks > Export Data

For this requirement, we need to right-click on the database (not table), then choose Tasks > Generate Scripts…

However, this approach has some limitations:

  • It does not let you define a WHERE clause. You’ll need to generate script for the whole table.
  • With Maximo, all tables have the ROWSTAMP field which you need to exclude. Otherwise, the INSERT script does not work.

How to generate INSERT Script?

To address the limitations described above, we can create a VIEW or a staging TABLE from the source table. This allows us to set a WHERE clause to filter the data we want to extract and exclude the ROWSTAMP column. A VIEW is more convenient in a development project because we can define any rule to manipulate the data such as changing the ID field to a new range to avoid conflict. However, note that there is a bug with older versions of SQL Servers that it cannot generate script correctly for the Data Only mode. In such case, we can use a staging table instead.

To provide an example, below are the detailed steps to generate INSERT script for all “Fire Extinguisher” records in the ASSET table:

Step 1: Create a temporary table:

  • Use one of the two SQL statements below to get a comma-separated list of column names from the table. The first query uses the MAXATTRIBUTE table, and it only works for Maximo database. The second query works for any generic SQL database, but you will need the SELECT permission to the Information_Schema
SELECT string_agg(column_name, ', ') 
WITHIN GROUP (ORDER BY column_name) 
FROM information_schema.columns 
WHERE table_name = 'asset' AND column_name != 'rowstamp'
SELECT string_agg(attributename, ', ') 
WITHIN GROUP (ORDER BY attributeno) 
FROM maxattribute WHERE objectname = 'asset' 
AND persistent = 1
  • Copy/Paste the list of columns to write a CREATE TABLE statement below, then execute it to create the table. In this step, we can also transform or map any ID fields (such as ASSETID and ASSETUID in this example) to avoid conflict.
DROP TABLE IF EXISTS tmp_asset;

SELECT [Copy/Paste the column names here]
INTO tmp_asset
FROM asset
WHERE description = 'Fire Extinguisher';

Step 2: Generate INSERT Script from the temp table:

  • Right-click on the database, choose Tasks > Generate Scripts…
  • In the Generate Scripts wizard, choose “Select specific database objects”, then select the tmp_asset table we created in the step above.
  • In the next page, “Set Scripting Options”, click on the “Advanced” button, under the General group, change the “Types of data to script” option from Schema Only to Data Only
  • Back to the Set Scripting Options, either choose to save the output as a SQL file or open it in a new query window.

What are the alternative methods?

If the method described in this post does not work for you, two alternative methods you can try are

ArcGIS to Maximo synchronisation not working due to API limit

How to send ArcGIS data to Maximo

This is a weird issue with the ArcGIS – Maximo integration cron task. I like to record just in case it hits me again. 

Symptom:

The client reported the Maximo – ArcGIS Asset integration stopped working. New assets are not synchronised from GIS to Maximo by the ArcGISDataSync cron task. The history of the cron task instance shows an error: BMXAA6361I – caused by: BMXAA1482E – The response code received from the HTTP request from the endpoint is not successful. Not Found

If I copy and open the same REST query of the cron task instance from a browser, ArcGIS does return data with HTTP response Code 200 – OK, the response JSON data looks normal. 

However, the GIS specialist advised me that the request exceeded ArcGIS’s API limit was set at 2000. When inspected closely, there is an attribute “exceededTransferLimit”: true at the end of the JSON response. In this case, the feature layer contained more than 5000 records in the updated state that met the request’s filter criteria (MXCREATIONSTATE=1)

Cause: 

To confirm the GIS limit caused the issue, we increased it to 10,000. The cron task ran without error after that. It is unclear how Maximo captures this error message; whether it read the JSON message looking for the exceededTransferLimit attribute or whether Maximo received a different HTTP status code from ArcGIS. We didn’t have time to debug Maximo’s code to find out.

Solution:

To fix the issue in this case, we resorted to a short-term solution, which is increasing the limit to 10,000 to clear the batch. Then we reset it back to default as it wasn’t expected to have more than 2,000 features updated/created per day.

For the long term, I proposed two options:

  • Option 1: increase the limit to 10,000 or higher permanently. I don’t think it causes much stress to the servers. If we have more than 10,000 updates per feature layer per day, it will still fail.
  • Option 2: Set this layer/cron task instance in Maximo to run more frequently (i.e. hourly). This is processed by the background JVM and won’t cause performance degradation to the end-users. When the cron task runs but does not get any result, it doesn’t consume much resource anyway. However, this approach won’t help if we have a large batch update in a short period (e.g. manual data import or bulk update).