Author: Viet Tran (Page 6 of 14)

Framework Manager crashes when creating Project

I tried to create a new Framework Manager project to build a package which I will join some big Maximo tables (Assets, Workorder etc.) with some general geo/demographical data I pulled from the Web. Framework Manager kept crashing with this annoying error. After some quick search on the web, I realized the issue with FM crashing is not uncommon, and there can be a thousand reasons for that. To be honest, from the perspective of a software developer, most of these are simple and stupid exceptions which should be caught or fixed easily by the Cognos R&D team. Good software should give the end-users useful error messages about what they did wrong. But anyway, this is not another post to complain, just a quick note on another crash scenario so you can avoid:


Problem: when trying to create a new project and include the tables/object to import, Framework Manager took 10-15 minutes to process, then crashed:

Cause: In this case, because the GEO database I created only contains a few simple tables, I assumed I could just tick on it to import all tables in the database. What I did wrong was, I should only pick the tables I created. By ticking on the database entry, FM will try to import every system object included in the database (functions, sys tables, users, etc.) and it caused trouble.

 

Solution: Should always select only the tables we want to use:

Display password stored in System Properties

I had to look at an existing interface in Maximo production which publishes data outbound. The interface is working well but we don’t have any documentation. I like to use POSTMAN to send some test requests to figure out how the API works. Unfortunately, password for the API is kept in a System Properties and is masked and encrypted.

I struggled for a while until I realized, since System Properties are meant to be accessible by code, it means, I can easily retrieve and display the value from code. So, I created an automation script on the Save event of the ASSET object to display the value:

The code has a filter on a specific asset to minimize the chance it can cause trouble to end users who might update Asset data while I’m doing this. This is my favorite approach when I want to run some code on production.

Mess around with Azure: Migrate Maximo to Cloud

Last week, while attending a call to discuss an integration interface between Maximo and an Azure SQL database, the other consultant mentioned a few Azure terms like “blob storage”; I realized I didn’t know anything about Azure despite the fact that the cloud platform has been adopted by the large number of the clients I work with. So today, I decided I should play around a bit with it by trying to migrate a Maximo instance from my local VM to Azure.
 
Before I go into the technical details, for those of you non-technical readers, I like to sum up the whole experience I had with this platform in one word: magical. That’s it, I went to the website, signed up for a trial account, followed some wizards or clicked on a few buttons or menus that I think were features I need, and boom, I got a running Maximo instance on the cloud in just a few hours. No training, no muss, no fuss, everything just works. Not much else to say. I’m in love now. I think will spend the next couple of weeks, months or even years to learn more about Microsoft stuff.

Below are
the steps that I took:
 
1.1. Sign up for a trial account. After created the account, I was given $260 AUD to spend on paid services
 
1.2. I went straight to the portal and tried to create a new VM. Being cheap as I am, I went to the lowest option of 1 CPU + 3.5GB RAM. I know to install Maximo, I’ll need at least 8GB, but in this case, I just wanted to see if I can fit Windows, Websphere and Maximo into a box this size
1.3. Next page, I gave the VM a 256GB SSD data disk
1.4. For network setting, I gave it a Public IP as I want to access Maximo from public Internet
1.5. On the next few pages, I left the default settings and proceeded to create the VM. After about 5 minutes, the VM was created. I clicked on “Connect”, and it downloaded an RDP shortcut which will open the Remote Desktop app on my laptop and pre-populated it with the IP address of the machine. I struggled at this step a bit because I forgot that I created a Window user account in the 1st step. Thus, I tried to use my Microsoft account instead and it failed. After a few failed attempts and a quick google search, I remembered the details and entered MaxDM76AzureUser for account name, could login with that. The machine is extremely fast and responsive, feels like working on a local server rather than some remote sessions on the cloud (Now I remember how frustrating I was having to deal with the sluggish Maximo SaaS Flex DEV server). Download files gave me a rate of 50-70MB/s (that’s megabytes per second not Mbps)
 
<This screen was
captured after the fact as I forgot to take a screenshot of this step>
2.1 Next step, I tried to create an SQL DB. I put it in the same Resource Group as the Maximo App Server VM Because, naturally, I think all resources for an application should be put in one Resource Group as the name suggests. To be honest, I still have no idea what it is now. On the “Server” drop-down, since it is empty, I had to click on “Create New” and created a new server. I think we cannot remote access to this one like a VM, just a logical name to manage an SQL instance.
  
2.2 On the Networking page, I chose Public Endpoint for Connectivity Method because I thought I need to access the DB from SQL Studio on my Laptop, and Allow Azure services and resources to access this server because I think this option lets Maximo on App Server VM to access the DB. It turned out I couldn’t figure out how to access the DB using SQL Studio on my laptop (more on this later), but the second option works exactly as it supposed to do.
2.3 Default settings for the next few pages then I proceeded to create the DB.
2.4 I attempted to connect to the DB using SQL Studio on my laptop but it didn’t work. However, I tried to install SQL Studio on the Azure VM and it worked on the first try.
3.1 Now I had a server and a DB, I proceeded with Maximo migration. I backed-up the SQL DB in my VM to a DMP file, uploaded it to Azure VM, and tried to Restore it to Azure SQL. It turned out on-premises SQL Server is different to Azure SQL, and I can’t do that. A quick google search revealed that I had to export the DB into a *.bacpac file, then import it to Azure.
3.2 The whole export/import of bacpac file was straight-forward with a minimum number of clicks, and no confusion. After it’s done, I realized the import process restored the DB from my VM, and I ended up with two DBs, so I had to delete the empty one I created in the earlier step.
3.3 I struggled a bit trying to create a new ‘maximo’ user. Turned out the way security logins and users are managed in Azure SQL is a bit different, so after a bit of googling, I found some scripts I could use. So here is the notes I wrote down for this step
– Right click the “Database Folder”, to add a new Query window:  
CREATE LOGIN maximo WITH PASSWORD =
‘SuperSecret!’
– Right click the ‘maxdm7609’ to add a new Query window:
  CREATE USER maximo FOR LOGIN maximo WITH
default_schema = dbo;
  ALTER ROLE db_owner ADD MEMBER maximo;
4.1 Next step is to install Websphere, a quick check before installing showed I had 2.3 GB RAM left, I thought could make Websphere to take 1GB and Maximo 1GB, so I proceeded with the installation.
4.2 It turned out that although I can install Websphere successfully, the automated process to prepare the environment for Maximo failed due to Out-of-Memory error. I know we could manually configure WebSphere  and make it works, but what’s the point in wasting time doing that? So I uninstalled Websphere, upgrade the VM to the next lowest option which has 2 CPU and 8GB RAM, then attempted to install Websphere again, which worked this time.
 
4.3 I uploaded a pre-built maximo.ear file from my local laptop to Azure VM, deployed it to Websphere, then updated the connection string to point to the Azure SQL DB. Connection string is almost exactly the same as connecting Maximo to a local SQL DB. And it worked immediately after I tried to start MXServer after deployment.
(Installing Websphere and Deploying maximo is not the topic of this post, so I’m not going to put the details of those steps here)
 
5.1 Now I got Maximo running on the VM, I wanted to access it from the Web, so I went to the VM’s Networking page, and opened port 80 inbound for the VM, but it still didn’t work. After a bit of fussing around, I remembered I had to open the port on the VM’s Windows Firewall as well. After opened port 80 on both two firewalls, it worked like a charm, I could access Maximo from my laptop using the VM’s public IP  
5.2 I wanted to have a DNS to access it as well, so in the VM’s IP setting, I set it to Static (which I think will prevent it to change even if I shutdown the VM), and put a DNS name label.
5.3 Right after I clicked on Save, I can access Maximo using a DNS alias. Mission accomplished
I have a fair share of experience with various software and platforms ranging from great, good, bad, or horrible. I can live with bad software, but it irritates me to the bone. I feel good when playing with good software. In this case, I think Microsoft has done everything right designing the Azure platform, I felt nothing but joy throughout this whole learning process. My next step is probably to recommend anyone I meet to put Maximo on Azure if they are considering moving to cloud.

Convert Australian GDA94 Easting/Northing values to Latitude/Longitude in SQL Server

As I play around with the Cognos Analytics map controls, it appears to me it only understands Latitude/Longitude values, but the sample dataset I have is in UTM format. So, I have to find ways to convert Easting/Northing values to Lat/Long. Since I use SQL Server, attempting to create a column in “Geometry” data type from Easting/Northing value, then convert it to a “Geography” data type (Lat/Long) does not seem to work for me.

After some scrambling on the web, the below SQL Functions seem to work for me to convert Easting/Northing values (on the Australian GDA94, Zone 56 standard). The key to get it to work is setting the correct datum details in the constant section. A quick test of this function does seem to give me the correct Lat/Long coordinates of Sydney Habour Bridge:

Coordinate of the Sydney Harbour bridge for testing

Calling the utm2lat and utm2lon functions to get the result

The two pieces of T-SQL codes below are the same, the only difference is one returns Latitude, the other one returns Longitude value

My top 3 beginner gotchas with Cognos 11

Working with Maximo, perhaps we all have the same frustration with the constant changes by IBM marketing team who work hard and are very successfully in their attempt to confuse us. The most visible problem is the continuous changes to their website which essentially breaks any links from external sources pointing to their KB pages which are more than a few years old. When I started with Cognos, I realized they have brought it (the confusion part) to the next level with this product. Below are the top 3 gotchas that I had to learn the hard way when I started with this product:

Gotcha #1: Cognos Business Intelligence vs Cognos Analytics
From Version 11, IBM started calling it Cognos Analytics. Previous versions were called Cognos Business Intelligence. This does not seem like a big problem when scratching the surface. But as a beginner, I was totally lost when searching for technical information from the web. Initially, when I read something that mentions Cognos BI or Cognos Analytics, to me it meant the same thing “Cognos”. In many cases, I didn’t realize, when talking about changing behaviors or different functionalities, if people mentioned something is available in Cognos Analytics or something is not possible in Cognos BI, they were talking about different versions. I learned this the hard way and thought how much trouble it could have saved me if I knew about this sooner.
Gotcha #2: Visualization vs Charts
In version 11, IBM added a brand-new graphical component called ‘Visualization’ and made it the default option. IBM marketing is all out touting the cool new features, and they don’t mention any limitation (of course). I didn’t realize Visualization and Chart are totally different things, each has its own pros and cons. The good thing about Visualization is it has a more modern look, the nasty thing about it, it has certain limitations which makes it less appealing to the experienced report developers.
 
Older version charts are only visible if you click on this little icon on the top right corner of the dialog
For example, with Column Visualization, it is not possible to add a Trend Line, Base Line or Group Small Items. These are the standard features of the old Column Chart component. If appearance is the only reason to use Visualization, in most cases, customizing the old Chart control could make it looks almost indistinguishable to the same chart type in the newer Visualization version, adding the missing functions, on the other hand, is a lot more difficult.
Chart Annotations option is only available in the older version “Chart” control
When I had to add a Trend Line to a Column Bar Visualization, it took me hours trying to find the “Chart Annotations” section in the Properties navigation bar. This section is not available with the new Visualization. Took me many hours more trying to search and download the “legacy Cognos chart” libraries to import them to my Cognos environment. Only by then, I found out about the little “Filter” button on the corner of the Select Visualization screen to make the old Charts visible. I think IBM could do a few things to improve it:
  • Make old components more visible by showing them in the same list, under a separate group (e.g. Cognos v10 charts, Legacy Visualizations, etc.
  • Standardize the Properties list of all charts, any properties that are not available to certain chart type could be greyed out, (rather than not shown), and having a tooltip telling users why it is not available
  • New chart types or should provide the same minimum functionalities as the older version it tries to replace.
Gotcha #3: Report vs Active Report vs Dashboard
Report, Active Report, Dashboard are totally different things, and use different
underlying technologies, I get it. But on the surface, they have controls with
the same meaning to the end-users/developers such as List, or CrossTab, or
Chart. The problem I have with this one is, when designing a report for
example, the UI looks kind of the same (which gives us the impression that they
are the same), but with some random inconsistencies that confuse the hell out
of those who are new to the product. For example, a (Static) Report can only have
List control, an Active Report can have a List control and a Data List control.
The latter is advertised as a better and faster version. So why not use it to
replace the old ones? (I guess there are limitations which make the old control
irreplaceable but at this point, but I haven’t figured out. As always, IBM will
not tell us, we’ll have to figure it out ourselves the hard way).
 
Next is the fun part, I needed to display a List with a little graphical like the image
below, and I read somewhere that, in Cognos, we can include a Microchart in the
List to achieve that.
 
This tiny graphical icon is called a Microchart

After spending a ridiculous amount of time, turned out this simple feature is only
available in Active Report, with the Data List control (not List).
This Microchart function, in turn, has its own gotcha. For a Data List in Active Report, the Microchart Option can be located on the Property side navigation bar.
 
For Data List, Microchart option sits on the Property bar.
 
For a CrossTab control, which is available in both Static and Active Report, to add a Microchart, we will have to right click on the Column or Cell, click on the ellipsis icon […] of the On-demand Toolbar, then find the “Insert Chart…”
 
For Crosstab, Microchart option is buried in the […] button of the On-Demand tool bar
 

I found few dozen other little annoyances with Cognos, but it all comes down to one thing, consistency, or the lack of it at both business and software design level that make the software so difficult to use. However, to be fair, I think in this
version (v11), IBM has done a great job modernizing Cognos and switching it to
web-based GUI. When a software has a new version, it is always received with criticisms due to changed design and interface. But overtime, people will get used to it or found way to work around limitations, or the vendor found ways to improve it. With Cognos, I’m kind of optimistic thinking about its future.

BMXAA4017E Error when running custom code

The “Object cannot be saved” error

A client hired me to do a bit of house-cleaning and provide minor enhancements for their Maximo system. It has some pieces of buggy Java code and I took the opportunities to de-customize the Java code by rewriting the logic with Automation Script. For one piece of logic, it worked well in the Development environment. However, when deployed to Production, it did not work.

The automation script is triggered when the user saves a record, but it executes at the post commit launch point. Thus, this script does not give any error on the frontend. However, there is an error “BMXAA4017E object cannot be saved with the data that was provided” in the SystemOut.log file. The error came from the mbo.checkQualifiedRestriction method

Error BMXAA4017E object cannot be saved with the data that was provided

Background Information

The BMXAA4017E error is a common error. It often occurs because of a QUALIFIED rule setup in security restriction preventing an user to interact with data he/she should not have access to.

However, this client has an interesting environment with four satelite Maximo servers located in Antarctica synchronized to a central server via satellite using a data sync solution provided by SRO. Although the number of active users is low, they have 500-600 users across nearly 20 logical sites. There are many data restriction rules configured at various levels: global, site, security group etc.

This issue gave me a chill down the spine during the deployment to Production because I simply couldn’t think of where to start looking. When I was almost announcing it a failed deployment, a random thought came up and it helped me solve the problem.

Problem and Solution

In short, the problem is, to update a record, I fetched the MboSet from the database using the MXServer.getMboSet() method. I passed in the SystemUserInfo to this method as I always do, thinking it is linked to the MAXADMIN account, and it has the ultimate access right. Below is the code:

The getSystemUserInfo() method is the cause of the problem

That is where the problem comes about, that “SystemUser” profile does not belong to any Security Group or listed in any security restriction rules. One of those rules must have failed and caused the error. To fix it, I changed the code to use the current logged in user’s security profile as follows:

Changing to getUserInfo fixed the issue

To be honest, I still haven’t figured out why the same code using SystemUserInfo
works in the DEV and TEST environments, and not works in PROD. But as an engineer, not a scientist, I guess it is ok to not knowing the Why sometimes. As long as it works, everyone is happy 😊

« Older posts Newer posts »