Page 9 of 18

My failed attempt to get Maximo to work with Azure SQL database

Recently, I started playing with Azure by attempting to migrate a Maximo instance from my local VM to Azure platform. Although the attempt was a success. I didn’t realize SQL Server and Azure SQL are different databases (or more correctly, two different versions). There were a few issues during the process, but I figured out how to work around them and got Maximo running on Azure VM and Azure SQL. After sharing the result on LinkedIn, there were some comments that Maximo couldn’t be installed on Azure SQL and IBM doesn’t support it, so I spent a bit more time digging and thought I should share the details and some of my opinions on this matter.

First, let us be clear, Azure is a big cloud platform which offers many different services. I’m not a cloud expert, but from what I understand, we are talking about two main services:

  • Azure VM: which is IaaS and it lets us run a virtual machine on the cloud. You can run many different Windows or Linux versions on it; it is transparent to the standard business applications (Maximo). In other words, there shouldn’t be any issue running Maximo on Azure VM as long as you stick to the OS versions supported by Maximo. For example, when referring to Maximo 7.6.1.1 Compatibility matrix, Microsoft Hyper-V 2012 and 2012 R2 are supported, and with them, Windows Server 2012 and 2016 are supported as Guest OS. In fact, many companies large and small are running Maximo on Azure VM.
  • Azure SQL: is a cloud version of SQL Server. It has the core engine of SQL Server database, but certain aspects of it has been modified to work on a cloud environment and to support multi-tenancy. Current version of Maximo doesn’t support Azure SQL, and IBM don’t have any plan to support it in the near future with Maximo 7.6.1.x and 8 (according to their 2020 Roadmap).

Even with that knowledge, I was still curious on whether I can make it work from technical perspective and did a few experiments:

A – Installing Maximo 7.6.1 from scratch on Azure VM and Azure SQL:

People mentioned that Maximo cannot be installed on Azure SQL, so I did just that, just to know what exactly the issue is (and secretly, I hoped I could fix
it). I followed the standard Installation process and managed to install
Websphere, prepare the environment, and install Maximo SMP.

After SMP was installed, I couldn’t get the Configuration tool to connect to Azure DB because it couldn’t resolve the DB Alias provided by Azure.

I attempted to execute this step manually using command line by setting up the
maximo.properties file, then run the maxinst.bat tool. However, DBC script failed when it ran an undocumented DBCC ‘CALLFULLTEXT’ command:

To be honest, installing a fresh Maximo DB on Azure SQL shouldn’t be a problem from technical perspective, because we can always install it on an on-premise SQL
Server, then put it on to Azure SQL (using the process I described in my previous post). However, this is definitely a failure.

B – Upgrading Maximo from 7.6.0.6 to 7.6.0.9

It is critical to be able to upgrade and install updates, fix packs, or industry add-ons. From technical stand point, all of these activities are essentially the same process; it involves running a large number of DBC scripts, which mostly consists of SQL commands. I wanted to see if it works on Azure SQL.

I started out with a simple task first: upgrading from Maximo 7.6.0.6 to 7.6.1. To do this, I simply replaced the existing 7.6.0.6 SMP folder installed previously with a new 7.6.1 SMP folder, then run the updatedb tool. On the first run, I had several issues related to incompatible collation between “Latin1_General_CI_AS” (default setting of on-premise SQL Server) and “SQL_Latin1_General_CP1_CI_AS” (the default value of Azure SQL’s databases, which can be changed, and the ONLY supported collation of master and tempdb).

Changing a database collation is a complex job which involves updating the value at all three levels: database, table, and column.  To do it, we have to drop all indexes and constraints on the tables, then run the update script, then re-create the indexes. I only wanted to find out if upgrade is possible, so I tried again with a fresh Demo DB created with “SQL_Latin1_General_CPI1_CI_AS” collation. The upgrade is successful this time. So, that is a small success.

C – Install Add-ons

Next, I tried installing some serious industry add-ons (Scheduler 7.6.7, Oil and Gas 7.6.1, Utilities 7.6.0.4, Control Desk 7.6.1) on top of the working instance from the previous step. These are probably the most complex add-ons we could get for Maximo. As such, if it worked, there wouldn’t be any problem to get it to work with any future upgrade/update.

The process failed when it tried to run some scripts belong to the ICD add-ons. This time, it had an issue when trying to access object spt_values in the master DB which will never be given in a multi-tenancy cloud environment. There were a few posts on the web showing us how to work around this issue. But I stopped that this, and that is the end of my attempt. There’s no point in trying to spend too much time in getting something to work knowing it is not supported by IBM (unless there are some clients willing to pay for my time doing that in the future)

Conclusion: the result of this process confirms what others have reported, Maximo doesn’t work with Azure SQL database. Although the issue with collation can be worked around (with a lot of effort), there are more than one issue with various DBC scripts requiring access to the master DB which is not possible in Azure DB. For people with extensive knowledge on DBC scripting and custom DBC handler, they might be able to modify the problem scripts to get it to work, but that is definitely not supported by IBM and the effort cannot be justified in my opinion. As support for Azure DB is not on the road map. At this stage, our only option is to use on-premise DB version installed on the cloud as a separate DB server.

Framework Manager crashes when creating Project

I tried to create a new Framework Manager project to build a package which I will join some big Maximo tables (Assets, Workorder etc.) with some general geo/demographical data I pulled from the Web. Framework Manager kept crashing with this annoying error. After some quick search on the web, I realized the issue with FM crashing is not uncommon, and there can be a thousand reasons for that. To be honest, from the perspective of a software developer, most of these are simple and stupid exceptions which should be caught or fixed easily by the Cognos R&D team. Good software should give the end-users useful error messages about what they did wrong. But anyway, this is not another post to complain, just a quick note on another crash scenario so you can avoid:


Problem: when trying to create a new project and include the tables/object to import, Framework Manager took 10-15 minutes to process, then crashed:

Cause: In this case, because the GEO database I created only contains a few simple tables, I assumed I could just tick on it to import all tables in the database. What I did wrong was, I should only pick the tables I created. By ticking on the database entry, FM will try to import every system object included in the database (functions, sys tables, users, etc.) and it caused trouble.

 

Solution: Should always select only the tables we want to use:

Display password stored in System Properties

I had to look at an existing interface in Maximo production which publishes data outbound. The interface is working well but we don’t have any documentation. I like to use POSTMAN to send some test requests to figure out how the API works. Unfortunately, password for the API is kept in a System Properties and is masked and encrypted.

I struggled for a while until I realized, since System Properties are meant to be accessible by code, it means, I can easily retrieve and display the value from code. So, I created an automation script on the Save event of the ASSET object to display the value:

The code has a filter on a specific asset to minimize the chance it can cause trouble to end users who might update Asset data while I’m doing this. This is my favorite approach when I want to run some code on production.

Mess around with Azure: Migrate Maximo to Cloud

Last week, while attending a call to discuss an integration interface between Maximo and an Azure SQL database, the other consultant mentioned a few Azure terms like “blob storage”; I realized I didn’t know anything about Azure despite the fact that the cloud platform has been adopted by the large number of the clients I work with. So today, I decided I should play around a bit with it by trying to migrate a Maximo instance from my local VM to Azure.
 
Before I go into the technical details, for those of you non-technical readers, I like to sum up the whole experience I had with this platform in one word: magical. That’s it, I went to the website, signed up for a trial account, followed some wizards or clicked on a few buttons or menus that I think were features I need, and boom, I got a running Maximo instance on the cloud in just a few hours. No training, no muss, no fuss, everything just works. Not much else to say. I’m in love now. I think will spend the next couple of weeks, months or even years to learn more about Microsoft stuff.

Below are
the steps that I took:
 
1.1. Sign up for a trial account. After created the account, I was given $260 AUD to spend on paid services
 
1.2. I went straight to the portal and tried to create a new VM. Being cheap as I am, I went to the lowest option of 1 CPU + 3.5GB RAM. I know to install Maximo, I’ll need at least 8GB, but in this case, I just wanted to see if I can fit Windows, Websphere and Maximo into a box this size
1.3. Next page, I gave the VM a 256GB SSD data disk
1.4. For network setting, I gave it a Public IP as I want to access Maximo from public Internet
1.5. On the next few pages, I left the default settings and proceeded to create the VM. After about 5 minutes, the VM was created. I clicked on “Connect”, and it downloaded an RDP shortcut which will open the Remote Desktop app on my laptop and pre-populated it with the IP address of the machine. I struggled at this step a bit because I forgot that I created a Window user account in the 1st step. Thus, I tried to use my Microsoft account instead and it failed. After a few failed attempts and a quick google search, I remembered the details and entered MaxDM76AzureUser for account name, could login with that. The machine is extremely fast and responsive, feels like working on a local server rather than some remote sessions on the cloud (Now I remember how frustrating I was having to deal with the sluggish Maximo SaaS Flex DEV server). Download files gave me a rate of 50-70MB/s (that’s megabytes per second not Mbps)
 
<This screen was
captured after the fact as I forgot to take a screenshot of this step>
2.1 Next step, I tried to create an SQL DB. I put it in the same Resource Group as the Maximo App Server VM Because, naturally, I think all resources for an application should be put in one Resource Group as the name suggests. To be honest, I still have no idea what it is now. On the “Server” drop-down, since it is empty, I had to click on “Create New” and created a new server. I think we cannot remote access to this one like a VM, just a logical name to manage an SQL instance.
  
2.2 On the Networking page, I chose Public Endpoint for Connectivity Method because I thought I need to access the DB from SQL Studio on my Laptop, and Allow Azure services and resources to access this server because I think this option lets Maximo on App Server VM to access the DB. It turned out I couldn’t figure out how to access the DB using SQL Studio on my laptop (more on this later), but the second option works exactly as it supposed to do.
2.3 Default settings for the next few pages then I proceeded to create the DB.
2.4 I attempted to connect to the DB using SQL Studio on my laptop but it didn’t work. However, I tried to install SQL Studio on the Azure VM and it worked on the first try.
3.1 Now I had a server and a DB, I proceeded with Maximo migration. I backed-up the SQL DB in my VM to a DMP file, uploaded it to Azure VM, and tried to Restore it to Azure SQL. It turned out on-premises SQL Server is different to Azure SQL, and I can’t do that. A quick google search revealed that I had to export the DB into a *.bacpac file, then import it to Azure.
3.2 The whole export/import of bacpac file was straight-forward with a minimum number of clicks, and no confusion. After it’s done, I realized the import process restored the DB from my VM, and I ended up with two DBs, so I had to delete the empty one I created in the earlier step.
3.3 I struggled a bit trying to create a new ‘maximo’ user. Turned out the way security logins and users are managed in Azure SQL is a bit different, so after a bit of googling, I found some scripts I could use. So here is the notes I wrote down for this step
– Right click the “Database Folder”, to add a new Query window:  
CREATE LOGIN maximo WITH PASSWORD =
‘SuperSecret!’
– Right click the ‘maxdm7609’ to add a new Query window:
  CREATE USER maximo FOR LOGIN maximo WITH
default_schema = dbo;
  ALTER ROLE db_owner ADD MEMBER maximo;
4.1 Next step is to install Websphere, a quick check before installing showed I had 2.3 GB RAM left, I thought could make Websphere to take 1GB and Maximo 1GB, so I proceeded with the installation.
4.2 It turned out that although I can install Websphere successfully, the automated process to prepare the environment for Maximo failed due to Out-of-Memory error. I know we could manually configure WebSphere  and make it works, but what’s the point in wasting time doing that? So I uninstalled Websphere, upgrade the VM to the next lowest option which has 2 CPU and 8GB RAM, then attempted to install Websphere again, which worked this time.
 
4.3 I uploaded a pre-built maximo.ear file from my local laptop to Azure VM, deployed it to Websphere, then updated the connection string to point to the Azure SQL DB. Connection string is almost exactly the same as connecting Maximo to a local SQL DB. And it worked immediately after I tried to start MXServer after deployment.
(Installing Websphere and Deploying maximo is not the topic of this post, so I’m not going to put the details of those steps here)
 
5.1 Now I got Maximo running on the VM, I wanted to access it from the Web, so I went to the VM’s Networking page, and opened port 80 inbound for the VM, but it still didn’t work. After a bit of fussing around, I remembered I had to open the port on the VM’s Windows Firewall as well. After opened port 80 on both two firewalls, it worked like a charm, I could access Maximo from my laptop using the VM’s public IP  
5.2 I wanted to have a DNS to access it as well, so in the VM’s IP setting, I set it to Static (which I think will prevent it to change even if I shutdown the VM), and put a DNS name label.
5.3 Right after I clicked on Save, I can access Maximo using a DNS alias. Mission accomplished
I have a fair share of experience with various software and platforms ranging from great, good, bad, or horrible. I can live with bad software, but it irritates me to the bone. I feel good when playing with good software. In this case, I think Microsoft has done everything right designing the Azure platform, I felt nothing but joy throughout this whole learning process. My next step is probably to recommend anyone I meet to put Maximo on Azure if they are considering moving to cloud.

Convert Australian GDA94 Easting/Northing values to Latitude/Longitude in SQL Server

As I play around with the Cognos Analytics map controls, it appears to me it only understands Latitude/Longitude values, but the sample dataset I have is in UTM format. So, I have to find ways to convert Easting/Northing values to Lat/Long. Since I use SQL Server, attempting to create a column in “Geometry” data type from Easting/Northing value, then convert it to a “Geography” data type (Lat/Long) does not seem to work for me.

After some scrambling on the web, the below SQL Functions seem to work for me to convert Easting/Northing values (on the Australian GDA94, Zone 56 standard). The key to get it to work is setting the correct datum details in the constant section. A quick test of this function does seem to give me the correct Lat/Long coordinates of Sydney Habour Bridge:

Coordinate of the Sydney Harbour bridge for testing

Calling the utm2lat and utm2lon functions to get the result

The two pieces of T-SQL codes below are the same, the only difference is one returns Latitude, the other one returns Longitude value

My top 3 beginner gotchas with Cognos 11

Working with Maximo, perhaps we all have the same frustration with the constant changes by IBM marketing team who work hard and are very successfully in their attempt to confuse us. The most visible problem is the continuous changes to their website which essentially breaks any links from external sources pointing to their KB pages which are more than a few years old. When I started with Cognos, I realized they have brought it (the confusion part) to the next level with this product. Below are the top 3 gotchas that I had to learn the hard way when I started with this product:

Gotcha #1: Cognos Business Intelligence vs Cognos Analytics
From Version 11, IBM started calling it Cognos Analytics. Previous versions were called Cognos Business Intelligence. This does not seem like a big problem when scratching the surface. But as a beginner, I was totally lost when searching for technical information from the web. Initially, when I read something that mentions Cognos BI or Cognos Analytics, to me it meant the same thing “Cognos”. In many cases, I didn’t realize, when talking about changing behaviors or different functionalities, if people mentioned something is available in Cognos Analytics or something is not possible in Cognos BI, they were talking about different versions. I learned this the hard way and thought how much trouble it could have saved me if I knew about this sooner.
Gotcha #2: Visualization vs Charts
In version 11, IBM added a brand-new graphical component called ‘Visualization’ and made it the default option. IBM marketing is all out touting the cool new features, and they don’t mention any limitation (of course). I didn’t realize Visualization and Chart are totally different things, each has its own pros and cons. The good thing about Visualization is it has a more modern look, the nasty thing about it, it has certain limitations which makes it less appealing to the experienced report developers.
 
Older version charts are only visible if you click on this little icon on the top right corner of the dialog
For example, with Column Visualization, it is not possible to add a Trend Line, Base Line or Group Small Items. These are the standard features of the old Column Chart component. If appearance is the only reason to use Visualization, in most cases, customizing the old Chart control could make it looks almost indistinguishable to the same chart type in the newer Visualization version, adding the missing functions, on the other hand, is a lot more difficult.
Chart Annotations option is only available in the older version “Chart” control
When I had to add a Trend Line to a Column Bar Visualization, it took me hours trying to find the “Chart Annotations” section in the Properties navigation bar. This section is not available with the new Visualization. Took me many hours more trying to search and download the “legacy Cognos chart” libraries to import them to my Cognos environment. Only by then, I found out about the little “Filter” button on the corner of the Select Visualization screen to make the old Charts visible. I think IBM could do a few things to improve it:
  • Make old components more visible by showing them in the same list, under a separate group (e.g. Cognos v10 charts, Legacy Visualizations, etc.
  • Standardize the Properties list of all charts, any properties that are not available to certain chart type could be greyed out, (rather than not shown), and having a tooltip telling users why it is not available
  • New chart types or should provide the same minimum functionalities as the older version it tries to replace.
Gotcha #3: Report vs Active Report vs Dashboard
Report, Active Report, Dashboard are totally different things, and use different
underlying technologies, I get it. But on the surface, they have controls with
the same meaning to the end-users/developers such as List, or CrossTab, or
Chart. The problem I have with this one is, when designing a report for
example, the UI looks kind of the same (which gives us the impression that they
are the same), but with some random inconsistencies that confuse the hell out
of those who are new to the product. For example, a (Static) Report can only have
List control, an Active Report can have a List control and a Data List control.
The latter is advertised as a better and faster version. So why not use it to
replace the old ones? (I guess there are limitations which make the old control
irreplaceable but at this point, but I haven’t figured out. As always, IBM will
not tell us, we’ll have to figure it out ourselves the hard way).
 
Next is the fun part, I needed to display a List with a little graphical like the image
below, and I read somewhere that, in Cognos, we can include a Microchart in the
List to achieve that.
 
This tiny graphical icon is called a Microchart

After spending a ridiculous amount of time, turned out this simple feature is only
available in Active Report, with the Data List control (not List).
This Microchart function, in turn, has its own gotcha. For a Data List in Active Report, the Microchart Option can be located on the Property side navigation bar.
 
For Data List, Microchart option sits on the Property bar.
 
For a CrossTab control, which is available in both Static and Active Report, to add a Microchart, we will have to right click on the Column or Cell, click on the ellipsis icon […] of the On-demand Toolbar, then find the “Insert Chart…”
 
For Crosstab, Microchart option is buried in the […] button of the On-Demand tool bar
 

I found few dozen other little annoyances with Cognos, but it all comes down to one thing, consistency, or the lack of it at both business and software design level that make the software so difficult to use. However, to be fair, I think in this
version (v11), IBM has done a great job modernizing Cognos and switching it to
web-based GUI. When a software has a new version, it is always received with criticisms due to changed design and interface. But overtime, people will get used to it or found way to work around limitations, or the vendor found ways to improve it. With Cognos, I’m kind of optimistic thinking about its future.
« Older posts Newer posts »