PYODBC – Login Timeout Expired error

Just want to document a strange issue I had today. I have this small Python program that uses the pyodbc library to connect to an SQL Server database on my local machine. The code was working fine two weeks ago before I left it to work on a different project. Below is the simplified version of the code in which pyodbc is used to read data from SQL Server 2022:

When I tried running it today, it did not work. I had the following error instead:

pyodbc.OperationalError: ('HYT00', '[HYT00] [Microsoft][ODBC SQL Server Driver]Login timeout expired (0) (SQLDriverConnect)')

While it looks like a network issue, there has been no change to my laptop or home network. Looking up this error on the internet does not seem to give me any solution to the issue.

However, this article suggests we can use ODBC driver with pyodbc. My ODBC Data Source (found in Windows’ Control Panel) showed that I have ODBC Driver 17 for SQL Server.

So, I changed my driver as in the example below and it worked fine:

I could not find out what caused the previous driver (SQL Driver) to stop working.

Maximo training books from archive

This is a list of old training books created by MRO before it was acquired by IBM. I used these to learn Maximo when I started with it. While these courses were designed for Maximo 7, the core processes remain the same in the newer versions. You can use this to learn Maximo if you can’t find a recent course elsewhere.

Immersion Training: this book should be the starting point for an absolute beginner to learn the basic Maximo applications and processes. It covers the following topics:

  • Setting up Organisation & Site structure
  • Creating Person and User records
  • Setting up Inventory items
  • Create Asset records
  • Setting up Job Plans, Preventive Maintenance, and generate PM Work Orders
  • Reactive Work Management process
  • Requisition and Reordering

System Administration Training: this book is for a beginner who wants to learn administration tasks and application configuration. It covers the following topics:

  • Users & Security Control
  • Report Administration
  • Database Configuration
  • Domains
  • Other configuration applications: Escalation, Bulletin Board, Escalations, and Communication Templates

Below are some other books that focus on certain specific topics:

  • Functional:
    • Inventory Management: covers Inventory management topics such as Item Master, Storeroom, Inventory Transactions, Reordering, and Item Kits
    • Procurement: covers the purchasing processes including setting up Companies, Contracts, Purchase Requisitions, Purchase Orders, Receipts, and Invoices.
    • Work Management: cover the work management processes including setting up Job Plans, Preventive Maintenance plans, generating Work Orders from PM, and schedule / dispatch / execute / complete Work Orders
  • Configuration:
    • Application Design: this book covers how to use the Application Designer to modify Maximo UI screens or create a new application
    • Workflow Management: how to use the Workflow Designer app to modify or create a new workflow in Maximo
    • Migration Manager: how to use Migration Manager to migrate configuration between Maximo environments

The matrix below shows the training tracks which provide some guidance on what kind of knowledge you need to acquire for each user role

Top 5 common integration requirements for Maximo

Maximo is a feature-rich and extensible Enterprise Asset Management software (EAM). It has many modules that cover a wide range of business processes such as operations & maintenance, procurement, and inventory management. Yet, a standard business often employs many other applications for different processes that Maximo does not cover. Finance & Accounting software is the first to come to mind. Complex businesses can run an ERP or a Billling & CRM suit in tandem with EAM. Specialised applications are also common in the asset intensive industries.

This article discusses some of the most common integration requirements between Maximo and other enterprise applications. This is a wide topic, and it is impossible to cover everything in depth. Therefore, the intent is to cover the following key points:

  • Why is the integration required?
  • What are the common requirements?
  • Common implementation strategies and things to watch out for.

1. Integration between Maximo and DCS / SCADA / Data Historian

Why is it required?

It makes a lot of sense to feed meter readings to Maximo from a data collection system (DCS). The information is used for generating preventive maintenance work orders. For instance, the odometer of a vehicle is used to generate work order to service the engine and change oil. Other measurements such as pressure or vibration can be used for setting up conditional monitoring and generating work orders when an asset operates outside of its normal working conditions.

What are the common requirements?

In general, the requirement for Maximo – SCADA integration is a one-way data feed from the source system to Maximo. In most cases, data synchronisation does not need to be high frequency. During the pre-sales stage of a greenfield project, the client often asks for a real-time interface citing alarms from critical equipment will need immediate attention. In many cases, this is made a mandatory requirement to eliminate competitive bidders. However, in practice, this is rarely justified because urgent maintenance work is often identified by control systems or other monitoring tools. Emergency works are usually dealt with outside of Maximo and the details are only entered to Maximo after the fact for recording keeping. As such, meter data fed to Maximo is useful for generating preventive or planned corrective maintenance only.

Integration between Maximo-OSISoft PI

Things to watch out for

The common implementation strategy for this integration is having DCS to regularly exports data in CSV format at a low frequency interval. Daily data export works fine in this case.

If realtime integration is required as part of the contractual agreement, REST API or Web Services can be used. However, we need to ensure that the frequency and amount of data sent to Maximo is limited to a reasonable level, just enough for Maximo to generate work orders and produce reports. Indiscriminately sending a large volume of meter data to Maximo can slow down the system while achieving little added value. After all, Maximo is used to manage maintenance work, not for storing every piece of meter data that can be collected. It is the job of a Data Historian system.

2. Integration between Maximo and CRM

Why is it required?

The requirement to integrate Maximo with CRM system is common in the Utilities sector. Utility companies provide basic amenities such as gas, water, electricity, and internet services. Billing and CRM is the core system of such businesses. Asset management software like Maximo is used to manage distribution network infrastructure and tracking maintenance work.

At high level, the end-2-end process works like this: when an end customer calls up to setup a new account or raise a new service repair request, it makes sense to keep track of all these details in one place: the CRM system. If new development or repair work is needed, the service order in CRM is sent to Maximo as a new service request. Subsequent work orders are created and assigned to the crew responsible in certain geographical area where the work needs to be executed. Work progress and any related details recorded by the field crew in Maximo are sent back to CRM. This ensures all information is available in CRM, enabling the call-center staff to answer to any query from the customer. The diagram below illustrates the highlevel data flow.

Integration between Maximo and CRM

What are the common requirements?

The common requirement for this interface is simple from business process perspective. Service requests are sent from CRM to Maximo. Work Order status and any additional details are sent back to CRM. However, the technical requirement for this integration is demanding. It is critical to synchronise the data in realtime or near realtime; and the interface must be highly fault tolerant. It is because the Utilities sector is highly regulated, there are minimum SLA standards applicable to certain type of problem or service interruptions.

Things to watch out for

Although realtime, synchronous synchronisation sounds great for this integration, one must be careful that the performance of the CRM’s UI/UX is not affected by the poor performance of the network or external systems. To illustrate this point, in a system I once had the chance to work with, when creating a new service request, Maximo queries ArcGIS geocoding service to convert a free-text address to a formatted address record which is linked with an asset or location. Due to the poor performance of the ArcGIS service, it took more than a minute to populate the SR with a formatted address. As a result, the call-center staff and the customer have to stay on the line waiting for this before they can continue filling in other details.

With near realtime, asynchronous integration, one must ensure that the running interval of the queues are set in seconds, not minutes. Needless to say, there should be no bottleneck throughout the end-to-end data transformation and delivering process.

To ensure the interface is fault tolerant, the integration logic should catch any possible fault scenarios such as:

  • CRM API is out-of-service, or timeout
  • Maximo MIF is out-of-service, or timeout
  • CRM or Maximo rejects a request

Extensive alarms should be setup for any possible scenarios so that the failed message can be handled promptly. Talking of which, when a message cannot be delivered, the integration should provide an error message that is easy to understand, and allow the system administrator to easily reprocess the message. If a system administrator or support staff has to go to the developer to troubleshoot everytime a failure occurs, it is unacceptable. Such instances should be the exception, not the norm.

3. Integration between Maximo and Finance / Accounting software

Why is it required?

For someone new to Maximo, it might appear that it has the full financial functionality with all the Chart of Accounts setup and finance-related fields seen everywhere. However, these are just ways for Maximo to reference financial transactions and have them ready for integration with an external Financial & Accounting software.

Within Maximo, hundreds if not thousands of finance-related transactions can be created daily. Having an interface that sends these transactions to the accounting software will not only eliminate manual work but also improve the accuracy of data and ensure the auditability of the book.

What are the common requirements?

The most common requirements for data synchronisation with accounting software are:

  • G/L chart of accounts
  • Any financial related transactions such as:
    • Inventory transactions (receipt, issue, adjustment)
    • Actual labour hours or service
  • Fixed asset register and/or data related to fixed asset depreciation
Integration between Maximo and Finance

Things to watch out for

With this integration, it is critical that the data is accurate and fully auditable. Any change to a dollar value should be backed up by a transaction or a persistent record, and the numbers should add up, accurate to the decimal values. To support this, the integration should have full message tracking capability, and the messages should be kept for an extended period (30-day retention policy for this integration is probably too short). The integration should automatically retry failed messages and have the capability to allow resubmission of an error transaction too.

Another common pattern with this integration is that a lot of issues come up around the end of the financial year. When it happens, due to time constraints, many issues are addressed with temporary fixes and quickly forgotten, only to resurface in the next financial year-end. Having a proper process to follow-up and address any technical debts will help avoiding recurring problems.

As an example, a company created a daily CSV export using automation script in Maximo to send inventory reconciliation transactions to its accounting software.  This mechanism worked well during the year as it usually produces a few hundred transactions per day. However, on the last day of the financial year, the company reconciled more than ten thousand records. Subsequently, the export failed as it exceeded the default maximum number of records Maximo can process in one go (the default value for mxe.db.fetchResultStopLimit setting is 5000). To quickly address the problem, the support team increased the limit to 15000, rerun the export, then set it back to the previous value. The same issue happened again and again for the next two years on the same date. Each instance was an urgent issue and caused a bit of drama to the business when it happened. Only then, the export process was updated with a permanent fix, by querying the data in smaller chunks.

4. Integration between Maximo and ERP

Why is it required?

The core module of an ERP system is Finance & Accounting. Thus, the points discussed in the previous section also apply here. In addition to that, there are a few other common problems specific to the integration between Maximo and ERP.

Between EAM and ERP, there are a few overlapping functions, mainly in the Procurement and Inventory modules. Integrating the two systems provides a seamless end-to-end business process across the organisation.

What are the common requirements?

For an organisation that is in the process of setting up the systems, a few questions are often asked:

  • Which system should be used to manage the following processes?
    • Inventory management
    • Procurement and Payment
  • What data needs to be synchronised and on which direction?

There is no standard answer to these questions and each company will have to make their own decision based on many factors specific to their business. Nonetheless, one important factor is the industry the company is in. For example, for a manufacturing company or a retailer, an ERP is the mission critical system while Maximo is only there to manage the assets which support the core business. In that case, the Procurement and Logistics department handle much more responsibilities than just the spare parts and services used by maintenance. As a result, it is more likely that these functions are managed in ERP.

Integration with ERP when ERP manages Procurement and Inventory
An example when ERP manages Procurement and Inventory

On the other hand, in the case the core function of a company is operations and maintenance – such as a power plant, a freeway operator, or a utility provider – it is likely that only the Finance & Accounting core module of the ERP software is used, and the procurement and logistics processes are managed in Maximo which directly support the maintenance processes.

Integration with ERP when Maximo manages Procurement and Inventory
An example when Maximo manages Procurement and Inventory

Integration between Maximo and ERP often demands realtime or near realtime data synchronisation using web services. Similar to the integration with Finance discussed above, auditability and fault tolerance are two important requirements.

Things to watch out for

This integration can be complex, but it doesn’t always have to be that way. Different software has different data structure and business logic. Correctly mapping all processes and data fields will require a lot of discussion and analysis. Working on such a project can result in “paralysis in analysis”, especially if it involves multiple parties.  An SME who understands the big picture and pay attention to details will be a great asset.

During analysis and design, discussion on the various scenarios and whether certain design decision works with such scenarios can be an endless conversation. Having a system to keep track and follow-up on these side questions will help the team to focus on the main problem, avoid getting distracted by questions which the team does not have an immediate answer to. A collaboration tool such as JIRA is great to manage these stories, questions, or issues. It can notify and follow-up with people when they are tagged, and all discussion are kept in one place. If such tool is not available, a shared Excel file on Sharepoint works just fine.

5. Integration between Maximo and ArgGIS

Why is it required?

ArcGIS is big in Utilities and Public Service sector. For companies which have the asset networks spread across a big geographical area, and many of which are underground, map software is crucial to operation and maintenance. With such companies, integration between Maximo and ArcGIS using the Maximo Spatial add-on is almost unavoidable.

What are the common requirement?

The requirement in this case is pretty much standardized across the industry. ArcGIS often act as the master for managing the asset register. Maximo Spatial polls ArcGIS and synchronises the list of assets and any engineering details. Asset status updates and technical specifications are also sent back to ArcGIS. This integration can also support the following processes:

  • Creation of work orders in GIS
  • Identification of assets or locations based on provided address (geo-coding)
  • Identify route to work location, or adjacent assets or work

Things to watch out for

The business requirement for the Maximo – ArcGIS integration is straightforward. There isn’t much analysis and discussion involved. However, the technical implementation can be tedious. It can take weeks or months if the data mapping is configured manually. However, there are ways to generate the configuration for Maximo Spatial from the metadata in ArcGIS.

Another thing to watch out for when setting up Maximo Spatial is the initial synchronisation of a large number of assets. ArcGIS might have certain limitations which can cause problem to the data synchronisation cron task in Maximo. A developer who is not familiar with ArcGIS can struggle when dealing with such issues.

How to use custom endpoint for dynamic delivery?

The challenges

Enterprise application integration needs to be fast and fault-tolerance. As a result, when it comes to integration with Maximo, we almost always opt for asynchronous, near real-time message delivery. This method of delivery has certain challenges which is not a big problem when using an external enterprise integration tool. There are dozens of these tools available on the market.

However, running an enterprise application integration system is expensive. Many smaller companies often opt for direct integration. Maximo Integration Framework offers an arsenal of tools to handle all kind of requirement. Yet, the framework is still lacking in two major areas:

  • Data mapping
  • Dynamic delivery

Previously, I have provided an example on how to use XSLT. In this article, I will discuss on how to address the dynamic delivery requirement.

Dynamic delivery with a custom automation script endpoint

When is dynamic delivery required?

In many cases, we need to build an integration interface with one or more of the following requirements:

  • Different URL for each Add/Update/Delete action: this is a common pattern for the REST API nowadays. For example, the ServiceNow API has 3 different URL formats to Add, Update, and Cancel a Service Order.
  • The target application needs an Internal ID for the Update request: this is also a common pattern. The Maximo OSLC REST API itself is an example.
  • The need for retrying and reprocessing failed deliveries: message delivery is expected to fail due to many external factors. Thus, this is a mandatory requirement for almost all integration interfaces.

In general, the logic to address the above requirement should be handled during delivery, not during publishing and transformation of the integration message.

Why using this approach?

When the external system requires an Internal ID for the Update action, one common strategy is to store the ID in the EXTERNALREFID field. For example, when a new PO is created, it is published to an external ERP system. Upon successful delivery, the receiving system responds with the Internal ID of the newly created record. We can store this value in the PO’s EXTERNALREFID field or a custom field. The next time when the PO needs to be published, if this field has a value, we know that the PO has already been created in ERP and thus, will send an Update request instead of a Create New request.

But this strategy often does not work well with asynchronous integration. For example, if the first request to create a new PO takes a while to process due to the external system being busy, a subsequent update to the PO will send another Create New PO request, resulting in duplicated POs in the target system.

However, if we handle the logic to determine whether to send a Create New or Update during delivery, this should not be an issue. In the case the first request failed, and the second request will be determined as a Create New. In the case both messages failed, and later they are reprocessed successfully, the first one that reaches ERP will be determined as a Create New, and the second one will be delivered as an Update.

Two key benefits of this approach are:

  • No need for manual intervention
  • Can use continuous outbound queue instead of sequential queue

Real-world Implementation

Over the last 2-3 years, I have built a few interfaces using this approach. So far, they have been working well. The latest one was an interface between Maximo and Myob Greentree ERP. In this case, Maximo handles the procurement process and Greentree handles the payable process. The functionality of Greentree API is somewhat limited. It does not support Sync operation and requires Internal ID for both the PO and the POLINE records. To address this requirement, the interface is implemented with the following key points:

  • Data mapping is handled by XSLT
  • Delivery is handled by an automation script endpoint:
    • On delivery, the endpoint will query Greentree to identify the Internal ID and status of the PO and PO Lines. If the PO already exists, it will determine to Update or Add New PO lines depending on the status of each line.
    • The original payload is manipulated during delivery to match with the current status of the PO in Greentree. Which means if it fails, the delivery will retry and come-up with a new payload to match with the new status of the record in Greentree at that point of time.
  • An autoscript crontask is added to send error alerts to the user in Excel format. But it will only send email in the case there is a least one error. The email is delivered directly to the user most of the errors come from missing or incorrect GL Account.

Automation Script example

To implement the requirement above, we can create a custom endpoint with automation script. The endpoint can be associated with the publish channel and external system like other standard endpoints. That means the interface is fully compatible with other MIF functionalities including:

  • Message Queues
  • Message Tracking
  • Message Reprocessing

Below is an example piece of python code for the end point:

The basic example above does a similar job like a standard HTTP endpoint:

  • The published message is passed to the script in the implicit variable requestData
  • Any failure/exception at any line of the code will be equal to a failed delivery:
    • The message is put back to the error queue .
    • Error is captured and can be seen in the Message Reprocessing application.

However, by doing this, we have full control of what we want to happen during delivery such as:

  • Query external system to determine if a record already exists, and if yes, get the Internal ID
  • Raise an Exception to tell Maximo that the delivery has failed. This can be quite handy in given scenarios such as when an external system rejects a message but it still responses with a HTTP 200 code, and only provides an error message inside the XML response body.

Below is an example piece of code which does the following:

  • Query an external system to identify if a PO record exists
  • If the record exists, update the XML payload with Internal ID before delivering it to the system as an Update request instead of a Create.

How to remove HTML tags in Long Description field

Introduction

The Long Description field contains data in richtext format. When sending this data to an external system, the special characters in HTML tags often cause trouble to the integration interface. It can also result in unreadable text displayed in the receiving application. This post provides intructions on how we can easily strip the Maximo’s Long Description field from the richtext format to keep the plain text.

Common problems with Long Description

The Long Description field is used by many key objects such as Service Request, Purchase Order, or Work Log. In Maximo, perhaps, Service Request and the Work Log’s Details are the most common place where this field is used.

When raising a ticket or recording some logs, users often copy the information from other sources such as email. This does not usually cause much problem to Maximo from the front-end because most of the format if copied from standard applications like Words, Outlook or the browser are retained in this field.

However, when it comes to integration, it is a different story. For system integration, there are two common problems with the Long Description data due to the HTML tags of the richtext format:

  • Many integration tools have trouble espcaping the special characters contained in the field data. This often result in integration failure. Even if the tool can handle these special characters graciously, the additional number of characters added can exceed the field length limit of the receiving application.
  • External application does not support this format, as such, the rich-text content is displayed as-is with a lot of HTML tags, making it unreadable to the users.

In many cases, retaining the format of the text is not desirable. The customer might prefer to keep the data as plain-text. In such case, Disabling the Rich Text Editor is a better solution. However, if we want to retain the formating and only remove it when sending the data to an external system, the following section descibes how to achieve it.

How to strip rich-text tags from Long Description?

Requirement

Below is an example of an integration interface between IBM Maximo and Gentrack Unify CRM in a water utility. When carrying out maintenance work, if there are delays, the field workers put the Work Order on Hold, and enter the detail of the delay as a Work Log entry. This information must be sent to Unify CRM so that, if the customer calls up, the call center staff would be able to answer on why the work has not completed.

Setup a simple interface

To provide a simplified example, in a Maximo vanilla instance, we will setup the following integration objects:

  • Object Structure: ZZWORKLOG – make sure to include the LONGDESCRIPTION field
  • Publish Channel: ZZWORKLOG – make sure to enable event listener and message tracking
  • Exteral System: WEBHOOK – I added a HTTP endpoint to webhook.site for testing

Sample text with richtext format

To test the code, we will create new Work Log entries and use the same text with some formating and hyperlinks as follows:

XML Output without tripping

Without any customisation, Maximo will publish an XML message to Webhook as shown in the image below. As you can see, it contains a lot of XML tags for which, in turn, the special characters have been escaped to be compatible with XML format.

Add UserExit logic with Automation Script

For this requirement, we can use the utility class psdi.util.HTML in the Maximo library which is available out-of-the-box. To strip the richtext tags in the Long Description field before sending data to an external system, we can create an UserExit processing logic with Automation Script as follows:

  • From the Automation Scripts application, choose Actions > Create Script for Integration
  • Choose Publish Channel option, select the ZZWORKLOG pulish channel
  • Choose User Exit option
  • Choose Before External Exit
  • Language: Python
  • Source Code:

XML Output after removing formating

If we create a new Work Log entry using the same text above, the XML output will only contain plaintext as shown below. The text is much more readable now and it still contain the hyperlinks which can be an important piece of information.

Other Notes

There is another frequent problem when sending the Long Description data to an external system. It often exceeds the field length limit of the external system. While you are at it, double check the length limit of the receiving field. With the Description and Long Description, it is a great idea to always truncate it to fit the target field. In the case of Long Description, we might want to split the long text into multiple records to avoid integration failure in the future.

In the case we want to strip the richtext formating when it is first entered in Maximo, so that only the plain text content is saved to the database. We can use the same toPlainText function in Automation script when saving the record.

The performance effect of the Download button in Maximo

Introduction

If you have been working with Maximo for a while, you already know about the Download button on the top left of every table in Maximo. With one click, it will export everything we see displayed in the table into an Excel file. This is great for doing further data analysis and reporting from that data. It is so simple and convenient, right? Not quite.

Performance Degradation

The danger with the Download button is that, since it is too convenient, everyone is using it for every data requirement. One frequent problem is the user keep asking us to add more columns to the List tab of the key applications like Work Order Tracking. Most of the time, the Maximo Administrator will comply to such requests in a blink (another problem of Maximo being too easy to customize). Often, many of those columns are retrieved via relationships. One additional column usually does not really make much difference. But if there is a high number of records, and as the amount of user activities increases, it will create a snowball effect in degrading the overall system performance and people will start complaining about Maximo being slow. 

However, the real problem with the Download button is that, by default, there is no limitation set for it. Usually after Maximo was first implemented, it worked great. After several years, the amount of data grew, and people start using this method to download data for various reporting requirements. The Download button can significantly affect system performance.

Measure the impact

For a client that I recently worked with, many users frequently use the Download button to extract a large amount of data (e.g. all work orders in one year) to create their own reports in Excel. This led to a tremendous amount of stress on the servers.

First, we must realize that the output of this “Download” function is an XML file (although the extension is XLS), Maximo consumes a lot of processing power and memory to generate the XML file. To fully understand how it affects the server, I did a small test by setup a copy of the client’s database a local VM. I opened the Work Order Tracking app and try to download all active work orders (15k records). That took around 15 minutes to generate and download the file. Then I tried to download all work orders reported in the last year (this includes both closed and active work orders). It took 50 minutes. And during this whole time, the VM’s CPU and memory utilization was saturated the whole time.

CPU and Memory usage while Maximo is processing the download request
CPU and Memory usage after the processing completed

Crashing the server

To test the worst-case scenario, I opened the SR application, and click on “All Records”, then clicked on the Download button to download all 600 thoudsand records. Users can easily make this “mistake”, and once they clicked on the Download button there is no option for them to cancel the process.

At first, the process saturated CPU and memory utilization for more than an hour, after that, the session expired. However, in the background, SQL Server process continued running. It was consuming 30-40% CPU utilization for the entire day. I left it to run for about 6-7 hours until I got fed up and had to restart SQL Server to kill that process. Theoretically, since I was the only user in the system, when the process was running and it reached the maximum JVM heap size, Garbage Collector would try to clean the other previous MBOs which it has used, and able to free up some memory.

However, in the production environment, as it happened to our client, if the server load is high, sometimes, the Garbage Collector couldn’t free up memory quick enough, it resulted in the OutOfMemory error and crashed the server.

What can we do about this problem?

To reduce the impact of the Download function, we should set a limit on the number of records the user can download by setting a value to the webclient.maxdownloadrows property. There are already some tech notes by IBM that talked about it.

However, the next question is, once we have set a limit, what is the alternative method for the user to download the data they need? I can think of a few methods like building a simple BIRT report. It allows the users to choose XLS as the output format. We can also setup Application Export function with a flatfile output format. But my favourite option would be using the “Create Report” function. By default, when we create and run a report in the “Preview” mode, it exports the exact same number of columns on the List tab of an application, then we can “Export Data” from that report. The process takes a few clicks, but processing time is usually less than one minute compared to 10 or 20 minutes. That’s a quick win. Also, once the user got used to it, they can extract any data that they like. That means less work for the Maximo Administrator.

« Older posts