An Effective Package Management Strategy Will Streamline DevOps on Your Mainframe

Disclaimer: This blog is based on an article authored by Insoft-Infotel’s Colin Oakhill which appears in the May/June edition of IBM Systems Magazine, a publication of MSP TechMedia.

As the IT skills gap widens on the mainframe, the Fortune 500 enterprises that rely on it are in a constant struggle to accomplish more IT with even fewer skilled mainframers than their distributed counterparts. While 93% of respondents in BMC’s 2019 Mainframe Survey believe in the long-term strength of Big Iron, a report from Forrester Research found that 23% of mainframe staff retired between 2013 and 2018, and that two-thirds of those vacancies have gone unfilled. In order to enjoy uninterrupted development on the mainframe, teams are increasingly turning to the DevOps development methodology.

2017 research from DZone indicated that 74% of global organizations had adopted a DevOps approach, but the size and complexity of development on the mainframe makes implementation an even bigger challenge. In the Db2 environment, each module change with SQL also produces a new DBRM module, a new Db2 package version, entries in Explain-type tables, and a new level of the changed application module in a specific level or stage LOAD library.

Cleaning Up the Packages Mess

While some packages are required, others simply clutter your database and negatively impact performance and availability. To determine which modules should be checked, it’s important to build cross references of all module CSECTs and DBRM or package names. There are a number of ways to locate LOAD and DBRM libraries, but the fastest is through the z/OS Catalog Search Interface (CSI).

Modules will be PDS or PDSE organized and require rapid processing, particularly when PDSE libraries hold thousands or even tens of thousands of modules. By scanning modules for consistency tokens (CONTOKENs), you unlock three big benefits: Application integrity, performance, and cost.

Application integrity is obviously important, and an application won’t function properly in production unless DBRMs and load modules match. If the application is mission-critical, the price of downtime can add up quickly. While downtime is a worst-case scenario, large Explain-type tables full of unnecessary data compromise uptime performance. Reducing the size of Explain tables will help BIND steps reduce CPU usage and cost.

To develop an effective package management strategy, focus on these best practices:

  1. Perform consistency checks on all CONTOKENs, whether they originate from the Db2 Catalog, a DBRM or a LOADLIB module
  2. Identify relevant LOADLIBs and DBRMs
  3. Resolve specific LOADLIBs into module with all BINDer timestamp information
  4. Resolve DBRM modules, keeping header options to compare and identify empty or invalid contents
  5. Construct cross-reference data sets with package names or Plan-DBRM names that match LOADLIB and DBRM counterparts
  6. Check for inconsistencies and set condition codes as required
  7. Use a clean-up utility to eliminate inconsistent or unnecessary entries in PLAN_TABLEs or Explain-type tables.
  8. Regenerate DBRM members from Catalog
  9. Determine whether to avoid or enforce PACKAGE BINDs

DevOps implementation on the mainframe isn’t easy, but it’s increasingly necessary for enterprises facing a stark mainframe developer shortage. By focusing on the above nine steps in your application development process, you’ll enjoy a cleaner database, improved performance, and reduced mainframe computing costs.

Leave a Reply

Your email address will not be published. Required fields are marked *