Find out how smart businesses are turning COVID-19 from a challenge into an opportunityTell me more
Although some processes had been updated with agile tooling and techniques, the database components were holding back deployments by a matter of weeks and the quality of the schema changes varied widely.
The customer needed a way of speeding up the delivery of DB2 schema changes, while improving the consistency and quality of the associated Data Definition Language (DDL) statements.
The first step was to examine the existing development processes and tooling. Working with the development teams, project management and centralised mainframe support teams we produced an overview of the current process, along with the challenges faced and what an ideal scenario would look like. It was important to get the viewpoint of all the different teams involved to ensure the solution could address as many issues as possible.
The existing DB2 for z/OS DDL standards were also reviewed to ensure they were up to date and suitable for use within the planned automation.
As some elements of DevOps had already been introduced, it would be preferable to integrate any new processes with the existing continuous delivery pipeline and reuse the existing toolset where possible. This ensured the solution utilised existing familiarity with the toolset, and minimised the cost and disruption involved with introducing new products. As the customer was already using SonarQube* for static code analysis as part of a pipeline controlled by UrbanCode Deploy and Jenkins, a solution using these tools would be ideal. There was just one issue – the database was DB2 for z/OS, and SonarQube did not have a native module for this system.
After working with the customer to evaluate possible solutions, it was decided that we would write a custom plugin to extend the SonarQube product to perform static code analysis on mainframe DB2 assets including DDL and DML files. This would enable the use of their preferred toolset whilst providing a capability to automatically analyse code quality for their database artefacts according to rules they could customise at will. Triton worked with the customer teams responsible for development, operations and DevOps infrastructure to write a Java based SonarQube plugin and associated UrbanCode Deploy hooks that provided analysis rules, reporting and delivery pipeline integration that perfectly matched the customer requirements. The SonarQube rules were based on the pre-existing DB2 DDL standards.
Throughout the process we were engaged with the customer teams who would use and support the new database static code analysis functionality, so they were already familiar with what the solution did, how it was achieved and supported. We also presented at customer DevOps roadshows around the country which ensured there was a high level of awareness throughout the organisation and resulted in several other divisions expressing their interest in using the solution.
This close integration with the customer teams ensured we were ready to go with pilot users for the new tools and processes, and the appropriate support teams were engaged. Working with these pilot users the new plugin was successfully tested and integrated into the customer DevOps toolset at which point it became available to development teams across the customer organisation.
Following the Triton engagement, the customer now has a custom static code analysis capability for mainframe DDL and DML assets which fits into their preferred DevOps CI/CD toolset, fully documented, supported and extensible by the customers own support teams.
Implementing this capability has automated the code analysis formerly performed manually by the database administrators, which has reduced the time it takes for mainframe assets to be deployed from weeks to hours while freeing time for the DBAs to concentrate on higher-value tasks such as improving performance and reliability of the database systems. Indeed, with this focus on adding value and the DevOps processes driving interaction between the database and development teams, the customer is experiencing the benefits of this increased communication in addition to original goals of the engagement. As the SonarQube analysis is an integral part of every deployment process, the quality rules are applied consistently and can be easily tailored to match evolving requirements.
With the faster turnaround to deploy mainframe assets, the mainframe database platform can now be considered an integral part of the DevOps processes being implemented by the customer. This is essential for the platform to continue to thrive and be an integral part of driving the customers business forward.
As a result of this and several other successful mainframe DevOps initiatives for this customer, Triton continues to be engaged to produce innovative and effective solutions to provide a competitive advantage and enhance the return on investment from the customers mainframe estate.
*SONARQUBE is a trademark belonging to SonarSource SA