Legacy COBOL applications can be challenging.
Any developer, either fresh out of college or the gnarled senior principal who ‘worked with IBM mainframes in the 70s’ will think twice before changing code. The risk is always high and answering questions around execution flow, potential impact, where the data is to be used and other copybook dependencies are time-consuming when using mainstream searches and custom scripts.
But not for everyone.
Customers with strong code analysis tools, such as COBOL Analyzer, gather this information quickly and confidently. The benefits are huge; why spend time on research when it would be better used to handle your backlog?
But knowing how to extract maximum benefit from your tools could make all the difference. Check out our quick guide to the features that help hard-pressed devs deliver better COBOL applications faster.
When you ‘see’ the program control flow immediately, you focus on the relevant logic. All the redundant execution flows disappear and you are zoomed into the code path (or paths) that leads to and from a specific paragraph or section. Navigation is synchronized, so moving backward or forward in the diagram also takes you to the relevant source line.
This feature alone can potentially save days of research work. Changing a data item, table column, or data file record layout are all common tasks. While the change itself is small and simple, estimating the affected code is highly complex. It requires knowledge of which data fields in the database will be affected, the intermediate fields that might contain or use the field in calculations, and so on.
Change Analysis will do everything automatically, assess all the usages recursively and create an interactive report of all the affected code with traces that explain how it is affected.
COBOL’s strongest suit is data processing – one of the reasons it is still kicking – which create highly complex inter program, and intra program data flows. Understanding how data is moved from an input field in the frontend and committed into a database, including all calculations, and passing different programs, can be labor-intensive.
Micro Focus COBOL code analysis includes a unique step, called ‘data flow analysis’ which makes this information available upon request. You can see it, track it using an interactive tool, or issue a report with all the details. That information makes changes, extracting and exposing business rules, or refactoring code, a lot easier.
Code analysis scans can improve quality, reduce defects and optimize application performance. They can be a pre-commit task or run as part of your Continuous Integration system. Making sure there are no coding standards violations, that potential performance or security issues are highlighted, and dead or unreachable code is detected early on, can make a huge difference to team velocity.
Many complex ideas are better conveyed with a single still image. With the right tool, devs can look at application-wide, cross technology call maps, data flows, source dependencies and more. Zoom in to a specific scenario - login, adding a new customer, and so on; filter out noise, and group objects by types and purposes. This is the most effective way to document and present the architecture of your system and is always updated, unlike traditional static documentation, such as files and papers.
Which programs access this SQL column? Need to find all numeric data items of a specific length, with names that contain ‘sum’, or discover all evaluations of a specific variable which are not dead code? Or locate all code that is similar to:
Resolve all these queries and many more in seconds using advanced code searches that remove the need for writing countless scripts and filter out thousands of false positives.
A lack of application insight could be holding you and your development team back and putting your organization at a disadvantage. Take a look at this video introduction or get in touch to see how to regain your advantage.