A futuristic ship navigating a digital ocean of binary code and cloud structures, symbolizing cloud transformations in the marine industry. In the foreground, developers interact with holographic screens displaying unit testing, code coverage graphs, and automation workflows. The Azure DevOps logo subtly blends into the background clouds, emphasizing the transition to cloud-based development.

Ensuring Code Quality in Cloud Transformations: A Marine Industry Case Study

Sign up for our newsletter

We care about the protection of your data. Read our Privacy Policy.

Introduction

At ProCogia, we prioritize quality and resilience in cloud transformations. For a marine industry client, we transitioned transformation code from local development to Azure DevOps while maintaining high code coverage. By integrating unit testing and automating validation through a structured build pipeline, we ensured the code remained reliable and adaptable across diverse data scenarios.

 

Challenge

For a marine industry client, the team developed transformation code locally instead of using Databricks or Synapse Notebooks. This local development approach required ensuring the code was resilient and adaptable to diverse data scenarios. To achieve this, unit testing was implemented during local development to validate the code’s robustness. However, after transitioning the project to Azure DevOps, the challenge remained to maintain the authenticity and reliability of the code in the new environment, ensuring it continued to meet quality standards and handle various data scenarios effectively.

 

Approach

To ensure the reliability and quality of the code, code coverage was employed as a key strategy. This involved setting up a build pipeline in Azure DevOps, triggered by a YAML script created locally. The YAML script outlined a series of steps, including specifying the Python version, incorporating the unit test files developed locally, and generating test data within Azure to validate the code’s compatibility. This automated pipeline ensured that every time the code was deployed to the Azure Repository from the local environment, comprehensive testing and validation processes were executed seamlessly. 

Below is a step-by-step guide to creating the YAML file for deploying the build pipeline.

 

Steps for creating YAML file 

After ensuring that all prerequisites are in place, proceed with the following steps to construct the YAML file, which will be submitted to the Azure Build pipeline for code coverage integration. 

 

Step 1: Define the Pipeline Trigger and Pool 

The pipeline is set to trigger whenever there is a commit to the main branch. Additionally, it specifies the use of an ubuntu-latest virtual machine image to run the pipeline. 

YAML Configuration:

 

Step 2: Set up a Python Version Strategy

The pipeline uses a matrix strategy to specify the Python version (3.12) for the build environment. This helps ensure consistency in testing with the desired Python version.

YAML Configuration:

 

Step 3: Configure Python Version for the Environment

This step sets up the specific Python version for the pipeline environment. The UsePythonVersion task ensures that the pipeline uses Python 3.12, as defined in the strategy.

YAML Configuration:

 

Step 4: Install Dependencies

Dependencies required for the project are installed in this step. The script upgrades pip and then installs all packages listed in the requirements.txt file.

YAML Configuration:

For example, one of the requirements is installing ‘pytest’ in the pipeline environment to run the test cases from the scripts.

 

Step 5: Generate Test Data

To create a consistent test environment, this step runs the generate_test_data.py script. This script prepares necessary data for running the tests.

YAML Configuration:

 

Step 6: Run Unit Tests with Coverage for Each Test Suite

This step runs unit tests for your project. We will execute the Unit Tests for each of the test files and generate a coverage report in XML format. For each test file, we’ll use pytest along with the –cov option to track coverage for the main code file and generate a report. 

YAML Configuration for Test Suite 1:

YAML Configuration for Test Suite 2:

In this example:

  • Each test file (test_file_1.py, test_file_2.py) runs its corresponding Unit Tests.
  • The –cov=main_code.py flag ensures that Code Coverage is tracked for the main_code.py file.
  • The –cov-report xml flag generates the coverage report in XML format.
  • The mv command renames each generated coverage.xml file to maintain unique file names.

 

Step 7: Publish Code Coverage Results

This step consolidates and publishes the code coverage results to the Azure DevOps dashboard. It uses the PublishCodeCoverageResults task to process all XML files (e.g., coverage_file_1.xml, coverage_file_2.xml) into a readable report.

YAML Configuration:

Result

Once the report is generated, we can navigate to the Tests tab in the run results to gain valuable insights. This section provides a breakdown of how many tests have passed for each test file, along with a graphical representation of code coverage. The graph highlights the percentage of code covered by tests versus the portions left untested, helping us identify whether the uncovered sections are critical for testing or consist of generic, non-essential calls. This visualization ensures clarity on the extent of code coverage and provides a deeper understanding of the code’s resilience and completeness.

 

Conclusion

By implementing a structured approach that combines local testing with an automated Azure DevOps pipeline, we successfully ensured the reliability and consistency of the transformation code. The integration of unit testing, code coverage, and pipeline automation allowed us to validate the code across diverse scenarios while maintaining high-quality standards. This end-to-end process not only streamlined the testing workflow but also enhanced confidence in the code’s adaptability and robustness within the Azure environment.

Subscribe to our newsletter

Stay informed with the latest insights, industry trends, and expert tips delivered straight to your inbox. Sign up for our newsletter today and never miss an update!

We care about the protection of your data. Read our Privacy Policy.

Keep reading

Dig deeper into data development by browsing our blogs…

Get in Touch

Let us leverage your data so that you can make smarter decisions. Talk to our team of data experts today or fill in this form and we’ll be in touch.