Automating environment-specific verification queries ensures your database changes are validated reliably across development, staging, and production despite data differences. By combining Liquibase and Harness Database DevOps, teams can simplify developer workflows, catch issues early, and ensure only validated changes make it to production.
In database management, ensuring that data migrations and schema changes are verified consistently across different environments is a common challenge. Verification Queries are an approach for writing unit tests for individual changes performed on a database. Verification queries are crucial for validating the success of database operations, but when environments like development, staging, and production have different data sets, these queries must be flexible, especially when verifying DML changes (Changes to data). This blog post will guide you on how to automate environment-specific verification queries using Liquibase and Harness Database DevOps, ensuring that your database changes are validated correctly in every environment.
In many organizations, the data in different environments can vary significantly. For instance, a staging environment might contain a subset of production data, or PII-sanitized test data, leading to differences in record counts or other metrics. When executing data migrations or schema changes, it's essential to ensure that verification queries account for these differences. A verification query that expects a specific record count in production might not be valid in staging due to these discrepancies.
To address this challenge, we can leverage the following components:
Liquibase: A popular open-source tool for database change management that supports both schema and data changes.
Harness Database DevOps: A platform that integrates with Liquibase to automate database deployments and includes features for governance, visibility, and verification.
Stored Procedures: To encapsulate the logic for running verification queries and comparing expected vs. actual results.
A demo video walking through this solution can also be watched on our YouTube Channel.
Environment-Specific Properties: Define properties that vary by environment (e.g., expected record counts, environment-specific verification queries). The solution puts these in a CSV, loaded alongside the change. The property passed to Liquibase can then simply be the environment name.
Liquibase Changesets: Use Liquibase to define database changes and include verification queries referencing environment-specific properties.
Stored Procedures: This solution uses a stored procedure to execute verification queries and compare results.
Many companies have developers provide verification queries to DBAs, who then run them by hand. For an automated solution to be viable, it must ensure a good experience for the developers using it. This solution simplifies their experience by abstracting away the complexities of environment-specific validations, allowing developers to focus on writing code rather than worrying about the nuances of environment-specific testing.
Developers can write their database changes once, without creating separate change sets for different environments. Instead of maintaining multiple versions of SQL scripts for development, staging, and production, developers simply write their changes in a single file. The solution allows developers to provide a CSV file containing their verification queries. The file has two columns, one specifying the environment and the other the query to run.
For example, a developer can write a single SQL script that includes the changes and the expected results for verification. For instance, if the change has this folder structure:
Then the developer’s change can be as simple as:
- changeSet:
Id: MyJiraTicketNumber
author: satwell
changes:
- loadUpdateData:
relativeToChangelogFile: true
file: sql/MyJiraTicketNumber-testcase.csv
tableName: testcases
primaryKey: environment,query
- sqlFile:
path: sql/MyJiraTicketNumber.sql
- sql:
sql: EXECUTE RunTestCases @environment = N'${env}'
This change starts by updating the test case data, then runs the developers' change, and finally, runs the tests for the environment to which the changelog is being deployed. The only parts of the change that differ between changes are the change ID and the path to the SQL file and CSV.
When changes are committed, the RunTestCases stored procedure is executed as part of the deployment pipeline. The deployment pipeline automatically passes the name of the environment into the environment parameter to run the appropriate verification queries. If the actual results match the expected results, the changes are committed; otherwise, the transaction is rolled back, ensuring that only validated changes are applied to the database.
By streamlining the process of writing and validating database changes, this solution empowers developers to deliver high-quality code faster while ensuring the integrity and consistency of the database across all environments.
This procedure assumes you have already set up a simple configuration of Harness Database DevOps. If you haven’t, you can follow our quickstart to create a starting point.
First, create a table to store your test cases. Here’s an example change you can add to your changelog to create this table:
- changeSet:
id: testCaseTable
author: harness
changes:
- createTable:
tableName: testcases
columns:
- column:
name: environment
type: VARCHAR(50)
- column:
name: query
type: VARCHAR(max)
Next, create the RunTestCases stored procedure. This procedure will take an environment parameter, fetch the relevant test cases, execute each query, and compare the results.
Again, here is a change you can add to your changelog to create this stored procedure.
changeSet:
id: setupEnvAwareProcedure
author: satwell
changes:
- createProcedure:
procedureText: |-
CREATE PROCEDURE RunTestCases
@environment NVARCHAR(100)
AS
BEGIN
SET NOCOUNT ON;
DECLARE @testQuery NVARCHAR(MAX);
DECLARE @id INT;
DECLARE @message NVARCHAR(4000);
DECLARE @err NVARCHAR(4000);
-- Output startup message
SET @message = 'running tests for' + @environment;
PRINT @message;
-- Cursor to iterate through testcases for the given environment
DECLARE testcase_cursor CURSOR FOR
SELECT ROW_NUMBER() OVER (ORDER BY (SELECT NULL)) AS id, query
FROM testcases
WHERE environment = @environment;
OPEN testcase_cursor;
FETCH NEXT FROM testcase_cursor INTO @id, @testQuery;
WHILE @@FETCH_STATUS = 0
BEGIN
-- Table to store test query result
DECLARE @result TABLE (
Expected NVARCHAR(MAX),
Actuals NVARCHAR(MAX)
);
BEGIN TRY
-- Execute test query
INSERT INTO @result (Expected, Actuals)
EXEC sp_executesql @testQuery;
-- Check for mismatched rows
IF EXISTS (
SELECT 1
FROM @result
WHERE ISNULL(Expected, '') <> ISNULL(Actuals, '')
)
BEGIN
SET @message = 'Test case #' + CAST(@id AS NVARCHAR) + ' failed: Mismatched Expected vs Actuals values.';
THROW 51000, @message, 1;
END
ELSE
BEGIN
SET @message = 'Test case #' + CAST(@id AS NVARCHAR) + ' passed.';
PRINT @message;
END
END TRY
BEGIN CATCH
SET @err = ERROR_MESSAGE();
SET @message = 'Test case #' + CAST(@id AS NVARCHAR) + ' execution error: ' + @err;
THROW 51001, @message, 1;
END CATCH
FETCH NEXT FROM testcase_cursor INTO @id, @testQuery;
END
CLOSE testcase_cursor;
DEALLOCATE testcase_cursor;
END;
procedureName: ensure_actual_matches_expected
replaceIfExists: false
rollback:
- dropProcedure:
procedureName: RunTestCases
In the Harness UI, open each database instance for your Database Schema. Under ‘Substitute Properties (Optional)’ Add a property. Give the property the name ‘env’ and a value corresponding to the environment name you provide in your CSV testcase files.
Now your developers can define changes as we discussed in the prior section. When your Harness Database DevOps pipeline deploys them, Harness will apply the changes and execute the verification query. Harness will find the correct verification queries for that environment. If the expected and actual values match, the deployment proceeds. If they don't, the deployment fails, and the transaction is rolled back.
You can automate environment-specific verification queries by leveraging Liquibase and Harness Database DevOps, ensuring your database changes are validated consistently across all environments. This approach improves efficiency, reduces the risk of errors, and ensures compliance with your organization's standards.
Start implementing this solution today to streamline your database deployments and maintain the integrity of your data across environments. Let us know if you’d like to discuss how Harness Database DevOps can help you automate your environment-specific verification queries.