If you happen to’ve visited my weblog earlier than, you’ve doubtless seen a number of posts right here on an identical subject, Liquibase. A number of years in the past we enhanced that open supply library to make it simpler to generate Liquibase changeSets on your Oracle Database objects, schemas, ORDS REST APIs, and APEX purposes.
This allowed builders to ‘do their work’ – and ship that work to the Oracle Database, in an automatic, and managed trend utilizing Liquibase.
At this time I wish to introduce the following evolution in CI/CD for the Oracle Database with SQLcl’s new mission command.
However first let’s focus on why we constructed this Undertaking mechanism, and distinction it with the prevailing Liquibase instructions.
Why the Undertaking command was needed
Whereas utilizing the liquibase command supplied a ton of advantages over placing collectively your personal customized SQL scripts, there have been a couple of issues it didn’t do:
- match neatly into your present Git workflows
- implement easy-to-read (diffs!) SQL primarily based changeSets
- bundle up your adjustments to drop-dead-simple artifacts for deployments
Database adjustments are additionally computed/delivered in another way
That is maybe the largest distinction between liquibase replace and mission deploy – how the CREATE/ALTER scripts are generated and deployed.
SQLcl Liquibase command changeLogs
These are comprised of XML changeSets, that SQLcl makes use of to explain the state of the database object. When an ‘replace’ is issued, SQLcl appears to be like to see what the present state is within the goal database, and dynamically generates the DDL/PLSQL essential to get to that state.
Meaning the goal system might be in a state the place the article to be altered doesn’t even exist but, and we’d create it for you.
SQLcl initiatives command changeLogs
These are comprised of SQL changeSets, that SQLcl executes as-is when deployed to a database. The SQL is computed not at deployment time on the goal system, however when the native src code is used to stage the deployment scripts.
The goal system, your UAT or Prod situations for instance, will probably be at an anticipated state (launch), and can have these scripts utilized – assuming the the dependencies and situations described within the changeLog are glad.

Which technique to go? Which methodology or command ought to our workforce choose?
If you happen to’re already utilizing our enhanced Liquibase command in SQLcl, and also you’re proud of it – hold going!
If you happen to’re about to begin work on a brand new utility, it’s our official suggestion that you simply go together with a SQLcl mission. The truth is, it’s what we’re utilizing right here at Oracle for a number of crucial options.
We will probably be sustaining each options going ahead, and we’re utilizing our enhanced Liquibase library underneath the covers for the Undertaking command. Our clear path ahead and the know-how receiving probably the most updates would be the initiatives command.
Let’s do a quick stroll by means of of what the Undertaking command does, and the way it works.
In order for you a deep-dive demo of how precisely you should utilize this, you’re in luck!
I extremely suggest the LiveLab. And it comes with a bonus: see how a React internet app might be powered by ORDS REST APIs!
SQLcl Undertaking assets
An inexpensive technique to method the SQLcl Undertaking command, is to interrupt it all the way down to its sub-commands. Let’s try this now. Be aware, I’m not going to point out how this is able to combine into your Git workflow on this put up
For the entire, step-by-step demo, see this.
What’s the mission command, what does it do?

init – units up our native directories and recordsdata, is also the place you specify the schema or schemas that will probably be utilized by our mission.
The preliminary listing construction appears to be like like this –
──.dbtools
│ ├── filters
│ │ └── mission.filters
│ ├── mission.config.json
│ └── mission.sqlformat.xml
├── dist
│ └── set up.sql
└── src
export – this makes use of a database connection to take your whole database objects and PL/SQL packages, REST APIs, and APEX apps and writes them to native recordsdata in your supply (src) folder. We’ll set up these by object sort.
This ‘src’ will at all times comprise the ‘supply’ of fact on your desk, process, or REST API. It will likely be the entire illustration of the article, outlined as SQL or PL/SQL.
stage – this may populate the dist (distributable) listing tree with the SQL scripts for use to deploy a ‘launch’ or ‘model’ of our utility schema. The ‘dist’ recordsdata are computed from the ‘src’ by evaluating them to the accompanying recordsdata in your Git mission’s lively department.
It’s at this level, that it ought to turn into apparent why utilizing Git to keep up your Oracle schema objects and PL/SQL packages is not an possibility. We’re NOT utilizing the database to generate these distributable objects, however your personal Git mission recordsdata! Now, after all you possibly can argue we’re getting these recordsdata from the database, but it surely is a vital distinction.
launch – as a substitute of working off of ‘grasp’ or ‘subsequent,’ we’re going to have a launch, resembling 1.0.0 that will probably be used to deploy our mission to the database (set up), or say a 1.1.0 that’s used to improve an utility from 1.0.0.
gen-artifact – this packages up the discharge dist recordsdata together with it’s set up.sql script, and bundles it up in a Zip archive. This will then be utilized by the deploy sub-program to be utilized to a database.

deploy – the only command, this takes an artifact (Zip), and deploys it to your database to be utilized.
Abstract
SQLcl’s Liquibase help has taken a significant step ahead with the Undertaking command. This new workflow is a extremely structured course of permitting improvement groups to:
- keep their Oracle Database schemas in Git
- robotically generate and handle your src and dist recordsdata
- simply evaluate what’s modified by holding the whole lot in easy-to-read SQL scripts
- setup improvement and manufacturing pipelines to make sure constant and dependable database schema installs and upgrades, backed up by rigorous and automatic testing by way of their CI/CD programs