#162: release automation and documentation update#163
Conversation
There was a problem hiding this comment.
Pull request overview
This PR modernizes Atum’s release process by migrating from the legacy Sonatype OSSRH publishing setup to sbt-ci-release, aligning release automation with other AbsaOSS sbt-based repositories and updating related documentation/CI configuration.
Changes:
- Adopt
sbt-ci-releaseand remove legacy publish/signing configuration (includingversion.sbt-pinned versioning). - Add a manual GitHub Actions Release workflow (
workflow_dispatch) to runsbt ci-release. - Update CI/tooling configuration (SBT version bump, add
versionScheme, update README/release docs and checkout action pinning).
Reviewed changes
Copilot reviewed 10 out of 11 changed files in this pull request and generated 2 comments.
Show a summary per file
| File | Description |
|---|---|
version.sbt |
Removes build-pinned snapshot version in favor of CI/tag-based versioning. |
build.sbt |
Adds versionScheme for semver metadata with sbt versioning. |
publish.sbt |
Removes old Sonatype publish target configuration; keeps project metadata. |
project/plugins.sbt |
Adds sbt-ci-release and removes explicit sbt-pgp. |
project/build.properties |
Updates sbt version. |
RELEASE.md |
Documents the new sbt-ci-release-based release procedure. |
README.md |
Updates Maven Central link target. |
.github/workflows/release.yml |
Adds manual release workflow running sbt ci-release. |
.github/workflows/build-sbt.yml |
Pins actions/checkout to a SHA and adjusts checkout settings. |
.github/workflows/jacoco_check.yml |
Pins actions/checkout to a SHA and adjusts checkout settings. |
.github/workflows/license_check.yml |
Pins actions/checkout to a SHA and adjusts checkout settings. |
Comments suppressed due to low confidence (1)
publish.sbt:47
pomIncludeRepository := (_ => false)was removed. Without explicitly disabling repository entries in the generated POM, published artifacts can end up embedding resolver/repository information, which is typically undesired for Maven Central publishing. Ifsbt-ci-release(or another setting) doesn’t already enforce this, please add it back at the build level.
ThisBuild / homepage := Some(url("https://github.com/AbsaOSS/atum"))
ThisBuild / description := "Dynamic data completeness and accuracy at enterprise scale in Apache Spark"
ThisBuild / organizationName := "ABSA Group Limited"
ThisBuild / startYear := Some(2018)
ThisBuild / licenses += "Apache-2.0" -> url("https://www.apache.org/licenses/LICENSE-2.0.txt")
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
….g. "1.2.3+9-0c9da44f-SNAPSHOT"
JaCoCo code coverage report - scala:2.12.15
|
JaCoCo code coverage report - scala:2.13.14
|
JaCoCo code coverage report - scala:2.11.12
|
…SparkTestBase Details: setting spark.sql.queryExecutionListeners.async=false to prevent a race condition in Spark 3.2+ where the _INFO file written by the QueryExecutionListener may not exist by the time the test reads it, as the listener fires on a background thread by default
| .replaceAll("""(?<="processStartTime"\s?:\s?")([-+: \d]+)""", testingDateTime1) | ||
| .replaceAll("""(?<="processEndTime"\s?:\s?")([-+: \d]+)""", testingDateTime2) | ||
| .replaceAll("""(?<="version"\s?:\s?")([-\d\.A-z]+)""", testingVersion) | ||
| .replaceAll("""(?<="version"\s?:\s?")([-+\d\.A-z]+)""", testingVersion) |
There was a problem hiding this comment.
note - I removed the version from the SBT files because it will be detected from git tag and the tests were more exact before
| .config("spark.ui.enabled", "false") | ||
| .config("spark.testing.memory", 1024*1024*1024) // otherwise may fail based on local machine settings | ||
| // Spark 3.2+ fires listeners async by default; sync needed for _INFO file tests | ||
| .config("spark.sql.queryExecutionListeners.async", value = false) |
There was a problem hiding this comment.
Potential race condition in file HdfsInfoIntegrationSuite.scala for Spark 3.2+
Closes #162
Adopting https://github.com/sbt/sbt-ci-release so that the releases stay functional and this process is consistent with our other AbsaOSS projects, such as https://github.com/AbsaOSS/atum-service/ or https://github.com/AbsaOSS/spark-data-standardization