Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/lint.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ jobs:

- uses: golangci/golangci-lint-action@v3
with:
version: latest
version: v1.60.1
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

🧩 Analysis chain

LGTM! Version pinning improves CI reproducibility.

Pinning the golangci-lint-action to a specific version (v1.60.1) instead of using "latest" is a best practice that ensures reproducible builds and prevents unexpected CI failures from breaking changes in newer versions.

Let me verify that this version exists and is valid:


🏁 Script executed:

#!/bin/bash
# Description: Verify golangci-lint-action version v1.60.1 exists on GitHub
# Expected: Should return version information if valid

gh api repos/golangci/golangci-lint-action/releases --jq '.[] | select(.tag_name == "v1.60.1") | {tag_name, published_at, name}'

Length of output: 130


Let’s recheck the release tag using a direct GitHub API call:


🏁 Script executed:

#!/bin/bash
set -euo pipefail
# Check HTTP status code for the golangci-lint-action v1.60.1 release tag
status=$(curl -s -o /dev/null -w "%{http_code}" \
  https://api.github.com/repos/golangci/golangci-lint-action/releases/tags/v1.60.1)
echo "HTTP status code: $status"

Length of output: 208


Invalid golangci-lint-action version pin – v1.60.1 not found

The GitHub API returned HTTP 404 for releases/tags/v1.60.1, so CI will break trying to fetch a non-existent release. Please update the version on line 55 of .github/workflows/lint.yaml to a valid tag (see https://github.com/golangci/golangci-lint-action/releases):

• File: .github/workflows/lint.yaml
Line 55:

- version: v1.60.1
+ version: <VALID_TAG_HERE>  # e.g., v1.59.1 or the latest stable release
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
version: v1.60.1
version: <VALID_TAG_HERE> # e.g., v1.59.1 or the latest stable release
🤖 Prompt for AI Agents
In .github/workflows/lint.yaml at line 55, the golangci-lint-action version is
pinned to v1.60.1, which does not exist and causes CI failures. Update the
version on line 55 to a valid release tag by checking the official
golangci-lint-action releases page and replacing v1.60.1 with an existing
version to ensure the action runs correctly.

args: --timeout=5m

# Count issues reported by disabled linters. The command always
Expand Down
79 changes: 39 additions & 40 deletions .github/workflows/test.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -10,28 +10,28 @@ on:

jobs:
go-test:
runs-on: ubuntu-20.04
runs-on: ubuntu-24.04
steps:
- uses: actions/checkout@v3
- uses: actions/setup-go@v3
- uses: actions/checkout@v4
- uses: actions/setup-go@v4
with:
go-version: 1.x
go-version: 1.19
- run: make check
- run: make check-generate

- name: Ensure go.mod is tidy
run: go mod tidy && git diff --exit-code -- go.mod

kubernetes-api:
runs-on: ubuntu-20.04
runs-on: ubuntu-24.04
needs: [go-test]
strategy:
fail-fast: false
matrix:
kubernetes: ['default']
kubernetes: ['1.23.5']
steps:
- uses: actions/checkout@v3
- uses: actions/setup-go@v3
- uses: actions/checkout@v4
- uses: actions/setup-go@v4
with: { go-version: 1.x }
- run: go mod download
- run: ENVTEST_K8S_VERSION="${KUBERNETES#default}" make check-envtest
Expand All @@ -41,33 +41,32 @@ jobs:

# Upload coverage to GitHub
- run: gzip envtest.coverage
- uses: actions/upload-artifact@v3
- uses: actions/upload-artifact@v4
with:
name: "kubernetes-api=${{ matrix.kubernetes }}"
path: envtest.coverage.gz
retention-days: 1

kubernetes-k3d:
if: "${{ github.repository == 'IvorySQL/ivory-operator' }}"
runs-on: ubuntu-20.04
runs-on: ubuntu-24.04
needs: [go-test]
strategy:
fail-fast: false
matrix:
kubernetes: [v1.28, v1.25]
steps:
- uses: actions/checkout@v3
- uses: actions/setup-go@v3
- uses: actions/checkout@v4
- uses: actions/setup-go@v4
with: { go-version: 1.x }

- name: Start k3s
uses: ./.github/actions/k3d
with:
k3s-channel: "${{ matrix.kubernetes }}"
prefetch-images: |
docker.io/ivorysql/pgbackrest:ubi8-2.47-2.0-1
docker.io/ivorysql/pgbouncer:ubi8-1.21-2.0-1
docker.io/ivorysql/ivorysql:ubi8-3.0-2.0-1
docker.io/ivorysql/pgbackrest:ubi8-2.54.1-4.0-1
docker.io/ivorysql/pgbouncer:ubi8-1.23.0-4.0-1
docker.io/ivorysql/ivorysql:ubi8-4.5-4.0-1

- run: make createnamespaces check-envtest-existing
env:
Expand All @@ -76,35 +75,35 @@ jobs:

# Upload coverage to GitHub
- run: gzip envtest-existing.coverage
- uses: actions/upload-artifact@v3
- uses: actions/upload-artifact@v4
with:
name: "kubernetes-k3d=${{ matrix.kubernetes }}"
path: envtest-existing.coverage.gz
retention-days: 1

kuttl-k3d:
runs-on: ubuntu-20.04
runs-on: ubuntu-24.04
needs: [go-test]
strategy:
fail-fast: false
matrix:
kubernetes: [v1.25, v1.24, v1.23, v1.22, v1.21]
steps:
- uses: actions/checkout@v3
- uses: actions/setup-go@v3
- uses: actions/checkout@v4
- uses: actions/setup-go@v4
with: { go-version: 1.x }

- name: Start k3s
uses: ./.github/actions/k3d
with:
k3s-channel: "${{ matrix.kubernetes }}"
prefetch-images: |
docker.io/ivorysql/pgadmin:ubi8-8.0-2.0-1
docker.io/ivorysql/pgbackrest:ubi8-2.47-2.0-1
docker.io/ivorysql/pgbouncer:ubi8-1.21-2.0-1
docker.io/ivorysql/postgres-exporter:ubi8-0.15.0-2.0-1
docker.io/ivorysql/ivorysql:ubi8-3.0-2.0-1
docker.io/ivorysql/postgis:ubi8-3.4-2.0-1
docker.io/ivorysql/pgadmin:ubi8-8.14.0-4.0-1
docker.io/ivorysql/pgbackrest:ubi8-2.54.1-4.0-1
docker.io/ivorysql/pgbouncer:ubi8-1.23.0-4.0-1
docker.io/ivorysql/postgres-exporter:ubi8-0.17.0-4.0-1
docker.io/ivorysql/ivorysql:ubi8-4.5-4.0-1
docker.io/ivorysql/postgis:ubi8-3.4-4.0-1
registry.developers.crunchydata.com/crunchydata/crunchy-upgrade:ubi8-5.3.1-0
- run: go mod download
- name: Build executable
Expand All @@ -119,13 +118,13 @@ jobs:
docker run --detach --network host --read-only \
--volume "$(pwd):/mnt" --workdir '/mnt' --env 'PATH=/mnt/bin' \
--env 'KUBECONFIG=hack/.kube/ivory-operator/ivyo' \
--env 'RELATED_IMAGE_PGADMIN=docker.io/ivorysql/pgadmin:ubi8-8.0-2.0-1' \
--env 'RELATED_IMAGE_PGBACKREST=docker.io/ivorysql/pgbackrest:ubi8-2.47-2.0-1' \
--env 'RELATED_IMAGE_PGBOUNCER=docker.io/ivorysql/pgbouncer:ubi8-1.21-2.0-1' \
--env 'RELATED_IMAGE_PGEXPORTER=docker.io/ivorysql/postgres-exporter:ubi8-0.15.0-2.0-1' \
--env 'RELATED_IMAGE_PGADMIN=docker.io/ivorysql/pgadmin:ubi8-8.14.0-4.0-1' \
--env 'RELATED_IMAGE_PGBACKREST=docker.io/ivorysql/pgbackrest:ubi8-2.54.1-4.0-1' \
--env 'RELATED_IMAGE_PGBOUNCER=docker.io/ivorysql/pgbouncer:ubi8-1.23.0-4.0-1' \
--env 'RELATED_IMAGE_PGEXPORTER=docker.io/ivorysql/postgres-exporter:ubi8-0.17.0-4.0-1' \
--env 'RELATED_IMAGE_PGUPGRADE=registry.developers.crunchydata.com/crunchydata/crunchy-upgrade:ubi8-5.3.1-0' \
--env 'RELATED_IMAGE_IVORY_16=docker.io/ivorysql/ivorysql:ubi8-3.0-2.0-1' \
--env 'RELATED_IMAGE_IVORY_16_GIS_3.4=docker.io/ivorysql/postgis:ubi8-3.4-2.0-1' \
--env 'RELATED_IMAGE_IVORY_17=docker.io/ivorysql/ivorysql:ubi8-4.5-4.0-1' \
--env 'RELATED_IMAGE_IVORY_17_GIS_3.4=docker.io/ivorysql/postgis:ubi8-3.4-4.0-1' \
--env 'IVYO_FEATURE_GATES=TablespaceVolumes=true' \
--name 'ivory-operator' ubuntu \
ivory-operator
Expand All @@ -136,11 +135,11 @@ jobs:

- run: make generate-kuttl
env:
KUTTL_PG_UPGRADE_FROM_VERSION: '16'
KUTTL_PG_UPGRADE_TO_VERSION: '16'
KUTTL_PG_VERSION: '16'
KUTTL_PG_UPGRADE_FROM_VERSION: '17'
KUTTL_PG_UPGRADE_TO_VERSION: '17'
KUTTL_PG_VERSION: '17'
KUTTL_POSTGIS_VERSION: '3.4'
KUTTL_IVORY_IMAGE: 'docker.io/ivorysql/ivorysql:ubi8-3.0-2.0-1'
KUTTL_IVORY_IMAGE: 'docker.io/ivorysql/ivorysql:ubi8-4.5-4.0-1'
- run: |
make check-kuttl && exit
failed=$?
Expand All @@ -158,10 +157,10 @@ jobs:
- kubernetes-api
- kubernetes-k3d
steps:
- uses: actions/checkout@v3
- uses: actions/setup-go@v3
- uses: actions/checkout@v4
- uses: actions/setup-go@v4
with: { go-version: 1.x }
- uses: actions/download-artifact@v3
- uses: actions/download-artifact@v4
with: { path: download }

# Combine the coverage profiles by taking the mode line from any one file
Expand All @@ -185,7 +184,7 @@ jobs:

# Upload coverage to GitHub
- run: gzip total-coverage.html
- uses: actions/upload-artifact@v3
- uses: actions/upload-artifact@v4
with:
name: coverage-report
path: total-coverage.html.gz
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -10468,7 +10468,7 @@ spec:
postgresVersion:
description: The major version of PostgreSQL installed in the PostgreSQL
image
maximum: 16
maximum: 17
minimum: 10
type: integer
proxy:
Expand Down
4 changes: 2 additions & 2 deletions config/crd/bases/ivory-operator.ivorysql.org_ivyupgrades.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -857,7 +857,7 @@ spec:
type: object
fromIvoryVersion:
description: The major version of IvorySQL before the upgrade.
maximum: 16
maximum: 17
minimum: 10
type: integer
image:
Expand Down Expand Up @@ -936,7 +936,7 @@ spec:
type: string
toIvoryVersion:
description: The major version of IvorySQL to be upgraded to.
maximum: 16
maximum: 17
minimum: 10
type: integer
tolerations:
Expand Down
4 changes: 2 additions & 2 deletions config/default/kustomization.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -11,5 +11,5 @@ bases:

images:
- name: ivory-operator
newName: docker.io/balinor2017/ivory-operator
newTag: v0.4
newName: docker.io/ivorysql/ivory-operator
newTag: v4.0
10 changes: 5 additions & 5 deletions config/manager/manager.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -19,15 +19,15 @@ spec:
- name: IVORY_DEBUG
value: "true"
- name: RELATED_IMAGE_IVORY_16
value: "docker.io/ivorysql/ivorysql:ubi8-3.0-2.0-1"
value: "docker.io/ivorysql/ivorysql:ubi8-4.5-4.0-1"
- name: RELATED_IMAGE_PGADMIN
value: "docker.io/ivorysql/pgadmin:ubi8-8.0-2.0-1"
value: "docker.io/ivorysql/pgadmin:ubi8-8.14.0-4.0-1"
- name: RELATED_IMAGE_PGBACKREST
value: "docker.io/ivorysql/pgbackrest:ubi8-2.47-2.0-1"
value: "docker.io/ivorysql/pgbackrest:ubi8-2.54.1-4.0-1"
- name: RELATED_IMAGE_PGBOUNCER
value: "docker.io/ivorysql/pgbouncer:ubi8-1.21-2.0-1"
value: "docker.io/ivorysql/pgbouncer:ubi8-1.23.0-4.0-1"
- name: RELATED_IMAGE_PGEXPORTER
value: "docker.io/ivorysql/postgres-exporter:ubi8-0.15.0-2.0-1"
value: "docker.io/ivorysql/postgres-exporter:ubi8-0.17.0-4.0-1"
- name: RELATED_IMAGE_PGUPGRADE
value: "registry.developers.crunchydata.com/crunchydata/crunchy-upgrade:ubi8-5.3.1-0"
- name: RELATED_IMAGE_POSTGRES_16_GIS_3.4
Expand Down
4 changes: 2 additions & 2 deletions config/singlenamespace/kustomization.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -13,5 +13,5 @@ patches:

images:
- name: ivory-operator
newName: docker.io/balinor2017/ivory-operator
newTag: v0.4
newName: docker.io/ivorysql/ivory-operator
newTag: v4.0
Original file line number Diff line number Diff line change
Expand Up @@ -10468,7 +10468,7 @@ spec:
postgresVersion:
description: The major version of PostgreSQL installed in the PostgreSQL
image
maximum: 16
maximum: 17
minimum: 10
type: integer
proxy:
Expand Down
2 changes: 1 addition & 1 deletion examples/kustomize/install/default/kustomization.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ bases:
images:
- name: ivory-operator
newName: docker.io/ivorysql/ivory-operator
newTag: v1.1
newTag: v4.0

patchesJson6902:
- target: { group: apps, version: v1, kind: Deployment, name: ivyo }
Expand Down
4 changes: 2 additions & 2 deletions examples/kustomize/install/manager/manager.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -27,9 +27,9 @@ spec:
- name: IVORY_DEBUG
value: "true"
- name: RELATED_IMAGE_IVORY_3
value: "docker.io/ivorysql/ivorysql:ubi8-3.0-2.0-1"
value: "docker.io/ivorysql/ivorysql:ubi8-4.5-4.0-1"
- name: RELATED_IMAGE_PGBACKREST
value: "docker.io/ivorysql/pgbackrest:ubi8-2.47-2.0-1"
value: "docker.io/ivorysql/pgbackrest:ubi8-2.54.1-4.0-1"

securityContext:
allowPrivilegeEscalation: false
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ bases:
images:
- name: ivory-operator
newName: docker.io/ivorysql/ivory-operator
newTag: v1.1
newTag: v4.0
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

🧩 Analysis chain

Operator image updated to v4.0 release.

The image tag has been properly updated to reflect the new IvorySQL Operator v4.0 release. This major version increment aligns with the PR objectives for the V4 release.

Verify that the v4.0 image tag exists and is available:


🏁 Script executed:

#!/bin/bash
# Verify the availability of the v4.0 image tag
docker pull docker.io/ivorysql/ivory-operator:v4.0 2>/dev/null && echo "Image v4.0 exists" || echo "Image v4.0 not found"

Length of output: 103


IvorySQL Operator image tag v4.0 not found

The kustomization is referencing a nonexistent operator image tag. Please ensure that docker.io/ivorysql/ivory-operator:v4.0 has been published and is available before updating the manifest. If the v4.0 tag isn’t yet released, revert to the latest published tag or wait until the image is pushed.

• File: examples/kustomize/install/singlenamespace/kustomization.yaml (line 16)
• Current snippet:

  newTag: v4.0
🤖 Prompt for AI Agents
In examples/kustomize/install/singlenamespace/kustomization.yaml at line 16, the
newTag is set to v4.0, but the corresponding Docker image tag
docker.io/ivorysql/ivory-operator:v4.0 does not exist or is not yet published.
To fix this, verify the availability of the v4.0 image tag; if it is not
available, revert the newTag value to the latest published image tag or wait
until the v4.0 image is pushed before updating the manifest.


patchesJson6902:
- target: { group: apps, version: v1, kind: Deployment, name: ivyo }
Expand Down
12 changes: 11 additions & 1 deletion internal/controller/ivorycluster/apply_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,17 @@ func TestServerSideApply(t *testing.T) {
after := constructor()
assert.NilError(t, cc.Patch(ctx, after, client.Apply, reconciler.Owner))
assert.Assert(t, after.GetResourceVersion() != "")
assert.Assert(t, after.GetResourceVersion() == before.GetResourceVersion())
switch {
case serverVersion.LessThan(version.MustParseGeneric("1.25.15")):
case serverVersion.AtLeast(version.MustParseGeneric("1.26")) && serverVersion.LessThan(version.MustParseGeneric("1.26.10")):
case serverVersion.AtLeast(version.MustParseGeneric("1.27")) && serverVersion.LessThan(version.MustParseGeneric("1.27.7")):

assert.Assert(t, after.GetResourceVersion() != before.GetResourceVersion(),
"expected https://issue.k8s.io/116861")

default:
assert.Assert(t, after.GetResourceVersion() == before.GetResourceVersion())
}

// Our apply method generates the correct apply-patch.
again := constructor()
Expand Down
5 changes: 3 additions & 2 deletions internal/controller/ivorycluster/instance.go
Original file line number Diff line number Diff line change
Expand Up @@ -75,7 +75,7 @@ func (i Instance) IsPrimary() (primary bool, known bool) {
return false, false
}

return i.Pods[0].Labels[naming.LabelRole] == naming.RolePatroniLeader, true
return i.Pods[0].Labels[naming.LabelRole] == naming.RolePatroniLeader || i.Pods[0].Labels[naming.LabelRole] == naming.RolePrimary, true
}

// IsReady returns whether or not this instance is ready to receive IvorySQL
Expand Down Expand Up @@ -142,7 +142,8 @@ func (i Instance) IsWritable() (writable, known bool) {

// TODO(cbandy): Update this to consider when Patroni is paused.

return strings.HasPrefix(member[role:], `"role":"master"`), true
return strings.HasPrefix(member[role:], `"role":"master"`) ||
strings.HasPrefix(member[role:], `"role":"primary"`), true
}

// PodMatchesPodTemplate returns whether or not the Pod for this instance
Expand Down
2 changes: 0 additions & 2 deletions internal/controller/ivorycluster/ivory.go
Original file line number Diff line number Diff line change
Expand Up @@ -434,7 +434,6 @@ func (r *Reconciler) reconcileIvoryUsersInIvorySQL(

// Find the IvorySQL instance that can execute SQL that writes system
// catalogs. When there is none, return early.

for _, instance := range instances.forCluster {
if terminating, known := instance.IsTerminating(); terminating || !known {
continue
Expand Down Expand Up @@ -469,7 +468,6 @@ func (r *Reconciler) reconcileIvoryUsersInIvorySQL(
write := func(ctx context.Context, exec ivory.Executor) error {
return ivory.WriteUsersInIvorySQL(ctx, exec, specUsers, verifiers)
}

revision, err := safeHash32(func(hasher io.Writer) error {
// Discard log messages about executing SQL.
return write(logging.NewContext(ctx, logging.Discard()), func(
Expand Down
2 changes: 1 addition & 1 deletion internal/controller/ivorycluster/patroni.go
Original file line number Diff line number Diff line change
Expand Up @@ -106,7 +106,7 @@ func (r *Reconciler) handlePatroniRestarts(
return r.PodExec(pod.Namespace, pod.Name, container, stdin, stdout, stderr, command...)
})

return errors.WithStack(exec.RestartPendingMembers(ctx, "master", naming.PatroniScope(cluster)))
return errors.WithStack(exec.RestartPendingMembers(ctx, "primary", naming.PatroniScope(cluster)))
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

🧩 Analysis chain

LGTM! Role terminology standardization to "primary".

The change from "master" to "primary" aligns with modern inclusive terminology standards. Ensure that all role-dependent components (Patroni, instance recognition logic, etc.) consistently use the "primary" designation.

Verify that role recognition is consistent across the codebase:


🏁 Script executed:

#!/bin/bash
# Description: Verify consistent role terminology usage across the codebase.

# Search for any remaining "master" role references that might need updating
echo "=== Checking for remaining 'master' role references ==="
rg -i 'role.*master|master.*role' --type go --type yaml

echo "=== Verifying 'primary' role usage ==="
rg -i 'role.*primary|primary.*role' --type go --type yaml

echo "=== Checking RestartPendingMembers function signature ==="
ast-grep --pattern 'func $_ RestartPendingMembers($$$) $$$'

Length of output: 8665


Ensure Complete Migration from “master” to “primary”

A repository-wide search shows numerous remaining references to the old “master” role in both code and tests. To avoid inconsistencies at runtime and in your CI tests, please update every occurrence to the new primary designation. Key areas to address:

• internal/naming/labels.go
– Deprecate or remove RolePatroniLeader = "master" and consolidate on RolePrimary = "primary".
• internal/controller/ivorycluster/instance.go
– Replace checks like Labels[naming.LabelRole] == "master" with == naming.RolePrimary.
• All *_test.go under internal/controller/ivorycluster/
– Change assertions and fixture annotations from "role":"master" or naming.LabelRole: "master" to use "primary" or naming.RolePrimary.
• internal/patroni/reconcile_test.go
– Update pod status annotation tests ({"role":"master"}{"role":"primary"}).
• testing/kuttl/e2e/**/*.yaml
– Replace YAML selectors and asserts that reference master to reference primary.

A global search-and-replace (e.g. rg -l 'master' | xargs sed -i '' 's/master/primary/g') followed by targeted adjustments in constants and tests should help ensure full coverage.

🤖 Prompt for AI Agents
In internal/controller/ivorycluster/patroni.go at line 109, the role terminology
has been updated from "master" to "primary" in the RestartPendingMembers call.
To ensure consistency and avoid runtime or test failures, perform a
repository-wide update replacing all remaining "master" role references with
"primary". This includes constants in internal/naming/labels.go, role checks in
internal/controller/ivorycluster/instance.go, test files under
internal/controller/ivorycluster/, internal/patroni/reconcile_test.go, and YAML
files in testing/kuttl/e2e/. Use a global search-and-replace followed by manual
verification and adjustments to constants and tests to complete the migration.

}

// When the primary does not need to restart but a replica does, restart all
Expand Down
Loading
Loading