Skip to content

Return an actual newest version of the directory latest is not available#576

Merged
knative-prow-robot merged 5 commits into
knative:mainfrom
houshengbo:fix-key-latest-version
May 17, 2021
Merged

Return an actual newest version of the directory latest is not available#576
knative-prow-robot merged 5 commits into
knative:mainfrom
houshengbo:fix-key-latest-version

Conversation

@houshengbo
Copy link
Copy Markdown
Contributor

@houshengbo houshengbo commented May 10, 2021

Fixes #556

Proposed Changes

  • The field spec.version is able to set to latest. If the directory latest is not available, return the newest actual version instead.

@google-cla google-cla Bot added the cla: yes Indicates the PR's author has signed the CLA. label May 10, 2021
@knative-prow-robot knative-prow-robot added the size/L Denotes a PR that changes 100-499 lines, ignoring generated files. label May 10, 2021
Copy link
Copy Markdown
Contributor

@knative-prow-robot knative-prow-robot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@houshengbo: 0 warnings.

Details

In response to this:

Fixes #556

Proposed Changes

Release Note


Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes/test-infra repository.

@houshengbo
Copy link
Copy Markdown
Contributor Author

/assign @nak3

@houshengbo houshengbo changed the title Return an actual neest version of the directory latest is not available Return an actual newest version of the directory latest is not available May 10, 2021
Copy link
Copy Markdown
Contributor

@nak3 nak3 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems that upgrade test still panicking.
Is it possible to fix the upgrade test by using this change?
Please refer to k8s.log.txt

default/knative-operator-578d98d87b-7tm8x[knative-operator]: panic: runtime error: slice bounds out of range [1:0]
...
default/knative-operator-578d98d87b-7tm8x[knative-operator]:    knative.dev/operator/pkg/reconciler/knativeserving/ingress/ingress.go:95 +0x419
...
default/knative-operator-578d98d87b-7tm8x[knative-operator]:    knative.dev/operator/pkg/reconciler/knativeserving/ingress/ingress.go:107 +0x57
default/knative-operator-578d98d87b-7tm8x[knative-operator]: knative.dev/operator/pkg/reconciler/common.Stages.Execute(0xc0004a18a8, 0x9, 0x9, 0x201e860, 0xc0009804e0, 0xc001860100, 0x204aa80, 0xc00153b400, 0x0, 0x0)
default/knative-operator-578d98d87b-7tm8x[knative-operator]:    knative.dev/operator/pkg/reconciler/common/stages.go:37 +0x83
default/knative-operator-578d98d87b-7tm8x[knative-operator]: knative.dev/operator/pkg/reconciler/knativeserving.(*Reconciler).ReconcileKind(0xc0000f8000, 0x201e860, 0xc0009804e0, 0xc00153b400, 0x0, 0x0)

@houshengbo
Copy link
Copy Markdown
Contributor Author

@nak3 I should have added a test on getIngress.
Weird, how could the test show green mark?

@nak3
Copy link
Copy Markdown
Contributor

nak3 commented May 11, 2021

Weird, how could the test show green mark?

I guess that something should be fixed in the upgrade test asides from this fix.

@houshengbo houshengbo force-pushed the fix-key-latest-version branch from 54c46f4 to d1a4fae Compare May 11, 2021 17:31
@knative-prow-robot knative-prow-robot added size/XXL Denotes a PR that changes 1000+ lines, ignoring generated files. and removed size/L Denotes a PR that changes 100-499 lines, ignoring generated files. labels May 11, 2021
@houshengbo houshengbo force-pushed the fix-key-latest-version branch from d8f925b to b69f39f Compare May 11, 2021 17:41
@houshengbo
Copy link
Copy Markdown
Contributor Author

@nak3 I fixes the getIngress function.
Adding more test coverage as well.

@nak3
Copy link
Copy Markdown
Contributor

nak3 commented May 12, 2021

Thank you @houshengbo Could you please tell me how to use the spec.version locally? It is different from the step with #556 (comment) ?

@houshengbo
Copy link
Copy Markdown
Contributor Author

@nak3 Should be no difference.
You can specify latest to spec.version, or any other version, like 0.22, 0.22.0, etc.
If the semantic version is not available, operator will log the error in the CR.

@nak3
Copy link
Copy Markdown
Contributor

nak3 commented May 12, 2021

Hmm... In that case, I think this fix still does not work. I tested it but the release label is v0.22.0

$ kubectl get deploy --show-labels 
NAME                               READY   UP-TO-DATE   AVAILABLE   AGE   LABELS
activator                          1/1     1            1           12h   serving.knative.dev/release=v0.22.0
autoscaler                         1/1     1            1           12h   serving.knative.dev/release=v0.22.0
  ...

$ kubectl get knativeserving ks -o template="{{.spec.version}}" 
latest

upgrade e2e test did not catch this issue as well, so I think upgrade test should be improved.

@houshengbo
Copy link
Copy Markdown
Contributor Author

houshengbo commented May 13, 2021

@nak3 That is what we expected.
If latest is not available, return 0.22 instead for your case.
spec.version is latest, but status.version is 0.22.0.

@houshengbo
Copy link
Copy Markdown
Contributor Author

houshengbo commented May 13, 2021

@nak3 Here is what I did:
I created a dir called latest. I copied all the manifests under 0.19.0 into this directory.
Run the command ko apply -f config.
Everything is launched.

Apply the CR:

apiVersion: v1
kind: Namespace
metadata:
 name: knative-serving
---
apiVersion: operator.knative.dev/v1alpha1
kind: KnativeServing
metadata:
  name: knative-serving
  namespace: knative-serving
spec:
  version: latest

When the CR is ready, check the labels: k get deploy -n knative-serving --show-labels

NAME               READY   UP-TO-DATE   AVAILABLE   AGE     LABELS
activator          1/1     1            1           6m      serving.knative.dev/release=v0.19.0
autoscaler         1/1     1            1           6m      serving.knative.dev/release=v0.19.0
autoscaler-hpa     1/1     1            1           5m57s   autoscaling.knative.dev/autoscaler-provider=hpa,serving.knative.dev/release=v0.19.0
controller         1/1     1            1           5m59s   serving.knative.dev/release=v0.19.0
istio-webhook      1/1     1            1           5m54s   networking.knative.dev/ingress-provider=istio,serving.knative.dev/release=v0.19.0
networking-istio   1/1     1            1           5m55s   networking.knative.dev/ingress-provider=istio,serving.knative.dev/release=v0.19.0
webhook            1/1     1            1           5m58s   serving.knative.dev/release=v0.19.0

The label is 0.19.0, which means the label matches what I put in the manifests. OK!

Check the CM of the net-istio: k get cm config-istio -n knative-serving --show-labels

NAME           DATA   AGE     LABELS
config-istio   1      8m57s   networking.knative.dev/ingress-provider=istio,serving.knative.dev/release=v0.19.0

Check the CR:

apiVersion: operator.knative.dev/v1alpha1
kind: KnativeServing
metadata:
  annotations:
    kubectl.kubernetes.io/last-applied-configuration: |
      {"apiVersion":"operator.knative.dev/v1alpha1","kind":"KnativeServing","metadata":{"annotations":{},"name":"knative-serving","namespace":"knative-serving"},"spec":{"version":"latest"}}
  creationTimestamp: "2021-05-13T12:49:09Z"
  finalizers:
  - knativeservings.operator.knative.dev
  generation: 1
  managedFields:
  - apiVersion: operator.knative.dev/v1alpha1
    fieldsType: FieldsV1
    fieldsV1:
      f:metadata:
        f:annotations:
          .: {}
          f:kubectl.kubernetes.io/last-applied-configuration: {}
      f:spec:
        .: {}
        f:version: {}
    manager: kubectl-client-side-apply
    operation: Update
    time: "2021-05-13T12:49:09Z"
  - apiVersion: operator.knative.dev/v1alpha1
    fieldsType: FieldsV1
    fieldsV1:
      f:metadata:
        f:finalizers:
          .: {}
          v:"knativeservings.operator.knative.dev": {}
      f:spec:
        f:cluster-local-gateway: {}
        f:controller-custom-certs:
          .: {}
          f:name: {}
          f:type: {}
        f:knative-ingress-gateway: {}
        f:registry: {}
      f:status:
        .: {}
        f:conditions: {}
        f:manifests: {}
        f:observedGeneration: {}
        f:version: {}
    manager: operator
    operation: Update
    time: "2021-05-13T12:49:21Z"
  name: knative-serving
  namespace: knative-serving
  resourceVersion: "1489581"
  selfLink: /apis/operator.knative.dev/v1alpha1/namespaces/knative-serving/knativeservings/knative-serving
  uid: 5c424b2d-75a0-40ae-977e-59a86f43bd98
spec:
  version: latest
status:
  conditions:
  - lastTransitionTime: "2021-05-13T12:49:21Z"
    status: "True"
    type: DependenciesInstalled
  - lastTransitionTime: "2021-05-13T12:51:41Z"
    status: "True"
    type: DeploymentsAvailable
  - lastTransitionTime: "2021-05-13T12:49:21Z"
    status: "True"
    type: InstallSucceeded
  - lastTransitionTime: "2021-05-13T12:51:41Z"
    status: "True"
    type: Ready
  - lastTransitionTime: "2021-05-13T12:49:09Z"
    status: "True"
    type: VersionMigrationEligible
  manifests:
  - /var/run/ko/knative-serving/latest
  observedGeneration: 1
  version: latest

spec.version is latest and status.version is latest as well. Everything seems ok.

@houshengbo
Copy link
Copy Markdown
Contributor Author

Then, I removed the dir latest. Clean the cluster.
Rebuild the image ko apply -f config.
Everything goes up.

Apply the CR:

apiVersion: v1
kind: Namespace
metadata:
 name: knative-serving
---
apiVersion: operator.knative.dev/v1alpha1
kind: KnativeServing
metadata:
  name: knative-serving
  namespace: knative-serving
spec:
  version: latest

When the CR is ready, check the labels: k get deploy -n knative-serving --show-labels

NAME               READY   UP-TO-DATE   AVAILABLE   AGE   LABELS
activator          1/1     1            1           41s   serving.knative.dev/release=v0.22.0
autoscaler         1/1     1            1           40s   serving.knative.dev/release=v0.22.0
autoscaler-hpa     1/1     1            1           38s   autoscaling.knative.dev/autoscaler-provider=hpa,serving.knative.dev/release=v0.22.0
controller         1/1     1            1           40s   serving.knative.dev/release=v0.22.0
istio-webhook      1/1     1            1           35s   networking.knative.dev/ingress-provider=istio,serving.knative.dev/release=v0.22.1
networking-istio   1/1     1            1           35s   networking.knative.dev/ingress-provider=istio,serving.knative.dev/release=v0.22.1
webhook            1/1     1            1           39s   serving.knative.dev/release=v0.22.0

The newest version is 0.22, so the labels are correct.

Check the CR:

apiVersion: v1
items:
- apiVersion: operator.knative.dev/v1alpha1
  kind: KnativeServing
  metadata:
    annotations:
      kubectl.kubernetes.io/last-applied-configuration: |
        {"apiVersion":"operator.knative.dev/v1alpha1","kind":"KnativeServing","metadata":{"annotations":{},"name":"knative-serving","namespace":"knative-serving"},"spec":{"version":"latest"}}
    creationTimestamp: "2021-05-13T13:03:39Z"
    finalizers:
    - knativeservings.operator.knative.dev
    generation: 1
    managedFields:
    - apiVersion: operator.knative.dev/v1alpha1
      fieldsType: FieldsV1
      fieldsV1:
        f:metadata:
          f:annotations:
            .: {}
            f:kubectl.kubernetes.io/last-applied-configuration: {}
        f:spec:
          .: {}
          f:version: {}
      manager: kubectl-client-side-apply
      operation: Update
      time: "2021-05-13T13:03:39Z"
    - apiVersion: operator.knative.dev/v1alpha1
      fieldsType: FieldsV1
      fieldsV1:
        f:metadata:
          f:finalizers:
            .: {}
            v:"knativeservings.operator.knative.dev": {}
        f:spec:
          f:cluster-local-gateway: {}
          f:controller-custom-certs:
            .: {}
            f:name: {}
            f:type: {}
          f:knative-ingress-gateway: {}
          f:registry: {}
        f:status:
          .: {}
          f:conditions: {}
          f:manifests: {}
          f:observedGeneration: {}
          f:version: {}
      manager: operator
      operation: Update
      time: "2021-05-13T13:03:51Z"
    name: knative-serving
    namespace: knative-serving
    resourceVersion: "1497356"
    selfLink: /apis/operator.knative.dev/v1alpha1/namespaces/knative-serving/knativeservings/knative-serving
    uid: a2431493-fdd3-4803-a3d8-c41c5d64dd0a
  spec:
    version: latest
  status:
    conditions:
    - lastTransitionTime: "2021-05-13T13:03:51Z"
      status: "True"
      type: DependenciesInstalled
    - lastTransitionTime: "2021-05-13T13:04:21Z"
      status: "True"
      type: DeploymentsAvailable
    - lastTransitionTime: "2021-05-13T13:03:51Z"
      status: "True"
      type: InstallSucceeded
    - lastTransitionTime: "2021-05-13T13:04:21Z"
      status: "True"
      type: Ready
    - lastTransitionTime: "2021-05-13T13:03:39Z"
      status: "True"
      type: VersionMigrationEligible
    manifests:
    - /var/run/ko/knative-serving/0.22.0
    observedGeneration: 1
    version: 0.22.0
kind: List
metadata:
  resourceVersion: ""
  selfLink: ""

spec.version is latest, status.version is 0.22.0.
They seems correct.

@nak3
Copy link
Copy Markdown
Contributor

nak3 commented May 13, 2021

Thank you. Okay, I understand the "rule". When spec.version: latest is used for serving, latest directory must be created in cmd/operator/kodata/ingress as well. Then it worked.

@houshengbo
Copy link
Copy Markdown
Contributor Author

houshengbo commented May 13, 2021

Ingresses need to be in sync with the knative serving structure all the time.
There is no validation for ingress only. The validation is for knative serving. So any version working with knative serving must match the version for ingresses.

Maybe I can change the part for the ingress version, by checking the latest dir also.

@houshengbo houshengbo force-pushed the fix-key-latest-version branch from b69f39f to 65b45f2 Compare May 13, 2021 19:18
@nak3 nak3 removed their assignment May 14, 2021
@nak3
Copy link
Copy Markdown
Contributor

nak3 commented May 14, 2021

/assign @knative/operations-writers

Copy link
Copy Markdown
Contributor

@nak3 nak3 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you @houshengbo
It looks good to me overall but is it not possible to catch the current panic issue #556 (comment) in upgrade test before merging this PR?
If you could improve the upgrade test to catch it, you can prove that this PR surely fixed it and can notice similar issues by causing any future change.

@houshengbo
Copy link
Copy Markdown
Contributor Author

@nak3 I can add some logic into this function
func getIngress(version string, manifest *mf.Manifest) error, to check if the version is a valid semantic version, before running semver.MajorMinor(common.SanitizeSemver(version))[1:].

@nak3
Copy link
Copy Markdown
Contributor

nak3 commented May 17, 2021

That's also good but I mean that current upgrade test prints all "Passed" even though operator keeps causing go panic. It could be a upgrade test bug. We can not fix it?

@houshengbo
Copy link
Copy Markdown
Contributor Author

@nak3 I have no idea why prow does not catch the panic :-(
What I can do with operator is to avoid the panic in golang code as much as possible.

Copy link
Copy Markdown

@maximilien maximilien left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

/ok-to-test
/approve
/lgtm

@knative-prow-robot knative-prow-robot added ok-to-test Indicates a non-member PR verified by an org member that is safe to test. lgtm Indicates that a PR is ready to be merged. labels May 17, 2021
@knative-metrics-robot
Copy link
Copy Markdown

The following is the coverage report on the affected files.
Say /test pull-knative-operator-go-coverage to re-run this coverage report

File Old Coverage New Coverage Delta
pkg/reconciler/common/releases.go 91.6% 87.7% -3.8
pkg/reconciler/knativeserving/ingress/ingress.go 97.9% 100.0% 2.1

@maximilien
Copy link
Copy Markdown

/approve

1 similar comment
@houshengbo
Copy link
Copy Markdown
Contributor Author

/approve

@knative-prow-robot
Copy link
Copy Markdown
Contributor

[APPROVALNOTIFIER] This PR is APPROVED

This pull-request has been approved by: houshengbo, maximilien

The full list of commands accepted by this bot can be found here.

The pull request process is described here

Details Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@knative-prow-robot knative-prow-robot added the approved Indicates a PR has been approved by an approver from all required OWNERS files. label May 17, 2021
@knative-prow-robot knative-prow-robot merged commit 5f894ab into knative:main May 17, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

approved Indicates a PR has been approved by an approver from all required OWNERS files. cla: yes Indicates the PR's author has signed the CLA. lgtm Indicates that a PR is ready to be merged. ok-to-test Indicates a non-member PR verified by an org member that is safe to test. size/XXL Denotes a PR that changes 1000+ lines, ignoring generated files.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Usage of spec.version: latest is not clear

5 participants