-
Notifications
You must be signed in to change notification settings - Fork 1.2k
Closed
Closed
Copy link
Labels
kind/bugCategorizes issue or PR as related to a bug.Categorizes issue or PR as related to a bug.
Milestone
Description
What version of Knative?
0.9.x
0.10.x
0.11.x
main
Expected Behavior
When I use the restricted setting for Pod Security Standards, and deploy a Knative Service, I should see a success.
There are issues with both the user-container and the queue-proxy.
Actual Behavior
The deployment cannot start due to errors like:
❯ k get deployments.apps tomato-00001-deployment -oyaml
apiVersion: apps/v1
kind: Deployment
metadata:
annotations:
deployment.kubernetes.io/revision: "1"
serving.knative.dev/creator: ckauzlaric@vmware.com
creationTimestamp: "2023-09-13T20:13:14Z"
generation: 1
labels:
app: tomato-00001
serving.knative.dev/configuration: tomato
serving.knative.dev/configurationGeneration: "1"
serving.knative.dev/configurationUID: 7a129f79-5cc8-467d-810d-9e297cfdabbd
serving.knative.dev/revision: tomato-00001
serving.knative.dev/revisionUID: 6680b843-1a7e-40e2-ae10-bdd318a5e94a
serving.knative.dev/service: tomato
serving.knative.dev/serviceUID: 764c5939-4640-4548-bba9-d909e3159f1d
name: tomato-00001-deployment
namespace: default
ownerReferences:
- apiVersion: serving.knative.dev/v1
blockOwnerDeletion: true
controller: true
kind: Revision
name: tomato-00001
uid: 6680b843-1a7e-40e2-ae10-bdd318a5e94a
resourceVersion: "54227699"
uid: ccc9b830-4572-4066-b19c-018623185475
spec:
progressDeadlineSeconds: 600
replicas: 1
revisionHistoryLimit: 10
selector:
matchLabels:
serving.knative.dev/revisionUID: 6680b843-1a7e-40e2-ae10-bdd318a5e94a
strategy:
rollingUpdate:
maxSurge: 25%
maxUnavailable: 0
type: RollingUpdate
template:
metadata:
annotations:
serving.knative.dev/creator: ckauzlaric@vmware.com
creationTimestamp: null
labels:
app: tomato-00001
serving.knative.dev/configuration: tomato
serving.knative.dev/configurationGeneration: "1"
serving.knative.dev/configurationUID: 7a129f79-5cc8-467d-810d-9e297cfdabbd
serving.knative.dev/revision: tomato-00001
serving.knative.dev/revisionUID: 6680b843-1a7e-40e2-ae10-bdd318a5e94a
serving.knative.dev/service: tomato
serving.knative.dev/serviceUID: 764c5939-4640-4548-bba9-d909e3159f1d
spec:
containers:
- env:
- name: TARGET
value: tomato
- name: PORT
value: "8080"
- name: K_REVISION
value: tomato-00001
- name: K_CONFIGURATION
value: tomato
- name: K_SERVICE
value: tomato
image: gcr.io/knative-samples/helloworld-go@sha256:2babda8ec819e24d5a6342095e8f8a25a67b44eb7231ae253ecc2c448632f07e
imagePullPolicy: IfNotPresent
lifecycle:
preStop:
httpGet:
path: /wait-for-drain
port: 8022
scheme: HTTP
name: user-container
ports:
- containerPort: 8080
name: user-port
protocol: TCP
resources: {}
securityContext:
allowPrivilegeEscalation: false
capabilities:
drop:
- ALL
seccompProfile:
type: RuntimeDefault
terminationMessagePath: /dev/termination-log
terminationMessagePolicy: FallbackToLogsOnError
- env:
- name: SERVING_NAMESPACE
value: default
- name: SERVING_SERVICE
value: tomato
- name: SERVING_CONFIGURATION
value: tomato
- name: SERVING_REVISION
value: tomato-00001
- name: QUEUE_SERVING_PORT
value: "8012"
- name: QUEUE_SERVING_TLS_PORT
value: "8112"
- name: CONTAINER_CONCURRENCY
value: "0"
- name: REVISION_TIMEOUT_SECONDS
value: "300"
- name: REVISION_RESPONSE_START_TIMEOUT_SECONDS
value: "0"
- name: REVISION_IDLE_TIMEOUT_SECONDS
value: "0"
- name: SERVING_POD
valueFrom:
fieldRef:
apiVersion: v1
fieldPath: metadata.name
- name: SERVING_POD_IP
valueFrom:
fieldRef:
apiVersion: v1
fieldPath: status.podIP
- name: SERVING_LOGGING_CONFIG
- name: SERVING_LOGGING_LEVEL
- name: SERVING_REQUEST_LOG_TEMPLATE
value: '{"httpRequest": {"requestMethod": "{{.Request.Method}}", "requestUrl":
"{{js .Request.RequestURI}}", "requestSize": "{{.Request.ContentLength}}",
"status": {{.Response.Code}}, "responseSize": "{{.Response.Size}}", "userAgent":
"{{js .Request.UserAgent}}", "remoteIp": "{{js .Request.RemoteAddr}}",
"serverIp": "{{.Revision.PodIP}}", "referer": "{{js .Request.Referer}}",
"latency": "{{.Response.Latency}}s", "protocol": "{{.Request.Proto}}"},
"traceId": "{{index .Request.Header "X-B3-Traceid"}}"}'
- name: SERVING_ENABLE_REQUEST_LOG
value: "false"
- name: SERVING_REQUEST_METRICS_BACKEND
value: prometheus
- name: SERVING_REQUEST_METRICS_REPORTING_PERIOD_SECONDS
value: "5"
- name: TRACING_CONFIG_BACKEND
value: none
- name: TRACING_CONFIG_ZIPKIN_ENDPOINT
- name: TRACING_CONFIG_DEBUG
value: "false"
- name: TRACING_CONFIG_SAMPLE_RATE
value: "0.1"
- name: USER_PORT
value: "8080"
- name: SYSTEM_NAMESPACE
value: knative-serving
- name: METRICS_DOMAIN
value: knative.dev/internal/serving
- name: SERVING_READINESS_PROBE
value: '{"tcpSocket":{"port":8080,"host":"127.0.0.1"},"successThreshold":1}'
- name: ENABLE_PROFILING
value: "false"
- name: SERVING_ENABLE_PROBE_REQUEST_LOG
value: "false"
- name: METRICS_COLLECTOR_ADDRESS
- name: HOST_IP
valueFrom:
fieldRef:
apiVersion: v1
fieldPath: status.hostIP
- name: ENABLE_HTTP2_AUTO_DETECTION
value: "false"
- name: ROOT_CA
image: us.gcr.io/daisy-284300/clay/queue-39be6f1d08a095bd076a71d288d295b6@sha256:5ee5f554893ed3ce2cedbd9547a2088c274f2aa65cf63c9d1104fa83f5ffe94b
imagePullPolicy: IfNotPresent
name: queue-proxy
ports:
- containerPort: 8022
name: http-queueadm
protocol: TCP
- containerPort: 9090
name: http-autometric
protocol: TCP
- containerPort: 9091
name: http-usermetric
protocol: TCP
- containerPort: 8012
name: queue-port
protocol: TCP
- containerPort: 8112
name: https-port
protocol: TCP
readinessProbe:
failureThreshold: 3
httpGet:
httpHeaders:
- name: K-Network-Probe
value: queue
path: /
port: 8012
scheme: HTTP
periodSeconds: 10
successThreshold: 1
timeoutSeconds: 1
resources:
requests:
cpu: 25m
securityContext:
allowPrivilegeEscalation: false
capabilities:
drop:
- ALL
readOnlyRootFilesystem: true
runAsNonRoot: true
terminationMessagePath: /dev/termination-log
terminationMessagePolicy: File
dnsPolicy: ClusterFirst
enableServiceLinks: false
restartPolicy: Always
schedulerName: default-scheduler
securityContext: {}
terminationGracePeriodSeconds: 300
status:
conditions:
- lastTransitionTime: "2023-09-13T20:13:14Z"
lastUpdateTime: "2023-09-13T20:13:14Z"
message: Created new replica set "tomato-00001-deployment-7b8cd5d98d"
reason: NewReplicaSetCreated
status: "True"
type: Progressing
- lastTransitionTime: "2023-09-13T20:13:14Z"
lastUpdateTime: "2023-09-13T20:13:14Z"
message: Deployment does not have minimum availability.
reason: MinimumReplicasUnavailable
status: "False"
type: Available
- lastTransitionTime: "2023-09-13T20:13:14Z"
lastUpdateTime: "2023-09-13T20:13:14Z"
message: 'admission webhook "pod-security-webhook.kubernetes.io" denied the request:
pods "tomato-00001-deployment-7b8cd5d98d-7gdj7" is forbidden: violates PodSecurity
"restricted:latest": runAsNonRoot != true (pod or container "user-container"
must set securityContext.runAsNonRoot=true), seccompProfile (pod or container
"queue-proxy" must set securityContext.seccompProfile.type to "RuntimeDefault"
or "Localhost")'
reason: FailedCreate
status: "True"
type: ReplicaFailure
observedGeneration: 1
unavailableReplicas: 1
Steps to Reproduce the Problem
tested on main, with GKE 1.27 cluster
- Create a restricted namespace
apiVersion: v1
kind: Namespace
metadata:
name: restricted
labels:
pod-security.kubernetes.io/enforce: restricted
pod-security.kubernetes.io/enforce-version: v1.27
# We are setting these to our _desired_ `enforce` level.
pod-security.kubernetes.io/audit: restricted
pod-security.kubernetes.io/audit-version: v1.27
pod-security.kubernetes.io/warn: restricted
pod-security.kubernetes.io/warn-version: v1.27
- Install Knative Serving
- configure
secure-pod-defaultsto enabled in config-features - create a Knative Service
apiVersion: serving.knative.dev/v1
kind: Service
metadata:
name: tomato
namespace: restricted
spec:
template:
metadata:
annotations:
autoscaling.knative.dev/initial-scale: "1"
autoscaling.knative.dev/min-scale: "1"
spec:
containers:
- env:
- name: TARGET
value: tomato
image: gcr.io/knative-samples/helloworld-go
name: user-container
Metadata
Metadata
Assignees
Labels
kind/bugCategorizes issue or PR as related to a bug.Categorizes issue or PR as related to a bug.