-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support pod template for Spark 3.x applications #2141
base: master
Are you sure you want to change the base?
Conversation
07e334a
to
dc671cb
Compare
/hold |
27696b4
to
a7c5002
Compare
@@ -302,6 +304,12 @@ func driverConfOption(app *v1beta2.SparkApplication) ([]string, error) { | |||
property = fmt.Sprintf(common.SparkKubernetesDriverLabelTemplate, common.LabelLaunchedBySparkOperator) | |||
args = append(args, "--conf", fmt.Sprintf("%s=%s", property, "true")) | |||
|
|||
// If Spark version is less than 3.0.0 or driver pod template is not defined, then the driver pod needs to be mutated by the webhook. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As discussed during the community call, we can think about a future release (maybe 2.1) where we announce the deprecation of web hook and support for Spark 2.x
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would we take the approach of moving to users only being able to specify a pod template, or map the existing struct fields and construct a pod template on the fly during submission?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I feel like mapping existing field to a pod template on the fly + new parameter templateFile
go be the easiest migration path for existing user
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For now I would personally find it useful to support webhook mutations and template on the same SparkApplication
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
/lgtm
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ChenYi015 Thanks for adding this much needed feature! PR looks good to me. You may need to resolve the conflicts to move forward.
e78f6b6
to
bc8d1ba
Compare
New changes are detected. LGTM label has been removed. |
@vara-bonthu Thanks for the review. I have resolved the merge conflicts. |
/unhold |
This is great stuff! Nicely done @ChenYi015 🚀 |
373a5e8
to
bfd00ac
Compare
Signed-off-by: Yi Chen <[email protected]>
Signed-off-by: Yi Chen <[email protected]>
Signed-off-by: Yi Chen <[email protected]>
Signed-off-by: Yi Chen <[email protected]>
bfd00ac
to
bdc9eca
Compare
spark.apache.org/version: 3.5.3 | ||
spec: | ||
containers: | ||
- name: spark-kubernetes-driver |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ChenYi015 Shall we show nodeSelectors and Taints as well in the example? Mutating admission webhook is required for this but its not anymore once we merge this PodTemplates feature
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
/approve
[APPROVALNOTIFIER] This PR is APPROVED This pull-request has been approved by: vara-bonthu, yuchaoran2011 The full list of commands accepted by this bot can be found here. The pull request process is described here
Needs approval from an approver in each of these files:
Approvers can indicate their approval by writing |
I know that the PR has been approved but I feel like introducing a new key is not the right implementation here We should either build the template with the current CR if spark version >= 3 or have the possibility to specify a template file name to use This implementation force a migration for our user if they want to get rid of webhook but it should be free for them (which is not the case here) |
|
My user perspective is that I like introducing the new Before the webhook is deprecated though I think the template should be built from the current CR as ImpSy suggests. |
Purpose of this PR
Close #2101
Close #1690
Proposed changes:
spec.driver.template
andspec.executor.template
to SparkApplication CRDsparkoperator.k8s.io/mutated-by-spark-operator="true"
, and this label will be added by controller as needed.spark-pi-pod-template.yaml
Change Category
Indicate the type of change by marking the applicable boxes:
Rationale
Checklist
Before submitting your PR, please review the following:
Additional Notes