Skip to content

[Merged by Bors] - Fix PVC-related tests on managed k8s #90

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 33 commits into from
Closed
Show file tree
Hide file tree
Changes from 32 commits
Commits
Show all changes
33 commits
Select commit Hold shift + click to select a range
4fac471
wip
adwk67 Jun 17, 2022
4da42c1
wip: working test
adwk67 Jun 20, 2022
988a2e2
formatting
adwk67 Jun 20, 2022
3978283
code cleanup
adwk67 Jun 20, 2022
5b435f2
regenerate charts
adwk67 Jun 20, 2022
223408e
updated changelog
adwk67 Jun 21, 2022
a0cdc26
use access_style from s3 struct
adwk67 Jun 21, 2022
5af14ca
log out warning if tls is specified
adwk67 Jun 21, 2022
927ad77
updated documentation
adwk67 Jun 21, 2022
309485b
updated changelog
adwk67 Jun 22, 2022
9316329
Merge branch 'main' into s3-impl-update
adwk67 Jun 22, 2022
e662046
corrected changelog
adwk67 Jun 22, 2022
05287e1
test mixed access modes
adwk67 Jun 24, 2022
b938193
renamed function
adwk67 Jun 24, 2022
f527b7d
changes following review
adwk67 Jun 24, 2022
cf47871
Update rust/crd/src/lib.rs
adwk67 Jun 24, 2022
4aa7ccd
minor cleanup
adwk67 Jun 24, 2022
5131dea
refactoring of secret/access keys as per review suggestion
adwk67 Jun 24, 2022
889821b
Merge branch 's3-impl-update' into pvc-tests
adwk67 Jun 24, 2022
748d188
reverted earlier changes
adwk67 Jun 24, 2022
8ff8078
specify node selection
adwk67 Jun 24, 2022
d2ed01d
Merge branch 'main' into pvc-tests
adwk67 Jun 24, 2022
6eeade4
use different nodes for each job
adwk67 Jun 27, 2022
9a2912d
applied TTL to job
adwk67 Jun 27, 2022
edc20d6
job-specific naming
adwk67 Jun 27, 2022
efc1bf4
consolidate changes
adwk67 Jun 27, 2022
104964b
commented out TTLs
adwk67 Jun 27, 2022
39aa778
add node selector to job
adwk67 Jun 27, 2022
b15cfc3
removed commented-out directives
adwk67 Jun 27, 2022
ae5b689
regenerate charts
adwk67 Jun 27, 2022
8a04f54
updated changelog
adwk67 Jun 27, 2022
34a6e2d
run all jobs on node=1 for Azure tests
adwk67 Jun 28, 2022
e61f5c3
use driver node selection for job
adwk67 Jun 28, 2022
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,8 +9,10 @@ All notable changes to this project will be documented in this file.
### Changed

- BREAKING: Use current S3 connection/bucket structs ([#86])
- Add node selector to top-level job and specify node selection in PVC-relevant tests ([#90])

[#86]: https://github.com/stackabletech/spark-k8s-operator/pull/86
[#90]: https://github.com/stackabletech/spark-k8s-operator/pull/90

## [0.2.0] - 2022-06-21

Expand Down
5 changes: 5 additions & 0 deletions deploy/crd/sparkapplication.crd.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -272,6 +272,11 @@ spec:
mode:
nullable: true
type: string
nodeSelector:
additionalProperties:
type: string
nullable: true
type: object
s3bucket:
description: Operators are expected to define fields for this type in order to work with S3 buckets.
nullable: true
Expand Down
5 changes: 5 additions & 0 deletions deploy/helm/spark-k8s-operator/crds/crds.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -274,6 +274,11 @@ spec:
mode:
nullable: true
type: string
nodeSelector:
additionalProperties:
type: string
nullable: true
type: object
s3bucket:
description: Operators are expected to define fields for this type in order to work with S3 buckets.
nullable: true
Expand Down
5 changes: 5 additions & 0 deletions deploy/manifests/crds.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -275,6 +275,11 @@ spec:
mode:
nullable: true
type: string
nodeSelector:
additionalProperties:
type: string
nullable: true
type: object
s3bucket:
description: Operators are expected to define fields for this type in order to work with S3 buckets.
nullable: true
Expand Down
6 changes: 6 additions & 0 deletions rust/crd/src/lib.rs
Original file line number Diff line number Diff line change
Expand Up @@ -100,6 +100,8 @@ pub struct SparkApplicationSpec {
pub volumes: Option<Vec<Volume>>,
#[serde(default, skip_serializing_if = "Option::is_none")]
pub env: Option<Vec<EnvVar>>,
#[serde(default, skip_serializing_if = "Option::is_none")]
pub node_selector: Option<std::collections::BTreeMap<String, String>>,
}

#[derive(Clone, Debug, Deserialize, Eq, JsonSchema, PartialEq, Serialize, Display, EnumString)]
Expand Down Expand Up @@ -411,6 +413,10 @@ impl SparkApplication {
.as_ref()
.and_then(|executor_config| executor_config.node_selector.clone())
}

pub fn job_node_selector(&self) -> Option<std::collections::BTreeMap<String, String>> {
self.spec.node_selector.clone()
}
}

#[derive(Clone, Debug, Default, Deserialize, Eq, JsonSchema, PartialEq, Serialize)]
Expand Down
1 change: 1 addition & 0 deletions rust/operator-binary/src/spark_k8s_controller.rs
Original file line number Diff line number Diff line change
Expand Up @@ -388,6 +388,7 @@ fn spark_job(
.fs_group(1000)
.build()
.into(), // Needed for secret-operator
node_selector: spark_application.job_node_selector(),
..PodSpec::default()
}),
};
Expand Down
2 changes: 2 additions & 0 deletions tests/templates/kuttl/spark-ny-public-s3/02-deps-volume.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,8 @@ metadata:
spec:
template:
spec:
nodeSelector:
node: "1"
restartPolicy: Never
volumes:
- name: job-deps
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,8 @@ spec:
mode: cluster
mainClass: tech.stackable.demo.spark.NYTLCReport
mainApplicationFile: s3a://my-bucket/ny-tlc-report-1.1.0.jar
nodeSelector:
node: "1"
volumes:
- name: spark-ny-deps
persistentVolumeClaim:
Expand Down Expand Up @@ -48,6 +50,8 @@ spec:
mountPath: /dependencies
- name: cm-job-arguments
mountPath: /arguments
nodeSelector:
node: "1"
executor:
cores: 1
instances: 3
Expand All @@ -57,3 +61,5 @@ spec:
mountPath: /dependencies
- name: cm-job-arguments
mountPath: /arguments
nodeSelector:
node: "1"
6 changes: 4 additions & 2 deletions tests/templates/kuttl/spark-pi-private-s3/02-deps-volume.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
apiVersion: v1
kind: PersistentVolumeClaim
metadata:
name: spark-pi-pvc
name: spark-pi-private-pvc
spec:
accessModes:
- ReadWriteOnce
Expand All @@ -17,11 +17,13 @@ metadata:
spec:
template:
spec:
nodeSelector:
node: "1"
restartPolicy: Never
volumes:
- name: job-deps
persistentVolumeClaim:
claimName: spark-pi-pvc
claimName: spark-pi-private-pvc
containers:
- name: aws-deps
image: docker.stackable.tech/stackable/tools:0.2.0-stackable0
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,8 @@ spec:
mode: cluster
mainClass: org.apache.spark.examples.SparkPi
mainApplicationFile: s3a://my-bucket/spark-examples_2.12-{{ test_scenario['values']['spark'] }}.jar
nodeSelector:
node: "1"
s3bucket:
inline:
bucketName: my-bucket
Expand All @@ -22,7 +24,7 @@ spec:
volumes:
- name: spark-pi-deps
persistentVolumeClaim:
claimName: spark-pi-pvc
claimName: spark-pi-private-pvc
sparkConf:
spark.hadoop.fs.s3a.aws.credentials.provider: "org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider"
spark.driver.extraClassPath: "/dependencies/jars/hadoop-aws-3.2.0.jar:/dependencies/jars/aws-java-sdk-bundle-1.11.375.jar"
Expand All @@ -34,10 +36,14 @@ spec:
volumeMounts:
- name: spark-pi-deps
mountPath: /dependencies
nodeSelector:
node: "1"
executor:
cores: 1
instances: 1
memory: "512m"
volumeMounts:
- name: spark-pi-deps
mountPath: /dependencies
nodeSelector:
node: "1"
6 changes: 4 additions & 2 deletions tests/templates/kuttl/spark-pi-public-s3/02-deps-volume.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
apiVersion: v1
kind: PersistentVolumeClaim
metadata:
name: spark-pi-pvc
name: spark-pi-public-pvc
spec:
accessModes:
- ReadWriteOnce
Expand All @@ -17,11 +17,13 @@ metadata:
spec:
template:
spec:
nodeSelector:
node: "1"
restartPolicy: Never
volumes:
- name: job-deps
persistentVolumeClaim:
claimName: spark-pi-pvc
claimName: spark-pi-public-pvc
containers:
- name: aws-deps
image: docker.stackable.tech/stackable/tools:0.2.0-stackable0
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -10,10 +10,12 @@ spec:
mode: cluster
mainClass: org.apache.spark.examples.SparkPi
mainApplicationFile: s3a://my-bucket/spark-examples_2.12-{{ test_scenario['values']['spark'] }}.jar
nodeSelector:
node: "1"
volumes:
- name: spark-pi-deps
persistentVolumeClaim:
claimName: spark-pi-pvc
claimName: spark-pi-public-pvc
s3bucket:
inline:
bucketName: my-bucket
Expand All @@ -33,10 +35,14 @@ spec:
volumeMounts:
- name: spark-pi-deps
mountPath: /dependencies
nodeSelector:
node: "1"
executor:
cores: 1
instances: 1
memory: "512m"
volumeMounts:
- name: spark-pi-deps
mountPath: /dependencies
nodeSelector:
node: "1"