Skip to content

Commit a3b8580

Browse files
maltesanderrazvan
andauthored
New versions 3.4.1 and 3.5.0 (#291)
* new examples for tests * bump versions * add jmx exporter for history server * adapt test matrix to new versions * fix tests * adapt test versions * fix test * fix examples * adapted changelog * bump operator-rs to 0.55.0 * Update tests/templates/kuttl/iceberg/10-deploy-spark-app.yaml.j2 Co-authored-by: Razvan-Daniel Mihai <[email protected]> * do not use pyspark image anymore * vector and operator-rs bumps --------- Co-authored-by: Razvan-Daniel Mihai <[email protected]>
1 parent 53af467 commit a3b8580

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

48 files changed

+128
-78
lines changed

CHANGELOG.md

+9-2

Cargo.lock

+4-4
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

Cargo.toml

+1-1
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,7 @@ serde = { version = "1.0", features = ["derive"] }
2121
serde_json = "1.0"
2222
serde_yaml = "0.9"
2323
snafu = "0.7"
24-
stackable-operator = { git = "https://github.com/stackabletech/operator-rs.git", tag = "0.52.1" }
24+
stackable-operator = { git = "https://github.com/stackabletech/operator-rs.git", tag = "0.55.0" }
2525
strum = { version = "0.25", features = ["derive"] }
2626
tokio = { version = "1.29", features = ["full"] }
2727
tracing = "0.1"

deploy/helm/spark-k8s-operator/crds/crds.yaml

+60-20
Large diffs are not rendered by default.

docs/modules/spark-k8s/examples/example-encapsulated.yaml

+1-1
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ metadata:
66
spec:
77
version: "1.0"
88
sparkImage:
9-
productVersion: 3.3.0 # <1>
9+
productVersion: 3.5.0 # <1>
1010
mode: cluster
1111
mainClass: org.apache.spark.examples.SparkPi
1212
mainApplicationFile: /stackable/spark/examples/jars/spark-examples.jar # <2>

docs/modules/spark-k8s/examples/example-history-app.yaml

+1-1
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ metadata:
66
spec:
77
version: "1.0"
88
sparkImage:
9-
productVersion: 3.3.0
9+
productVersion: 3.5.0
1010
pullPolicy: IfNotPresent
1111
mode: cluster
1212
mainClass: org.apache.spark.examples.SparkPi

docs/modules/spark-k8s/examples/example-history-server.yaml

+1-1
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ metadata:
55
name: spark-history
66
spec:
77
image:
8-
productVersion: 3.3.0
8+
productVersion: 3.5.0
99
logFileDirectory: # <1>
1010
s3:
1111
prefix: eventlogs/ # <2>

docs/modules/spark-k8s/examples/example-sparkapp-configmap.yaml

+1-1
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ metadata:
77
spec:
88
version: "1.0"
99
sparkImage:
10-
productVersion: 3.3.0
10+
productVersion: 3.5.0
1111
mode: cluster
1212
mainApplicationFile: s3a://stackable-spark-k8s-jars/jobs/ny-tlc-report-1.1.0.jar # <3>
1313
mainClass: tech.stackable.demo.spark.NYTLCReport

docs/modules/spark-k8s/examples/example-sparkapp-external-dependencies.yaml

+1-1
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ metadata:
77
spec:
88
version: "1.0"
99
sparkImage:
10-
productVersion: 3.3.0
10+
productVersion: 3.5.0
1111
mode: cluster
1212
mainApplicationFile: s3a://stackable-spark-k8s-jars/jobs/ny_tlc_report.py # <1>
1313
args:

docs/modules/spark-k8s/examples/example-sparkapp-image.yaml

+1-1
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ spec:
88
version: "1.0"
99
image: docker.stackable.tech/stackable/ny-tlc-report:0.1.0 # <1>
1010
sparkImage:
11-
productVersion: 3.3.0
11+
productVersion: 3.5.0
1212
mode: cluster
1313
mainApplicationFile: local:///stackable/spark/jobs/ny_tlc_report.py # <2>
1414
args:

docs/modules/spark-k8s/examples/example-sparkapp-pvc.yaml

+1-1
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ metadata:
77
spec:
88
version: "1.0"
99
sparkImage:
10-
productVersion: 3.3.0
10+
productVersion: 3.5.0
1111
mode: cluster
1212
mainApplicationFile: s3a://stackable-spark-k8s-jars/jobs/ny-tlc-report-1.0-SNAPSHOT.jar # <1>
1313
mainClass: org.example.App # <2>

docs/modules/spark-k8s/examples/example-sparkapp-s3-private.yaml

+1-1
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ metadata:
66
spec:
77
version: "1.0"
88
sparkImage:
9-
productVersion: 3.3.0
9+
productVersion: 3.5.0
1010
mode: cluster
1111
mainApplicationFile: s3a://my-bucket/spark-examples.jar # <1>
1212
mainClass: org.apache.spark.examples.SparkPi # <2>

docs/modules/spark-k8s/examples/example-sparkapp-streaming.yaml

+1-1
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ metadata:
77
spec:
88
version: "1.0"
99
sparkImage:
10-
productVersion: 3.3.0
10+
productVersion: 3.5.0
1111
mode: cluster
1212
mainApplicationFile: local:///stackable/spark/examples/src/main/python/streaming/hdfs_wordcount.py
1313
args:

docs/modules/spark-k8s/examples/getting_started/getting_started.sh

+1-1
Original file line numberDiff line numberDiff line change
@@ -59,7 +59,7 @@ metadata:
5959
spec:
6060
version: "1.0"
6161
sparkImage:
62-
productVersion: 3.3.0
62+
productVersion: 3.5.0
6363
mode: cluster
6464
mainApplicationFile: local:///stackable/spark/examples/src/main/python/pi.py
6565
driver:

docs/modules/spark-k8s/examples/getting_started/getting_started.sh.j2

+1-1
Original file line numberDiff line numberDiff line change
@@ -59,7 +59,7 @@ metadata:
5959
spec:
6060
version: "1.0"
6161
sparkImage:
62-
productVersion: 3.3.0
62+
productVersion: 3.5.0
6363
mode: cluster
6464
mainApplicationFile: local:///stackable/spark/examples/src/main/python/pi.py
6565
driver:

docs/modules/spark-k8s/pages/crd-reference.adoc

+1-1

docs/modules/spark-k8s/pages/usage-guide/history-server.adoc

+1-1

docs/modules/spark-k8s/partials/supported-versions.adoc

+3-2

examples/README-examples.md

+2-2

examples/ny-tlc-report-external-dependencies.yaml

+1-1
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ metadata:
77
spec:
88
version: "1.0"
99
sparkImage:
10-
productVersion: 3.3.0
10+
productVersion: 3.5.0
1111
pullPolicy: IfNotPresent
1212
mode: cluster
1313
mainApplicationFile: s3a://my-bucket/ny_tlc_report.py

examples/ny-tlc-report-image.yaml

+1-1
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ spec:
88
version: "1.0"
99
# everything under /jobs will be copied to /stackable/spark/jobs
1010
image: docker.stackable.tech/stackable/ny-tlc-report:0.1.0
11-
sparkImage: docker.stackable.tech/stackable/pyspark-k8s:3.3.0-stackable0.0.0-dev
11+
sparkImage: docker.stackable.tech/stackable/spark-k8s:3.5.0-stackable0.0.0-dev
1212
sparkImagePullPolicy: IfNotPresent
1313
mode: cluster
1414
mainApplicationFile: local:///stackable/spark/jobs/ny_tlc_report.py

examples/ny-tlc-report.yaml

+2-2
Original file line numberDiff line numberDiff line change
@@ -14,9 +14,9 @@ metadata:
1414
spec:
1515
version: "1.0"
1616
sparkImage:
17-
productVersion: 3.3.0
17+
productVersion: 3.5.0
1818
mode: cluster
19-
mainApplicationFile: s3a://my-bucket/ny-tlc-report-1.1.0-3.3.0.jar
19+
mainApplicationFile: s3a://my-bucket/ny-tlc-report-1.1.0-3.5.0.jar
2020
mainClass: tech.stackable.demo.spark.NYTLCReport
2121
volumes:
2222
- name: cm-job-arguments

rust/crd/src/affinity.rs

+1-1
Original file line numberDiff line numberDiff line change
@@ -47,7 +47,7 @@ mod test {
4747
name: spark-history
4848
spec:
4949
image:
50-
productVersion: 3.3.0
50+
productVersion: 3.5.0
5151
logFileDirectory:
5252
s3:
5353
prefix: eventlogs/

rust/operator-binary/src/history/history_controller.rs

+18-6
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,6 @@ use stackable_operator::{
55
builder::{ConfigMapBuilder, ContainerBuilder, ObjectMetaBuilder, PodBuilder, VolumeBuilder},
66
cluster_resources::{ClusterResourceApplyStrategy, ClusterResources},
77
commons::product_image_selection::ResolvedProductImage,
8-
duration::Duration,
98
k8s_openapi::{
109
api::{
1110
apps::v1::{StatefulSet, StatefulSetSpec},
@@ -31,6 +30,7 @@ use stackable_operator::{
3130
},
3231
},
3332
role_utils::RoleGroupRef,
33+
time::Duration,
3434
};
3535
use stackable_spark_k8s_crd::{
3636
constants::{
@@ -55,6 +55,8 @@ use stackable_operator::k8s_openapi::DeepMerge;
5555
use stackable_operator::logging::controller::ReconcilerError;
5656
use strum::{EnumDiscriminants, IntoStaticStr};
5757

58+
const METRICS_PORT: u16 = 18081;
59+
5860
#[derive(Snafu, Debug, EnumDiscriminants)]
5961
#[strum_discriminants(derive(IntoStaticStr))]
6062
#[allow(clippy::enum_variant_names)]
@@ -415,6 +417,7 @@ fn build_stateful_set(
415417
.command(vec!["/bin/bash".to_string()])
416418
.args(command_args(s3_log_dir))
417419
.add_container_port("http", 18080)
420+
.add_container_port("metrics", METRICS_PORT.into())
418421
.add_env_vars(env_vars(s3_log_dir))
419422
.add_volume_mounts(s3_log_dir.volume_mounts())
420423
.add_volume_mount(VOLUME_MOUNT_NAME_CONFIG, VOLUME_MOUNT_PATH_CONFIG)
@@ -515,15 +518,23 @@ fn build_service(
515518
.ownerreference_from_resource(shs, None, Some(true))
516519
.context(ObjectMissingMetadataForOwnerRefSnafu)?
517520
.with_recommended_labels(labels(shs, app_version_label, &group_name))
521+
.with_label("prometheus.io/scrape", "true")
518522
.build(),
519523
spec: Some(ServiceSpec {
520524
type_: Some(service_type),
521525
cluster_ip: service_cluster_ip,
522-
ports: Some(vec![ServicePort {
523-
name: Some(String::from("http")),
524-
port: 18080,
525-
..ServicePort::default()
526-
}]),
526+
ports: Some(vec![
527+
ServicePort {
528+
name: Some(String::from("http")),
529+
port: 18080,
530+
..ServicePort::default()
531+
},
532+
ServicePort {
533+
name: Some(String::from("metrics")),
534+
port: METRICS_PORT.into(),
535+
..ServicePort::default()
536+
},
537+
]),
527538
selector: Some(selector),
528539
..ServiceSpec::default()
529540
}),
@@ -634,6 +645,7 @@ fn env_vars(s3logdir: &S3LogDir) -> Vec<EnvVar> {
634645
format!(
635646
"-Djava.security.properties={VOLUME_MOUNT_PATH_CONFIG}/{JVM_SECURITY_PROPERTIES_FILE}"
636647
),
648+
format!("-javaagent:/stackable/jmx/jmx_prometheus_javaagent.jar={METRICS_PORT}:/stackable/jmx/config.yaml")
637649
];
638650
if tlscerts::tls_secret_name(&s3logdir.bucket.connection).is_some() {
639651
history_opts.extend(

rust/operator-binary/src/pod_driver_controller.rs

+2-2
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
use stackable_operator::{
2-
client::Client, duration::Duration, k8s_openapi::api::core::v1::Pod,
3-
kube::runtime::controller::Action,
2+
client::Client, k8s_openapi::api::core::v1::Pod, kube::runtime::controller::Action,
3+
time::Duration,
44
};
55
use stackable_spark_k8s_crd::{
66
constants::POD_DRIVER_CONTROLLER_NAME, SparkApplication, SparkApplicationStatus,

rust/operator-binary/src/spark_k8s_controller.rs

+1-1
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ use std::{
66
vec,
77
};
88

9-
use stackable_operator::{duration::Duration, product_config::writer::to_java_properties_string};
9+
use stackable_operator::{product_config::writer::to_java_properties_string, time::Duration};
1010
use stackable_spark_k8s_crd::{
1111
constants::*, s3logdir::S3LogDir, tlscerts, RoleConfig, SparkApplication, SparkApplicationRole,
1212
SparkContainer, SubmitConfig,

tests/README-templating.md

+6-12

tests/templates/kuttl/iceberg/10-assert.yaml.j2

-5
Original file line numberDiff line numberDiff line change
@@ -9,9 +9,4 @@ kind: SparkApplication
99
metadata:
1010
name: pyspark-iceberg
1111
status:
12-
{% if test_scenario['values']['spark'].startswith("3.3") %}
13-
# Spark 3.3 is expected to fail because of this https://issues.apache.org/jira/browse/SPARK-35084
14-
phase: Failed
15-
{% else %}
1612
phase: Succeeded
17-
{% endif %}

tests/templates/kuttl/iceberg/10-deploy-spark-app.yaml.j2

+2-1
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,8 @@ spec:
4545
mountPath: /stackable/spark/jobs
4646
deps:
4747
packages:
48-
- org.apache.iceberg:iceberg-spark-runtime-{{ test_scenario['values']['spark'].rstrip('.0') }}_2.12:1.3.1
48+
# need to extract only the major and minor versions
49+
- org.apache.iceberg:iceberg-spark-runtime-{{ test_scenario['values']['spark'].rsplit('.', maxsplit=1)[0] }}_2.12:1.4.0
4950
volumes:
5051
- name: script
5152
configMap:
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.

0 commit comments

Comments
 (0)