Skip to content

[Merged by Bors] - Add support for Spark 3.4.0 #243

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 14 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ All notable changes to this project will be documented in this file.
### Added

- Generate OLM bundle for Release 23.4.0 ([#238]).
- Add support for Spark 3.4.0 ([#243]).

### Changed

Expand All @@ -15,10 +16,15 @@ All notable changes to this project will be documented in this file.
- Use testing-tools 0.2.0 ([#236])
- Run as root group ([#241]).

### Fixed

- Fix quoting issues when spark config values contain spaces ([#243]).

[#235]: https://github.com/stackabletech/spark-k8s-operator/pull/235
[#236]: https://github.com/stackabletech/spark-k8s-operator/pull/236
[#238]: https://github.com/stackabletech/spark-k8s-operator/pull/238
[#241]: https://github.com/stackabletech/spark-k8s-operator/pull/241
[#243]: https://github.com/stackabletech/spark-k8s-operator/pull/243

## [23.4.0] - 2023-04-17

Expand Down
1 change: 1 addition & 0 deletions deploy/helm/spark-k8s-operator/templates/roles.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,7 @@ rules:
verbs:
- create
- delete
- deletecollection
- get
- list
- patch
Expand Down
4 changes: 2 additions & 2 deletions docs/modules/spark-k8s/examples/example-encapsulated.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5,9 +5,9 @@ metadata:
name: spark-pi
spec:
version: "1.0"
sparkImage: docker.stackable.tech/stackable/spark-k8s:3.3.0-stackable0.3.0 # <1>
sparkImage: docker.stackable.tech/stackable/spark-k8s:3.3.0-stackable0.0.0-dev # <1>
mode: cluster
mainClass: org.apache.spark.examples.SparkPi
mainApplicationFile: /stackable/spark/examples/jars/spark-examples_2.12-3.3.0.jar # <2>
mainApplicationFile: /stackable/spark/examples/jars/spark-examples.jar # <2>
executor:
instances: 3
4 changes: 2 additions & 2 deletions docs/modules/spark-k8s/examples/example-history-app.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5,11 +5,11 @@ metadata:
name: spark-pi-s3-1
spec:
version: "1.0"
sparkImage: docker.stackable.tech/stackable/spark-k8s:3.3.0-stackable0.3.0
sparkImage: docker.stackable.tech/stackable/spark-k8s:3.3.0-stackable0.0.0-dev
sparkImagePullPolicy: IfNotPresent
mode: cluster
mainClass: org.apache.spark.examples.SparkPi
mainApplicationFile: s3a://my-bucket/spark-examples_2.12-3.3.0.jar
mainApplicationFile: s3a://my-bucket/spark-examples.jar
s3connection: # <1>
inline:
host: test-minio
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ metadata:
namespace: default
spec:
version: "1.0"
sparkImage: docker.stackable.tech/stackable/spark-k8s:3.3.0-stackable0.3.0
sparkImage: docker.stackable.tech/stackable/spark-k8s:3.3.0-stackable0.0.0-dev
mode: cluster
mainApplicationFile: s3a://stackable-spark-k8s-jars/jobs/ny-tlc-report-1.1.0.jar # <3>
mainClass: tech.stackable.demo.spark.NYTLCReport
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ metadata:
namespace: default
spec:
version: "1.0"
sparkImage: docker.stackable.tech/stackable/pyspark-k8s:3.3.0-stackable0.3.0
sparkImage: docker.stackable.tech/stackable/pyspark-k8s:3.3.0-stackable0.0.0-dev
mode: cluster
mainApplicationFile: s3a://stackable-spark-k8s-jars/jobs/ny_tlc_report.py # <1>
args:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ metadata:
spec:
version: "1.0"
image: docker.stackable.tech/stackable/ny-tlc-report:0.1.0 # <1>
sparkImage: docker.stackable.tech/stackable/pyspark-k8s:3.3.0-stackable0.3.0
sparkImage: docker.stackable.tech/stackable/pyspark-k8s:3.3.0-stackable0.0.0-dev
mode: cluster
mainApplicationFile: local:///stackable/spark/jobs/ny_tlc_report.py # <2>
args:
Expand Down
2 changes: 1 addition & 1 deletion docs/modules/spark-k8s/examples/example-sparkapp-pvc.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ metadata:
namespace: default
spec:
version: "1.0"
sparkImage: docker.stackable.tech/stackable/spark-k8s:3.3.0-stackable0.3.0
sparkImage: docker.stackable.tech/stackable/spark-k8s:3.3.0-stackable0.0.0-dev
mode: cluster
mainApplicationFile: s3a://stackable-spark-k8s-jars/jobs/ny-tlc-report-1.0-SNAPSHOT.jar # <1>
mainClass: org.example.App # <2>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -5,9 +5,9 @@ metadata:
name: example-sparkapp-s3-private
spec:
version: "1.0"
sparkImage: docker.stackable.tech/stackable/spark-k8s:3.3.0-stackable0.3.0
sparkImage: docker.stackable.tech/stackable/spark-k8s:3.3.0-stackable0.0.0-dev
mode: cluster
mainApplicationFile: s3a://my-bucket/spark-examples_2.12-3.3.0.jar # <1>
mainApplicationFile: s3a://my-bucket/spark-examples.jar # <1>
mainClass: org.apache.spark.examples.SparkPi # <2>
s3connection: # <3>
inline:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ metadata:
namespace: default
spec:
version: "1.0"
sparkImage: docker.stackable.tech/stackable/pyspark-k8s:3.3.0-stackable0.3.0
sparkImage: docker.stackable.tech/stackable/pyspark-k8s:3.3.0-stackable0.0.0-dev
mode: cluster
mainApplicationFile: local:///stackable/spark/examples/src/main/python/streaming/hdfs_wordcount.py
args:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@ metadata:
namespace: default
spec:
version: "1.0"
sparkImage: docker.stackable.tech/stackable/pyspark-k8s:3.3.0-stackable0.3.0
sparkImage: docker.stackable.tech/stackable/pyspark-k8s:3.3.0-stackable0.0.0-dev
mode: cluster
mainApplicationFile: local:///stackable/spark/examples/src/main/python/pi.py
driver:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@ metadata:
namespace: default
spec:
version: "1.0"
sparkImage: docker.stackable.tech/stackable/pyspark-k8s:3.3.0-stackable0.3.0
sparkImage: docker.stackable.tech/stackable/pyspark-k8s:3.3.0-0.0.0-dev
mode: cluster
mainApplicationFile: local:///stackable/spark/examples/src/main/python/pi.py
driver:
Expand Down
1 change: 1 addition & 0 deletions docs/modules/spark-k8s/partials/supported-versions.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -5,3 +5,4 @@
- 3.2.1-hadoop3.2
- 3.2.1-hadoop3.2-python39
- 3.3.0-hadoop3
- 3.4.0-hadoop3
2 changes: 1 addition & 1 deletion examples/ny-tlc-report-external-dependencies.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ metadata:
namespace: default
spec:
version: "1.0"
sparkImage: docker.stackable.tech/stackable/pyspark-k8s:3.3.0-stackable0.3.0
sparkImage: docker.stackable.tech/stackable/pyspark-k8s:3.3.0-stackable0.0.0-dev
# Always | IfNotPresent | Never
sparkImagePullPolicy: IfNotPresent
mode: cluster
Expand Down
2 changes: 1 addition & 1 deletion examples/ny-tlc-report-image.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ spec:
version: "1.0"
# everything under /jobs will be copied to /stackable/spark/jobs
image: docker.stackable.tech/stackable/ny-tlc-report:0.1.0
sparkImage: docker.stackable.tech/stackable/pyspark-k8s:3.3.0-stackable0.3.0
sparkImage: docker.stackable.tech/stackable/pyspark-k8s:3.3.0-stackable0.0.0-dev
sparkImagePullPolicy: IfNotPresent
mode: cluster
mainApplicationFile: local:///stackable/spark/jobs/ny_tlc_report.py
Expand Down
2 changes: 1 addition & 1 deletion examples/ny-tlc-report.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ metadata:
name: spark-ny-cm
spec:
version: "1.0"
sparkImage: docker.stackable.tech/stackable/spark-k8s:3.3.0-stackable0.3.0
sparkImage: docker.stackable.tech/stackable/spark-k8s:3.3.0-stackable0.0.0-dev
mode: cluster
mainApplicationFile: s3a://my-bucket/ny-tlc-report-1.1.0-3.3.0.jar
mainClass: tech.stackable.demo.spark.NYTLCReport
Expand Down
8 changes: 4 additions & 4 deletions rust/crd/src/lib.rs
Original file line number Diff line number Diff line change
Expand Up @@ -618,7 +618,7 @@ impl SparkApplication {
}
// ...before being added to the command collection
for (key, value) in submit_conf {
submit_cmd.push(format!("--conf {key}={value}"));
submit_cmd.push(format!("--conf \"{key}={value}\""));
}

submit_cmd.extend(
Expand Down Expand Up @@ -939,7 +939,7 @@ spec:
sparkImage: docker.stackable.tech/stackable/spark-k8s:3.2.1-hadoop3.2-python39-aws1.11.375-stackable0.3.0
mode: cluster
mainClass: org.apache.spark.examples.SparkPi
mainApplicationFile: s3a://stackable-spark-k8s-jars/jobs/spark-examples_2.12-3.2.1.jar
mainApplicationFile: s3a://stackable-spark-k8s-jars/jobs/spark-examples.jar
sparkConf:
"spark.hadoop.fs.s3a.aws.credentials.provider": "org.apache.hadoop.fs.s3a.AnonymousAWSCredentialsProvider"
driver:
Expand All @@ -960,7 +960,7 @@ spec:
spark_application.spec.main_class
);
assert_eq!(
Some("s3a://stackable-spark-k8s-jars/jobs/spark-examples_2.12-3.2.1.jar".to_string()),
Some("s3a://stackable-spark-k8s-jars/jobs/spark-examples.jar".to_string()),
spark_application.spec.main_application_file
);
assert_eq!(
Expand Down Expand Up @@ -1113,7 +1113,7 @@ spec:
- name: myregistrykey
mode: cluster
mainClass: org.apache.spark.examples.SparkPi
mainApplicationFile: local:///stackable/spark/examples/jars/spark-examples_2.12-3.2.1.jar
mainApplicationFile: local:///stackable/spark/examples/jars/spark-examples.jar
sparkConf:
spark.kubernetes.node.selector.node: "2"
driver:
Expand Down
4 changes: 2 additions & 2 deletions rust/operator-binary/src/history_controller.rs
Original file line number Diff line number Diff line change
Expand Up @@ -536,9 +536,9 @@ fn command_args(s3logdir: &S3LogDir) -> Vec<String> {

if let Some(secret_dir) = s3logdir.credentials_mount_path() {
command.extend(vec![
format!("export AWS_ACCESS_KEY_ID=$(cat {secret_dir}/{ACCESS_KEY_ID})"),
format!("export AWS_ACCESS_KEY_ID=\"$(cat {secret_dir}/{ACCESS_KEY_ID})\""),
"&&".to_string(),
format!("export AWS_SECRET_ACCESS_KEY=$(cat {secret_dir}/{SECRET_ACCESS_KEY})"),
format!("export AWS_SECRET_ACCESS_KEY=\"$(cat {secret_dir}/{SECRET_ACCESS_KEY})\""),
"&&".to_string(),
]);
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,12 +6,11 @@ metadata:
spec:
version: "1.0"
sparkImage: docker.stackable.tech/stackable/spark-k8s:{{ test_scenario['values']['spark'].split('-stackable')[0] }}-stackable{{ test_scenario['values']['spark'].split('-stackable')[1] }}
sparkImagePullPolicy: IfNotPresent
image: docker.stackable.tech/stackable/ny-tlc-report:{{ test_scenario['values']['ny-tlc-report'] }}
vectorAggregatorConfigMapName: spark-vector-aggregator-discovery
mode: cluster
mainClass: org.apache.spark.examples.SparkALS
mainApplicationFile: local:///stackable/spark/examples/jars/spark-examples_2.12-{{ test_scenario['values']['examples'] }}.jar
mainApplicationFile: local:///stackable/spark/examples/jars/spark-examples.jar
job:
logging:
enableVectorAgent: true
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -40,12 +40,11 @@ metadata:
spec:
version: "1.0"
sparkImage: docker.stackable.tech/stackable/spark-k8s:{{ test_scenario['values']['spark'].split('-stackable')[0] }}-stackable{{ test_scenario['values']['spark'].split('-stackable')[1] }}
sparkImagePullPolicy: IfNotPresent
image: docker.stackable.tech/stackable/ny-tlc-report:{{ test_scenario['values']['ny-tlc-report'] }}
vectorAggregatorConfigMapName: spark-vector-aggregator-discovery
mode: cluster
mainClass: org.apache.spark.examples.SparkALS
mainApplicationFile: local:///stackable/spark/examples/jars/spark-examples_2.12-{{ test_scenario['values']['examples'] }}.jar
mainApplicationFile: local:///stackable/spark/examples/jars/spark-examples.jar
job:
logging:
enableVectorAgent: true
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,6 @@ metadata:
spec:
version: "1.0"
sparkImage: docker.stackable.tech/stackable/pyspark-k8s:{{ test_scenario['values']['spark'].split('-stackable')[0] }}-stackable{{ test_scenario['values']['spark'].split('-stackable')[1] }}
sparkImagePullPolicy: IfNotPresent
vectorAggregatorConfigMapName: spark-vector-aggregator-discovery
mode: cluster
mainApplicationFile: local:///stackable/spark/examples/src/main/python/als.py
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,6 @@ metadata:
spec:
version: "1.0"
sparkImage: docker.stackable.tech/stackable/pyspark-k8s:{{ test_scenario['values']['spark'].split('-stackable')[0] }}-stackable{{ test_scenario['values']['spark'].split('-stackable')[1] }}
sparkImagePullPolicy: IfNotPresent
vectorAggregatorConfigMapName: spark-vector-aggregator-discovery
mode: cluster
mainApplicationFile: local:///stackable/spark/examples/src/main/python/als.py
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ spec:
sparkImagePullPolicy: IfNotPresent
mode: cluster
mainClass: org.apache.spark.examples.SparkALS
mainApplicationFile: "local:///stackable/spark/examples/jars/spark-examples_2.12-{{ test_scenario['values']['examples'] }}.jar"
mainApplicationFile: "local:///stackable/spark/examples/jars/spark-examples.jar"
job:
logging:
enableVectorAgent: {{ lookup('env', 'VECTOR_AGGREGATOR') | length > 0 }}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,10 +4,9 @@ kind: TestStep
commands:
# give minio enough time to start
- command: sleep 10
- command: kubectl cp -n $NAMESPACE spark-examples_2.12-{{ test_scenario['values']['examples'] }}.jar minio-client:/tmp
- command: kubectl cp -n $NAMESPACE spark-examples_{{ test_scenario['values']['spark'].split('-stackable')[0] }}.jar minio-client:/tmp/spark-examples.jar
- command: kubectl exec -n $NAMESPACE minio-client -- sh -c 'mc alias set test-minio http://test-minio:9000 $$MINIO_SERVER_ACCESS_KEY $$MINIO_SERVER_SECRET_KEY'
- command: kubectl exec -n $NAMESPACE minio-client -- mc mb test-minio/my-bucket
- command: kubectl exec -n $NAMESPACE eventlog-minio-client -- sh -c 'mc alias set eventlog-minio http://eventlog-minio:9000 $$MINIO_SERVER_ACCESS_KEY $$MINIO_SERVER_SECRET_KEY'
- command: kubectl exec -n $NAMESPACE eventlog-minio-client -- mc mb eventlog-minio/spark-logs/eventlogs
- script: >-
kubectl exec -n $NAMESPACE minio-client -- mc cp /tmp/spark-examples_2.12-{{ test_scenario['values']['examples'] }}.jar test-minio/my-bucket
- command: kubectl exec -n $NAMESPACE minio-client -- mc cp /tmp/spark-examples.jar test-minio/my-bucket
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ spec:
sparkImagePullPolicy: IfNotPresent
mode: cluster
mainClass: org.apache.spark.examples.SparkPi
mainApplicationFile: "s3a://my-bucket/spark-examples_2.12-{{ test_scenario['values']['examples'] }}.jar"
mainApplicationFile: "s3a://my-bucket/spark-examples.jar"
s3connection:
reference: spark-data-s3-connection
logFileDirectory:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ spec:
sparkImagePullPolicy: IfNotPresent
mode: cluster
mainClass: org.apache.spark.examples.SparkPi
mainApplicationFile: "s3a://my-bucket/spark-examples_2.12-{{ test_scenario['values']['examples'] }}.jar"
mainApplicationFile: "s3a://my-bucket/spark-examples.jar"
s3connection:
reference: spark-data-s3-connection
logFileDirectory:
Expand Down
Binary file not shown.
Original file line number Diff line number Diff line change
Expand Up @@ -4,14 +4,10 @@ kind: TestStep
commands:
# give minio enough time to start
- command: sleep 10
- command: kubectl cp -n $NAMESPACE ny-tlc-report-1.1.0-{{ test_scenario['values']['examples'] }}.jar minio-client:/tmp
- command: kubectl cp -n $NAMESPACE ny-tlc-report-1.1.0-{{ test_scenario['values']['spark'].split('-stackable')[0] }}.jar minio-client:/tmp/ny-tlc-report.jar
- command: kubectl cp -n $NAMESPACE yellow_tripdata_2021-07.csv minio-client:/tmp
- command: kubectl exec -n $NAMESPACE minio-client -- sh -c 'mc alias set test-minio http://test-minio:9000 $$MINIO_SERVER_ACCESS_KEY $$MINIO_SERVER_SECRET_KEY'
- command: kubectl exec -n $NAMESPACE minio-client -- mc mb test-minio/my-bucket
- command: kubectl exec -n $NAMESPACE minio-client -- mc policy set public test-minio/my-bucket
- script: >-
kubectl exec -n $NAMESPACE minio-client --
mc cp /tmp/ny-tlc-report-1.1.0-{{ test_scenario['values']['examples'] }}.jar test-minio/my-bucket
- script: >-
kubectl exec -n $NAMESPACE minio-client --
mc cp /tmp/yellow_tripdata_2021-07.csv test-minio/my-bucket
- command: kubectl exec -n $NAMESPACE minio-client -- mc cp /tmp/ny-tlc-report.jar test-minio/my-bucket
- command: kubectl exec -n $NAMESPACE minio-client -- mc cp /tmp/yellow_tripdata_2021-07.csv test-minio/my-bucket
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ spec:
sparkImagePullPolicy: IfNotPresent
mode: cluster
mainClass: tech.stackable.demo.spark.NYTLCReport
mainApplicationFile: "s3a://my-bucket/ny-tlc-report-1.1.0-{{ test_scenario['values']['examples'] }}.jar"
mainApplicationFile: "s3a://my-bucket/ny-tlc-report.jar"
volumes:
- name: cm-job-arguments
configMap:
Expand Down
Binary file not shown.
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,7 @@ kind: TestStep
commands:
# give minio enough time to start
- command: sleep 10
- command: kubectl cp -n $NAMESPACE spark-examples_2.12-{{ test_scenario['values']['examples'] }}.jar minio-client:/tmp
- command: kubectl cp -n $NAMESPACE spark-examples_{{ test_scenario['values']['spark'].split('-stackable')[0] }}.jar minio-client:/tmp/spark-examples.jar
- command: kubectl exec -n $NAMESPACE minio-client -- sh -c 'mc alias set test-minio http://test-minio:9000 $$MINIO_SERVER_ACCESS_KEY $$MINIO_SERVER_SECRET_KEY'
- command: kubectl exec -n $NAMESPACE minio-client -- mc mb test-minio/my-bucket
- script: >-
kubectl exec -n $NAMESPACE minio-client -- mc cp /tmp/spark-examples_2.12-{{ test_scenario['values']['examples'] }}.jar test-minio/my-bucket
- command: kubectl exec -n $NAMESPACE minio-client -- mc cp /tmp/spark-examples.jar test-minio/my-bucket
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ spec:
sparkImagePullPolicy: IfNotPresent
mode: cluster
mainClass: org.apache.spark.examples.SparkPi
mainApplicationFile: s3a://my-bucket/spark-examples_2.12-{{ test_scenario['values']['examples'] }}.jar
mainApplicationFile: s3a://my-bucket/spark-examples.jar
s3connection:
inline:
host: test-minio
Expand Down
Binary file not shown.
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,8 @@ kind: TestStep
commands:
# give minio enough time to start
- command: sleep 10
- command: kubectl cp -n $NAMESPACE spark-examples_2.12-{{ test_scenario['values']['examples'] }}.jar minio-client:/tmp
- command: kubectl cp -n $NAMESPACE spark-examples_{{ test_scenario['values']['spark'].split('-stackable')[0] }}.jar minio-client:/tmp/spark-examples.jar
- command: kubectl exec -n $NAMESPACE minio-client -- sh -c 'mc alias set test-minio http://test-minio:9000 $$MINIO_SERVER_ACCESS_KEY $$MINIO_SERVER_SECRET_KEY'
- command: kubectl exec -n $NAMESPACE minio-client -- mc mb test-minio/my-bucket
- command: kubectl exec -n $NAMESPACE minio-client -- mc policy set public test-minio/my-bucket
- script: >-
kubectl exec -n $NAMESPACE minio-client -- mc cp /tmp/spark-examples_2.12-{{ test_scenario['values']['examples'] }}.jar test-minio/my-bucket
- command: kubectl exec -n $NAMESPACE minio-client -- mc cp /tmp/spark-examples.jar test-minio/my-bucket
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ spec:
sparkImagePullPolicy: IfNotPresent
mode: cluster
mainClass: org.apache.spark.examples.SparkPi
mainApplicationFile: s3a://my-bucket/spark-examples_2.12-{{ test_scenario['values']['examples'] }}.jar
mainApplicationFile: s3a://my-bucket/spark-examples.jar
s3connection:
inline:
host: test-minio
Expand Down
Binary file not shown.
Loading