Skip to content

[Merged by Bors] - Bump image version to 3.3.0-stackable0.2.0 #145

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 3 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,12 @@ All notable changes to this project will be documented in this file.

## [Unreleased]

### Changed

- Bumped image to `3.3.0-stackable0.2.0` in tests and docs ([#145])

[#145]: https://github.com/stackabletech/spark-k8s-operator/pull/145

## [0.5.0] - 2022-09-06

### Added
Expand Down
2 changes: 1 addition & 1 deletion docs/modules/ROOT/examples/example-encapsulated.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ metadata:
name: spark-pi
spec:
version: "1.0"
sparkImage: docker.stackable.tech/stackable/spark-k8s:3.3.0-stackable0.1.0 # <1>
sparkImage: docker.stackable.tech/stackable/spark-k8s:3.3.0-stackable0.2.0 # <1>
mode: cluster
mainClass: org.apache.spark.examples.SparkPi
mainApplicationFile: /stackable/spark/examples/jars/spark-examples_2.12-3.3.0.jar # <2>
Expand Down
11 changes: 1 addition & 10 deletions docs/modules/ROOT/examples/example-sparkapp-configmap.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -6,38 +6,29 @@ metadata:
namespace: default
spec:
version: "1.0"
sparkImage: docker.stackable.tech/stackable/spark-k8s:3.3.0-stackable0.1.0
sparkImage: docker.stackable.tech/stackable/spark-k8s:3.3.0-stackable0.2.0
mode: cluster
mainApplicationFile: s3a://stackable-spark-k8s-jars/jobs/ny-tlc-report-1.1.0.jar # <3>
mainClass: tech.stackable.demo.spark.NYTLCReport
volumes:
- name: job-deps
persistentVolumeClaim:
claimName: pvc-ksv
- name: cm-job-arguments
configMap:
name: cm-job-arguments # <4>
args:
- "--input /arguments/job-args.txt" # <5>
sparkConf:
"spark.hadoop.fs.s3a.aws.credentials.provider": "org.apache.hadoop.fs.s3a.AnonymousAWSCredentialsProvider"
"spark.driver.extraClassPath": "/dependencies/jars/*"
"spark.executor.extraClassPath": "/dependencies/jars/*"
driver:
cores: 1
coreLimit: "1200m"
memory: "512m"
volumeMounts:
- name: job-deps
mountPath: /dependencies
- name: cm-job-arguments # <6>
mountPath: /arguments # <7>
executor:
cores: 1
instances: 3
memory: "512m"
volumeMounts:
- name: job-deps
mountPath: /dependencies
- name: cm-job-arguments # <6>
mountPath: /arguments # <7>
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ metadata:
namespace: default
spec:
version: "1.0"
sparkImage: docker.stackable.tech/stackable/pyspark-k8s:3.3.0-stackable0.1.0
sparkImage: docker.stackable.tech/stackable/pyspark-k8s:3.3.0-stackable0.2.0
mode: cluster
mainApplicationFile: s3a://stackable-spark-k8s-jars/jobs/ny_tlc_report.py # <1>
args:
Expand Down
14 changes: 1 addition & 13 deletions docs/modules/ROOT/examples/example-sparkapp-image.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ metadata:
spec:
version: "1.0"
image: docker.stackable.tech/stackable/ny-tlc-report:0.1.0 # <1>
sparkImage: docker.stackable.tech/stackable/pyspark-k8s:3.3.0-stackable0.1.0
sparkImage: docker.stackable.tech/stackable/pyspark-k8s:3.3.0-stackable0.2.0
mode: cluster
mainApplicationFile: local:///stackable/spark/jobs/ny_tlc_report.py # <2>
args:
Expand All @@ -17,23 +17,11 @@ spec:
- tabulate==0.8.9 # <4>
sparkConf: # <5>
"spark.hadoop.fs.s3a.aws.credentials.provider": "org.apache.hadoop.fs.s3a.AnonymousAWSCredentialsProvider"
"spark.driver.extraClassPath": "/dependencies/jars/*"
"spark.executor.extraClassPath": "/dependencies/jars/*"
volumes:
- name: job-deps # <6>
persistentVolumeClaim:
claimName: pvc-ksv
driver:
cores: 1
coreLimit: "1200m"
memory: "512m"
volumeMounts:
- name: job-deps
mountPath: /dependencies # <7>
executor:
cores: 1
instances: 3
memory: "512m"
volumeMounts:
- name: job-deps
mountPath: /dependencies # <7>
2 changes: 1 addition & 1 deletion docs/modules/ROOT/examples/example-sparkapp-pvc.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ metadata:
namespace: default
spec:
version: "1.0"
sparkImage: docker.stackable.tech/stackable/spark-k8s:3.3.0-stackable0.1.0
sparkImage: docker.stackable.tech/stackable/spark-k8s:3.3.0-stackable0.2.0
mode: cluster
mainApplicationFile: s3a://stackable-spark-k8s-jars/jobs/ny-tlc-report-1.0-SNAPSHOT.jar # <1>
mainClass: org.example.App # <2>
Expand Down
12 changes: 1 addition & 11 deletions docs/modules/ROOT/examples/example-sparkapp-s3-private.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ metadata:
name: example-sparkapp-s3-private
spec:
version: "1.0"
sparkImage: docker.stackable.tech/stackable/spark-k8s:3.3.0-stackable0.1.0
sparkImage: docker.stackable.tech/stackable/spark-k8s:3.3.0-stackable0.2.0
mode: cluster
mainApplicationFile: s3a://my-bucket/spark-examples_2.12-3.3.0.jar # <1>
mainClass: org.apache.spark.examples.SparkPi # <2>
Expand All @@ -23,21 +23,11 @@ spec:
spark.hadoop.fs.s3a.aws.credentials.provider: "org.apache.hadoop.fs.s3a.SimpleAWSCredentialsProvider" # <6>
spark.driver.extraClassPath: "/dependencies/jars/hadoop-aws-3.2.0.jar:/dependencies/jars/aws-java-sdk-bundle-1.11.375.jar"
spark.executor.extraClassPath: "/dependencies/jars/hadoop-aws-3.2.0.jar:/dependencies/jars/aws-java-sdk-bundle-1.11.375.jar"
volumes:
- name: spark-pi-deps # <7>
persistentVolumeClaim:
claimName: spark-pi-pvc
driver:
cores: 1
coreLimit: "1200m"
memory: "512m"
volumeMounts:
- name: spark-pi-deps
mountPath: /dependencies # <8>
executor:
cores: 1
instances: 3
memory: "512m"
volumeMounts:
- name: spark-pi-deps
mountPath: /dependencies # <8>
6 changes: 1 addition & 5 deletions docs/modules/ROOT/pages/usage.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -42,8 +42,6 @@ include::example$example-sparkapp-image.yaml[]
<3> Job argument (external)
<4> List of python job requirements: these will be installed in the pods via `pip`
<5> Spark dependencies: the credentials provider (the user knows what is relevant here) plus dependencies needed to access external resources (in this case, in an S3 store)
<6> the name of the volume mount backed by a `PersistentVolumeClaim` that must be pre-existing
<7> the path on the volume mount: this is referenced in the `sparkConf` section where the extra class path is defined for the driver and executors

=== JVM (Scala): externally located artifact and dataset

Expand Down Expand Up @@ -71,8 +69,6 @@ include::example$example-sparkapp-s3-private.yaml[]
<4> Credentials referencing a secretClass (not shown in is example)
<5> Spark dependencies: the credentials provider (the user knows what is relevant here) plus dependencies needed to access external resources...
<6> ...in this case, in an S3 store, accessed with the credentials defined in the secret
<7> the name of the volume mount backed by a `PersistentVolumeClaim` that must be pre-existing
<8> the path on the volume mount: this is referenced in the `sparkConf` section where the extra class path is defined for the driver and executors

=== JVM (Scala): externally located artifact accessed with job arguments provided via configuration map

Expand Down Expand Up @@ -174,7 +170,7 @@ Below are listed the CRD fields that can be defined by the user:
|User-supplied image containing spark-job dependencies that will be copied to the specified volume mount

|`spec.sparkImage`
| Spark image which will be deployed to driver and executor pods, which must contain spark environment needed by the job e.g. `docker.stackable.tech/stackable/spark-k8s:3.3.0-stackable0.1.0`
| Spark image which will be deployed to driver and executor pods, which must contain spark environment needed by the job e.g. `docker.stackable.tech/stackable/spark-k8s:3.3.0-stackable0.2.0`

|`spec.sparkImagePullPolicy`
| Optional Enum (one of `Always`, `IfNotPresent` or `Never`) that determines the pull policy of the spark job image
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ metadata:
namespace: default
spec:
version: "1.0"
sparkImage: docker.stackable.tech/stackable/pyspark-k8s:3.3.0-stackable0.1.0
sparkImage: docker.stackable.tech/stackable/pyspark-k8s:3.3.0-stackable0.2.0
mode: cluster
mainApplicationFile: local:///stackable/spark/examples/src/main/python/pi.py
driver:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ metadata:
namespace: default
spec:
version: "1.0"
sparkImage: docker.stackable.tech/stackable/pyspark-k8s:3.3.0-stackable0.1.0
sparkImage: docker.stackable.tech/stackable/pyspark-k8s:3.3.0-stackable0.2.0
mode: cluster
mainApplicationFile: local:///stackable/spark/examples/src/main/python/pi.py
driver:
Expand Down
2 changes: 1 addition & 1 deletion examples/ny-tlc-report-external-dependencies.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ metadata:
namespace: default
spec:
version: "1.0"
sparkImage: docker.stackable.tech/stackable/pyspark-k8s:3.3.0-stackable0.1.0
sparkImage: docker.stackable.tech/stackable/pyspark-k8s:3.3.0-stackable0.2.0
# Always | IfNotPresent | Never
sparkImagePullPolicy: IfNotPresent
mode: cluster
Expand Down
14 changes: 1 addition & 13 deletions examples/ny-tlc-report-image.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ spec:
version: "1.0"
# everything under /jobs will be copied to /stackable/spark/jobs
image: docker.stackable.tech/stackable/ny-tlc-report:0.1.0
sparkImage: docker.stackable.tech/stackable/pyspark-k8s:3.3.0-stackable0.1.0
sparkImage: docker.stackable.tech/stackable/pyspark-k8s:3.3.0-stackable0.2.0
sparkImagePullPolicy: IfNotPresent
mode: cluster
mainApplicationFile: local:///stackable/spark/jobs/ny_tlc_report.py
Expand All @@ -27,23 +27,11 @@ spec:
accessStyle: Path
sparkConf:
spark.hadoop.fs.s3a.aws.credentials.provider: "org.apache.hadoop.fs.s3a.AnonymousAWSCredentialsProvider"
spark.driver.extraClassPath: "/dependencies/jars/*"
spark.executor.extraClassPath: "/dependencies/jars/*"
volumes:
- name: job-deps
persistentVolumeClaim:
claimName: pvc-ksv
driver:
cores: 1
coreLimit: "1200m"
memory: "512m"
volumeMounts:
- name: job-deps
mountPath: /dependencies
executor:
cores: 1
instances: 3
memory: "512m"
volumeMounts:
- name: job-deps
mountPath: /dependencies
12 changes: 1 addition & 11 deletions examples/ny-tlc-report.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -11,17 +11,13 @@ apiVersion: spark.stackable.tech/v1alpha1
kind: SparkApplication
metadata:
name: spark-ny-cm
namespace: default
spec:
version: "1.0"
sparkImage: docker.stackable.tech/stackable/spark-k8s:3.3.0-stackable0.1.0
sparkImage: docker.stackable.tech/stackable/spark-k8s:3.3.0-stackable0.2.0
mode: cluster
mainApplicationFile: s3a://my-bucket/ny-tlc-report-1.1.0-3.3.0.jar
mainClass: tech.stackable.demo.spark.NYTLCReport
volumes:
- name: job-deps
persistentVolumeClaim:
claimName: pvc-ksv
- name: cm-job-arguments
configMap:
name: cm-job-arguments
Expand All @@ -37,23 +33,17 @@ spec:
accessStyle: Path
sparkConf:
spark.hadoop.fs.s3a.aws.credentials.provider: "org.apache.hadoop.fs.s3a.AnonymousAWSCredentialsProvider"
spark.driver.extraClassPath: "/dependencies/jars/*"
spark.executor.extraClassPath: "/dependencies/jars/*"
driver:
cores: 1
coreLimit: "1200m"
memory: "512m"
volumeMounts:
- name: job-deps
mountPath: /dependencies
- name: cm-job-arguments
mountPath: /arguments
executor:
cores: 1
instances: 3
memory: "512m"
volumeMounts:
- name: job-deps
mountPath: /dependencies
- name: cm-job-arguments
mountPath: /arguments
Original file line number Diff line number Diff line change
@@ -1,8 +1,6 @@
---
apiVersion: kuttl.dev/v1beta1
kind: TestAssert
metadata:
name: minio
timeout: 900
---
apiVersion: apps/v1
Expand Down
13 changes: 0 additions & 13 deletions tests/templates/kuttl/pyspark-ny-public-s3-image/02-assert.yaml

This file was deleted.

This file was deleted.

Original file line number Diff line number Diff line change
@@ -1,8 +1,6 @@
---
apiVersion: kuttl.dev/v1beta1
kind: TestAssert
metadata:
name: pyspark-ny-public-s3-image
timeout: 900
---
# The Job starting the whole process
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -26,23 +26,11 @@ spec:
accessStyle: Path
sparkConf:
spark.hadoop.fs.s3a.aws.credentials.provider: "org.apache.hadoop.fs.s3a.AnonymousAWSCredentialsProvider"
spark.driver.extraClassPath: "/dependencies/jars/*"
spark.executor.extraClassPath: "/dependencies/jars/*"
volumes:
- name: job-deps
persistentVolumeClaim:
claimName: pyspark-ny-pvc
driver:
cores: 1
coreLimit: "1200m"
memory: "512m"
volumeMounts:
- name: job-deps
mountPath: /dependencies/jars
executor:
cores: 1
instances: 3
memory: "512m"
volumeMounts:
- name: job-deps
mountPath: /dependencies/jars
2 changes: 0 additions & 2 deletions tests/templates/kuttl/pyspark-ny-public-s3/00-assert.yaml
Original file line number Diff line number Diff line change
@@ -1,8 +1,6 @@
---
apiVersion: kuttl.dev/v1beta1
kind: TestAssert
metadata:
name: minio
timeout: 900
---
apiVersion: apps/v1
Expand Down
Loading