Skip to content

Commit 58333ac

Browse files
committed
Merge branch 'main' of https://github.com/stackabletech/demos into bump/nifi-2.2.0-git-registry-client
2 parents 33701c4 + 8468d5d commit 58333ac

File tree

34 files changed

+39
-38
lines changed

34 files changed

+39
-38
lines changed

.github/ISSUE_TEMPLATE/pre-release-upgrade-testing.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -57,6 +57,7 @@ Replace the items in the task lists below with the applicable Pull Requests (if
5757
- [ ] [data-lakehouse-iceberg-trino-spark](https://docs.stackable.tech/home/nightly/demos/data-lakehouse-iceberg-trino-spark)
5858
- [ ] [end-to-end-security](https://docs.stackable.tech/home/nightly/demos/end-to-end-security)
5959
- [ ] [hbase-hdfs-load-cycling-data](https://docs.stackable.tech/home/nightly/demos/hbase-hdfs-load-cycling-data)
60+
- [ ] [jupyterhub-keycloak](https://docs.stackable.tech/home/nightly/demos/jupyterhub-keycloak)
6061
- [ ] [jupyterhub-pyspark-hdfs-anomaly-detection-taxi-data](https://docs.stackable.tech/home/nightly/demos/jupyterhub-pyspark-hdfs-anomaly-detection-taxi-data)
6162
- [ ] [logging](https://docs.stackable.tech/home/nightly/demos/logging)
6263
- [ ] [nifi-kafka-druid-earthquake-data](https://docs.stackable.tech/home/nightly/demos/nifi-kafka-druid-earthquake-data)

demos/data-lakehouse-iceberg-trino-spark/create-spark-ingestion-job.yaml

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -43,13 +43,13 @@ data:
4343
stackable.tech/vendor: Stackable
4444
spec:
4545
sparkImage:
46-
productVersion: 3.5.2
46+
productVersion: 3.5.5
4747
mode: cluster
4848
mainApplicationFile: local:///stackable/spark/jobs/spark-ingest-into-lakehouse.py
4949
deps:
5050
packages:
51-
- org.apache.iceberg:iceberg-spark-runtime-3.5_2.12:1.7.0
52-
- org.apache.spark:spark-sql-kafka-0-10_2.12:3.5.2
51+
- org.apache.iceberg:iceberg-spark-runtime-3.5_2.12:1.8.1
52+
- org.apache.spark:spark-sql-kafka-0-10_2.12:3.5.5
5353
s3connection:
5454
reference: minio
5555
sparkConf:

demos/end-to-end-security/create-spark-report.yaml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -55,12 +55,12 @@ data:
5555
name: spark-report
5656
spec:
5757
sparkImage:
58-
productVersion: 3.5.2
58+
productVersion: 3.5.5
5959
mode: cluster
6060
mainApplicationFile: local:///stackable/spark/jobs/spark-report.py
6161
deps:
6262
packages:
63-
- org.apache.iceberg:iceberg-spark-runtime-3.5_2.12:1.7.0
63+
- org.apache.iceberg:iceberg-spark-runtime-3.5_2.12:1.8.1
6464
sparkConf:
6565
spark.driver.extraClassPath: /stackable/config/hdfs
6666
spark.executor.extraClassPath: /stackable/config/hdfs

demos/spark-k8s-anomaly-detection-taxi-data/create-spark-anomaly-detection-job.yaml

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -37,13 +37,13 @@ data:
3737
name: spark-ad
3838
spec:
3939
sparkImage:
40-
productVersion: 3.5.2
40+
productVersion: 3.5.5
4141
mode: cluster
4242
mainApplicationFile: local:///spark-scripts/spark-ad.py
4343
deps:
4444
packages:
45-
- org.apache.iceberg:iceberg-spark-runtime-3.5_2.12:1.5.0
46-
- org.apache.spark:spark-sql-kafka-0-10_2.12:3.5.0
45+
- org.apache.iceberg:iceberg-spark-runtime-3.5_2.12:1.8.1
46+
- org.apache.spark:spark-sql-kafka-0-10_2.12:3.5.5
4747
requirements:
4848
- scikit-learn==1.4.0
4949
s3connection:

docs/modules/demos/pages/airflow-scheduled-job.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ This demo should not be run alongside other demos.
1919
To run this demo, your system needs at least:
2020

2121
* 2.5 https://kubernetes.io/docs/tasks/debug/debug-cluster/resource-metrics-pipeline/#cpu[cpu units] (core/hyperthread)
22-
* 9GiB memory
22+
* 10GiB memory
2323
* 24GiB disk storage
2424
2525
== Overview

docs/modules/demos/pages/trino-iceberg.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -77,7 +77,7 @@ As an alternative, you can use https://trino.io/download.html[trino-cli] by runn
7777

7878
[source,console]
7979
----
80-
$ java -jar ~/Downloads/trino-cli-455-executable.jar --user admin --insecure --password --server https://172.18.0.2:30856
80+
$ java -jar ~/Downloads/trino-cli-470-executable.jar --user admin --insecure --password --server https://172.18.0.2:30856
8181
----
8282

8383
Make sure to replace the server endpoint with the endpoint listed in the `stackablectl stacklet list` output.

stacks/_templates/jupyterhub.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ name: jupyterhub
44
repo:
55
name: jupyterhub
66
url: https://jupyterhub.github.io/helm-chart/
7-
version: 4.0.0
7+
version: 4.1.0 # 5.2.1
88
options:
99
hub:
1010
config:

stacks/_templates/keycloak.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@ spec:
1717
spec:
1818
containers:
1919
- name: keycloak
20-
image: quay.io/keycloak/keycloak:26.0.5
20+
image: quay.io/keycloak/keycloak:26.1.4
2121
args:
2222
- start
2323
- --hostname-strict=false

stacks/_templates/minio-distributed-small.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ name: minio
44
repo:
55
name: minio
66
url: https://charts.min.io/
7-
version: 5.3.0
7+
version: 5.4.0 # RELEASE.2024-12-18T13-15-44Z
88
options:
99
additionalLabels:
1010
stackable.tech/vendor: Stackable

stacks/_templates/minio-distributed.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ name: minio
44
repo:
55
name: minio
66
url: https://charts.min.io/
7-
version: 5.3.0
7+
version: 5.4.0 # RELEASE.2024-12-18T13-15-44Z
88
options:
99
additionalLabels:
1010
stackable.tech/vendor: Stackable

0 commit comments

Comments
 (0)