Skip to content

Commit f2f6f4a

Browse files
committed
updating versions for release 1.1.0
1 parent 63bebaf commit f2f6f4a

File tree

1 file changed

+20
-19
lines changed

1 file changed

+20
-19
lines changed

README.md

Lines changed: 20 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# Kotlin for Apache® Spark™ [![Maven Central](https://img.shields.io/maven-central/v/org.jetbrains.kotlinx.spark/kotlin-spark-api-parent.svg?label=Maven%20Central)](https://search.maven.org/search?q=g:org.jetbrains.kotlinx.spark%20AND%20v:1.0.2) [![official JetBrains project](http://jb.gg/badges/official.svg)](https://confluence.jetbrains.com/display/ALL/JetBrains+on+GitHub) [![Join the chat at https://gitter.im/JetBrains/kotlin-spark-api](https://badges.gitter.im/JetBrains/kotlin-spark-api.svg)](https://gitter.im/JetBrains/kotlin-spark-api?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)
1+
# Kotlin for Apache® Spark™ [![Maven Central](https://img.shields.io/maven-central/v/org.jetbrains.kotlinx.spark/kotlin-spark-api-parent.svg?label=Maven%20Central)](https://search.maven.org/search?q=g:org.jetbrains.kotlinx.spark%20AND%20v:1.1.0) [![official JetBrains project](http://jb.gg/badges/official.svg)](https://confluence.jetbrains.com/display/ALL/JetBrains+on+GitHub) [![Join the chat at https://gitter.im/JetBrains/kotlin-spark-api](https://badges.gitter.im/JetBrains/kotlin-spark-api.svg)](https://gitter.im/JetBrains/kotlin-spark-api?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)
22

33

44
Your next API to work with [Apache Spark](https://spark.apache.org/).
@@ -31,20 +31,21 @@ We have opened a Spark Project Improvement Proposal: [Kotlin support for Apache
3131

3232
## Supported versions of Apache Spark
3333

34-
| Apache Spark | Scala | Kotlin for Apache Spark |
34+
| Apache Spark | Scala | Kotlin for Apache Spark |
3535
|:------------:|:-----:|:-------------------------------:|
36-
| 3.0.0+ | 2.12 | kotlin-spark-api-3.0:1.0.2 |
37-
| 2.4.1+ | 2.12 | kotlin-spark-api-2.4_2.12:1.0.2 |
38-
| 2.4.1+ | 2.11 | kotlin-spark-api-2.4_2.11:1.0.2 |
39-
| 3.2.0+ | 2.12 | kotlin-spark-api-3.2:1.0.3 |
36+
| 3.2.1+ | 2.12 | kotlin-spark-api-3.2:1.1.0 |
37+
| 3.1.3+ | 2.12 | kotlin-spark-api-3.1:1.1.0 |
38+
| 3.0.3+ | 2.12 | kotlin-spark-api-3.0:1.1.0 |
39+
| 2.4.1+ | 2.12 | kotlin-spark-api-2.4_2.12:1.0.2 |
40+
| 2.4.1+ | 2.11 | kotlin-spark-api-2.4_2.11:1.0.2 |
4041

4142
## Releases
4243

4344
The list of Kotlin for Apache Spark releases is available [here](https://github.com/JetBrains/kotlin-spark-api/releases/).
4445
The Kotlin for Spark artifacts adhere to the following convention:
4546
`[Apache Spark version]_[Scala core version]:[Kotlin for Apache Spark API version]`
4647

47-
[![Maven Central](https://img.shields.io/maven-central/v/org.jetbrains.kotlinx.spark/kotlin-spark-api-parent.svg?label=Maven%20Central)](https://search.maven.org/search?q=g:"org.jetbrains.kotlinx.spark"%20AND%20a:"kotlin-spark-api-3.0")
48+
[![Maven Central](https://img.shields.io/maven-central/v/org.jetbrains.kotlinx.spark/kotlin-spark-api-parent.svg?label=Maven%20Central)](https://search.maven.org/search?q=g:"org.jetbrains.kotlinx.spark"%20AND%20a:"kotlin-spark-api-3.2")
4849

4950
## How to configure Kotlin for Apache Spark in your project
5051

@@ -55,7 +56,7 @@ Here's an example `pom.xml`:
5556
```xml
5657
<dependency>
5758
<groupId>org.jetbrains.kotlinx.spark</groupId>
58-
<artifactId>kotlin-spark-api-3.0</artifactId>
59+
<artifactId>kotlin-spark-api-3.2</artifactId>
5960
<version>${kotlin-spark-api.version}</version>
6061
</dependency>
6162
<dependency>
@@ -84,7 +85,7 @@ To it, simply add
8485
to the top of your notebook. This will get the latest version of the API, together with the latest version of Spark.
8586
To define a certain version of Spark or the API itself, simply add it like this:
8687
```jupyterpython
87-
%use kotlin-spark-api(spark=3.2, version=1.0.4)
88+
%use kotlin-spark-api(spark=3.2, v=1.1.0)
8889
```
8990

9091
Inside the notebook a Spark session will be initiated automatically. This can be accessed via the `spark` value.
@@ -134,8 +135,8 @@ Do not use this when running the Kotlin Spark API from a Jupyter notebook.
134135
```kotlin
135136
withSpark {
136137
dsOf(1, 2)
137-
.map { it X it } // creates Tuple2<Int, Int>
138-
.show()
138+
.map { it X it } // creates Tuple2<Int, Int>
139+
.show()
139140
}
140141
```
141142

@@ -152,14 +153,14 @@ To solve these problems we've added `withCached` function
152153
```kotlin
153154
withSpark {
154155
dsOf(1, 2, 3, 4, 5)
155-
.map { tupleOf(it, it + 2) }
156-
.withCached {
157-
showDS()
158-
159-
filter { it._1 % 2 == 0 }.showDS()
160-
}
161-
.map { tupleOf(it._1, it._2, (it._1 + it._2) * 2) }
162-
.show()
156+
.map { tupleOf(it, it + 2) }
157+
.withCached {
158+
showDS()
159+
160+
filter { it._1 % 2 == 0 }.showDS()
161+
}
162+
.map { tupleOf(it._1, it._2, (it._1 + it._2) * 2) }
163+
.show()
163164
}
164165
```
165166

0 commit comments

Comments
 (0)