SDK migration guides
SDK migration guides
This guides provides a hints to upgrading your SDK projects to leverage the latest versions. Each section includes an overview of key changes, compatibility considerations, and best practices to ensure a smooth transition.
Scala 2.12 to Scala 2.13 migration guide
This section describes high-level changes that have to be made to migrate your app from Scala 2.12 to Scala 2.13.
Scala 2.13 support was introduced as part of the HERE Data SDK for Java & Scala 2.80 release.
-
New SDK BOM files with the
_2.13suffix were added for each environment. To migrate to theScala 2.13, update SDK BOMartifactId's to include_2.13suffix as follows:sdk-batch-bom_2.12->sdk-batch-bom_2.13sdk-stream-bom_2.12->sdk-stream-bom_2.13sdk-standalone-bom_2.12->sdk-standalone-bom_2.13
-
Change your app's dependencies with the
_2.12suffix to use_2.13. It is recommended that you use_${scala.compat.version}as a suffix. For instance,
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_${scala.compat.version}</artifactId>
<version>${spark.version}</version>
<scope>provided</scope>
</dependency>For convenience, all SDK BOM files declare the ${scala.compat.version} property.
To use the property, SDK BOM must be added as a parent to your project.
3. Schemas maintained by HERE have been updated to support Scala 2.13:
- HERE Map Content Schema
- SDII Schema
- Sensoris Schema
- Real-Time Traffic Schema
- Weather Schema
- Optimized Map for Location Library
All Scala 2.13 bindings have the _2.13 suffix.
To use the above-mentioned schemas, you have to add _2.13 to the schema artifactId.
For detailed steps on how to update or add a new Scala 2.13 module to your own schema projects, see Adding Scala 2.13 module to your schema project
For more details on how to include a schema into the Maven or sbt project, refer to the schema page on the
platform,
see Sensoris schema
as an example.
-
Your own schemas published on the platform must be updated to support
Scala 2.13by adding
Scala 2.13bindings to the schema project. -
Since the
2.80release, the HERE platform supports two additional pipeline environments forScala 2.13. To runScala 2.13applications on the platform, use the newest versions:stream-6.0->stream-6.0.1batch-4.3->batch-4.0.16
If you use the OLP CLI, you must change the
<cluster type>parameter in theolp pipeline template createcommand.
Additional Migration Notes
Code-level changes in Scala 2.13
Scala 2.13 introduced significant changes to the standard collections library:
import scala.collection.JavaConverters._should be replaced with importscala.jdk.CollectionConverters._- Some methods were renamed or dropped (
.to[Collection]replacedCanBuildFrom,Streamreplaced withLazyList, etc.). Scala 2.13shows a warning if there is no default case in a match expression that is not exhaustive.
For more Refer to the Scala 2.13 Migration Guide for more detailed examples and changes.
Adding scala 2.13 module to your schema project
Follow these steps to add the new scala_2.13 module to your project:
-
Create a new module in your schema:
- Create a
scala_2.13directory in your root project folder. - Copy the scala 2.12 Maven module configuration from
scala_2.12/pom.xmltoscala_2.13/pom.xml. - Update the
modulessection of thepom.xmlfile in your root project directory to include the new module. Example:
- Create a
<modules>
<module>proto</module>
<module>java</module>
<module>scala_2.12</module>
<module>scala_2.13</module>
<module>ds</module>
</modules>-
Update
scala_2.13/pom.xmlas follows:- Rename
artifactIdfrom<schema-name>_scala_2.12to<schema-name>_scala_2.13where<schema-name>is the name of your schema. Please note that no changes should be made inscala_2.12/pom.xmlin order for your schema to be backwards compatible with the previous versions. - Update
nameanddescriptionappropriately for the new module. - Update the
scala-libraryversion independenciessection to 2.13.x as shown in the example below:
- Rename
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.13.16</version>
</dependency>
- In the
dependenciessection, replace thescalapb-runtime_2.12dependency with thescalapb-runtime_2.13one as follows:
<dependency>
<groupId>com.thesamet.scalapb</groupId>
<artifactId>scalapb-runtime_2.13</artifactId>
<version>0.11.19</version>
<exclusions>
<exclusion>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
</exclusion>
</exclusions>
</dependency>
-
In the
build > plugins > protoc-scala-maven-pluginplugin:- Rename the
artifactIdof theprotoc-scala-maven-plugin_2.12toprotoc-scala-maven-plugin_2.13. - Remove the dependencies section of the plugin if it exists. The final
protoc-scala-maven-plugin_2.13configuration may look like this:
- Rename the
<plugin>
<groupId>com.here.platform.schema.maven_plugins</groupId>
<artifactId>protoc-scala-maven-plugin_2.13</artifactId>
<version>${here.plugin.version}</version>
<executions>
<execution>
<phase>generate-sources</phase>
<goals>
<goal>scala-protoc-mojo</goal>
</goals>
<configuration>
<protocVersion>${protoc.version}</protocVersion>
<inputDirectories>
<include>${project.build.directory}/proto</include>
</inputDirectories>
<includeDirectories>
<include>${project.build.directory}/proto-lib</include>
</includeDirectories>
<includeStdTypes>false</includeStdTypes>
</configuration>
</execution>
</executions>
</plugin>
- If your schema project contains Scala 2.12 and 2.13 modules, do not use the
scala.versionorscala.compat.versionMaven properties. These variables may lead to runtime issues making either Scala 2.12 or 2.13 bindings unusable.
Known Limitations & Workarounds
Apache Flink 1.19.2
Apache Flink 1.19.2 does not provide Scala 2.13 binaries.
Because of this, some dependencies must be adjusted when migrating:
- Replace
org.apache.flink:flink-streaming-scala_2.12withorg.apache.flink:flink-streaming-java(recommended). - Replace
org.apache.flink:flink-scala_2.12with theScala 2.13library fromorg.flinkextended:flink-scala-api_2.13.
These changes ensure compatibility with Scala 2.13 while continuing to run on Flink 1.19.2.
Migrate the SDK Projects from Spark 2.4.7 to 3.4.1
This section describes high-level changes that have to be made to migrate your app from Spark 2.4.7 to Spark 3.4.1.
Spark 3.4.1 support was introduced as part of
the HERE Data SDK for Java & Scala 2.58 release.
To migrate your project to use Spark 3.4.1 do the following:
- Update the
sdk-batch-bom_2.12version to2.58.2or later. - If you use the
com.thesamet.scalapb:sparksql-scalapb_2.12, replace it with thecom.thesamet.scalapb:sparksql33-scalapb0_10_2.12to supportSpark 3.4.1. - See the Spark Migration Guide to cover specific cases of your project.
- Use the
batch-4.0cluster type to run your application on the Platform. If you use the OLP CLI, you must change the<cluster type>parameter in theolp pipeline template createcommand.
Migrate the SDK Projects from Akka to Pekko
Starting from version 2.68.3, the HERE SDK for Java and Scala has been migrated from Apache Akka 2.5.32 to Apache Pekko 1.0.2.
To migrate your project, follow these steps:
-
Use SDK BOM version
2.68.3or later. -
Upgrade from Akka
2.5.32to Akka2.6.21.Follow the official Akka migration guide: Akka 2.5.x to 2.6.x Migration Guide
-
Upgrade from Akka
2.6.21to Pekko1.0.2.Follow the official Pekko migration guide: Pekko Migration Guide
-
For Flink-related projects.
Ensure that all reference.conf files from various dependencies are appended and merged into a single file in the fat jar to avoid configuration issues.
For more information, see the Data Client Developer Guide
Migrate the SDK Projects from Spark 3.4.4 to 4.0.1
This section describes high-level changes that have to be made to migrate your app from Spark 3.4.4 to Spark 4.0.1.
Spark 4.0.1 support was introduced as part of
the HERE Data SDK for Java & Scala 2.81 release.
To migrate your project to use Spark 4.0.1 do the following:
- Update the
sdk-batch-bom_2.13version to2.81.5or later. - If you use the
com.thesamet.scalapb:sparksql-scalapb_2.13, remove it and replaceScalaPB/Framelessencoders with standard Spark 4 encoders:- Use
Encoders.kryo[YourProtoClass]for Protobuf message types. - Use
Encoders.BINARYforArray[Byte]columns. - Avoid importing
scalapb.spark.Implicits._, which is not compatible with Spark 4.
- Use
- See the Spark Migration Guide to cover specific cases of your project.
- Use the
batch-5.0cluster type to run your application on the Platform. If you use the OLP CLI, you must change the<cluster type>parameter in theolp pipeline template createcommand.
Migrate the SDK Projects from Flink 1.13.5 to 1.19.1
This section outlines the key changes required to migrate your application from Flink 1.13.5 to Flink 1.19.1.
Support for Flink 1.19.1 was introduced in HERE Data SDK for Java & Scala 2.72.
Key Changes
Starting from Flink 1.15, one major improvement is the removal of Scala dependencies from the classpath.
Previously, Flink included Scala libraries, leading to version mismatches and dependency conflicts. Now, Flink no longer ships with Scala dependencies, making it more modular and stable.
As a result:
environment-streamno longer includes Scala-related libraries.- These libraries are now part of
sdk-dep-stream_2.12and must be packaged into your FAT jar.
Vulnerability Fixes
With environment-stream, the following vulnerabilities have been fixed:
The CVE-2022-42004, CVE-2020-36518, CVE-2021-46877, and CVE-2022-42003 vulnerabilities were fixed by upgrading the com.fasterxml.jackson.core/jackson-databind library from version 2.12.1 to 2.14.2 in the sdk-dep-stream_2.12 BOM.
The CVE-2022-39135 vulnerability was fixed by removing org.apache.calcite/calcite-core:1.26.0 from the environment-stream BOM.
The CVE-2022-36364 vulnerability was fixed by removing org.apache.calcite.avatica/avatica-core:1.17.0 from the environment-stream BOM.
Migration steps
To migrate your project to use Flink 1.19.1 do the
following:
1. Upgrade SDK Dependencies
Upgrade the sdk-stream-bom_2.12 version to 2.72.4 or later. For a complete list of libraries in the SDK BOM files, see
sdk-stream-bom_2.12dependencies.sdk-dep-stream_2.12dependencies.environment-streamdependencies.
2. Update Your Application
Made the following updates if needed:
- Update Flink dependencies.
- Replace imports.
- Fix deprecated APIs
- Add a custom serializer
- Resolve classpath issues for local runs.
3. Recompile & Package
Recompile your pipeline code with the updated sdk-stream-bom_2.12 and generate a new fat JAR file.
For details, see Stream Pipeline.
4. Update Stream Pipeline
- Use the OLP CLI or Platform UI to create a new pipeline template and a pipeline version with the
stream-6.0runtime environment and the newly compiled JAR file. - In the Platform UI, you can speed up this process by copying the existing pipeline version that runs on
stream-5.0runtime environment, then modifying it to usestream-6.0with the updated JAR. For more information, see Deploy pipelines.
5. Upgrade Pipeline Version
Upgrade to the new pipeline version. See:
ClassNotFoundException Issues with Flink Applications Using Maven or SBT
If you encounter ClassNotFoundException errors (such as for org.apache.flink.api.common.ExecutionConfig) when running your Flink application via Maven or SBT,
it's often due to classpath issues during job execution.
This problem can surface after upgrading Flink versions, such as from 1.13 to 1.20.
Fixes for Maven Projects:
Switch from mvn exec:java to the mvn exec:exec plugin to resolve the issue, ensuring that the correct classpath is included during execution.
Before:
mvn compile exec:java -D"exec.mainClass"="DevelopFlinkApplication" \
-Dpipeline-config.file=pipeline-config.confNow:
mvn compile exec:exec -Dexec.executable="java" \
-Dexec.args="-cp %classpath -Dpipeline-config.file=pipeline-config.conf DevelopFlinkApplication"Fixes for SBT Projects:
Add the following configuration to your build.sbt file to properly handle Java options and classpath issues:
Compile / run / fork := true
Compile / javaOptions ++= sys.props.toSeq.map { case (k, v) => s"-D$k=$v" }For more details on Flink migration, see the DCL Migration Guide.
Migrate the SDK Projects from Flink 1.19.1 to 2.2.0
This section outlines the key changes required to migrate your application from Flink 1.19.1 to Flink 2.2.0.
Support for Flink 2.2.0 was introduced in HERE Data SDK for Java & Scala 2.85.
Key Changes
Flink 2.2.0 is part of the Flink 2.x major release line and introduces significant breaking changes compared to Flink 1.19.x.
Below is a summary of the most impactful changes:
- SourceFunction, SinkFunction, and Sink V1 have been removed. Migrate to
Source V2andSink V2APIs. - Scala DataStream and DataSet APIs have been removed. Migrate to the Java DataStream API.
- Kryo has been upgraded to version 5.6, which is faster and more memory efficient but may require serializer updates.
- Many deprecated APIs and configuration options have been removed. See the DCL Migration Guide for Flink 2.2.0 and Flink 2.0 Release Notes for a comprehensive list.
Vulnerability Fixes
With environment-stream, the following vulnerabilities have been fixed:
The CVE-2025-52999 vulnerability was fixed by upgrading the com.fasterxml.jackson.core/jackson-databind library from version 2.14.2 to 2.18.2 in the sdk-dep-stream_2.12 BOM.
Migration steps
To migrate your project to use Flink 2.2.0 do the
following:
1. Upgrade SDK Dependencies
Upgrade the sdk-stream-bom_2.13 version to 2.85.8 or later. For a complete list of libraries in the SDK BOM files, see
sdk-stream-bom_2.13dependencies.sdk-dep-stream_2.13dependencies.environment-streamdependencies.
2. Update Your Application
Make the following updates if needed:
3. Recompile & Package
Recompile your pipeline code with the updated sdk-stream-bom_2.13 and generate a new fat JAR file.
For details, see Stream Pipeline.
4. Update Stream Pipeline
- Use the OLP CLI or Platform UI to create a new pipeline template and a pipeline version with the
stream-7.0runtime environment and the newly compiled JAR file. - In the Platform UI, you can speed up this process by copying the existing pipeline version that runs on
stream-6.1runtime environment, then modifying it to usestream-7.0with the updated JAR. For more information, see Deploy pipelines.
5. Upgrade Pipeline Version
Upgrade to the new pipeline version. See:
For more details on Flink 2.2.0 migration, see the DCL Migration Guide for Flink 2.2.0.
Migrate the SDK Projects from Java 8 to Java 17
This section outlines the key changes required to migrate your application from Java 8 to Java 17.
Support for Java 8 was removed starting from HERE Data SDK for Java & Scala version 2.75.
The latest version of the SDK BOM that supports Java 8 is 2.74.4.
To use SDK BOM version 2.75.5 or newer, your application must be compatible with Java 17.
Follow the instructions below to complete the migration.
-
Update the
sdk-<batch|stream|standalone>-bom_2.12version to2.75.5or later. -
Update your application to compile and run with Java 17. For detailed steps and guidance, refer to the Application Migration section below.
-
If you are deploying your application to the HERE Platform, ensure you are using a Java 17-compatible cluster type:
- For Spark: use
batch-4.1 - For Flink: use
stream-6.1
Note: If using the OLP CLI, update the
<cluster-type>in theolp pipeline template createcommand accordingly. - For Spark: use
Application migration
Migrating to Java 17 may require code changes, configuration updates, and compatibility checks with dependencies.
Below are key considerations and steps:
- Update build tools: Ensure your build tool (e.g., Maven, Gradle, SBT) supports Java 17.
- Set Java version explicitly:
- Maven:
<maven.compiler.source>17</maven.compiler.source>
<maven.compiler.target>17</maven.compiler.target>- SBT:
javacOptions ++= Seq("--release", "17")- Gradle:
java.toolchain.languageVersion.set(JavaLanguageVersion.of(17))-
Update dependencies: Make sure your application uses library versions specified in the SDK BOM. If you're using libraries or plugins that are not included in the SDK BOM, ensure they are compatible with Java 17.
-
Runtime Compatibility for Spark & Flink
Starting with
Java 16, applications must comply with theJava Platform Module System (JPMS), introduced through Project Jigsaw. This modularization enforces strong encapsulation of internal JDK APIs, meaning that access to certain internal classes must be explicitly granted using--add-opensor--add-exportsJVM options at runtime.When running Flink or Spark applications LOCALLY on
Java 17, you may encounter runtime issues due to reflective access to internal JDK classes. BothSparkandFlinkmake extensive use of reflection for serialization, object instantiation, and internal optimizations.A list of recommended
--add-opensJVM options for running your Spark or Flink applications locally can be found here:
Refer to the JDK migration guide for detailed insights on changes between JDK 8 and later versions.
Migrate the SDK Projects from Protobuf 3.x to 4.x
This section outlines the key changes required to migrate your application from Protobuf 3.19.x to Protobuf 4.32.x.
Support for Protobuf 4.32.0 was introduced in HERE Data SDK for Java & Scala 2.81.
The latest version of the SDK BOM that supports Protobuf 3.19 is 2.80.2.
Migration steps
To migrate your project to use Protobuf 4.32.x, follow these steps:
1. Update the protoc Compiler Version
protoc Compiler VersionYou must upgrade the protoc compiler to a 4.32.x version.
<properties>
<protoc.version>4.32.0</protoc.version>
</properties>Make sure any CI/CD jobs or local scripts that invoke protoc are using
the updated binary.
2. Update All com.google.protobuf Library Versions
com.google.protobuf Library VersionsEvery dependency in your project (or its modules) that pulls in
com.google.protobuf:* must also be updated to a compatible 4.32.x
version.
This includes:
- direct dependencies (e.g.,
protobuf-java,protobuf-java-util) - transitive dependencies brought by gRPC, Akka, Flink, Kafka, etc.
- Spring Boot starters that package protobuf
- any JARs or libraries that internally use protobuf 3.x and were pinned
We highly recommend using the SDK BOM version 2.81.5 or later, and relying on dependency management to ensure version alignment.
Use Maven's dependency tree to ensure there are no leftovers:
mvn dependency:tree | grep protobuf
All protobuf-related artifacts must use the same major version (4.32.x).
3. Update Protobuf-Related Maven Plugins
If your project uses:
<plugin>
<groupId>com.github.os72</groupId>
<artifactId>protoc-jar-maven-plugin</artifactId>
<version>3.11.4</version>
...
</plugin>This plugin is deprecated, incompatible with Protobuf 4+, and must be replaced with the new official maintained plugin:
<plugin>
<groupId>io.github.ascopes</groupId>
<artifactId>protobuf-maven-plugin</artifactId>
<version>${protobuf-maven-plugin.version}</version>
<executions>
<execution>
<phase>generate-sources</phase>
<goals>
<goal>generate</goal>
</goals>
</execution>
</executions>
<configuration>
<protocVersion>${protoc.version}</protocVersion>
<sourceDirectories>
<sourceDirectory>${project.build.directory}/proto</sourceDirectory>
</sourceDirectories>
<importPaths>
<importPath>${project.build.directory}/proto</importPath>
<importPath>${project.build.directory}/proto-lib</importPath>
</importPaths>
</configuration>
</plugin>Do not keep both plugins. Only the Ascopes plugin correctly supports the new 4.32.x protoc and handles include paths properly.
4. Optional: Update scalapb and stb
scalapb and stbIf your project uses ScalaPB or stb, update them to versions compatible with Protobuf 4:
scalapb → 1.0.0-alpha.3 stb → latest version supporting Protobuf 4
Older ScalaPB releases rely on protobuf 3.x runtime and will break with Protobuf 4.x descriptors.
5. Rebuild & Verify
Once everything is updated, build your project and ensure there are no compilation errors.
If you encounter the following error:
[ERROR] Failed to execute goal io.github.ascopes:protobuf-maven-plugin:3.6.0:generate (default)
on project generated_proto_schema_v1_java: The plugin io.github.ascopes:protobuf-maven-plugin:3.6.0
requires Maven version 3.8
This means that your current Maven installation is older than 3.8, which is the minimum version required
by io.github.ascopes:protobuf-maven-plugin:3.6.0.
How to fix it
- Upgrade Maven to 3.8.x or newer
- Ensure your environment (local machine, CI system, Docker image) uses the upgraded Maven version
- If using a wrapper, update
maven-wrapper.propertiesto point to a 3.8+ distribution
Also, look for errors such as:
NoClassDefFoundError: com/google/protobuf/RuntimeVersion$RuntimeDomain- incompatible descriptor errors
- plugin execution failures
- inconsistent protobuf runtime version warnings
These usually indicate a leftover 3.x runtime artifact somewhere in your dependency tree.
For more details on Protobuf migration, see the Protobuf Official Migration Guide.
Updated 19 days ago