GuidesChangelogData Inspector Library API Reference
Guides

SDK migration guides

SDK migration guides

This guides provides a hints to upgrading your SDK projects to leverage the latest versions. Each section includes an overview of key changes, compatibility considerations, and best practices to ensure a smooth transition.

Scala 2.12 to Scala 2.13 migration guide

This section describes high-level changes that have to be made to migrate your app from Scala 2.12 to Scala 2.13. Scala 2.13 support was introduced as part of the HERE Data SDK for Java & Scala 2.80 release.

  1. New SDK BOM files with the _2.13 suffix were added for each environment. To migrate to the Scala 2.13, update SDK BOM artifactId's to include _2.13 suffix as follows:

  2. Change your app's dependencies with the _2.12 suffix to use _2.13. It is recommended that you use _${scala.compat.version} as a suffix. For instance,

   <dependency>
     <groupId>org.apache.spark</groupId>
     <artifactId>spark-core_${scala.compat.version}</artifactId>
     <version>${spark.version}</version>
     <scope>provided</scope>
   </dependency>

For convenience, all SDK BOM files declare the ${scala.compat.version} property. To use the property, SDK BOM must be added as a parent to your project. 3. Schemas maintained by HERE have been updated to support Scala 2.13:

  • HERE Map Content Schema
  • SDII Schema
  • Sensoris Schema
  • Real-Time Traffic Schema
  • Weather Schema
  • Optimized Map for Location Library

All Scala 2.13 bindings have the _2.13 suffix. To use the above-mentioned schemas, you have to add _2.13 to the schema artifactId.

For detailed steps on how to update or add a new Scala 2.13 module to your own schema projects, see Adding Scala 2.13 module to your schema project

For more details on how to include a schema into the Maven or sbt project, refer to the schema page on the platform, see Sensoris schema as an example.

  1. Your own schemas published on the platform must be updated to support Scala 2.13 by adding
    Scala 2.13 bindings to the schema project.

  2. Since the 2.80 release, the HERE platform supports two additional pipeline environments for Scala 2.13. To run Scala 2.13 applications on the platform, use the newest versions:

    If you use the OLP CLI, you must change the <cluster type> parameter in the olp pipeline template create command.

Additional Migration Notes

Code-level changes in Scala 2.13

Scala 2.13 introduced significant changes to the standard collections library:

  • import scala.collection.JavaConverters._ should be replaced with import scala.jdk.CollectionConverters._
  • Some methods were renamed or dropped (.to[Collection] replaced CanBuildFrom, Stream replaced with LazyList, etc.).
  • Scala 2.13 shows a warning if there is no default case in a match expression that is not exhaustive.

For more Refer to the Scala 2.13 Migration Guide for more detailed examples and changes.

Adding scala 2.13 module to your schema project

Follow these steps to add the new scala_2.13 module to your project:

  1. Create a new module in your schema:

    • Create a scala_2.13 directory in your root project folder.
    • Copy the scala 2.12 Maven module configuration from scala_2.12/pom.xml to scala_2.13/pom.xml.
    • Update the modules section of the pom.xml file in your root project directory to include the new module. Example:
     <modules>
       <module>proto</module>
       <module>java</module>
       <module>scala_2.12</module>
       <module>scala_2.13</module>
       <module>ds</module>
     </modules>
  1. Update scala_2.13/pom.xml as follows:

    • Rename artifactId from <schema-name>_scala_2.12 to <schema-name>_scala_2.13 where <schema-name> is the name of your schema. Please note that no changes should be made in scala_2.12/pom.xml in order for your schema to be backwards compatible with the previous versions.
    • Update name and description appropriately for the new module.
    • Update the scala-library version in dependencies section to 2.13.x as shown in the example below:
     <dependency>
       <groupId>org.scala-lang</groupId>
       <artifactId>scala-library</artifactId>
       <version>2.13.16</version>
     </dependency>
  • In the dependencies section, replace the scalapb-runtime_2.12dependency with the scalapb-runtime_2.13 one as follows:
     <dependency>
       <groupId>com.thesamet.scalapb</groupId>
       <artifactId>scalapb-runtime_2.13</artifactId>
       <version>0.11.19</version>
       <exclusions>
         <exclusion>
           <groupId>org.scala-lang</groupId>
           <artifactId>scala-library</artifactId>
         </exclusion>
       </exclusions>
     </dependency>
  • In the build > plugins > protoc-scala-maven-plugin plugin:

    • Rename the artifactId of the protoc-scala-maven-plugin_2.12 to protoc-scala-maven-plugin_2.13.
    • Remove the dependencies section of the plugin if it exists. The final protoc-scala-maven-plugin_2.13 configuration may look like this:
       <plugin>
         <groupId>com.here.platform.schema.maven_plugins</groupId>
         <artifactId>protoc-scala-maven-plugin_2.13</artifactId>
         <version>${here.plugin.version}</version>
         <executions>
           <execution>
             <phase>generate-sources</phase>
             <goals>
               <goal>scala-protoc-mojo</goal>
             </goals>
             <configuration>
               <protocVersion>${protoc.version}</protocVersion>
               <inputDirectories>
                 <include>${project.build.directory}/proto</include>
               </inputDirectories>
               <includeDirectories>
                 <include>${project.build.directory}/proto-lib</include>
               </includeDirectories>
               <includeStdTypes>false</includeStdTypes>
             </configuration>
           </execution>
         </executions>
       </plugin>
  1. If your schema project contains Scala 2.12 and 2.13 modules, do not use the scala.version or scala.compat.version Maven properties. These variables may lead to runtime issues making either Scala 2.12 or 2.13 bindings unusable.

Known Limitations & Workarounds

Apache Flink 1.19.2

Apache Flink 1.19.2 does not provide Scala 2.13 binaries.
Because of this, some dependencies must be adjusted when migrating:

  • Replace org.apache.flink:flink-streaming-scala_2.12 with org.apache.flink:flink-streaming-java (recommended).
  • Replace org.apache.flink:flink-scala_2.12 with the Scala 2.13 library from org.flinkextended:flink-scala-api_2.13.

These changes ensure compatibility with Scala 2.13 while continuing to run on Flink 1.19.2.

Migrate the SDK Projects from Spark 2.4.7 to 3.4.1

This section describes high-level changes that have to be made to migrate your app from Spark 2.4.7 to Spark 3.4.1. Spark 3.4.1 support was introduced as part of the HERE Data SDK for Java & Scala 2.58 release.

To migrate your project to use Spark 3.4.1 do the following:

  1. Update the sdk-batch-bom_2.12 version to 2.58.2 or later.
  2. If you use the com.thesamet.scalapb:sparksql-scalapb_2.12, replace it with the com.thesamet.scalapb:sparksql33-scalapb0_10_2.12 to support Spark 3.4.1.
  3. See the Spark Migration Guide to cover specific cases of your project.
  4. Use the batch-4.0 cluster type to run your application on the Platform. If you use the OLP CLI, you must change the <cluster type> parameter in the olp pipeline template create command.

Migrate the SDK Projects from Akka to Pekko

Starting from version 2.68.3, the HERE SDK for Java and Scala has been migrated from Apache Akka 2.5.32 to Apache Pekko 1.0.2.

To migrate your project, follow these steps:

  1. Use SDK BOM version 2.68.3 or later.

  2. Upgrade from Akka 2.5.32 to Akka 2.6.21.

    Follow the official Akka migration guide: Akka 2.5.x to 2.6.x Migration Guide

  3. Upgrade from Akka 2.6.21 to Pekko 1.0.2.

    Follow the official Pekko migration guide: Pekko Migration Guide

  4. For Flink-related projects.

Ensure that all reference.conf files from various dependencies are appended and merged into a single file in the fat jar to avoid configuration issues. For more information, see the Data Client Developer Guide

Migrate the SDK Projects from Spark 3.4.4 to 4.0.1

This section describes high-level changes that have to be made to migrate your app from Spark 3.4.4 to Spark 4.0.1. Spark 4.0.1 support was introduced as part of the HERE Data SDK for Java & Scala 2.81 release.

To migrate your project to use Spark 4.0.1 do the following:

  1. Update the sdk-batch-bom_2.13 version to 2.81.5 or later.
  2. If you use the com.thesamet.scalapb:sparksql-scalapb_2.13, remove it and replace ScalaPB/Frameless encoders with standard Spark 4 encoders:
    • Use Encoders.kryo[YourProtoClass] for Protobuf message types.
    • Use Encoders.BINARY for Array[Byte] columns.
    • Avoid importing scalapb.spark.Implicits._, which is not compatible with Spark 4.
  3. See the Spark Migration Guide to cover specific cases of your project.
  4. Use the batch-5.0 cluster type to run your application on the Platform. If you use the OLP CLI, you must change the <cluster type> parameter in the olp pipeline template create command.

Migrate the SDK Projects from Flink 1.13.5 to 1.19.1

This section outlines the key changes required to migrate your application from Flink 1.13.5 to Flink 1.19.1. Support for Flink 1.19.1 was introduced in HERE Data SDK for Java & Scala 2.72.

Key Changes

Starting from Flink 1.15, one major improvement is the removal of Scala dependencies from the classpath.

Previously, Flink included Scala libraries, leading to version mismatches and dependency conflicts. Now, Flink no longer ships with Scala dependencies, making it more modular and stable.

As a result:

Vulnerability Fixes

With environment-stream, the following vulnerabilities have been fixed:

The CVE-2022-42004, CVE-2020-36518, CVE-2021-46877, and CVE-2022-42003 vulnerabilities were fixed by upgrading the com.fasterxml.jackson.core/jackson-databind library from version 2.12.1 to 2.14.2 in the sdk-dep-stream_2.12 BOM. The CVE-2022-39135 vulnerability was fixed by removing org.apache.calcite/calcite-core:1.26.0 from the environment-stream BOM. The CVE-2022-36364 vulnerability was fixed by removing org.apache.calcite.avatica/avatica-core:1.17.0 from the environment-stream BOM.

Migration steps

To migrate your project to use Flink 1.19.1 do the following:

1. Upgrade SDK Dependencies

Upgrade the sdk-stream-bom_2.12 version to 2.72.4 or later. For a complete list of libraries in the SDK BOM files, see

2. Update Your Application

Made the following updates if needed:

3. Recompile & Package

Recompile your pipeline code with the updated sdk-stream-bom_2.12 and generate a new fat JAR file.

For details, see Stream Pipeline.

4. Update Stream Pipeline

  • Use the OLP CLI or Platform UI to create a new pipeline template and a pipeline version with the stream-6.0 runtime environment and the newly compiled JAR file.
  • In the Platform UI, you can speed up this process by copying the existing pipeline version that runs on stream-5.0 runtime environment, then modifying it to use stream-6.0 with the updated JAR. For more information, see Deploy pipelines.

5. Upgrade Pipeline Version

Upgrade to the new pipeline version. See:

ClassNotFoundException Issues with Flink Applications Using Maven or SBT

If you encounter ClassNotFoundException errors (such as for org.apache.flink.api.common.ExecutionConfig) when running your Flink application via Maven or SBT, it's often due to classpath issues during job execution. This problem can surface after upgrading Flink versions, such as from 1.13 to 1.20.

Fixes for Maven Projects:

Switch from mvn exec:java to the mvn exec:exec plugin to resolve the issue, ensuring that the correct classpath is included during execution.

Before:

mvn compile exec:java -D"exec.mainClass"="DevelopFlinkApplication" \
-Dpipeline-config.file=pipeline-config.conf

Now:

mvn compile exec:exec -Dexec.executable="java" \
-Dexec.args="-cp %classpath -Dpipeline-config.file=pipeline-config.conf DevelopFlinkApplication"

Fixes for SBT Projects:

Add the following configuration to your build.sbt file to properly handle Java options and classpath issues:

      Compile / run / fork := true
      Compile / javaOptions ++= sys.props.toSeq.map { case (k, v) => s"-D$k=$v" }

For more details on Flink migration, see the DCL Migration Guide.

Migrate the SDK Projects from Flink 1.19.1 to 2.2.0

This section outlines the key changes required to migrate your application from Flink 1.19.1 to Flink 2.2.0. Support for Flink 2.2.0 was introduced in HERE Data SDK for Java & Scala 2.85.

Key Changes

Flink 2.2.0 is part of the Flink 2.x major release line and introduces significant breaking changes compared to Flink 1.19.x.

Below is a summary of the most impactful changes:

Vulnerability Fixes

With environment-stream, the following vulnerabilities have been fixed:

The CVE-2025-52999 vulnerability was fixed by upgrading the com.fasterxml.jackson.core/jackson-databind library from version 2.14.2 to 2.18.2 in the sdk-dep-stream_2.12 BOM.

Migration steps

To migrate your project to use Flink 2.2.0 do the following:

1. Upgrade SDK Dependencies

Upgrade the sdk-stream-bom_2.13 version to 2.85.8 or later. For a complete list of libraries in the SDK BOM files, see

2. Update Your Application

Make the following updates if needed:

3. Recompile & Package

Recompile your pipeline code with the updated sdk-stream-bom_2.13 and generate a new fat JAR file.

For details, see Stream Pipeline.

4. Update Stream Pipeline

  • Use the OLP CLI or Platform UI to create a new pipeline template and a pipeline version with the stream-7.0 runtime environment and the newly compiled JAR file.
  • In the Platform UI, you can speed up this process by copying the existing pipeline version that runs on stream-6.1 runtime environment, then modifying it to use stream-7.0 with the updated JAR. For more information, see Deploy pipelines.

5. Upgrade Pipeline Version

Upgrade to the new pipeline version. See:

For more details on Flink 2.2.0 migration, see the DCL Migration Guide for Flink 2.2.0.

Migrate the SDK Projects from Java 8 to Java 17

This section outlines the key changes required to migrate your application from Java 8 to Java 17. Support for Java 8 was removed starting from HERE Data SDK for Java & Scala version 2.75. The latest version of the SDK BOM that supports Java 8 is 2.74.4.

To use SDK BOM version 2.75.5 or newer, your application must be compatible with Java 17.

Follow the instructions below to complete the migration.

  1. Update the sdk-<batch|stream|standalone>-bom_2.12 version to 2.75.5 or later.

  2. Update your application to compile and run with Java 17. For detailed steps and guidance, refer to the Application Migration section below.

  3. If you are deploying your application to the HERE Platform, ensure you are using a Java 17-compatible cluster type:

    Note: If using the OLP CLI, update the <cluster-type> in the olp pipeline template create command accordingly.

Application migration

Migrating to Java 17 may require code changes, configuration updates, and compatibility checks with dependencies.

Below are key considerations and steps:

  • Update build tools: Ensure your build tool (e.g., Maven, Gradle, SBT) supports Java 17.
  • Set Java version explicitly:
    • Maven:
    <maven.compiler.source>17</maven.compiler.source>
    <maven.compiler.target>17</maven.compiler.target>
  • SBT:
    javacOptions ++= Seq("--release", "17")
  • Gradle:
    java.toolchain.languageVersion.set(JavaLanguageVersion.of(17))
  • Update dependencies: Make sure your application uses library versions specified in the SDK BOM. If you're using libraries or plugins that are not included in the SDK BOM, ensure they are compatible with Java 17.

  • Runtime Compatibility for Spark & Flink

    Starting with Java 16, applications must comply with the Java Platform Module System (JPMS), introduced through Project Jigsaw. This modularization enforces strong encapsulation of internal JDK APIs, meaning that access to certain internal classes must be explicitly granted using --add-opens or --add-exports JVM options at runtime.

    When running Flink or Spark applications LOCALLY on Java 17, you may encounter runtime issues due to reflective access to internal JDK classes. Both Spark and Flink make extensive use of reflection for serialization, object instantiation, and internal optimizations.

    A list of recommended --add-opens JVM options for running your Spark or Flink applications locally can be found here:

Refer to the JDK migration guide for detailed insights on changes between JDK 8 and later versions.

Migrate the SDK Projects from Protobuf 3.x to 4.x

This section outlines the key changes required to migrate your application from Protobuf 3.19.x to Protobuf 4.32.x. Support for Protobuf 4.32.0 was introduced in HERE Data SDK for Java & Scala 2.81. The latest version of the SDK BOM that supports Protobuf 3.19 is 2.80.2.

Migration steps

To migrate your project to use Protobuf 4.32.x, follow these steps:

1. Update the protoc Compiler Version

You must upgrade the protoc compiler to a 4.32.x version.

<properties>
    <protoc.version>4.32.0</protoc.version>
</properties>

Make sure any CI/CD jobs or local scripts that invoke protoc are using the updated binary.

2. Update All com.google.protobuf Library Versions

Every dependency in your project (or its modules) that pulls in com.google.protobuf:* must also be updated to a compatible 4.32.x version.

This includes:

  • direct dependencies (e.g., protobuf-java, protobuf-java-util)
  • transitive dependencies brought by gRPC, Akka, Flink, Kafka, etc.
  • Spring Boot starters that package protobuf
  • any JARs or libraries that internally use protobuf 3.x and were pinned

We highly recommend using the SDK BOM version 2.81.5 or later, and relying on dependency management to ensure version alignment.

Use Maven's dependency tree to ensure there are no leftovers:

mvn dependency:tree | grep protobuf

All protobuf-related artifacts must use the same major version (4.32.x).

3. Update Protobuf-Related Maven Plugins

If your project uses:

<plugin>
    <groupId>com.github.os72</groupId>
    <artifactId>protoc-jar-maven-plugin</artifactId>
    <version>3.11.4</version>
    ...
</plugin>

This plugin is deprecated, incompatible with Protobuf 4+, and must be replaced with the new official maintained plugin:

<plugin>
    <groupId>io.github.ascopes</groupId>
    <artifactId>protobuf-maven-plugin</artifactId>
    <version>${protobuf-maven-plugin.version}</version>
    <executions>
        <execution>
            <phase>generate-sources</phase>
            <goals>
                <goal>generate</goal>
            </goals>
        </execution>
    </executions>
    <configuration>
        <protocVersion>${protoc.version}</protocVersion>
        <sourceDirectories>
            <sourceDirectory>${project.build.directory}/proto</sourceDirectory>
        </sourceDirectories>
        <importPaths>
            <importPath>${project.build.directory}/proto</importPath>
            <importPath>${project.build.directory}/proto-lib</importPath>
        </importPaths>
    </configuration>
</plugin>

Do not keep both plugins. Only the Ascopes plugin correctly supports the new 4.32.x protoc and handles include paths properly.

4. Optional: Update scalapb and stb

If your project uses ScalaPB or stb, update them to versions compatible with Protobuf 4:

scalapb → 1.0.0-alpha.3 stb → latest version supporting Protobuf 4

Older ScalaPB releases rely on protobuf 3.x runtime and will break with Protobuf 4.x descriptors.

5. Rebuild & Verify

Once everything is updated, build your project and ensure there are no compilation errors.

If you encounter the following error:

[ERROR] Failed to execute goal io.github.ascopes:protobuf-maven-plugin:3.6.0:generate (default)
on project generated_proto_schema_v1_java: The plugin io.github.ascopes:protobuf-maven-plugin:3.6.0
requires Maven version 3.8

This means that your current Maven installation is older than 3.8, which is the minimum version required by io.github.ascopes:protobuf-maven-plugin:3.6.0.

How to fix it

  • Upgrade Maven to 3.8.x or newer
  • Ensure your environment (local machine, CI system, Docker image) uses the upgraded Maven version
  • If using a wrapper, update maven-wrapper.properties to point to a 3.8+ distribution

Also, look for errors such as:

  • NoClassDefFoundError: com/google/protobuf/RuntimeVersion$RuntimeDomain
  • incompatible descriptor errors
  • plugin execution failures
  • inconsistent protobuf runtime version warnings

These usually indicate a leftover 3.x runtime artifact somewhere in your dependency tree.

For more details on Protobuf migration, see the Protobuf Official Migration Guide.