GuidesChangelogData Inspector Library API Reference
Guides

How to run an archiving application locally

How to run an archiving application locally

After you have created an archiving application using the Data Archiving Library, you may want to run the application locally to test it before deploying it as a pipeline in the HERE Workspace. There are two ways you can run the application locally:

  • Run locally with Maven
  • Run locally with a local Flink cluster

The following information shows how to run the SDK example apps using both these methods.

Run with Maven

  1. Download the HERE Data SDK examples project.

  2. Fill in the necessary information in the examples/data-archive/java/avro-example/src/main/resources/application.conf file

  3. If you want to use a custom logger, modify the logback.xml file inside the resources folder.

  4. Go to your example project root folder (examples/data-archive/java/avro-example) and run the following command:

    mvn compile exec:java -Dexec.mainClass=com.here.platform.data.archive.example.Main -Padd-dependencies-for-local-run

    The command starts the archiving application locally which consumes data from the stream layer and archives it to index layer. The application will be idle if there is no data to consume from the stream layer.

Run with a local Flink cluster

Note

Apache Flink 2.2.0 does not officially support Scala 2.13. To run a local Flink cluster with Scala 2.13, replace the Scala 2.12 components with their Scala 2.13 counterparts as outlined below.

  1. Download Flink 2.2.0
   curl -fL -O "https://archive.apache.org/dist/flink/flink-2.2.0/flink-2.2.0-bin-scala_2.12.tgz"

   tar -xvf flink-2.2.0-bin-scala_2.12.tgz
   chmod 777 flink-2.2.0
  1. Remove flink-scala_2.12 from lib
   cd flink-2.2.0
   rm -f lib/flink-scala_2.12-*.jar || true
  1. Download and place the Flink Scala API for 2.13, then start the cluster
   curl -fL -O lib/flink-scala-api-2_2.13-2.2.0.jar "https://repo1.maven.org/maven2/org/flinkextended/flink-scala-api-2_2.13/2.2.0/flink-scala-api-2_2.13-2.2.0.jar"
   cd bin
   ./start-cluster.sh
  1. Download the HERE Data SDK examples project.

  2. Fill in the necessary information in the following file:

    examples/data-archive/java/avro-example/src/main/resources/application.conf

  3. Get a credentials.properties file containing the credentials to allow the example application to access the input and output catalogs and place the file in the ~/.here/ folder. For instructions, see the Identity & access management guide.

  4. Make sure that the credentials you use to generate the credentials.properies file provide read permission to the input stream layer and read/write permission to the index layer. The credentials should match those in the application.conf file.

    Alternatively, you can place the credentials.properties file in the folder:

    examples/data-archive/java/avro-example/src/main/resources/

    Note that the ~/.here/ folder takes priority over the examples/data-archive/java/avro-example/src/main/resources/ folder. The format for the credentials.properties file is:

   here.client.id = <Client Id>
   here.access.key.id = <Access Key Id>
   here.access.key.secret = <Access Key Secret>
   here.token.endpoint.url = <Token Endpoint>
  1. Go to your example project root folder (examples/data-archive/java/avro-example) and run the following command:

    mvn clean install

    This command builds the JAR file to upload to the local Flink cluster. The output JAR file should be generated in the folder:

    examples/data-archive/java/avro-example/target

  2. Go to your local Flink UI at http://localhost:8081. Click the left menu Submit new job, then Add New to upload the JAR file (has to be platform jar file).

  3. To run the application, click the checkbox on the left to select your uploaded JAR file.

  4. Set the Entry class field to com.here.platform.dal.DALMain, then click Submit.

  5. Go to "Running job" in the left menu to check whether your job is successfully running. You can also look at the Logs tab inside each job to see the generated logs. There is a logback.xml file inside the src/main/resources/ folder you can use to customize logger behaviour.