How to run an archiving application locally
How to run an archiving application locally
After you have created an archiving application using the Data Archiving Library, you may want to run the application locally to test it before deploying it as a pipeline in the HERE Workspace. There are two ways you can run the application locally:
- Run locally with Maven
- Run locally with a local Flink cluster
The following information shows how to run the SDK example apps using both these methods.
Run with Maven
-
Download the HERE Data SDK examples project.
-
Fill in the necessary information in the
examples/data-archive/java/avro-example/src/main/resources/application.conffile -
If you want to use a custom logger, modify the
logback.xmlfile inside theresourcesfolder. -
Go to your example project root folder (
examples/data-archive/java/avro-example) and run the following command:mvn compile exec:java -Dexec.mainClass=com.here.platform.data.archive.example.Main -Padd-dependencies-for-local-runThe command starts the archiving application locally which consumes data from the stream layer and archives it to index layer. The application will be idle if there is no data to consume from the stream layer.
Run with a local Flink cluster
NoteApache Flink 2.2.0 does not officially support Scala 2.13. To run a local Flink cluster with Scala 2.13, replace the Scala 2.12 components with their Scala 2.13 counterparts as outlined below.
- Download Flink 2.2.0
curl -fL -O "https://archive.apache.org/dist/flink/flink-2.2.0/flink-2.2.0-bin-scala_2.12.tgz"
tar -xvf flink-2.2.0-bin-scala_2.12.tgz
chmod 777 flink-2.2.0- Remove flink-scala_2.12 from lib
cd flink-2.2.0
rm -f lib/flink-scala_2.12-*.jar || true- Download and place the Flink Scala API for 2.13, then start the cluster
curl -fL -O lib/flink-scala-api-2_2.13-2.2.0.jar "https://repo1.maven.org/maven2/org/flinkextended/flink-scala-api-2_2.13/2.2.0/flink-scala-api-2_2.13-2.2.0.jar"
cd bin
./start-cluster.sh-
Download the HERE Data SDK examples project.
-
Fill in the necessary information in the following file:
examples/data-archive/java/avro-example/src/main/resources/application.conf
-
Get a
credentials.propertiesfile containing the credentials to allow the example application to access the input and output catalogs and place the file in the~/.here/folder. For instructions, see the Identity & access management guide. -
Make sure that the credentials you use to generate the
credentials.properiesfile provide read permission to the input stream layer and read/write permission to the index layer. The credentials should match those in theapplication.conffile.Alternatively, you can place the
credentials.propertiesfile in the folder:examples/data-archive/java/avro-example/src/main/resources/
Note that the
~/.here/folder takes priority over theexamples/data-archive/java/avro-example/src/main/resources/folder. The format for thecredentials.propertiesfile is:
here.client.id = <Client Id>
here.access.key.id = <Access Key Id>
here.access.key.secret = <Access Key Secret>
here.token.endpoint.url = <Token Endpoint>-
Go to your example project root folder (
examples/data-archive/java/avro-example) and run the following command:mvn clean installThis command builds the JAR file to upload to the local Flink cluster. The output JAR file should be generated in the folder:
examples/data-archive/java/avro-example/target
-
Go to your local Flink UI at
http://localhost:8081. Click the left menu Submit new job, then Add New to upload the JAR file (has to be platform jar file). -
To run the application, click the checkbox on the left to select your uploaded JAR file.
-
Set the Entry class field to
com.here.platform.dal.DALMain, then click Submit. -
Go to "Running job" in the left menu to check whether your job is successfully running. You can also look at the Logs tab inside each job to see the generated logs. There is a
logback.xmlfile inside thesrc/main/resources/folder you can use to customize logger behaviour.
Updated 21 days ago