HERE Workspace for Java and Scala developers tutorials
HERE Workspace for Java and Scala developers tutorials
This document contains a set of developer tutorials to provide an overview of available HERE Workspace functionalities.
This set of tutorials teaches you how to create applications using the Workspace. These tutorials also introduce you to the key concepts and functionalities of the Workspace's primary components.
For more information about the HERE Data SDK for Java & Scala, see HERE Data SDK for Java & Scala.
For a list of the HERE Workspace components and their uses, see the Which Tool to Use?.
For the terms and conditions covering this documentation, see the HERE Documentation License.
Prerequisites
- A HERE Workspace user account
- A development environment , including the following:
- A basic familiarity with the Workspace data model, as organized into Catalogs, layers, and partitions
List of tutorials
Verify that you have access to the HERE platform portal, and that you have configured the Maven settings (Repository Credentials) and HERE Credentials (platform Credentials).
The HERE Workspace offers a wide range of functionality along a rich software stack. This series of tutorials guides you along the Workspace major components:
- Verify your Maven settings demonstrates how to write and run your first program with the Data SDK for Java & Scala.
- Verify your credentials demonstrates how to connect to a catalog to read some meta-information in order to verify that you set up the credentials correctly.
- Organize your work in projects demonstrates how to create your first project, manage project access, add new catalog with a versioned layer to the project, using the OLP Command Line Interface (CLI).
- Develop a Flink application demonstrates how to develop, debug and run a Flink application.
- Run a Flink application on the platform demonstrates how to run a simple Flink application on the platform and monitor the application execution.
- Develop a Spark application demonstrates how to develop, debug and run a Spark application.
- Run a Spark application on the platform demonstrates how to run a simple Spark application on the platform and monitor the application execution.
- Publish JSON schema to the platform demonstrates how to generate and build a JSON schema and publish it to the platform.
- Publish Protobuf schema to the platform demonstrates how to generate and build a Protobuf schema and publish it to the platform.
- Implement GeoJSON Renderer demonstrates how to implement the GeoJSON rendering plugin using the Data Inspector.
- Calculate partition Tile IDs demonstrates how to calculate partition Tile IDs for certain geocoordinate area queries and levels.
- Read from a catalog in a batch application demonstrates how to create your first Spark application using the Data SDK for Java & Scala.
- Copy a catalog using the Data Processing Library demonstrates how to write a Data Processing Library application that copies a catalog populated with geojson content.
- Use HERE platform service demonstrates how to use HERE platform services through the example of the HERE Traffic API client.
- Subscribe to catalog and layer-level changes demonstrates how to create and manage subscriptions to catalog and layer-level changes.
- Path matching in Spark demonstrates how to use the Location Library path matcher inside a Spark application.
- Correlate road attributes to segment geometry demonstrates how to follow the attribution referencing model of HERE Map Content to correlate road attributes to segment geometry.
- Use Spark connector to read and write data demonstrates how to implement an application leveraging the Spark connector to read and write data from different layer types and data formats.
- Local development and testing with CLI and Data Client Library demonstrates how to implement and test an application with local catalogs.
- Read and write to the Object store layer using Hadoop FS support in Spark demonstrates how to read and write to Object store layer using Hadoop FS Support library with Spark in Java & Scala.
- Bring your data to Object store layer demonstrates how to bring your data into the Object store layer in HERE Platform using OLP CLI and Apache Hadoop.
- Use Flink Connector to Read and Write Data demonstrates how to implement an application using the Flink connector to read, transform and write data.
All the tutorials strive to be single-file code, with a single-file build system, for ease of readability and copy-pasting snippets into your IDE.
Updated 21 days ago