HERE Workspace & Marketplace 2.8 release
Highlights for this release
OLP Marketplace as a Neutral Server
In December 2016, the European Automobile Association (ACEA) announced their plans to make vehicle sensor data available through external, third-party "neutral servers". The ACEA envisioned an ecosystem where third-parties could access vehicle data in a secure manner, in order to create innovative services and safer driving experiences.
We agree, and have now extended the HERE OLP Marketplace to provide these Neutral Server capabilities, aligned with the ACEA Extended Vehicle concept, including:
- Access to vehicle sensor data - the Marketplace can now act as a secure, neutral, GDPR-compliant hub for data consumers to gain access to vehicle data from participating Automotive manufacturers
- Consent management - ensures privacy for owners and drivers of vehicles, by allowing them to grant and revoke consent for specific third-parties to access data generated by their vehicles
- ISO 20078 "ExVe" compliant interface - simplifies platform integration for data providers and consumers
For Data Providers - the Marketplace offers a new distribution channel for Automotive manufacturers who want to make their vehicle data available to third-parties. HERE operates the secure, scalable data access point, offering your data to third-parties, handling the commercial details, while providing you with anonymized usage reporting. With integrated Consent Management, buyers and drivers of your vehicles can have the confidence that their privacy will be protected, with permission to access data fully under their control, and in compliance with privacy regulations such as GDPR.
For Data Consumers - the Marketplace provides standardized ISO 20078-compliant access to OEM data. When subscribing to Neutral Server listings, Data Consumers license and access data from HERE, leveling the playing field by protecting the Data Consumer's identity from Data Providers. For more information on subscribing to Neutral Server data, refer to the Marketplace Consumer User Guide.
Note: OLP Marketplace is currently not available in China.
Japan Map Coverage
HERE OLP Workspace now has highly detailed map coverage of Japan (not available in OLP China).
The Japan map data has been published into HMC proprietary schema in a catalog named HERE Map Content Japan (hrn:here:data:::here-map-content-japan-2). Users will be able to develop with this new Japan map coverage like any other map coverage from HERE, leveraging all the same capabilities found in Workspace, including the OLP SDK and its location library.
Analytics availability in China
New additions to OLP China will include the Optimized Map for Analytics (OMA) catalog, built from the China map available in that region; and, the OLP SDK for Python, bringing the analytics capabilities in that region to parity with what's available in OLP outside of China.
Big Data Connectors
Additional Big Data connectors are available with this release to facilitate your data analysis, consumption and processing via Flink and Spark Pipelines. These connectors integrate OLP proprietary catalogs and layers with the industry standard formats of Flink (TableSource) and Spark (DataFrame). Whereas before you needed to implement this functionality yourself, now you can use these connectors to leverage the full functionality of searching, filtering, mapping, sorting and more offered by standard Apache Flink and Spark frameworks. Spend less time writing low level pipeline/data integration code and more time on your use case specific business logic:
- A new Flink connector for reading from and writing to a stream layer.
- The Spark connector now also supports writing to a Versioned layer. This additional functionality completes the full read and write support for Versioned and Index layers.
Changes, Additions and Known Issues
SDK for Java and Scala
To read about updates to the SDK for Java and Scala, please visit the SDK Release Notes.
Web & Portal
Issue: The custom run-time configuration for a Pipeline Version has a limit of 64 characters for the property name and 255 characters for the value.
Workaround: For the property name, you can define a shorter name in the configuration and map that to the actual, longer name within the pipeline code. For the property value, you must stay within the limitation.
Issue: Pipeline Templates can't be deleted from the Portal UI.
Workaround: Use the CLI or API to delete Pipeline Templates.
Issue: In the Portal, new jobs and operations are not automatically added to the list of jobs and operations for a pipeline version while the list is open for viewing.
Workaround: Refresh the Jobs and Operations pages to see the latest job or operation in the list.
Account & Permissions
Issue: A finite number of access tokens (~ 250) are available for each app or user. Depending on the number of resources included, this number may be smaller.
Workaround: Create a new app or user if you reach the limitation.
Issue: Only a finite number of permissions are allowed for each app or user in the system across all services. It will be reduced depending on the inclusion of resources and types of permissions.
Issue: All users and apps in a group are granted permissions to perform all actions on any pipeline associated with that group. There is no support for users or apps with limited permissions. For example, you cannot have a reduced role that can only view pipeline status, but not start and stop a pipeline. Limit the users in a pipeline's group to only those users who should have full control over the pipeline.
Issue: When updating permissions, it can take up to an hour for changes to take effect.
Data
Issue: Partition list information normally available within the Layer/Partitions tab of the Portal UI is not displaying correctly for newer Volatile layers created after September 10. This known issue is actively being worked on and should be resolved shortly.
Temporary workaround: To get partition list information for new volatile layers created after September 10, please use the appropriate CLI command with one of the following parameters:
--modified-since 2018
OR
--modified-since <time-of-interest>
Example: olp catalog layer partition list <your-catalog-HRN> <your-volatile-layer-ID> --modified-since 2018
Issue: Catalogs not associated with a realm are not visible in OLP.
Issue: Visualization of Index Layer data is not yet supported.
Issue: When you use the Data API or Data Library to create a Data Catalog or Layer, the app credentials used do not automatically enable the user who created those credentials to discover, read, write, manage, and share those catalogs and layers.
Workaround: After the catalog is created, use the app credentials to enable sharing with the user who created the app credentials. You can also share the catalog with other users, apps, and groups.
Pipelines
Fixed: an issue where when a Stream pipeline version running with the High-Availability mode enabled is paused or canceled from the OLP Portal or a new pipeline version is created within the same pipeline, then the High-Availability mode remains enabled during Resume, re-Activation of the existing pipeline version or during the Activation of the new pipeline version. The correct behavior is that the high-availability mode is disabled by default and we have released a fix to enable the correct behavior.
Issue: A pipeline failure or exception can sometimes take several minutes to respond.
Issue: Pipelines can still be activated after a catalog is deleted.
Workaround: The pipeline will fail when it starts running and will show an error message about the missing catalog. Re-check the missing catalog or use a different catalog.
Issue: If several pipelines are consuming data from the same stream layer and belong to the same Group (pipeline permissions are managed via a Group), then each of those pipelines will only receive a subset of the messages from the stream. This is because, by default, the pipelines share the same Application ID.
Workaround: Use the Data Client Library to configure your pipelines to consume from a single stream: If your pipelines/applications use the Direct Kafka connector, you can specify a Kafka Consumer Group ID per pipeline/application. If the Kafka consumer group IDs are unique, the pipelines/applications will be able to consume all the messages from the stream.
If your pipelines use the HTTP connector, we recommend you to create a new Group for each pipeline/application, each with its own Application ID.
Issue: The Pipeline Status Dashboard in Grafana can be edited by users. Any changes made by the user will be lost when updates are published in future releases because users will not be able to edit the dashboard in a future release.
Workaround: Duplicate the dashboard or create a new dashboard.
Issue: For Stream pipeline versions running with the high-availability mode, in a rare scenario, the selection of the primary Job Manager fails.
Workaround: Restart the stream pipeline.
Issue: When a paused Batch pipeline version is resumed from the Portal, the ability to change the execution mode is displayed but the change actually doesn't happen. This functionality is not supported yet and will be removed soon.
Issue: When a paused pipeline is resumed from the Portal, the ability to change the runtime credentials is displayed but the change actually doesn't happen. This will be fixed soon and you will be able to change the runtime credentials from the Portal while resuming a pipeline version.
Marketplace
Added Neutral Server capabilities to Marketplace: For automotive data providers, you can now create a ticket with Technical Support for inquiring integration with our ISO 20078 compliant interface and let HERE Marketplace act as Neutral Server. Once your data is listed, Data Consumers can find your data via provider name: HERE Neutral Server. They can subscribe to the data by selecting available subscription options. For more information, refer to the Marketplace Consumer User Guide
Added consent management for PII data distribution: For automotive data providers, Neutral Server can facilitate access to your data with personally identifiable information by managing your data subjects' consents before sharing. When data consumers subscribe to a data listing which contains PII, data consumers are directed to the consent management workflow for requesting data subjects' consents before the data access is granted. For more information, refer to the Neutral Server and Consent Management section in Marketplace Consumer User Guide
Issue: Users do not receive stream data usage metrics when reading or writing data from Kafka Direct.
Workaround: When writing data into a stream layer, you must use the ingest API to receive usage metrics. When reading data, you must use the Data Client Library, configured to use the HTTP connector type, to receive usage metrics and read data from a stream layer.
Issue: When the Technical Accounting is busy, the server can lose usage metrics.
Workaround: If you suspect you are losing usage metrics, contact HERE technical support for assistance rerunning queries and validating data.
Notebooks
Deprecated: OLP Notebooks has been deprecated, please refer to the advantages and enhancements offered in the new OLP SDK for Python instead. Download your old notebooks and refer to the Zeppelin Notebooks Migration Guide for further instructions.
OLP SDK for Python
Issue: Currently, only MacOS and Linux distributions are supported.
Workaround: If you are using Windows OS, we recommend that you use a virtual machine.
Optimized Map for Analytics (OMA)
Added: OMA now includes all ADAS layer attributes.
Have your say
Sign up for our newsletter
Why sign up:
- Latest offers and discounts
- Tailored content delivered weekly
- Exclusive events
- One click to unsubscribe