Real-time data linkage via Linked Data Event Streams

Click for: original source

Real-time interchanging data across domains and applications is challenging; data format incompatibility, latency and outdated data sets, quality issues, and lack of metadata and context. A Linked Data Event Stream (LDES) is a new data publishing approach which allows you to publish any dataset as a collection of immutable objects. The focus of an LDES is to allow clients to replicate the history of a dataset and efficiently synchronize with its latest changes. By towardsai.net.

Using Linked Data Event Stream (LDES), data can be fluently shared between different systems and organizations. In this way, companies and organizations can ensure that their data is well-structured, interoperable, and easily consumable by other systems and services. LDES has emerged as a standard for representing and sharing up-to-date data streams.

Further in the article:

  • Linked Data Event Streams explained in 8 minutes
  • Onboarding of a Linked Data Event Stream
  • Interlink multiple Linked Data Event Streams
  • SPARQL query
  • Combining multiple data streams via their semantics

To replicate the whole proof of concept (LDES 2 GraphDB), please go to provided Github repository. It describes how to set up the GraphDB and Apache NiFi via docker, after which the data flow can be started using the supplied Apache NiFi setup file. Nice one!

[Read More]

Tags data-science streaming performance how-to big-data apache