Skip to content

Latest commit

 

History

History
47 lines (29 loc) · 2.26 KB

File metadata and controls

47 lines (29 loc) · 2.26 KB

High volume message upload with Streaming Analytics and exactly once semantics

Description

Data Historian - Event Streams to Object Storage Demo

The demo demonstrates the very common use case when input data is read from Event Streams and is written to the IBM Cloud Object Storage (COS). These objects created on COS can be queried, for example, with IBM SQL Query service.

The demo applications integrate IBM Streams features, like consistent region, user-defined parallelism and optional data types, and provide the following features:

  • Scalability
  • Data integrity - Guaranteed processing with exactly once semantics
  • Resiliency
  • Simplicity - Takes advantage of Kafka consumer groups

Import

The demo contains two IBM Streams applications:

Requirements

IBM Streams 4.3

Setup the IBM Cloud services: Setup

To achieve high throughput with large volumes, the Streaming Analytics premium service plan is required, see setup with premium service plan.

Customize Streams Console Dashboard

Optional: Import dashboard configuration file: Dashboard

Launch the applications

Instructions how to launch the SPL applications to the Streaming Analytics service: Launch SPL applications

To achieve high throughput with large volumes, the Streaming Analytics premium service plan is required, see Launch SPL applications with premium service plan.

Alternative try the IBM Streams Python Application Demo: Launch Python application

Utilized Toolkits

  • com.ibm.streamsx.json
  • com.ibm.streamsx.messagehub
  • com.ibm.streamsx.objectstorage