Skip to content

Latest commit

 

History

History
 
 

cp-all-in-one-ibmmq

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

image

IBM MQ demo

This repository demonstrates how to use the IBM MQ connector. Two connectors will be started up: Datagen source, to mock clickstream data and IBM MQ Connetor source. Then we'll use KSQL to join the two sources together. No sink connector is configured.

Make commands

make build
make cluster
# wait a minute for cluster to spinup

Make the topics and connectors

make topic
make connect
# wait a minute before moving on to the next step

Open the IBM MQ Dashboard

log in

UserName=admin
Password=passw0rd

Show AVRO schema in C3 topics

Goto the link below to view the AVRO schema the datagen connector registered to schema registry. clickstream schema

You need to send a message to IBM MQ before the schema will appear in the topic in C3.

  • Select DEV.QUEUE.1 under "Queues on MQ1"

ibmmq

  • Add a message

add image add image

  • You can now see the schema assigned to the ibmmq topic

ibmmq topic

AVRO message appear in consumer

Run the ibmmq consumer to see messages coming in from DEV.QUEUE.1

make consumer

KSQL

Create the stream from the CLICKSTREAM topic in C3

Make sure to leave the timestamp blank to use the topic timestamp by default

clickstream stream

Add anothermessage to DEV.QUEUE.1

You can use the user names bobk_43 or akatz1022 to capture clickstreams for those users with a KSQL join.

Create the Stream for the IBMMQ topic

This time we will use KSQL to create the stream. Paste the KSQL statement into the KSQL Editor.

CREATE STREAM ibmmq
  WITH (KAFKA_TOPIC='ibmmq',
        VALUE_FORMAT='AVRO');

JOIN the 2 streams

Paste the KSQL statement into the KSQL Editor to perform the join.

select  * from  CLICKSTREAM
join  IBMMQ WITHIN 5 seconds
on text = username;

join