ProptechOS Streaming API is delivered via Azure EventHub, and can be consumed via a Kafka Consumer .
See Apache - Kafka clients and Apache - Kafka quickstart for general information on Kafka and Kafka consumers.
See the Quickstart tutorials at Github - Azure eventhubs for kafka for the specifics of consuming Azure Eventhub via the Kafka protocol.
The Microsoft docs resource at event hubs for kafka ecosystem overview will give an overview of translating Eventhub concepts to Kafka concepts.
In summary:
- kafka topic translates to EventHub
- the kafka 'bootstrap.servers' translates to EventHub namespace path (e.g. sb://idun-myproptechos-eventhubs-streamingapi.servicebus.windows.net/)
- the kafka 'ssl.ca' property should be set to 'SASL_SSL'
- the kafka 'sasl.username' property should be set to '$ConnectionString' (literally)
- the kafka 'sasl.password' property should be set to '<the connection string>'
See a full working example: Java Spring ProptechOS Streaming API consumer.
...
"observations":
[
{
"value": "21.5",
"quantityKind": "Temperature",
"sensorId": "0234c884-f8dc-48d6-b627-2f0d8f8705d6",
"observationTime": "2019-06-06T13:37:32.379Z"
}
]
The Idun Streaming message contains a json formatted RealEstateCore (https://www.realestatecore.io) Observation and the URI of the sensor that produced the observation. To get additional information on the message and the sensor, follow the sensor URI (e.g. the device, the BuildingStrcutureComponent, the RealEstate or related Actuators).
For information about event hubs and how to consume messages via the EventProcessorHost, please refer to https://docs.microsoft.com/en-us/azure/event-hubs/event-hubs-programming-guide