IoT Analytics Lifecycle Tutorials & Snippets
for developers

Tutorials


SAS ESP for ONNX Runtime
In this tutorial, learn how to integrate ONNX Runtime with SAS Event Stream Processing (ESP) on a Tiny YOLOv2 model, a real-time object detection network.

Key takeaways from the example: 
 •
integrate ONNX Runtime with SAS Event Stream Processing
 • leverages an Intel CPU with a CUDA TensorRT CPUb
 • sample UI developed with SAS ESP Connect API

>> Resources on GitHub <<

Show details

Overview

Open Neural Network Exchange (ONNX) is an open standard format to represent machine learning models. ONNX Runtime is an accelerator for machine learning models with support for multiple platforms and the flexibility to be integrated with a variety of frameworks. This enables developers to approach more complex use cases while maintaining efficient building and testing efforts.

This demo showcases the integration of ONNX Runtime with SAS Event Stream Processing on a Tiny YOLOv2 model, a real-time object detection network. This demo leverages an Intel CPU with (or without, as per user specifications) a CUDA TensorRT CPU on Intel hardware. It is based on a Docker configuration, so having Docker installed is required. The sample UI was developed specifically for this demo with a separate API, SAS ESP Connect.


Stream data to SAS ESP model using Azure Event Hub Connector
In this tutorial, learn how to stream data from an Azure event hub into a SAS Event Stream Processing (ESP) model using the Azure EventHub Connector.

Key takeaways from the example: 
 •
configure an Azure Logic application to query a service
 • format response data and send it to an Azure Event Hub
 • create an ESP project to receive events from the Event Hub using the ESP Azure Event Hub Connector

>> Resources on GitHub <<

Show details

Overview

In this project we will configure an Azure Logic application to query the Azure Maps Weather service once an hour. The logic app will then format the data returned from the weather API into the required JSON string and send it to an Azure Event Hub. Once in the Event Hub, it is available to other Azure resources and applications. We will create an ESP project that receives the events from the Event Hub using the ESP Azure Event Hub Connector.


Process Azure IoT Hub Messages With SAS ESP
In this tutorial, learn how to process messages from an Azure IoT Hub with a SAS Event Stream Processing (ESP) project.

Key takeaways from the example: 
 •
configure a simulated IoT Device to send sensor data to an IoT Hub
 • configure the IoT Hub to expose the message stream via an event hub endpoint
 • connect a SAS Event Stream Processing project to process the messages

>> Resources on GitHub <<

Show details

Overview

In this project we will configure a simulated IoT Device to send sensor data to an IoT Hub, configure the IoT Hub to expose the message stream via an event hub endpoint, and then connect a SAS Event Stream Processing project to process the messages.


Deploy ASTORE from SAS Model Manager in SAS ESP
In this tutorial, we focus on the integrated process of model managment and analytics deployment.

Key takeaways from the example: 
 •
import an Analytics Store (ASTORE) from models built in SAS Model Studio, SAS Studio, or Jupyter Notebook into SAS Model Manager
 • deploy ASTORE from SAS Model Manager in SAS Event Stream Processing Studio

>> Resources on GitHub <<

Show details

Overview

The IoT analytical life cycle expands traditional analysis processing beyond investigation of stored data (i.e., at rest) to analytical investigation of streaming data (i.e., in motion) or at the edge. The combined capability of data management, streaming analytics execution, and intelligent decisioning enables fast and confident decision making from data, to discovery, to deployment.

SAS Analytics for IoT offers an optimized IoT solution ecosystem and addresses the entire analytical life cycle.

In this tutorial we are focusing on the integrated process of model management and analytics deployment.

Tags

CategoryModel Management
Sub-CategoryASTORE Models

Processing Streaming Trade Data
In this tutorial, learn how to process streaming trade data using SAS Event Stream Processing (ESP).

Key takeaways from the example: 
 •
view and edit a model using a text editor and SAS ESP Studio
 • execute the model using the SAS ESP XML Server
 • subscribe to the output using SAS ESP Streamviewer

>> Resources on GitHub <<

Show details

Overview

The Processing Streaming Trade Data model is an XML model included in the examples that are installed with SAS Event Stream Processing (ESP). It includes five ESP window types that perform various tasks on the stream.

1. Data Source Windows
There are two data source windows. The Trades window inputs the transactional records for each trade. The Traders window inputs the names of the traders which is joined with the stream. Neither window includes an input data connector. Therefore, a file/socket adapter command must be used to connect the files to the windows.

2. Filter Window
Filter windows use expressions, user-defined functions, and registered plug-in functions to set up a filter condition. The LargeTrades window uses the simple expression, quantity >= 100, to filter out small trades.

3. Join Window
Streaming joins take event streams from two windows and combine them into a single stream based on a key field in each stream. This model performs a one-to-many join because one trader name affects many trade events. Therefore, this is a left-outer join.

4. Compute Window
Compute windows take the input stream and create an output stream using computational manipulation. New output field values are created using expressions, user-defined functions, or plug-in functions. Fields from the input stream can be passed to the output stream without manipulation, as well. This model passes through all input fields and creates field TotalCost by using the simple expression price*quantity.

5. Aggregate Window
Aggregate windows place input events into groups based on one or more key fields. Output values of non-key fields are aggregated using available functions. In our model we use the sum function to output total quantity and cost by security code.

Tags

CategoryData Transformation
Sub-CategoryAggregate
Sub-Category
Compute
Sub-Category
Join

Aggregating Stock Transactions
In this tutorial, learn how to aggregate stock transactions using SAS Event Stream Processing (ESP).

Key takeaways from the example: 
 •
aggregate values, execute an XML model, and subscribe with a file/socket adapter
 • model description, editing, executing, and subscribing
 • view, edit, and test model

>> Resources on GitHub <<

Show details

Overview

The Aggregating Stock Transactions model is a simple XML model included in the examples that are installed with SAS Event Stream Processing (ESP). It includes a Source window with an Input Data Connector and an Aggregate window to perform aggregate functions on the stream. 

The model reads stock transactions for a set of stock symbols. The input stream includes the following for each transaction:
     • stock symbol
     • number of shares
     • price of the stock

The model then aggregates the total number of shares traded for each symbol. This repository includes the files required to execute the example and video demonstrations that include the following topics:
     • viewing and editing the model using a text editor
     • viewing and editing the model in SAS ESP Studio (optional)
     • executing the model using the SAS ESP XML Server
     • subscribing to the output using the file/scoket adapter command and writing the results to a csv file
     • subscribing to the output using SAS ESP Streamviewer (optional)

Tags

CategoryData Transformation
Sub-CategoryAggregate

Deploy SAS Tiny YoloV2 models on OpenVINO
In this tutorial, learn how to convert and score SAS Tiny YoloV2 computer vision models leveraging the Intel OpenVINO framework and ONNX format using  SAS IoT Analytics.

Key takeaways from the example: 
 •
convert and score SAS Tiny YoloV2 computer vision models
 • leverage the Intel OpenVINO framework and ONNX format
 • functional difference between ONNX format and native SAS ASTORE

>> Resources on GitHub <<

Show details

Overview

The purpose of this tutorial is to provide a guideline on how to convert and score SAS Tiny YoloV2 computer vision models leveraging the Intel OpenVINO framework and ONNX format.

We will also highlight some of the functional difference between ONNX format and native SAS ASTORE formats to take into consideration.

OpenVINO is an inferencing (scoring) toolkit provided by Intel that aims to ensure quick deployment on Intel hardware of applications and solutions that emulate human vision by:
     • enabling Convolutional Neural Networks (CNN) based deep learning Inference (Scoring) on the edge
     • supporting heterogeneous execution across Intel CPU, Intel Integrated Graphics, Intel FPGA, Intel Movidius Neural Compute Stick, Intel Neural Compute Stick 2 and Intel Vision Accelerator Design with Intel Movidius VPUs

Additional information on the toolkit might be found on the OpenVINO toolkit page.

Tags

CategoryComputer Vision
Sub-CategoryImage Classification
Analytical MethodTiny YoloV2

Streaming Live Weather Data
In this tutorial, learn how to stream live weather data into any SAS Event Stream Processing (ESP) model.

Key takeaways from the example: 
 •
capture live weather data using REST APIs
 • use ESP's URL connector to publish data into a model
 • read and transform  JSON messages into events readable by ESP

>> Resources on GitHub <<

Show details

Overview

The purpose of this tutorial is to demonstate how to stream live weather data into any SAS Event Stream Processing (ESP) model. The GitHub repository also includes an examples directory that contains sets of configuration, model, and output files.

This example works with any weather service that returns a JSON message by way of an API call. A configuration file contains the a URL connector. It requests the data using an API call, transforms the data, and publishes events to the model.

After starting the model, the ESP XML Server uses the output data connector to create a csv file from the returned data.


Event Retention and Calculating Throughput
Learn how to retain events and how to calculate the event throughput rate using SAS Event Stream Processing (ESP).

Key takeaways from the example: 
 •
configure event retention on time and volume
 • calculate throughput based on a function
 • use a Copy window to retain events
 • write and execute user-defined functions in a Compute window

>> Resources on GitHub <<

Show details

Overview

The Event Retention and Calculating Throughput model is an XML model included in the examples that are installed with SAS Event Stream Processing (ESP). It includes a source, compute, three copy, and three aggregate windows that perform various tasks on the stream. 

The model is based on the following components:

1. Data Source Window
There is a single source window named source_win. The window inputs the transactional records using an input data connector.

2. Compute Window
Compute windows take the input stream and create an output stream using computational manipulation.  This model uses a user-defined function to calculate throughput rate. User-defined functions contain two parts. An initializer function executes when the model starts and the main portion of the function executes each time an event passes through the stream.

3. Copy Window 
There are three copy windows that retain events based on a retention technique. The retained events are then passed to the aggregate windows.

4. Aggregate Window
Aggregate windows place input events into groups based on one or more key fields. In this model there are three aggregate windows, each tied to one of the copy windows. The aggregate windows are identical. The difference is the input stream which is controlled by the retention values in the copy windows. Each aggregate window has an output data connector to create a csv file of the output.

Once setup is complete and data is streaming through the model, you can subscribe to the four output windows using SAS ESP Streamviewer. 

Tags

CategoryData Transformation
Sub-CategoryAggregate
Sub-Category
Compute
Sub-Category
Copy

SAS Analytics for IoT Enhanced Data Load Macro
The SAS Analytics for IoT Enhanced Data Load Macro is an alternate tool for use when loading the data mart for a SAS Analytics for IoT application.

Key takeaways from the example: 
 •
used to load the data mart when the traditional data load macro does not meet your needs
 • executed entirely in CAS
 • divided into individual macro steps each acting independently

>> Resources on GitHub <<

Show details

Overview

The enhanced data load macro process is very similar to the production data load process. A parameter file continues to be used to control the process and the SAS Analytics for IoT data mart is loaded upon successful completion of the process.

There are several steps you need to perform to execute the enhanced data load  macro:
    •  Load the macro source code
    •  Create a program file (.sas) that includes the following:
         •  Statement to create a CAS session
         •  Statement to assign a caslib for the imput data source
         •  Statement to assign a caslib for the output data if used
         •  Statement to call the macro with appropriate parameters
    •  Execute the program file you created

Tags

CategoryData Model
Sub-CategoryEnhanced Data Load Macro

Create a SAS Viya Image Denoising Model using SAS DLPy
This example notebook uses SAS DLPy to train and validate a SAS Viya deep learning image denoising model.

Key takeaways from the example: 
 •
uses the concept of pixel-wise CNN segmentation regression to remove noise from images
 • the CNN architecture consists of an encoder-decoder framework along with a pixel-wise regression layer
 • the trained model cleanses and restores noisy images to prepare them for further analytic consumption

>> Resources on GitHub <<

Show details

Overview

This example notebook uses SAS DLPy to create, train, and deploy a SAS Viya deep learning image denoising model. The image denoising model uses a pixel-wise CNN segmentation regression architecture to replace pixels in noisy images with predicted pixel values. The trained image denoise model can be used to cleanse and restore noisy images to prepare them for further analytic consumption.

CNN segmentation models are used for tasks such as object detection, image classification, and extracting information from low-level images. The model in this example structurally resembles a pixel-wise classification model, but the output task is different. This model uses regression to predict values for image pixels, rather than using classification to predict the category for image pixels.

The example data uses a set of images with generated salt-and-pepper noise. Salt-and-pepper noise is commonly seen when an image signal is disturbed, creating black-and-white pixel corruption in the output image. Given an input data set of noisy images, the trained model up-samples input image pixels with predicted values, and outputs cleaned, denoised images.

The pixel-wise segmentation regression model in this SAS DLPy notebook consists of an encoder network, a corresponding decoder network, and the final pixel-wise regression layer. The encoder network consists of convolutional and pooling layers. The decoder network consists of convolution and transpose convolution layers for up-sampling. The appropriate decoders use transposed convolution to perform a non-linear pixel-level up-sampling of the input feature maps.


Zambretti Algorithm for Weather Forecasting
In this tutorial, learn about various types of SAS Event Stream Processing (ESP) windows and their functions using the Zambretti alorithm for weather forecasting.

Key takeaways from the example: 
 •
use the ESP Expression Language to perform calculations
 • filter outliers using Filter windows
 • 
detect trends using Pattern windows
 • rejoin a split stream using a Union window
 • use a Remove State window to convert Upsert events into Insert events

>> Resources on GitHub <<

Show details

Overview

This is a fun one! The Zambretti algorithm is based on an instrument developed by Negretti and Zambra in the mid 1800s to forecast local weather. The output of the instrument or algorithm is one of 26 forecast statements, such as becoming unsettled. The forecast can be as much as 94% accurate in the Northern Hemisphere.

The forecast is determined based on four parameters:
    •  Value of the pressure at sea level
    •  Whether the pressure is falling, rising, or steady
    •  Pressure meets range requirements
    •  
Wind direction

Follow the steps in this tutorial to set up and run the project based on input weather data.

Tags

CategoryData Transformataion
Sub-CategoryPattern
Sub-Category
Union
Sub-Category
Remove State

Tracking the Interational Space Station
In this tutorial, learn about tracking the International Space Station (ISS) using a Geofences ESP example.

Key takeaways from the example: 
 •
learn the basics of a geofence
 • create required source windows:  source, geoCircle, filter, and pattern
 •
create and use GEO maps using ESP Streamviewer

>> Resources on GitHub <<

Show details

Overview

Beam me up! Welcome to the Tracking the International Space Station, an Introduction to Geofences ESP example. In this example you will get an introduction to the Geofence window including:
    •  How to define areas of interest for a geofence
    •  How to use the geofence window to detect a position within a circle
    •  How to use position proximity analysis to detect a position’s proximity to a polygon
    •  
How to create a GEO Map in SAS ESP Streamviewer

ISS background

The International Space Station (ISS) is a modular space station in low Earth orbit. It maintains an altitude of 251 miles and completes an orbit of the Earth in about 93 minutes. The ISS makes 15.54 orbits per day and is travelling at 7.66 km/s (27,600 km/h; 17,100 mph). The API used by this project publishes the latitude and longitude of the ISS every 10 seconds.


Connect multiple streaming models using SAS ESP Routers
In this tutorial, learn about how Event Stream Processing (ESP) Routers work, along with examples easily extended to more complex use cases.

Key takeaways from the example: 
 •
learn the basics about ESP routers
 • use ESP routers for streaming applications into modules
 •
learn how to scale applications to add/remove modules, while limiting interruptions.

>> Resources on GitHub <<

Show details

Overview

When building an ESP project to consume streams of data, we frequently encounter situations in which it makes sense to have certain events take one path and for other events to take other paths. Sometimes, we want to have different branches in an ESP project that will have their own logic, and we need to route events based on that. Another case would be when we are not achieving the performance we need, so we will want to parallelize the processing in order to increase throughput. In all these cases, we need to create a splitter to help us route the events. Let’s see how to split a stream of data in ESP. We’ll first go over a few simple cases, and then something a bit more intricate.

We will look at two basic splitters and three more complex splitters.


Setting up a Kubernetes monitoring stack for SAS ESP
In this tutorial, learn how to setup the Prometheus monitoring tool for Kubernetes and use the integration with Grafana to query, visualize, alert on, and explore metrics.

Key takeaways from the example: 
 •
Kubernetes overview
 • What to monitor; how to monitor
 •
SAS Event Stream Processing and Prometheus

>> Resources on GitHub <<

Show details

Overview

Prometheus is quickly becoming the de-facto monitoring tool for Kubernetes, in part because of the simplicity of its installation and configuration processes, but also because of the integration with Grafana, an open source visualization and analytics software that allows you to query, visualize, alert on, and explore metrics no matter where they are stored.

The documents found in this project provides a high-level introduction to Prometheus and Grafana, as well as a description of the steps required to set up a monitoring stack for a Kubernetes cluster that is running the SAS ESP operator. The goal is to make the whole deployment easy to understand and perform, while ultimately providing the ability to monitor resources with the goal of minimizing performance degradation.


Splitting Data Streams in SAS ESP
In this tutorial, learn how to incorporate splitters into your SAS Event Stream Processing (ESP) projects.

Key takeaways from the example: 
 •
learn how to split ESP streams
 • split to add logic to individual streams
 •
split to increase performance

>> Resources on GitHub <<

Show details

Overview

When building an ESP project to consume streams of data, we frequently encounter situations in which it makes sense to have certain events take one path and for other events to take other paths. Sometimes, we want to have different branches in an ESP project that will have their own logic, and we need to route events based on that. Another case would be when we are not achieving the performance we need, so we will want to parallelize the processing in order to increase throughput.

In all these cases, we need to create a splitter to help us route the events. Let’s see how to split a stream of data in ESP. We’ll first go over a few simple cases, and then something a bit more intricate.


Integrate SAS ESP with Relational Databases using DB Connector
In this tutorial, learn how to utilize the Database Connector with DataDirect Drivers in SAS  Event Stream Processing (ESP) to read and write data to relational databases like Microsoft SQL Server and PostgreSQL on Azure.

Key takeaways from the example: 
 •
use database connector with DataDirect drivers
 • configure the connector using the configuration file e.g. odbc.ini
 •
understand the configuration file “odbc.ini” and parameter considerations

>> Resources on GitHub <<

Show details

Overview

The database connector in SAS Event Stream Processing (ESP) can be utilized in two ways: with DataDirect Drivers or with SAS Threaded Kernel Drivers. This repository will explain how to utilize the Database Connector with DataDirect Drivers in ESP to read and write data to relational databases like Microsoft SQL Server and PostgreSQL on Azure.

Currently, the DataDirect ODBC drivers are only certified for the following databases: Oracle, MySQL, IBM DB2, Greenplum, PostgreSQL, SAP Sybase ASE, Teradata, Microsoft SQL Server, IBM Informix and Sybase IQ. In this tutorial we are connecting to databases on Azure, these steps could also be utilized to connect to an on-premise database.

Tags

CategoryDatabase Connector
Sub-CategoryIntegrate ESP with Relational Databases

Ingest data into CAS from SAS ESP
In this tutorial, learn how to use the SAS CAS loadStream actionset to ingest data from a SAS Event Stream Processing (ESP) running project into CAS.

Key takeaways from the example: 
 •
data ingestion through the SAS ESP CAS adapter
 • data ingestion through the loadStream CAS actionSet
 • works for both cloud-based and non-cloud-based environments

>> Resources on GitHub <<

Show details

Overview

Follow this how-to document and learn how to programmatically ingest data from SAS ESP into CAS using the CAS loadStream actionSet. The scenario described in the document applies to both cloud-based (Viya 4 and later) and non-cloud based (Viya 3.5 and earlier) environments and assumes the data to be stored in a CSV file.

However, any other storage mean could be used so long as ESP has access to it through one of its adapters, as the overall process is independent of the location of the input data. Please check the SAS documentation for a comprehensive list of available adapters based on the version of ESP used.


Dates and Times in SAS ESP
In this tutorial, learn various ways of working with dates and times in SAS  Event Stream Processing (ESP).

Key takeaways from the example: 
 •
ways to publish data containing dates and times to ESP
 • how ESP handles dates and times internally
 • best practices and considerations when working with dates and times

>> Resources on GitHub <<

Show details

Overview

SAS Event Stream Processing provides multiple ways to work with dates and times. Based on how the overall data stream is structured, ESP uses different internal languages and engines to handle the dates and times accordingly.

This document talks about various ways of processing dates and times in ESP with example models. We have also provided explanations to help you to navigate through some of the common scenarios one can face when working with such diverse data.

Tags

CategoryStreaming Data
Sub-CategoryDates and Times

SAS ESP Azure Event Hub Integration Using Kafka
In this tutorial, learn how to stream data from an Azure event hub into a SAS  Event Stream Processing (ESP) model.

Key takeaways from the example: 
 •
use the ESP Kafka connector to connect to an event hub
 • use a function window to transpose a JSON message into ESP events
 • return write data back to the event hub
 • drive messages into an Azure Event hub from a logic application

>> Resources on GitHub <<

Show details

Overview

In this project, we will configure an Azure Logic application to query the Azure Maps Weather service once an hour. The logic app will then format the data returned from the weather API into the required JSON string and send it to an Azure Event Hub. Once in the Event Hub, it is available to any application running a Kafka listener. An ESP project will be created which will retrieve these events from the Event Hub using an ESP Kafka connector. Once in ESP a function window will be used to parse the incoming JSON into an ESP schema.


How to use the SAS ESP Functional Window for JSON Processing
In this tutorial, learn how to use the Functional window in SAS Event Stream Processing (ESP) to parse and create event loops over JSON objects.

Key takeaways from the example: 
 •
learn Functional window basics
 • parse and process JSON objects
 • use the "Function Context" and "Event Generation" options

>> Resources on GitHub <<

Show details

Overview

The Functional window in ESP is very useful and also not widely understood. There are too many things that can be done with the Functional window to cover in just one repository, so we will just focus on parsing simple JSON objects and processing more complex JSON objects.

Tags

CategoryData Transformation
Sub-CategoryFunctional Window

How to use the SAS ESP Transpose window
In this tutorial, learn how to utilize the the SAS Event Stream Processing (ESP) Transpose window in ESP Studio.

Key takeaways from the example: 
 •
use the Transpose Window to transpose table data
 • transpose data from long to wide configuration
 •
transpose data from wide to long configuration

>> Resources on GitHub <<

Show details

Overview

In this tutorial, learn how to use the Transpose window in SAS ESP Studio and follow a few examples to illustrate functionality. Explore the source window configuration and view examples of both long-to-wide and wide-to-long transpose configurations.

Tags

CategoryData Transformation
Sub-CategoryTranspose Window

Code Snippets


SAS ESP Transformations Windows Code Snippets
View code snippets for the Transformation windows of SAS Event Stream Processing (ESP).

Key takeaways from the example: 
 •
includes examples for Aggegate, Compute, Copy, Filter, etc., transform windows
 • a directory for each example contains everything you need for execution
 • a document for each code snippet provides instructions unique to the example

>> Resources on GitHub <<


SAS ESP Utilities Windows Code Snippets
View code snippets for use in the  Utilities window of SAS Event Stream Processing (ESP).

Key takeaways from the example: 
 •
includes examples for Pattern and Geofence windows
 • a directory for each example contains everything you need for execution
 • a document for each code snippet provides instructions unique to the example

>> Resources on GitHub <<


SAS ESP Analytics Windows Code Snippets
View code snippets for the Analytics windows of SAS Event Stream Processing (ESP).

Key takeaways from the example: 
 •
includes examples for Calculate, Train, and Score windows
 • a directory for each example contains everything you need for execution
 • a document for each code snippet provides instructions unique to the example

>> Resources on GitHub <<

Back to Top