Amazon Kinesis makes it easy to collect, process, and analyze real-time, streaming data so you can get timely insights and react quickly to new information. In this scenario, the browser will send draw points to a Kinesis stream. Normally, we would create a Kinesis consumer client and read the events from the stream directly and process them, and we will do this in our next scenario. But for this scenario, let’s assume we want the messages to be delivered to an SQS queue like we have done in our previous scenarios.
Since there is no in-built way to route messages that have been published to a Kinesis stream into an SQS queue, we need an intermediary to do that processing/routing for us. In this scenario, we will implement the intermediary as an AWS Lambda function that will create a bridge between the Kinesis stream, and the SQS queue we want to target.
The architecture for this scenario looks like this: Similar to a previous scenario, messages are rendered by a single subscriber to an Amazon SQS queue. But in this case, the messages are published to a Kinesis stream, and then transferred to the SQS queue by the Lambda function.
o start this demo, click the drop-down list on the website and choose AWS Kinesis publisher to SQS subscriber. When the page initialises, draw a shape on the Kinesis Publisher canvas. Notice that the Messages Sent counter increments, but no shape is rendered on the SQS Subscriber canvas? This is because the messages are being published into the Kinesis stream, but no Kinesis consumers are processing the message yet:
KinesisToSQS
Kinesissam-LambdaFunction
Lab 5 Kinesis to SQS
public class LambdaFunctionHandler implements RequestHandler<KinesisEvent, Integer> {
@Override
public Integer handleRequest(KinesisEvent event, Context context) {
context.getLogger().log("Input: " + event);
for (KinesisEventRecord record : event.getRecords()) {
String payload = new String(record.getKinesis().getData().array());
context.getLogger().log("Payload: " + payload);
}
return event.getRecords().size();
}
}
Kinesissam-LambdaFunction
to the search bar, type EnterDepending on how many draw points you emited into the Kinesis stream, you will see a different number for Invocation count. When we enabled the trigger, we set the Batch size to 100 (which was the default). So the Lambda function will be invoked with up to 100 records at a time. If you sent 225 messages into the stream, this will be consumed by 3 invocations of the Lambda function - Math.ceil(225/100) == 3.
When writing your Lambda function code in Java, you can emit log strings into CloudWatch logs by simply calling the Logger’s .log() method:
context.getLogger().log("Log message to emit: " + someMessage);
Getting back to our scenario, we want to read the messages out of the Kinesis stream, and publish them into an SQS queue, where our browser implementation can read them and draw them on the subscriber’s canvas.
package com.amazonaws.lambda.demo;
import java.awt.geom.AffineTransform;
import java.awt.geom.Point2D;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Iterator;
import com.amazonaws.services.lambda.runtime.Context;
import com.amazonaws.services.lambda.runtime.RequestHandler;
import com.amazonaws.services.lambda.runtime.events.KinesisEvent;
import com.amazonaws.services.lambda.runtime.events.KinesisEvent.KinesisEventRecord;
import com.amazonaws.services.sqs.AmazonSQS;
import com.amazonaws.services.sqs.AmazonSQSClientBuilder;
import com.amazonaws.services.sqs.model.SendMessageBatchRequest;
import com.amazonaws.services.sqs.model.SendMessageBatchRequestEntry;
import com.fasterxml.jackson.databind.ObjectMapper;
public class LambdaFunctionHandler implements RequestHandler<KinesisEvent, Integer> {
final int MAX_SQS_BATCH_SIZE = 10;
private AmazonSQS sqs = AmazonSQSClientBuilder.defaultClient();
@Override
public Integer handleRequest(KinesisEvent event, Context context) {
Iterator<KinesisEventRecord> iterator = event.getRecords().iterator();
AffineTransform transformer = new AffineTransform(1, 0, 0, -1, 0, 400);
ObjectMapper mapper = new ObjectMapper();
Point2D dstPoint = new Point2D.Double();
boolean sendMessagesToSQS = true;
try
{
//
// Read the SQS Queue Target from the Lambda Environment Variables
//
String sqsUrl = System.getenv("TargetSQSUrl");
if ( sqsUrl == null || sqsUrl.isEmpty() )
{
context.getLogger().log("WARNING:: Environment Variable [TargetSQSUrl] is not set. No messages will be sent via SQS");
sendMessagesToSQS = false;
}
while ( iterator.hasNext() )
{
int messageCounter = 0;
// Prepare a batch request to write all the messages we have received in this invocation
Collection<SendMessageBatchRequestEntry> entries = new ArrayList<SendMessageBatchRequestEntry>();
while ( iterator.hasNext() && messageCounter++ < MAX_SQS_BATCH_SIZE )
{
String payload = new String(iterator.next().getKinesis().getData().array());
context.getLogger().log("Payload: " + payload);
DrawPoint transformedPoint = new DrawPoint();
try {
transformedPoint = mapper.readValue(payload, DrawPoint.class);
// Transform the point
Point2D srcPoint = new Point2D.Double(transformedPoint.getX(), transformedPoint.getY());
transformer.transform(srcPoint, dstPoint);
// Update the payload
transformedPoint.setX((int)dstPoint.getX());
transformedPoint.setY((int)dstPoint.getY());
// Add this payload into our batch
SendMessageBatchRequestEntry entry = new SendMessageBatchRequestEntry(
"msg_" + messageCounter,
mapper.writeValueAsString(transformedPoint)
);
entries.add(entry);
}
catch (Exception e)
{
context.getLogger().log("Unable to deserialise " + payload + " as a DrawPoint! " + e.getMessage());
}
}
if ( sendMessagesToSQS && entries.size() > 0 )
{
// We have reached the end of the records or we have reached the maximum
// batch size allowed for SQS, so we need to send our entries
context.getLogger().log("Sending batch of " + (messageCounter - 1) + " events to SQS...");
SendMessageBatchRequest batch = new SendMessageBatchRequest()
.withQueueUrl(sqsUrl);
batch.setEntries(entries);
// Perform the message sending
sqs.sendMessageBatch(batch);
}
}
}
catch (Exception e)
{
context.getLogger().log("EXCEPTION::Aborting Lambda processing");
context.getLogger().log(e.getStackTrace().toString());
}
return event.getRecords().size();
}
// Inner class must be marked as static in order for the JSON mapper to deserialise
private static class DrawPoint {
private int x;
private int y;
private long timestamp;
private boolean clear;
public int getX() {
return x;
}
public void setX(int x) {
this.x = x;
}
public int getY() {
return y;
}
public void setY(int y) {
this.y = y;
}
public long getTimestamp() {
return timestamp;
}
public void setTimestamp(long timestamp) {
this.timestamp = timestamp;
}
public boolean isClear() {
return clear;
}
public void setClear(boolean clear) {
this.clear = clear;
}
}
}
mvn package
to build
Kinesissam-LambdaFunction
to the search bar, type EnterThe drawing points should now be received via the SQS queue, and rendered on the canvas. Notice that the shape you draw is rendered inversed in the Y-dimension? This is because the implementation we used for the Lambda function executes a transform on the drawing points, just to demonstrate how we can manipulate the data in the Kinesis stream before passing it to the SQS queue.
Take a few moments to review the handler implementation code to ensure you understand how it works.