Skip to contentSkip to navigationSkip to topbar
Rate this page:
On this page

How to Monitor Super SIM Connection Events using AWS ElasticSearch and Kibana


Twilio Event Streams is a new, unified mechanism to help you track your application's interactions with Twilio products. Event Streams spans Twilio's product line to provide a common event logging system that supports product-specific event types in a consistent way. You use a single API to subscribe to the events that matter to you.

Super SIM provides a set of event types, called Connection Events, which are focused on devices' attempts to attach to cellular networks and, once they are connected, the data sessions they establish to send and receive information.

This tutorial will show you how to integrate Super SIM Connection Events into a typical business process: routing the data to AWS ElasticSearch so it can feed your Kibana dashboards. To do so, you'll set up Event Streams to send Super SIM Connection Events to an AWS Kinesis Data Stream and into ElasticSearch via AWS Kinesis Firehose.

(information)

Info

If you'd prefer to start with a more basic guide to using Super SIM connection events, or your use-case doesn't require AWS, we have another tutorial that focuses on streaming Super SIM Connection Events to a webhook in your cloud.

With the AWS resources in place to read and feed events to Kibana, you'll explore probing the event data, setting up visualizations to help you spot underlying trends in the data, and configuring a dashboard so you're ready to monitor event data regularly.

This tutorial assumes you're working on a Linux box or a Mac, but it should also work under Windows if you're running Windows Subsystem for Linux(link takes you to an external page). Whatever platform you're using, we'll take it as a given that you're familiar with the command line.

You don't need to have experience of working with AWS accounts and processes, unless you plan to change or upgrade the resources configured during the tutorial. All of the AWS setup required by the tutorial is driven by a script, so you can focus on the flow rather than keying in code or navigating the AWS console. The script is fully commented so you can understand exactly what is to be installed.

(information)

Info

If you've already set up Event Streams for another Twilio product, then you may prefer to jump straight to the documentation that describes the event types unique to Super SIM. No problem.

Let's get started.


1. Review the requirements

1-review-the-requirements page anchor

To work through this tutorial, you'll need your Twilio Account SID and Auth Token to log in with the Twilio CLI tool. If you don't have these credentials handy, you can get them from the Console(link takes you to an external page).

You'll also need an AWS account(link takes you to an external page) set up with an AWS user that has resource creation permissions. You'll make use of these in Steps 2 and 3.

(warning)

Warning

The tutorial creates and uses AWS resources, so please be aware that this will come at a cost to your AWS account holder. We've made sure we used as few and as limited resources as possible. For more details, check out AWS' pricing page(link takes you to an external page).

You may also find that certain configurations are not available in your preferred AWS region, so you may need to modify the config you use accordingly. Make sure you thoroughly review the supersim_events.tf script in Step 3, which has all the details of the config you'll use and will be where you'll make any changes you need.


2. Set up your machine

2-set-up-your-machine page anchor

You may have already completed some of these tasks for other tutorials you have followed, or as part of your own workflow, so just skip those steps. For each step, we've linked to the relevant setup guide, so just jump straight there if you need further assistance then hop back here when you're done.

  1. Install and configure the Twilio CLI using this guide .
  2. Install and configure the AWS CLI. Amazon has full guidance for you(link takes you to an external page) .
  3. Install the JSON processor JQ from its website(link takes you to an external page) . JQ is used to process data in a script you'll run shortly to validate the Events Stream Sink you set up.
  4. Install the Terraform CLI(link takes you to an external page) . Terraform lets you build out infrastructure using code. Because this tutorial involves the creation of many AWS resources of various types, you'll use Terraform and a ready-to-go setup script to automate this process. Terraform's developer, Hashicorp, has a guide to get you started(link takes you to an external page) with the installation process — come back here as soon as you've installed the CLI tool — there's no need to go further.

3. Build out your AWS resources

3-build-out-your-aws-resources page anchor

In this step you'll put in place all of the AWS resources you need and connect them together so that they're ready to receive your Super SIM Connection Events and make them available to Kibana. You will create:

  • A Kinesis Data Stream as an entry point for Super SIM Connection Events.
  • An ElasticSearch domain to receive, store, and work with the events, which will be presented by Kibana.
  • A Kinesis Firehose to bridge the Data Stream and ElasticSearch.
  • An S3 Bucket as a store for records Firehose could not pass to ElasticSearch.
  • Assorted Roles and Policies to authorize Twilio to write to your stream, and for the AWS components to interact with each other.

The entire set up process is automated using Terraform code-as-infrastructure technology and an easy-to-understand script you'll download shortly. Feel free to review the script before you run it.

  1. Open a terminal and run the following commands to prepare your working directory:


    _10
    mkdir supersim_events
    _10
    cd supersim_events

  2. Copy the first of the three scripts below and save it to your supersim_events directory as setup_aws.sh and then run chmod +x setup_aws to make the script executable. You'll need to update the first line of the script with your preferred AWS region.
  3. Get the second script , save it as validate_sink.sh , and run chmod +x validate_sink.sh .
  4. Copy the third script and save it to your supersim_events directory as supersim_events.tf . This does not need to be made executable. It is read by the Terraform CLI and used to build a setup scheme. You should review this script to make sure you're happy with its role and policy settings.
  5. Edit setup_aws.sh and set your AWS region where marked, eg. us-east-2 .
  6. You're now ready to put your AWS infrastructure into place. Remember, there is a cost implication for your account holder, which we've attempted to minimize by instantiating as few and as limited a set of AWS resources as possible. For more details, check out AWS' pricing page(link takes you to an external page) .
  7. Run:


    _10
    ./setup_aws.sh

    This script sets the variables Terraform will use and calls two Terraform CLI commands, one to check the supersim_events.tf script, and the other to build the infrastructure from the script, which it does via the AWS CLI. It will output what it is going to generate, and then ask you how you'd like to proceed. Review its plan, then type in yes and hit Enter.

  8. As the AWS resources are created, you'll receive progress reports in the terminal, and then it will output some key data: the External ID value, the stream ARN, and the role ARN that you'll use to set up your Twilio Events Stream Sink, and the URL you'll use to access Kibana. You'll need to keep all of these values handy for future steps.
(information)

Info

If you subsequently make changes to your Terraform script, just run terraform apply to see the changes Terraform will make and, if you agree, apply them. You don't need to run setup_aws.sh again, but if you wish to, just make sure you replace the $(...) section in line 3 with the external ID output on the first run. If you change the external ID, either via the script or manually, you will need to recreate your Sink.

Here are the scripts, or just jump to the next step.

(information)

Info

You can also find all of these files in our public GitHub repo(link takes you to an external page).

Script #1 — setup_aws.sh

script-1--setup_awssh page anchor

_10
#!/bin/bash
_10
export TF_VAR_your_aws_region='<YOUR_AWS_REGION>'
_10
export TF_VAR_external_id=$(openssl rand -hex 40)
_10
export TF_VAR_your_computer_external_ip=$(curl -s https://checkip.amazonaws.com/)
_10
terraform init
_10
terraform validate
_10
terraform apply

Script #2 — validate_sink.sh

script-2--validate_sinksh page anchor

Download this script from the Event Streams documentation.

Script #3 — supersim_events.tf

script-3--supersim_eventstf page anchor

_343
/*
_343
* Define Terraform variables
_343
* These are set in the 'setup_aws.sh' script
_343
*/
_343
variable "your_aws_region" {
_343
type = string
_343
}
_343
_343
// This is required to give your computer access to Kibana
_343
variable "your_computer_external_ip" {
_343
type = string
_343
}
_343
_343
// Randomly generated string to verify connections from Twilio
_343
variable "external_id" {
_343
type = string
_343
}
_343
_343
/*
_343
* Base Terraform setup
_343
*/
_343
terraform {
_343
required_providers {
_343
aws = {
_343
source = "hashicorp/aws"
_343
version = "~> 3.42.0"
_343
}
_343
}
_343
_343
required_version = ">= 0.15.0"
_343
}
_343
_343
provider "aws" {
_343
profile = "default"
_343
region = var.your_aws_region
_343
}
_343
_343
/*
_343
* Set up policies
_343
*/
_343
_343
// Create a Policy to permit Twilio to write records to our Kinesis Stream
_343
resource "aws_iam_policy" "supersim_kinesis_stream_record_write_policy" {
_343
name = "supersim-kinesis-stream-record-write-policy"
_343
policy = jsonencode({
_343
Version = "2012-10-17"
_343
Statement = [
_343
{
_343
Effect = "Allow"
_343
Resource = "*"
_343
Action = [
_343
"kinesis:PutRecord",
_343
"kinesis:PutRecords"
_343
]
_343
},
_343
{
_343
Effect = "Allow"
_343
Resource = "*"
_343
Action = [
_343
"kinesis:ListShards",
_343
"kinesis:DescribeLimits"
_343
]
_343
}
_343
]
_343
})
_343
}
_343
_343
// Create a policy to provide read access to ElasticSearch Kibana
_343
// NOTE We limit access to your computer's external (eg. router) IP address,
_343
// which is required for web access to Kibana
_343
resource "aws_elasticsearch_domain_policy" "supersim_elasticsearch_kibana_access_policy" {
_343
domain_name = aws_elasticsearch_domain.supersim_elastic_search_kibana_domain.domain_name
_343
access_policies = jsonencode({
_343
Version = "2012-10-17"
_343
Statement = [
_343
{
_343
Action = [
_343
"es:ESHttp*",
_343
"es:DescribeElasticsearchDomain",
_343
"es:ListDomainNames",
_343
"es:ListTags"
_343
]
_343
Effect = "Allow"
_343
Resource = "${aws_elasticsearch_domain.supersim_elastic_search_kibana_domain.arn}/*"
_343
Principal = {
_343
"AWS": "*"
_343
}
_343
Condition = {
_343
"IpAddress": {
_343
"aws:SourceIp": [
_343
var.your_computer_external_ip
_343
]
_343
}
_343
}
_343
}
_343
]
_343
})
_343
}
_343
_343
// Set up a policy to manage Firehose's access to various resources:
_343
// * To write records to ElasticSearch
_343
// * To read from ElasticSearch (may not be necessary)
_343
// * To write to S3 records it could not write to ElasticSearch
_343
// * To read records from the Kinesis Data Stream
_343
// * To access EC2 resources for data transfer (may not be necessary)
_343
resource "aws_iam_policy" "supersim_firehose_rw_access_policy" {
_343
name = "supersim-firehose-rw-access-policy"
_343
policy = jsonencode({
_343
Version = "2012-10-17"
_343
Statement = [
_343
{
_343
Effect = "Allow"
_343
Action = [
_343
"es:DescribeElasticsearchDomain",
_343
"es:DescribeElasticsearchDomains",
_343
"es:DescribeElasticsearchDomainConfig",
_343
"es:ESHttpPost",
_343
"es:ESHttpPut"
_343
]
_343
Resource = [
_343
"${aws_elasticsearch_domain.supersim_elastic_search_kibana_domain.arn}",
_343
"${aws_elasticsearch_domain.supersim_elastic_search_kibana_domain.arn}/*"
_343
]
_343
},
_343
{
_343
Effect = "Allow"
_343
Action = [
_343
"es:ESHttpGet"
_343
]
_343
Resource = [
_343
"${aws_elasticsearch_domain.supersim_elastic_search_kibana_domain.arn}",
_343
"${aws_elasticsearch_domain.supersim_elastic_search_kibana_domain.arn}/*"
_343
]
_343
},
_343
{
_343
Effect = "Allow"
_343
Action = [
_343
"s3:AbortMultipartUpload",
_343
"s3:GetBucketLocation",
_343
"s3:GetObject",
_343
"s3:ListBucket",
_343
"s3:ListBucketMultipartUploads",
_343
"s3:PutObject"
_343
]
_343
Resource = [
_343
"${aws_s3_bucket.supersim_failed_report_bucket.arn}",
_343
"${aws_s3_bucket.supersim_failed_report_bucket.arn}/*"
_343
]
_343
},
_343
{
_343
Effect = "Allow"
_343
Action = [
_343
"kinesis:DescribeStream",
_343
"kinesis:GetShardIterator",
_343
"kinesis:GetRecords",
_343
"kinesis:ListShards"
_343
]
_343
Resource = aws_kinesis_stream.supersim_connection_events_stream.arn
_343
},
_343
{
_343
Effect = "Allow"
_343
Action = [
_343
"ec2:DescribeVpcs",
_343
"ec2:DescribeVpcAttribute",
_343
"ec2:DescribeSubnets",
_343
"ec2:DescribeSecurityGroups",
_343
"ec2:DescribeNetworkInterfaces",
_343
"ec2:CreateNetworkInterface",
_343
"ec2:CreateNetworkInterfacePermission",
_343
"ec2:DeleteNetworkInterface"
_343
]
_343
Resource = "*"
_343
}
_343
]
_343
})
_343
}
_343
_343
_343
/*
_343
* Set up roles
_343
*/
_343
_343
// Create a Role Twilio will assume to access the Stream
_343
resource "aws_iam_role" "supersim_twilio_access_role" {
_343
name = "supersim-twilio-access-role"
_343
assume_role_policy = jsonencode({
_343
Version = "2012-10-17"
_343
Statement = [
_343
{
_343
Action = "sts:AssumeRole"
_343
Effect = "Allow"
_343
Principal = {
_343
"AWS" = "arn:aws:iam::177261743968:root"
_343
}
_343
Condition = {
_343
StringEquals = {
_343
"sts:ExternalId" = var.external_id
_343
}
_343
}
_343
}
_343
]
_343
})
_343
}
_343
_343
// Create a Role Firehose will assume to access ElasticSearch
_343
resource "aws_iam_role" "supersim_firehose_access_role" {
_343
name = "supersim-firehose-access-role"
_343
assume_role_policy = jsonencode({
_343
Version = "2012-10-17"
_343
Statement = [
_343
{
_343
Effect = "Allow"
_343
Action = "sts:AssumeRole"
_343
Principal = {
_343
"Service" = "firehose.amazonaws.com"
_343
}
_343
}
_343
]
_343
})
_343
}
_343
_343
_343
/*
_343
* Attach policies to roles
_343
*/
_343
_343
// Attach the Stream write Policy to the Twilio access Role
_343
resource "aws_iam_role_policy_attachment" "supersim_attach_write_policy_to_twilio_access_role" {
_343
role = aws_iam_role.supersim_twilio_access_role.name
_343
policy_arn = aws_iam_policy.supersim_kinesis_stream_record_write_policy.arn
_343
}
_343
_343
// Attach the resource read/write/access Policy to the Firehose access Role
_343
resource "aws_iam_role_policy_attachment" "supersim_attach_rw_policy_to_firehose_access_role" {
_343
role = aws_iam_role.supersim_firehose_access_role.name
_343
policy_arn = aws_iam_policy.supersim_firehose_rw_access_policy.arn
_343
}
_343
_343
_343
/*
_343
* Set up AWS resources
_343
*/
_343
_343
// Set up a Kinesis Stream to receive streamed events
_343
// NOTE One shard is sufficient to the tutorial and testing
_343
resource "aws_kinesis_stream" "supersim_connection_events_stream" {
_343
name = "supersim-connection-events-stream"
_343
shard_count = 1
_343
}
_343
_343
// Create our Elastic Search Domain
_343
// This uses minimal server resources for the tutorial, but
_343
// a real-world application would require greater resources
_343
resource "aws_elasticsearch_domain" "supersim_elastic_search_kibana_domain" {
_343
domain_name = "supersim-es-kibana-domain"
_343
elasticsearch_version = "7.10"
_343
_343
cluster_config {
_343
instance_type = "t2.small.elasticsearch"
_343
instance_count = 1
_343
}
_343
_343
ebs_options {
_343
ebs_enabled = true
_343
volume_type = "standard"
_343
volume_size = 25
_343
}
_343
_343
domain_endpoint_options {
_343
enforce_https = true
_343
tls_security_policy = "Policy-Min-TLS-1-2-2019-07"
_343
}
_343
}
_343
_343
// Create an S3 Bucket
_343
// This is used by Firehose to dump records it could not pass
_343
// to ElasticSearch. In a real-world app, you might also choose
_343
// to store all received records
_343
data "aws_canonical_user_id" "current_user" {}
_343
_343
resource "aws_s3_bucket" "supersim_failed_report_bucket" {
_343
bucket = "supersim-failed-report-bucket"
_343
grant {
_343
id = data.aws_canonical_user_id.current_user.id
_343
type = "CanonicalUser"
_343
permissions = ["FULL_CONTROL"]
_343
}
_343
}
_343
_343
// Create a Kinesis Firehose to link the Kinesis Data Stream (input)
_343
// to ElasticSearch (output)
_343
resource "aws_kinesis_firehose_delivery_stream" "supersim_firehose_pipe" {
_343
name = "supersim-firehose-pipe"
_343
destination = "elasticsearch"
_343
_343
kinesis_source_configuration {
_343
kinesis_stream_arn = aws_kinesis_stream.supersim_connection_events_stream.arn
_343
role_arn = aws_iam_role.supersim_firehose_access_role.arn
_343
}
_343
_343
elasticsearch_configuration {
_343
domain_arn = aws_elasticsearch_domain.supersim_elastic_search_kibana_domain.arn
_343
role_arn = aws_iam_role.supersim_firehose_access_role.arn
_343
index_name = "super-sim"
_343
_343
processing_configuration {
_343
enabled = "false"
_343
}
_343
}
_343
_343
s3_configuration {
_343
role_arn = aws_iam_role.supersim_firehose_access_role.arn
_343
bucket_arn = aws_s3_bucket.supersim_failed_report_bucket.arn
_343
buffer_interval = 60
_343
buffer_size = 1
_343
}
_343
}
_343
_343
_343
/*
_343
* Outputs -- useful values printed at the end
_343
*/
_343
output "EXTERNAL_ID" {
_343
value = var.external_id
_343
description = "The External ID you will use to create your Twilio Event Streams Sink"
_343
}
_343
_343
output "KIBANA_WEB_URL" {
_343
value = aws_elasticsearch_domain.supersim_elastic_search_kibana_domain.kibana_endpoint
_343
description = "The URL you will use to access Kibana"
_343
}
_343
_343
output "COMPUTER_IP_ADDRESS" {
_343
value = var.your_computer_external_ip
_343
}
_343
_343
output "YOUR_KINESIS_STREAM_ARN" {
_343
value = aws_kinesis_stream.supersim_connection_events_stream.arn
_343
}
_343
_343
output "YOUR_KINESIS_ROLE_ARN" {
_343
value = aws_iam_role.supersim_twilio_access_role.arn
_343
}


4. Configure Twilio Event Streams 1: create a Sink

4-configure-twilio-event-streams-1-create-a-sink page anchor

A Sink is an Event Stream destination. To set up a Sink, you create a Sink resource using the Event Streams API. Event Streams currently support two Sink types: AWS Kinesis and webhooks. You'll use the former, which you created and set up in the previous step.

In your terminal, run the following command. The details you need to add to the command — <YOUR_KINESIS_STREAM_ARN>, <YOUR_KINESIS_ROLE_ARN>, and <EXTERNAL_ID> — were output by Terraform at the end of the previous step.


_10
twilio api:events:v1:sinks:create --description SuperSimSink \
_10
--sink-configuration '{"arn":"<YOUR_KINESIS_STREAM_ARN>", "role_arn":"<YOUR_KINESIS_ROLE_ARN>", "external_id":"<EXTERNAL_ID>"}' \
_10
--sink-type kinesis

The command creates a Sink and outputs the new Sink's SID, which you'll need to paste into the code in the next step. If you get an error, check that you entered the command correctly.


5. Configure Twilio Event Streams 2: validate the Sink

5-configure-twilio-event-streams-2-validate-the-sink page anchor

Kinesis Sinks need to be validated before events can be delivered to them, in contrast to webhook Sinks. To validate a Sink, you need to tell the Sink that you want to test it. This automatically sends a test message to the Sink, which you'll then retrieve from the Sink itself and use to confirm that the Sink is operational. This validates the Sink.

  1. In one terminal tab or window, run the validate_sink.sh script:


    _10
    ./validate_sink.sh supersim-connection-events-stream

    It will wait for records to enter the stream and display them when they arrive.

  2. Open a second terminal tab or window and run the following command to get Twilio to send a test message. Replace <SINK_SID> with the SID value you got in Step 4.


    _10
    twilio api:events:v1:sinks:test:create --sid <SINK_SID>

  3. Switch back to the first terminal and you'll shortly see a block of JSON code. Look for the data key and note the value of its nested test_id key. You're done with the validation script, so hit ctrl-c to quit from it.
  4. Execute the following command, using your Sink SID and the test ID returned just now:


    _10
    twilio api:events:v1:sinks:validate:create --sid <SINK_SID> \
    _10
    --test-id <TEST_ID>

    This should display the result valid in the terminal — your Sink has been validated and is ready to receive Super SIM Connection Events.


6. Configure Twilio Event Streams 3: subscribe to events

6-configure-twilio-event-streams-3-subscribe-to-events page anchor

Event Streams uses a publish-subscribe (aka 'pub-sub') model: you subscribe to the events that interest you, and Twilio will publish those events to your Sink.

There are six Super SIM Connection event types, each identified by a reverse domain format string:

Event TypeID String
Attachment Acceptedcom.twilio.iot.supersim.connection.attachment.accepted
Attachment Rejectedcom.twilio.iot.supersim.connection.attachment.rejected
Attachment Failedcom.twilio.iot.supersim.connection.attachment.failed
Data Session Startedcom.twilio.iot.supersim.connection.data-session.started
Data Session Updatedcom.twilio.iot.supersim.connection.data-session.updated
Data Session Endedcom.twilio.iot.supersim.connection.data-session.ended

The types are largely self explanatory. The exception is Attachment Failed, which is a generic 'could not connect' error that you may encounter when your device tries to join a cellular network.

In a typical connection sequence, you would expect to receive: one Attachment Accepted, one Data Session Started, and then multiple Data Session Updated events. When your device disconnects, you'll receive a Data Session Ended event at that point.

Now let's set up some subscriptions.

To get events posted to your new Sink, you need to create a Subscription resource. This essentially tells Twilio what events you're interested in. Once again, copy and paste the following command, adding in your Sink SID from Step 4:


_10
twilio api:events:v1:subscriptions:create \
_10
--description "Super SIM events subscription" \
_10
--sink-sid <SINK_SID> \
_10
--types '{"type":"com.twilio.iot.supersim.connection.attachment.accepted","schema_version":1}' \
_10
--types '{"type":"com.twilio.iot.supersim.connection.attachment.rejected","schema_version":1}' \
_10
--types '{"type":"com.twilio.iot.supersim.connection.attachment.failed","schema_version":1}' \
_10
--types '{"type":"com.twilio.iot.supersim.connection.data-session.started","schema_version":1}' \
_10
--types '{"type":"com.twilio.iot.supersim.connection.data-session.ended","schema_version":1}' \
_10
--types '{"type":"com.twilio.iot.supersim.connection.data-session.updated","schema_version":1}'

You specify your chosen event types as --types arguments: each one is a JSON structure that combines the type as an event ID string and the events schema you want to use.

The code will output the new Subscription resource's SID and confirm the subscription has been applied. The events you've subscribed to will now start flowing into ElasticSearch as your Super SIMs connect to cellular networks and, bring up and tear down data sessions.

(warning)

Warning

We've included the Data Session Updated event because it will allow you to monitor your Super SIMs' data usage in Kibana, but please be aware that this event is emitted every six minutes for every Super SIM in your account so you can quickly use up your free event allowance. If this might be an issue for you, consider omitting the last line of the command.

To halt the flow of events, you can also remove the Subscription from your Sink. To do so, run the following command; you'll need the Subscription SID output when you set up the Subscription:


_10
twilio api:events:v1:subscriptions:remove \
_10
--sid <SUBSCRIPTION_SID>


7. Monitor events in Kibana 1 — explore the data

7-monitor-events-in-kibana-1--explore-the-data page anchor

The URL you use to access Kibana was output at the end of Step 3. If it's not still visible in your terminal, navigate to the working directory, run terraform show, and scan the results for the kibana_endpoint key — its value will give you the URL you need.

If you know your way around Kibana, you can jump out of the tutorial at this point if you wish. The remainder focuses on using Kibana to explore and chart the incoming Connection Events data. You should, however, check out the Super SIM Connection Events documentation, which describes the information contained within each event.

  1. Staying with us? Good. Paste the Kibana URL into a browser and hit Enter .
  2. Click on the hamburger menu at the top left and select Stack Management under Management .
  3. Click on Create index pattern:

  4. Under Index pattern name , enter super-sim* and then click Next Step .
  5. Under Time field , select data.timestamp , and then click Create Index Pattern . What you've done is establish the data fields Kibana can use and the particular field that it uses to track the flow and order of incoming events. Let's see what we can do with these events.
  6. Click on the hamburger menu at the top left and select Discover under Kibana. This will show you the most recent records received. How recent is set by the date value at the top right of the screen; you can click this to change the time period in focus:

  7. You can click Refresh to update the list of events, but it's better to click the calendar icon next to the date range, add a period under Refresh every, and then hit Start. This will refresh the data automatically:

  8. At the left, under Available fields , is a selection of all the data types available in each Super SIM connection event. Scroll down a little and locate data.sim_iccid . Move your mouse over it to reveal a + symbol, which you should click.
  9. The events are now segmented by each of your Super SIMs' ICCIDs. Look back at the left-hand list of data fields and add data.network.friendly_name to the table (click on the + that appears alongside it). You can now see the networks your SIMs have connected through:

  10. Go to Available fields and add data.rat_type to the table. Now you can see what cellular technologies your devices used to connect:

  11. Pick a Super SIM and move your mouse to one of its rows. You'll see + and - icons appear at the right of the data.sim_iccid column. Click the + and Kibana will show only events experienced by that SIM:

  12. Filters like the one you just set up are listed above the chart; click on any filter's X symbol to clear it. You can do the same with column headings to remove them from the table. The << and >> symbols you'll see when you move your mouse over a column heading allow you to reorder the columns. If you like, click Save — it's in the menu at the top right — and save your search parameters for use again.

8. Monitor events in Kibana 2 — looking for patterns

8-monitor-events-in-kibana-2--looking-for-patterns page anchor
  1. Click on the hamburger menu at the top left and select Visualize under Kibana .
  2. Click Create visualization .
  3. In the New Visualization dialog, locate Vertical Bar (it's toward the bottom of the list) and select it.
  4. Under New Line / Choose a source , click super-sim* .
  5. You're going to visualize how much data each of your Super SIMs have downloaded. This will be the Y axis value; you'll list the SIMs on the X axis. First, click on Y-axis Count under Metrics . Then, under Aggregation , click Count , scroll down through the list of options and click Sum .
  6. Click on Select a Field, start typing down, and select data.data_download when it's suggested:

  7. Click Add under Buckets . Click on X-axis in the pop-up that appears, and then, under Aggregation , click on Terms .
  8. Click on the Field text field and start typing icc — Kibana will suggest data.sim_iccid.keyword, so select it:

  9. Click Update at the bottom right of the screen. You'll now see a chart showing how much data in bytes each Super SIM has downloaded:

  10. Depending on the number of Super SIMs you have, you might need to adjust the Size setting under Buckets > X-axis , and the date range you're using (this works just the way you saw in Step 7 ).
  11. Let's include upload data volumes too. Click Add under Metrics and select Y-axis . Under Aggregation , click Sum .
  12. Under Field, click Select a field, start typing upload and select data.data_upload when it's suggested:

  13. Click Update. You'll see something like this:

  14. Again, you might need to adjust the date range to see meaningful data.

9. Monitor events in Kibana 3 — build a dashboard

9-monitor-events-in-kibana-3--build-a-dashboard page anchor

Setting up visualizations for occasional use is all very well, but what you really want to do is add them to a dashboard that you can check regularly. Let's do that now with the visualization you just made.

  1. Click on **Save **at the top right. Give the visualization the name Super SIM Data Usage and click the Save button.
  2. Click on the hamburger menu at the top left and select Dashboard under Kibana .
  3. Click Create new dashboard .
  4. Kibana invites you to create a new dashboard widget — or "object" — but let's use the visualization you just created. Click Add in the menu at the top right, and under Add panels select Super SIM Data Usage — the visualization you just made.
  5. Click the X in the top right of the panel to close it:

    1.Click Save in the top right menu to save the dashboard so it's accessible next time you visit.


Well done — you've come a long way. You've used Terraform to put in place your AWS infrastructure and then set up Twilio Event Streams to first create a connection to your infrastructure, called a Sink. You've subscribed to a series of Super SIM Connection Events which have then begun to flow from Twilio to your AWS setup as your IoT devices' Super SIMs connect to networks and transfer data.

You've used Kibana to explore the events data that has been received, from all your SIMs right down to single SIM, and you've learned how to create graphs to help you spot trends in your devices' data usage. You've added that graph to a dashboard you can use regularly to monitor the behavior of the devices in your fleet.

We've only just scratched the surface of what Kibana can do, but having real, meaningful data to work with — all those Super SIM Connection Events — will make it much easier to explore the rest to see how it can help your business.

Try adding some more visualizations to your dashboard that will help you understand which radio technologies your devices are using to connect.

Imagine you're fielding a support query from a customer. Drill down to examine how a specific Super SIM or small group of SIMs behaved around a certain point in time.

Want to know more? Kibana's developer, Elastic, has a bunch of tutorials(link takes you to an external page) which will walk you through common scenarios and help you grow your knowledge.

We can't wait to see what business intelligence tools you build with Super SIM Connection Events!


Rate this page: