
At Timeplus, we are committed to building the next generation streaming analytics platform. We are leveraging various SaaS/PaaS tools to help us be more focused on our key mission. As Head of Product, I am happy to get my hands dirty and build a few internal tools with Superblocks, without bothering our busy engineering team.
If you haven’t heard of Superblocks, it’s an all-in-one internal tooling platform for developers.
Assemble any UI. Query any datasource. Automate any workflow. Schedule any job. All in one place.
I’d like to track how such internal tools are being used. With the spirit of “Eat your own dog food.”, I configured Superblocks to send audit logs and user activities to Timeplus, then I built dashboards and alerts in our own platform to understand the usage or capture any potential issues.
I wrote this tutorial so that other Superblocks users can leverage our free-to-use Timeplus Cloud to gain similar real-time visibility.
A sample dashboard looks like this:

Step 1: Send Superblocks logs to Kafka
Assuming you have built one or more applications in Superblocks, go to the Observability tab to connect to either Kafka or Confluent Cloud.

In my case, I chose to use Confluent Cloud, the fully managed Apache Kafka cluster in the cloud. All I need to do is to create a topic, say ‘superblocks’, with 1 partition, and no schema binding.

Simply add the Confluent Cloud or Apache Kafka broker, topic, and credential in Superblocks’ admin UI.

In a few seconds, any new user activities from your Superblocks app will be logged and available in the Kafka topic, in JSON format.

You can expand one of the messages and see the full JSON document, like this:

Timeplus provides great support for JSON data. For such document with multiple key/value pairs, you don’t need to create the schema explicitly. Timeplus makes it very easy to set up everything with just a few clicks.
Step 2: Collect data in Timeplus Cloud
Go to https://timeplus.cloud to create an account and set up a workspace. Go to the Source tab and create a Kafka source.

Timeplus Cloud provides an intuitive UI to connect to your Kafka data. Simply set the broker, topic, etc.

Make sure you change the value of the last option (Data Format) from JSON to TEXT. If you choose the first option “JSON”, then Timeplus will create the stream with the current schema of the JSON. Each key will be the column name. This can work well for sure, however, if in the future, Superblocks add new key/value pairs, this data cannot be inserted. Saving the JSON document as TEXT also allows us to save JSON documents in different schemas in a single stream.
Click the Next button. Timeplus will fetch the sample data from Confluent Cloud. You can let Timeplus automatically create a new stream with a single `raw` column in `string` type.

Click the Next button again to confirm the settings.

Click the Create the source button to finish the setup.
Step 3: Explore streaming data
Once the source is created, Timeplus will start loading data from Confluent Cloud and load it into the specified stream. Go to the Query tab, and click the name of your stream in the “Stream Catalog” section to generate SQL (you can also enter SQL manually). Click the Run Query button to run the streaming query.

You can view details of the raw JSON data by clicking the eye icon on the left.

You can pick up which key/value in the JSON document you want to explore or visualize, then use the EasyJSON feature in Timeplus to query them with JSON path, for example use `raw:level` to get the string value of `level` key in the JSON. For keys with special characters such as `http.method` or `resource-name`, you can wrap them with double quotes, e.g. `raw:”resource-type”`.
Here is the sample query to list all events in the past 1 day (`now()-1d` is the shortcut to get timestamp for 24 hours ago)
SELECT to_time(raw:time) as _tp_time,raw:level,raw:msg, raw:"resource-type",raw:"resource-name",raw:"user-email" as email
FROM superblocks_o11y where _tp_time>now()-1d

Step 4: Build charts and alerts
For the last step, you can build dashboards with this real-time data, or even set up real-time alerts to stay on top of unexpected events.
Simply switch to the Visualization tab, choose a chart type, and configure the settings for data binding and chart styles. Finally, add it to a dashboard.

You can also push unexpected events to downstream systems, such as Kafka, Snowflake, or just send out a Slack message or email to your security team. Here is an example of an unexpected login outside the organization.

Summary
By integrating Superblocks with Timeplus, you can gain real-time visibility of internal tools usage. Never miss a notable security event, and gain a better understanding which features are being used. You can sign up for a free account at https://timeplus.cloud and build dashboards and alerts today.