An easy to use monitoring solution for Redis

Recently, I discovered a nice way of plugging in monitoring for Redis using Grafana, thanks to this great Data Source plugin that works with any Redis database, including Azure Cache for Redis!

It’s really easy to setup and try

Setup an Azure Cache for Redis instance

Start Grafana in Docker:

docker run -d -p 3000:3000 --name=grafana -e "GF_INSTALL_PLUGINS=redis-datasource" grafana/grafana

Access Grafana dashboard - browse to http://localhost:3000/

Enter admin as the username and password

Add the Data Source

[Read More]

Getting started with Rust and Redis

Are you learning Rust and looking for ways to get some hands-on practice with concrete examples? A good approach might be to try and integrate Rust with external systems. Why not try to use it with Redis? It is a powerful, versatile database but dead simple to get started with!

In this blog post, you will learn how to use the Rust programming language to interact with Redis using the redis-rs client. We will walk through commonly used Redis data structures such as String, Hash, List etc. The Redis client used in the sample code exposes both high and low-level APIs and you will see both these styles in action.

[Read More]
Rust  Redis  Azure  NoSQL 

Build a pipeline to join streams of real time data

With traditional architectures, it’s quite hard to counter challenges imposed by real-time streaming data – one such use case is joining streams of data from disparate sources. For example, think about a system that accepts processed orders from customers (real time, high velocity data source) and the requirement is to enrich these “raw” orders with additional customer info such as name, email, location etc. A possible solution is to build a service that fetches customer data for each customer ID from an external system (for example, a database), perform a join (in-memory) and stores the enriched data in another database perhaps (materialized view). This has several problems though and one of them is not being able to keep up (process with low latency) with a high volume data.

[Read More]

PostgreSQL pgoutput plugin for change data capture

Set up a Change Data Capture architecture on Azure using Debezium, Postgres and Kafka was a tutorial on how to use Debezium for change data capture from Azure PostgreSQL and send them to Azure Event Hubs for Kafka - it used the wal2json output plugin.

What about the pgoutput plugin?

This blog will provide a quick walk through of how to pgoutput plugin and provide clarification on this point raised by Denis Arnaud (thank you for brining it up!)

[Read More]

How to use Azure Go SDK to manage Azure Data Explorer clusters

Getting started with Azure Data Explorer using the Go SDK covered how to use the Azure Data Explorer Go SDK to ingest and query data from azure data explorer to ingest and query data. In this blog you will the Azure Go SDK to manage Azure Data Explorer clusters and databases.

Azure Data Explorer (also known as Kusto) is a fast and scalable data exploration service for analyzing large volumes of diverse data from any data source, such as websites, applications, IoT devices, and more. This data can then be used for diagnostics, monitoring, reporting, machine learning, and additional analytics capabilities.

[Read More]

Tutorial: Getting started with Azure Data Explorer using the Go SDK

With the help of an example, this blog post will walk you through how to use the Azure Data explorer Go SDK to ingest data from a Azure Blob storage container and query it programmatically using the SDK. After a quick overview of how to setup Azure Data Explorer cluster (and a database), we will explore the code to understand what’s going on (and how) and finally test the application using a simple CLI interface

[Read More]

Change Data Capture architecture using Debezium, Postgres and Kafka

Change Data Capture (CDC) is a technique used to track row-level changes in database tables in response to create, update and delete operations. Different databases use different techniques to expose these change data events - for example, logical decoding in PostgreSQL, MySQL binary log (binlog) etc. This is a powerful capability, but useful only if there is a way to tap into these event logs and make it available to other services which depend on that information.

[Read More]

Azure Event Hubs 'Role Based Access Control' in action

Azure Event Hubs is streaming platform and event ingestion service that can receive and process millions of events per second. In this blog, we are going to cover one of the security aspects related to Azure Event Hubs.

Shared Access Signature (SAS) is a commonly used authentication mechanism for Azure Event Hubs which can be used to enforce granular control over the type of access you want to grant - it works by configuring rules on Event Hubs resources (namespace or topic). However, it is recommended that you use Azure AD credentials (over SAS) whenever possible since it provides similar capabilities without the need to manage SAS tokens or worry about revoking a compromised SAS.

[Read More]

Tip: Using the latest TLS version with Azure Cache for Redis

Azure Cache for Redis provides an in-memory data store based on the open-source software Redis.

As a part of the industry-wide push toward the exclusive use of Transport Layer Security (TLS) version 1.2 or later, Azure Cache for Redis will not support TLS versions 1.0 and 1.1 i.e. your application will be required to use TLS 1.2 or later to communicate with your cache

To read the details, please refer to this page from the product documentation

[Read More]