Redis Streams in Action - Part 3 (Java app to process tweets with Redis Streams)

Welcome to this series of blog posts which covers Redis Streams with the help of a practical example. We will use a sample application to make Twitter data available for search and query in real-time. RediSearch and Redis Streams serve as the backbone of this solution that consists of several co-operating components, each of which will we covered in a dedicated blog post.

The code is available in this GitHub repo - https://github.com/abhirockzz/redis-streams-in-action

This blog post will cover a Java based Tweets processor application whose role is to pick up tweets from Redis Streams and store them (as a HASH) so that they can be queried using RediSearch (the accurate term for this is “indexing documents” in RediSearch). You will deploy the application to Azure, validate it, run a few RediSearch queries to search tweets. Finally, there is a section where we will walk through the code to understand “how things work”.

[Read More]

Redis Streams in Action — Part 2 (Tweets consumer app)

Welcome to this series of blog posts which covers Redis Streams with the help of a practical example. We will use a sample application to make Twitter data available for search and query in real-time. RediSearch and Redis Streams serve as the backbone of this solution that consists of several co-operating components, each of which will we covered in a dedicated blog post.

The code is available in this GitHub repo - https://github.com/abhirockzz/redis-streams-in-action

In this part, we look at the service which interacts with the Twitter Streaming API to consume tweets and move them on to the next part in the processing pipeline.

[Read More]

Redis Streams in Action: Part 1 (Intro and overview)

Welcome to this series of blog posts which covers Redis Streams with the help of a practical example. We will use a sample application to make Twitter data available for search and query in real-time. RediSearch and Redis Streams serve as the backbone of this solution that consists of several co-operating components, each of which will we covered in a dedicated blog post.

The code is available in this GitHub repo - https://github.com/abhirockzz/redis-streams-in-action

This is the first part which explores the use case, motivations and provides a high level overview of the Redis features used in the solution.

[Read More]

Getting started with Kafka and Rust: Part 2

This is a two-part series to help you get started with Rust and Kafka. We will be using the rust-rdkafka crate which itself is based on librdkafka (C library).

In this post we will cover the Kafka Consumer API.

Alt Text

Initial setup

Make sure you install a Kafka broker - a local setup should suffice. Of course you will need to have Rust installed as well - you will need version 1.45 or above

[Read More]

Getting started with Kafka and Rust: Part 1

This is a two-part series to help you get started with Rust and Kafka. We will be using the rust-rdkafka crate which itself is based on librdkafka (C library).

In this post we will cover the Kafka Producer API.

Alt Text

Initial setup

Make sure you install a Kafka broker - a local setup should suffice. Of course you will need to have Rust installed as well - you will need version 1.45 or above

[Read More]

RediSearch in Action

Redis has a versatile set of data structures ranging from simple Strings all the way to powerful abstractions such as Redis Streams. The native data types can take you a long way, but there are certain use cases that may require a workaround. One example is the requirement to use secondary indexes in Redis in order to go beyond the key-based search/lookup for richer query capabilities. Though you can use Sorted Sets, Lists, and so on to get the job done, you’ll need to factor in some trade-offs.

[Read More]

Real-Time Search and Analytics with Confluent, Azure, Redis, and Spring Cloud

Self-managing a distributed system like Apache Kafka ®, along with building and operating Kafka connectors, is complex and resource intensive. It requires significant Kafka skills and expertise in the development and operations teams of your organization. Additionally, the higher the volumes of real-time data that you work with, the more challenging it becomes to ensure that all of the infrastructure scales efficiently and runs reliably.

Confluent and Microsoft are working together to make the process of adopting event streaming easier than ever by alleviating the typical infrastructure management needs that often pull developers away from building critical applications. With Azure and Confluent seamlessly integrated, you can collect, store, process event streams in real-time and feed them to multiple Azure data services. The integration helps reduce the burden of managing resources across Azure and Confluent.

[Read More]

Autoscaling Redis applications on Kubernetes 🚀🚀

This blog post demonstrates how to auto-scale your Redis based applications on Kubernetes. Redis is a widely used (and loved!) database which supports a rich set of data structures (String, Hash, Streams, Geospatial), as well as other features such as pub/sub messaging, clustering (HA) etc. One such data structure is a List which supports operations such as inserts (LPUSH, RPUSH, LINSERT etc.), reads (LRANGE), deletes (LREM, LPOP etc.) etc. But that’s not all!

[Read More]

Azure Cosmos DB: Use Cases and Trade-Offs

Azure Cosmos DB is a fully managed, elastically scalable and globally distributed database with a multi-model approach, and provides you with the ability to use document, key-value, wide-column, or graph-based data.

We will drill further into the multi-model capabilities and explore the options that are available to store and access data. Hopefully, it can help you make an informed decision on the right API are the right choice.

  • Core (SQL) API: Flexibility of a NoSQL document store combined with the power of SQL for querying.
  • MongoDB API: Supports the MongoDB wire protocol so that existing MongoDB client continue to work with Azure Cosmos DB as if they are running against an actual MongoDB database.
  • Cassandra API: Supports the Cassandra wire protocol so that existing Apache drivers compliant with CQLv4 continue to work with Azure Cosmos DB as if they are running against an actual Cassandra database.
  • Gremlin API: Supports graph data with Apache TinkerPop (a graph computing framework) and the Gremlin query language.
  • Table API: Provides premium capabilities for applications written for Azure Table storage.

You can read further on the some of the Key Benefits

[Read More]

An easy to use monitoring solution for Redis

Recently, I discovered a nice way of plugging in monitoring for Redis using Grafana, thanks to this great Data Source plugin that works with any Redis database, including Azure Cache for Redis!

It’s really easy to setup and try

Setup an Azure Cache for Redis instance

Start Grafana in Docker:

docker run -d -p 3000:3000 --name=grafana -e "GF_INSTALL_PLUGINS=redis-datasource" grafana/grafana

Access Grafana dashboard - browse to http://localhost:3000/

Enter admin as the username and password

Add the Data Source

[Read More]