diff --git a/docs/common/_graph-eol.mdx b/docs/common/_graph-eol.mdx new file mode 100644 index 00000000000..7977150410b --- /dev/null +++ b/docs/common/_graph-eol.mdx @@ -0,0 +1,7 @@ +:::info End-of-Life Notice +Redis is phasing out **RedisGraph**. [**This blog post**](https://redis.com/blog/redisgraph-eol/) explains the motivation behind this decision and the implications for existing Redis customers and community members. + +End of support is scheduled for January 31, 2025. + +_Beginning with Redis Stack 7.2.x-y, Redis Stack will no longer include graph capabilities (RedisGraph)._ +::: \ No newline at end of file diff --git a/docs/create/aws/analytics-using-aws/index-analytics-using-aws.mdx b/docs/create/aws/analytics-using-aws/index-analytics-using-aws.mdx index f2c37478613..a7fc28ddbd0 100644 --- a/docs/create/aws/analytics-using-aws/index-analytics-using-aws.mdx +++ b/docs/create/aws/analytics-using-aws/index-analytics-using-aws.mdx @@ -6,6 +6,10 @@ slug: /create/aws/analytics-using-aws authors: [ajeet] --- +import Authors from '@theme/Authors'; + + + An interactive analytics dashboard serves several purposes. They allow you to share data and provide you with all those vital information to make game-changing decisions at a faster pace. Building a real-time dynamic dashboard using a traditional relational database might require a complex set of queries. By using a NoSQL database like Redis, you can build a powerful interactive and dynamic dashboard with a small number of Redis commands. Let’s take a look at how this was achieved. @@ -26,7 +30,7 @@ Ready to get started? Ok, let’s dive straight in. ### What will you need? - [NodeJS](https://developer.redis.com/develop/node): used as an open-source, cross-platform, backend JavaScript runtime environment that executes Javascript code outside a web browser. -- [Redis Enterprise Cloud](https://developer.redis.com/create/rediscloud): used as a real-time database, cache, and message broker. +- [Redis Cloud](https://redis.com/try-free): used as a real-time database, cache, and message broker. - [NPM](https://www.npmjs.com/): used as a package manager. It allows you to build node apps. ### Getting Started @@ -36,13 +40,13 @@ Ready to get started? Ok, let’s dive straight in. - Install Node - v12.19.0 - Install NPM - v6.14.8 -### Step 1. Sign up for a Free Redis Enterprise Cloud Account +### Step 1. Sign up for a Free Redis Cloud Account -[Follow this tutorial](https://developer.redis.com/create/aws/redis-on-aws) to sign up for a free Redis Enterprise Cloud account. +[Follow this tutorial](https://developer.redis.com/create/aws/redis-on-aws) to sign up for a free Redis Cloud account. ![image](analytics3.png) -Choose AWS as a Cloud vendor while creating your new subscription. At the end of the database creation process, you will get a Redis Enterprise CLoud database endpoint and password. You can save it for later use. +Choose AWS as a Cloud vendor while creating your new subscription. At the end of the database creation process, you will get a Redis Cloud database endpoint and password. You can save it for later use. ![image](analytics4.png) @@ -62,7 +66,7 @@ Go to /server folder (cd ./server) and then execute the below command: cp .env.example .env ``` -Open .env file and add Redis Enterprise Cloud Database Endpoint URL, port and password as shown below: +Open .env file and add Redis Cloud Database Endpoint URL, port and password as shown below: ``` @@ -432,5 +436,5 @@ to - [Project Source Code](https://github.com/redis-developer/basic-analytics-dashboard-redis-bitmaps-nodejs) - [Use cases of Bitmaps](https://redis.io/topics/data-types-intro) -- [How to Build a Slack Bot to Retrieve Lost Files Using AWS S3 and RediSearch](/create/aws/slackbot) +- [How to Build a Slack Bot to Retrieve Lost Files Using AWS S3 and Search](/create/aws/slackbot) - [How to Deploy and Manage Redis Database on AWS Using Terraform](/create/aws/terraform) diff --git a/docs/create/aws/bidding-on-aws/index-bidding-on-aws.mdx b/docs/create/aws/bidding-on-aws/index-bidding-on-aws.mdx index 7fc58a1ccbc..cd3fc0907a3 100644 --- a/docs/create/aws/bidding-on-aws/index-bidding-on-aws.mdx +++ b/docs/create/aws/bidding-on-aws/index-bidding-on-aws.mdx @@ -6,6 +6,10 @@ slug: /create/aws/bidding-on-aws authors: [ajeet] --- +import Authors from '@theme/Authors'; + + + Digital technology has propelled us forward to an exciting new era and has transformed almost every aspect of life. We’re more interconnected than ever as communication has become instant. Working from home has now become the norm, helping us pivot to a new way of working during the pandemic. And our ability to reduce carbon emissions by attending work-related events online has meant that we’ve taken greater strides to combat global warming. Continuing this trend is [Shamshir Anees and his team](https://github.com/shamshiranees), who have created an application that can host digital auctions. By using Redis, data transmission between components was carried out with maximum efficiency, providing users with real-time bidding updates on the dashboard. Let’s take a look at how this was achieved. We’d also like to point out that we have a diverse range of exciting applications for you to check out on the [Redis Launchpad](https://launchpad.redis.com). @@ -28,8 +32,8 @@ Ready to get started? Ok, let’s dive straight in. - [NodeJS](https://developer.redis.com/develop/node): used as an open-source, cross-platform, backend JavaScript runtime environment that executes Javascript code outside a web browser. - [Amazon Cognito](https://aws.amazon.com/cognito/): used to securely manage and synchronize app data for users on mobile. -- [Redis Enterprise Cloud](https://developer.redis.com/create/rediscloud): used as a real-time database, cache, and message broker. -- [RedisJSON](https://developer.redis.com/howtos/redisjson/getting-started): used to store, update and fetch JSON values from Redis. +- [Redis Cloud](https://redis.com/try-free): used as a real-time database, cache, and message broker. +- [Redis Stack](https://developer.redis.com/quick-start): used to store, update and fetch JSON values from Redis. - [Socket.IO](https://socket.io/): used as a library that provides real-time, bi-directional, and event-based communication between the browser and the server. - [AWS Lambda](https://aws.amazon.com/lambda/): used as a serverless compute service that runs your code in response events and manages the underlying compute service automatically for you. - [Amazon SNS/Amazon SES](https://aws.amazon.com/sns/): a fully managed messaging service for both application-to-application (A2A) and application-to-person (A2P) communication. @@ -42,13 +46,13 @@ Ready to get started? Ok, let’s dive straight in. #### All auctions -NodeJS connects to the Redis Enterprise Cloud database. +NodeJS connects to the Redis Cloud database. The frontend then communicates with the NodeJS backend through API calls. `GET : /api/auctions` fetches all the keys from Auctions Hash. -NodeJS uses the Redis module to work with Redis Enterprise Cloud. The Redis client is then created using the Redis credentials and hmget(). This is the equivalent of the HMSET command that’s used to push data to the Redis database. +NodeJS uses the Redis module to work with Redis Cloud. The Redis client is then created using the Redis credentials and hmget(). This is the equivalent of the HMSET command that’s used to push data to the Redis database. #### Each auction @@ -81,13 +85,13 @@ NodeJS uses the Redis module to work with Redis Cloud. The Redis client is then - [NodeJS](https://nodejs.org/en/) - [NPM](https://www.npmjs.com/) -### Step 1. Sign up for a Free Redis Enterprise Cloud Account +### Step 1. Sign up for a Free Redis Cloud Account -[Follow this tutorial](https://developer.redis.com/create/aws/redis-on-aws) to sign up for a free Redis Enterprise Cloud account. +[Follow this tutorial](https://developer.redis.com/create/aws/redis-on-aws) to sign up for a free Redis Cloud account. ![image](sign3.png) -Choose AWS as a Cloud vendor while creating your new subscription. At the end of the database creation process, you will get a Redis Enterprise CLoud database endpoint and password. You can save it for later use. +Choose AWS as a Cloud vendor while creating your new subscription. At the end of the database creation process, you will get a Redis Cloud database endpoint and password. You can save it for later use. ![image](sign4.png) @@ -169,7 +173,7 @@ npm start ### How data is stored -The Redis Enterprise Cloud Database with RedisJSON module is what you’ll use to install the data. +The [Redis Cloud](https://redis.com/try-free) database with Redis Stack is what you’ll use to install the data. ### Auctions @@ -248,7 +252,7 @@ You’ll then be taken to the sign-up page. Enter your details and click ‘sign ### Placing a bid -Go to the homepage to have access to view all of the items and their auction details. All of the data here is being populated by RedisJSON and Redis Cloud. Scroll through the page and click on the item that you want to place a bid for. +Go to the homepage to have access to view all of the items and their auction details. All of the data here is being populated by Redis Stack and Redis Cloud. Scroll through the page and click on the item that you want to place a bid for. ![placing](images/image_9.png) @@ -284,23 +288,4 @@ But thanks to Redis, the components that made up the architecture system became [NR-Digital-Auction](https://launchpad.redis.com/?id=project%3ANR-digital-auction-frontend) is a fantastic example of how innovations can be brought to life by using Redis. Everyday programmers are experimenting with Redis to build applications that are impacting everyday life from around the world and you can too! -So what can you build with Redis? For more inspiration, you can head over to the Redis Launchpad to access an exciting range of applications. - -Don't miss out on your $200 credit. Plus, your opportunity win a Tesla! - -
- - - Redis Try Free - - -
+So what can you build with Redis? For more inspiration, you can head over to the [Redis Launchpad](https://launchpad.redis.com/) to access an exciting range of applications. If you're ready to get started building, quickly spin up a free database [Redis Cloud](https://redis.com/try-free/). diff --git a/docs/create/aws/chatapp/index-chatapp.mdx b/docs/create/aws/chatapp/index-chatapp.mdx index 1c906da2088..647861343e4 100644 --- a/docs/create/aws/chatapp/index-chatapp.mdx +++ b/docs/create/aws/chatapp/index-chatapp.mdx @@ -6,6 +6,10 @@ slug: /create/aws/chatapp authors: [ajeet] --- +import Authors from '@theme/Authors'; + + + Real time chat messaging apps are surging in popularity exponentially. Mobile apps like WhatsApp, Facebook, Telegram, Slack, Discord have become “a part and parcel” of our life. Users are addicted to these live chat mobile app conversations as they bring a personal touch and offer a real-time interaction ![chatapp](image_chatapp1.png) @@ -21,30 +25,30 @@ There’s been a rise in the number of social media apps that bring social eleme ### 1. What will you build? -In this tutorial, we will see how to build a realtime chat app built with Flask, Socket.IO and Redis Enterprise Cloud running on Amazon Web Services. This example uses pub/sub features combined with web-sockets for implementing the message communication between client and server. +In this tutorial, we will see how to build a realtime chat app built with Flask, Socket.IO and Redis Cloud running on Amazon Web Services. This example uses pub/sub features combined with web-sockets for implementing the message communication between client and server. ![image](chatapp2.png) ### 2. What will you need? - Frontend - React, Socket.IO -- Backend - Python(Flask), Redis Enterprise Cloud hosted on AWS +- Backend - Python(Flask), Redis Cloud hosted on AWS ### 3. Getting Started -### Step 1. Sign up for a Free Redis Enterprise Cloud Account +### Step 1. Sign up for a Free Redis Cloud Account -[Follow this tutorial](https://developer.redis.com/create/aws/redis-on-aws) to sign up for a free Redis Enterprise Cloud account. +[Follow this tutorial](https://developer.redis.com/create/aws/redis-on-aws) to sign up for a free Redis Cloud account. If you already have an existing account, then all you need are your login credentials to access your subscription. ![image](chatapp3.png) -Choose AWS as the Cloud vendor while creating your new subscription. While creating a new database, ensure that you set your own password. At the end of the database creation process, you will get a Redis Enterprise Cloud database endpoint and port. Save these, you will need them later. +Choose AWS as the Cloud vendor while creating your new subscription. While creating a new database, ensure that you set your own password. At the end of the database creation process, you will get a Redis Cloud database endpoint and port. Save these, you will need them later. ![image](chatapp4.png) -:::info TIP -You don't need to create an AWS account for setting up your Redis database. Redis Enterprise Cloud on AWS is a fully managed database-as-a-service trusted by thousands of customers for high performance, infinite scalability, true high availability, and best-in-class support. +:::tip +You don't need to create an AWS account for setting up your Redis database. Redis Cloud on AWS is a fully managed database-as-a-service trusted by thousands of customers for high performance, infinite scalability, true high availability, and best-in-class support. ::: ### Step 2. Clone the repository @@ -114,7 +118,7 @@ The demo data initialization is handled in multiple steps: We create a new user id: INCR total_users. Then we set a user ID lookup key by user name: e.g. -``` +```bash SET username:nick user:1 ``` @@ -123,14 +127,14 @@ And finally, the rest of the data is written to the hash set: Example: ```bash - HSET user:1 username "nick" password "bcrypt_hashed_password". +HSET user:1 username "nick" password "bcrypt_hashed_password". ``` Additionally, each user is added to the default "General" room. For handling rooms for each user, we have a set that holds the room ids. Here's an example command of how to add the room: ```bash - SADD user:1:rooms "0" +SADD user:1:rooms "0" ``` #### Populating private messages between users @@ -140,16 +144,16 @@ First, private rooms are created: if a private room needs to be established, for E.g. Create a private room between 2 users: ```bash - SADD user:1:rooms 1:2 and SADD user:2:rooms 1:2 +SADD user:1:rooms 1:2 and SADD user:2:rooms 1:2 ``` Then we add messages for each conversation to this room by writing to a sorted set: ```bash - ZADD room:1:2 1615480369 "{'from': 1, 'date': 1615480369, 'message': 'Hello', 'roomId': '1:2'}" +ZADD room:1:2 1615480369 "{'from': 1, 'date': 1615480369, 'message': 'Hello', 'roomId': '1:2'}" ``` -We are using a stringified JSON to keep the message structure and simplify the implementation details for this demo-app. You may choose to use a Hash or RedisJSON +We are using a stringified JSON to keep the message structure and simplify the implementation details for this demo-app. You may choose to use a Hash or JSON ### Populate the "General" room with messages @@ -172,7 +176,7 @@ When a WebSocket connection is established, we can start to listen for events: A global set with online_users key is used for keeping the online state for each user. So on a new connection, a user ID is written to that set: ```bash - SADD online_users 1 +SADD online_users 1 ``` Here we have added user with id 1 to the set online_users @@ -222,7 +226,7 @@ User data is stored in a hash set where each user entry contains the next values Get User HGETALL user:{id}. ```bash - HGETALL user:2 +HGETALL user:2 ``` where we get data for the user with id: 2. diff --git a/docs/create/aws/chatapp/index-chatapp.mdx_python b/docs/create/aws/chatapp/index-chatapp.mdx_python index d50e68bc2b0..309d2677c4b 100644 --- a/docs/create/aws/chatapp/index-chatapp.mdx_python +++ b/docs/create/aws/chatapp/index-chatapp.mdx_python @@ -8,7 +8,7 @@ slug: /howtos/chatapp import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem'; import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; +import RedisCard from '@theme/RedisCard'; In this tutorial, we will see how to build a basic chat application built with Flask, Socket.IO and Redis. @@ -28,7 +28,7 @@ In this tutorial, we will see how to build a basic chat application built with F - Python 3.6+ -### Step 2. Clone the repository +### Step 2. Clone the repository ``` git clone https://github.com/redis-developer/basic-redis-chat-app-demo-python @@ -81,7 +81,7 @@ python3 app.py ``` ``` -python3 app.py +python3 app.py * Restarting with stat * Debugger is active! * Debugger PIN: 220-696-610 @@ -103,7 +103,7 @@ The demo data initialization is handled in multiple steps: #### Creating of demo users -We create a new user id: INCR total_users. Then we set a user ID lookup key by user name: e.g. +We create a new user id: INCR total_users. Then we set a user ID lookup key by user name: e.g. ``` SET username:nick user:1 @@ -111,8 +111,8 @@ SET username:nick user:1 And finally, the rest of the data is written to the hash set: e.g. HSET user:1 username "nick" password "bcrypt_hashed_password". -Additionally, each user is added to the default "General" room. -For handling rooms for each user, we have a set that holds the room ids. Here's an example command of how to add the room: +Additionally, each user is added to the default "General" room. +For handling rooms for each user, we have a set that holds the room ids. Here's an example command of how to add the room: ``` SADD user:1:rooms "0" @@ -120,13 +120,13 @@ SADD user:1:rooms "0" Populate private messages between users. At first, private rooms are created: if a private room needs to be established, for each user a room id: room:1:2 is generated, where numbers correspond to the user ids in ascending order. -E.g. Create a private room between 2 users: +E.g. Create a private room between 2 users: ``` SADD user:1:rooms 1:2 and SADD user:2:rooms 1:2 ``` -Then we add messages to this room by writing to a sorted set: +Then we add messages to this room by writing to a sorted set: ``` ZADD room:1:2 1615480369 "{'from': 1, 'date': 1615480369, 'message': 'Hello', 'roomId': '1:2'}" @@ -198,7 +198,7 @@ User data is stored in a hash set where each user entry contains the next values ### How the data is accessed: -Get User HGETALL user:{id}. +Get User HGETALL user:{id}. ``` HGETALL user:2 @@ -208,8 +208,8 @@ where we get data for the user with id: 2. - Online users: SMEMBERS online_users. This will return ids of users which are online -- Get room ids of a user: SMEMBERS user:{id}:rooms. -Example: +- Get room ids of a user: SMEMBERS user:{id}:rooms. +Example: ``` SMEMBERS user:2:rooms @@ -217,8 +217,8 @@ SMEMBERS user:2:rooms This will return IDs of rooms for user with ID: 2 -- Get list of messages ZREVRANGE room:{roomId} {offset_start} {offset_end}. -Example: +- Get list of messages ZREVRANGE room:{roomId} {offset_start} {offset_end}. +Example: ``` ZREVRANGE room:1:2 0 50 @@ -244,4 +244,3 @@ It will return 50 messages with 0 offsets for the private room between users wit - diff --git a/docs/create/aws/import/index-database-migration-aws-elasticache-redis-enterprise-cloud.mdx b/docs/create/aws/import/index-database-migration-aws-elasticache-redis-enterprise-cloud.mdx index 801665d4de5..e24859a7b3e 100644 --- a/docs/create/aws/import/index-database-migration-aws-elasticache-redis-enterprise-cloud.mdx +++ b/docs/create/aws/import/index-database-migration-aws-elasticache-redis-enterprise-cloud.mdx @@ -6,9 +6,13 @@ slug: /create/aws/import/database-migration-aws-elasticache-redis-enterprise-clo authors: [ajeet] --- +import Authors from '@theme/Authors'; + + + Most of the database migration tools available today are offline in nature. They are complex and require manual intervention. -If you want to migrate your data from Amazon ElastiCache to Redis Enterprise Cloud, for example, the usual process is to back up your ElastiCache data to an Amazon S3 bucket and then import your data using the Redis Enterprise Cloud UI. This process can require painful downtime and could result in data loss. Other available techniques include creating point-in-time snapshots of the source Redis server and applying the changes to the destination servers to keep both the servers in sync. That might sound like a good approach, but it can be challenging when you have to maintain dozens of scripts to implement the migration strategy. +If you want to migrate your data from Amazon ElastiCache to Redis Cloud, for example, the usual process is to back up your ElastiCache data to an Amazon S3 bucket and then import your data using the Redis Cloud UI. This process can require painful downtime and could result in data loss. Other available techniques include creating point-in-time snapshots of the source Redis server and applying the changes to the destination servers to keep both the servers in sync. That might sound like a good approach, but it can be challenging when you have to maintain dozens of scripts to implement the migration strategy. So we’ve come up with a different approach: @@ -16,15 +20,15 @@ So we’ve come up with a different approach: ![image](../../../../static/img/ajeet-riot-blog-1.png) -RIOT is an open source online migration tool built by Julien Ruaux, a Solution Architect at Redis. RIOT implements client-side replication using a producer/consumer approach. The producer is the combination of the key and value readers that have a connection to ElastiCache. The key reader component identifies keys to be replicated using scan and keyspace notifications. For each key, the value reader component performs a DUMP and handles the resulting key+bytes to the consumer (writer), which performs a RESTORE on the Redis Enterprise connection. +RIOT is an open source online migration tool built by Julien Ruaux, a Solution Architect at Redis. RIOT implements client-side replication using a producer/consumer approach. The producer is the combination of the key and value readers that have a connection to ElastiCache. The key reader component identifies keys to be replicated using scan and keyspace notifications. For each key, the value reader component performs a DUMP and handles the resulting key+bytes to the consumer (writer), which performs a RESTORE on the Redis Cloud connection. -This blog post will show how to perform a seamless online migration of databases from ElastiCache to Redis Enterprise Cloud. +This blog post will show how to perform a seamless online migration of databases from ElastiCache to Redis Cloud. ## Prerequisites: You will require a few resources to use the migration tool: -- A Redis Enterprise Cloud subscription, sign up [here](https://redis.com/try-free/) +- A Redis Cloud subscription, sign up [here](https://redis.com/try-free/) - Amazon ElastiCache (a primary endpoint in the case of a single-master EC and a configuration endpoint in the case of a clustered EC: Refer to Finding Connection Endpoints on the ElastiCache documentation to learn more) - An Amazon EC2 instance based on Linux @@ -115,14 +119,14 @@ Commands: ping, p Execute PING command ``` -Once Java and RIOT are installed, we are all set to begin the migration process with the command below, which replicates data directly from the source (ElastiCache) to the target (Redis Enterprise Cloud). +Once Java and RIOT are installed, we are all set to begin the migration process with the command below, which replicates data directly from the source (ElastiCache) to the target (Redis Cloud). ## Step 4 - Migrate the data -Finally, it’s time to replicate the data from ElastiCache to Redis Enterprise Cloud by running the following command: +Finally, it’s time to replicate the data from ElastiCache to Redis Cloud by running the following command: ``` -sudo ./riot-redis -r redis://:6379 replicate -r redis://password@:port --live +sudo ./riot-redis -r redis://:6379 replicate -r redis://password@:port --live ``` ElastiCache can be configured in two ways: clustered and non-clustered. In the chart below, the first row shows what commands you should perform for the non-clustered scenario, while the second row shows the command for the clustered scenario with a specific database namespace: @@ -132,9 +136,9 @@ As you can see, whenever you have a clustered ElastiCache, you need to pass the ## Important notes - Perform user acceptance testing of the migration before using it in production. -- Once the migration is complete, ensure that application traffic gets successfully redirected to the Redis Enterprise endpoint. +- Once the migration is complete, ensure that application traffic gets successfully redirected to the Redis Cloud endpoint. - Perform the migration process during a period of low traffic to minimize the chance of data loss. ## Conclusion -If you’re looking for a simple and easy-to-use live migration tool that can help you move data from Amazon ElastiCache to Redis Enterprise Cloud with no downtime, RIOT is a promising option. +If you’re looking for a simple and easy-to-use live migration tool that can help you move data from Amazon ElastiCache to Redis Cloud with no downtime, RIOT is a promising option. diff --git a/docs/create/aws/index-aws.mdx b/docs/create/aws/index-aws.mdx index 29be073300d..e8c5bfd0581 100644 --- a/docs/create/aws/index-aws.mdx +++ b/docs/create/aws/index-aws.mdx @@ -5,7 +5,7 @@ sidebar_label: Overview slug: /create/aws --- -import RedisCard from '@site/src/theme/RedisCard'; +import RedisCard from '@theme/RedisCard'; The following links provide you with the available options to run apps on AWS using Redis: @@ -19,8 +19,8 @@ The following links provide you with the available options to run apps on AWS us
@@ -48,7 +48,7 @@ The following links provide you with the available options to run apps on AWS us page="/create/aws/import/database-migration-aws-elasticache-redis-enterprise-cloud" /> - +
+ +Redis Cloud on AWS is fully managed Redis as a service. Designed for modern distributed applications, Redis Cloud on AWS is known for its high performance, infinite scalability and true high availability. + +Follow the below steps to setup Redis Cloud hosted over AWS Cloud: ### Step 1. Create free cloud account -Create your free Redis Enterprise Cloud account. Once you click on “Get Started”, you will receive an email with a link to activate your account and complete your signup process. +Create your free Redis Cloud account. Once you click on “Get Started”, you will receive an email with a link to activate your account and complete your signup process. -:::info TIP -For a limited time, use **TIGER200** to get **$200** credits on Redis Enterprise Cloud and try all the advanced capabilities! +:::tip +For a limited time, use **TIGER200** to get **$200** credits on Redis Cloud and try all the advanced capabilities! :tada: [Click here to sign up](https://redis.com/try-free) @@ -25,7 +29,7 @@ For a limited time, use **TIGER200** to get **$200** credits on Redis Enterprise ### Step 2. Create Your subscription -Next, you will have to create Redis Enterprise Cloud subscription. In the Redis Enterprise Cloud menu, click "Create your Subscription". +Next, you will have to create Redis Cloud subscription. In the Redis Cloud menu, click "Create your Subscription". ![My Image](images/create_subscription.png) @@ -63,11 +67,6 @@ Click "Activate" and wait for few seconds till it gets activated. Once fully act ![My Image](images/launch_database.png) -### Next Steps - -- [Connecting to the database using RedisInsight](/explore/redisinsight/) -- [How to list & search Movies database using Redisearch](/howtos/moviesdatabase/getting-started/) - ##
@@ -76,13 +75,11 @@ Click "Activate" and wait for few seconds till it gets activated. Once fully act target="_blank" rel="noopener" className="link"> - Redis Launchpad -
diff --git a/docs/create/aws/slackbot/index-slackbot.mdx b/docs/create/aws/slackbot/index-slackbot.mdx index 665a74e7544..2133b8affa0 100644 --- a/docs/create/aws/slackbot/index-slackbot.mdx +++ b/docs/create/aws/slackbot/index-slackbot.mdx @@ -1,11 +1,15 @@ --- id: index-slackbot -title: How to Build a Slack Bot to Retrieve Lost Files Using AWS S3 and RediSearch -sidebar_label: Building a Slack Bot using AWS S3 and RediSearch from scratch +title: How to Build a Slack Bot to Retrieve Lost Files Using AWS S3 and Redis Search and Query Engine +sidebar_label: Building a Slack Bot using AWS S3 and Redis Search and Query Engine slug: /create/aws/slackbot authors: [ajeet] --- +import Authors from '@theme/Authors'; + + + ![alt_text](images/image7.png) If you work remotely then you’re likely to have come across Slack at some point. And if you use Slack on a daily basis, then you’ll be all too aware of how easy it can be to lose files. Being pinged every day by different employees across different channels makes it difficult to keep track of files. @@ -36,13 +40,13 @@ Ready to get started? OK, let’s dive straight in. ### Step 2. What will you need? -- [Slack:](https://slack.com/intl/en-gb/): used as an instant messaging app that connects employees with one another. +- [Slack](https://slack.com/intl/en-gb/): used as an instant messaging app that connects employees with one another. - [Slack Block Kit](https://api.slack.com/block-kit): used as a UI framework for Slack apps that offers a balance of control and flexibility when building experiences. -- [Python:](https://www.python.org/): the preferred programming language to connect Redis in the application. -- [RediSearch](https://redis.com/modules/redis-search/): Provides querying, secondary indexing, and full-text search for Redis. -- [S3 bucket](https://aws.amazon.com/es/s3/): Used as a public cloud storage resource in Amazon Web Services (AWS). -- [AWS Textract](https://aws.amazon.com/es/textract/): Used as a machine learning service that automatically extracts text. -- [Nodejs](https://nodejs.org/en/): Responsible for image generation. +- [Python](https://www.python.org/): the preferred programming language to connect Redis in the application. +- [Redis Stack](https://redis.io/docs/stack/): includes a built-in Search and Query feature that provides querying, secondary indexing and full-text search. +- [S3 bucket](https://aws.amazon.com/es/s3/): used as a public cloud storage resource in Amazon Web Services (AWS). +- [AWS Textract](https://aws.amazon.com/es/textract/): used as a machine learning service that automatically extracts text. +- [Nodejs](https://nodejs.org/en/): responsible for image generation. ### Step 3. Architecture @@ -53,36 +57,36 @@ Let’s look at each of the components that creates the Reeko-Slack bot: #### 1. file_shared - When a new file is shared in any public slack channel the file_share event is sent to the Slack Bot app. -- The file name is added as a suggestion using the [FT.SUGADD](https://oss.redis.com/redisearch/Commands/) command in RediSearch. -- All file data is added using the [JSON.SET](https://oss.redis.com/redisjson/commands/#jsonset) command. +- The file name is added as a suggestion using the [`FT.SUGADD`](https://redis.io/commands/?group=search) command in Redis. +- All file data is added using the [`JSON.SET`](https://redis.io/commands/?group=json) command. - The file is then stored on the S3 bucket as an object with the key as the filename. #### 2. S3-get -- The [JSON.GET ](https://oss.redis.com/redisjson/commands/#jsonget)command checks whether the desired file exists. +- The [`JSON.GET`](https://redis.io/commands/?group=json) command checks whether the desired file exists. - The file will then be retrieved from the S3 bucket if found. #### 3. S3-search -- The [FT.SEARCH ](https://oss.redis.com/redisearch/Commands/#ftsearch)command uses RediSearch to look for documents in the S3 bucket- Users are presented will be prompted with different file name suggestions based on what they’ve typed in the search bar. +- The [`FT.SEARCH`](https://redis.io/commands/?group=search) command uses the Redis Search and Query engine to look for documents in the S3 bucket- Users are presented will be prompted with different file name suggestions based on what they’ve typed in the search bar. - Once the user chooses one of the file suggestions, it is then downloaded and sent back to Slack. #### 4. S3-delete -- User types the file name from the command["text'] parameter -- The file data is deleted from RedisJson using the [JSON.DEL](https://oss.redis.com/redisjson/commands/#jsondel) command and is also removed from RediSearch's suggestions using the FT.SUGDEL command. +- User types the file name from the command['text'] parameter +- The file data is deleted from Redis using the [`JSON.DEL`](https://redis.io/commands/?group=json) command and is also removed from Redis's suggestions using the `FT.SUGDEL` command. #### 5. Summarise-document - The file name is identified from the command['text'] parameter. -- It is then retrieved from the S3 bucket through the[ JSON.GET](https://oss.redis.com/redisjson/commands/#jsonget) command. +- It is then retrieved from the S3 bucket through the [JSON.GET](https://redis.io/commands/?group=json) command. - Users can either download the pdf or png file locally from the S3 bucket. -- The text is extracted using [AWS Textract ](https://aws.amazon.com/textract/). -- The extracted text is then summarised using Hugging face transformers summarization pipeline. The text summary is also added back to the JSON document using[ JSON.SET](https://oss.redis.com/redisjson/commands/#jsonset) command. +- The text is extracted using [AWS Textract](https://aws.amazon.com/textract/). +- The extracted text is then summarised using Hugging face transformers summarization pipeline. The text summary is also added back to the `JSON` document using [`JSON.SET`](https://redis.io/commands/?group=json) command. - A post request is then sent to the /create-image on the NodeJS backend with the file name and summary text. - An image is generated using a base template. - The image that is returned is saved to the S3 bucket and sent back to Slack. -- The image URL is also added to the JSON document using [JSON.SET ](https://oss.redis.com/redisjson/commands/#jsonset)command. +- The image URL is also added to the `JSON` document using [`JSON.SET`](https://redis.io/commands/?group=json) command. ### What is the S3 bucket? @@ -103,7 +107,7 @@ The [S3 bucket](https://aws.amazon.com/s3/) is a simple storage service from Ama This simple container image bundles together the latest stable releases of Redis and select Redis modules from Redis Labs. This image is based on the official image of Redis from Docker. By default, the container starts with Redis' default configuration and all included modules loaded. ```bash - docker run -d -p 6379:6379 redislabs/redismod + docker run -d -p 6379:6379 redis/redis-stack ``` ### 2. Setup a Python environment @@ -193,7 +197,7 @@ SLACK_SIGNING_SECRET=your-signing-secret ![alt_text](images/image8.png) -1. Make sure you have followed the steps in [Cloning the repo t](https://github.com/redis-developer/Reeko-Slack-Bot/tree/master/python-backend#Cloning-the-repo)o start the bolt app. The HTTP server is using a built-in development adapter, which is responsible for handling and parsing incoming events from Slack on port 3000. +1. Make sure you have followed the steps in [Cloning the repo](https://github.com/redis-developer/Reeko-Slack-Bot/tree/master/python-backend#Cloning-the-repo) to start the bolt app. The HTTP server is using a built-in development adapter, which is responsible for handling and parsing incoming events from Slack on port 3000. ``` python3 app.py @@ -201,7 +205,7 @@ python3 app.py ![alt_text](images/image6.png) -Open a new terminal and ensure that you've installed [ngrok. Make sure to](https://github.com/redis-developer/Reeko-Slack-Bot/tree/master/python-backend#ngrok) tell ngrok to use port 3000 (which Bolt for Python uses by default): +Open a new terminal and ensure that you've installed [ngrok](https://github.com/redis-developer/Reeko-Slack-Bot/tree/master/python-backend#ngrok). Make sure to tell ngrok to use port 3000 (which Bolt for Python uses by default): ``` ngrok http 3000 @@ -212,16 +216,15 @@ ngrok http 3000 For local slack development, we'll use your ngrok URL from above, so copy it to your clipboard. ``` -For example: https://your-own-url.ngrok.io (copy to clipboard) - +https://your-own-url.ngrok.io ``` 1. Now we’re going to subscribe to events. Your app can listen to all sorts of events that are happening around your workspace - messages being posted, files being shared and more. On your app configuration page, select the _Event Subscriptions_ sidebar. You'll be presented with an input box to enter a Request URL, which is where Slack sends the events your app is subscribed to. Hit the _save_ button. -By default Bolt for Python listens for all incoming requests at the /slack/events route, so for the Request URL you can enter your ngrok URL appended with /slack/events. +By default Bolt for Python listens for all incoming requests at the /slack/events route, so for the Request URL you can enter your ngrok URL appended with /slack/events: ``` -Request URL: https://your-own-url.ngrok.io/slack/events +https://your-own-url.ngrok.io/slack/events ``` If the challenge was successful, you’ll get “verified” right next to the Request URL. @@ -241,18 +244,18 @@ Add the following scopes ![alt_text](images/image26.png) -1. Select the _Interactivity & Shortcuts_ sidebar and toggle the switch as on. Again, for the Request URL, enter your ngrok URL appended with /slack/events. +1. Select the _Interactivity & Shortcuts_ sidebar and toggle the switch as on. Again, for the Request URL, enter your ngrok URL appended with /slack/events: ``` -Request URL: https://your-own-url.ngrok.io/slack/events +https://your-own-url.ngrok.io/slack/events ``` ![alt_text](images/image16.png) -1. Scroll down to the _Select Menus_ section in the Options Load URL and enter your ngork URL appended with /slack/events. +1. Scroll down to the _Select Menus_ section in the Options Load URL and enter your ngork URL appended with /slack/events: ``` -Options Load URL: https://your-own-url.ngrok.io/slack/events +https://your-own-url.ngrok.io/slack/events ``` ![alt_text](images/image32.png) @@ -303,7 +306,11 @@ You can find the suitable release from [http://www.graphicsmagick.org/download.h #### Nodejs -Note: Please follow all the steps in [python-backend/README.md](https://github.com/redis-developer/Reeko-Slack-Bot/blob/master/python-backend/README.md) first. +:::note + +Please follow all the steps in [python-backend/README.md](https://github.com/redis-developer/Reeko-Slack-Bot/blob/master/python-backend/README.md) first. + +::: Copy the AWS credentials from the [python-backend/.env](https://github.com/redis-developer/Reeko-Slack-Bot/blob/master/python-backend/README.md) to the [config.json](https://github.com/redis-developer/Reeko-Slack-Bot/blob/master/nodejs-backend/src/config/config.json) file. @@ -319,7 +326,7 @@ Install all the packages and run the server. ``` npm install -npm run start +npm start ``` ![alt_text](images/image18.png) @@ -344,7 +351,7 @@ JSON.GET amazonshareholderletterpdf This command involves deleting files from the S3 bucket. To achieve this you simply need to type in the file name in the search bar and Reeko will pull up the file as demonstrated below. -You’ll have the option to permanently delete the file from the S3 bucket. The file data is deleted from RedisJson using the JSON.DEL command and is removed from RediSearch's suggestions using the `FT.SUGDEL` command. You’ll be informed when the file is deleted. +You’ll have the option to permanently delete the file from the S3 bucket. The file data is deleted from Redis using the JSON.DEL command and is removed from Search suggestions using the `FT.SUGDEL` command. You’ll be informed when the file is deleted. ``` FT.SUGDEL file-index "amazon-shareholder-letter.pdf" @@ -354,7 +361,7 @@ JSON.DEL amazonshareholderletterpdf #### Step 8: File searching -Have you ever searched for a file without being entirely sure what it is you’re looking for? You may remember snippets of the content but not enough to manually track down its location. Well due to RediSearch’s autocomplete functionality this will no longer be a problem. +Have you ever searched for a file without being entirely sure what it is you’re looking for? You may remember snippets of the content but not enough to manually track down its location. Well due to Search’s autocomplete functionality this will no longer be a problem. #### /s3-search @@ -375,7 +382,6 @@ In this step, Reeko will extract all of the text from the documents and summariz ``` JSON.GET amazonshareholderletterpdf - ``` 3. Download the pdf or png file locally from S3 bucket @@ -399,7 +405,7 @@ Below we’ve used the [Amazon 2020 shareholder letter](https://s2.q4cdn.com/299 #### 5. How it works -The Slack app is built using Bolt for Python framework. To connect the AWS S3 bucket and AWS Textract, use their respective [boto3 ](https://github.com/boto/boto3)clients. +The Slack app is built using Bolt for Python framework. To connect the AWS S3 bucket and AWS Textract, use their respective [boto3](https://github.com/boto/boto3) clients. Slack is receptive to all events around your workspace such as messages being posted, files being shared, users joining the team, and more. To listen to events, Slack uses the Events API. And to enable custom interactivity, you can use the Block Kit. @@ -407,10 +413,10 @@ Slash commands work in the following way. First they consider the text you enter In the application there are two Redis Modules: -- [RedisJSON](https://oss.redislabs.com/redisjson/) - store file information like filename, summary and image url. -- [RediSearch](https://oss.redislabs.com/redisearch/) - searches for files in the S3 bucket +- [Redis JSON](https://redis.io/docs/stack/json/) - store file information like filename, summary and image url. +- [Redis Search and Query](https://redis.io/docs/stack/search/) - searches for files in the S3 bucket -The code below is used to initialize RediSearch in redisearch_connector.py. This is done by creating an index with the name `file_index.` +The code below is used to initialize Redis in redisearch_connector.py. This is done by creating an index with the name `file_index.` ``` from redisearch import Client, TextField, AutoCompleter, Suggestion @@ -422,7 +428,7 @@ class RedisSearchConnector(): self.ac = AutoCompleter(self.index_name) ``` -Use the following code to initialise RedisJSON in redisjson_connector.py. +Use the following code to initialise Redis JSON in redisjson_connector.py. ``` from rejson import Client, Path @@ -432,7 +438,7 @@ class RedisJsonConnector(): self.rj = Client(decode_responses=True) ``` -And the code below is used to create an index on RediSearch +And the code below is used to create an index in Redis ``` FT.CREATE file-index ON HASH SCHEMA file_name TEXT SORTABLE file_id TEXT created TEXT timestamp TEXT mimetype TEXT filetype TEXT user_id TEXT size @@ -440,7 +446,7 @@ FT.CREATE file-index ON HASH SCHEMA file_name TEXT SORTABLE file_id TEXT created ### Conclusion: preventing lost files with Redis -The advanced capabilities of Redis allowed this Launchpad App to create an invaluable asset to remote workers - to never lose a file again on Slack. RediSearch offered a simple yet effective way of transmitting data to and from the S3 bucket with no lags, no pauses and no delays whatsoever. You can discover more about the ins and outs of how this app was made by simply [clicking here](https://launchpad.redis.com/?id=project%3AReeko-Slack-Bot). +The advanced capabilities of Redis allowed this Launchpad App to create an invaluable asset to remote workers - to never lose a file again on Slack. Redis Stack offered a simple yet effective way of transmitting data to and from the S3 bucket with no lags, no pauses and no delays whatsoever. You can discover more about the ins and outs of how this app was made by simply [clicking here](https://launchpad.redis.com/?id=project%3AReeko-Slack-Bot). Reeko is an innovative application that joins our _exciting_ collection of apps that we currently have on the [Redis Launchpad](https://launchpad.redis.com/). By using Redis, programmers from all over the world are creating breakthrough applications that are having an impact on daily lives… _and you can too_. @@ -459,5 +465,3 @@ To discover more about his work and his activity on GitHub, you can [check out h ### References - [Create Redis database on AWS](/create/aws/redis-on-aws) -- [Connecting to the database using RedisInsight](/explore/redisinsight/) -- [How to list & search Movies database using Redisearch](/howtos/moviesdatabase/getting-started/) diff --git a/docs/create/aws/terraform/index-terraform.mdx b/docs/create/aws/terraform/index-terraform.mdx index ceab161e0e4..fea67d60e37 100644 --- a/docs/create/aws/terraform/index-terraform.mdx +++ b/docs/create/aws/terraform/index-terraform.mdx @@ -6,6 +6,10 @@ slug: /create/aws/terraform authors: [ajeet, rahul] --- +import Authors from '@theme/Authors'; + + + ![terraform](terraform_arch.png) Development teams today are embracing more and more DevOps principles, such as continuous integration and continuous delivery (CI/CD). Therefore, the need to manage infrastructure-as-code (IaC) has become an essential capability for any cloud service. IaC tools allow you to manage infrastructure with configuration files rather than through a graphical user interface. IaC allows you to build, change, and manage your infrastructure in a safe, consistent, and repeatable way by defining resource configurations that you can version, reuse, and share. @@ -28,10 +32,10 @@ Terraform is an open source IaC software tool that provides a consistent CLI wor - It is not intended to give low-level programmatic access to providers, but instead provides a high-level syntax for describing how cloud resources and services should be created, provisioned, and combined. - It provides a simple, unified syntax, allowing almost any resource to be managed without learning new tooling. -### The HashiCorp Terraform Redis Enterprise Cloud provider +### The HashiCorp Terraform Redis Cloud provider -Redis has developed a Terraform provider for Redis Enterprise Cloud. The HashiCorp Terraform Redis Enterprise Cloud provider allows customers to deploy and manage Redis Enterprise Cloud subscriptions, databases, and network peering easily as code, on any cloud provider. It is a plugin for Terraform that allows Redis Enterprise Cloud Flexible customers to manage the full life cycle of their subscriptions and related Redis databases. -The Redis Enterprise Cloud provider is used to interact with the resources supported by Redis Enterprise Cloud. The provider needs to be configured with the proper credentials before it can be used. Use the navigation to the left to read about the available provider resources and data sources. +Redis has developed a Terraform provider for Redis Cloud. The HashiCorp Terraform Redis Cloud provider allows customers to deploy and manage Redis Cloud subscriptions, databases, and network peering easily as code, on any cloud provider. It is a plugin for Terraform that allows Redis Cloud Flexible customers to manage the full life cycle of their subscriptions and related Redis databases. +The Redis Cloud provider is used to interact with the resources supported by Redis Cloud. The provider needs to be configured with the proper credentials before it can be used. Use the navigation to the left to read about the available provider resources and data sources. ![rediscloud](terraform_rediscloud.png) @@ -104,11 +108,15 @@ Data sources allow Terraform to use information defined outside of Terraform, de A data block requests that Terraform read from a given data source ("rediscloud_payment_method") and export the result under the given local name ("card"). The name is used to refer to this resource from elsewhere in the same Terraform module, but has no significance outside of the scope of a module. Within the block body (between { and }) are query constraints defined by the data source. Most arguments in this section depend on the data source, and indeed in this example card_type and last_four_numbers are all arguments defined specifically for the rediscloud_payment_method data source. -### Configure Redis Enterprise Cloud programmatic access +### Configure Redis Cloud programmatic access + +In order to set up authentication with the Redis Cloud provider, a programmatic API key must be generated for Redis Cloud. The Redis Cloud documentation contains the most up-to-date instructions for creating and managing your key(s) and IP access. + +:::note -In order to set up authentication with the Redis Enterprise Cloud provider, a programmatic API key must be generated for Redis Enterprise Cloud. The Redis Enterprise Cloud documentation contains the most up-to-date instructions for creating and managing your key(s) and IP access. +Flexible and Annual Redis Cloud subscriptions can leverage a RESTful API that permits operations against a variety of resources, including servers, services, and related infrastructure. The REST API is not supported for Fixed or Free subscriptions. -Please note that Flexible and Annual Redis Enterprise Cloud subscriptions can leverage a RESTful API that permits operations against a variety of resources, including servers, services, and related infrastructure. The REST API is not supported for Fixed or Free subscriptions. +::: ``` provider "rediscloud" { } # Example resource configuration @@ -118,7 +126,7 @@ Please note that Flexible and Annual Redis Enterprise Cloud subscriptions can le ### Prerequisites: - Install Terraform on MacOS. -- Create a free Redis Enterprise Cloud account. +- Create a free Redis Cloud account. - Create your first subscription. - Enable API @@ -130,15 +138,15 @@ Use Homebrew to install Terraform on MacOS as shown below: brew install terraform ``` -### Step 2: Sign up for a free Redis Enterprise Cloud account +### Step 2: Sign up for a free Redis Cloud account -[Follow this tutorial](https://developer.redis.com/create/aws/redis-on-aws) to sign up for free Redis Enterprise Cloud account. +[Follow this tutorial](https://developer.redis.com/create/aws/redis-on-aws) to sign up for free Redis Cloud account. ![Redis Cloud](tryfree.png) -### Step 3: Enable Redis Enterprise Cloud API +### Step 3: Enable Redis Cloud API -If you have a Flexible (or Annual) Redis Enterprise Cloud subscription, you can use a REST API to manage your subscription programmatically. The Redis Cloud REST API is available only to Flexible or Annual subscriptions. It is not supported for Fixed or Free subscriptions. +If you have a Flexible (or Annual) Redis Cloud subscription, you can use a REST API to manage your subscription programmatically. The Redis Cloud REST API is available only to Flexible or Annual subscriptions. It is not supported for Fixed or Free subscriptions. For security reasons, the Redis Cloud API is disabled by default. To enable the API: Sign in to your Redis Cloud subscription as an account owner. @@ -388,7 +396,7 @@ Apply complete! Resources: 3 added, 0 changed, 0 destroyed. You can now verify the new database created under Subscription named “db-json.” -Deploy a Redis Database with RedisJSON modules on AWS using Terraform: +Deploy a Redis Database with Redis JSON modules on AWS using Terraform: ``` terraform { @@ -544,7 +552,5 @@ Destroy complete! Resources: 3 destroyed. ### Further References: -- [Provision and Manage Redis Enterprise Cloud Anywhere with HashiCorp Terraform](https://redis.com/blog/provision-manage-redis-enterprise-cloud-hashicorp-terraform/) -- [The HashiCorp Terraform Redis Enterprise Cloud provider](https://registry.terraform.io/providers/RedisLabs/rediscloud/latest) -- [Azure Cache for Redis Enterprise using Terraform](https://developer.redis.com/create/azure/terraform-simple) -- [Azure Cache for Redis Enterprise using Terraform with Private Link](https://developer.redis.com/create/azure/terraform-private-endpoint) +- [Provision and Manage Redis Cloud Anywhere with HashiCorp Terraform](https://redis.com/blog/provision-manage-redis-enterprise-cloud-hashicorp-terraform/) +- [The HashiCorp Terraform Redis Cloud provider](https://registry.terraform.io/providers/RedisLabs/rediscloud/latest) diff --git a/docs/create/azure/index-azure.mdx b/docs/create/azure/index-azure.mdx index 82cba5e4a8d..fb85376ecf8 100644 --- a/docs/create/azure/index-azure.mdx +++ b/docs/create/azure/index-azure.mdx @@ -5,7 +5,7 @@ sidebar_label: Overview slug: /create/azure --- -import RedisCard from '@site/src/theme/RedisCard'; +import RedisCard from '@theme/RedisCard'; The following links provides you with the available options for create instances of Azure Cache for Redis @@ -17,21 +17,7 @@ The following links provides you with the available options for create instances page="/create/azure/portal" />
-
- -
-
- -
diff --git a/docs/create/azure/portal/index-azure-portal.mdx b/docs/create/azure/portal/index-azure-portal.mdx index 44f7ea72153..953f6024e7f 100644 --- a/docs/create/azure/portal/index-azure-portal.mdx +++ b/docs/create/azure/portal/index-azure-portal.mdx @@ -9,7 +9,7 @@ Redis is an open source, in-memory, key-value data store most commonly used as a The Azure cloud platform has more than 200+ products and cloud services designed to help you bring new solutions to life-to solve today's challenges and create the future. Azure services help you to build, run, and manage applications across multiple clouds, on-premises, and at the edge, with the tools and frameworks of your choice. -Azure Cache for Redis is a native fully-managed service on Microsoft Azure. Azure Cache for Redis offers both the Redis open-source (OSS Redis) and a commercial product from Redis (Redis Enterprise) as a managed service. It provides secure and dedicated Redis server instances and full Redis API compatibility. The service is operated by Microsoft, hosted on Azure, and accessible to any application within or outside of Microsoft Azure. +Azure Cache for Redis is a native fully-managed service on Microsoft Azure. Azure Cache for Redis offers both the Redis open-source (OSS Redis) and a commercial product from Redis (Redis Cloud) as a managed service. It provides secure and dedicated Redis server instances and full Redis API compatibility. The service is operated by Microsoft, hosted on Azure, and accessible to any application within or outside of Microsoft Azure. Azure Cache for Redis dashboard uses Azure Monitor to provide several options for monitoring your cache instances.[Learn more](https://docs.microsoft.com/en-us/azure/azure-monitor/insights/redis-cache-insights-overview) Use Azure Monitor to: @@ -22,7 +22,7 @@ Use Azure Monitor to: ### Step 1. Getting Started -Search for "azure redis cache " in the search dashboard and launch [Azure Cache for Redis Enterprise](https://portal.azure.com) +Search for "azure redis cache " in the search dashboard and launch [Azure Cache for Redis Cloud](https://portal.azure.com) ![RedisLabs Azure Page](azure7.png) @@ -43,7 +43,11 @@ sudo redis-cli -h demos.redis.cache.windows.net -p 6379 demos.redis.cache.windows.net:6379> ``` -Please note that you can have multiple clients connected to a Redis database at the same time. The above Redis client command might require a password if you have setup authentication in your Redis configuration file. You can insert data to Redis using the `SET` command and then fetch it back with the `GET` command. You can also run the Redis `INFO` command to get the statistics about the health of the Redis server (for example, memory usage, Redis server load etc). +:::tip + +You can have multiple clients connected to a Redis database at the same time. The above Redis client command might require a password if you have setup authentication in your Redis configuration file. You can insert data to Redis using the `SET` command and then fetch it back with the `GET` command. You can also run the Redis `INFO` command to get the statistics about the health of the Redis server (for example, memory usage, Redis server load etc). + +::: ### Resources @@ -60,7 +64,6 @@ Please note that you can have multiple clients connected to a Redis database at ### Next Steps -- [Getting Started with .Net and Redis](/develop/dotnet/) - [Best Practices for Azure Cache for Redis](https://docs.microsoft.com/en-in/azure/azure-cache-for-redis/cache-best-practices) - [Quickstart: Use Azure Cache for Redis in .NET Framework](https://docs.microsoft.com/en-us/azure/azure-cache-for-redis/cache-dotnet-how-to-use-azure-redis-cache) @@ -72,13 +75,11 @@ Please note that you can have multiple clients connected to a Redis database at target="_blank" rel="noopener" className="link"> - Redis Launchpad - diff --git a/docs/create/azure/terraform-private-endpoint/index-terraform-private-endpoint.mdx b/docs/create/azure/terraform-private-endpoint/index-terraform-private-endpoint.mdx deleted file mode 100644 index cd2b5553c6f..00000000000 --- a/docs/create/azure/terraform-private-endpoint/index-terraform-private-endpoint.mdx +++ /dev/null @@ -1,125 +0,0 @@ ---- -id: index-azure-terraform-private-endpoint -title: Azure Cache for Redis Enterprise using Terraform with Private Link -sidebar_label: Azure Cache for Redis Enterprise using Terraform with Private Link -slug: /create/azure/terraform-private-endpoint ---- - -Azure Private Link for Azure Cache for Redis provides private connectivity from a virtual network to your cache instance. This means that you can now use Azure Private Link to connect to an Azure Cache for Redis instance from your virtual network via a private endpoint, which is assigned a private IP address in a subnet within the virtual network.It simplifies the network architecture and secures the connection between endpoints in Azure by eliminating data exposure to the public internet. Private Link carries traffic privately, reducing your exposure to threats and helps you meet compliance standards. - -Azure Resource Manager(a.k.a AzureRM) is the deployment and management service for Azure. It provides a management layer that enables you to create, update, and delete resources in your Azure account. You can use management features, like access control, locks, and tags, to secure and organize your resources after deployment. The "azurerm_redis_enterprise_cluster" is a resource that manages a Redis Enterprise cluster. This is a template to get started with the 'azurerm_redis_enterprise_cluster' resource available in the 'azurerm' provider with Terraform. - -### Prerequisite - -1. [Terraform](https://terraform.io) -2. [Azure CLI](https://docs.microsoft.com/en-us/cli/azure/install-azure-cli) - -### Step 1. Getting Started - -Login in Azure using the Azure CLI - -```bash -az login -``` - -> Login with a Service Principal will also work - -Login using an Azure Service Principal - -```bash -az login --service-principal --username APP_ID --tenant TENANT_ID --password [password || /path/to/cert] -``` - -### Step 2: Clone the repository - -```bash -git clone https://github.com/redis-developer/acre-terraform -``` - -### Step 3: Initialize the repository - -```bash -cd acre-terraform -terraform init -``` - -> The output should include: `Terraform has been successfully initialized` - -### Step 4: Modify the variables(optional) - -The default variables are setup to deploy the smallest 'E10' instance into the 'East US' region. -Changes can be made by updating the `variables.tf` file. - -### Step 5: Verify the plan - -The 'plan' output will show you everything being created by the template. - -```bash -terraform plan -``` - -> The output should include: `Plan: 18 to add, 0 to change, 0 to destroy.` - -### Step 6: Apply the plan - -When the plan looks good, 'apply' the template. - -```bash -terraform apply -``` - -> The output should include: `Apply complete! Resources: 18 added, 0 changed, 0 destroyed.` - -### Step 7: Connect using generated output - -The access key is sensitive, so viewing the outputs must be requested specifically. -The output is also in JSON format. - -```bash -terraform output redisgeek_config -``` - -> Example output: - -``` -{ -"hostname" = "redisgeek-8jy4.eastus.redisenterprise.cache.azure.net" -"access_key" = "DQYABC3uRMXXXXXXXXXXXXXXXXTRkfgOXXXPjs82Y=" -"port" = "10000" -} -``` - -### Resources - -##### 1. How to use Redis Cache for Redis like a Pro - -
- -
- -##### 2. Do More with Azure Cache for Redis, Enterprise Tiers - -
- -
- -### References - -- [Azure Cache for Redis Enterprise](https://azuremarketplace.microsoft.com/en-us/marketplace/apps/garantiadata.redis_enterprise_1sp_public_preview?ocid=redisga_redislabs_cloudpartner_cta1) -- [Accelerate Modern Application Delivery with Redis Enterprise on Microsoft Azure](https://query.prod.cms.rt.microsoft.com/cms/api/am/binary/RWGGx3) -- [.Net and Redis](/develop/dotnet/) -- [Quickstart: Create a Redis Enterprise cache](https://docs.microsoft.com/en-us/azure/azure-cache-for-redis/quickstart-create-redis-enterprise) diff --git a/docs/create/azure/terraform-simple/index-azure-terraform-simple.mdx b/docs/create/azure/terraform-simple/index-azure-terraform-simple.mdx deleted file mode 100644 index 92552e47813..00000000000 --- a/docs/create/azure/terraform-simple/index-azure-terraform-simple.mdx +++ /dev/null @@ -1,135 +0,0 @@ ---- -id: index-azure-terraform-simple -title: Azure Cache for Redis Enterprise using Terraform -sidebar_label: Redis Enterprise with Terraform -slug: /create/azure/terraform-simple ---- - -The Enterprise Tiers of Azure Cache for Redis is generally available as a native fully managed service on Microsoft Azure. This offering combines Azure’s global presence, flexibility, security, and compliance with Redis Enterprise’s unmatched availability, performance, and extended data structure functionality to create the best experience for enterprises. Enterprise features include: - -- [Open source Redis 6.0](https://redis.com/blog/diving-into-redis-6/) -- [Zone redundancy, with up to 99.99% availability](https://docs.microsoft.com/en-us/azure/azure-cache-for-redis/cache-high-availability#zone-redundancy) -- [Active geo-replication, with up to 99.999% availability](https://redis.com/redis-enterprise/technology/active-active-geo-distribution/) - Preview -- [Redis on Flash (RoF)](https://redis.com/redis-enterprise/technology/redis-on-flash/) -- [Disk persistence with recovery](https://redis.com/redis-enterprise/technology/durable-redis/) - Preview -- Redis Enterprise modules: - - [RediSearch 2.0](https://redis.com/blog/redisearch-2-build-modern-applications-interactive-search/) - - [RedisTimeSeries](https://redis.com/modules/redis-timeseries/) - - [RedisBloom](https://redis.com/modules/redis-bloom/) -- Scaling - - Datasets up to 13TB - - Up to 2M concurrent client connections - - Over 1M ops/second -- Security - - [Private Link support](https://docs.microsoft.com/en-us/azure/azure-cache-for-redis/cache-private-link) - - TLS connectivity -- Integrated billing and the ability to apply Azure-commitment spend - -Azure Resource Manager(a.k.a AzureRM) is the deployment and management service for Azure. It provides a management layer that enables you to create, update, and delete resources in your Azure account. You use management features, like access control, locks, and tags, to secure and organize your resources after deployment. - -The "azurerm_redis_enterprise_cluster" is a resource that manages a Redis Enterprise cluster. This is a template to get started with the 'azurerm_redis_enterprise_cluster' resource available in the 'azurerm' provider with Terraform. - -### Prerequisite - -1. [Terraform CLI](https://terraform.io) -2. [Azure CLI](https://docs.microsoft.com/en-us/cli/azure/install-azure-cli) - -### Step 1. Getting Started - -Login in Azure using the Azure CLI - -```bash -az login -``` - -### Step 2: Clone the repository - -```bash -git clone https://github.com/redis-developer/acre-terraform-simple -``` - -### Step 3: Initialize the repository - -```bash -cd acre-terraform-simple -terraform init -``` - -> The output should include: `Terraform has been successfully initialized` - -### Step 4: Modify the variables(optional) - -The default variables are setup to deploy the smallest 'E10' instance into the 'East US' region. -Changes can be made by updating the `variables.tf` file. - -### Step 5: Verify the plan - -The 'plan' output will show you everything being created by the template. - -```bash -terraform plan -``` - -> The plan step does not make any changes in Azure - -### Step 6: Apply the plan - -When the plan looks good, 'apply' the template. - -```bash -terraform apply -``` - -### Step 7: Connect using generated output - -The access key is sensitive, so viewing the outputs must be requested specifically. -The output is also in JSON format. - -```bash -terraform output redisgeek_config -``` - -> Example output: - -``` -{ -"hostname" = "redisgeek-8jy4.eastus.redisenterprise.cache.azure.net" -"access_key" = "DQYABC3uRMyDguEXXXXXXXXXXWTRkfgOPjs82Y=" -"port" = "10000" -} -``` - -### Resources - -##### How to use Redis Cache for Redis like a Pro - -
- -
- -##### Do More with Azure Cache for Redis, Enterprise Tiers - -
- -
- -### References - -- [Azure Cache for Redis Enterprise](https://azuremarketplace.microsoft.com/en-us/marketplace/apps/garantiadata.redis_enterprise_1sp_public_preview?ocid=redisga_redislabs_cloudpartner_cta1) -- [Accelerate Modern Application Delivery with Redis Enterprise on Microsoft Azure](https://query.prod.cms.rt.microsoft.com/cms/api/am/binary/RWGGx3) -- [.Net and Redis](/develop/dotnet/) -- [Quickstart: Create a Redis Enterprise cache](https://docs.microsoft.com/en-us/azure/azure-cache-for-redis/quickstart-create-redis-enterprise) diff --git a/docs/create/azurefunctions/index-azurefunctions.mdx b/docs/create/azurefunctions/index-azurefunctions.mdx index cf017472810..ae4cca67516 100644 --- a/docs/create/azurefunctions/index-azurefunctions.mdx +++ b/docs/create/azurefunctions/index-azurefunctions.mdx @@ -6,6 +6,10 @@ slug: /create/azurefunctions authors: [ajeet] --- +import Authors from '@theme/Authors'; + + + ![alt_text](images/preview_azurefunction.png) [Azure Functions](https://azure.microsoft.com/en-in/services/functions/) is an event-based, serverless compute platform offered by [Microsoft](https://azure.microsoft.com/en-in/blog/microsoft-named-a-leader-in-forrester-wave-functionasaservice-platforms/) to accelerate and simplify serverless application development. It allows developers to write less code, build and debug locally without additional setup, and deploy and operate at scale in the cloud. @@ -256,7 +260,7 @@ If you connect to the Redis database and run the `MONITOR` command, you should s ### Step 14. Run query using RedisInsight -[Follow this link to set up RedisInsight](https://developer.redis.com/explore/redisinsightv2/getting-started) on your local system and get connected to the Redis database. Once connected, you should be able to run the following queries: +Set up RedisInsight on your local system and get connected to the Redis database. Once connected, you should be able to run the following queries: ![Redis Insight](images/image18.png) @@ -276,5 +280,5 @@ If you connect to the Redis database and run the `MONITOR` command, you should s ### Additional references: - [Introduction to Azure Functions](https://azure.microsoft.com/en-in/services/functions/) -- [Fully Managed Redis Enterprise for Azure](https://redis.com/cloud-partners/microsoft-azure/) -- [Azure Cache for Redis Enterprise & Flash](https://azuremarketplace.microsoft.com/en-us/marketplace/apps/garantiadata.redis_enterprise_1sp_public_preview?ocid=redisga_redis_cloudpartner_cta1) +- [Fully Managed Redis Cloud for Azure](https://redis.com/cloud-partners/microsoft-azure/) +- [Azure Cache for Redis Cloud & Flash](https://azuremarketplace.microsoft.com/en-us/marketplace/apps/garantiadata.redis_enterprise_1sp_public_preview?ocid=redisga_redis_cloudpartner_cta1) diff --git a/docs/create/cloud/aws/index-aws.mdx b/docs/create/cloud/aws/index-aws.mdx index 124594e6fbe..b40d43ada4a 100644 --- a/docs/create/cloud/aws/index-aws.mdx +++ b/docs/create/cloud/aws/index-aws.mdx @@ -5,9 +5,9 @@ sidebar_label: AWS slug: /create/cloud/aws --- -Redis Enterprise Cloud on AWS is a fully Managed Redis Enterprise as a service. Designed for modern distributed applications, Redis Enterprise Cloud on AWS is known for its high performance, infinite scalability and true high availability. +Redis Cloud on AWS is fully managed Redis as a service. Designed for modern distributed applications, Redis Cloud on AWS is known for its high performance, infinite scalability and true high availability. -Follow the below steps to setup Redis Enterprise Cloud hosted over AWS Cloud: +Follow the below steps to setup Redis Cloud hosted over AWS Cloud: ### Step 1. Getting Started @@ -28,8 +28,3 @@ For the cloud provider, select Amazon AWS and choose Free plan. ### Step 4. Click "Activate" ![AWS Cloud](aws6.png) - -### Next Steps - -- [Connecting to the database using RedisInsight](/explore/redisinsight/) -- [How to list & search Movies database using Redisearch](/howtos/moviesdatabase/getting-started/) diff --git a/docs/create/cloud/azure/index-azure.mdx b/docs/create/cloud/azure/index-azure.mdx index 505f9bd035f..ec11e5d6efd 100644 --- a/docs/create/cloud/azure/index-azure.mdx +++ b/docs/create/cloud/azure/index-azure.mdx @@ -5,11 +5,11 @@ sidebar_label: Azure Cache for Redis slug: /create/cloud/azure --- -Azure Cache for Redis is a native fully-managed service on Microsoft Azure. Azure Cache for Redis offers both the Redis open-source (OSS Redis) and a commercial product from Redis (Redis Enterprise) as a managed service. It provides secure and dedicated Redis server instances and full Redis API compatibility. The service is operated by Microsoft, hosted on Azure, and accessible to any application within or outside of Azure. +Azure Cache for Redis is a native fully-managed service on Microsoft Azure. Azure Cache for Redis offers both the Redis open-source (OSS Redis) and a commercial product from Redis (Redis Cloud) as a managed service. It provides secure and dedicated Redis server instances and full Redis API compatibility. The service is operated by Microsoft, hosted on Azure, and accessible to any application within or outside of Azure. ### Step 1. Getting Started -Launch [Azure Cache for Redis Enterprise & Flash](https://azuremarketplace.microsoft.com/en-us/marketplace/apps/garantiadata.redis_enterprise_1sp_public_preview?ocid=redisga_redislabs_cloudpartner_cta1) +Launch [Azure Cache for Redis Cloud & Flash](https://azuremarketplace.microsoft.com/en-us/marketplace/apps/garantiadata.redis_enterprise_1sp_public_preview?ocid=redisga_redislabs_cloudpartner_cta1) ![RedisLabs Azure Page](azure1.png) diff --git a/docs/create/cloud/gcp/index-gcp.mdx b/docs/create/cloud/gcp/index-gcp.mdx index d6671990d2e..b4ffae7bf79 100644 --- a/docs/create/cloud/gcp/index-gcp.mdx +++ b/docs/create/cloud/gcp/index-gcp.mdx @@ -5,11 +5,11 @@ sidebar_label: Google Cloud slug: /create/cloud/gcp --- -Redis Enterprise Cloud delivers fully managed Redis Enterprise as a Service. It offers all the capabilities of Redis Enterprise while taking care of all the operational aspects associated with operating Redis in the most efficient manner on Google Cloud Platform. Redis Enterprise Cloud is built on a complete serverless concept, so users don’t need to deal with nodes and clusters +Redis Cloud delivers fully managed Redis as a service. It offers all the capabilities of Redis Enterprise while taking care of all the operational aspects associated with operating Redis in the most efficient manner on Google Cloud Platform. Redis Cloud is built on a complete serverless concept, so users don’t need to deal with nodes and clusters ### Step 1. Getting Started -Launch [Redis Enterprise Cloud page](https://console.cloud.google.com/apis/library/gcp.redisenterprise.com?pli=1) on Google Cloud Platform +Launch [Redis Cloud page](https://console.cloud.google.com/apis/library/gcp.redisenterprise.com?pli=1) on Google Cloud Platform ![Google Cloud](gcp2.png) @@ -43,5 +43,4 @@ Launch [Redis Enterprise Cloud page](https://console.cloud.google.com/apis/libra ### Next Steps -- [Connecting to the database using RedisInsight](/explore/redisinsight/) - [How to list & search Movies database using Redisearch](/howtos/shoppingcart/) diff --git a/docs/create/cloud/index-cloud.mdx.orig b/docs/create/cloud/index-cloud.mdx.orig deleted file mode 100644 index 206203b9758..00000000000 --- a/docs/create/cloud/index-cloud.mdx.orig +++ /dev/null @@ -1,51 +0,0 @@ ---- -id: index-cloud -title: Create a database using Redis Enterprise Cloud -sidebar_label: Redis Enterprise Cloud -slug: /create/cloud/ ---- - -### Step 1. Create free cloud account -Create your free Redis Enterprise Cloud account. Once you click on “Get Started”, you will receive an email with a link to activate your account and complete your signup process. - -### Step 2. Add subscription -Next, you will have to add Redis Enterprise Cloud Essentials subscription. In the Redis Enterprise Cloud menu, click Subscriptions. At the bottom of the page, click the “+” sign. - -![My Image](images/subscription.png) - -### Step 3. Select cloud provider -For the cloud provider, select Amazon AWS - -![My Image](images/aws.png) - -### Step 4. Selection region -For the region where you want to use the subscription, select ap-south-1. Please note that it’s currently available only in the AWS/Mumbai - -![My Image](images/region.png) - -### Step 5. Select free cloud plan -In the Redis Enterprise Cloud service levels, select the Redis Cloud Essentials 30MB/1 Database level - -![My Image](images/plan.png) - -### Step 6. Create database -Click Create. After you create a subscription, you can create a database: - -![My Image](images/createdatabase.png) - -### Step 7. Add database details -Enter a name for the database of your choice - -![My Image](images/choosemodule.png) - -### Step 8. Launch database -Click "Activate" and wait for few seconds till it gets activated. Once fully activated, you will see the database endpoints as shown below: - -![My Image](images/activate.png) - - -### Next Steps - -- [Connecting to the database using RedisInsight](/explore/redisinsight/) -- [How to list & search Movies database using Redisearch](/howtos/moviesdatabase/getting-started/) - diff --git a/docs/create/cloud/rediscloud/index-recloud.mdx b/docs/create/cloud/rediscloud/index-recloud.mdx index 29e4d9b3ec6..6039ea35555 100644 --- a/docs/create/cloud/rediscloud/index-recloud.mdx +++ b/docs/create/cloud/rediscloud/index-recloud.mdx @@ -1,15 +1,15 @@ --- id: index-rediscloud -title: Create Database using Redis Enterprise Cloud -sidebar_label: Redis Enterprise Cloud +title: Create Database using Redis Cloud +sidebar_label: Redis Cloud slug: /create/cloud/rediscloud --- -Redis Enterprise Cloud is a fully managed cloud service by Redis. Built for modern distributed applications, Redis Enterprise Cloud enables you to run any query, simple or complex, at sub-millisecond performance at virtually infinite scale without worrying about operational complexity or service availability. With modern probabilistic data structures and extensible data models, including Search, JSON, Graph, and Time Series, you can rely on Redis as your data-platform for all your real-time needs. +Redis Cloud is a fully managed cloud service by Redis. Built for modern distributed applications, Redis Cloud enables you to run any query, simple or complex, at sub-millisecond performance at virtually infinite scale without worrying about operational complexity or service availability. With modern probabilistic data structures and extensible data models, including Search, JSON, Graph, and Time Series, you can rely on Redis as your data-platform for all your real-time needs. ### Step 1. Create free cloud account -Create your free Redis Enterprise Cloud account. Once you click on “Get Started”, you will receive an email with a link to activate your account and complete your signup process. +Create your free Redis Cloud account. Once you click on “Get Started”, you will receive an email with a link to activate your account and complete your signup process. ![My Image](tryfree.png) @@ -17,7 +17,7 @@ Create your free -
- -
-
+ Thanks to [Node.js](https://nodejs.dev/) - Millions of frontend developers that write JavaScript for the browser are now able to write the server-side code in addition to the client-side code without the need to learn a completely different language. Node.js is a free, open-sourced, cross-platform JavaScript run-time environment. It is capable to handle thousands of concurrent connections with a single server without introducing the burden of managing thread concurrency, which could be a significant source of bugs. ![Nginx-node](docker_nginx.png) diff --git a/docs/create/docker/redis-on-docker/images/README.md b/docs/create/docker/redis-on-docker/images/README.md deleted file mode 100644 index cd01cea8763..00000000000 --- a/docs/create/docker/redis-on-docker/images/README.md +++ /dev/null @@ -1 +0,0 @@ -# List of Images diff --git a/docs/create/docker/redis-on-docker/images/resoftware-1.png b/docs/create/docker/redis-on-docker/images/resoftware-1.png deleted file mode 100644 index 0ac05bdb94e..00000000000 Binary files a/docs/create/docker/redis-on-docker/images/resoftware-1.png and /dev/null differ diff --git a/docs/create/docker/redis-on-docker/images/resoftware-2.png b/docs/create/docker/redis-on-docker/images/resoftware-2.png deleted file mode 100644 index 739e91e0dda..00000000000 Binary files a/docs/create/docker/redis-on-docker/images/resoftware-2.png and /dev/null differ diff --git a/docs/create/docker/redis-on-docker/images/resoftware-3.png b/docs/create/docker/redis-on-docker/images/resoftware-3.png deleted file mode 100644 index ca2bd300cfc..00000000000 Binary files a/docs/create/docker/redis-on-docker/images/resoftware-3.png and /dev/null differ diff --git a/docs/create/docker/redis-on-docker/images/resoftware-4.png b/docs/create/docker/redis-on-docker/images/resoftware-4.png deleted file mode 100644 index 8d6c4b87cf9..00000000000 Binary files a/docs/create/docker/redis-on-docker/images/resoftware-4.png and /dev/null differ diff --git a/docs/create/docker/redis-on-docker/images/resoftware-5.png b/docs/create/docker/redis-on-docker/images/resoftware-5.png deleted file mode 100644 index 00456b3ab51..00000000000 Binary files a/docs/create/docker/redis-on-docker/images/resoftware-5.png and /dev/null differ diff --git a/docs/create/docker/redis-on-docker/images/resoftware-7.png b/docs/create/docker/redis-on-docker/images/resoftware-7.png deleted file mode 100644 index 7fc070de3b8..00000000000 Binary files a/docs/create/docker/redis-on-docker/images/resoftware-7.png and /dev/null differ diff --git a/docs/create/docker/redis-on-docker/index-redis-on-docker.mdx b/docs/create/docker/redis-on-docker/index-redis-on-docker.mdx deleted file mode 100644 index 68a4cd0f83f..00000000000 --- a/docs/create/docker/redis-on-docker/index-redis-on-docker.mdx +++ /dev/null @@ -1,181 +0,0 @@ ---- -id: index-redis-on-docker -title: How to Deploy and Run Redis in a Docker container -sidebar_label: Redis on Docker -slug: /create/docker/redis-on-docker -authors: [ajeet] ---- - -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; - - - - - -### Pre-requisite - -Ensure that Docker is installed in your system. - -If you're new, refer https://docs.docker.com/docker-for-mac/install/ to install Docker on Mac. - -To pull and start the Redis Enterprise Software Docker container, run this docker run command in the terminal or command-line for your operating system. - -Note: On Windows, make sure Docker is configured to run Linux-based containers. - -``` -docker run -d --cap-add sys_resource --name rp -p 8443:8443 -p 9443:9443 -p 12000:12000 redislabs/redis -``` - -In the web browser on the host machine, go to https://localhost:8443 to see the Redis Enterprise Software web console. - -### Step 1: Click on “Setup” - -Click Setup to start the node configuration steps. - -![My Image](images/resoftware-1.png) - -### Step 2: Enter your preferred FQDN - -In the Node Configuration settings, enter a cluster FQDN such as demo.redis.com. Then click Next button. - -![My Image](images/resoftware-2.png) - -Enter your license key, if you have one. If not, click the Next button to use the trial version. - -### Step 3: Enter the admin credentials - -Enter an email and password for the admin account for the web console. - -![My Image](images/resoftware-4.png) - -These credentials are also used for connections to the REST API. -Click OK to confirm that you are aware of the replacement of the HTTPS SSL/TLS certificate on the node, and proceed through the browser warning. - -### Step 4: Create a Database: - -Select “redis database” and the “single region” deployment, and click Next. - -![My Image](images/resoftware-5.png) - -Enter a database name such as demodb and click Activate to create your database - -![My Image](images/resoftware-7.png) - -You now have a Redis database! - -### Step 5: Connecting using redis-cli - -After you create the Redis database, you are ready to store data in your database. redis-cli is a built-in simple command-line tool to interact with Redis database. Run redis-cli, located in the /opt/redislabs/bin directory, to connect to port 12000 and store and retrieve a key in database1 - -``` -$ docker exec -it rp bash -redislabs@fd8dca50f905:/opt$ - /opt/redislabs/bin/redis-cli -p 12000 -127.0.0.1:12000> auth -OK -127.0.0.1:12000> set key1 123 -OK -127.0.0.1:12000> get key1 -"123" -``` - - - - -### Pre-requisite - -Ensure that Docker is installed in your system. Follow https://docs.docker.com/engine/install/ if you haven’t installed yet. - -You can run Redis Stack using a Docker container. There are two types of Docker images available in Docker Hub. - -- The `redis/redis-stack` Docker image contains both Redis Stack server and RedisInsight. This container is recommended for local development because you can use RedisInsight to visualize your data. - -- The `redis/redis-stack-server` provides Redis Stack but excludes RedisInsight. This container is best for production deployment. - -### Getting started - -To start Redis Stack server using the redis-stack image, run the following command in your terminal: - -```bash - docker run -d --name redis-stack -p 6379:6379 -p 8001:8001 redis/redis-stack:latest -``` - -You can use `redis-cli` to connect to the server, just as you connect to any Redis instance. -If you don’t have redis-cli installed locally, you can run it from the Docker container: - -```bash - docker exec -it redis-stack redis-cli -``` - -:::info TIP -The `docker run` command above also exposes RedisInsight on port 8001. You can use RedisInsight by pointing your browser to http://localhost:8001. -::: - -To persist your Redis data to a local path, specify -v to configure a local volume. This command stores all data in the local directory local-data: - -```bash - docker run -v /local-data/:/data redis/redis-stack:latest -``` - -If you want to expose Redis Stack server or RedisInsight on a different port, update the left hand of portion of the `-p` argument. This command exposes Redis Stack server on port 10001 and RedisInsight on port 13333: - -```bash - docker run -p 10001:6379 -p 13333:8001 redis/redis-stack:latest -``` - -By default, the Redis Stack Docker containers use internal configuration files for Redis. To start Redis with local a configuration file, you can use the -v volume options: - -```bash - docker run -v `pwd`/local-redis-stack.conf:/redis-stack.conf -p 6379:6379 -p 8001:8001 redis/redis-stack:latest -``` - -To pass in arbitrary configuration changes, you can set any of these environment variables: - -- `REDIS_ARGS`: extra arguments for Redis -- `REDISEARCH_ARGS`: arguments for RediSearch -- `REDISJSON_ARGS`: arguments for RedisJSON -- `REDISGRAPH_ARGS`: arguments for RedisGraph -- `REDISTIMESERIES_ARGS`: arguments for RedisTimeSeries -- `REDISBLOOM_ARGS`: arguments for RedisBloom - -For example, here’s how to use the `REDIS_ARGS` environment variable to pass the `requirepass` directive to Redis: - -``` - docker run -e REDIS_ARGS="--requirepass redis-stack" redis/redis-stack:latest -``` - - - - - -### Next Steps - -- [Connect to Redis database using RedisInsight](/explore/redisinsightv2/) -- [Connect to Redis database using Redis datasource for Grafana](/explore/redisdatasource/) - -## - - diff --git a/docs/create/from-source/index-from-source.mdx b/docs/create/from-source/index-from-source.mdx deleted file mode 100644 index 8042b43d42b..00000000000 --- a/docs/create/from-source/index-from-source.mdx +++ /dev/null @@ -1,64 +0,0 @@ ---- -id: index-from-source -title: Create Redis database from Source -sidebar_label: Redis from Source -slug: /create/from-source/ -authors: [ajeet] ---- - -### Step 1: Download, extract and compile Redis - -Redis stands for REmote DIctionary Server. Redis is an open source, in-memory, key-value data store most commonly used as a primary database, cache, message broker, and queue. Redis cache delivers sub-millisecond response times, enabling fast and powerful real-time applications in industries such as gaming, fintech, ad-tech, social media, healthcare, and IoT. - -In order to install Redis from source, first you need to download the latest Redis source code. -The Redis source code is available to download [here](https://download.redis.io/redis-stable.tar.gz). You can verify the integrity of these downloads by checking them against the digests in the [redis-hashes git repository](https://github.com/redis/redis-hashes) - -``` -wget https://download.redis.io/redis-stable.tar.gz -tar xvzf redis-stable.tar.gz -cd redis-stable -make -``` - -It is a good idea to copy both the Redis server and the command line interface into the proper places, either manually using the following commands: - -``` -sudo cp src/redis-server /usr/local/bin/ -sudo cp src/redis-cli /usr/local/bin/ -``` - -Or just using `sudo make install.` - -The binaries that are now compiled are available in the src directory. - -### Step 2: Running Redis Server - -Install the Redis server by running the following command: - -``` -$ redis-server -``` - -Please note that you don't need to restart the Redis service. - -### Step 3: Interacting with Redis Client - -Once the Redis installation has completed, you can use the Redis client to connect to the Redis server. -Use the following commands to store and retrieve a string: - -``` -$ src/redis-cli -redis> set foo bar -OK -redis> get foo -"bar" -``` - -`redis.conf` is the Redis configuration file, used to configure the behavior of the Redis Server. For more information on the available configuration options, check out the [documentation on redis.io](https://redis.io/docs/manual/config/). - -### Next Steps - -- [Connect to a Redis database using RedisInsight](/explore/redisinsight) -- [Develop with Java and Redis](/develop/java) -- [High availability with Redis Sentinel](https://redis.io/docs/manual/sentinel/) -- [Develop with Python and Redis](/develop/python) diff --git a/docs/create/gcp/gcp1.png b/docs/create/gcp/gcp1.png deleted file mode 100644 index ed72d7097df..00000000000 Binary files a/docs/create/gcp/gcp1.png and /dev/null differ diff --git a/docs/create/gcp/gcp10.png b/docs/create/gcp/gcp10.png deleted file mode 100644 index b2a1e720ecb..00000000000 Binary files a/docs/create/gcp/gcp10.png and /dev/null differ diff --git a/docs/create/gcp/gcp11.png b/docs/create/gcp/gcp11.png deleted file mode 100644 index ce9934cf6cc..00000000000 Binary files a/docs/create/gcp/gcp11.png and /dev/null differ diff --git a/docs/create/gcp/gcp2.png b/docs/create/gcp/gcp2.png deleted file mode 100644 index 192e1171572..00000000000 Binary files a/docs/create/gcp/gcp2.png and /dev/null differ diff --git a/docs/create/gcp/gcp3.png b/docs/create/gcp/gcp3.png deleted file mode 100644 index e200ada0e15..00000000000 Binary files a/docs/create/gcp/gcp3.png and /dev/null differ diff --git a/docs/create/gcp/gcp4.png b/docs/create/gcp/gcp4.png deleted file mode 100644 index ad2fdd37eae..00000000000 Binary files a/docs/create/gcp/gcp4.png and /dev/null differ diff --git a/docs/create/gcp/gcp5.png b/docs/create/gcp/gcp5.png deleted file mode 100644 index 7429afcc7a8..00000000000 Binary files a/docs/create/gcp/gcp5.png and /dev/null differ diff --git a/docs/create/gcp/gcp6.png b/docs/create/gcp/gcp6.png deleted file mode 100644 index 5511e0e6d4e..00000000000 Binary files a/docs/create/gcp/gcp6.png and /dev/null differ diff --git a/docs/create/gcp/gcp7.png b/docs/create/gcp/gcp7.png deleted file mode 100644 index 13782da5a6d..00000000000 Binary files a/docs/create/gcp/gcp7.png and /dev/null differ diff --git a/docs/create/gcp/gcp8.png b/docs/create/gcp/gcp8.png deleted file mode 100644 index 80a8134f656..00000000000 Binary files a/docs/create/gcp/gcp8.png and /dev/null differ diff --git a/docs/create/gcp/gcp9.png b/docs/create/gcp/gcp9.png deleted file mode 100644 index dadc3a18d00..00000000000 Binary files a/docs/create/gcp/gcp9.png and /dev/null differ diff --git a/docs/create/gcp/index-gcp.mdx b/docs/create/gcp/index-gcp.mdx deleted file mode 100644 index 1a6ee4d0a66..00000000000 --- a/docs/create/gcp/index-gcp.mdx +++ /dev/null @@ -1,67 +0,0 @@ ---- -id: index-gcp -title: Create Redis database using Google Cloud -sidebar_label: Redis on Google Cloud -slug: /create/gcp -authors: [ajeet] ---- - -Redis Enterprise Cloud delivers fully managed Redis Enterprise as a Service. It offers all the capabilities of Redis Enterprise while taking care of all the operational aspects associated with operating Redis in the most efficient manner on Google Cloud Platform. Redis Enterprise Cloud is built on a complete serverless concept, so users don’t need to deal with nodes and clusters - -### Step 1. Getting Started - -Launch [Redis Enterprise Cloud page](https://console.cloud.google.com/apis/library/gcp.redisenterprise.com?pli=1) on Google Cloud Platform - -![Google Cloud](gcp2.png) - -### Step 2. Click "Manage via Redis Labs" - -![Google Cloud](gcp3.png) - -### Step 3. Create Subscription - -![Google Cloud](gcp4.png) - -### Step 4. Specify the database name - -![Google Cloud](gcp5.png) - -### Step 5. Enter sizing details - -![Google Cloud](gcp6.png) - -### Step 6: Review & Create - -![Google Cloud](gcp7.png) - -### Step 7. Verify the details - -![Google Cloud](gcp8.png) - -### Step 8. Finalising the setup - -![Google Cloud](gcp10.png) - -### Next Steps - -- [Connecting to the database using RedisInsight](/explore/redisinsight/) -- [How to list & search Movies database using Redisearch](/howtos/shoppingcart/) - -## - - diff --git a/docs/create/gcp/launchpad.png b/docs/create/gcp/launchpad.png deleted file mode 100644 index 66e7a455f63..00000000000 Binary files a/docs/create/gcp/launchpad.png and /dev/null differ diff --git a/docs/create/heroku/herokugo/create_heroku.png b/docs/create/heroku/herokugo/create_heroku.png deleted file mode 100644 index 68d7813f4d2..00000000000 Binary files a/docs/create/heroku/herokugo/create_heroku.png and /dev/null differ diff --git a/docs/create/heroku/herokugo/heroku_app1_env.png b/docs/create/heroku/herokugo/heroku_app1_env.png deleted file mode 100644 index b30b5307b33..00000000000 Binary files a/docs/create/heroku/herokugo/heroku_app1_env.png and /dev/null differ diff --git a/docs/create/heroku/herokugo/heroku_chatapp_go.png b/docs/create/heroku/herokugo/heroku_chatapp_go.png deleted file mode 100644 index d15456cc321..00000000000 Binary files a/docs/create/heroku/herokugo/heroku_chatapp_go.png and /dev/null differ diff --git a/docs/create/heroku/herokugo/heroku_environment.png b/docs/create/heroku/herokugo/heroku_environment.png deleted file mode 100644 index ae3220cc190..00000000000 Binary files a/docs/create/heroku/herokugo/heroku_environment.png and /dev/null differ diff --git a/docs/create/heroku/herokugo/heroku_leaderboard_ruby.png b/docs/create/heroku/herokugo/heroku_leaderboard_ruby.png deleted file mode 100644 index cd62ee84773..00000000000 Binary files a/docs/create/heroku/herokugo/heroku_leaderboard_ruby.png and /dev/null differ diff --git a/docs/create/heroku/herokugo/index-herokugo.mdx b/docs/create/heroku/herokugo/index-herokugo.mdx deleted file mode 100644 index badf375b31a..00000000000 --- a/docs/create/heroku/herokugo/index-herokugo.mdx +++ /dev/null @@ -1,269 +0,0 @@ ---- -id: index-herokugo -title: Deploy a Go app on Heroku using Redis -sidebar_label: How to deploy a Go based application on Heroku using Redis -slug: /create/heroku/herokugo -authors: [ajeet] ---- - -import RedisCard from '@site/src/theme/RedisCard'; - -Heroku is a PaaS (platform as a service) for building and running software applications in the cloud. Heroku today supports programming languages such as Java, Python, Ruby, Node.js and Go. Heroku manages your app portfolio in a straightforward Dashboard or with a CLI. Heroku's horizontally scalable, share-nothing architecture is designed for building services in today's world of containerized applications. - -Here are few popular terminologies used in Heroku: - -- Dynos: The Heroku Platform uses the container model to run and scale all Heroku apps. The containers used at Heroku are called “dynos". Dynos are isolated, virtualized Linux containers that are designed to execute code based on a user-specified command. -- Buildpack: This is a config script for the build automation process, describing how a container image should be created. -- Add-ons: These are tools and services for extending a Heroku application's functionality, such as data storage and processing, monitoring, or analytics. -- Heroku CLI: This is a tool for building and running Heroku apps from within the terminal. (Docker, too, uses its own CLI for working with the platform.) Learn about [Heroku Dev Center](https://devcenter.heroku.com/) -- Git: A popular version control system for tracking changes to a software's source code. Heroku makes it easy to manage your app deployments with git, and has built-in integrations with the GitHub hosting platform for git repositories. - -Heroku recognizes an app as being written in Go by the existence of a `go.mod` file in the root directory. Heroku also supports govendor, godep & GB, but this tutorial focuses only on Go modules. Here's a quickstart guide to deploy Go apps on Heroku using Redis. We will be deploying a sample Chat application written in Go. - -### Step 1. Create a Redis Enterprise Cloud Database - -Create your free Redis Enterprise Cloud account by visiting [this link](https://redis.com/try-free). -Creating a Heroku account is free of charge. - -:::info TIP -For a limited time, use **TIGER200** to get **$200** credits on Redis Enterprise Cloud and try all the advanced capabilities! - -:tada: [Click here to sign up](https://redis.com/try-free) - -::: - -![recloud](try-free.png) - -[Follow this link to create a Redis Enterprise Cloud](/create/rediscloud) subscription and database. Once you create the database, you will be provisioned with a unique database endpoint URL, port and password. Save these for future reference. - -### Step 2. Create a Heroku account - -If you are using Heroku for the first time, visit the Heroku website and create your new Heroku account [through this link](https://signup.heroku.com/login). - -![heroku](create_heroku.png) - -### Step 3. Install the Heroku CLI on your system - -The Heroku Command Line Interface (CLI) lets you create and manage Heroku apps directly from the terminal. It's an essential part of using Heroku. In order to install Heroku CLI, run the following command: - -```macos - brew install heroku -``` - -### Step 4. Login to Heroku - -Use the following Heroku Command Line commands to login to Heroku dashboard: - -```bash - heroku login - heroku: Press any key to open up the browser to login or q to exit: - Opening browser to https://cli-auth.heroku.com/auth/cli/browser/XXXXXXXXXXA - Logging in... done - Logged in as your_email_address -``` - -### Step 5. Connect your application to Redis Enterprise Cloud - -For this demonstration, we will be using a [Sample Redis Chat app](https://github.com/redis-developer/basic-redis-chat-demo-go). - -#### Clone the GitHub repository - -First you will need to clone the GitHub repository to configure it as a local repository. - -```bash - git clone https://github.com/redis-developer/basic-redis-chat-demo-go -``` - -`basic-redis-chat-demo-go` is the name of the project directory. -Run the commands below to get a functioning Git repository that contains a simple application as well as a `app.json` file. - -``` -heroku create -Creating app... done, ⬢ stark-island-03510 -https://stark-island-03510.herokuapp.com/ | https://git.heroku.com/stark-island-03510.git -``` - -### Step 6. Setting up Environment Variables - -Go to the Heroku dashboard, click "Settings" and set the following parameters under Config Vars: - -- SERVER_ADDRESS=:5555 -- CLIENT_LOCATION=/api/public -- REDIS_HOST= -- REDIS_PASSWORD= - -![heroku](heroku_environment.png) - -You now have a functioning Git repository that contains a simple application as well as a `app.json` file. - -### Step 7. Deploy your code - -``` -$ git push heroku -``` - -Wait for few seconds and you will see the messages below: - -``` -remote: -remote: Verifying deploy... done. -To https://git.heroku.com/stark-island-03510 - * [new branch] master -> master -``` - -### Step 8. Accessing the application - -Open `https://stark-island-03510.herokuapp.com/` to access your web application on the browser. -Please note that the Web URL is unique, hence it will be different in your case. - -![heroku](heroku_chatapp_go.png) - -### How does it work? - -The chat server works as a basic REST API which involves managing sessions and handling the user state in the chat rooms (besides the WebSocket/real-time part). -When the server starts, the initialization step occurs. At first, a new Redis connection is established and the code checks whether or not to load the demo data. - -#### Initialization - -For simplicity, a key named `total_users` is checked: if it does not exist, we fill the Redis database with initial data. `EXISTS total_users` (checks if the key exists). -The demo data initialization is handled in multiple steps: - -#### Creating demo users - -We create a new user id: `INCR total_users`. Then we set a user ID lookup key by user name: e.g. - -``` -SET username:nick user:1 -``` - -And finally, the rest of the data is written to a Redis hash: - -Example: - -```bash - HSET user:1 username "nick" password "bcrypt_hashed_password". -``` - -Additionally, each user is added to the default "General" room. -For handling rooms for each user, we have a set that holds the room ids. Here's an example command of how to add the room: - -```bash - SADD user:1:rooms "0" -``` - -Next, we need to populate private messages between users. At first, private rooms are created: if a private room needs to be established, for each user a room id: `room:1:2` is generated, where numbers correspond to the user ids in ascending order. - -E.g. Create a private room between 2 users: - -```bash - SADD user:1:rooms 1:2 and SADD user:2:rooms 1:2 -``` - -Then we add messages to this room by writing to a sorted set: - -```bash - ZADD room:1:2 1615480369 "{'from': 1, 'date': 1615480369, 'message': 'Hello', 'roomId': '1:2'}" -``` - -We use a stringified JSON object for keeping the message structure and simplify the implementation details for this demo-app. - -Populate the "General" room with messages. Messages are added to the sorted set with id of the "General" room: `room:0`. - -#### Pub/sub - -After initialization, a pub/sub subscription is created: `SUBSCRIBE MESSAGES`. At the same time, each server instance will run a listener on a message on this channel to receive real-time updates. - -Again, for simplicity, each message is serialized to JSON, which we parse and then handle in the same manner, as WebSocket messages. - -Pub/sub allows us to connect multiple servers written in different languages without taking into consideration the implementation detail of each server. - -#### Real-time chat and session handling - -A WebSocket/real-time server is instantiated, which then listens for the next events: - -- Connection. A new user is connected. At this point, a user ID is captured and saved to the session (which is cached in Redis). Note, that session caching is language/library-specific and it's used here purely for persistence and maintaining the state between server reloads. - -A global set with online_users key is used for keeping the online state for each user. So on a new connection, a user ID is written to that set: - -```bash - SADD online_users 1 -``` - -Here we have added a user with id 1 to the set `online_users` - -After that, a message is broadcast to the clients to notify them that a new user is joined the chat. - -- Disconnect. It works similarly to the connection event, except we need to remove the user for online_users set and notify the clients: `SREM online_users 1` (makes user with id 1 offline). - -- Message. A user sends a message, and it needs to be broadcast to the other clients. The pub/sub also allows us to broadcast this message to all server instances which are connected to this Redis server: - -``` - PUBLISH message "{'serverId': 4132, 'type':'message', 'data': {'from': 1, 'date': 1615480369, 'message': 'Hello', 'roomId': '1:2'}}" -``` - -Note we send additional data related to the type of the message and the server id. Server id is used to discard the messages by the server instance which sends them since it is connected to the same MESSAGES channel. - -The type field of the serialized JSON corresponds to the real-time method we use for real-time communication (connect/disconnect/message). - -The data is method-specific information. In the example above it's related to the new message. - -### How is the data stored? - -Redis is used mainly as a database to keep the user/messages data and for sending messages between connected servers. - -The real-time functionality is handled by Socket.IO for server-client messaging. Additionally each server instance subscribes to the `MESSAGES` pub/sub channel and dispatches messages once they arrive. Note that, the server transports pub/sub messages with a separate event stream (handled by Server Sent Events), this is due to the need to run the pub/sub message loop separately from socket.io signals. - -The chat data is stored in various keys and various data types. -User data is stored in a hash where each user entry contains the next values: - -- username: unique user name; -- password: hashed password - -- Additionally a set of rooms is associated with user -- Rooms are sorted sets which contains messages where score is the timestamp for each message -- Each room has a name associated with it -- The "online" set is global for all users is used for keeping track on which user is online. -- User hashes are accessed by key `user:{userId}`. The data for it stored with `HSET` key field data. User id is calculated by incrementing the `total_users` key (`INCR total_users`) - -- Usernames are stored as separate keys (`username:{username}`) which returns the userId for quicker access and stored with `SET username:{username} {userId}`. - -- Rooms that a user belongs to are stored at `user:{userId}:rooms` as a set of room ids. A room is added by the `SADD user:{userId}:rooms {roomId}` command. - -- Messages are stored at `room:{roomId}` key in a sorted set (as mentioned above). They are added with the `ZADD room:{roomId} {timestamp} {message}` command. The message is serialized to an app-specific JSON string. - -### How is the data accessed? - -Get User `HGETALL user:{id}`. - -```bash - HGETALL user:2 -``` - -which gets data for the user with id: 2. - -- Online users: `SMEMBERS online_users`. This will return ids of users who are online - -- Get the ids of rooms that a user is in: `SMEMBERS user:{id}:rooms`. - Example: - -``` - SMEMBERS user:2:rooms -``` - -This will return IDs of rooms for the user whose ID is 2 - -- Get a list of messages: `ZREVRANGE room:{roomId} {offset_start} {offset_end}`. - Example: - -``` - ZREVRANGE room:1:2 0 50 -``` - -This returns 50 messages with 0 offsets for the private room between users with IDs 1 and 2. - -### Next Steps - -- [Connecting to the database using RedisInsight](/explore/redisinsight/) -- [Accessing Go-based apps over Redis LaunchPad](https://launchpad.redis.com/) -- [Deploy Java apps on Heroku using Redis](/create/heroku/herokujava) -- [Deploy NodeJS apps on Heroku using Redis](/create/heroku/herokujava) diff --git a/docs/create/heroku/herokugo/launch_database.png b/docs/create/heroku/herokugo/launch_database.png deleted file mode 100644 index 67d35afa3be..00000000000 Binary files a/docs/create/heroku/herokugo/launch_database.png and /dev/null differ diff --git a/docs/create/heroku/herokugo/try-free.png b/docs/create/heroku/herokugo/try-free.png deleted file mode 100644 index 11915ea5927..00000000000 Binary files a/docs/create/heroku/herokugo/try-free.png and /dev/null differ diff --git a/docs/create/heroku/herokujava/create_heroku.png b/docs/create/heroku/herokujava/create_heroku.png deleted file mode 100644 index 68d7813f4d2..00000000000 Binary files a/docs/create/heroku/herokujava/create_heroku.png and /dev/null differ diff --git a/docs/create/heroku/herokujava/heroku-redis.png b/docs/create/heroku/herokujava/heroku-redis.png deleted file mode 100644 index 578f0359f09..00000000000 Binary files a/docs/create/heroku/herokujava/heroku-redis.png and /dev/null differ diff --git a/docs/create/heroku/herokujava/heroku1.png b/docs/create/heroku/herokujava/heroku1.png deleted file mode 100644 index 71bc7adc960..00000000000 Binary files a/docs/create/heroku/herokujava/heroku1.png and /dev/null differ diff --git a/docs/create/heroku/herokujava/heroku2.png b/docs/create/heroku/herokujava/heroku2.png deleted file mode 100644 index 7c81f3a4059..00000000000 Binary files a/docs/create/heroku/herokujava/heroku2.png and /dev/null differ diff --git a/docs/create/heroku/herokujava/heroku_access.png b/docs/create/heroku/herokujava/heroku_access.png deleted file mode 100644 index 442fac110aa..00000000000 Binary files a/docs/create/heroku/herokujava/heroku_access.png and /dev/null differ diff --git a/docs/create/heroku/herokujava/heroku_access2.png b/docs/create/heroku/herokujava/heroku_access2.png deleted file mode 100644 index 2c223bc2b4f..00000000000 Binary files a/docs/create/heroku/herokujava/heroku_access2.png and /dev/null differ diff --git a/docs/create/heroku/herokujava/heroku_addons.png b/docs/create/heroku/herokujava/heroku_addons.png deleted file mode 100644 index 0cfaca1bf52..00000000000 Binary files a/docs/create/heroku/herokujava/heroku_addons.png and /dev/null differ diff --git a/docs/create/heroku/herokujava/heroku_app01.png b/docs/create/heroku/herokujava/heroku_app01.png deleted file mode 100644 index 1279bba0a44..00000000000 Binary files a/docs/create/heroku/herokujava/heroku_app01.png and /dev/null differ diff --git a/docs/create/heroku/herokujava/heroku_app1_env.png b/docs/create/heroku/herokujava/heroku_app1_env.png deleted file mode 100644 index a64460ea284..00000000000 Binary files a/docs/create/heroku/herokujava/heroku_app1_env.png and /dev/null differ diff --git a/docs/create/heroku/herokujava/heroku_appname.png b/docs/create/heroku/herokujava/heroku_appname.png deleted file mode 100644 index f9960449251..00000000000 Binary files a/docs/create/heroku/herokujava/heroku_appname.png and /dev/null differ diff --git a/docs/create/heroku/herokujava/heroku_deploymethod.png b/docs/create/heroku/herokujava/heroku_deploymethod.png deleted file mode 100644 index a2913a1ce60..00000000000 Binary files a/docs/create/heroku/herokujava/heroku_deploymethod.png and /dev/null differ diff --git a/docs/create/heroku/herokujava/heroku_env.png b/docs/create/heroku/herokujava/heroku_env.png deleted file mode 100644 index becec8add3e..00000000000 Binary files a/docs/create/heroku/herokujava/heroku_env.png and /dev/null differ diff --git a/docs/create/heroku/herokujava/heroku_env1.png b/docs/create/heroku/herokujava/heroku_env1.png deleted file mode 100644 index 149617cdcc6..00000000000 Binary files a/docs/create/heroku/herokujava/heroku_env1.png and /dev/null differ diff --git a/docs/create/heroku/herokujava/heroku_finaldeploy.png b/docs/create/heroku/herokujava/heroku_finaldeploy.png deleted file mode 100644 index 73f3601302c..00000000000 Binary files a/docs/create/heroku/herokujava/heroku_finaldeploy.png and /dev/null differ diff --git a/docs/create/heroku/herokujava/heroku_finalratelimit.png b/docs/create/heroku/herokujava/heroku_finalratelimit.png deleted file mode 100644 index f7e07c347c4..00000000000 Binary files a/docs/create/heroku/herokujava/heroku_finalratelimit.png and /dev/null differ diff --git a/docs/create/heroku/herokujava/heroku_gitconnect.png b/docs/create/heroku/herokujava/heroku_gitconnect.png deleted file mode 100644 index d89e6e34513..00000000000 Binary files a/docs/create/heroku/herokujava/heroku_gitconnect.png and /dev/null differ diff --git a/docs/create/heroku/herokujava/heroku_logo.png b/docs/create/heroku/herokujava/heroku_logo.png deleted file mode 100644 index 705784c613a..00000000000 Binary files a/docs/create/heroku/herokujava/heroku_logo.png and /dev/null differ diff --git a/docs/create/heroku/herokujava/heroku_newapp.png b/docs/create/heroku/herokujava/heroku_newapp.png deleted file mode 100644 index d21803f9f09..00000000000 Binary files a/docs/create/heroku/herokujava/heroku_newapp.png and /dev/null differ diff --git a/docs/create/heroku/herokujava/heroku_orderform.png b/docs/create/heroku/herokujava/heroku_orderform.png deleted file mode 100644 index ea3a73e7d78..00000000000 Binary files a/docs/create/heroku/herokujava/heroku_orderform.png and /dev/null differ diff --git a/docs/create/heroku/herokujava/heroku_ratelimiter.png b/docs/create/heroku/herokujava/heroku_ratelimiter.png deleted file mode 100644 index e32f44510b3..00000000000 Binary files a/docs/create/heroku/herokujava/heroku_ratelimiter.png and /dev/null differ diff --git a/docs/create/heroku/herokujava/heroku_ratelimiting1.png b/docs/create/heroku/herokujava/heroku_ratelimiting1.png deleted file mode 100644 index e32f44510b3..00000000000 Binary files a/docs/create/heroku/herokujava/heroku_ratelimiting1.png and /dev/null differ diff --git a/docs/create/heroku/herokujava/heroku_ratelimiting_dash.png b/docs/create/heroku/herokujava/heroku_ratelimiting_dash.png deleted file mode 100644 index 33102c4a677..00000000000 Binary files a/docs/create/heroku/herokujava/heroku_ratelimiting_dash.png and /dev/null differ diff --git a/docs/create/heroku/herokujava/heroku_ratelimting1.png b/docs/create/heroku/herokujava/heroku_ratelimting1.png deleted file mode 100644 index dd1d55b655e..00000000000 Binary files a/docs/create/heroku/herokujava/heroku_ratelimting1.png and /dev/null differ diff --git a/docs/create/heroku/herokujava/heroku_recloud.png b/docs/create/heroku/herokujava/heroku_recloud.png deleted file mode 100644 index 3e4cae3b7e7..00000000000 Binary files a/docs/create/heroku/herokujava/heroku_recloud.png and /dev/null differ diff --git a/docs/create/heroku/herokujava/heroku_recloud0.png b/docs/create/heroku/herokujava/heroku_recloud0.png deleted file mode 100644 index 43fc430fecc..00000000000 Binary files a/docs/create/heroku/herokujava/heroku_recloud0.png and /dev/null differ diff --git a/docs/create/heroku/herokujava/heroku_recloudinstall1.png b/docs/create/heroku/herokujava/heroku_recloudinstall1.png deleted file mode 100644 index 1942ad72145..00000000000 Binary files a/docs/create/heroku/herokujava/heroku_recloudinstall1.png and /dev/null differ diff --git a/docs/create/heroku/herokujava/heroku_recloudinstall2.png b/docs/create/heroku/herokujava/heroku_recloudinstall2.png deleted file mode 100644 index f8fb0b30ad6..00000000000 Binary files a/docs/create/heroku/herokujava/heroku_recloudinstall2.png and /dev/null differ diff --git a/docs/create/heroku/herokujava/heroku_rediscloud.png b/docs/create/heroku/herokujava/heroku_rediscloud.png deleted file mode 100644 index 1462bb4a8da..00000000000 Binary files a/docs/create/heroku/herokujava/heroku_rediscloud.png and /dev/null differ diff --git a/docs/create/heroku/herokujava/heroku_rediscloud1.png b/docs/create/heroku/herokujava/heroku_rediscloud1.png deleted file mode 100644 index 82fd167be38..00000000000 Binary files a/docs/create/heroku/herokujava/heroku_rediscloud1.png and /dev/null differ diff --git a/docs/create/heroku/herokujava/heroku_selectgit.png b/docs/create/heroku/herokujava/heroku_selectgit.png deleted file mode 100644 index 2e51d2a2194..00000000000 Binary files a/docs/create/heroku/herokujava/heroku_selectgit.png and /dev/null differ diff --git a/docs/create/heroku/herokujava/heroku_selectrecloud.png b/docs/create/heroku/herokujava/heroku_selectrecloud.png deleted file mode 100644 index beaa601e6b0..00000000000 Binary files a/docs/create/heroku/herokujava/heroku_selectrecloud.png and /dev/null differ diff --git a/docs/create/heroku/herokujava/image13.png b/docs/create/heroku/herokujava/image13.png deleted file mode 100644 index 71f6c3dfa87..00000000000 Binary files a/docs/create/heroku/herokujava/image13.png and /dev/null differ diff --git a/docs/create/heroku/herokujava/index-herokujava.mdx b/docs/create/heroku/herokujava/index-herokujava.mdx deleted file mode 100644 index a86f12b33b6..00000000000 --- a/docs/create/heroku/herokujava/index-herokujava.mdx +++ /dev/null @@ -1,116 +0,0 @@ ---- -id: index-herokujava -title: Deploy Java app on Heroku using Redis -sidebar_label: How to deploy a Java based application on Heroku using Redis -slug: /create/heroku/herokujava -authors: [ajeet] ---- - -import RedisCard from '@site/src/theme/RedisCard'; - -Heroku is a cloud service provider and software development platform which facilitates fast and effective building, deploying and scaling of web applications. It offers you a ready-to-use environment that allows you to deploy your code fast. - -Some of the notable benefits of Heroku include: - -- Users can get started with the free tier of Heroku -- Let developers concentrate on coding and not server management -- Integrates with familiar developer workflows -- Enhance the productivity of cloud app development teams -- Helps your development, QA, and business stakeholders create a unified dashboard -- Support for Modern Open Source Languages - -#### Step 1. Create Redis Enterprise Cloud - -Create your free Redis Enterprise Cloud account by visiting [this link](https://redis.com/try-free) - -:::info TIP -For a limited time, use **TIGER200** to get **$200** credits on Redis Enterprise Cloud and try all the advanced capabilities! - -:tada: [Click here to sign up](https://redis.com/try-free) - -::: - -![recloud](try-free.png) - -[Follow this link to create a Redis Enterprise Cloud](/create/rediscloud) subscription and database. Once you create the database, you will be provisioned with a unique database endpoint URL, port and password. Save these for future reference. - -### Step 2. Create a Heroku account - -If you are using Heroku for the first time, create your new Heroku account [through this link](https://signup.heroku.com/login) - -![heroku](create_heroku.png) - -### Step 3. Install Heroku CLI on your system - -```macos - brew install heroku -``` - -### Step 4. Login to Heroku - -```bash - heroku login - heroku: Press any key to open up the browser to login or q to exit: - Opening browser to https://cli-auth.heroku.com/auth/cli/browser/XXXXXXXXXXA - Logging in... done - Logged in as your_email_address -``` - -### Step 5. Connect your application to Redis Enterprise Cloud - -For this demonstration, we will be using a [Sample Rate Limiting application](https://github.com/redis-developer/basic-rate-limiting-demo-java). - -#### Clone the repository - -```bash - git clone https://github.com/redis-developer/basic-rate-limiting-demo-java -``` - -``` -heroku create -Creating app... done, ⬢ hidden-woodland-03996 -https://hidden-woodland-03996.herokuapp.com/ | https://git.heroku.com/hidden-woodland-03996.git -``` - -### Step 6. Setting up Environment Variables - -Go to Heroku dashboard, click "Settings" and set REDIS_ENDPOINT_URI and REDIS_PASSWORD under the Config Vars. -Refer to Step 1 for the correct values to use. - -![heroku](heroku_app1_env.png) - -You now have a functioning Git repository that contains a simple application as well as a package.json file, which is used by Node’s dependency manager. - -### Step 7. Deploy your code - -Heroku generates a random name (in this case hidden-woodland-03996) for your app, or you can pass a parameter to specify your own app name. -Now deploy your code: - -``` -$ git push heroku -remote: BUILD SUCCESSFUL in 1m 5s -remote: 12 actionable tasks: 12 executed -remote: -----> Discovering process types -remote: Procfile declares types -> web -remote: -remote: -----> Compressing... -remote: Done: 298.9M -remote: -----> Launching... -remote: Released v3 -remote: https://hidden-woodland-03996.herokuapp.com/ deployed to Heroku -remote: -remote: Verifying deploy... done. -To https://git.heroku.com/hidden-woodland-03996.git - * [new branch] master -> master -``` - -### Step 8. Accessing the application - -Open https://hidden-woodland-03996.herokuapp.com/ to see your application - -![heroku](heroku_ratelimiter.png) - -### Next Steps - -- [Connecting to the database using RedisInsight](/explore/redisinsight/) -- [How to list & search Movies database using Redisearch](/howtos/moviesdatabase/getting-started/) diff --git a/docs/create/heroku/herokujava/launch_database.png b/docs/create/heroku/herokujava/launch_database.png deleted file mode 100644 index 861f20f9dec..00000000000 Binary files a/docs/create/heroku/herokujava/launch_database.png and /dev/null differ diff --git a/docs/create/heroku/herokujava/orderform.png b/docs/create/heroku/herokujava/orderform.png deleted file mode 100644 index 7a047793e9f..00000000000 Binary files a/docs/create/heroku/herokujava/orderform.png and /dev/null differ diff --git a/docs/create/heroku/herokujava/pricing.png b/docs/create/heroku/herokujava/pricing.png deleted file mode 100644 index 27ac39b9008..00000000000 Binary files a/docs/create/heroku/herokujava/pricing.png and /dev/null differ diff --git a/docs/create/heroku/herokujava/ratelimiting.png b/docs/create/heroku/herokujava/ratelimiting.png deleted file mode 100644 index 331ef253da3..00000000000 Binary files a/docs/create/heroku/herokujava/ratelimiting.png and /dev/null differ diff --git a/docs/create/heroku/herokujava/rediscloud.png b/docs/create/heroku/herokujava/rediscloud.png deleted file mode 100644 index 67b07808347..00000000000 Binary files a/docs/create/heroku/herokujava/rediscloud.png and /dev/null differ diff --git a/docs/create/heroku/herokujava/try-free.png b/docs/create/heroku/herokujava/try-free.png deleted file mode 100644 index 11915ea5927..00000000000 Binary files a/docs/create/heroku/herokujava/try-free.png and /dev/null differ diff --git a/docs/create/heroku/herokunodejs/create_heroku.png b/docs/create/heroku/herokunodejs/create_heroku.png deleted file mode 100644 index 68d7813f4d2..00000000000 Binary files a/docs/create/heroku/herokunodejs/create_heroku.png and /dev/null differ diff --git a/docs/create/heroku/herokunodejs/heroku-redis.png b/docs/create/heroku/herokunodejs/heroku-redis.png deleted file mode 100644 index 578f0359f09..00000000000 Binary files a/docs/create/heroku/herokunodejs/heroku-redis.png and /dev/null differ diff --git a/docs/create/heroku/herokunodejs/heroku1.png b/docs/create/heroku/herokunodejs/heroku1.png deleted file mode 100644 index 71bc7adc960..00000000000 Binary files a/docs/create/heroku/herokunodejs/heroku1.png and /dev/null differ diff --git a/docs/create/heroku/herokunodejs/heroku2.png b/docs/create/heroku/herokunodejs/heroku2.png deleted file mode 100644 index 7c81f3a4059..00000000000 Binary files a/docs/create/heroku/herokunodejs/heroku2.png and /dev/null differ diff --git a/docs/create/heroku/herokunodejs/heroku_access.png b/docs/create/heroku/herokunodejs/heroku_access.png deleted file mode 100644 index 442fac110aa..00000000000 Binary files a/docs/create/heroku/herokunodejs/heroku_access.png and /dev/null differ diff --git a/docs/create/heroku/herokunodejs/heroku_access2.png b/docs/create/heroku/herokunodejs/heroku_access2.png deleted file mode 100644 index 2c223bc2b4f..00000000000 Binary files a/docs/create/heroku/herokunodejs/heroku_access2.png and /dev/null differ diff --git a/docs/create/heroku/herokunodejs/heroku_addons.png b/docs/create/heroku/herokunodejs/heroku_addons.png deleted file mode 100644 index 0cfaca1bf52..00000000000 Binary files a/docs/create/heroku/herokunodejs/heroku_addons.png and /dev/null differ diff --git a/docs/create/heroku/herokunodejs/heroku_app01.png b/docs/create/heroku/herokunodejs/heroku_app01.png deleted file mode 100644 index 1279bba0a44..00000000000 Binary files a/docs/create/heroku/herokunodejs/heroku_app01.png and /dev/null differ diff --git a/docs/create/heroku/herokunodejs/heroku_app1_env.png b/docs/create/heroku/herokunodejs/heroku_app1_env.png deleted file mode 100644 index a64460ea284..00000000000 Binary files a/docs/create/heroku/herokunodejs/heroku_app1_env.png and /dev/null differ diff --git a/docs/create/heroku/herokunodejs/heroku_appname.png b/docs/create/heroku/herokunodejs/heroku_appname.png deleted file mode 100644 index f9960449251..00000000000 Binary files a/docs/create/heroku/herokunodejs/heroku_appname.png and /dev/null differ diff --git a/docs/create/heroku/herokunodejs/heroku_deploymethod.png b/docs/create/heroku/herokunodejs/heroku_deploymethod.png deleted file mode 100644 index a2913a1ce60..00000000000 Binary files a/docs/create/heroku/herokunodejs/heroku_deploymethod.png and /dev/null differ diff --git a/docs/create/heroku/herokunodejs/heroku_env.png b/docs/create/heroku/herokunodejs/heroku_env.png deleted file mode 100644 index becec8add3e..00000000000 Binary files a/docs/create/heroku/herokunodejs/heroku_env.png and /dev/null differ diff --git a/docs/create/heroku/herokunodejs/heroku_env1.png b/docs/create/heroku/herokunodejs/heroku_env1.png deleted file mode 100644 index 149617cdcc6..00000000000 Binary files a/docs/create/heroku/herokunodejs/heroku_env1.png and /dev/null differ diff --git a/docs/create/heroku/herokunodejs/heroku_finaldeploy.png b/docs/create/heroku/herokunodejs/heroku_finaldeploy.png deleted file mode 100644 index 73f3601302c..00000000000 Binary files a/docs/create/heroku/herokunodejs/heroku_finaldeploy.png and /dev/null differ diff --git a/docs/create/heroku/herokunodejs/heroku_finalratelimit.png b/docs/create/heroku/herokunodejs/heroku_finalratelimit.png deleted file mode 100644 index f7e07c347c4..00000000000 Binary files a/docs/create/heroku/herokunodejs/heroku_finalratelimit.png and /dev/null differ diff --git a/docs/create/heroku/herokunodejs/heroku_gitconnect.png b/docs/create/heroku/herokunodejs/heroku_gitconnect.png deleted file mode 100644 index d89e6e34513..00000000000 Binary files a/docs/create/heroku/herokunodejs/heroku_gitconnect.png and /dev/null differ diff --git a/docs/create/heroku/herokunodejs/heroku_logo.png b/docs/create/heroku/herokunodejs/heroku_logo.png deleted file mode 100644 index 705784c613a..00000000000 Binary files a/docs/create/heroku/herokunodejs/heroku_logo.png and /dev/null differ diff --git a/docs/create/heroku/herokunodejs/heroku_newapp.png b/docs/create/heroku/herokunodejs/heroku_newapp.png deleted file mode 100644 index d21803f9f09..00000000000 Binary files a/docs/create/heroku/herokunodejs/heroku_newapp.png and /dev/null differ diff --git a/docs/create/heroku/herokunodejs/heroku_orderform.png b/docs/create/heroku/herokunodejs/heroku_orderform.png deleted file mode 100644 index ea3a73e7d78..00000000000 Binary files a/docs/create/heroku/herokunodejs/heroku_orderform.png and /dev/null differ diff --git a/docs/create/heroku/herokunodejs/heroku_ratelimiter.png b/docs/create/heroku/herokunodejs/heroku_ratelimiter.png deleted file mode 100644 index e32f44510b3..00000000000 Binary files a/docs/create/heroku/herokunodejs/heroku_ratelimiter.png and /dev/null differ diff --git a/docs/create/heroku/herokunodejs/heroku_ratelimiting1.png b/docs/create/heroku/herokunodejs/heroku_ratelimiting1.png deleted file mode 100644 index e32f44510b3..00000000000 Binary files a/docs/create/heroku/herokunodejs/heroku_ratelimiting1.png and /dev/null differ diff --git a/docs/create/heroku/herokunodejs/heroku_ratelimiting_dash.png b/docs/create/heroku/herokunodejs/heroku_ratelimiting_dash.png deleted file mode 100644 index 33102c4a677..00000000000 Binary files a/docs/create/heroku/herokunodejs/heroku_ratelimiting_dash.png and /dev/null differ diff --git a/docs/create/heroku/herokunodejs/heroku_ratelimting1.png b/docs/create/heroku/herokunodejs/heroku_ratelimting1.png deleted file mode 100644 index dd1d55b655e..00000000000 Binary files a/docs/create/heroku/herokunodejs/heroku_ratelimting1.png and /dev/null differ diff --git a/docs/create/heroku/herokunodejs/heroku_recloud.png b/docs/create/heroku/herokunodejs/heroku_recloud.png deleted file mode 100644 index 3e4cae3b7e7..00000000000 Binary files a/docs/create/heroku/herokunodejs/heroku_recloud.png and /dev/null differ diff --git a/docs/create/heroku/herokunodejs/heroku_recloud0.png b/docs/create/heroku/herokunodejs/heroku_recloud0.png deleted file mode 100644 index 43fc430fecc..00000000000 Binary files a/docs/create/heroku/herokunodejs/heroku_recloud0.png and /dev/null differ diff --git a/docs/create/heroku/herokunodejs/heroku_recloudinstall1.png b/docs/create/heroku/herokunodejs/heroku_recloudinstall1.png deleted file mode 100644 index 1942ad72145..00000000000 Binary files a/docs/create/heroku/herokunodejs/heroku_recloudinstall1.png and /dev/null differ diff --git a/docs/create/heroku/herokunodejs/heroku_recloudinstall2.png b/docs/create/heroku/herokunodejs/heroku_recloudinstall2.png deleted file mode 100644 index f8fb0b30ad6..00000000000 Binary files a/docs/create/heroku/herokunodejs/heroku_recloudinstall2.png and /dev/null differ diff --git a/docs/create/heroku/herokunodejs/heroku_rediscloud.png b/docs/create/heroku/herokunodejs/heroku_rediscloud.png deleted file mode 100644 index 1462bb4a8da..00000000000 Binary files a/docs/create/heroku/herokunodejs/heroku_rediscloud.png and /dev/null differ diff --git a/docs/create/heroku/herokunodejs/heroku_rediscloud1.png b/docs/create/heroku/herokunodejs/heroku_rediscloud1.png deleted file mode 100644 index 82fd167be38..00000000000 Binary files a/docs/create/heroku/herokunodejs/heroku_rediscloud1.png and /dev/null differ diff --git a/docs/create/heroku/herokunodejs/heroku_selectgit.png b/docs/create/heroku/herokunodejs/heroku_selectgit.png deleted file mode 100644 index 2e51d2a2194..00000000000 Binary files a/docs/create/heroku/herokunodejs/heroku_selectgit.png and /dev/null differ diff --git a/docs/create/heroku/herokunodejs/heroku_selectrecloud.png b/docs/create/heroku/herokunodejs/heroku_selectrecloud.png deleted file mode 100644 index beaa601e6b0..00000000000 Binary files a/docs/create/heroku/herokunodejs/heroku_selectrecloud.png and /dev/null differ diff --git a/docs/create/heroku/herokunodejs/index-herokunodejs.mdx b/docs/create/heroku/herokunodejs/index-herokunodejs.mdx deleted file mode 100644 index 01604370d0c..00000000000 --- a/docs/create/heroku/herokunodejs/index-herokunodejs.mdx +++ /dev/null @@ -1,105 +0,0 @@ ---- -id: index-herokunodejs -title: Deploy a NodeJS app on Heroku using Redis -sidebar_label: How to deploy a NodeJS based application on Heroku using Redis -slug: /create/heroku/herokunodejs -authors: [ajeet] ---- - -import RedisCard from '@site/src/theme/RedisCard'; - -Heroku is a platform as a service (PaaS) that enables developers to build, run, and operate applications entirely in the cloud. It is a platform for data as well as apps - providing a secure, scalable database-as-a-service with tons of developers tools like database followers, forking, dataclips and automated health checks.Heroku is widely popular as it makes the processes of deploying, configuring, scaling, tuning, and managing apps as simple and straightforward as possible, so that developers can focus on what’s most important: building great apps that delight and engage customers. - -#### Step 1. Create Redis Enterprise Cloud - -Create your free Redis Enterprise Cloud account by visiting [this link](https://redis.com/try-free) - -:::info TIP -For a limited time, use **TIGER200** to get **$200** credits on Redis Enterprise Cloud and try all the advanced capabilities! - -:tada: [Click here to sign up](https://redis.com/try-free) - -::: - -![recloud](try-free.png) - -[Follow this link to create a Redis Enterprise Cloud](/create/rediscloud) subscription and database. Once you create the database, you will be provisioned with a unique database endpoint URL, port and password. Save these for future reference. - -### Step 2. Create a Heroku account - -If you are using Heroku for the first time, create your new Heroku account [through this link](https://signup.heroku.com/login) - -![heroku](create_heroku.png) - -### Step 3. Install Heroku CLI on your system - -```macos - brew install heroku -``` - -### Step 4. Login to Heroku - -```bash - heroku login - heroku: Press any key to open up the browser to login or q to exit: - Opening browser to https://cli-auth.heroku.com/auth/cli/browser/XXXXXXXXXXA - Logging in... done - Logged in as your_email_address -``` - -### Step 5. Connect your application to Redis Enterprise Cloud - -For this demonstration, we will be using a [Sample Rate Limiting application](https://github.com/redis-developer/basic-rate-limiting-demo-nodejs) - -#### Clone the repository - -```bash - git clone https://github.com/redis-developer/basic-redis-rate-limiting-demo-nodejs -``` - -Run the commands below to get a functioning Git repository that contains a simple application as well as a package.json file. - -``` -heroku create -Creating app... done, ⬢ rocky-lowlands-06306 -https://rocky-lowlands-06306.herokuapp.com/ | https://git.heroku.com/rocky-lowlands-06306.git -``` - -### Step 6. Setting up environment variables - -Go to the Heroku dashboard, click "Settings" and set `REDIS_ENDPOINT_URI` and `REDIS_PASSWORD` under the Config Vars. -Refer to Step 1 for the correct values to use. - -![heroku](heroku_app1_env.png) - -You now have a functioning Git repository that contains a simple application as well as a package.json file, which is used by Node’s dependency manager. - -### Step 7. Deploy your code - -``` -$ git push heroku -``` - -Wait for few seconds and you will see the messages below: - -``` -remote: -----> Launching... -remote: Released v3 -remote: https://rocky-lowlands-06306.herokuapp.com/ deployed to Heroku -remote: -remote: Verifying deploy... done. -To https://git.heroku.com/rocky-lowlands-06306.git - * [new branch] main -> main - -``` - -### Step 8. Accessing the application - -Open https://rocky-lowlands-06306.herokuapp.com/ to see your application - -![heroku](heroku_ratelimiter.png) - -### Next Steps - -- [Connecting to the database using RedisInsight](/explore/redisinsight/) -- [How to list & search Movies database using Redisearch](/howtos/moviesdatabase/getting-started/) diff --git a/docs/create/heroku/herokunodejs/launch_database.png b/docs/create/heroku/herokunodejs/launch_database.png deleted file mode 100644 index 861f20f9dec..00000000000 Binary files a/docs/create/heroku/herokunodejs/launch_database.png and /dev/null differ diff --git a/docs/create/heroku/herokunodejs/orderform.png b/docs/create/heroku/herokunodejs/orderform.png deleted file mode 100644 index 7a047793e9f..00000000000 Binary files a/docs/create/heroku/herokunodejs/orderform.png and /dev/null differ diff --git a/docs/create/heroku/herokunodejs/pricing.png b/docs/create/heroku/herokunodejs/pricing.png deleted file mode 100644 index 27ac39b9008..00000000000 Binary files a/docs/create/heroku/herokunodejs/pricing.png and /dev/null differ diff --git a/docs/create/heroku/herokunodejs/ratelimiting.png b/docs/create/heroku/herokunodejs/ratelimiting.png deleted file mode 100644 index 331ef253da3..00000000000 Binary files a/docs/create/heroku/herokunodejs/ratelimiting.png and /dev/null differ diff --git a/docs/create/heroku/herokunodejs/rediscloud.png b/docs/create/heroku/herokunodejs/rediscloud.png deleted file mode 100644 index 67b07808347..00000000000 Binary files a/docs/create/heroku/herokunodejs/rediscloud.png and /dev/null differ diff --git a/docs/create/heroku/herokunodejs/try-free.png b/docs/create/heroku/herokunodejs/try-free.png deleted file mode 100644 index 11915ea5927..00000000000 Binary files a/docs/create/heroku/herokunodejs/try-free.png and /dev/null differ diff --git a/docs/create/heroku/herokupython/create_heroku.png b/docs/create/heroku/herokupython/create_heroku.png deleted file mode 100644 index 68d7813f4d2..00000000000 Binary files a/docs/create/heroku/herokupython/create_heroku.png and /dev/null differ diff --git a/docs/create/heroku/herokupython/heroku-redis.png b/docs/create/heroku/herokupython/heroku-redis.png deleted file mode 100644 index 578f0359f09..00000000000 Binary files a/docs/create/heroku/herokupython/heroku-redis.png and /dev/null differ diff --git a/docs/create/heroku/herokupython/heroku1.png b/docs/create/heroku/herokupython/heroku1.png deleted file mode 100644 index 71bc7adc960..00000000000 Binary files a/docs/create/heroku/herokupython/heroku1.png and /dev/null differ diff --git a/docs/create/heroku/herokupython/heroku2.png b/docs/create/heroku/herokupython/heroku2.png deleted file mode 100644 index 7c81f3a4059..00000000000 Binary files a/docs/create/heroku/herokupython/heroku2.png and /dev/null differ diff --git a/docs/create/heroku/herokupython/heroku_access.png b/docs/create/heroku/herokupython/heroku_access.png deleted file mode 100644 index 442fac110aa..00000000000 Binary files a/docs/create/heroku/herokupython/heroku_access.png and /dev/null differ diff --git a/docs/create/heroku/herokupython/heroku_access2.png b/docs/create/heroku/herokupython/heroku_access2.png deleted file mode 100644 index 2c223bc2b4f..00000000000 Binary files a/docs/create/heroku/herokupython/heroku_access2.png and /dev/null differ diff --git a/docs/create/heroku/herokupython/heroku_addons.png b/docs/create/heroku/herokupython/heroku_addons.png deleted file mode 100644 index 0cfaca1bf52..00000000000 Binary files a/docs/create/heroku/herokupython/heroku_addons.png and /dev/null differ diff --git a/docs/create/heroku/herokupython/heroku_app01.png b/docs/create/heroku/herokupython/heroku_app01.png deleted file mode 100644 index 1279bba0a44..00000000000 Binary files a/docs/create/heroku/herokupython/heroku_app01.png and /dev/null differ diff --git a/docs/create/heroku/herokupython/heroku_app1_env.png b/docs/create/heroku/herokupython/heroku_app1_env.png deleted file mode 100644 index a64460ea284..00000000000 Binary files a/docs/create/heroku/herokupython/heroku_app1_env.png and /dev/null differ diff --git a/docs/create/heroku/herokupython/heroku_appname.png b/docs/create/heroku/herokupython/heroku_appname.png deleted file mode 100644 index f9960449251..00000000000 Binary files a/docs/create/heroku/herokupython/heroku_appname.png and /dev/null differ diff --git a/docs/create/heroku/herokupython/heroku_deploymethod.png b/docs/create/heroku/herokupython/heroku_deploymethod.png deleted file mode 100644 index a2913a1ce60..00000000000 Binary files a/docs/create/heroku/herokupython/heroku_deploymethod.png and /dev/null differ diff --git a/docs/create/heroku/herokupython/heroku_env.png b/docs/create/heroku/herokupython/heroku_env.png deleted file mode 100644 index becec8add3e..00000000000 Binary files a/docs/create/heroku/herokupython/heroku_env.png and /dev/null differ diff --git a/docs/create/heroku/herokupython/heroku_env1.png b/docs/create/heroku/herokupython/heroku_env1.png deleted file mode 100644 index 149617cdcc6..00000000000 Binary files a/docs/create/heroku/herokupython/heroku_env1.png and /dev/null differ diff --git a/docs/create/heroku/herokupython/heroku_finaldeploy.png b/docs/create/heroku/herokupython/heroku_finaldeploy.png deleted file mode 100644 index 73f3601302c..00000000000 Binary files a/docs/create/heroku/herokupython/heroku_finaldeploy.png and /dev/null differ diff --git a/docs/create/heroku/herokupython/heroku_finalratelimit.png b/docs/create/heroku/herokupython/heroku_finalratelimit.png deleted file mode 100644 index f7e07c347c4..00000000000 Binary files a/docs/create/heroku/herokupython/heroku_finalratelimit.png and /dev/null differ diff --git a/docs/create/heroku/herokupython/heroku_gitconnect.png b/docs/create/heroku/herokupython/heroku_gitconnect.png deleted file mode 100644 index d89e6e34513..00000000000 Binary files a/docs/create/heroku/herokupython/heroku_gitconnect.png and /dev/null differ diff --git a/docs/create/heroku/herokupython/heroku_logo.png b/docs/create/heroku/herokupython/heroku_logo.png deleted file mode 100644 index 705784c613a..00000000000 Binary files a/docs/create/heroku/herokupython/heroku_logo.png and /dev/null differ diff --git a/docs/create/heroku/herokupython/heroku_newapp.png b/docs/create/heroku/herokupython/heroku_newapp.png deleted file mode 100644 index d21803f9f09..00000000000 Binary files a/docs/create/heroku/herokupython/heroku_newapp.png and /dev/null differ diff --git a/docs/create/heroku/herokupython/heroku_orderform.png b/docs/create/heroku/herokupython/heroku_orderform.png deleted file mode 100644 index ea3a73e7d78..00000000000 Binary files a/docs/create/heroku/herokupython/heroku_orderform.png and /dev/null differ diff --git a/docs/create/heroku/herokupython/heroku_ratelimiter.png b/docs/create/heroku/herokupython/heroku_ratelimiter.png deleted file mode 100644 index e32f44510b3..00000000000 Binary files a/docs/create/heroku/herokupython/heroku_ratelimiter.png and /dev/null differ diff --git a/docs/create/heroku/herokupython/heroku_ratelimiting1.png b/docs/create/heroku/herokupython/heroku_ratelimiting1.png deleted file mode 100644 index e32f44510b3..00000000000 Binary files a/docs/create/heroku/herokupython/heroku_ratelimiting1.png and /dev/null differ diff --git a/docs/create/heroku/herokupython/heroku_ratelimiting_dash.png b/docs/create/heroku/herokupython/heroku_ratelimiting_dash.png deleted file mode 100644 index 33102c4a677..00000000000 Binary files a/docs/create/heroku/herokupython/heroku_ratelimiting_dash.png and /dev/null differ diff --git a/docs/create/heroku/herokupython/heroku_ratelimting1.png b/docs/create/heroku/herokupython/heroku_ratelimting1.png deleted file mode 100644 index dd1d55b655e..00000000000 Binary files a/docs/create/heroku/herokupython/heroku_ratelimting1.png and /dev/null differ diff --git a/docs/create/heroku/herokupython/heroku_recloud.png b/docs/create/heroku/herokupython/heroku_recloud.png deleted file mode 100644 index 3e4cae3b7e7..00000000000 Binary files a/docs/create/heroku/herokupython/heroku_recloud.png and /dev/null differ diff --git a/docs/create/heroku/herokupython/heroku_recloud0.png b/docs/create/heroku/herokupython/heroku_recloud0.png deleted file mode 100644 index 43fc430fecc..00000000000 Binary files a/docs/create/heroku/herokupython/heroku_recloud0.png and /dev/null differ diff --git a/docs/create/heroku/herokupython/heroku_recloudinstall1.png b/docs/create/heroku/herokupython/heroku_recloudinstall1.png deleted file mode 100644 index 1942ad72145..00000000000 Binary files a/docs/create/heroku/herokupython/heroku_recloudinstall1.png and /dev/null differ diff --git a/docs/create/heroku/herokupython/heroku_recloudinstall2.png b/docs/create/heroku/herokupython/heroku_recloudinstall2.png deleted file mode 100644 index f8fb0b30ad6..00000000000 Binary files a/docs/create/heroku/herokupython/heroku_recloudinstall2.png and /dev/null differ diff --git a/docs/create/heroku/herokupython/heroku_rediscloud.png b/docs/create/heroku/herokupython/heroku_rediscloud.png deleted file mode 100644 index 1462bb4a8da..00000000000 Binary files a/docs/create/heroku/herokupython/heroku_rediscloud.png and /dev/null differ diff --git a/docs/create/heroku/herokupython/heroku_rediscloud1.png b/docs/create/heroku/herokupython/heroku_rediscloud1.png deleted file mode 100644 index 82fd167be38..00000000000 Binary files a/docs/create/heroku/herokupython/heroku_rediscloud1.png and /dev/null differ diff --git a/docs/create/heroku/herokupython/heroku_selectgit.png b/docs/create/heroku/herokupython/heroku_selectgit.png deleted file mode 100644 index 2e51d2a2194..00000000000 Binary files a/docs/create/heroku/herokupython/heroku_selectgit.png and /dev/null differ diff --git a/docs/create/heroku/herokupython/heroku_selectrecloud.png b/docs/create/heroku/herokupython/heroku_selectrecloud.png deleted file mode 100644 index beaa601e6b0..00000000000 Binary files a/docs/create/heroku/herokupython/heroku_selectrecloud.png and /dev/null differ diff --git a/docs/create/heroku/herokupython/index-herokupython.mdx b/docs/create/heroku/herokupython/index-herokupython.mdx deleted file mode 100644 index 71330dace2c..00000000000 --- a/docs/create/heroku/herokupython/index-herokupython.mdx +++ /dev/null @@ -1,135 +0,0 @@ ---- -id: index-herokupython -title: Deploy a Python app on Heroku using Redis -sidebar_label: How to deploy a Python based application on Heroku using Redis -slug: /create/heroku/herokupython -authors: [ajeet] ---- - -import RedisCard from '@site/src/theme/RedisCard'; - -Heroku is a container-based cloud Platform as a Service (PaaS). It is a new way of building and deploying web apps. Heroku lets app developers spend 100% of their time on their application code, not managing servers, deployment, ongoing operations, or scaling. Developers use Heroku to deploy, manage, and scale modern apps. The Heroku platform is elegant, flexible, and easy to use, offering developers the simplest path to getting their apps to market. - -Some of the notable features offered by Heroku are: - -- Agile deployment for Node.js, Java, Python, Ruby, Go and Scala -- Run and scale any type of app -- Flexibility to customize and support unique DevOps workflow needs -- Total visibility across your entire app -- Offers a powerful dashboard and CLI - -#### Step 1. Create Redis Enterprise Cloud - -Redis Enterprise Cloud is a fully managed cloud service by Redis. Built for modern distributed applications, Redis Enterprise Cloud enables you to run any query, simple or complex, at sub-millisecond performance at virtually infinite scale without worrying about operational complexity or service availability. With modern probabilistic data structures and extensible data models, including Search, JSON, Graph, and Time Series, you can rely on Redis as your data-platform for all your real-time needs. - -Create your free Redis Enterprise Cloud account by visiting [this link](https://redis.com/try-free) - -:::info TIP -For a limited time, use **TIGER200** to get **$200** credits on Redis Enterprise Cloud and try all the advanced capabilities! - -:tada: [Click here to sign up](https://redis.com/try-free) - -::: - -![recloud](try-free.png) - -[Follow this link to create a Redis Enterprise Cloud](/create/rediscloud) subscription and database. Once you create the database, you will be provisioned with a unique database endpoint URL, port and password. Save these for future reference. - -You can use the Redis CLI to quickly verify the connection to the Redis instance URL and access the database. - -### Step 2. Create a Heroku account - -If you are using the Heroku platform for the first time, create your new Heroku account [through this link](https://signup.heroku.com/login). -You can refer to [Heroku documentation](https://devcenter.heroku.com/categories/reference) - -![heroku](create_heroku.png) - -### Step 3. Install Heroku CLI on your system - -Run the following command to install the Heroku CLI on your system. - -```macos - brew install heroku -``` - -### Step 4. Login to Heroku - -```bash - heroku login - heroku: Press any key to open up the browser to login or q to exit: - Opening browser to https://cli-auth.heroku.com/auth/cli/browser/XXXXXXXXXXA - Logging in... done - Logged in as your_email_address -``` - -### Step 5. Connect your application to Redis Enterprise Cloud - -For this demonstration, we will be using a [Sample Rate Limiting application](https://github.com/redis-developer/basic-rate-limiting-demo-python). - -#### Clone the repository - -```bash - git clone https://github.com/redis-developer/basic-rate-limiting-demo-python -``` - -Run the commands below to get a functioning Git repository that contains a simple application as well as a package.json file. - -``` -$ heroku create -Creating app... done, ⬢ fast-reef-76278 -https://fast-reef-76278.herokuapp.com/ | https://git.heroku.com/fast-reef-76278.git -``` - -### Step 6. Setting up Environment Variables - -Go to the Heroku dashboard, click "Settings" and set REDIS_ENDPOINT_URI and REDIS_PASSWORD under the Config Vars. -Refer to Step 1 for the correct values to use. - -![heroku](heroku_app1_env.png) - -### Step 7. Deploy your code - -Heroku generates a random name (in this case fast-reef-76278) for your app, or you can pass a parameter to specify your own app name. -Now deploy your code: - -``` -$ git push heroku -Enumerating objects: 512, done. -Counting objects: 100% (512/512), done. -Delta compression using up to 12 threads -Compressing objects: 100% (256/256), done. -Writing objects: 100% (512/512), 1.52 MiB | 660.00 KiB/s, done. -Total 512 (delta 244), reused 512 (delta 244) -remote: Compressing source files... done. -remote: Building source: -remote: -remote: -----> Building on the Heroku-20 stack -remote: -----> Determining which buildpack to use for this app -remote: -----> Python app detected -… - -emote: -----> Compressing... -remote: Done: 59.3M -remote: -----> Launching... -remote: Released v5 -remote: https://fast-reef-76278.herokuapp.com/ deployed to Heroku -remote: -remote: Verifying deploy... done. -To https://git.heroku.com/fast-reef-76278.git - * [new branch] master -> master -``` - -### Step 8. Accessing the application - -Open https://fast-reef-76278.herokuapp.com/ to see your application - -![heroku](heroku_ratelimiter.png) - -### Next Steps - -- [Connecting to the database using RedisInsight](/explore/redisinsight/) -- [How to list & search Movies database using Redisearch](/howtos/moviesdatabase/getting-started/) -- [Django Buildpacks](https://elements.heroku.com/buildpacks/django/django) -- [Deploying Python and Django Apps on Heroku](https://devcenter.heroku.com/articles/deploying-python) -- [Securing Heroku Redis](https://devcenter.heroku.com/articles/securing-heroku-redis) -- [Heroku Connect](https://devcenter.heroku.com/articles/heroku-connect) diff --git a/docs/create/heroku/herokupython/launch_database.png b/docs/create/heroku/herokupython/launch_database.png deleted file mode 100644 index 861f20f9dec..00000000000 Binary files a/docs/create/heroku/herokupython/launch_database.png and /dev/null differ diff --git a/docs/create/heroku/herokupython/orderform.png b/docs/create/heroku/herokupython/orderform.png deleted file mode 100644 index 7a047793e9f..00000000000 Binary files a/docs/create/heroku/herokupython/orderform.png and /dev/null differ diff --git a/docs/create/heroku/herokupython/pricing.png b/docs/create/heroku/herokupython/pricing.png deleted file mode 100644 index 27ac39b9008..00000000000 Binary files a/docs/create/heroku/herokupython/pricing.png and /dev/null differ diff --git a/docs/create/heroku/herokupython/ratelimiting.png b/docs/create/heroku/herokupython/ratelimiting.png deleted file mode 100644 index 331ef253da3..00000000000 Binary files a/docs/create/heroku/herokupython/ratelimiting.png and /dev/null differ diff --git a/docs/create/heroku/herokupython/rediscloud.png b/docs/create/heroku/herokupython/rediscloud.png deleted file mode 100644 index 67b07808347..00000000000 Binary files a/docs/create/heroku/herokupython/rediscloud.png and /dev/null differ diff --git a/docs/create/heroku/herokupython/try-free.png b/docs/create/heroku/herokupython/try-free.png deleted file mode 100644 index 11915ea5927..00000000000 Binary files a/docs/create/heroku/herokupython/try-free.png and /dev/null differ diff --git a/docs/create/heroku/herokuruby/create_heroku.png b/docs/create/heroku/herokuruby/create_heroku.png deleted file mode 100644 index 68d7813f4d2..00000000000 Binary files a/docs/create/heroku/herokuruby/create_heroku.png and /dev/null differ diff --git a/docs/create/heroku/herokuruby/heroku_app1_env.png b/docs/create/heroku/herokuruby/heroku_app1_env.png deleted file mode 100644 index b30b5307b33..00000000000 Binary files a/docs/create/heroku/herokuruby/heroku_app1_env.png and /dev/null differ diff --git a/docs/create/heroku/herokuruby/heroku_leaderboard_ruby.png b/docs/create/heroku/herokuruby/heroku_leaderboard_ruby.png deleted file mode 100644 index cd62ee84773..00000000000 Binary files a/docs/create/heroku/herokuruby/heroku_leaderboard_ruby.png and /dev/null differ diff --git a/docs/create/heroku/herokuruby/index-herokuruby.mdx b/docs/create/heroku/herokuruby/index-herokuruby.mdx deleted file mode 100644 index 3f582ffcc35..00000000000 --- a/docs/create/heroku/herokuruby/index-herokuruby.mdx +++ /dev/null @@ -1,185 +0,0 @@ ---- -id: index-herokuruby -title: Deploy a Ruby app on Heroku using Redis -sidebar_label: How to deploy a Ruby based application on Heroku using Redis -slug: /create/heroku/herokuruby -authors: [ajeet] ---- - -import RedisCard from '@site/src/theme/RedisCard'; - -Heroku is a popular PaaS offering that allows software developers to easily deploy their code without worrying about the underlying infrastructure. By using a simple 'git push heroku' command, developers are able to deploy their application flawlessly. This platform offers support for a wide range of programming languages such as Java, Ruby, PHP, Node.js, Python, Scala, and Clojure. - -Here's a quickstart guide to deploy Ruby apps on Heroku using Redis. We will be deploying a sample Leaderboard app and will be using company valuation and stock tickers as its domain. - -### Step 1. Create a Redis Enterprise Cloud Database - -Create your free Redis Enterprise Cloud account by visiting [this link](https://redis.com/try-free). - -:::info TIP -For a limited time, use **TIGER200** to get **$200** credits on Redis Enterprise Cloud and try all the advanced capabilities! - -:tada: [Click here to sign up](https://redis.com/try-free) - -::: - -![recloud](try-free.png) - -[Follow this link to create a Redis Enterprise Cloud](/create/rediscloud) subscription and database. Once you create the database, you will be provisioned with a unique database endpoint URL, port and password. Save these for future reference. - -### Step 2. Create a Heroku account - -If you are using Heroku for the first time, create your new Heroku account [through this link](https://signup.heroku.com/login). - -### Step 3. Install the Heroku CLI on your system - -```macos - brew install heroku -``` - -### Step 4. Login to Heroku - -```bash - heroku login - heroku: Press any key to open up the browser to login or q to exit: - Opening browser to https://cli-auth.heroku.com/auth/cli/browser/XXXXXXXXXXA - Logging in... done - Logged in as your_email_address -``` - -### Step 5. Connect your application to Redis Enterprise Cloud - -For this demonstration, we will be using a [Sample Redis Leaderboard app](https://github.com/redis-developer/basic-redis-leaderboard-demo-ruby). - -#### Clone the repository - -```bash - git clone https://github.com/redis-developer/basic-redis-leaderboard-demo-ruby -``` - -Run the commands below to get a functioning Git repository that contains a simple application as well as a `app.json` file. - -``` -heroku create -Creating app... done, ⬢ thawing-shore-07338 -https://thawing-shore-07338.herokuapp.com/ | https://git.heroku.com/thawing-shore-07338.git -``` - -### Step 6. Setting up Environment Variables - -Go to the Heroku dashboard, click "Settings" and set `REDIS_HOST` and `REDIS_PASSWORD` under the Config Vars. -Refer to Step 1 for the correct values to use. - -![heroku](heroku_app1_env.png) - -You now have a functioning Git repository that contains a simple application as well as a `app.json` file, which is used by Node’s dependency manager. - -### Step 7. Deploy your code - -``` -$ git push heroku -``` - -Wait for few seconds and you will see the messages below: - -``` -remote: -----> Discovering process types -remote: Procfile declares types -> (none) -remote: Default types for buildpack -> console, rake, web -remote: -remote: -----> Compressing... -remote: Done: 125.9M -remote: -----> Launching... -remote: Released v10 -remote: https://thawing-shore-07338.herokuapp.com/ deployed to Heroku -remote: -remote: Verifying deploy... done. -To https://git.heroku.com/thawing-shore-07338.git - * [new branch] master -> master -``` - -### Step 8. Accessing the application - -Open https://thawing-shore-07338.herokuapp.com/ to access your application on the browser. -Please note that the Web URL is unique, hence it will be different in your case. - -![heroku](heroku_leaderboard_ruby.png) - -### How does it work? - -#### How the data is stored: - -- The AAPL's details - market cap of 2.6 triillions and USA origin - are stored in a Redis hash like this: - - ```bash - HSET "company:AAPL" symbol "AAPL" market_cap "2600000000000" country USA - ``` - -- The market capitalization for each company is also stored in a ZSET (Redis Sorted Set). - - ```bash - ZADD companyLeaderboard 2600000000000 company:AAPL - ``` - -#### How the data is accessed: - -- Top 10 companies: - - ```bash - ZREVRANGE companyLeaderboard 0 9 WITHSCORES - ``` - -- All companies: - - ```bash - ZREVRANGE companyLeaderboard 0 -1 WITHSCORES - ``` - -- Bottom 10 companies: - - ```bash - ZRANGE companyLeaderboard 0 9 WITHSCORES - ``` - -- Between rank 10 and 15: - - ```bash - ZREVRANGE companyLeaderboard 9 14 WITHSCORES - ``` - -- Show rank for AAPL, FB and TSLA: - - ```bash - ZREVRANGE companyLeaderBoard company:AAPL company:FB company:TSLA - ``` - -- Add 1 billion to the market cap of the FB company: - - ```bash - ZINCRBY companyLeaderBoard 1000000000 "company:FB" - ``` - -- Reduce the market cap of the FB company by 1 billion: - - ```bash - ZINCRBY companyLeaderBoard -1000000000 "company:FB" - ``` - -- How many companies have a market cap between 500 billion and 1 trillion?: - - ```bash - ZCOUNT companyLeaderBoard 500000000000 1000000000000 - ``` - -- How many companies have a market cap over a Trillion?: - - ```bash - ZCOUNT companyLeaderBoard 1000000000000 +inf - ``` - -### Next Steps - -- [Connecting to the database using RedisInsight](/explore/redisinsight/) -- [Accessing Ruby-based apps over Redis LaunchPad](https://launchpad.redis.com/) -- [Deploy Java apps on Heroku using Redis](/create/heroku/herokujava) -- [Deploy NodeJS apps on Heroku using Redis](/create/heroku/herokujava) diff --git a/docs/create/heroku/herokuruby/launch_database.png b/docs/create/heroku/herokuruby/launch_database.png deleted file mode 100644 index 67d35afa3be..00000000000 Binary files a/docs/create/heroku/herokuruby/launch_database.png and /dev/null differ diff --git a/docs/create/heroku/herokuruby/try-free.png b/docs/create/heroku/herokuruby/try-free.png deleted file mode 100644 index 11915ea5927..00000000000 Binary files a/docs/create/heroku/herokuruby/try-free.png and /dev/null differ diff --git a/docs/create/heroku/index-heroku.mdx b/docs/create/heroku/index-heroku.mdx index 3a216f7489b..e47c0e62beb 100644 --- a/docs/create/heroku/index-heroku.mdx +++ b/docs/create/heroku/index-heroku.mdx @@ -5,7 +5,7 @@ sidebar_label: Overview slug: /create/heroku --- -import RedisCard from '@site/src/theme/RedisCard'; +import RedisCard from '@theme/RedisCard'; The following links provide you with the available options to run apps on Heroku using Redis: @@ -17,53 +17,4 @@ The following links provide you with the available options to run apps on Heroku page="/create/heroku/portal" />
-
- -
-
- -
- -
-
- -
-
- -
- -
- -
-
-
-
- -
- -
diff --git a/docs/create/heroku/portal/index-heroku.mdx b/docs/create/heroku/portal/index-heroku.mdx index 89326d10421..50d377676cb 100644 --- a/docs/create/heroku/portal/index-heroku.mdx +++ b/docs/create/heroku/portal/index-heroku.mdx @@ -8,46 +8,40 @@ authors: [ajeet] import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; +import Authors from '@theme/Authors'; + + Heroku is a cloud Platform as a Service (PaaS) supporting multiple programming languages that is used as a web application deployment model. Heroku lets the developer build, run and scale applications in a similar manner across all the languages(Java, Node.js, Scala, Clojure, Python, PHP, Ruby and Go). -### Using Redis Enterprise Cloud directly +### Using Redis Cloud directly Redis is an open source, in-memory, key-value data store most commonly used as a primary database, cache, message broker, and queue. Redis cache delivers sub-millisecond response times, enabling fast and powerful real-time applications in industries such as gaming, fintech, ad-tech, social media, healthcare, and IoT. -Redis Cloud is a fully-managed cloud service for hosting and running your Redis dataset in a highly-available and scalable manner, with predictable and stable top performance. Redis Enterprise cloud allows you to run Redis server over the Cloud and access instance via multiple ways like RedisInsight, redis command line as well as client tools. You can quickly and easily get your apps up and running with Redis Cloud through its Redis Heroku addons , just tell us how much memory you need and get started instantly with your first Redis database. You can then add more Redis databases (each running in a dedicated process, in a non-blocking manner) and increase or decrease the memory size of your plan without affecting your existing data. +Redis Cloud is a fully-managed cloud service for hosting and running your Redis dataset in a highly-available and scalable manner, with predictable and stable top performance. Redis Cloud allows you to run Redis server over the Cloud and access instance via multiple ways like RedisInsight, redis command line as well as client tools. You can quickly and easily get your apps up and running with Redis Cloud through its Redis Heroku addons , just tell us how much memory you need and get started instantly with your first Redis database. You can then add more Redis databases (each running in a dedicated process, in a non-blocking manner) and increase or decrease the memory size of your plan without affecting your existing data. ::tip INFO Heroku addons are set of tools and services for developing, extending, and operating your app. ::: -You can quickly and easily get your apps up and running with Redis Enterprise Cloud directly. Follow the below steps: - -#### Step 1. Create Redis Enterprise Cloud - -Create your free Redis Enterprise Cloud account by visiting [this link](https://redis.com/try-free) +You can quickly and easily get your apps up and running with Redis Cloud directly. Follow the below steps: -:::info TIP -For a limited time, use **TIGER200** to get **$200** credits on Redis Enterprise Cloud and try all the advanced capabilities! +#### Step 1. Create Redis Cloud -:tada: [Click here to sign up](https://redis.com/try-free) - -::: +Create your free Redis Cloud account by visiting [this link](https://redis.com/try-free) ![recloud](tryfree.png) -[Follow this link to create a Redis Enterprise Cloud](/create/rediscloud) subscription and database. Once you create the database, you will be provisioned with a unique database endpoint URL, port and password. Save these for future reference. +[Follow this link to create a Redis Cloud](https://redis.com/try-free) subscription and database. Once you create the database, you will be provisioned with a unique database endpoint URL, port and password. Save these for future reference. Before you proceed with heroku redis, ensure that you can connect to Redis instance and verify if it is accessible via redis-cli command. You can run `info` command that is available in redis client software to see the version, memory usage, stats, and modules enabled in the Redis cloud database. @@ -73,7 +67,7 @@ If you are using Heroku for the first time, create your new Heroku account [thro Logged in as your_email_address ``` -#### Step 5. Connect your application to Redis Enterprise Cloud +#### Step 5. Connect your application to Redis Cloud For this demonstration, we will be using a [Sample Rate Limiting application](https://github.com/redis-developer/basic-rate-limiting-demo-python). @@ -95,7 +89,13 @@ Run the commands below to get a functioning Git repository that contains a simpl #### Step 6. Setting up environment variables -[Follow this link to create a Redis Enterprise Cloud](/create/rediscloud) subscription and database connection as shown below: Go to the Heroku dashboard, click "Settings" and set `REDIS_URL` and `REDIS_PASSWORD` under the Config Vars. (Please note that the Redis URL endpoint is unique and might be different in your case. Please enter the values accordingly) +[Follow this link to create a Redis Cloud](https://redis.com/try-free) subscription and database connection as shown below: Go to the Heroku dashboard, click "Settings" and set `REDIS_URL` and `REDIS_PASSWORD` under the Config Vars. + +:::note + +The Redis URL endpoint is unique and might be different in your case. Please enter the values accordingly + +::: Refer to [Step 1](/create/heroku/portal#step-1-create-redis-enterprise-cloud) for the correct values to use. @@ -139,11 +139,13 @@ Check the logs: #### Using Heroku CLI -:::important -Please note that this method won't allow you to choose Redis Modules while creating your Redis database. Also, it doesn't provide you with the flexibility to choose the Cloud platform of your choice. It is recommended to use Redis Enterprise Cloud directly. [Click here to learn more](/create/rediscloud). +:::note + +Please note that this method won't allow you to choose Redis Modules while creating your Redis database. Also, it doesn't provide you with the flexibility to choose the Cloud platform of your choice. It is recommended to use Redis Cloud directly. [Click here to learn more](https://redis.com/try-free). + ::: -In this section, we will create a Heroku account, use the Heroku CLI to login and add Redis Enterprise Cloud as an add-on. +In this section, we will create a Heroku account, use the Heroku CLI to login and add Redis Cloud as an add-on. #### Step 1: Install Heroku @@ -210,13 +212,13 @@ Once Redis Cloud has been added, you will notice a REDISCLOUD_URL config var in REDISCLOUD_URL: redis://default:ajSE7DuqhmGG7u2ZbSU0HTuEqTx1FuEQ@redis-17268.c256.us-east-1-2.ec2.cloud.redislabs.com:17268 ``` -#### Step 5. Accessing the Redis Enterprise Cloud dashboard +#### Step 5. Accessing the Redis Cloud dashboard Go to Heroku and click on “Installed add-ons”: ![heroku](heroku_addons.png) -Click on “Redis Enterprise Cloud” and you will be redirected to the Redis Enterprise Cloud Dashboard. +Click on “Redis Cloud” and you will be redirected to the Redis Cloud Dashboard. ![heroku](heroku_rediscloud1.png) @@ -247,8 +249,10 @@ Open https://lit-island-48230.herokuapp.com/ and access the rate limiting app. #### Using Heroku Dashboard -:::important -Please note that this method won't allow you to choose Redis Modules while creating a Redis database. Also, it doesn't provide you with the flexibility to choose the Cloud platform of your choice. It is recommended to use Redis Enterprise Cloud directly. [Click here to learn more](/create/rediscloud). +:::note + +Please note that this method won't allow you to choose Redis Modules while creating a Redis database. Also, it doesn't provide you with the flexibility to choose the Cloud platform of your choice. It is recommended to use Redis Cloud directly. [Click here to learn more](https://redis.com/try-free). + ::: #### Step 1: Sign up for a Heroku account @@ -302,11 +306,6 @@ Click "Open App" on the top right corner. -### Next Steps - -- [How to build a Java based Rate Limiting application on Heroku using Redis](/howtos/herokujava) -- [How to build a NodeJS based Rate Limiting application on Heroku using Redis](/howtos/herokunodejs) - ##
@@ -315,13 +314,11 @@ Click "Open App" on the top right corner. target="_blank" rel="noopener" className="link"> - Redis Launchpad -
diff --git a/docs/create/heroku/portal/index-heroku.mdx.march25 b/docs/create/heroku/portal/index-heroku.mdx.march25 index 241e033521b..83451dc0e64 100644 --- a/docs/create/heroku/portal/index-heroku.mdx.march25 +++ b/docs/create/heroku/portal/index-heroku.mdx.march25 @@ -1,23 +1,23 @@ --- id: index-heroku -title: Create Redis database on Heroku +title: Create Redis database on Heroku sidebar_label: Redis on Heroku slug: /create/heroku --- import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem'; import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; +import RedisCard from '@theme/RedisCard'; -Heroku is a cloud Platform as a Service (PaaS) supporting multiple programming languages that is used as a web application deployment model.Heroku lets the developer build, run and scale applications in a similar manner across all the languages(Java, Node.js, Scala, Clojure, Python, PHP, Ruby and Go). +Heroku is a cloud Platform as a Service (PaaS) supporting multiple programming languages that is used as a web application deployment model.Heroku lets the developer build, run and scale applications in a similar manner across all the languages(Java, Node.js, Scala, Clojure, Python, PHP, Ruby and Go). @@ -25,25 +25,25 @@ Heroku is a cloud Platform as a Service (PaaS) supporting multiple programming l -## Using Redis Enterprise Cloud +## Using Redis Cloud ![heroku](heroku_rediscloud.png) -You can quickly and easily get your apps up and running with Redis Enterprise Cloud directly. Follow the below steps: +You can quickly and easily get your apps up and running with Redis Cloud directly. Follow the below steps: -### Step 1. Create Redis Enterprise Cloud +### Step 1. Create Redis Cloud -Create your free Redis Enterprise Cloud account. Follow this link to know how to create Redis Enterprise Cloud subscription and database as shown below: +Create your free Redis Cloud account. Follow this link to know how to create Redis Cloud subscription and database as shown below: ![heroku](rediscloud.png) Save the database endpoint URL and password for future reference. -### Step 2. Connect your application to Redis Enterprise Cloud +### Step 2. Connect your application to Redis Cloud -Here’s a sample rate limiting application that you can connect to Redis Enterprise Cloud: +Here’s a sample rate limiting application that you can connect to Redis Cloud: #### Cloning the repository @@ -53,7 +53,7 @@ Here’s a sample rate limiting application that you can connect to Redis Enterp #### Install the below software: -- Python +- Python - Docker - Docker Compose @@ -70,7 +70,7 @@ The above command will run Redis container ``` $ docker-compose ps - Name Command State Ports + Name Command State Ports --------------------------------------------------------------------------------------------------- redis.redisratelimiting.docker docker-entrypoint.sh redis ... Up 127.0.0.1:55561->6379/tcp ``` @@ -117,7 +117,7 @@ Replace: ![heroku](create_heroku.png) -### Step 4. Create a Heroku app +### Step 4. Create a Heroku app Login to Heroku @@ -158,16 +158,16 @@ You now have a functioning Git repository that contains a simple application as 7.4.3 urllib3-1.26.2 uvicorn-0.13.2 uvloop-0.14.0 remote: -----> $ python server/manage.py collectstatic --noinput remote: 137 static files copied to '/tmp/build_3e723f51/server/static_root'. - remote: + remote: remote: -----> Discovering process types remote: Procfile declares types -> web - remote: + remote: remote: -----> Compressing... remote: Done: 59.4M remote: -----> Launching... remote: Released v5 remote: https://salty-harbor-93142.herokuapp.com/ deployed to Heroku - remote: + remote: remote: Verifying deploy... done. To https://git.heroku.com/salty-harbor-93142.git * [new branch] master -> master @@ -180,9 +180,9 @@ You now have a functioning Git repository that contains a simple application as ## Addons using Heroku CLI -You can use Heroku CLI to login and add Redis Enterprise Cloud as an add-on. +You can use Heroku CLI to login and add Redis Cloud as an add-on. -### Step 1: Install Heroku +### Step 1: Install Heroku ``` @@ -194,14 +194,14 @@ Assuming that you already have Heroku account created, run the below command to ``` $ heroku login - heroku: Press any key to open up the browser to login or q to exit + heroku: Press any key to open up the browser to login or q to exit Opening browser to https://cli-auth.heroku.com/auth/cli/browser/XXXXXXXXXXA Logging in... done Logged in as youremailaddress ``` -### Step 2. Installing Redis Enterprise Cloud Add-on +### Step 2. Installing Redis Cloud Add-on Create a Heroku app @@ -269,11 +269,11 @@ You can even browse it through Heroku Dashboard: ![heroku](heroku_selectrecloud.png) -Click on “Redis Enterprise Cloud” and it will be redirected over Redis Enterprise Cloud Dashboard +Click on “Redis Cloud” and it will be redirected over Redis Cloud Dashboard ![heroku](heroku_recloud.png) -As shown above, a database called "redis-kickstartredis-XXX" gets created over Redis Enterprise Cloud dashboard. +As shown above, a database called "redis-kickstartredis-XXX" gets created over Redis Cloud dashboard. @@ -284,14 +284,14 @@ As shown above, a database called "redis-kickstartredis-XXX" gets created over R -### Step 1: Sign-in to Heroku +### Step 1: Sign-in to Heroku -Open https://dashboard.heroku.com/apps and create a new sample application. For this demo, I have deployed an application with the name “kickstartredis”. +Open https://dashboard.heroku.com/apps and create a new sample application. For this demo, I have deployed an application with the name “kickstartredis”. -### Step 2: Install Redis Enterprise Cloud +### Step 2: Install Redis Cloud -Open https://elements.heroku.com/addons/rediscloud and click on “Install Redis Enterprise Cloud” to sign up with Heroku. +Open https://elements.heroku.com/addons/rediscloud and click on “Install Redis Cloud” to sign up with Heroku. ![heroku](heroku2.png) @@ -307,17 +307,17 @@ Once you sign in, you will see “Online Order Form” as shown below: ![heroku](orderform.png) -### Step 5. Provision Redis Enterprise Cloud database +### Step 5. Provision Redis Cloud database -Provision Redis Enterprise Cloud on your personal application(which in the above case is “kickstartredis”. +Provision Redis Cloud on your personal application(which in the above case is “kickstartredis”. ![heroku](heroku_selectrecloud.png) -Click on “Redis Enterprise Cloud” and it will be redirected over Redis Enterprise Cloud Dashboard +Click on “Redis Cloud” and it will be redirected over Redis Cloud Dashboard ![heroku](heroku_recloud.png) -As shown above, a database called "redis-kickstartredis-XXX" gets created over Redis Enterprise Cloud dashboard. +As shown above, a database called "redis-kickstartredis-XXX" gets created over Redis Cloud dashboard. ### Step 6. Accessing the database @@ -333,7 +333,4 @@ redis-11324.c82.us-east-1-2.ec2.cloud.redislabs.com:11324> - - - - + diff --git a/docs/create/heroku/portal/index-heroku.mdx.python b/docs/create/heroku/portal/index-heroku.mdx.python index 67e46a5dda7..3227c2e815e 100644 --- a/docs/create/heroku/portal/index-heroku.mdx.python +++ b/docs/create/heroku/portal/index-heroku.mdx.python @@ -1,13 +1,13 @@ --- id: index-heroku -title: Create Redis database on Heroku +title: Create Redis database on Heroku sidebar_label: Redis on Heroku slug: /create/heroku --- import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem'; import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; +import RedisCard from '@theme/RedisCard'; There are two ways to create Redis database on Heroku: - Using Heroku Dashboard UI @@ -28,36 +28,36 @@ There are two ways to create Redis database on Heroku: ### Prerequisite: -### Step 1: +### Step 1: -Sign in to Heroku, open https://dashboard.heroku.com/apps and create a new sample application. For this demo, I have deployed an application with the name “kickstartredis”. +Sign in to Heroku, open https://dashboard.heroku.com/apps and create a new sample application. For this demo, I have deployed an application with the name “kickstartredis”. -### Step 2: +### Step 2: -Open https://elements.heroku.com/addons/rediscloud and click on “Install Redis Enterprise Cloud” to sign up with Heroku. +Open https://elements.heroku.com/addons/rediscloud and click on “Install Redis Cloud” to sign up with Heroku. ![heroku](heroku2.png) -### Step 3: +### Step 3: For this demonstration, we will be picking up a 30MB Free plan(see Plans and Pricing below). ![heroku](pricing.png) -### Step 4: +### Step 4: Once you sign in, you will see “Online Order Form” as shown below: ![heroku](orderform.png) -### Step 5. +### Step 5. -Provision Redis Enterprise Cloud on your personal application(which in the above case is “kickstartredis”. +Provision Redis Cloud on your personal application(which in the above case is “kickstartredis”. ### References -- [Redis Enterprise Cloud on Heroku](https://elements.heroku.com/addons/rediscloud) +- [Redis Cloud on Heroku](https://elements.heroku.com/addons/rediscloud) - [Pricing](https://elements.heroku.com/addons/rediscloud#pricing) - [Region Availability](https://elements.heroku.com/addons/rediscloud#region-map) - [Documentation](https://elements.heroku.com/addons/rediscloud#docs) @@ -70,7 +70,7 @@ Provision Redis Enterprise Cloud on your personal application(which in the above ## Using Heroku CLI -You can use Heroku CLI to login and add Redis Enterprise Cloud as an add-on. +You can use Heroku CLI to login and add Redis Cloud as an add-on. ### Step 1: Install Heroku on MacOS @@ -84,7 +84,7 @@ Assuming that you already have Heroku account created, run the below command to ``` $ heroku login -heroku: Press any key to open up the browser to login or q to exit: +heroku: Press any key to open up the browser to login or q to exit: Opening browser to https://cli-auth.heroku.com/auth/cli/browser/4788f936-3557-439f-ab37-95338b735cf2?requestor=XXXXXXXXXXXA.vhF7XtVTtsp9xliwwrHG5ytuirrmn9EfT6Ef3WuzXFE Logging in... done Logged in as your_email_address @@ -151,7 +151,7 @@ You can even browse it through Heroku Dashboard: ![heroku](heroku_selectrecloud.png) -Click on “Redis Enterprise Cloud” and it will be redirected over Redis Enterprise Cloud Dashboard +Click on “Redis Cloud” and it will be redirected over Redis Cloud Dashboard ![heroku](heroku_recloud.png) @@ -186,7 +186,7 @@ You now have a functioning Git repository that contains a simple application as ### Deploy the app -It is recommended to use Redis Enterprise Cloud Page for creating the database as it allows you to add Redis modules of your choice. Also, it provides you freedom to choose Cloud other than AWS for creating the database. +It is recommended to use Redis Cloud Page for creating the database as it allows you to add Redis modules of your choice. Also, it provides you freedom to choose Cloud other than AWS for creating the database. In this step you will deploy the app to Heroku. @@ -203,14 +203,14 @@ When you create an app, a git remote (called heroku) is also created and associa Heroku generates a random name (in this case fast-reef-76278) for your app, or you can pass a parameter to specify your own app name. Now deploy your code: - + ``` $ git push heroku main ``` ``` -$ git push heroku +$ git push heroku Enumerating objects: 512, done. Counting objects: 100% (512/512), done. Delta compression using up to 12 threads @@ -219,7 +219,7 @@ Writing objects: 100% (512/512), 1.52 MiB | 660.00 KiB/s, done. Total 512 (delta 244), reused 512 (delta 244) remote: Compressing source files... done. remote: Building source: -remote: +remote: remote: -----> Building on the Heroku-20 stack remote: -----> Determining which buildpack to use for this app remote: -----> Python app detected @@ -230,7 +230,7 @@ remote: Done: 59.3M remote: -----> Launching... remote: Released v5 remote: https://fast-reef-76278.herokuapp.com/ deployed to Heroku -remote: +remote: remote: Verifying deploy... done. To https://git.heroku.com/fast-reef-76278.git * [new branch] master -> master @@ -258,10 +258,3 @@ content-goes-here - - - -### Next Steps - -- [Connecting to the database using RedisInsight](/explore/redisinsight/) -- [How to list & search Movies database using Redisearch](/howtos/moviesdatabase/getting-started/) diff --git a/docs/create/heroku/ratelimiting-go/images/image1.png b/docs/create/heroku/ratelimiting-go/images/image1.png deleted file mode 100644 index ade7fc97756..00000000000 Binary files a/docs/create/heroku/ratelimiting-go/images/image1.png and /dev/null differ diff --git a/docs/create/heroku/ratelimiting-go/images/image11.png b/docs/create/heroku/ratelimiting-go/images/image11.png deleted file mode 100644 index 974c27e27f8..00000000000 Binary files a/docs/create/heroku/ratelimiting-go/images/image11.png and /dev/null differ diff --git a/docs/create/heroku/ratelimiting-go/images/image2.png b/docs/create/heroku/ratelimiting-go/images/image2.png deleted file mode 100644 index 82a5a27d435..00000000000 Binary files a/docs/create/heroku/ratelimiting-go/images/image2.png and /dev/null differ diff --git a/docs/create/heroku/ratelimiting-go/images/image3.png b/docs/create/heroku/ratelimiting-go/images/image3.png deleted file mode 100644 index 1e16593df16..00000000000 Binary files a/docs/create/heroku/ratelimiting-go/images/image3.png and /dev/null differ diff --git a/docs/create/heroku/ratelimiting-go/images/image4.png b/docs/create/heroku/ratelimiting-go/images/image4.png deleted file mode 100644 index 0c24820e77f..00000000000 Binary files a/docs/create/heroku/ratelimiting-go/images/image4.png and /dev/null differ diff --git a/docs/create/heroku/ratelimiting-go/images/image5.png b/docs/create/heroku/ratelimiting-go/images/image5.png deleted file mode 100644 index 3fbb2162caf..00000000000 Binary files a/docs/create/heroku/ratelimiting-go/images/image5.png and /dev/null differ diff --git a/docs/create/heroku/ratelimiting-go/images/launch_database.png b/docs/create/heroku/ratelimiting-go/images/launch_database.png deleted file mode 100644 index 861f20f9dec..00000000000 Binary files a/docs/create/heroku/ratelimiting-go/images/launch_database.png and /dev/null differ diff --git a/docs/create/heroku/ratelimiting-go/images/try-free.png b/docs/create/heroku/ratelimiting-go/images/try-free.png deleted file mode 100644 index 11915ea5927..00000000000 Binary files a/docs/create/heroku/ratelimiting-go/images/try-free.png and /dev/null differ diff --git a/docs/create/heroku/ratelimiting-go/index-ratelimitinggo.mdx b/docs/create/heroku/ratelimiting-go/index-ratelimitinggo.mdx deleted file mode 100644 index 94acbe9de44..00000000000 --- a/docs/create/heroku/ratelimiting-go/index-ratelimitinggo.mdx +++ /dev/null @@ -1,213 +0,0 @@ ---- -id: index-ratelimitinggo -title: Deploy a Redis Rate Limiting app on Heroku -sidebar_label: How to deploy a Redis Rate Limiting application on Heroku -slug: /create/heroku/ratelimiting-go -authors: [ajeet] ---- - -Rate limiting is a mechanism that many developers may have to deal with at some point in their life. It’s useful for a variety of purposes like sharing access to limited resources or limiting the number of requests made to an API endpoint and responding with a 429 status code. Building a rate limiter with Redis is easy because of two commands [INCR](https://redis.io/commands/incr) and [EXPIRE](https://redis.io/commands/expire). The basic concept is that you want to limit requests to a particular service in a given time period. Let’s say we have a service that has users identified by an API key. This service states that it is limited to 20 requests in any given minute. - -In this tutorial, we will see how to deploy Rate Limiting using Redis and Go. - -#### Step 1. Create Redis Enterprise Cloud - -Create your free Redis Enterprise Cloud account by visiting [this link](https://redis.com/try-free) - -:::info TIP -For a limited time, use **TIGER200** to get **$200** credits on Redis Enterprise Cloud and try all the advanced capabilities! - -:tada: [Click here to sign up](https://redis.com/try-free) - -::: - -![recloud](images/try-free.png) - -[Follow this link to create a Redis Enterprise Cloud](/create/rediscloud) subscription and database. Once you create the database, you will be provisioned with a unique database endpoint URL, port and password. Save these for future reference. - -### Step 2. Create a Heroku account - -If you are using Heroku for the first time, create your new Heroku account [through this link](https://signup.heroku.com/login). - -![alt_text](images/image2.png) - -### Step 3. Install Heroku CLI on your system - -```bash - brew install heroku -``` - -### Step 4. Login to Heroku - -```bash - heroku login -``` - -### Step 5. Connect your application to Redis Enterprise Cloud - -For this demonstration, we will be using a Simple Rate Limiting application using Go. - -#### Clone the repository - -```bash - git clone https://github.com/basic-redis-rate-limiting-demo-go-lang -``` - -```bash - heroku create - - Creating app... done, ⬢ powerful-fortress-83061 - - https://powerful-fortress-83061.herokuapp.com/ | https://git.heroku.com/powerful-fortress-83061.git -``` - -![alt_text](images/image3.png) - -### Step 6. Setting up Environment Variables - -Go to the Heroku dashboard, click "Settings" and set REDIS_HOST, REDIS_PORT and REDIS_PASSWORD under the Config Vars. Refer to Step 1 for the correct values to use. - -![alt_text](images/image4.png) - -You now have a functioning Git repository that contains a simple application as well as a package.json file, which is used by Node’s dependency manager. - -### Step 7. Deploy your code - -Heroku generates a random name (in this case [powerful-fortress-83061](https://powerful-fortress-83061.herokuapp.com/)) for your app, or you can pass a parameter to specify your own app name. Now deploy your code: - -```bash - git push heroku -``` - -````bash - Enumerating objects: 171, done. - Counting objects: 100% (171/171), done. - Delta compression using up to 12 threads - Compressing objects: 100% (86/86), done. - Writing objects: 100% (171/171), 5.65 MiB | 2.18 MiB/s, done. - Total 171 (delta 74), reused 171 (delta 74), pack-reused 0 - remote: Compressing source files... done. - remote: Building source: - remote: ** ** - remote: ** Installed the following binaries:** - remote: ** ./bin/basic-redis-rate-limiting-demo-go-lang** - remote: ** ** - remote: ** Created a Procfile with the following entries:** - remote: ** web: bin/basic-redis-rate-limiting-demo-go-lang** - remote: ** ** - remote: ** If these entries look incomplete or incorrect please create a Procfile with the required entries.** - remote: ** See https://devcenter.heroku.com/articles/procfile for more details about Procfiles** - remote: ** ** - remote: -----> Discovering process types - remote: Procfile declares types -> web - remote: - remote: -----> Compressing... - remote: Done: 9.6M - remote: -----> Launching... - remote: Released v7 - remote: https://powerful-fortress-83061.herokuapp.com/ deployed to Heroku - remote: - remote: Verifying deploy... done. -To https://git.heroku.com/powerful-fortress-83061.git - - * [new branch] master -> master - -If your app doesn point to the right repository, you can manually add it: - -```bash - heroku git:remote -a powerful-fortress-83061 -```` - -### Step 8. Accessing the app - -Open [https://powerful-fortress-83061.herokuapp.com/](https://powerful-fortress-83061.herokuapp.com/) to see your application. - -![alt_text](images/image5.png) - -## How it works - -This app will block connections from a client after surpassing certain amount of requests (default: 10) per time period (default: 10 sec). The application returns the following headers in response to each request. The values of these headers tell the user how many requests they have remaining before they reach the limit. On the 10th run the server should return an HTTP status code of **429 Too Many Requests** - -### Cookies - -The application uses cookies to identify users. On the first request, the user will receive a cookie back from the server if one didn't previously exist. -`CookieName: user-limiter` -`CookieValue: md5()` -`` - request time in a format: `2006-01-02 15:04:05.999999999 -0700 MST` - -### Redis Commands - -- Read requests for user by `user-limiter` cookie: `GET requests.` - get `USER_IDENTIFIER` from request cookie - - E.g `GET requests.0cbc6611f5540bd0809a388dc95a615b` -- Set request counter with expired 10 sec if not exist in `requests.`: `SETEX requests. 10 0` - - E.g `SETEX requests.0cbc6611f5540bd0809a388dc95a615b 10 0` -- Increment requests counter for each of user request: `INC requests.` - - E.g `INC requests.0cbc6611f5540bd0809a388dc95a615b` -- Get requests number for user: `GET requests.` - - E.g `GET requests.0cbc6611f5540bd0809a388dc95a615b` - -### Code for Rate Limiting - -```Go -func (c Controller) AcceptedRequest(user string, limit int) (int, bool) { - key := c.key(user) - - if _, err := c.r.Get(key); err == redis.Nil { - err := c.r.Set(key, "0", time.Second * time.Duration(limit)) - if err != nil { - log.Println(err) - return 0, false - } - } - - if err := c.r.Inc(key); err != nil { - log.Println(err) - return 0, false - } - - requests, err := c.r.Get(key) - if err != nil { - log.Println(err) - return 0,false - } - requestsNum, err := strconv.Atoi(requests) - if err != nil { - log.Println(err) - return 0, false - } - - if requestsNum > limit { - return requestsNum, false - } - - return requestsNum, true - } -``` - -Where `c` corresponds to the active controller and `c.r` is a Redis client. - -### Response - -#### Status codes - -- `200 - OK` - responded `PONG` -- `406 - Not Acceptable` - could not read cookie from request, returned when cookies are not allowed on the client side -- `429 - Too Many Requests` - user send more than 10 requests / 10sec - -#### Headers - -- `X-RateLimit-Limit: 10` - allowed number of limits per 10sec -- `X-RateLimit-Remaining: 9` - number of left request in 10sec window - -### Available commands - -- [SETEX](https://redis.io/commands/setex) -- [GET](https://redis.io/commands/get) -- [DEL](https://redis.io/commands/del) -- [INCR](https://redis.io/commands/incr) - -### References - -- [Deploy a Java app on Heroku using Redis](/create/heroku/herokujava) -- [Deploy a NodeJS app on Heroku using Redis](/create/heroku/herokunodejs) -- [Deploy a Python app on Heroku using Redis](/create/heroku/herokupython) diff --git a/docs/create/homebrew/images/add_database.png b/docs/create/homebrew/images/add_database.png deleted file mode 100644 index 9ada742a2f2..00000000000 Binary files a/docs/create/homebrew/images/add_database.png and /dev/null differ diff --git a/docs/create/homebrew/images/database_creds.png b/docs/create/homebrew/images/database_creds.png deleted file mode 100644 index ef6379e72b3..00000000000 Binary files a/docs/create/homebrew/images/database_creds.png and /dev/null differ diff --git a/docs/create/homebrew/images/database_details.png b/docs/create/homebrew/images/database_details.png deleted file mode 100644 index 5007a480b06..00000000000 Binary files a/docs/create/homebrew/images/database_details.png and /dev/null differ diff --git a/docs/create/homebrew/images/details_database.png b/docs/create/homebrew/images/details_database.png deleted file mode 100644 index 3881cf02728..00000000000 Binary files a/docs/create/homebrew/images/details_database.png and /dev/null differ diff --git a/docs/create/homebrew/images/select_cloud_vendor.png b/docs/create/homebrew/images/select_cloud_vendor.png deleted file mode 100644 index 2526223c800..00000000000 Binary files a/docs/create/homebrew/images/select_cloud_vendor.png and /dev/null differ diff --git a/docs/create/homebrew/images/testredis1.png b/docs/create/homebrew/images/testredis1.png deleted file mode 100644 index 88bfdf751b5..00000000000 Binary files a/docs/create/homebrew/images/testredis1.png and /dev/null differ diff --git a/docs/create/homebrew/images/testredis2.png b/docs/create/homebrew/images/testredis2.png deleted file mode 100644 index 712038efad5..00000000000 Binary files a/docs/create/homebrew/images/testredis2.png and /dev/null differ diff --git a/docs/create/homebrew/images/testredis3.png b/docs/create/homebrew/images/testredis3.png deleted file mode 100644 index c366b3b8ca6..00000000000 Binary files a/docs/create/homebrew/images/testredis3.png and /dev/null differ diff --git a/docs/create/homebrew/images/testredis4.png b/docs/create/homebrew/images/testredis4.png deleted file mode 100644 index 15cbdccf451..00000000000 Binary files a/docs/create/homebrew/images/testredis4.png and /dev/null differ diff --git a/docs/create/homebrew/images/testredis5.png b/docs/create/homebrew/images/testredis5.png deleted file mode 100644 index 00e05e4c408..00000000000 Binary files a/docs/create/homebrew/images/testredis5.png and /dev/null differ diff --git a/docs/create/homebrew/images/testredis6.png b/docs/create/homebrew/images/testredis6.png deleted file mode 100644 index dce00ef0632..00000000000 Binary files a/docs/create/homebrew/images/testredis6.png and /dev/null differ diff --git a/docs/create/homebrew/index-homebrew.mdx b/docs/create/homebrew/index-homebrew.mdx deleted file mode 100644 index 4a5993a2121..00000000000 --- a/docs/create/homebrew/index-homebrew.mdx +++ /dev/null @@ -1,245 +0,0 @@ ---- -id: index-homebrew -title: Create a Redis database on Mac OS -sidebar_label: Redis on Mac OS -slug: /create/homebrew/ -authors: [ajeet] ---- - -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; - - - - -To install Redis Stack on mac OS, use Homebrew. Make sure that you have Homebrew installed before starting on the installation instructions below. - -Follow the instructions below to setup Redis Stack on your Mac OS: - -### Step 1. Install Redis Stack using Homebrew - -First, tap the Redis Stack Homebrew tap and then run `brew install` as shown below: - -```bash - brew tap redis-stack/redis-stack - brew install --cask redis-stack -``` - -This will install all Redis and Redis Stack binaries. How you run these binaries depends on whether you already have Redis installed on your system. - -``` - ==> Installing Cask redis-stack-redisinsight - ==> Moving App 'RedisInsight-preview.app' to '/Applications/RedisInsight-preview.app' - 🍺 redis-stack-redisinsight was successfully installed! - ==> Installing Cask redis-stack - 🍺 redis-stack was successfully installed! -``` - -:::info INFO - -If this is the first time you’ve installed Redis on your system, then all Redis Stack binaries will be installed and accessible from the `$PATH`. On M1 Macs, this assumes that `/opt/homebrew/bin` is in your path. On Intel-based Macs, `/usr/local/bin` should be in your path. - -To check this, run: - -```bash - echo $PATH -``` - -Then, confirm that the output contains `/opt/homebrew/bin` (M1 Mac) or `/usr/local/bin` (Intel Mac). If these directories are not in the output, see the “Existing Redis installation” instructions below. -::: - -### Start Redis Stack Server - -You can now start Redis Stack Server as follows: - -```bash - redis-stack-server -``` - -### Existing Redis installation - -If you have an existing Redis installation on your system, then you’ll need to modify your `PATH` environment variable to ensure that you’re using the latest Redis Stack binaries. - -Open the file `~/.bashrc` or `~/zshrc` (depending on your shell), and add the following line. - -```bash - export PATH=/usr/local/Caskroom/redis-stack-server//bin:$PATH -``` - -Go to Applications and click "RedisInsight Preview" to bring up the Redis Desktop GUI tool. - -### Step 2. Add Redis database - -![access redisinsight](images/add_database.png) - -### Step 3. Enter Redis database details - -Add the local Redis database endpoint and port. - -![access redisinsight](images/testredis1.png) - -### Step 5. Redis for time series - -Redis Stack provides you with a native time series data structure. Let's see how a time series might be useful in our bike shop. - -As we have multiple physical shops too, alongside our online shop, it could be helpful to have an overview of the sales volume. We will create one time series per shop tracking the total amount of all sales. In addition, we will mark the time series with the appropriate region label, east or west. This kind of representation will allow us to easily query bike sales performance per certain time periods, per shop, per region or across all shops. - -Click "Guides" icon (just below the key) in the left sidebar and choose "Redis for the time series" for this demonstration. - -![redis for timeseries](images/testredis2.png) - -### Step 6. Create time series per shop - -```bash - TS.CREATE bike_sales_1 DUPLICATE_POLICY SUM LABELS region east compacted no - TS.CREATE bike_sales_2 DUPLICATE_POLICY SUM LABELS region east compacted no - TS.CREATE bike_sales_3 DUPLICATE_POLICY SUM LABELS region west compacted no - TS.CREATE bike_sales_4 DUPLICATE_POLICY SUM LABELS region west compacted no - TS.CREATE bike_sales_5 DUPLICATE_POLICY SUM LABELS region west compacted no -``` - -As shown in the following query, we make the shop id (1,2,3,4,5) a part of the time series name. You might also notice the `DUPLICATE_POLICY SUM` argument; this describes what should be done when two events in the same time series share the same timestamp: In this case, it would mean that two sales happened at exactly the same time, so the resulting value should be a sum of the two sales amounts. - -Since the metrics are collected with a millisecond timestamp, we can compact our time series into sales per hour: - -![create time series per shop](images/testredis3.png) - -### Step 7. Running the query - -![execute the query](images/testredis4.png) - -### Step 8. Time series compaction - -RedisTimeSeries supports downsampling with the following aggregations: avg, sum, min, max, range, count, first and last. If you want to keep all of your raw data points indefinitely, your data set grows linearly over time. However, if your use case allows you to have less fine-grained data further back in time, downsampling can be applied. This allows you to keep fewer historical data points by aggregating raw data for a given time window using a given aggregation function. - -#### Example: - -``` - TS.CREATERULE bike_sales_5 bike_sales_5_per_day AGGREGATION sum 86400000 -``` - -![time series compaction](images/testredis6.png) - - - - -There are two ways to install Redis on Mac OS: - -- [Installing Redis from source](/create/from-source/) -- Using Homebrew - -Homebrew is the easiest and most flexible way to install Redis on Mac OS. It is a package management software for Mac OS. -It automates the Redis installation process, making it quick and easy to add Redis to your system. - -Follow the below steps to install Redis on Mac OS using `brew service`: - -### Step 1: Install Homebrew - -Run the following command to install and start brew service: - -```bash - /bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)" -``` - -### Step 2: Install Redis using Homebrew package manager - -Use the following commands to install Redis using brew service: - -```bash - brew update - brew install redis -``` - -### Step 3: Start Redis server - -Run the following command to start the Redis database in the background: - -```bash - brew services start redis -``` - -In order to run the latest version of Redis, you will need to compile Redis from the source. -[Follow this link](/create/from-source/) to learn more about it. - -### Step 4: Test if Redis server is running. - -```bash - redis-cli ping -``` - -It should return PONG. This command is often used to test if a connection is still alive. - -### Step 5: Launch Redis on system boot - -```bash - ln -sfv /usr/local/opt/redis/*.plist ~/Library/LaunchAgents -``` - -### Start Redis server via “launchctl” command - -```bash - launchctl load ~/Library/LaunchAgents/homebrew.mxcl.redis.plist -``` - -### Step 6: Run Redis service using a Redis configuration file - -```bash - redis-server /usr/local/etc/redis.conf -``` - -### Step 7: Interacting with Redis Client - -```bash - redis-cli - redis> set foo bar - OK - redis> get foo - "bar" -``` - -### Step 8: Stop the Redis service - -``` - brew services stop redis -``` - -### Step 9: Uninstall Redis - -```bash - brew uninstall redis -``` - -### Next Steps - -- [Connect to Redis database using RedisInsight](/explore/redisinsight) -- [Develop with Java and Redis](/develop/java) -- [Develop with Python and Redis](/develop/python) - - - - -## - - diff --git a/docs/create/homebrew/launchpad.png b/docs/create/homebrew/launchpad.png deleted file mode 100644 index 66e7a455f63..00000000000 Binary files a/docs/create/homebrew/launchpad.png and /dev/null differ diff --git a/docs/create/index-create.mdx b/docs/create/index-create.mdx index ebf06e91a69..a05647d6b99 100644 --- a/docs/create/index-create.mdx +++ b/docs/create/index-create.mdx @@ -5,19 +5,11 @@ sidebar_label: Overview - All Quick Starts slug: /create --- -import RedisCard from '@site/src/theme/RedisCard'; +import RedisCard from '@theme/RedisCard'; The following quick starts shows various ways of how to get started and create a new Redis database:
-
- -
-
-
- -
-
-
- -
-
- -
- -
-
- -
- -
- -
-
diff --git a/docs/create/index-create.mdx.orig b/docs/create/index-create.mdx.orig deleted file mode 100644 index ff5eedfff3e..00000000000 --- a/docs/create/index-create.mdx.orig +++ /dev/null @@ -1,74 +0,0 @@ ---- -id: index-create -title: Create Database -sidebar_label: Create Database -slug: /create/ ---- - -import useBaseUrl from '@docusaurus/useBaseUrl'; - -The following links provides you with the available options to create a new Redis database either on the Cloud or using local software. (Docker, Redis Enterprise, from Sources) - -
- -
-
-
-
-

Redis Enterprise Cloud

-
- -
-
-
- Read More -
-
- -
-
-
-
-

Docker

-
- -
-
-
- Read More -
-
- -
- -
- -
-
-
-
-

Redis from Source

-
- -
-
-
- Read More -
-
- -
-
-
-
-

Redis Enterprise & Kubernetes

-
- -
-
-
- Read More -
-
- -
diff --git a/docs/create/jenkins/index-jenkins.mdx b/docs/create/jenkins/index-jenkins.mdx index 97e8e3ec6f0..00d14f3729e 100644 --- a/docs/create/jenkins/index-jenkins.mdx +++ b/docs/create/jenkins/index-jenkins.mdx @@ -6,6 +6,10 @@ slug: /create/jenkins authors: [ajeet, matthew] --- +import Authors from '@theme/Authors'; + + + [Jenkins](https://www.jenkins.io/) is currently [the most popular CI(Continuous Integration) tool](https://cd.foundation/announcement/2019/08/14/jenkins-celebrates-15-years/), with ~15M users. It is an open source automation server which enables developers to reliably build, test, and deploy their software. It was forked in 2011 from a project called Hudson after a [dispute with Oracle](https://www.infoq.com/news/2011/01/jenkins/), and is used for [Continuous Integration and Continuous Delivery (CI/CD)](https://stackoverflow.com/questions/28608015/continuous-integration-vs-continuous-delivery-vs-continuous-deployment) and test automation. Jenkins is based on Java and provides over [1700 plugins](https://plugins.jenkins.io/) to automate your developer workflow and save a lot of your time in executing your repetitive tasks. ![image](images/image1.png) diff --git a/docs/create/kubernetes/kubernetes-gke/gke1.png b/docs/create/kubernetes/_kubernetes-gke/gke1.png similarity index 100% rename from docs/create/kubernetes/kubernetes-gke/gke1.png rename to docs/create/kubernetes/_kubernetes-gke/gke1.png diff --git a/docs/create/kubernetes/kubernetes-gke/gke2.png b/docs/create/kubernetes/_kubernetes-gke/gke2.png similarity index 100% rename from docs/create/kubernetes/kubernetes-gke/gke2.png rename to docs/create/kubernetes/_kubernetes-gke/gke2.png diff --git a/docs/create/kubernetes/kubernetes-gke/gke3.png b/docs/create/kubernetes/_kubernetes-gke/gke3.png similarity index 100% rename from docs/create/kubernetes/kubernetes-gke/gke3.png rename to docs/create/kubernetes/_kubernetes-gke/gke3.png diff --git a/docs/create/kubernetes/kubernetes-gke/gke4.png b/docs/create/kubernetes/_kubernetes-gke/gke4.png similarity index 100% rename from docs/create/kubernetes/kubernetes-gke/gke4.png rename to docs/create/kubernetes/_kubernetes-gke/gke4.png diff --git a/docs/create/kubernetes/kubernetes-gke/index-kubernetes-gke.mdx b/docs/create/kubernetes/_kubernetes-gke/index-kubernetes-gke.mdx similarity index 97% rename from docs/create/kubernetes/kubernetes-gke/index-kubernetes-gke.mdx rename to docs/create/kubernetes/_kubernetes-gke/index-kubernetes-gke.mdx index 882315ef0b5..c3e9a162c42 100644 --- a/docs/create/kubernetes/kubernetes-gke/index-kubernetes-gke.mdx +++ b/docs/create/kubernetes/_kubernetes-gke/index-kubernetes-gke.mdx @@ -6,10 +6,9 @@ slug: /create/kubernetes/kubernetes-gke authors: [ajeet] --- -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; +import Authors from '@theme/Authors'; + + ### Step 1. Pre-requisites @@ -150,13 +149,11 @@ Open `https://localhost:8443` in the browser to see the Redis Enterprise Softwar target="_blank" rel="noopener" className="link"> - Redis Launchpad - diff --git a/docs/create/docker/redis-on-docker/launchpad.png b/docs/create/kubernetes/_kubernetes-gke/launchpad.png similarity index 100% rename from docs/create/docker/redis-on-docker/launchpad.png rename to docs/create/kubernetes/_kubernetes-gke/launchpad.png diff --git a/docs/create/kubernetes/kubernetes-gke/re_kubernetes.png b/docs/create/kubernetes/_kubernetes-gke/re_kubernetes.png similarity index 100% rename from docs/create/kubernetes/kubernetes-gke/re_kubernetes.png rename to docs/create/kubernetes/_kubernetes-gke/re_kubernetes.png diff --git a/docs/create/kubernetes/index-kubernetes.mdx b/docs/create/kubernetes/index-kubernetes.mdx index ac572766a15..ca4cd19e10a 100644 --- a/docs/create/kubernetes/index-kubernetes.mdx +++ b/docs/create/kubernetes/index-kubernetes.mdx @@ -5,19 +5,11 @@ sidebar_label: Overview slug: /create/kubernetes/ --- -import RedisCard from '@site/src/theme/RedisCard'; +import RedisCard from '@theme/RedisCard'; The following links provide you with the available options to create a new Redis database on Kubernetes Platforms
- -
- -
![My Image](images/image1.png) @@ -219,7 +218,7 @@ Kubernetes makes physical storage devices available to your cluster in the form An Operator is basically an application-specific controller that can help you manage a Kubernetes application. It is a way to package, run, and maintain a Kubernetes application. It is designed to extend the capabilities of Kubernetes, and also simplify application management. This is especially useful for stateful applications, which include persistent storage and other elements external to the application, and may require extra work to manage and maintain. -:::info TIP +:::tip The Operator Framework is an open source project that provides developer and runtime Kubernetes tools, enabling you to accelerate the development of an operator. [Learn more about operator framework here](https://operatorframework.io/) ::: @@ -323,6 +322,5 @@ In the next tutorial, you will learn how to get started with the Redis Enterpris - [Create Redis database on Google Kubernetes Engine](/create/kubernetes/) - [Redis Enterprise Software on Kubernetes architecture ](https://docs.redis.com/latest/kubernetes/deployment/quick-start/) -- [Installing RedisInsight using Helm chart](/explore/redisinsight/usinghelm) - [Deploy Redis Enterprise Software on Kubernetes](https://docs.redis.com/latest/kubernetes/deployment/quick-start/) - [Operator Framework](https://operatorframework.io/) diff --git a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify-auth.png b/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify-auth.png deleted file mode 100644 index 36b06add560..00000000000 Binary files a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify-auth.png and /dev/null differ diff --git a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify-auth2.png b/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify-auth2.png deleted file mode 100644 index c50139a9e90..00000000000 Binary files a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify-auth2.png and /dev/null differ diff --git a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify-deploylog.png b/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify-deploylog.png deleted file mode 100644 index a672863105b..00000000000 Binary files a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify-deploylog.png and /dev/null differ diff --git a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify-importcode.png b/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify-importcode.png deleted file mode 100644 index ee65b7bd4dc..00000000000 Binary files a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify-importcode.png and /dev/null differ diff --git a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify-importexisting.png b/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify-importexisting.png deleted file mode 100644 index f67320ed1ba..00000000000 Binary files a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify-importexisting.png and /dev/null differ diff --git a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify-signin.png b/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify-signin.png deleted file mode 100644 index 5c1217d6347..00000000000 Binary files a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify-signin.png and /dev/null differ diff --git a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify-success.png b/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify-success.png deleted file mode 100644 index aefd150c803..00000000000 Binary files a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify-success.png and /dev/null differ diff --git a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify.png b/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify.png deleted file mode 100644 index 72ba227ce5e..00000000000 Binary files a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify.png and /dev/null differ diff --git a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify_buildenviron.png b/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify_buildenviron.png deleted file mode 100644 index bf8e55423c5..00000000000 Binary files a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify_buildenviron.png and /dev/null differ diff --git a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify_cd.png b/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify_cd.png deleted file mode 100644 index 81c9541fea6..00000000000 Binary files a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify_cd.png and /dev/null differ diff --git a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify_clickpreview.png b/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify_clickpreview.png deleted file mode 100644 index c0d53f0c3dc..00000000000 Binary files a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify_clickpreview.png and /dev/null differ diff --git a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify_connectgit.png b/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify_connectgit.png deleted file mode 100644 index 4533dca51db..00000000000 Binary files a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify_connectgit.png and /dev/null differ diff --git a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify_dash.png b/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify_dash.png deleted file mode 100644 index 2cffed31be9..00000000000 Binary files a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify_dash.png and /dev/null differ diff --git a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify_dashboard.png b/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify_dashboard.png deleted file mode 100644 index bf096a53903..00000000000 Binary files a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify_dashboard.png and /dev/null differ diff --git a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify_devhub.png b/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify_devhub.png deleted file mode 100644 index 536dc932d4a..00000000000 Binary files a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify_devhub.png and /dev/null differ diff --git a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify_git.png b/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify_git.png deleted file mode 100644 index 43dcc3ed757..00000000000 Binary files a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify_git.png and /dev/null differ diff --git a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify_import-to-github.png b/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify_import-to-github.png deleted file mode 100644 index f6746bc61a2..00000000000 Binary files a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify_import-to-github.png and /dev/null differ diff --git a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify_sitedeploy-inprogress.png b/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify_sitedeploy-inprogress.png deleted file mode 100644 index 400831cda09..00000000000 Binary files a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify_sitedeploy-inprogress.png and /dev/null differ diff --git a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify_trigger.png b/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify_trigger.png deleted file mode 100644 index 9c869292d70..00000000000 Binary files a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlify_trigger.png and /dev/null differ diff --git a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlifyflow.png b/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlifyflow.png deleted file mode 100644 index 93e0b33a73c..00000000000 Binary files a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlifyflow.png and /dev/null differ diff --git a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlifyimage.png b/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlifyimage.png deleted file mode 100644 index a1e3a82d3b1..00000000000 Binary files a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlifyimage.png and /dev/null differ diff --git a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlifyinit.png b/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlifyinit.png deleted file mode 100644 index 9a9cb15c5c3..00000000000 Binary files a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlifyinit.png and /dev/null differ diff --git a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlifyworkflow.png b/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlifyworkflow.png deleted file mode 100644 index 78ec0a26c8d..00000000000 Binary files a/docs/create/netlify/deploy-docusaurus-to-netlify/images/netlifyworkflow.png and /dev/null differ diff --git a/docs/create/netlify/deploy-docusaurus-to-netlify/images/preview.png b/docs/create/netlify/deploy-docusaurus-to-netlify/images/preview.png deleted file mode 100644 index b5d585f7b26..00000000000 Binary files a/docs/create/netlify/deploy-docusaurus-to-netlify/images/preview.png and /dev/null differ diff --git a/docs/create/netlify/deploy-docusaurus-to-netlify/index-deploy-docusaurus-to-netlify.mdx b/docs/create/netlify/deploy-docusaurus-to-netlify/index-deploy-docusaurus-to-netlify.mdx deleted file mode 100644 index 6cec542ad6d..00000000000 --- a/docs/create/netlify/deploy-docusaurus-to-netlify/index-deploy-docusaurus-to-netlify.mdx +++ /dev/null @@ -1,121 +0,0 @@ ---- -id: index-deploy-docusaurus-to-netlify -title: How to Deploy Docusaurus to Netlify in 5 Minutes -sidebar_label: Deploy Docusaurus to Netlify in 5 Minutes -slug: /create/netlify/deploy-docusaurus-to-netlify -authors: [ajeet] ---- - -import RedisCard from '@site/src/theme/RedisCard'; - -Millions of developers use Netlify to instantly build, deploy and scale their modern web applications. The platform comes with the first class-support for every popular framework like JAMstack, React, VueJS, NextJS, Gatsby, AngularJS, Nuxt, Eleventy, Svelte, Hugo, Astro and so on. - -![MyImage](images/preview.png) - -The Netlify platform allows developers to build and deploy their website to the global network (CDN) from Git in a convenient way. It delivers out-of-the-box continuous integration and continuous deployment. Developers love Netlify because it allows them to focus on building and deploying apps by abstracting all the maintenance work away from them. Features like free SSL, Custom Domain, deploy previews, functions and workflows etc makes Netlify the most comprehensive platform for web projects. - -Netlify CMS is an open source tool that allows non-technical users to easily manage and update content generated by a static site generator. -[Here's a documentation link](https://www.netlifycms.org/docs/intro/) for Netlify CMS if you're interested to learn more. - -
-
- -
-
- -[In the last blog post](/create/netlify/getting-started-with-netlify), we leveraged [Netlify CLI](https://docs.netlify.com/cli/get-started/) to build a simple Next.js application built using TailwindCSS and Redis. -In this blog, you will see how to deploy a Docusaurus website to Netlify Dashboard UI in 5 minutes. - -Let's get started.. - -### Table of Contents - -- Step 1. Sign-in for a new Netlify Account -- Step 2. Connect Netlify to a Git Provider -- Step 3. Import the GitHub repo to your GitHub account -- Step 4. Provide Netlify access to your GitHub repo -- Step 5. Configure site settings for Netlify -- Step 6. Deploy your static website -- Step 7. Visit your new Docusaurus site on Netlify - -### Step 1. Sign-in for a new Netlify Account - -Visit [https://app.netlify.com/](https://app.netlify.com/) and sign up for a Netlify account. - -![MyImage](images/netlify-signin.png) - -### Step 2. Connect your Netlify account to your Git Provider - -Netlify allows you to sign-in using various authentication services, including GitHub, GitLab, Bitbucket, Email and SSO. -For this demo we'll use GitHub. Sign into GitHub to connect it to Netlify. - -![MyImage](images/netlify_git.png) - -### Step 3. Import the project files to your GitHub account - -Once you connect your Netlify account to GitHub, you can start collaborating with your other team members. -Before we do that, let's push a sample Docusaurus site to our Git repository. You can use a generic Docusaurus site if you want. The Redis Developer Hub is built on Docusaurus, so we are using that instead. - -![MyImage](images/netlify-importexisting.png) - -### Step 4. Allow Netlify to access the GitHub repository - -Next, Netlify will allow you to import an existing project from a GitHub repository as shown below: - -![MyImage](images/netlify-importcode.png) - -### Step 5. Configure site settings for Netlify - -There are two essential settings/changes that need to be configured. -First, change the URL under `docusaurus.config.js` to any other random URL as shown below: - -```javascript title="docusaurus.config.js" -.... -module.exports = { - title: 'Redis Developer Hub', - tagline: 'The Home of Redis Developers', - url: 'https://docusaurus-2.netlify.app', - baseUrl: '/', - onBrokenLinks: 'throw', -... -... -``` - -Secondly, you will need to add a build command as shown below: -![MyImage](images/netlify_buildenviron.png) - -### Step 6. Deploy your static website - -Click "Deploys" on the top navigation, you will see an option "Trigger Deploy" on the right-side. -Choose "Deploy site". If you are performing it for the second time, then choose "Clear cache and deploy site" option. - -![MyImage](images/netlify_sitedeploy-inprogress.png) - -Monitor the "Deploy Log" carefully to see if any error messages appear in the log. - -![MyImage](images/netlify-deploylog.png) - -You should now be able to see your Docusaurus site hosted on port 3000. -![MyImage](images/netlify-success.png) - -### Step 7. Visit your new Docusaurus site on Netlify - -Go to "Sites" on the top navigation menu and click on the latest preview build. - -![MyImage](images/netlify_clickpreview.png) - -You will able to see that Netlify uploads site assets to a content delivery network and makes your site available. -![MyImage](images/netlify_devhub.png) - -### References - -- [Redis Developer Hub Source Code](https://github.com/redis-developer/redis-developer.github.io) -- [Netlify Build - A Modern CI-CD Infrastructure for Frontend Teams](https://www.netlify.com/products/build/) -- [Netlify Functions](https://www.netlify.com/products/functions/) -- [Netlify Edge](https://www.netlify.com/products/edge/) -- [Netlify WorkFlow](https://www.netlify.com/products/workflow/) -- [Netlify Analytics](https://www.netlify.com/products/analytics/) diff --git a/docs/create/netlify/getting-started-with-netlify/access.png b/docs/create/netlify/getting-started-with-netlify/access.png deleted file mode 100644 index bf40ced4f10..00000000000 Binary files a/docs/create/netlify/getting-started-with-netlify/access.png and /dev/null differ diff --git a/docs/create/netlify/getting-started-with-netlify/cachingapp.png b/docs/create/netlify/getting-started-with-netlify/cachingapp.png deleted file mode 100644 index ba1f9444c99..00000000000 Binary files a/docs/create/netlify/getting-started-with-netlify/cachingapp.png and /dev/null differ diff --git a/docs/create/netlify/getting-started-with-netlify/clicktoaccess.png b/docs/create/netlify/getting-started-with-netlify/clicktoaccess.png deleted file mode 100644 index bdc2886078c..00000000000 Binary files a/docs/create/netlify/getting-started-with-netlify/clicktoaccess.png and /dev/null differ diff --git a/docs/create/netlify/getting-started-with-netlify/database_details.png b/docs/create/netlify/getting-started-with-netlify/database_details.png deleted file mode 100644 index 5007a480b06..00000000000 Binary files a/docs/create/netlify/getting-started-with-netlify/database_details.png and /dev/null differ diff --git a/docs/create/netlify/getting-started-with-netlify/demoapp.png b/docs/create/netlify/getting-started-with-netlify/demoapp.png deleted file mode 100644 index f67e1420246..00000000000 Binary files a/docs/create/netlify/getting-started-with-netlify/demoapp.png and /dev/null differ diff --git a/docs/create/netlify/getting-started-with-netlify/details_database.png b/docs/create/netlify/getting-started-with-netlify/details_database.png deleted file mode 100644 index 3881cf02728..00000000000 Binary files a/docs/create/netlify/getting-started-with-netlify/details_database.png and /dev/null differ diff --git a/docs/create/netlify/getting-started-with-netlify/environ_2.png b/docs/create/netlify/getting-started-with-netlify/environ_2.png deleted file mode 100644 index c024372b4ee..00000000000 Binary files a/docs/create/netlify/getting-started-with-netlify/environ_2.png and /dev/null differ diff --git a/docs/create/netlify/getting-started-with-netlify/finalimage.png b/docs/create/netlify/getting-started-with-netlify/finalimage.png deleted file mode 100644 index 0d9fce95b01..00000000000 Binary files a/docs/create/netlify/getting-started-with-netlify/finalimage.png and /dev/null differ diff --git a/docs/create/netlify/getting-started-with-netlify/helloreact.png b/docs/create/netlify/getting-started-with-netlify/helloreact.png deleted file mode 100644 index 3952158d763..00000000000 Binary files a/docs/create/netlify/getting-started-with-netlify/helloreact.png and /dev/null differ diff --git a/docs/create/netlify/getting-started-with-netlify/index-getting-started-with-netlify.mdx b/docs/create/netlify/getting-started-with-netlify/index-getting-started-with-netlify.mdx deleted file mode 100644 index d533c6756a3..00000000000 --- a/docs/create/netlify/getting-started-with-netlify/index-getting-started-with-netlify.mdx +++ /dev/null @@ -1,208 +0,0 @@ ---- -id: index-getting-started-with-netlify -title: Getting Started with Netlify and Redis -sidebar_label: Getting Started with Netlify and Redis -slug: /create/netlify/getting-started-with-netlify -authors: [ajeet] ---- - -import RedisCard from '@site/src/theme/RedisCard'; - -Netlify is a popular static site hosting serverless platform. It is a popular way to build, deploy, and scale modern web applications in a much more scalable and secure way. - -Netlify helps developers to launch websites and campaigns in minutes with no fuss. Netlify is built primarily for JAMstack sites, which unify JavaScript and APIs to allow applications that are well suited for both developers and content editors. - -![netlify](netlify_redis.png) - -### Features of Netlify - -- It delivers out-of-the-box continuous integration. -- The platform allows continuous deployment through its support for Git repository deployment. -- It allows developers to focus on building and deploying apps by abstracting all the maintenance work away from the developers. -- The platform provides free SSL, CDN, and continuous integration. -- It has built-in DNS management & SSL certificates. - -### How Netlify works? - -![netlify](netlifyimage.png) - -1. The developer writes code and stores it in a version control repository (e.g. GitHub). -2. When a new change is merged into the main branch of the repository, a webhook notifies Netlify to deploy a new site. -3. Netlify pulls the latest version of the app from the repository and runs a build command to generate the static site files -4. Netlify then uses Plugins and internal code to make adjustments to your site, pre-render all of your pages in static HTML and improves it further - / -5. Once the build process gets completed, Netlify takes the static assets and pushes them to its global CDN for fast delivery. - -In this tutorial, you will see how to deploy a simple Redis caching app built with Next.js and TailwindCSS to Netlify in 5 minutes. - -### Table of Contents - -- Step 1. Setup a Free Redis Enterprise Cloud Account -- Step 2. Install Netlify CLI -- Step 3. Clone the GitHub repository -- Step 4. Login to Netlify via CLI -- Step 5. Configure Continuous Deployment -- Step 6. Pushing the changes to GitHub -- Step 7. Open the Netlify Admin URL -- Step 8. Add Environment Variable -- Step 9. Trigger the deployment -- Step 10. Accessing the app - -### Step 1. Setup a free Redis Enterprise Cloud account - -Visit [https://developer.redis.com/create/rediscloud/](https://developer.redis.com/create/rediscloud/) and create a free Redis Enterprise Cloud account. Enable the “RediSearch” module while you create the Redis Enterprise Cloud database. Once you complete the tutorial, you will be provided with the database Endpoint URL and password. Save it for future reference. - -:::info TIP -For a limited time, use **TIGER200** to get **$200** credits on Redis Enterprise Cloud and try all the advanced capabilities! - -:tada: [Click here to sign up](https://redis.com/try-free) - -::: - -![rediscloud](details_database.png) - -### Step 2. Install Netlify CLI - -Netlify’s command line interface (CLI) lets you configure continuous deployment directly from the command line. Run the below command to install Netlify CLI on your local laptop: - -```bash - -npm install netlify-cli -g -``` - -Verify if Netlify is installed or not by running the below command: - -```bash - netlify version - netlify-cli/8.15.3 darwin-x64 node-v14.17.3 -``` - -### Step 3. Clone the repository - -```bash - git clone https://github.com/redis-developer/nextjs-redis-netlify -``` - -### Step 4. Login to Netlify via CLI - -To authenticate and obtain an access token using the command line, run the following command to login to your Netlify account: - -```bash - netlify login -``` - -This will open a browser window, asking you to log in with Netlify and grant access to Netlify CLI. -Once you authenticate, it will ask you to close the window and display the below results: - -
Result - -``` - Already logged in via netlify config on your machine - - Run netlify status for account details - - To see all available commands run: netlify help -``` - -
- -### Step 5. Configure continuous deployent - -The `netlify init` command allows you to configure continuous deployment for a new or existing site. -It will also allow you to create netlify.toml file if it doesn't exists. - -``` -netlify init -``` - -
Result - -``` -netlify init -? What would you like to do? + Create & configure a new site -? Team: Redis -Choose a unique site name (e.g. super-cool-site-by-redisdeveloper.netlify.app) or leave it blank for a random name. You can update the site name later. -? Site name (optional): undefined - -Site Created - -Admin URL: https://app.netlify.com/sites/super-cool-site-by-redisdeveloper -URL: https://super-cool-site-by-redisdeveloper.netlify.app -Site ID: a70bcfb7-b7b1-4fdd-be8b-5eb3b5dbd404 - -Linked to super-cool-site-by-redis-developer in /Users/redisdeveloper/projects/netlify/basic-caching-demo-nodejs/.netlify/state.json -? Your build command (hugo build/yarn run build/etc): yarn start -? Directory to deploy (blank for current dir): dist -? Netlify functions folder: functions -Adding deploy key to repository... -Deploy key added! - -Creating Netlify GitHub Notification Hooks... -Netlify Notification Hooks configured! - -Success! Netlify CI/CD Configured! - -This site is now configured to automatically deploy from github branches & pull requests - -Next steps: - - git push Push to your git repository to trigger new site builds - netlify open Open the Netlify admin URL of your site -``` - -
- -The above step creates a `netlify.toml` file with the following content - -```javacript title="netlify.toml" - [build] - command = "npm run build" - publish = ".next" - -[[plugins]] - package = "@netlify/plugin-nextjs" - -``` - -### Step 6. Pushing the changes to GitHub - -As instructed by Netlify, run the below commands to push the latest changes to GitHub: - -``` -git add . -git commit -m “Pushing the latest changes” -git push -``` - -### Step 7. Open the Netlify Admin URL - -```bash - netlify open --admin -``` - -### Step 8. Add Environment Variable for Redis Enterprise Cloud - -![Environment Variable](environ_2.png) - -### Step 9. Trigger the deployment - -Click "Trigger deploy" to deploy the site - -![MyImage1](redisdeveloper_3.png) - -## Step 10. Accessing the app - -Click on the deploy URL and you should be able to access the app as shown: - -![MyImage](runningapp.png) - -### Try it Yourself - -[![Deploy to Netlify](https://www.netlify.com/img/deploy/button.svg)](https://app.netlify.com/start/deploy?repository=https://github.com/redis-developer/nextjs-redis-netlify) - -### References - -- [Redis Caching app using Next.js, TailwindCSS and Redis](https://github.com/redis-developer/nextjs-redis-netlify) -- [Introduction to Netlify](https://www.netlify.com/) -- [Netlify Functions](https://www.netlify.com/products/functions/) -- [Overview of JAMstack](https://www.netlify.com/jamstack/) diff --git a/docs/create/netlify/getting-started-with-netlify/localsite.png b/docs/create/netlify/getting-started-with-netlify/localsite.png deleted file mode 100644 index e76ab0212c2..00000000000 Binary files a/docs/create/netlify/getting-started-with-netlify/localsite.png and /dev/null differ diff --git a/docs/create/netlify/getting-started-with-netlify/login.png b/docs/create/netlify/getting-started-with-netlify/login.png deleted file mode 100644 index 8d5af1dbfff..00000000000 Binary files a/docs/create/netlify/getting-started-with-netlify/login.png and /dev/null differ diff --git a/docs/create/netlify/getting-started-with-netlify/netlify-logo.jpeg b/docs/create/netlify/getting-started-with-netlify/netlify-logo.jpeg deleted file mode 100644 index 0a937d52539..00000000000 Binary files a/docs/create/netlify/getting-started-with-netlify/netlify-logo.jpeg and /dev/null differ diff --git a/docs/create/netlify/getting-started-with-netlify/netlify.png b/docs/create/netlify/getting-started-with-netlify/netlify.png deleted file mode 100644 index 72ba227ce5e..00000000000 Binary files a/docs/create/netlify/getting-started-with-netlify/netlify.png and /dev/null differ diff --git a/docs/create/netlify/getting-started-with-netlify/netlify_dash.png b/docs/create/netlify/getting-started-with-netlify/netlify_dash.png deleted file mode 100644 index 2cffed31be9..00000000000 Binary files a/docs/create/netlify/getting-started-with-netlify/netlify_dash.png and /dev/null differ diff --git a/docs/create/netlify/getting-started-with-netlify/netlify_dashboard.png b/docs/create/netlify/getting-started-with-netlify/netlify_dashboard.png deleted file mode 100644 index bf096a53903..00000000000 Binary files a/docs/create/netlify/getting-started-with-netlify/netlify_dashboard.png and /dev/null differ diff --git a/docs/create/netlify/getting-started-with-netlify/netlify_redis.png b/docs/create/netlify/getting-started-with-netlify/netlify_redis.png deleted file mode 100644 index bd585721cca..00000000000 Binary files a/docs/create/netlify/getting-started-with-netlify/netlify_redis.png and /dev/null differ diff --git a/docs/create/netlify/getting-started-with-netlify/netlifyflow.png b/docs/create/netlify/getting-started-with-netlify/netlifyflow.png deleted file mode 100644 index 93e0b33a73c..00000000000 Binary files a/docs/create/netlify/getting-started-with-netlify/netlifyflow.png and /dev/null differ diff --git a/docs/create/netlify/getting-started-with-netlify/netlifyimage.png b/docs/create/netlify/getting-started-with-netlify/netlifyimage.png deleted file mode 100644 index a1e3a82d3b1..00000000000 Binary files a/docs/create/netlify/getting-started-with-netlify/netlifyimage.png and /dev/null differ diff --git a/docs/create/netlify/getting-started-with-netlify/netlifyinit.png b/docs/create/netlify/getting-started-with-netlify/netlifyinit.png deleted file mode 100644 index 9a9cb15c5c3..00000000000 Binary files a/docs/create/netlify/getting-started-with-netlify/netlifyinit.png and /dev/null differ diff --git a/docs/create/netlify/getting-started-with-netlify/netlifylogo.gif b/docs/create/netlify/getting-started-with-netlify/netlifylogo.gif deleted file mode 100644 index 6fe723cc1b2..00000000000 Binary files a/docs/create/netlify/getting-started-with-netlify/netlifylogo.gif and /dev/null differ diff --git a/docs/create/netlify/getting-started-with-netlify/netlifyworkflow.png b/docs/create/netlify/getting-started-with-netlify/netlifyworkflow.png deleted file mode 100644 index 78ec0a26c8d..00000000000 Binary files a/docs/create/netlify/getting-started-with-netlify/netlifyworkflow.png and /dev/null differ diff --git a/docs/create/netlify/getting-started-with-netlify/recloud_1.png b/docs/create/netlify/getting-started-with-netlify/recloud_1.png deleted file mode 100644 index f37fd1352b2..00000000000 Binary files a/docs/create/netlify/getting-started-with-netlify/recloud_1.png and /dev/null differ diff --git a/docs/create/netlify/getting-started-with-netlify/recloud_search.png b/docs/create/netlify/getting-started-with-netlify/recloud_search.png deleted file mode 100644 index eaeb5232e3d..00000000000 Binary files a/docs/create/netlify/getting-started-with-netlify/recloud_search.png and /dev/null differ diff --git a/docs/create/netlify/getting-started-with-netlify/rediscloud.png b/docs/create/netlify/getting-started-with-netlify/rediscloud.png deleted file mode 100644 index 766590adcd7..00000000000 Binary files a/docs/create/netlify/getting-started-with-netlify/rediscloud.png and /dev/null differ diff --git a/docs/create/netlify/getting-started-with-netlify/rediscloud_redisearch.png b/docs/create/netlify/getting-started-with-netlify/rediscloud_redisearch.png deleted file mode 100644 index 9d5dbdc9c99..00000000000 Binary files a/docs/create/netlify/getting-started-with-netlify/rediscloud_redisearch.png and /dev/null differ diff --git a/docs/create/netlify/getting-started-with-netlify/rediscloud_search.png b/docs/create/netlify/getting-started-with-netlify/rediscloud_search.png deleted file mode 100644 index bc851c77fd3..00000000000 Binary files a/docs/create/netlify/getting-started-with-netlify/rediscloud_search.png and /dev/null differ diff --git a/docs/create/netlify/getting-started-with-netlify/redisdeveloper_3.png b/docs/create/netlify/getting-started-with-netlify/redisdeveloper_3.png deleted file mode 100644 index 8259e57f317..00000000000 Binary files a/docs/create/netlify/getting-started-with-netlify/redisdeveloper_3.png and /dev/null differ diff --git a/docs/create/netlify/getting-started-with-netlify/runningapp.png b/docs/create/netlify/getting-started-with-netlify/runningapp.png deleted file mode 100644 index 1a90e78ffb0..00000000000 Binary files a/docs/create/netlify/getting-started-with-netlify/runningapp.png and /dev/null differ diff --git a/docs/create/netlify/getting-started-with-netlify/sitename.png b/docs/create/netlify/getting-started-with-netlify/sitename.png deleted file mode 100644 index 915a60cbd57..00000000000 Binary files a/docs/create/netlify/getting-started-with-netlify/sitename.png and /dev/null differ diff --git a/docs/create/netlify/getting-started-with-netlify/siteview.png b/docs/create/netlify/getting-started-with-netlify/siteview.png deleted file mode 100644 index 9826b33792b..00000000000 Binary files a/docs/create/netlify/getting-started-with-netlify/siteview.png and /dev/null differ diff --git a/docs/create/netlify/getting-started-with-netlify/success.png b/docs/create/netlify/getting-started-with-netlify/success.png deleted file mode 100644 index c8d9e7a75d8..00000000000 Binary files a/docs/create/netlify/getting-started-with-netlify/success.png and /dev/null differ diff --git a/docs/create/netlify/getting-started-with-netlify/viewsite.png b/docs/create/netlify/getting-started-with-netlify/viewsite.png deleted file mode 100644 index 318bf24161c..00000000000 Binary files a/docs/create/netlify/getting-started-with-netlify/viewsite.png and /dev/null differ diff --git a/docs/create/openshift/index-openshift.mdx b/docs/create/openshift/index-openshift.mdx index 39a3488b8e1..4b9188eb492 100644 --- a/docs/create/openshift/index-openshift.mdx +++ b/docs/create/openshift/index-openshift.mdx @@ -6,6 +6,10 @@ slug: /create/openshift authors: [karan, sumit, ajeet] --- +import Authors from '@theme/Authors'; + + + Deploying and managing containerized applications is not easy. The rise of microservice architecture has made it cumbersome to deploy containers across multiple environments. Given that containers can be spun up in seconds and at a much higher volume compared to VMs, managing containers across multiple platforms can be extremely challenging. [Red Hat OpenShift](https://www.redhat.com/en/technologies/cloud-computing/openshift/kubernetes-engine) to the rescue! OpenShift, the leading enterprise Kubernetes platform, enables a cloud-like experience wherever it’s deployed—in the cloud, on premises, or at the edge. OpenShift is built upon Kubernetes, which is designed and built from the ground up to deploy and manage containerized applications across hundreds of compute nodes. @@ -363,4 +367,3 @@ In our next tutorial, we will learn how to deploy a sample Real-time chat applic - [Deploy Redis Enterprise Software on Kubernetes with OpenShift](https://docs.redis.com/latest/kubernetes/deployment/openshift/) - [Official OpenShift Documentation](https://docs.openshift.com/) - [Install a Cluster quickly on GCP](https://docs.openshift.com/container-platform/4.7/installing/installing_gcp/installing-gcp-default.html) -- [Redis Enterprise on Kubernetes Github Repository](https://github.com/RedisLabs/redis-enterprise-k8s-docs) diff --git a/docs/create/portainer/images/portainer.png b/docs/create/portainer/images/portainer.png deleted file mode 100644 index c1baf9c6d63..00000000000 Binary files a/docs/create/portainer/images/portainer.png and /dev/null differ diff --git a/docs/create/portainer/images/redis1.png b/docs/create/portainer/images/redis1.png deleted file mode 100644 index b8b4c21492c..00000000000 Binary files a/docs/create/portainer/images/redis1.png and /dev/null differ diff --git a/docs/create/portainer/images/redis10.png b/docs/create/portainer/images/redis10.png deleted file mode 100644 index 41b6ad3f214..00000000000 Binary files a/docs/create/portainer/images/redis10.png and /dev/null differ diff --git a/docs/create/portainer/images/redis11.png b/docs/create/portainer/images/redis11.png deleted file mode 100644 index 71197e0cebc..00000000000 Binary files a/docs/create/portainer/images/redis11.png and /dev/null differ diff --git a/docs/create/portainer/images/redis12.png b/docs/create/portainer/images/redis12.png deleted file mode 100644 index e03972bf974..00000000000 Binary files a/docs/create/portainer/images/redis12.png and /dev/null differ diff --git a/docs/create/portainer/images/redis13.png b/docs/create/portainer/images/redis13.png deleted file mode 100644 index 7e2cde0f8e2..00000000000 Binary files a/docs/create/portainer/images/redis13.png and /dev/null differ diff --git a/docs/create/portainer/images/redis14.png b/docs/create/portainer/images/redis14.png deleted file mode 100644 index fbb4291f0c0..00000000000 Binary files a/docs/create/portainer/images/redis14.png and /dev/null differ diff --git a/docs/create/portainer/images/redis15.png b/docs/create/portainer/images/redis15.png deleted file mode 100644 index e0cd1444955..00000000000 Binary files a/docs/create/portainer/images/redis15.png and /dev/null differ diff --git a/docs/create/portainer/images/redis16.png b/docs/create/portainer/images/redis16.png deleted file mode 100644 index a1468365bd0..00000000000 Binary files a/docs/create/portainer/images/redis16.png and /dev/null differ diff --git a/docs/create/portainer/images/redis17.png b/docs/create/portainer/images/redis17.png deleted file mode 100644 index b543f8f57a2..00000000000 Binary files a/docs/create/portainer/images/redis17.png and /dev/null differ diff --git a/docs/create/portainer/images/redis18.png b/docs/create/portainer/images/redis18.png deleted file mode 100644 index 74f33f36d67..00000000000 Binary files a/docs/create/portainer/images/redis18.png and /dev/null differ diff --git a/docs/create/portainer/images/redis19.png b/docs/create/portainer/images/redis19.png deleted file mode 100644 index d9f4253fd7f..00000000000 Binary files a/docs/create/portainer/images/redis19.png and /dev/null differ diff --git a/docs/create/portainer/images/redis2.png b/docs/create/portainer/images/redis2.png deleted file mode 100644 index 059bfa16df2..00000000000 Binary files a/docs/create/portainer/images/redis2.png and /dev/null differ diff --git a/docs/create/portainer/images/redis3.png b/docs/create/portainer/images/redis3.png deleted file mode 100644 index e148d4b38b6..00000000000 Binary files a/docs/create/portainer/images/redis3.png and /dev/null differ diff --git a/docs/create/portainer/images/redis4.png b/docs/create/portainer/images/redis4.png deleted file mode 100644 index e6b4c8b66b6..00000000000 Binary files a/docs/create/portainer/images/redis4.png and /dev/null differ diff --git a/docs/create/portainer/images/redis5.png b/docs/create/portainer/images/redis5.png deleted file mode 100644 index 2e5e388b225..00000000000 Binary files a/docs/create/portainer/images/redis5.png and /dev/null differ diff --git a/docs/create/portainer/images/redis6.png b/docs/create/portainer/images/redis6.png deleted file mode 100644 index 9ad313587ff..00000000000 Binary files a/docs/create/portainer/images/redis6.png and /dev/null differ diff --git a/docs/create/portainer/images/redis7.png b/docs/create/portainer/images/redis7.png deleted file mode 100644 index c14c7cc55c8..00000000000 Binary files a/docs/create/portainer/images/redis7.png and /dev/null differ diff --git a/docs/create/portainer/images/redis8.png b/docs/create/portainer/images/redis8.png deleted file mode 100644 index 4988bad6021..00000000000 Binary files a/docs/create/portainer/images/redis8.png and /dev/null differ diff --git a/docs/create/portainer/images/redis9.png b/docs/create/portainer/images/redis9.png deleted file mode 100644 index 9c504d9b717..00000000000 Binary files a/docs/create/portainer/images/redis9.png and /dev/null differ diff --git a/docs/create/portainer/index-portainer.mdx b/docs/create/portainer/index-portainer.mdx deleted file mode 100644 index 4c8bb356fde..00000000000 --- a/docs/create/portainer/index-portainer.mdx +++ /dev/null @@ -1,130 +0,0 @@ ---- -id: index-portainer -title: Getting Started with Portainer and Redis -sidebar_label: Getting Started with Portainer and Redis -slug: /create/portainer -authors: [ryan] ---- - -Redis is an in-memory data structure store, used as a distributed, in-memory key–value database, cache and message broker, with optional durability. Redis supports different kinds of abstract data structures, such as strings, lists, maps, sets, sorted sets, HyperLogLogs, bitmaps, streams, and spatial indices. [~Wikipedia](https://en.wikipedia.org/wiki/Redis) - -![portainer and redis](images/portainer.png) - -You can’t travel far in the modern software world without finding that you need an in-memory data store. Today, the answer to “Which data store?” is often Redis. Also today, the answer to “How to Redis?” can be Portainer. - -In this blog we'll take you through using Portainer to set up Redis in three common scenarios. - -### Scenario 1: Kubernetes - -Many organizations are either already using Kubernetes in some capacity or they are on a journey to adopt it, so let’s start there. - -First, log into Portainer and select a Kubernetes environment to manage. Then, in the navigation menu on the left, click **"Namespaces"**, and then click the **"Add namespace with form"** button. - -- Give the namespace a name, in the case “**redis**”. -- Turn off **"Resource assignment"** (only for the purpose of simplicity for this demo). -- Click the **“Create namespace”** button. - -![create_namespace](images/redis1.png) - -Now that we have a namespace for Redis to operate in, let’s install it. - -In the navigation menu on the left, click **“Helm”**. If you haven’t already done so, add the Bitnami charts repository by typing https://charts.bitnami.com/bitnami into the **“Additional repositories”** form, and then click **“Add repository”**. - -![helm](images/redis2.png) - -Once the Bitnami charts repository has been added, you should see a list of Charts on this page. Find **"Redis"** and click on it. - -Note, you will see redis-cluster listed as an option. The redis-cluster Helm chart configures a six node cluster; three masters, and three slaves. The redis Helm chart we will use configures a much simpler three node cluster; one master, and two slaves. There are a number of other differences between these two Helm charts. For a complete list of differences, Bitnami has a good description [here](https://docs.bitnami.com/kubernetes/infrastructure/redis/get-started/compare-solutions/). - -![redis-in-helm-list](images/redis3.png) - -Next, scroll to the top of the page to configure Redis for deployment. - -- Select **“redis”** in the Namespace dropdown. -- Enter **“redis”** for the Name. -- Click **“Show custom values”**. I am going to expose Redis via NodePort 31000. I picked port 31000 because I know that ports 31000-31010 are open to my cluster. To get this done I will set the **.service.type\* to NodePort and the **.service.nodePorts.redis\* to 31000. As you can see in the screenshot below, these can currently be found on lines 431 and 441 in the Helm chart. -- Click the **“Install”** button. - -![redis-helm-custom-values](images/redis4.png) - -When it’s finished, you can head to the Applications page by clicking Applications in the navigation menu on the left. When Kubernetes is finished bringing up Redis, you will see the Status as **“Ready”**. - -![redis-ready](images/redis5.png) - -Note that what is deployed is a STATEFULSET application, which means it persists data. You can see the volumes which have been created (which use the default storage class of your system) by clicking **"Volumes"**. - -![volumes](images/redis6.png) - -See that each pod has its own copy/replica of the DB content, and note that it defaults to 8GB in size. If you need to change this, then its line 409 in the values file of the HELM deployment. - -![change_volumes_size](images/redis7.png) - -And that’s it. The only thing left to do is test it. Before we do that, we’re going to take a short detour. There are two facts about the Helm chart install of Redis that you should know. First, Redis will come up requiring authentication to connect to it. Second, a random password was created during installation. To find it, in Portainer click on “ConfigMaps & Secrets” in the navigation menu on the left. Find the secret named **“redis”** and click on it. The password that you’ll need to authenticate with is the value of the _redis-password_ key. - -![find-redis-password](images/redis8.png) - -With that in hand, you can test that your Redis server is running. My typical rudimentary test of a Redis deployment is to connect to it with the redis-cli from my laptop and increment an arbitrary key. You’ll see that the client is connecting to our NodePort here and using the password we found in our Secrets. - -![k8s-test](images/redis9.png) -In a shocking plot twist, this worked. - -### Scenario 2: Docker Swarm - -Long live Docker Swarm! There is still plenty of Swarm in the wild and Portainer is managing a lot of it. Let’s bring up Redis in Swarm. - -For the purposes of this demo, we will use a Portainer App Template to deploy Redis, and this assumes you are using Portainer’s provided App Template Repo. You can check that under “Settings”. - -![settings](images/redis10.png) - -Now that you know you are good to go, - -- Click "App Templates" in the navigation menu on the left. -- Find and click on Redis Cluster. - -![redis-cluster](images/redis11.png) - -- Fill in the name, in this case **"redis"**. -- Provide a SECURE password for **Redis**. -- Click the **“Deploy the stack”** button. - -![deploy-stack](images/redis12.png) - -You will be taken to the “Stacks list” page and will see the new stack named “redis”. Click on “redis” to see information about this stack. Like this: - -![stack-info](images/redis13.png) - -To test, expand one of the services that you see on the stack details page (above). Then click on the container’s “exec console” icon. - -![console-access-icon](images/redis14.png) - -- Click on the “Connect” button to start the shell. - -![console-connect](images/redis15.png) - -Once the console opens, the Redis deployment can be tested like so: -In case it’s difficult to see, the command used to connect to a redis node is `redis-cli -h redis-node-1 -c -a my-password-here --no-auth-warning` - -![swarm-test](images/redis16.png) - -### Scenario 3: Docker & Docker Swarm - Can I just have a container? - -Sometimes I just want a Redis container, not a whole situation. Just a quick, unsophisticated Redis container. Here’s how to get that done in Portainer. - -- Click **"App Templates"** in the navigation menu on the left. -- Toggle the **"Show container templates"** switch to on. - -![show-container-templates-toggle](images/redis17.png) - -- Find and click on Redis. -- Give the application a name, in this case "redis". -- Click on “Show advanced options”. -- Set the port to map. In this example, the Docker **host**'s port 6379 is forwarded to the **container**'s port 6379, Redis’s default port for most communications. -- Click on the “Deploy the container” button. - -![deploy-container](images/redis18.png) - -That’s it. You can test in the same way as before. - -![container-container](images/redis19.png) - -Three different scenarios - Three easy Redis deployments using Portainer diff --git a/docs/create/redis-functions/index-redis-functions.mdx b/docs/create/redis-functions/index-redis-functions.mdx index ff62eedda1a..e0d9dbed54c 100644 --- a/docs/create/redis-functions/index-redis-functions.mdx +++ b/docs/create/redis-functions/index-redis-functions.mdx @@ -6,8 +6,12 @@ slug: /create/redis-functions authors: [elena] --- +import Authors from '@theme/Authors'; + # Getting started with Redis Functions + + The most impactful addition to Redis version 7.0 is **Redis Functions** - a new programmability option, improving on scripts by **adding modularity, reusability, and better overall developer experience**. Functions are, in contrast to scripts, persisted in the .rdb and .aof files as well as automatically replicated to all the replicas, which makes them a first-class citizen of Redis. @@ -31,7 +35,11 @@ Alternatively, you can spin up a Docker container with Redis Stack: $ docker run -p 6379:6379 --name redis-7.0 -it --rm redis/redis-stack:7.0.0-RC4 ``` -> Note: In the rest of this tutorial we’ll use the `$` character to indicate that the command needs to be run on the command prompt and `redis-cli>` to indicate the same for a redis-cli prompt.\_ +:::note + +In the rest of this tutorial we’ll use the `$` character to indicate that the command needs to be run on the command prompt and `redis-cli>` to indicate the same for a redis-cli prompt. + +::: ### Warm-Up diff --git a/docs/create/redis-stack/images/Verify_subscription.png b/docs/create/redis-stack/images/Verify_subscription.png deleted file mode 100644 index e5911628f69..00000000000 Binary files a/docs/create/redis-stack/images/Verify_subscription.png and /dev/null differ diff --git a/docs/create/redis-stack/images/add_credentials.png b/docs/create/redis-stack/images/add_credentials.png deleted file mode 100644 index 5007a480b06..00000000000 Binary files a/docs/create/redis-stack/images/add_credentials.png and /dev/null differ diff --git a/docs/create/redis-stack/images/add_database.png b/docs/create/redis-stack/images/add_database.png deleted file mode 100644 index 9ada742a2f2..00000000000 Binary files a/docs/create/redis-stack/images/add_database.png and /dev/null differ diff --git a/docs/create/redis-stack/images/create_database.png b/docs/create/redis-stack/images/create_database.png deleted file mode 100644 index 6f68abd90d7..00000000000 Binary files a/docs/create/redis-stack/images/create_database.png and /dev/null differ diff --git a/docs/create/redis-stack/images/create_subscription.png b/docs/create/redis-stack/images/create_subscription.png deleted file mode 100644 index 347fdd15353..00000000000 Binary files a/docs/create/redis-stack/images/create_subscription.png and /dev/null differ diff --git a/docs/create/redis-stack/images/database_creds.png b/docs/create/redis-stack/images/database_creds.png deleted file mode 100644 index ef6379e72b3..00000000000 Binary files a/docs/create/redis-stack/images/database_creds.png and /dev/null differ diff --git a/docs/create/redis-stack/images/database_details.png b/docs/create/redis-stack/images/database_details.png deleted file mode 100644 index 5007a480b06..00000000000 Binary files a/docs/create/redis-stack/images/database_details.png and /dev/null differ diff --git a/docs/create/redis-stack/images/details_database.png b/docs/create/redis-stack/images/details_database.png deleted file mode 100644 index 3881cf02728..00000000000 Binary files a/docs/create/redis-stack/images/details_database.png and /dev/null differ diff --git a/docs/create/redis-stack/images/endpoint.png b/docs/create/redis-stack/images/endpoint.png deleted file mode 100644 index 2f29297b95d..00000000000 Binary files a/docs/create/redis-stack/images/endpoint.png and /dev/null differ diff --git a/docs/create/redis-stack/images/final_subscription.png b/docs/create/redis-stack/images/final_subscription.png deleted file mode 100644 index 333ce58c396..00000000000 Binary files a/docs/create/redis-stack/images/final_subscription.png and /dev/null differ diff --git a/docs/create/redis-stack/images/json_keys.png b/docs/create/redis-stack/images/json_keys.png deleted file mode 100644 index 326c21895f0..00000000000 Binary files a/docs/create/redis-stack/images/json_keys.png and /dev/null differ diff --git a/docs/create/redis-stack/images/json_results.png b/docs/create/redis-stack/images/json_results.png deleted file mode 100644 index d9cef33678f..00000000000 Binary files a/docs/create/redis-stack/images/json_results.png and /dev/null differ diff --git a/docs/create/redis-stack/images/json_workbench.png b/docs/create/redis-stack/images/json_workbench.png deleted file mode 100644 index d1f769b28f3..00000000000 Binary files a/docs/create/redis-stack/images/json_workbench.png and /dev/null differ diff --git a/docs/create/redis-stack/images/launch_database.png b/docs/create/redis-stack/images/launch_database.png deleted file mode 100644 index 861f20f9dec..00000000000 Binary files a/docs/create/redis-stack/images/launch_database.png and /dev/null differ diff --git a/docs/create/redis-stack/images/local_database_creds.png b/docs/create/redis-stack/images/local_database_creds.png deleted file mode 100644 index 4f91de2bf15..00000000000 Binary files a/docs/create/redis-stack/images/local_database_creds.png and /dev/null differ diff --git a/docs/create/redis-stack/images/recloud1.png b/docs/create/redis-stack/images/recloud1.png deleted file mode 100644 index ec9e61f84bc..00000000000 Binary files a/docs/create/redis-stack/images/recloud1.png and /dev/null differ diff --git a/docs/create/redis-stack/images/recloud2.png b/docs/create/redis-stack/images/recloud2.png deleted file mode 100644 index d2ed5dfaae8..00000000000 Binary files a/docs/create/redis-stack/images/recloud2.png and /dev/null differ diff --git a/docs/create/redis-stack/images/recloud3.png b/docs/create/redis-stack/images/recloud3.png deleted file mode 100644 index aaead51bdfd..00000000000 Binary files a/docs/create/redis-stack/images/recloud3.png and /dev/null differ diff --git a/docs/create/redis-stack/images/recloud4.png b/docs/create/redis-stack/images/recloud4.png deleted file mode 100644 index 1688e1b4493..00000000000 Binary files a/docs/create/redis-stack/images/recloud4.png and /dev/null differ diff --git a/docs/create/redis-stack/images/recloud5.png b/docs/create/redis-stack/images/recloud5.png deleted file mode 100644 index 671cc8b92e1..00000000000 Binary files a/docs/create/redis-stack/images/recloud5.png and /dev/null differ diff --git a/docs/create/redis-stack/images/recloud_1.png b/docs/create/redis-stack/images/recloud_1.png deleted file mode 100644 index f37fd1352b2..00000000000 Binary files a/docs/create/redis-stack/images/recloud_1.png and /dev/null differ diff --git a/docs/create/redis-stack/images/redis-stack-components.png b/docs/create/redis-stack/images/redis-stack-components.png deleted file mode 100644 index 3d59b516be5..00000000000 Binary files a/docs/create/redis-stack/images/redis-stack-components.png and /dev/null differ diff --git a/docs/create/redis-stack/images/redis-stack.png b/docs/create/redis-stack/images/redis-stack.png deleted file mode 100644 index 63e43de504f..00000000000 Binary files a/docs/create/redis-stack/images/redis-stack.png and /dev/null differ diff --git a/docs/create/redis-stack/images/redisinsight_access.png b/docs/create/redis-stack/images/redisinsight_access.png deleted file mode 100644 index 9108228f946..00000000000 Binary files a/docs/create/redis-stack/images/redisinsight_access.png and /dev/null differ diff --git a/docs/create/redis-stack/images/redisinsight_creds.png b/docs/create/redis-stack/images/redisinsight_creds.png deleted file mode 100644 index bb14c5476cf..00000000000 Binary files a/docs/create/redis-stack/images/redisinsight_creds.png and /dev/null differ diff --git a/docs/create/redis-stack/images/redisstack.png b/docs/create/redis-stack/images/redisstack.png deleted file mode 100644 index 56839f81811..00000000000 Binary files a/docs/create/redis-stack/images/redisstack.png and /dev/null differ diff --git a/docs/create/redis-stack/images/security_enable.png b/docs/create/redis-stack/images/security_enable.png deleted file mode 100644 index abae839b8cf..00000000000 Binary files a/docs/create/redis-stack/images/security_enable.png and /dev/null differ diff --git a/docs/create/redis-stack/images/select_cloud.png b/docs/create/redis-stack/images/select_cloud.png deleted file mode 100644 index 2784e455de7..00000000000 Binary files a/docs/create/redis-stack/images/select_cloud.png and /dev/null differ diff --git a/docs/create/redis-stack/images/select_cloud_vendor.png b/docs/create/redis-stack/images/select_cloud_vendor.png deleted file mode 100644 index 2526223c800..00000000000 Binary files a/docs/create/redis-stack/images/select_cloud_vendor.png and /dev/null differ diff --git a/docs/create/redis-stack/images/select_subscription.png b/docs/create/redis-stack/images/select_subscription.png deleted file mode 100644 index 764333343f7..00000000000 Binary files a/docs/create/redis-stack/images/select_subscription.png and /dev/null differ diff --git a/docs/create/redis-stack/images/stack1.png b/docs/create/redis-stack/images/stack1.png deleted file mode 100644 index f8d7e70c008..00000000000 Binary files a/docs/create/redis-stack/images/stack1.png and /dev/null differ diff --git a/docs/create/redis-stack/images/stack2.png b/docs/create/redis-stack/images/stack2.png deleted file mode 100644 index f89f2c60b22..00000000000 Binary files a/docs/create/redis-stack/images/stack2.png and /dev/null differ diff --git a/docs/create/redis-stack/images/stack3.png b/docs/create/redis-stack/images/stack3.png deleted file mode 100644 index 455d2282b09..00000000000 Binary files a/docs/create/redis-stack/images/stack3.png and /dev/null differ diff --git a/docs/create/redis-stack/images/testredis1.png b/docs/create/redis-stack/images/testredis1.png deleted file mode 100644 index 88bfdf751b5..00000000000 Binary files a/docs/create/redis-stack/images/testredis1.png and /dev/null differ diff --git a/docs/create/redis-stack/images/testredis2.png b/docs/create/redis-stack/images/testredis2.png deleted file mode 100644 index 712038efad5..00000000000 Binary files a/docs/create/redis-stack/images/testredis2.png and /dev/null differ diff --git a/docs/create/redis-stack/images/testredis3.png b/docs/create/redis-stack/images/testredis3.png deleted file mode 100644 index c366b3b8ca6..00000000000 Binary files a/docs/create/redis-stack/images/testredis3.png and /dev/null differ diff --git a/docs/create/redis-stack/images/testredis4.png b/docs/create/redis-stack/images/testredis4.png deleted file mode 100644 index 15cbdccf451..00000000000 Binary files a/docs/create/redis-stack/images/testredis4.png and /dev/null differ diff --git a/docs/create/redis-stack/images/testredis5.png b/docs/create/redis-stack/images/testredis5.png deleted file mode 100644 index 00e05e4c408..00000000000 Binary files a/docs/create/redis-stack/images/testredis5.png and /dev/null differ diff --git a/docs/create/redis-stack/images/testredis6.png b/docs/create/redis-stack/images/testredis6.png deleted file mode 100644 index dce00ef0632..00000000000 Binary files a/docs/create/redis-stack/images/testredis6.png and /dev/null differ diff --git a/docs/create/redis-stack/images/try-free.png b/docs/create/redis-stack/images/try-free.png deleted file mode 100644 index 11915ea5927..00000000000 Binary files a/docs/create/redis-stack/images/try-free.png and /dev/null differ diff --git a/docs/create/redis-stack/images/tryfree.png b/docs/create/redis-stack/images/tryfree.png deleted file mode 100644 index 53eeaba1727..00000000000 Binary files a/docs/create/redis-stack/images/tryfree.png and /dev/null differ diff --git a/docs/create/redis-stack/index-redis-stack.mdx b/docs/create/redis-stack/index-redis-stack.mdx deleted file mode 100644 index 9e50e2c7d20..00000000000 --- a/docs/create/redis-stack/index-redis-stack.mdx +++ /dev/null @@ -1,382 +0,0 @@ ---- -id: index-redis-stack -title: Introduction to Redis Stack -sidebar_label: Redis Stack -slug: /create/redis-stack -authors: [ajeet] ---- - -import RedisCard from '@site/src/theme/RedisCard'; -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; - -![redis stack](images/redis-stack.png) - -Redis Stack is an extension of Redis that adds modern data models and processing engines to provide a complete developer experience. Redis Stack provides a simple and seamless way to access different data models such as full-text search, document store, graph, time series, and probabilistic data structures enabling developers to build any real-time data application. - -### Redis OSS vs. Redis Stack - -In addition to all of the features of OSS Redis, Redis Stack supports: - -- Queryable JSON documents -- Full-text search -- Time series data (ingestion & querying) -- Graph data models with the Cypher query language -- Probabilistic data structures - -### Redis Stack License - -Redis Stack is made up of several components, licensed as follows: - -- Redis Stack Server combines open source Redis with RediSearch, RedisJSON RedisGraph, RedisTimeSeries and RedisBloom is licensed under the [Redis Source Available License](https://github.com/RediSearch/RediSearch/blob/master/LICENSE) (RSAL). -- RedisInsight is licensed under the [Server Side Public License](https://en.wikipedia.org/wiki/Server_Side_Public_License) (SSPL). - -### Which client libraries support Redis Stack? - -The following core client libraries support Redis Stack: - -- [Jedis](https://github.com/redis/jedis) >= 4.0 -- [node-redis](https://github.com/redis/node-redis) >= 4.0 -- [redis-py](https://github.com/redis/redis-py/) >= 4.0 - -The Redis OM client libraries let you use the document modeling, indexing, and querying capabilities of Redis Stack much like the way you’d use an ORM. The following Redis OM libraries support Redis Stack: - -- [Redis OM .NET](https://github.com/redis/redis-om-dotnet) -- [Redis OM Node](https://github.com/redis/redis-om-node) -- [Redis OM Python](https://github.com/redis/redis-om-python) -- [Redis OM Spring](https://github.com/redis/redis-om-spring) - -### Getting Started - - - - -#### Using Redis Enterprise Cloud - -### Step 1. Create free cloud account - -Create your free Redis Enterprise Cloud account. Once you click on “Get Started”, you will receive an email with a link to activate your account and complete your signup process. - -:::info TIP -For a limited time, use **TIGER200** to get **$200** credits on Redis Enterprise Cloud and try all the advanced capabilities! - -:tada: [Click here to sign up](https://redis.com/try-free) - -::: - -### Step 2. Create Your database - -For the cloud provider, select your preferred cloud. -Select the region of your choice and then click "Let's start free". - -:::info TIP -If you want to create a custom database with your preferred name and type of Redis, -click "Create a custom database". -::: - -![create database ](images/select_cloud_vendor.png) - -### Step 3. Listing the database details - -Once fully activated, you will see the database endpoints as shown below: - -![verify database](images/details_database.png) - -### Step 4. Connecting to the database via RedisInsight - -RedisInsight is a visual tool that lets you do both GUI- and CLI-based interactions with your Redis database, and so much more when developing your Redis based application. It is a fully-featured pure Desktop GUI client that provides capabilities to design, develop and optimize your Redis application. It works with any cloud provider as long as you run it on a host with network access to your cloud-based Redis server. It makes it easy to discover cloud databases and configure connection details with a single click. It allows you to automatically add Redis Enterprise Software and Redis Enterprise Cloud databases. - -You can install Redis Stack on your local system to get RedisInsight GUI tool up and running. Ensure that you have the `brew` package installed in your Mac system. - -```bash - brew tap redis-stack/redis-stack - brew install --cask redis-stack -``` - -``` - ==> Installing Cask redis-stack-redisinsight - ==> Moving App 'RedisInsight-preview.app' to '/Applications/RedisInsight-preview.app' - 🍺 redis-stack-redisinsight was successfully installed! - ==> Installing Cask redis-stack - 🍺 redis-stack was successfully installed! -``` - -You can easily find the Applications folder on your Mac with Finder. Search "RedisInsight-v2" and click the icon to bring up the Redis Desktop GUI tool. - -### Step 5. Add Redis database - -![access redisinsight](images/add_database.png) - -### Step 6. Enter Redis Enterprise Cloud details - -Add the Redis Enterprise cloud database endpoint, port and password. - -![access redisinsight](images/database_creds.png) - -### Step 7. Verify the database under RedisInsight dashboard - -![database details](images/database_details.png) - -### Step 8. Try Redis Stack tutorials - -In this tutorial, we will go through an example of a bike shop. We will show the different capabilities of Redis Stack. - -Choose "Redis Stack" from the left sidebar menu. - -![access json workbench](images/stack1.png) - -### Step 9. Store and Manage JSON - -Let's examine the query for creating a single bike. Click "Create a bike" button: - -![access json keys](images/stack2.png) - -It will display a `JSON.SET` command with model, brand, price, type, specs and description details. The `bikes:1` is the name of the Redis key that the JSON will be stored in. - -### Step 10. Accessing part of a stored JSON document - -Click "Get specific fields" to access parts of a stored JSON document as shown in the following image: - -![access json keys](images/stack3.png) - - - - - -Use homebrew to install Redis Stack on macOS, by following these instructions: - -### Step 1. Install Homebrew - -```bash - /bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)" -``` - -### Step 2. Install Redis Stack using Homebrew - -First, tap the Redis Stack Homebrew tap and then run `brew install` as shown below: - -```bash - brew tap redis-stack/redis-stack - brew install --cask redis-stack -``` - -This will install all Redis and Redis Stack binaries. How you run these binaries depends on whether you already have Redis installed on your system. - -``` - ==> Installing Cask redis-stack-redisinsight - ==> Moving App 'RedisInsight-preview.app' to '/Applications/RedisInsight-preview.app' - 🍺 redis-stack-redisinsight was successfully installed! - ==> Installing Cask redis-stack - 🍺 redis-stack was successfully installed! -``` - -:::info TIP - -If this is the first time you’ve installed Redis on your system, then all Redis Stack binaries will be installed and on your path. On M1 Macs, this assumes that `/opt/homebrew/bin` is in your path. On Intel-based Macs, `/usr/local/bin` should be in the path. - -To check this, run: - -```bash - echo $PATH -``` - -Then, confirm that the output contains `/opt/homebrew/bin` (M1 Mac) or `/usr/local/bin` (Intel Mac). If these directories are not in the output, see the “Existing Redis installation” instructions below. -::: - -### Start Redis Stack Server - -You can now start Redis Stack Server as follows: - -```bash - redis-stack-server -``` - -### Existing Redis installation - -If you have an existing Redis installation on your system, then you’ll need to modify your path to ensure that you’re using the latest Redis Stack binaries. - -Open the file `~/.bashrc` or `~/zshrc` (depending on your shell), and add the following line. - -```bash - export PATH=/usr/local/Caskroom/redis-stack-server//bin:$PATH -``` - -Go to Applications and click "RedisInsight-v2" to bring up the Redis Desktop GUI tool. - -### Step 3. Add Redis database - -![access redisinsight](images/add_database.png) - -### Step 4. Enter Redis database details - -Add the local Redis database endpoint and port. - -![access redisinsight](images/testredis1.png) - -### Step 5. Redis for time series - -Redis Stack provides you with a native time series data structure. Let's see how a time series might be useful in our bike shop. - -As we have multiple physical shops too, alongside our online shop, it could be helpful to have an overview of the sales volume. We will create one time series per shop tracking the total amount of all sales. In addition, we will mark the time series with the appropriate region label, east or west. This kind of representation will allow us to easily query bike sales performance per certain time periods, per shop, per region or across all shops. - -Click the "Guides" icon (just below the key) in the left sidebar and choose "Redis for time series" for this demonstration. - -![redis for timeseries](images/testredis2.png) - -### Step 6. Create time series per shop - -```bash - TS.CREATE bike_sales_1 DUPLICATE_POLICY SUM LABELS region east compacted no - TS.CREATE bike_sales_2 DUPLICATE_POLICY SUM LABELS region east compacted no - TS.CREATE bike_sales_3 DUPLICATE_POLICY SUM LABELS region west compacted no - TS.CREATE bike_sales_4 DUPLICATE_POLICY SUM LABELS region west compacted no - TS.CREATE bike_sales_5 DUPLICATE_POLICY SUM LABELS region west compacted no -``` - -As shown in the following query, we make the shop id (1,2,3,4,5) a part of the time series name. You might also notice the `DUPLICATE_POLICY SUM` argument; this describes what should be done when two events in the same time series share the same timestamp: In this case, it would mean that two sales happened at exactly the same time, so the resulting value should be a sum of the two sales amounts. - -Since the metrics are collected with a millisecond timestamp, we can compact our time series into sales per hour: - -![create time series per shop](images/testredis3.png) - -### Step 7. Run the query - -![execute the query](images/testredis4.png) - -### Step 8. Time series Aggregations - -RedisTimeSeries supports downsampling with the following aggregations: avg, sum, min, max, range, count, first and last. If you want to keep all of your raw data points indefinitely, your data set grows linearly over time. However, if your use case allows you to have less fine-grained data further back in time, downsampling can be applied. This allows you to keep fewer historical data points by aggregating raw data for a given time window using a given aggregation function. - -#### Example: - -``` - TS.CREATERULE bike_sales_5 bike_sales_5_per_day AGGREGATION sum 86400000 -``` - -![time series compaction](images/testredis6.png) - - - - - -You can run Redis Stack using a Docker container. There are two types of Docker images available in Docker Hub. - -- The `redis/redis-stack` Docker image contains both Redis Stack server and RedisInsight. This container is recommended for local development because you can use RedisInsight to visualize your data. - -- The `redis/redis-stack-server` provides Redis Stack but excludes RedisInsight. This container is best for production deployment. - -### Getting started - -To start Redis Stack server using the redis-stack image, run the following command in your terminal: - -```bash - docker run -d --name redis-stack -p 6379:6379 -p 8001:8001 redis/redis-stack:latest -``` - -You can use `redis-cli` to connect to the server, just as you connect to any Redis instance. -If you don’t have redis-cli installed locally, you can run it from the Docker container: - -```bash - docker exec -it redis-stack redis-cli -``` - -:::info TIP -The `docker run` command above also exposes RedisInsight on port 8001. You can use RedisInsight by pointing your browser to http://localhost:8001. -::: - -To persist your Redis data to a local path, specify -v to configure a local volume. This command stores all data in the local directory `local-data`: - -```bash - docker run -v /local-data/:/data redis/redis-stack:latest -``` - -If you want to expose Redis Stack server or RedisInsight on a different port, update the left hand of portion of the `-p` argument. This command exposes Redis Stack server on port 10001 and RedisInsight on port 13333: - -```bash - docker run -p 10001:6379 -p 13333:8001 redis/redis-stack:latest -``` - -By default, the Redis Stack Docker containers use internal configuration files for Redis. To start Redis with a local configuration file, you can use the `-v` volume options: - -```bash - docker run -v `pwd`/local-redis-stack.conf:/redis-stack.conf -p 6379:6379 -p 8001:8001 redis/redis-stack:latest -``` - -To pass in arbitrary configuration changes, you can set any of these environment variables: - -- `REDIS_ARGS`: extra arguments for Redis -- `REDISEARCH_ARGS`: arguments for RediSearch -- `REDISJSON_ARGS`: arguments for RedisJSON -- `REDISGRAPH_ARGS`: arguments for RedisGraph -- `REDISTIMESERIES_ARGS`: arguments for RedisTimeSeries -- `REDISBLOOM_ARGS`: arguments for RedisBloom - -For example, here’s how to use the `REDIS_ARGS` environment variable to pass the `requirepass` directive to Redis: - -``` - docker run -e REDIS_ARGS="--requirepass redis-stack" redis/redis-stack:latest -``` - -Here’s how to set a retention policy for RedisTimeSeries: - -``` - docker run -e REDISTIMESERIES_ARGS="RETENTION_POLICY=20" redis/redis-stack:latest -``` - - - - -### Next Steps - -
- -
-
- -#### Storing and querying JSON documents - -[Follow this tutorial](/howtos/redisjson/getting-started) to learn how to store and query JSON documents using Redis Stack. - -
-
- -
-
- -#### Full-text search - -Learn how to [perform full-text search](/howtos/redisearch/) using Redis Stack. - -
-
-
- -
- -
-
- -#### Probabilistic data structures - -Follow this tutorial to learn [how to implement low latency and compact Probabilistic data structures](/howtos/redisbloom) using Redis Stack. - -
-
- -
-
- -#### Storing and querying time series data - -[Learn how to store and query time series data](/howtos/redistimeseries/getting-started) using Redis Stack. - -
-
-
diff --git a/docs/create/rediscloud/deployment.png b/docs/create/rediscloud/deployment.png deleted file mode 100644 index ac7b5328fa2..00000000000 Binary files a/docs/create/rediscloud/deployment.png and /dev/null differ diff --git a/docs/create/rediscloud/images/.wd b/docs/create/rediscloud/images/.wd deleted file mode 100644 index 3881cf02728..00000000000 Binary files a/docs/create/rediscloud/images/.wd and /dev/null differ diff --git a/docs/create/rediscloud/images/README.md b/docs/create/rediscloud/images/README.md deleted file mode 100644 index 4148b2f11ab..00000000000 --- a/docs/create/rediscloud/images/README.md +++ /dev/null @@ -1 +0,0 @@ -# images diff --git a/docs/create/rediscloud/images/Verify_subscription.png b/docs/create/rediscloud/images/Verify_subscription.png deleted file mode 100644 index e5911628f69..00000000000 Binary files a/docs/create/rediscloud/images/Verify_subscription.png and /dev/null differ diff --git a/docs/create/rediscloud/images/activate.png b/docs/create/rediscloud/images/activate.png deleted file mode 100644 index b871b07fd4e..00000000000 Binary files a/docs/create/rediscloud/images/activate.png and /dev/null differ diff --git a/docs/create/rediscloud/images/add_database.png b/docs/create/rediscloud/images/add_database.png deleted file mode 100644 index 9ada742a2f2..00000000000 Binary files a/docs/create/rediscloud/images/add_database.png and /dev/null differ diff --git a/docs/create/rediscloud/images/aws.png b/docs/create/rediscloud/images/aws.png deleted file mode 100644 index 5d49974a6a4..00000000000 Binary files a/docs/create/rediscloud/images/aws.png and /dev/null differ diff --git a/docs/create/rediscloud/images/choosemodule.png b/docs/create/rediscloud/images/choosemodule.png deleted file mode 100644 index ba5165de56c..00000000000 Binary files a/docs/create/rediscloud/images/choosemodule.png and /dev/null differ diff --git a/docs/create/rediscloud/images/create_database.png b/docs/create/rediscloud/images/create_database.png deleted file mode 100644 index 6f68abd90d7..00000000000 Binary files a/docs/create/rediscloud/images/create_database.png and /dev/null differ diff --git a/docs/create/rediscloud/images/create_subscription.png b/docs/create/rediscloud/images/create_subscription.png deleted file mode 100644 index 347fdd15353..00000000000 Binary files a/docs/create/rediscloud/images/create_subscription.png and /dev/null differ diff --git a/docs/create/rediscloud/images/createdatabase.png b/docs/create/rediscloud/images/createdatabase.png deleted file mode 100644 index a4415e902e9..00000000000 Binary files a/docs/create/rediscloud/images/createdatabase.png and /dev/null differ diff --git a/docs/create/rediscloud/images/database_creds.png b/docs/create/rediscloud/images/database_creds.png deleted file mode 100644 index ef6379e72b3..00000000000 Binary files a/docs/create/rediscloud/images/database_creds.png and /dev/null differ diff --git a/docs/create/rediscloud/images/database_details.png b/docs/create/rediscloud/images/database_details.png deleted file mode 100644 index 5007a480b06..00000000000 Binary files a/docs/create/rediscloud/images/database_details.png and /dev/null differ diff --git a/docs/create/rediscloud/images/deployment.png b/docs/create/rediscloud/images/deployment.png deleted file mode 100644 index adb4c49d3d9..00000000000 Binary files a/docs/create/rediscloud/images/deployment.png and /dev/null differ diff --git a/docs/create/rediscloud/images/details_database.png b/docs/create/rediscloud/images/details_database.png deleted file mode 100644 index 3881cf02728..00000000000 Binary files a/docs/create/rediscloud/images/details_database.png and /dev/null differ diff --git a/docs/create/rediscloud/images/final_subscription.png b/docs/create/rediscloud/images/final_subscription.png deleted file mode 100644 index 333ce58c396..00000000000 Binary files a/docs/create/rediscloud/images/final_subscription.png and /dev/null differ diff --git a/docs/create/rediscloud/images/launch_database.png b/docs/create/rediscloud/images/launch_database.png deleted file mode 100644 index 861f20f9dec..00000000000 Binary files a/docs/create/rediscloud/images/launch_database.png and /dev/null differ diff --git a/docs/create/rediscloud/images/plan.png b/docs/create/rediscloud/images/plan.png deleted file mode 100644 index d481c7540a6..00000000000 Binary files a/docs/create/rediscloud/images/plan.png and /dev/null differ diff --git a/docs/create/rediscloud/images/recloud1.png b/docs/create/rediscloud/images/recloud1.png deleted file mode 100644 index a73de599091..00000000000 Binary files a/docs/create/rediscloud/images/recloud1.png and /dev/null differ diff --git a/docs/create/rediscloud/images/recloud2.png b/docs/create/rediscloud/images/recloud2.png deleted file mode 100644 index 5cb98fc25f1..00000000000 Binary files a/docs/create/rediscloud/images/recloud2.png and /dev/null differ diff --git a/docs/create/rediscloud/images/recloud3.png b/docs/create/rediscloud/images/recloud3.png deleted file mode 100644 index a390a684cc8..00000000000 Binary files a/docs/create/rediscloud/images/recloud3.png and /dev/null differ diff --git a/docs/create/rediscloud/images/region.png b/docs/create/rediscloud/images/region.png deleted file mode 100644 index ddbe75e4287..00000000000 Binary files a/docs/create/rediscloud/images/region.png and /dev/null differ diff --git a/docs/create/rediscloud/images/select_cloud.png b/docs/create/rediscloud/images/select_cloud.png deleted file mode 100644 index 2784e455de7..00000000000 Binary files a/docs/create/rediscloud/images/select_cloud.png and /dev/null differ diff --git a/docs/create/rediscloud/images/select_cloud_vendor.png b/docs/create/rediscloud/images/select_cloud_vendor.png deleted file mode 100644 index 2526223c800..00000000000 Binary files a/docs/create/rediscloud/images/select_cloud_vendor.png and /dev/null differ diff --git a/docs/create/rediscloud/images/select_subscription.png b/docs/create/rediscloud/images/select_subscription.png deleted file mode 100644 index 764333343f7..00000000000 Binary files a/docs/create/rediscloud/images/select_subscription.png and /dev/null differ diff --git a/docs/create/rediscloud/images/stack1.png b/docs/create/rediscloud/images/stack1.png deleted file mode 100644 index f8d7e70c008..00000000000 Binary files a/docs/create/rediscloud/images/stack1.png and /dev/null differ diff --git a/docs/create/rediscloud/images/stack2.png b/docs/create/rediscloud/images/stack2.png deleted file mode 100644 index f89f2c60b22..00000000000 Binary files a/docs/create/rediscloud/images/stack2.png and /dev/null differ diff --git a/docs/create/rediscloud/images/stack3.png b/docs/create/rediscloud/images/stack3.png deleted file mode 100644 index 455d2282b09..00000000000 Binary files a/docs/create/rediscloud/images/stack3.png and /dev/null differ diff --git a/docs/create/rediscloud/images/subscription.png b/docs/create/rediscloud/images/subscription.png deleted file mode 100644 index b4a61342f3e..00000000000 Binary files a/docs/create/rediscloud/images/subscription.png and /dev/null differ diff --git a/docs/create/rediscloud/images/try-free.png b/docs/create/rediscloud/images/try-free.png deleted file mode 100644 index 11915ea5927..00000000000 Binary files a/docs/create/rediscloud/images/try-free.png and /dev/null differ diff --git a/docs/create/rediscloud/images/tryfree.png b/docs/create/rediscloud/images/tryfree.png deleted file mode 100644 index 53eeaba1727..00000000000 Binary files a/docs/create/rediscloud/images/tryfree.png and /dev/null differ diff --git a/docs/create/rediscloud/index-recloud.mdx b/docs/create/rediscloud/index-recloud.mdx deleted file mode 100644 index a77d78ea8da..00000000000 --- a/docs/create/rediscloud/index-recloud.mdx +++ /dev/null @@ -1,116 +0,0 @@ ---- -id: index-rediscloud -title: Create database using Redis Enterprise Cloud -sidebar_label: Redis Enterprise Cloud -slug: /create/rediscloud -authors: [ajeet] ---- - -Redis Enterprise Cloud is a fully managed cloud service by Redis. Built for modern distributed applications, Redis Enterprise Cloud enables you to run any query, simple or complex, at sub-millisecond performance at virtually infinite scale without worrying about operational complexity or service availability. With modern probabilistic data structures and extensible data models, including Search, JSON, Graph, and Time Series, you can rely on Redis as your data-platform for all your real-time needs. - -### Step 1. Create a free Cloud account - -Create your free Redis Enterprise Cloud account. Once you click on “Get Started”, you will receive an email with a link to activate your account and complete your signup process. - -:::info TIP -For a limited time, use **TIGER200** to get **$200** credits on Redis Enterprise Cloud and try all the advanced capabilities! - -:tada: [Click here to sign up](https://redis.com/try-free) - -::: - -### Step 2. Create Your database - -Choose your preferred cloud vendor. Select the region and then click "Let's start free" to create your free database automatically. - -:::info TIP -If you want to create a custom database with your preferred name and type of Redis, -click "Create a custom database" option shown in the image. -::: - -![create database](images/select_cloud_vendor.png) - -### Step 3. Verify the database details - -You will be provided with Public endpoint URL and "Redis Stack" as the type of database with the list of modules that comes by default. - -![verify database](images/details_database.png) - -### Step 4. Install RedisInsight - -RedisInsight is a visual tool that lets you do both GUI- and CLI-based interactions with your Redis database, and so much more when developing your Redis based application. It is a fully-featured pure Desktop GUI client that provides capabilities to design, develop and optimize your Redis application. It works with any cloud provider as long as you run it on a host with network access to your cloud-based Redis server. It makes it easy to discover cloud databases and configure connection details with a single click. It allows you to automatically add Redis Enterprise Software and Redis Enterprise Cloud databases. - -You can install Redis Stack on your local system to get RedisInsight GUI tool up and running. Ensure that you have `brew` package installed in your Mac system. - -```bash - brew tap redis-stack/redis-stack - brew install --cask redis-stack -``` - -``` - ==> Installing Cask redis-stack-redisinsight - ==> Moving App 'RedisInsight-preview.app' to '/Applications/RedisInsight-preview.app' - 🍺 redis-stack-redisinsight was successfully installed! - ==> Installing Cask redis-stack - 🍺 redis-stack was successfully installed! -``` - -Go to Applications and click "RedisInsight-v2" to bring up the Redis Desktop GUI tool. - -### Step 5. Add Redis database - -![access redisinsight](images/add_database.png) - -### Step 6. Enter Redis Enterprise Cloud details - -Add the Redis Enterprise cloud database endpoint, port and password. - -![access redisinsight](images/database_creds.png) - -### Step 7. Verify the database under RedisInsight dashboard - -![database details](images/database_details.png) - -### Step 8. Try Redis Stack tutorials - -In this tutorial, we will go through an example of a bike shop. We will show the different capabilities of Redis Stack. - -Choose "Redis Stack" in the left sidebar. - -![access json workbench](images/stack1.png) - -### Step 9. Store and Manage JSON - -Let's examine the query for creating a single bike. Click "Create a bike" button: - -![access json keys](images/stack2.png) - -It will display a `JSON.SET` command with model, brand, price, type, specs and description details. `bikes:1` is the name of the Redis key where the JSON is stored. - -### Step 10. Accessing parts of a stored JSON document - -Click "Get specific fields" to access part of a stored JSON document as shown in the following diagram: - -![access json keys](images/stack3.png) - -### Next Steps - -- [Connecting to the database using RedisInsight](/explore/redisinsight/) -- [How to list & search Movies database using Redisearch](/howtos/moviesdatabase/getting-started/) - - - - - - Redis Launchpad - - diff --git a/docs/create/rediscloud/launchpad.png b/docs/create/rediscloud/launchpad.png deleted file mode 100644 index 66e7a455f63..00000000000 Binary files a/docs/create/rediscloud/launchpad.png and /dev/null differ diff --git a/docs/create/rediscloud/tryfree.png b/docs/create/rediscloud/tryfree.png deleted file mode 100644 index 53eeaba1727..00000000000 Binary files a/docs/create/rediscloud/tryfree.png and /dev/null differ diff --git a/docs/create/vercel/images/details_database.png b/docs/create/vercel/images/details_database.png deleted file mode 100644 index 3881cf02728..00000000000 Binary files a/docs/create/vercel/images/details_database.png and /dev/null differ diff --git a/docs/create/vercel/images/rediscloud_endpoint.png b/docs/create/vercel/images/rediscloud_endpoint.png deleted file mode 100644 index c62da0e3053..00000000000 Binary files a/docs/create/vercel/images/rediscloud_endpoint.png and /dev/null differ diff --git a/docs/create/vercel/images/vercel1.png b/docs/create/vercel/images/vercel1.png deleted file mode 100644 index 83e3977f9b5..00000000000 Binary files a/docs/create/vercel/images/vercel1.png and /dev/null differ diff --git a/docs/create/vercel/images/vercel13.png b/docs/create/vercel/images/vercel13.png deleted file mode 100644 index 4eca4a87392..00000000000 Binary files a/docs/create/vercel/images/vercel13.png and /dev/null differ diff --git a/docs/create/vercel/images/vercel2.png b/docs/create/vercel/images/vercel2.png deleted file mode 100644 index 763483f9333..00000000000 Binary files a/docs/create/vercel/images/vercel2.png and /dev/null differ diff --git a/docs/create/vercel/images/vercel3.png b/docs/create/vercel/images/vercel3.png deleted file mode 100644 index 9a9f7c1636c..00000000000 Binary files a/docs/create/vercel/images/vercel3.png and /dev/null differ diff --git a/docs/create/vercel/images/vercel_redis.png b/docs/create/vercel/images/vercel_redis.png deleted file mode 100644 index b9e7f3cc391..00000000000 Binary files a/docs/create/vercel/images/vercel_redis.png and /dev/null differ diff --git a/docs/create/vercel/index-vercel.mdx b/docs/create/vercel/index-vercel.mdx deleted file mode 100644 index d4812a84893..00000000000 --- a/docs/create/vercel/index-vercel.mdx +++ /dev/null @@ -1,196 +0,0 @@ ---- -id: index-vercel -title: Getting Started with Vercel and Redis -sidebar_label: Getting Started with Vercel and Redis -slug: /create/vercel -authors: [ajeet] ---- - -import RedisCard from '@site/src/theme/RedisCard'; - -[Vercel ](https://vercel.com/)is a popular static web hosting serverless platform for frontend developers. The platform allows developers to host websites and web services, deploy instantly, and scale automatically with minimal configuration. - -Vercel is the preferred platform to host [Next.js-based web applications](https://vercel.com/docs/concepts/next.js/overview). It allows you to deploy serverless functions that take an HTTP request and provide a response. You can use [serverless functions](https://vercel.com/docs/concepts/functions/serverless-functions) to handle user authentication, form submission, database queries, custom Slack commands, and more. - -Vercel integrates well with popular tools, such as [GitHub](https://vercel.com/docs/concepts/git/vercel-for-github), [GitLab](https://vercel.com/docs/concepts/git/vercel-for-gitlab), [Lighthouse](https://vercel.com/integrations/lighthouse), [Doppler](https://vercel.com/integrations/doppler), and [Divjoy](https://divjoy.com/). NodeJS, Go, Python, and Ruby are the leading [official runtimes supported by Vercel](https://vercel.com/docs/runtimes). - -![vercel](images/vercel_redis.png) - -### Features of Vercel - -- Vercel is focused on the build and deployment aspects of the [JAMstack approach](https://jamstack.org/what-is-jamstack/). -- [The Vercel API](https://vercel.com/docs/rest-api) provides full control over the Vercel platform, exposed as simple HTTP service endpoints over SSL. -- All endpoints live under the URL [https://api.vercel.com](https://api.vercel.com) and follow the REST architecture. -- Vercel provides custom domains to deploy your code on the live server (vercel.app as the suffix in the domain). -- Vercel provides you with an option to choose any framework of the repository you wish to deploy either Node.js, React, Gatsby, or [Next.js](https://nextjs.org/) (a full-stack React serverless framework that integrates seamlessly with Vercel). -- Vercel integrates with a GitHub repository for automatic deployments upon commits. - -In this tutorial, you will learn how to deploy a Node.js based Redis chat application using Vercel in just 5 minutes. - -### Table of Contents - -- Step 1. Set up a free Redis Enterprise Cloud account -- Step 2. Install Vercel CLI -- Step 3. Log in to your Vercel Account -- Step 4. Clone your GitHub repository -- Step 5. Create a vercel.json file -- Step 6. Set up environment variables -- Step 7. Deploy the Node.js app -- Step 8. Access your app - -### Step 1. Set up a free Redis Enterprise Cloud account - -Visit [developer.redis.com/create/rediscloud/](https://developer.redis.com/create/rediscloud/) and create [a free Redis Enterprise Cloud account](https://redis.com/try-free/). Once you complete this tutorial, you will be provided with the database endpoint URL and password. Save it for future reference. - -:::info TIP -For a limited time, use **TIGER200** to get **$200** credits on Redis Enterprise Cloud and try all the advanced capabilities! - -:tada: [Click here to sign up](https://redis.com/try-free) - -::: - -![alt_text](images/details_database.png) - -### Step 2. Install Vercel CLI - -``` -npm i -g vercel - -vercel -v -Vercel CLI 23.1.2 -23.1.2 - -``` - -### Step 3. Log in to your Vercel account - -The `vercel login` command allows you to log in to your Vercel account through Vercel CLI. - -``` -vercel login -Vercel CLI 23.1.2 -> Log in to Vercel github -> Success! GitHub authentication complete for xx@xx.com -Congratulations! You are now logged in. In order to deploy something, run `vercel`. -💡 Connect your Git Repositories to deploy every branch push automatically (https://vercel.link/git). -``` - -Once Vercel gets connected to your GitHub account, it displays your public repositories. Let us clone [https://github.com/redis-developer/basic-redis-chat-app-demo-nodejs](https://github.com/redis-developer/basic-redis-chat-app-demo-nodejs) to the local repository. - -### Step 4. Clone the GitHub repository - -The complete source code of the Redis Chat application is hosted [here](https://github.com/redis-developer/basic-redis-chat-app-demo-nodejs). React and Socket.IO are used for building the frontend while Node.js and Redis power the backend. Redis is used mainly as a database to keep the user/messages data and for sending messages between connected servers. - -``` -git clone https://github.com/redis-developer/basic-redis-chat-app-demo-nodejs -``` - -### Step 5. Create a vercel.json for Node.js app - -If you run the `vercel init` command, it will list various frameworks but you won’t be able to find Node.js, hence you will need to create a `vercel.json` file as shown below: - -``` -{ - "version": 2, - "builds": [ - { - "src": "./index.vercel.js", - "use": "@vercel/node" - } - ], - "routes": [ - { - "src": "/(.*)", - "dest": "/" - } - ] -} - -``` - -### Step 6. Set up environment variables - -The `vercel env` command is used to manage [environment variables](https://vercel.com/docs/concepts/projects/environment-variables) under a Project, providing functionality to list, add, remove, and pull. - -Let us first set up environment variables. - -``` - vercel env add -Vercel CLI 23.1.2 -? What's the name of the variable? REDIS_ENDPOINT_URI -? What's the value of REDIS_ENDPOINT_URI? redis-XXXX.c110-qa.us-east-1-1-1.ec2.qa-cloud.redislabs.com:XXX - -``` - -Listing the environment variables: - -``` -vercel env ls -Vercel CLI 23.1.2 -> Environment Variables found in Project basic-redis-chat-app-demo-nodejs [684ms] - - name value environments created - REDIS_PASSWORD Encrypted Production, Preview, Development 2d ago - REDIS_ENDPOINT_URL Encrypted Production, Preview, Development 2d ago - -``` - -### Step 7. Deploy the Node.js app - -When you run a vercel command in a directory for the first time, [Vercel CLI](https://vercel.com/cli) needs to know which [scope](https://vercel.com/docs/cli#options/global-options/scope) and [Project](https://vercel.com/docs/concepts/projects/overview) you want to deploy your directory to. You can choose to either link an existing project or create a new one. - -``` -vercel -Vercel CLI 23.1.2 -? Set up and deploy "~/projects/10feb/basic-redis-chat-app-demo-nodejs"? [Y/n] y -? Which scope do you want to deploy to? redis-developer -? Found project "redis-developer/basic-redis-chat-app-demo-nodejs". Link to it? [Y/n] y -🔗 Linked to redis-developer/basic-redis-chat-app-demo-nodejs (created .vercel) -🔍 Inspect: https://vercel.com/redis-developer/basic-redis-chat-app-demo-nodejs/5KZydRNsXwnjRxDYa65x4Ak8KwZT [4s] -✅ Preview: https://basic-redis-chat-app-demo-nodejs-redis-developer.vercel.app [copied to clipboard] [27s] -📝 To deploy to production (basic-redis-chat-app-demo-nodejs.vercel.app), run `vercel --prod` -❗️ Due to `builds` existing in your configuration file, the Build and Development Settings defined in your Project Settings will not apply. Learn More: https://vercel.link/unused-build-settings - -``` - -Once the deployment process has completed, a new `.vercel` directory will be added to your directory. The `.vercel` directory contains both the organization and project ID of your project. - -The "project.json" file contains: - -- The ID of the Vercel project that you linked ("projectId") - -- The ID of the user or team your Vercel project is owned by ("orgId") - -:::note -Vercel CLI automatically detects the framework you are using and offers default project settings accordingly. -::: - -### Step 8. Accessing the app - -Run the following command to deploy the Redis chat app to the Prod environment. - -``` -vercel --prod -Vercel CLI 23.1.2 -🔍 Inspect: https://vercel.com/redis-developer/basic-redis-chat-app-demo-nodejs/GoRdy7LKxqhBfJNW8hSvvFLQC6EN [2s] -✅ Production: https://basic-redis-chat-app-demo-nodejs.vercel.app [copied to clipboard] [14s] - - -``` - -By now, you will be able to login to Chat app and start chatting. - -![alt_text](images/vercel13.png) - -The chat server works as a basic REST API which involves keeping the session and handling the user state in the chat rooms (besides the WebSocket/real-time part). When a WebSocket/real-time server is instantiated, which listens for the next events: - -![alt_text](images/vercel3.png) - -If you want to know how the chat app works internally, [refer to this detailed blog tutorial](/howtos/chatapp#how-it-works) - -### References - -- [Getting Started with Vercel](https://vercel.com/docs/get-started) -- [Serverless Functions under Vercel](https://vercel.com/docs/concepts/functions/serverless-functions) -- [A Look at Vercel Supported Languages](https://vercel.com/docs/concepts/functions/supported-languages) -- [Next.js on Vercel](https://vercel.com/docs/concepts/next.js/overview) diff --git a/docs/create/windows/index-windows.mdx b/docs/create/windows/index-windows.mdx index fe5dc7085bc..32e7a910adc 100644 --- a/docs/create/windows/index-windows.mdx +++ b/docs/create/windows/index-windows.mdx @@ -6,10 +6,9 @@ slug: /create/windows authors: [ajeet] --- -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; +import Authors from '@theme/Authors'; + + You can run Redis on Windows 10 using Windows Subsystem for Linux(a.k.a WSL2). WSL2 is a compatibility layer for running Linux binary executables natively on Windows 10 and Windows Server 2019. WSL2 lets developers run a GNU/Linux environment (that includes command-line tools, utilities, and applications) directly on Windows. @@ -31,7 +30,7 @@ Reboot Windows after making the change — note that you only need to do this on start ms-windows-store: ``` -Then search for Ubuntu, or your preferred distribution of Linux, and download the latest version. +Then search for Ubuntu, or your preferred distribution of Linux, and download the latest version. ### Step 3: Install Redis server @@ -44,7 +43,11 @@ Installing Redis is simple and straightforward. The following example works with sudo apt-get install redis-server ``` -Please note that the `sudo` command might or mightn't be required based on the user configuration of your system. +:::note + +The `sudo` command may or may not be required based on the user configuration of your system. + +::: ### Step 4: Restart the Redis server @@ -65,7 +68,11 @@ Use the `redis-cli` command to test connectivity to the Redis database. "Jane" ``` -Please note: By default, Redis has 0-15 indexes for databases, you can change that number databases NUMBER in redis.conf. +:::note + +By default, Redis has 0-15 indexes for databases, you can change that number databases NUMBER in redis.conf. + +::: ### Step 6: Stop the Redis Server @@ -77,18 +84,6 @@ Please note: By default, Redis has 0-15 indexes for databases, you can change th
-
-
- -#### How to run Redis GUI tool on Windows - -![Windows logo](/img/logos/windows.png) - -[Follow this tutorial](/explore/redisinsightv2/windows) in order to run RedisInsight on Windows - -
-
-
@@ -96,15 +91,12 @@ Please note: By default, Redis has 0-15 indexes for databases, you can change th ![redis OM logo](/img/logos/redisom.gif) -Learn how to [connect to a Redis database with Redis OM Dotnet](/develop/dotnet/redis-om-dotnet/getting-started) -
### References -- [Redis Enterprise For Windows](https://redis.com/lp/redis-windows/) - [List of .Net Clients](https://redis.io/clients#c) - [Redis client for .NET languages](https://github.com/StackExchange/StackExchange.Redis) diff --git a/docs/develop/C/index-c.mdx b/docs/develop/C/index-c.mdx index 4f33cf3a814..08f6ed7f706 100644 --- a/docs/develop/C/index-c.mdx +++ b/docs/develop/C/index-c.mdx @@ -8,8 +8,9 @@ authors: [ajeet] import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; +import Authors from '@theme/Authors'; + + Find tutorials, examples and technical articles that will help you to develop with Redis and C. diff --git a/docs/develop/deno/index-deno.mdx b/docs/develop/deno/index-deno.mdx index e0f0423ee99..43ca1c01787 100644 --- a/docs/develop/deno/index-deno.mdx +++ b/docs/develop/deno/index-deno.mdx @@ -6,6 +6,10 @@ slug: /develop/deno/ authors: [ajeet] --- +import Authors from '@theme/Authors'; + + + [With over 80,000 stars and 670+ contributors](https://github.com/denoland/deno), Deno is a popular modern runtime for JavaScript and TypeScript. It is built on [V8](https://v8.dev/), an open-source JavaScript engine developed by the Chromium Project for Google Chrome and Chromium web browsers. ![deno](deno.png) @@ -35,12 +39,12 @@ https://deno.land/std@0.126.0/examples In order to use Redis with Deno you will need a Deno Redis client. In the following sections, we will demonstrate the use of [an experimental implementation of a Redis client for Deno](https://deno.land/x/redis@v0.25.3). -### Step 1. Set up a free Redis Enterprise Cloud account +### Step 1. Set up a free Redis Cloud account -Visit [developer.redis.com/create/rediscloud/](/create/rediscloud/) and create [a free Redis Enterprise Cloud account](https://redis.com/try-free/). Once you complete this tutorial, you will be provided with the database endpoint URL and password. Save it for future reference. +Visit [redis.com/try-free](https://redis.com/try-free) and create [a free Redis Cloud account](https://redis.com/try-free/). Once you complete this tutorial, you will be provided with the database endpoint URL and password. Save it for future reference. -:::info TIP -For a limited time, use **TIGER200** to get **$200** credits on Redis Enterprise Cloud and try all the advanced capabilities! +:::tip +For a limited time, use **TIGER200** to get **$200** credits on Redis Cloud and try all the advanced capabilities! :tada: [Click here to sign up](https://redis.com/try-free) @@ -111,13 +115,11 @@ OK target="_blank" rel="noopener" className="link"> - Redis Launchpad -
diff --git a/docs/develop/dotnet/aspnetcore/rate-limiting/fixed-window/fixed-window.mdx b/docs/develop/dotnet/aspnetcore/rate-limiting/fixed-window/fixed-window.mdx index b5a8237c137..6a8b60ce60a 100644 --- a/docs/develop/dotnet/aspnetcore/rate-limiting/fixed-window/fixed-window.mdx +++ b/docs/develop/dotnet/aspnetcore/rate-limiting/fixed-window/fixed-window.mdx @@ -6,6 +6,10 @@ slug: /develop/dotnet/aspnetcore/rate-limiting/fixed-window authors: [steve] --- +import Authors from '@theme/Authors'; + + + In this tutorial, we will build an app that implements basic fixed-window rate limiting using Redis & ASP.NET Core. ## Prerequisites @@ -232,13 +236,11 @@ You will see some of your requests return a `200`, and at least one request retu target="_blank" rel="noopener" className="link"> - Redis Launchpad -
diff --git a/docs/develop/dotnet/index-dotnet.mdx b/docs/develop/dotnet/index-dotnet.mdx index 8f3d692939b..1e12a2e3b3a 100644 --- a/docs/develop/dotnet/index-dotnet.mdx +++ b/docs/develop/dotnet/index-dotnet.mdx @@ -2,14 +2,15 @@ id: index-dotnet title: .NET and Redis sidebar_label: Getting Started -slug: /develop/dotnet/ +slug: /develop/dotnet authors: [steve] --- import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; +import Authors from '@theme/Authors'; + + ## Getting Started @@ -25,7 +26,7 @@ There are a few ways to Install the Package: {label: '.NET CLI', value: 'cli'}, {label: 'PM Console', value: 'pmConsole'}, {label: 'Package Reference', value: 'csproj'}, - {label: 'NuGet GUI', value: 'gui'} + {label: 'NuGet GUI', value: 'gui'} ]}> diff --git a/docs/develop/dotnet/redis-om-dotnet/add-and-retrieve-objects/add-and-retrieve-objects.md b/docs/develop/dotnet/redis-om-dotnet/add-and-retrieve-objects/add-and-retrieve-objects.md index 2e012bd2c7c..1e93e44e01c 100644 --- a/docs/develop/dotnet/redis-om-dotnet/add-and-retrieve-objects/add-and-retrieve-objects.md +++ b/docs/develop/dotnet/redis-om-dotnet/add-and-retrieve-objects/add-and-retrieve-objects.md @@ -6,7 +6,7 @@ slug: /develop/dotnet/redis-om-dotnet/add-and-retrieve-objects authors: [steve] --- -The Redis OM library supports declarative storage and retrieval of objects from Redis. Without the RediSearch and RedisJson modules, this is limited to using hashes, and id lookups of objects in Redis. You will still use the `Document` Attribute to decorate a class you'd like to store in Redis. From there, all you need to do is either call `Insert` or `InsertAsync` on the `RedisCollection` or `Set` or `SetAsync` on the RedisConnection, passing in the object you want to set in Redis. You can then retrieve those objects with `Get` or `GetAsync` with the `RedisConnection` or with `FindById` or `FindByIdAsync` in the RedisCollection. +The Redis OM library supports declarative storage and retrieval of objects from Redis. Without [Redis Stack](https://redis.io/docs/stack), this is limited to using hashes, and id lookups of objects in Redis. You will still use the `Document` Attribute to decorate a class you'd like to store in Redis. From there, all you need to do is either call `Insert` or `InsertAsync` on the `RedisCollection` or `Set` or `SetAsync` on the RedisConnection, passing in the object you want to set in Redis. You can then retrieve those objects with `Get` or `GetAsync` with the `RedisConnection` or with `FindById` or `FindByIdAsync` in the RedisCollection. ```csharp public class Program diff --git a/docs/develop/dotnet/redis-om-dotnet/creating-an-index/creating-an-index.md b/docs/develop/dotnet/redis-om-dotnet/creating-an-index/creating-an-index.md index b7fc206e555..30080de3489 100644 --- a/docs/develop/dotnet/redis-om-dotnet/creating-an-index/creating-an-index.md +++ b/docs/develop/dotnet/redis-om-dotnet/creating-an-index/creating-an-index.md @@ -48,15 +48,15 @@ public partial class Person As shown above, you can declare a class as being indexed with the `Document` Attribute. In the `Document` attribute, you can set a few fields to help build the index: -| Property Name | Description | Default | Optional | -| -------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | --------------------------------------------- | -------- | -| StorageType | Defines the underlying data structure used to store the object in Redis, options are `HASH` and `JSON`, Note JSON is only useable with the [RedisJson module](https://oss.redis.com/redisjson/) | HASH | true | -| IndexName | The name of the index | `$"{SimpleClassName.ToLower()}-idx}` | true | -| Prefixes | The key prefixes for redis to build an index off of | `new string[]{$"{FullyQualifiedClassName}:"}` | true | -| Language | Language to use for full-text search indexing | `null` | true | -| LanguageField | The name of the field in which the document stores its Language | null | true | -| Filter | The filter to use to determine whether a particular item is indexed, e.g. `@Age>=18` | null | true | -| IdGenerationStrategy | The strategy used to generate Ids for documents, if left blank it will use a [ULID](https://github.com/ulid/spec) generation strategy | UlidGenerationStrategy | true | +| Property Name | Description | Default | Optional | +| -------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | --------------------------------------------- | -------- | +| StorageType | Defines the underlying data structure used to store the object in Redis, options are `HASH` and `JSON`, Note JSON is only useable with [Redis Stack](https://redis.io/docs/stack) | HASH | true | +| IndexName | The name of the index | `$"{SimpleClassName.ToLower()}-idx}` | true | +| Prefixes | The key prefixes for redis to build an index off of | `new string[]{$"{FullyQualifiedClassName}:"}` | true | +| Language | Language to use for full-text search indexing | `null` | true | +| LanguageField | The name of the field in which the document stores its Language | null | true | +| Filter | The filter to use to determine whether a particular item is indexed, e.g. `@Age>=18` | null | true | +| IdGenerationStrategy | The strategy used to generate Ids for documents, if left blank it will use a [ULID](https://github.com/ulid/spec) generation strategy | UlidGenerationStrategy | true | ## Field Level Declarations diff --git a/docs/develop/dotnet/redis-om-dotnet/getting-started/getting-started.md b/docs/develop/dotnet/redis-om-dotnet/getting-started/getting-started.md deleted file mode 100644 index f3a874a4c04..00000000000 --- a/docs/develop/dotnet/redis-om-dotnet/getting-started/getting-started.md +++ /dev/null @@ -1,78 +0,0 @@ ---- -id: getting-started -title: Getting Started with Redis OM .NET -sidebar_label: Getting Started -slug: /develop/dotnet/redis-om-dotnet/getting-started -authors: [steve] ---- - -Redis OM is designed to make using Redis easier for .NET developers, so naturally the first question one might ask is where to start? - -## Prerequisites - -- A .NET Standard 2.0 compatible version of .NET. This means that all .NET Framework versions 4.6.1+, .NET Core 2.0+ and .NET 5+ will work with Redis OM .NET. -- An IDE for writing .NET, Visual Studio, Rider, VS Code will all work. - -## Installation - -To install Redis OM .NET all you need to do is add the [`Redis.OM`](https://www.nuget.org/packages/Redis.OM/) NuGet package to your project. This can be done by running `dotnet add package Redis.OM` - -## Connecting to Redis. - -The next major step for getting started with Redis OM .NET is to connect to Redis. - -The Redis OM library is an abstraction above a lower level (closer to Redis) library—[StackExchange.Redis](https://github.com/StackExchange/StackExchange.Redis)—which it uses to manage connections to Redis. That is however, an implementation detail which should not be a concern to the user. `RedisConnectionProvider` class contains the connection logic, and provides for connections to Redis. The RedisConnectionProvider should only be initialized once in your app's lifetime. - -## Initializing RedisConnectionProvider - -RedisConnectionProvider takes a [Redis URI](https://github.com/redis-developer/Redis-Developer-URI-Spec/blob/main/spec.md) and uses that to initialize a connection to Redis. - -Consequentially, all that needs to be done to initialize the client is calling the constructor of `RedisConnectionProvider` with a Redis uri. Alternatively, you can connect with a ConnectionConfiguration object, or if you have a ConnectionMultiplexer in your DI container already, you can construct it with your ConnectionMultiplexer. - -#### Connecting to a Standalone Instance of Redis No Auth - -```csharp -var provider = new RedisConnectionProvider("redis://hostname:port"); -``` - -#### Connecting to Standalone Instance of Redis Just Password - -```csharp -var provider = new RedisConnectionProvider("redis://:password@hostname:port"); -``` - -#### Connecting to Standalone Instance of Redis or Redis Enterprise Username and Password - -```csharp -var provider = new RedisConnectionProvider("redis://username:password@hostname:port"); -``` - -#### Connecting to Standalone Instance of Redis Particular Database - -```csharp -var provider = new RedisConnectionProvider("redis://username:password@hostname:port/4"); -``` - -#### Connecting to Redis Sentinel - -When connecting to Redis Sentinel, you will need to provide the sentinel - -```csharp -var provider = new RedisConnectionProvider("redis://username:password@sentinel-hostname:port?endpoint=another-sentinel-host:port&endpoint=yet-another-sentinel-hot:port&sentinel_primary_name=redisprimary"); -``` - -#### Connecting to Redis Cluster - -Connecting to a Redis Cluster is similar to connecting to a standalone server, it is advisable however to include at least one other alternative endpoint in the URI as a query parameter in case of a failover event. - -```csharp -var provider = new RedisConnectionProvider("redis://username:password@hostname:port?endpoint=another-primary-host:port"); -``` - -## Getting the RedisConnection, RedisCollection, and RedisAggregationSet - -There are three primary drivers of Redis in this Library, which can all be accessed from the `provider` object after it's been initialize. - -- The RedisConnection - this provides a command level interface to Redis, a limited set of commands are directly implemented, but any command can be executed via the `Execute` and `ExecuteAsync` commands. To get a handle to the RedisConnection just use `provider.Connection` -- `RedisCollection` - This is a generic collection used to access Redis. It provides a fluent interface for retrieving data stored in Redis. To create a `RedisCollection` use `provider.RedisCollection()` -- `RedisAggregationSet` - This is another generic collection used to aggregate data in Redis. It provides a fluent interface for performing mapping & reduction operations on Redis. To create a `RedisAggregationSet`use `provider.AggregationSet()` diff --git a/docs/develop/dotnet/redis-om-dotnet/searching/simple-text-queries/simple-text-queries.md b/docs/develop/dotnet/redis-om-dotnet/searching/simple-text-queries/simple-text-queries.md index 8fd10a8667a..c5d88320b8f 100644 --- a/docs/develop/dotnet/redis-om-dotnet/searching/simple-text-queries/simple-text-queries.md +++ b/docs/develop/dotnet/redis-om-dotnet/searching/simple-text-queries/simple-text-queries.md @@ -6,7 +6,7 @@ slug: /develop/dotnet/redis-om-dotnet/simple-text-queries authors: [steve] --- -The `RedisCollection` provides a fluent interface for querying objects stored in redis. This means that if you store an object in Redis with the Redis OM library, and you have [RediSearch](https://oss.redis.com/redisearch/) enabled, you can query objects stored in Redis with ease using the LINQ syntax you're used to. +The `RedisCollection` provides a fluent interface for querying objects stored in redis. This means that if you store an object in Redis with the Redis OM library, and you have [Redis Stack](https://redis.io/docs/stack) running, you can query objects stored in Redis with ease using the LINQ syntax you're used to. ## Define the Model diff --git a/docs/develop/dotnet/streams/blocking-reads/blocking-reads.md b/docs/develop/dotnet/streams/blocking-reads/blocking-reads.md deleted file mode 100644 index c08f91abcd4..00000000000 --- a/docs/develop/dotnet/streams/blocking-reads/blocking-reads.md +++ /dev/null @@ -1,23 +0,0 @@ ---- -id: blocking-reads -title: Blocking Stream Reads -sidebar_label: Blocking Stream Reads -slug: /develop/dotnet/streams/blocking-reads -authors: [steve] ---- - -[Redis Streams](https://redis.io/topics/streams-intro) can be used to build a message bus for our applications. The ability of multiple readers to consume messages from a Redis Stream in a consumer group makes Redis Streams ideal for a variety of use cases where you want the assurance of message delivery and where you have high volumes of data you want to distribute across multiple consumers. - -One of the great things about Redis Streams is that you can reduce the number of requests you need to make to Redis by having consumers use blocking requests and wait for new messages to come into the stream. In terms of commands, this would look something like this: - -```bash -XREADGROUP GROUP average avg1 COUNT 1 BLOCK 1000 STREAMS numbers > -``` - -Or, for a simple XREAD, you can wait for the next message to come in: - -```bash -127.0.0.1:6379> XREAD BLOCK 1000 STREAMS numbers $ -``` - -The main .NET Redis client [StackExchange.Redis](https://github.com/StackExchange/StackExchange.Redis) does not support this particular feature. The reason for this lack of support is architectural, the StackExchange.Redis client centers all commands to Redis around a single connection. Because of this, blocking that connection for a single client will block all other requests to Redis. If we want to do blocking stream reads with Redis in .NET we'll need to use different clients to do so. Contained in this section are tutorials for doing so with both [ServiceStack.Redis](blocking-reads/service-stack) and [CsRedis](blocking-reads/cs-redis) diff --git a/docs/develop/dotnet/streams/blocking-reads/cs-redis/cs-redis.md b/docs/develop/dotnet/streams/blocking-reads/cs-redis/cs-redis.md deleted file mode 100644 index 6034004cbf6..00000000000 --- a/docs/develop/dotnet/streams/blocking-reads/cs-redis/cs-redis.md +++ /dev/null @@ -1,166 +0,0 @@ ---- -id: cs-redis -title: Blocking Stream reads with CSRedis -sidebar_label: Blocking Stream Reads with CSRedis -slug: /develop/dotnet/streams/blocking-reads/cs-redis -authors: [steve] ---- - -[CSRedis](https://github.com/2881099/csredis) is an MIT Licensed Open source project which provides a straightforward interface for executing commands. CSRedis can be used effectively for performing blocking stream reads with the one major downside that it does not support any async API for them. - -## Start Redis - -Before we begin, we'll start up Redis. If you are developing locally, which we'll assume you are for the duration of this tutorial, you can start Redis with a simple docker command. - -```bash -docker run -p 6379:6379 redis -``` - -## Create the app - -We will build a simple console application for streaming telemetry using the library. To do so, use the `dotnet new` command: - -```bash -dotnet new console -n StreamsWithCSRedis -``` - -## Add the package to your app - -Run the `cd StreamsWithCSRedis` command to change directories into the application's directory and run the following to add the CSRedis package - -```bash -dotnet add package CSRedisCore -``` - -## Create group - -When we start up our app, the first thing we'll do is create our `avg` group. To make this group, open up `Program.cs` and add to it the following: - -```csharp -var cancellationTokenSource = new CancellationTokenSource(); -var token = cancellationTokenSource.Token; - -var client = new CSRedisClient("localhost"); -if (!client.Exists("stream") || client.XInfoStream("stream").groups == 0) -{ - client.XGroupCreate("stream", "avg", "$", MkStream: true); -} -``` - -This code will create a cancellation token for the threads we'll spin up to do the writes/reads to the stream, create a client, check if our `avg` group already exists, and finally create the `avg` group if it doesn't. - -## Write to the stream - -Next, we'll write out to the stream. We'll call the stream `stream`, and send a `temp` and `time` field along with the stream. We'll do this every 2 seconds. We'll put this on its own thread, since this operation isn't actually 'blocking' in the Redis sense, it may be alright to spin it out on its task, but as the other two operations in here are blocking, it's better to spin it off on its own thread as well. Add the following to your `Program.cs` file: - -```csharp -var writeThread = new Thread(() => -{ - var writeClient = new CSRedisClient("localhost"); - var random = new Random(); - while (!token.IsCancellationRequested) - { - writeClient.XAdd("stream", new (string, string)[]{new ("temp", random.Next(50,65).ToString()), new ("time", DateTimeOffset.Now.ToUnixTimeSeconds().ToString())}); - Thread.Sleep(2000); - } -}); -``` - -## Parsing read results - -The next issue we'll need to dispose of is parsing the read results from the `XREAD` and `XREADGROUP` commands. CSRedis handles return types generally as tuples in a reply, so we'll need a way to parse the result into something more useable. In this case, we'll parse the results into a dictionary. For the sake of brevity, we will keep everything in this project in `Program.cs` on the top-level method, so we'll declare a `Func` to handle the parsing. This function will pull the first message from the first stream and arrange the values returned into a dictionary. A couple of things to consider here if you wanted to expand this further is that you could reply with a dictionary of dictionaries if you were pulling back multiple messages from multiple streams. This complexity is intentionally left out. - -```csharp -Func<(string key, (string id, string[] items)[] data), Dictionary> parse = delegate((string key, (string id, string[] items)[] data) streamResult) -{ - var message = streamResult.data.First().items; - var result = new Dictionary(); - for (var i = 0; i < message.Length; i += 2) - { - result.Add(message[i], message[i+1]); - } - - return result; -}; -``` - -## Blocking XREAD - -There are two primary types of 'read' methods, `XREAD` and `XREADGROUP`, this is in addition to the various range methods, which are their category and operate semantically differently from the read operations. `XREAD` lets you read off a given stream and read the _next_ item that hit's the stream. You can do this with the special `$` id. For our purposes here, we are going to block for two seconds, or whenever we get a response back from redis, whichever comes first: - -```csharp -var readThread = new Thread(() => -{ - var readClient = new CSRedisClient("localhost"); - while (!token.IsCancellationRequested) - { - var result = readClient.XRead(1, 5000, new (string key, string id)[] {new("stream", "$")}); - if (result != null) - { - var dictionary = parse(result[0]); - Console.WriteLine($"Most recent message, time: {dictionary["time"]} temp: {dictionary["temp"]}"); - } - } -}); -``` - -## Blocking XREADGROUP - -Blocking `XREADGROUP` commands operate very similarly to `XREAD`. In this case, however, the creation of the group told us what id to start at, and by passing in the `>` we necessarily start off at the next message in the queue. Because we are reading out of a group, we'll also want to `XACK` to any messages that we pull down. Also, since this is our average group, we'll maintain an average for our stream's temperatures. - -```csharp -var total = 0; -var count = 0; -var groupReadThread = new Thread(() => -{ - var groupReadClient = new CSRedisClient("localhost"); - var id = string.Empty; - while (!token.IsCancellationRequested) - { - if (!string.IsNullOrEmpty(id)) - { - client.XAck("stream", "avg", id); - } - var result = - groupReadClient.XReadGroup("avg", "avg-1", 1, 5000, new (string key, string id)[] {new("stream", ">")}); - if (result != null) - { - id = result.First().data.First().id; - var dictionary = parse(result[0]); - if (dictionary.ContainsKey("temp")) - { - count++; - total += int.Parse(dictionary["temp"]); - double avg = (double) total / count; - Console.WriteLine($"Most recent group message, time: {dictionary["time"]} temp: {dictionary["temp"]} avg: {avg:00.00}"); - } - } - } -}); -``` - -## Spin up threads - -The last thing we'll need to do is start up all the threads, set a cancellation timeout (so the app doesn't run forever), and join all the threads back together: - -```csharp -readThread.Start(); -writeThread.Start(); -groupReadThread.Start(); - -cancellationTokenSource.CancelAfter(TimeSpan.FromSeconds(10)); - -readThread.Join(); -writeThread.Join(); -groupReadThread.Join(); -``` - -## Run the app - -Now that the app is written, all that's left to do is run it. You can do so by running `dotnet run in your terminal. - -## Resources: - -- The source for this tutorial is in [GitHub](https://github.com/redis-developer/redis-streams-with-dotnet/tree/main/StreamsWithCSRedis) -- Redis University has an extensive [course](https://university.redis.com/courses/ru202/) on Redis Streams where you can learn everything you need to know about them. -- You can learn more about Redis Streams in the [Streams Info](https://redis.io/topics/streams-intro) article on redis.io diff --git a/docs/develop/dotnet/streams/blocking-reads/service-stack/service-stack.md b/docs/develop/dotnet/streams/blocking-reads/service-stack/service-stack.md deleted file mode 100644 index 733ebf60116..00000000000 --- a/docs/develop/dotnet/streams/blocking-reads/service-stack/service-stack.md +++ /dev/null @@ -1,197 +0,0 @@ ---- -id: service-stack -title: How to handle blocking stream reads with ServiceStack.Redis -sidebar_label: Blocking Stream Reads with ServiceStack.Redis -slug: /develop/dotnet/streams/blocking-reads/service-stack -authors: [steve] ---- - -[ServiceStack.Redis](https://github.com/ServiceStack/ServiceStack.Redis) is part of the ServiceStack suite, it has some restrictions when used for commercial purposes - see their [license](https://github.com/ServiceStack/ServiceStack.Redis/blob/master/license.txt) - -## Start Redis - -If you're developing locally (which is what we will assume for the balance of this tutorial), you can start Redis fairly quickly with docker: - -```bash -docker run -p 6379:6379 redis -``` - -## Create the app - -We will build a simple console application for streaming telemetry using the library. To do so, use the `dotnet new` command: - -```bash -dotnet new console -n StreamsWithServiceStack -``` - -### Add the package to your app - -You can add this package to your app with: - -```bash -dotnet add package ServiceStack.Redis -``` - -### Initialize the client manager - -To initialize a client with ServiceStack, you'll need to create a [`RedisClientManager`](https://github.com/ServiceStack/ServiceStack.Redis#redis-client-managers). Then, add the following to `Program.cs`. - -```csharp -var manager = new BasicRedisClientManager("localhost"); -``` - -### Add items to streams - -Redis streams are not yet fully supported by ServiceStack.Redis, however, you can run raw commands easily with the `CustomAsync` method. So let's create a new class called `Producer.cs` and add the following to it. - -```csharp -public static class Producer -{ - public static async Task Produce(BasicRedisClientManager manager, CancellationToken token) - { - var client = await manager.GetClientAsync(token); - var random = new Random(); - while (!token.IsCancellationRequested) - { - await client.CustomAsync("XADD", "telemetry", "*", "temp",random.Next(50,65), "time", DateTimeOffset.Now.ToUnixTimeSeconds()); - await Task.Delay(10000, token); - } - } -} -``` - -This code will send new telemetry every 10 seconds to the `telemetry` stream, with a `temp` record and a `time` record. - -### Reading messages - -As mentioned earlier, ServiceStack does not have native support for the Streams API, so we need to do a bit of work after retrieving a record from a stream. However, this isn't a complex operation since the resulting structure is a predictable set of nested arrays going from an array of the streams requested to an array of messages retrieved from each stream to the message itself split between its id and its attributes. Finally, the field value pairs within a message; this looks something like this: - -``` -127.0.0.1:6379> XREAD COUNT 1 BLOCK 20000 STREAMS telemetry $ -1) 1) "telemetry" - 2) 1) 1) "1642857504469-0" - 2) 1) "temp" - 2) "57" - 3) "time" - 4) "1642857504" -``` - -This data structure is pretty predictable to parse, so we'll add a little parsing method. First, Create `Consumer.cs` and add the following to it: - -```csharp -using ServiceStack.Redis; - -namespace StreamsWithServicestack; - -public static class Consumer -{ - public static IDictionary ParseStreamResult(RedisText text, out string id) - { - var result = new Dictionary(); - - var fieldValPairs = text.Children[0].Children[1].Children[0].Children[1].Children; - id = text.Children[0].Children[1].Children[0].Children[0].Text; - for (var i = 0; i < fieldValPairs.Count; i += 2) - { - result.Add(fieldValPairs[i].Text, fieldValPairs[i+1].Text); - } - - return result; - } -} -``` - -`ParseStreamResult` will yield the first message from the first stream of an `XREAD` or `XREADGROUP`, this isn't a fully generalized solution but will serve our purposes here. - -### Reading a stream outside a group with XREAD - -To read the next message in a stream, which is necessarily a blocking operation, you will use the `XREAD` command with the `BLOCK` option and the special `$` id. Then, in the `Consumer` class, add the following, which will read off the stream in a continuous loop, blocking for 20 seconds at each request. - -```csharp -public static async Task Consume(IRedisClientsManagerAsync manager, CancellationToken token) -{ - var client = await manager.GetClientAsync(token); - while (!token.IsCancellationRequested) - { - string id; - var result = await client.CustomAsync("XREAD", "COUNT", 1, "BLOCK", 20000, "STREAMS", "telemetry", "$"); - var fvp = ParseStreamResult(result, out id); - Console.WriteLine($"read: result {id} - temp: {fvp["temp"]} time: {fvp["time"]}"); - } -} -``` - -### Reading with consumer groups - -Reading messages in a consumer group can be helpful in cases where you have a common task that you want to distribute across many consumers in a high-throughput environment. It's a two-step process: - -1. Read the stream -2. Acknowledge receipt of the message - -This task can be done by running an `XREADGROUP` and a `XACK` back to back. The `XREADGROUP` will take, in addition to the parameters we spoke about for the `XREAD`, the `GROUP` name, the consumer's name, and instead of taking the special `$` id, it will take the special `>` id, which will have it take the next unassigned id for the group. We'll then extract the information from it, update our average, and then acknowledge the receipt of the message. - -```csharp -public static async Task ConsumeFromGroup(IRedisClientsManagerAsync manager, CancellationToken token) -{ - var client = await manager.GetClientAsync(token); - var total = 0; - var num = 0; - while (!token.IsCancellationRequested) - { - string id; - var result = await client.CustomAsync("XREADGROUP", "GROUP", "avg", "avg-1", "COUNT", "1", "BLOCK", - 20000, "STREAMS", "telemetry", ">"); - var fvp = ParseStreamResult(result, out id); - total += int.Parse(fvp["temp"]); - num++; - Console.WriteLine( - $"Group-read: result {id} - temp: {fvp["temp"]} time: {fvp["time"]}, current average: {total / num}"); - await client.CustomAsync("XACK", "telemetry", "avg", id); - } -} -``` - -### Create the group and start the tasks - -The final bit we need is to create the group and start up all the tasks. We'll use the `XGROUP` command with the `MKSTREAM` option to create the group. We'll then start up all the tasks we need for our producer and consumers, and we'll await everything. Add the following to your `Program.cs` file: - -```csharp -using ServiceStack.Redis; -using StreamsWithServicestack; - -var manager = new BasicRedisClientManager("localhost"); -var asyncClient = await manager.GetClientAsync(); - -var tokenSource = new CancellationTokenSource(); -var token = tokenSource.Token; - -try -{ - await asyncClient.CustomAsync("XGROUP", "CREATE", "telemetry", "avg", "0-0", "MKSTREAM"); -} -catch (Exception ex) -{ - Console.WriteLine(ex); -} - -var writeTask = Producer.Produce(manager, token); -var readTask = Consumer.Consume(manager, token); -var groupReadTask = Consumer.ConsumeFromGroup(manager, token); - -await Task.WhenAll(writeTask, readTask, groupReadTask); - -``` - -## Run the app - -All that's left to do is to run the app, and you'll see a continuous stream of messages coming in every 10 seconds. You can run the app by running: - -```bash -dotnet run -``` - -## Resources: - -- The source for this tutorial is in [GitHub](https://github.com/redis-developer/redis-streams-with-dotnet/tree/main/StreamsWithServicestack) -- Redis University has an extensive [course](https://university.redis.com/courses/ru202/) on Redis Streams where you can learn everything you need to know about them. -- You can learn more about Redis Streams in the [Streams Info](https://redis.io/topics/streams-intro) article on redis.io diff --git a/docs/develop/dotnet/streams/streams-basics.md b/docs/develop/dotnet/streams/streams-basics.md index bb677d453ee..254089c733b 100644 --- a/docs/develop/dotnet/streams/streams-basics.md +++ b/docs/develop/dotnet/streams/streams-basics.md @@ -90,11 +90,15 @@ The results retrieved from Redis will be in a reasonably readable form; all the Dictionary ParseResult(StreamEntry entry) => entry.Values.ToDictionary(x => x.Name.ToString(), x => x.Value.ToString()); ``` -> Note: Stream messages enforce no requirement that field names be unique. We use a dictionary for clarity sake in this example, but you will need to ensure that you are not passing in multiple fields with the same names in your usage to prevent an issue using a dictionary. +:::note + +Stream messages enforce no requirement that field names be unique. We use a dictionary for clarity sake in this example, but you will need to ensure that you are not passing in multiple fields with the same names in your usage to prevent an issue using a dictionary. + +::: ## Spin up most recent element task -Next, we'll need to spin up a task to read the most recent element off of the stream. To do this, we'll use the `StreamRangeAsync` method passing in two special ids, `-` which means the lowest id, and `+`, which means the highest id. Running this command will result in some duplication. This redundancy is necessary because the `StackExchange.Redis` library does not support blocking stream reads and does not support the special `$` character for stream reads. Overcoming this behavior is explored in-depth in the [Blocking Reads](blocking-reads) tutorial. For this tutorial, you can manage these most-recent reads with the following code: +Next, we'll need to spin up a task to read the most recent element off of the stream. To do this, we'll use the `StreamRangeAsync` method passing in two special ids, `-` which means the lowest id, and `+`, which means the highest id. Running this command will result in some duplication. This redundancy is necessary because the `StackExchange.Redis` library does not support blocking stream reads and does not support the special `$` character for stream reads. For this tutorial, you can manage these most-recent reads with the following code: ```csharp var readTask = Task.Run(async () => diff --git a/docs/develop/golang/images/bkup/app_preview_image.png b/docs/develop/golang/images/bkup/app_preview_image.png deleted file mode 100644 index a5eb2eae369..00000000000 Binary files a/docs/develop/golang/images/bkup/app_preview_image.png and /dev/null differ diff --git a/docs/develop/golang/images/bkup/leaderboardgo.png b/docs/develop/golang/images/bkup/leaderboardgo.png deleted file mode 100644 index 899d1d37386..00000000000 Binary files a/docs/develop/golang/images/bkup/leaderboardgo.png and /dev/null differ diff --git a/docs/develop/golang/images/bkup/ratelimitinggo.png b/docs/develop/golang/images/bkup/ratelimitinggo.png deleted file mode 100644 index 441aed59144..00000000000 Binary files a/docs/develop/golang/images/bkup/ratelimitinggo.png and /dev/null differ diff --git a/docs/develop/golang/images/leaderboardgo.png b/docs/develop/golang/images/leaderboardgo.png deleted file mode 100644 index dca8553d8c9..00000000000 Binary files a/docs/develop/golang/images/leaderboardgo.png and /dev/null differ diff --git a/docs/develop/golang/images/ratelimitinggo.png b/docs/develop/golang/images/ratelimitinggo.png deleted file mode 100644 index c49a809960a..00000000000 Binary files a/docs/develop/golang/images/ratelimitinggo.png and /dev/null differ diff --git a/docs/develop/golang/index-golang.mdx b/docs/develop/golang/index-golang.mdx deleted file mode 100644 index c2c94aee80a..00000000000 --- a/docs/develop/golang/index-golang.mdx +++ /dev/null @@ -1,248 +0,0 @@ ---- -id: index-golang -title: Go and Redis -sidebar_label: Go -slug: /develop/golang/ -authors: [ajeet] ---- - -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; - -Find tutorials, examples and technical articles that will help you to develop with Redis and Golang. - -## Getting Started - -Golang community has built many client libraries that you can find [here](https://redis.io/clients#go). -For your first steps with Golang and Redis, this article will show how to use the recommended library: [redigo](https://github.com/gomodule/redigo). - - - - -The `redigo` library is located in the `https://github.com/gomodule/redigo` that you must import in your application. - -#### Step 1. Import the `redigo` module - -```bash - go get github.com/gomodule/redigo/redis -``` - -```go - import ( - "fmt" - "context" - "github.com/gomodule/redigo/redis" - ) -``` - -#### Step 2. Create a connection pool - -```go - func newPool() *redis.Pool { - return &redis.Pool{ - MaxIdle: 80, - MaxActive: 12000, - Dial: func() (redis.Conn, error) { - c, err := redis.Dial("tcp", ":6379") - if err != nil { - panic(err.Error()) - } - return c, err - }, - } - } -``` - -#### Step 3. Write your application code - -```go - package main - - import ( - "fmt" - - "github.com/gomodule/redigo/redis" - ) - - var pool = newPool() - - func main() { - - client := pool.Get() - defer client.Close() - - _, err := client.Do("SET", "mykey", "Hello from redigo!") - if err != nil { - panic(err) - } - - value, err := client.Do("GET", "mykey") - if err != nil { - panic(err) - } - - fmt.Printf("%s \n", value) - - _, err = client.Do("ZADD", "vehicles", 4, "car") - if err != nil { - panic(err) - } - _, err = client.Do("ZADD", "vehicles", 2, "bike") - if err != nil { - panic(err) - } - - vehicles, err := client.Do("ZRANGE", "vehicles", 0, -1, "WITHSCORES") - if err != nil { - panic(err) - } - fmt.Printf("%s \n", vehicles) - - - } - - func newPool() *redis.Pool { - return &redis.Pool{ - MaxIdle: 80, - MaxActive: 12000, - Dial: func() (redis.Conn, error) { - c, err := redis.Dial("tcp", ":6379") - if err != nil { - panic(err.Error()) - } - return c, err - }, - } - } -``` - -Find more information about Golang & Redis connections in the "[Redis Connect](https://github.com/redis-developer/redis-connect/tree/master/golang/redigo)". - - - - -Go-redis is a type-safe, Redis client library for Go with support for features like Pub/Sub, sentinel, and pipelining.It is a Redis client able to support a Redis cluster and is designed to store and update slot info automatically with a cluster change. Below are the attractive features of Go-redis: - -- Go-redis has pooling capabilities.(Pools allow you to safely handle go-routines, auto reconnect if any error occurs) -- It supports both standard, OSS cluster AIP, and Sentinel -- Comes with Auto reconnects / Auto-rediscovers cluster slots on error/migration -- Support instrumentations -- Allows for a custom dialer (this is useful for Enterprise) -- Support for Redis Sentinel - -The go-redis library is located in the https://github.com/go-redis/redis that you must import in your application. -Do check [Redis Cache Library for Golang](https://github.com/go-redis/cache) - -#### Step 1. Run a Redis server - -Redis is an open source, in-memory, key-value data store most commonly used as a primary database, cache, message broker, and queue. Redis delivers sub-millisecond response times, enabling fast and powerful real-time applications in industries such as gaming, fintech, ad-tech, social media, healthcare, and IoT. You can run a Redis database directly over your local mac os or in a container. If you have Docker installed in your sytem, type the following command: - -```bash - docker run -d -p 6379:6379 redislabs/redismod -``` - -You can connect to Redis server using the `redis-cli` command like this: - -``` - redis-cli -``` - -The above command will make a connection to the Redis server. It will then present a prompt that allows you to run Redis commands. - -#### Step 2. Initialise the Go Module - -In order to connect to the Redis instance and return some data value, first you need to initialize the Go module as shown: - -```bash - go mod init github.com/my/repo -``` - -#### Step 3. Install redis/v8 - -```bash - go get github.com/go-redis/redis/v8 -``` - -#### Step 4. Create a main.go file - -Let us create a `main.go` file and write the following code to check for your Redis instance connection - -```bash - package main - -import ( - "fmt" - "github.com/go-redis/redis" -) - -func main() { - fmt.Println("Testing Golang Redis") - - client := redis.NewClient(&redis.Options{ - Addr: "localhost:6379", - Password: "", - DB: 0, - }) - - pong, err := client.Ping(client.Context()).Result() - fmt.Println(pong, err) - - } -``` - -#### Step 5. Begin the compilation - -``` - go run main.go -``` - -By now, the Go application should successfully connect to the Redis instance and return data value (a successful "PONG" response). - - - - - -### Redis Launchpad - -Redis Launchpad is like an “App Store” for Redis sample apps. You can easily find apps for your preferred frameworks and languages. -Check out a few of these apps below, or [click here to access the complete list](https://launchpad.redis.com). - -
- -
-
- -#### Rate-Limiting app in Go - -![launchpad](images/ratelimitinggo.png) - -[Rate Limiting app](http://launchpad.redis.com/?id=project%3Abasic-redis-rate-limiting-demo-go-lang) built in Go - -
-
- -
-
- -#### Leaderboard app in Go - -![launchpad](images/leaderboardgo.png) - -[How to implement leaderboard app](https://launchpad.redis.com/?id=project%3Abasic-redis-leaderboard-demo-python) in Go - -
-
-
- -### Technical Articles & Whitepapers - -**[Redis and Golang: Designed to Improve Performance](https://redis.com/blog/redis-go-designed-improve-performance/)** -**[Redisqueue - A producer and consumer of a queue that uses Redis streams](https://pkg.go.dev/github.com/robinjoseph08/redisqueue#section-readme) -**[A High Performance Recommendation Engine with Redis and Go](https://redis.com/docs/ultra-fast-recommendations-engine-using-redis-go/)\*\* diff --git a/docs/develop/guides/netlify/getting-started/index-getting-started.mdx b/docs/develop/guides/netlify/getting-started/index-getting-started.mdx index ad9a7e56f3e..71f11b6631a 100644 --- a/docs/develop/guides/netlify/getting-started/index-getting-started.mdx +++ b/docs/develop/guides/netlify/getting-started/index-getting-started.mdx @@ -7,8 +7,6 @@ slug: /develop/guides/netlify/getting-started import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; -
- -
-
- -
-
- - -
- -
- -
-
diff --git a/docs/develop/java/getting-started/index.md b/docs/develop/java/getting-started/index.md index 818d3de928d..643b68af1fe 100644 --- a/docs/develop/java/getting-started/index.md +++ b/docs/develop/java/getting-started/index.md @@ -8,23 +8,13 @@ slug: /develop/java/getting-started import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem'; import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; +import RedisCard from '@theme/RedisCard'; Find tutorials, examples and technical articles that will help you to develop with Redis and Java. ## Getting Started -Java community has built many client libraries that you can find [here](https://redis.io/clients#java). For your first steps with Java and Redis, this article will show how to use the two main libraries: [Jedis](https://github.com/redis/jedis) and [Lettuce](https://lettuce.io/). - -The blog post “[Jedis vs. Lettuce: An Exploration](https://redis.com/blog/jedis-vs-lettuce-an-exploration/)” will help you to select the best for your application; keeping in mind that both are available in Spring & SpringBoot framework. - - - +Java community has built many client libraries that you can find [here](https://redis.io/clients#java). For your first steps with Java and Redis, this article will show how to use [Jedis](https://github.com/redis/jedis), the supported Redis client for Java. Redis is an open source, in-memory, key-value data store most commonly used as a primary database, cache, message broker, and queue. Redis delivers sub-millisecond response times, enabling fast and powerful real-time applications in industries such as gaming, fintech, ad-tech, social media, healthcare, and IoT. @@ -39,8 +29,7 @@ Use these commands to setup a Redis server locally on Mac OS: ``` :::info INFO -Redis Stack unifies and simplifies the developer experience of the leading Redis modules and the capabilities they provide. Redis Stack bundles five Redis modules: RedisJSON, RedisSearch, RedisGraph, RedisTimeSeries, and RedisBloom. -[Learn more](/create/redis-stack) +Redis Stack unifies and simplifies the developer experience of the leading Redis modules and the capabilities they provide. Redis Stack provides the following in addition to Redis Open Source: JSON, Search, Time Series, and Probabilistic data structures. ::: Ensure that you are able to use the following Redis command to connect to the Redis instance. @@ -58,7 +47,7 @@ Ensure that you are able to use the following Redis command to connect to the Re redis.clients jedis - 3.4.0 + 5.0.2 ``` @@ -106,53 +95,6 @@ Once you have access to the connection pool you can now get a Jedis instance and Find more information about Java & Redis connections in the "[Redis Connect](https://github.com/redis-developer/redis-connect/tree/master/java/jedis)". - - - -## Using Lettuce - -### Step 1. Add dependencies Jedis dependency to your Maven (or Gradle) project file: - -```xml - - io.lettuce - lettuce-corea - 6.0.1.RELEASE - -``` - -### Step 2. Import the Jedis classes - -```java - import io.lettuce.core.RedisClient; - import io.lettuce.core.api.StatefulRedisConnection; - import io.lettuce.core.api.sync.RedisCommands; -``` - -### Step 3. Write your application code - -```java - RedisClient redisClient = RedisClient.create("redis://localhost:6379/"); - StatefulRedisConnection connection = redisClient.connect(); - RedisCommands syncCommands = connection.sync(); - - syncCommands.set("mykey", "Hello from Lettuce!"); - String value = syncCommands.get("mykey"); - System.out.println( value ); - - syncCommands.zadd("vehicles", 0, "car"); - syncCommands.zadd("vehicles", 0, "bike"); - List vehicles = syncCommands.zrange("vehicles", 0, -1); - System.out.println( vehicles ); - - connection.close(); - redisClient.shutdown(); -``` - -Find more information about Java & Redis connections in the "[Redis Connect](https://github.com/redis-developer/redis-connect/tree/master/java/lettuce)". - - - ### Redis Launchpad Redis Launchpad is like an “App Store” for Redis sample apps. You can easily find apps for your preferred frameworks and languages. @@ -167,7 +109,7 @@ Check out a few of these apps below, or [click here to access the complete list] ![launchpad](images/moviedatabasejava.png) -[Movie Database app in Java](http://launchpad.redis.com/?id=project%3Ademo-movie-app-redisearch-java) based on RediSearch capabilities +[Movie Database app in Java](http://launchpad.redis.com/?id=project%3Ademo-movie-app-redisearch-java) based on Search capabilities
@@ -248,14 +190,8 @@ As developer you can use the Java client library directly in your application, o ### More developer resources -
- -
- -#### Sample Code - **[Brewdis - Product Catalog (Spring)](https://github.com/redis-developer/brewdis)** -See how to use Redis and Spring to build a product catalog with streams, hashes and RediSearch +See how to use Redis and Spring to build a product catalog with streams, hashes and Search **[Redis Stream in Action (Spring)](https://github.com/redis-developer/redis-streams-in-action)** See how to use Spring to create multiple producer and consumers with Redis Streams @@ -263,27 +199,6 @@ See how to use Spring to create multiple producer and consumers with Redis Strea **[Rate Limiting with Vert.x](https://github.com/redis-developer/vertx-rate-limiting-redis)** See how to use Redis Sorted Set with Vert.x to build a rate limiting service. -**[Redis Java Samples with Lettuce](https://github.com/redis-developer/vertx-rate-limiting-redis)** -Run Redis Commands from Lettuce - -
-
-
- -
- -#### Technical Articles - -**[Getting Started with Redis Streams and Java (Lettuce)](https://redis.com/blog/getting-started-with-redis-streams-and-java/)** - -**[Jedis vs. Lettuce: An Exploration](https://redis.com/blog/jedis-vs-lettuce-an-exploration/)** - -
- -
- ---- - ### Redis University ### [Redis for Java Developers](https://university.redis.com/courses/ru102j/) @@ -294,8 +209,6 @@ Redis for Java Developers teaches you how to build robust Redis client applicati -## - diff --git a/docs/develop/java/index-java.mdx b/docs/develop/java/index-java.mdx index 5d40bb17cb9..0cbc4e12e89 100644 --- a/docs/develop/java/index-java.mdx +++ b/docs/develop/java/index-java.mdx @@ -5,7 +5,7 @@ sidebar_label: Overview slug: /develop/java --- -import RedisCard from '@site/src/theme/RedisCard'; +import RedisCard from '@theme/RedisCard'; Explore the many different ways to build Java applications powered by Redis: diff --git a/docs/develop/java/spring/getting-started.mdx b/docs/develop/java/spring/getting-started.mdx index 0e6cd44edc2..d72fd319b50 100644 --- a/docs/develop/java/spring/getting-started.mdx +++ b/docs/develop/java/spring/getting-started.mdx @@ -6,10 +6,9 @@ slug: /develop/java/spring/ authors: [bsb] --- -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; +import Authors from '@theme/Authors'; + + #### Data-Driven Applications with Spring Boot and Redis diff --git a/docs/develop/java/spring/index-spring.mdx b/docs/develop/java/spring/index-spring.mdx index 556a01a79f9..f5f2679302e 100644 --- a/docs/develop/java/spring/index-spring.mdx +++ b/docs/develop/java/spring/index-spring.mdx @@ -5,7 +5,7 @@ sidebar_label: Overview slug: /develop/java/spring --- -import RedisCard from '@site/src/theme/RedisCard'; +import RedisCard from '@theme/RedisCard'; Discover the many different ways to build powerful SpringBoot applications with Redis: diff --git a/docs/develop/java/spring/rate-limiting/fixed-window/index-fixed-window-reactive-gears.mdx b/docs/develop/java/spring/rate-limiting/fixed-window/index-fixed-window-reactive-gears.mdx index b234869947a..fda5e8febd5 100644 --- a/docs/develop/java/spring/rate-limiting/fixed-window/index-fixed-window-reactive-gears.mdx +++ b/docs/develop/java/spring/rate-limiting/fixed-window/index-fixed-window-reactive-gears.mdx @@ -5,10 +5,11 @@ sidebar_label: Atomicity with Gears slug: /develop/java/spring/rate-limiting/fixed-window/reactive-gears --- -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; +:::warning LETTUCE + +This tutorial uses Lettuce, which is an unsupported Redis library. For production applications, we recommend using [**Jedis**](https://github.com/redis/jedis) + +::: ## Improving atomicity and performance with RedisGears @@ -115,10 +116,10 @@ In order to use our RedisGear function from our SpringBoot application we need t [LettuceMod](https://github.com/redis-developer/lettucemod) is a Java client for Redis Modules based on Lettuce created by [Julien Ruaux ](https://github.com/jruaux). It supports the following modules in standalone or cluster configurations: -- RedisGears -- RedisJSON -- RediSearch -- RedisTimeSeries +- Triggers and Functions +- JSON +- Search +- Time Series To use LettuceMod we'll add the dependency to the Maven POM as shown: diff --git a/docs/develop/java/spring/rate-limiting/fixed-window/index-fixed-window-reactive-lua.mdx b/docs/develop/java/spring/rate-limiting/fixed-window/index-fixed-window-reactive-lua.mdx index ca3a18e99e7..77f941148c2 100644 --- a/docs/develop/java/spring/rate-limiting/fixed-window/index-fixed-window-reactive-lua.mdx +++ b/docs/develop/java/spring/rate-limiting/fixed-window/index-fixed-window-reactive-lua.mdx @@ -6,10 +6,9 @@ slug: /develop/java/spring/rate-limiting/fixed-window/reactive-lua authors: [bsb] --- -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; +import Authors from '@theme/Authors'; + + ## Improving atomicity and performance with Lua diff --git a/docs/develop/java/spring/rate-limiting/fixed-window/index-fixed-window-reactive.mdx b/docs/develop/java/spring/rate-limiting/fixed-window/index-fixed-window-reactive.mdx index 4a224b7cda2..2366384db09 100644 --- a/docs/develop/java/spring/rate-limiting/fixed-window/index-fixed-window-reactive.mdx +++ b/docs/develop/java/spring/rate-limiting/fixed-window/index-fixed-window-reactive.mdx @@ -6,10 +6,9 @@ slug: /develop/java/spring/rate-limiting/fixed-window/reactive authors: [bsb] --- -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; +import Authors from '@theme/Authors'; + + ## A basic Spring Web Flux App diff --git a/docs/develop/java/spring/rate-limiting/fixed-window/index-fixed-window.mdx b/docs/develop/java/spring/rate-limiting/fixed-window/index-fixed-window.mdx index 06c1d2fa9b1..e1a3c0af85c 100644 --- a/docs/develop/java/spring/rate-limiting/fixed-window/index-fixed-window.mdx +++ b/docs/develop/java/spring/rate-limiting/fixed-window/index-fixed-window.mdx @@ -6,10 +6,9 @@ slug: /develop/java/spring/rate-limiting/fixed-window authors: [bsb] --- -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; +import Authors from '@theme/Authors'; + + The simplest approach to build a rate limiter is the "fixed window" implementation in which we cap the maximum number of requests in a fixed window of time. For exmaple, if the window size is 1 hour, we can diff --git a/docs/develop/java/spring/rate-limiting/getting-started/getting-started.mdx b/docs/develop/java/spring/rate-limiting/getting-started/getting-started.mdx index e75eab9d981..3be06421f72 100644 --- a/docs/develop/java/spring/rate-limiting/getting-started/getting-started.mdx +++ b/docs/develop/java/spring/rate-limiting/getting-started/getting-started.mdx @@ -6,10 +6,9 @@ slug: /develop/java/spring/rate-limiting/getting-started authors: [bsb] --- -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; +import Authors from '@theme/Authors'; + + In this series of mini-tutorials we'll explore several approaches to implement rate limiting in Spring applications using Redis. We’ll start with the most basic of Redis recipes and we’ll slowly increase the complexity of our implementations. diff --git a/docs/develop/java/spring/rate-limiting/index-ratelimiting.mdx b/docs/develop/java/spring/rate-limiting/index-ratelimiting.mdx index e74f2cd080b..8b387a5be02 100644 --- a/docs/develop/java/spring/rate-limiting/index-ratelimiting.mdx +++ b/docs/develop/java/spring/rate-limiting/index-ratelimiting.mdx @@ -5,7 +5,7 @@ sidebar_label: Overview slug: /develop/java/spring/rate-limiting --- -import RedisCard from '@site/src/theme/RedisCard'; +import RedisCard from '@theme/RedisCard'; The following links provides you with the available options to develop your application using NodeJS and Redis @@ -48,8 +48,8 @@ The following links provides you with the available options to develop your appl
diff --git a/docs/develop/java/spring/redis-and-spring-course/lesson_8/images/redis-insight-query-graph.png b/docs/develop/java/spring/redis-and-spring-course/_lesson_8/images/redis-insight-query-graph.png similarity index 100% rename from docs/develop/java/spring/redis-and-spring-course/lesson_8/images/redis-insight-query-graph.png rename to docs/develop/java/spring/redis-and-spring-course/_lesson_8/images/redis-insight-query-graph.png diff --git a/docs/develop/java/spring/redis-and-spring-course/lesson_8/images/redis-insight-query-table.png b/docs/develop/java/spring/redis-and-spring-course/_lesson_8/images/redis-insight-query-table.png similarity index 100% rename from docs/develop/java/spring/redis-and-spring-course/lesson_8/images/redis-insight-query-table.png rename to docs/develop/java/spring/redis-and-spring-course/_lesson_8/images/redis-insight-query-table.png diff --git a/docs/develop/java/spring/redis-and-spring-course/lesson_8/images/redis-insight.png b/docs/develop/java/spring/redis-and-spring-course/_lesson_8/images/redis-insight.png similarity index 100% rename from docs/develop/java/spring/redis-and-spring-course/lesson_8/images/redis-insight.png rename to docs/develop/java/spring/redis-and-spring-course/_lesson_8/images/redis-insight.png diff --git a/docs/develop/java/spring/redis-and-spring-course/lesson_8/index-lesson_8.mdx b/docs/develop/java/spring/redis-and-spring-course/_lesson_8/index-lesson_8.mdx similarity index 99% rename from docs/develop/java/spring/redis-and-spring-course/lesson_8/index-lesson_8.mdx rename to docs/develop/java/spring/redis-and-spring-course/_lesson_8/index-lesson_8.mdx index 435c72d9808..bd48c32ad83 100644 --- a/docs/develop/java/spring/redis-and-spring-course/lesson_8/index-lesson_8.mdx +++ b/docs/develop/java/spring/redis-and-spring-course/_lesson_8/index-lesson_8.mdx @@ -3,11 +3,14 @@ id: index-lesson_8 title: 'Recommendations with RedisGraph' sidebar_label: Recommendations w/ RedisGraph slug: /develop/java/redis-and-spring-course/lesson_8 +authors: [bsb] --- -import useBaseUrl from '@docusaurus/useBaseUrl'; +import Authors from '@theme/Authors'; +import GraphEol from '@site/docs/common/_graph-eol.mdx'; -Author: [Brian Sam-Bodden](https://twitter.com/bsbodden) + + ### Objectives diff --git a/docs/develop/java/spring/redis-and-spring-course/index-redis-and-spring-course.mdx b/docs/develop/java/spring/redis-and-spring-course/index-redis-and-spring-course.mdx index c195a860fc8..5305ec57d7e 100644 --- a/docs/develop/java/spring/redis-and-spring-course/index-redis-and-spring-course.mdx +++ b/docs/develop/java/spring/redis-and-spring-course/index-redis-and-spring-course.mdx @@ -6,10 +6,10 @@ slug: /develop/java/redis-and-spring-course authors: [bsb] --- -import RedisCard from '@site/src/theme/RedisCard'; -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; +import RedisCard from '@theme/RedisCard'; +import Authors from '@theme/Authors'; + + This is a complete online course for Java/Spring developers wanting to learn how Redis can serve as your primary database in Spring Applications and how to leverage the @@ -87,8 +87,8 @@ along with other teaching assistants. Join us on the Redis Discord server.
@@ -97,23 +97,15 @@ along with other teaching assistants. Join us on the Redis Discord server.
-
- -
-
diff --git a/docs/develop/java/spring/redis-and-spring-course/lesson_1/index-lesson_1.mdx b/docs/develop/java/spring/redis-and-spring-course/lesson_1/index-lesson_1.mdx index b11f4552612..d7893841988 100644 --- a/docs/develop/java/spring/redis-and-spring-course/lesson_1/index-lesson_1.mdx +++ b/docs/develop/java/spring/redis-and-spring-course/lesson_1/index-lesson_1.mdx @@ -6,7 +6,9 @@ slug: /develop/java/redis-and-spring-course/lesson_1 authors: [bsb] --- -import useBaseUrl from '@docusaurus/useBaseUrl'; +import Authors from '@theme/Authors'; + + ### Objectives @@ -86,19 +88,16 @@ Read the [Git documentation on submodules](https://git-scm.com/book/en/v2/Git-To You’ll find this file in the [redismod-docker-compose repo](https://github.com/redis-developer/redismod-docker-compose), hosted under the [redis-developer](https://github.com/redis-developer/) organization in Github. -The Repo contains a [Docker Compose](https://docs.docker.com/compose) file configured to use the Redis “redismod” image, which is a Docker image -that includes Redis built with select modules. In particular, we will use the [RedisJSON](https://redisjson.io), [RediSearch](https://redisearch.io), -and [RedisGraph](https://redisgraph.io) modules while building this application. +The Repo contains a [Docker Compose](https://docs.docker.com/compose) file configured to use the Redis “redismod” image, which is a Docker image that includes Redis built with support for JSON, Search, Graph, Time Series, Triggers and Functions, and Probabilistic data structures. Modules included in the container: -- **RediSearch**: a full-featured search engine -- **RedisGraph**: a graph database -- **RedisTimeSeries**: a time-series database -- **RedisAI**: a tensor and deep learning model server -- **RedisJSON**: a native JSON data type -- **RedisBloom**: native Bloom and Cuckoo Filter data types -- **RedisGears**: a dynamic execution framework +- **Search**: a full-featured search engine +- **Graph**: a graph database +- **Time Series**: a time-series database +- **JSON**: a native JSON data type +- **Probabilistic**: native Bloom and Cuckoo Filter data types +- **Triggers and Functions**: a dynamic execution framework To add the submodule, we use the git submodule command at the root of the project: @@ -298,6 +297,6 @@ This makes Redis an excellent choice for applications that require real time dat Here's some resources that we think will be useful to you as you discover Redis: - [redis.io](https://redis.io/) - the official website of open source Redis. -- [Redis Enterprise Cloud](https://redis.com/redis-enterprise-cloud/overview/) - a fully managed cloud service from Redis with a free plan for getting started. +- [Redis Cloud](https://redis.com/redis-enterprise-cloud/overview/) - a fully managed cloud service from Redis with a free plan for getting started. - The official [Redis Docker image](https://hub.docker.com/_/redis/). - For a comprehensive introduction to Redis, we recommend taking a look at the [RU101: Introduction to Redis Data Structures](https://university.redis.com/courses/ru101/) course at Redis University. In this free online course, you’ll learn about the data structures in Redis, and you’ll see how to practically apply them in the real world. diff --git a/docs/develop/java/spring/redis-and-spring-course/lesson_2/index-lesson_2.mdx b/docs/develop/java/spring/redis-and-spring-course/lesson_2/index-lesson_2.mdx index 7522098beab..45efff84056 100644 --- a/docs/develop/java/spring/redis-and-spring-course/lesson_2/index-lesson_2.mdx +++ b/docs/develop/java/spring/redis-and-spring-course/lesson_2/index-lesson_2.mdx @@ -6,7 +6,9 @@ slug: /develop/java/redis-and-spring-course/lesson_2 authors: [bsb] --- -import useBaseUrl from '@docusaurus/useBaseUrl'; +import Authors from '@theme/Authors'; + + ### Objectives @@ -61,7 +63,7 @@ import org.springframework.data.redis.core.RedisTemplate; Notice that while the template types are generic, it is up to the serializers/deserializers to convert the given Objects to-and-from binary data correctly. -We could also configure the Redis host and port programmatically by defining a `@Bean` annotated method that returns a `RedisConnectionFactory` (either a `JedisConnectionFactory` or `LettuceConnectionFactory`) and use the `setHostName` and `setPort` methods. +We could also configure the Redis host and port programmatically by defining a `@Bean` annotated method that returns a `RedisConnectionFactory` (either a `JedisConnectionFactory`) and use the `setHostName` and `setPort` methods. But since Spring Data Redis can configure the beans using a properties file (either Java Properties or YAML), we will use the `applications.properties` file instead. diff --git a/docs/develop/java/spring/redis-and-spring-course/lesson_3/index-lesson_3.mdx b/docs/develop/java/spring/redis-and-spring-course/lesson_3/index-lesson_3.mdx index 1f48ec49307..ee033813b5e 100644 --- a/docs/develop/java/spring/redis-and-spring-course/lesson_3/index-lesson_3.mdx +++ b/docs/develop/java/spring/redis-and-spring-course/lesson_3/index-lesson_3.mdx @@ -6,7 +6,9 @@ slug: /develop/java/redis-and-spring-course/lesson_3 authors: [bsb] --- -import useBaseUrl from '@docusaurus/useBaseUrl'; +import Authors from '@theme/Authors'; + + ### Objectives diff --git a/docs/develop/java/spring/redis-and-spring-course/lesson_4/index-lesson_4.mdx b/docs/develop/java/spring/redis-and-spring-course/lesson_4/index-lesson_4.mdx index 024814c1838..0ccc9ffffad 100644 --- a/docs/develop/java/spring/redis-and-spring-course/lesson_4/index-lesson_4.mdx +++ b/docs/develop/java/spring/redis-and-spring-course/lesson_4/index-lesson_4.mdx @@ -6,7 +6,9 @@ slug: /develop/java/redis-and-spring-course/lesson_4 authors: [bsb] --- -import useBaseUrl from '@docusaurus/useBaseUrl'; +import Authors from '@theme/Authors'; + + ### Objectives diff --git a/docs/develop/java/spring/redis-and-spring-course/lesson_5/index-lesson_5.mdx b/docs/develop/java/spring/redis-and-spring-course/lesson_5/index-lesson_5.mdx index 278e4cc0a38..945efea3f6f 100644 --- a/docs/develop/java/spring/redis-and-spring-course/lesson_5/index-lesson_5.mdx +++ b/docs/develop/java/spring/redis-and-spring-course/lesson_5/index-lesson_5.mdx @@ -6,7 +6,9 @@ slug: /develop/java/redis-and-spring-course/lesson_5 authors: [bsb] --- -import useBaseUrl from '@docusaurus/useBaseUrl'; +import Authors from '@theme/Authors'; + + ### Objectives diff --git a/docs/develop/java/spring/redis-and-spring-course/lesson_6/index-lesson_6.mdx b/docs/develop/java/spring/redis-and-spring-course/lesson_6/index-lesson_6.mdx index 1b8ad1d6904..0b5d6174aa4 100644 --- a/docs/develop/java/spring/redis-and-spring-course/lesson_6/index-lesson_6.mdx +++ b/docs/develop/java/spring/redis-and-spring-course/lesson_6/index-lesson_6.mdx @@ -1,16 +1,18 @@ --- id: index-lesson_6 -title: 'Domain Models with RedisJSON' -sidebar_label: Domain Models w/ RedisJSON +title: Domain Models with Redis +sidebar_label: Domain Models w/ Redis slug: /develop/java/redis-and-spring-course/lesson_6 authors: [bsb] --- -import useBaseUrl from '@docusaurus/useBaseUrl'; +import Authors from '@theme/Authors'; + + ### Objectives -Add a JSON-backed domain model to Redi2Read using the RedisJSON Redis module. +Add a JSON-backed domain model to Redi2Read using [Redis Stack](https://redis.io/docs/stack/). ### Agenda @@ -22,23 +24,40 @@ In this lesson, you'll learn how to: If you get stuck: - The progress made in this lesson is available on the redi2read github repository at https://github.com/redis-developer/redi2read/tree/course/milestone-6 +:::note + +As of [Jedis 4.0.0](https://github.com/redis/jedis) this library is deprecated. It's features have been merged into Jedis. + +::: + ### Carts and Cart Items -We will implement the `Cart` and `CartItem` models backed by a custom Spring Repository that uses the RedisJSON API via the JRedisJSON client library. +We will implement the `Cart` and `CartItem` models backed by a custom Spring Repository that uses the [Redis JSON API](https://redis.io/commands/?group=json) via the JRedisJSON client library. We will represent a user’s cart as a JSON document containing cart item subdocuments. As you can see in the class diagram, a `Cart` has zero or more `CartItems`, and it belongs to a `User`. ![Carts and Cart Items](./images/carts_and_cartitems.png) -### RedisJSON +### Redis Stack + +[Redis Stack](https://redis.io/docs/stack/about/) extends the core capabilities of Redis OSS and provides a complete developer experience for debugging and more. In addition to all of the features of Redis OSS, Redis Stack supports: + +- Queryable JSON documents +- Querying across hashes and JSON documents +- Time series data support (ingestion & querying), including full-text search +- Probabilistic data structures -RedisJSON is a Redis module that lets you store, update, and fetch JSON values natively in Redis. -JSON can be a better fit for modeling complex data in Redis than Hashes because, unlike Hashes, -JSON values can contain nested arrays and objects. ### JRedisJSON -JRedisJSON (https://github.com/RedisJSON/JRedisJSON) is a Java client that provides access to RedisJSON's Redis API and provides Java serialization using Google’s GSON library. +:::note + +As of [Jedis 4.0.0](https://github.com/redis/jedis) this library is deprecated. It's features have been merged into Jedis. + +::: + + +JRedisJSON (https://github.com/RedisJSON/JRedisJSON) is a Java client that provides access to Redis's JSON API and provides Java serialization using Google’s GSON library. #### Adding JRedisJSON as a Dependency @@ -188,7 +207,7 @@ Now, when JSON serialization occurs in the REST controllers, the user collection ### The Cart Repository -RedisJSON is not yet seamlessly integrated with Spring, but that does not prevent us from using RedisJSON the “Spring Way”. We have provided an implementation of Spring’s CrudRepository so that we can implement our services and controllers. +Below, we have provided an implementation of Spring’s CrudRepository so that we can implement our services and controllers. Add the file src/main/java/com/redislabs/edu/redi2read/repositories/CartRepository.java with the following contents: ```java diff --git a/docs/develop/java/spring/redis-and-spring-course/lesson_7/index-lesson_7.mdx b/docs/develop/java/spring/redis-and-spring-course/lesson_7/index-lesson_7.mdx index 42b083bec7b..e63512be1f9 100644 --- a/docs/develop/java/spring/redis-and-spring-course/lesson_7/index-lesson_7.mdx +++ b/docs/develop/java/spring/redis-and-spring-course/lesson_7/index-lesson_7.mdx @@ -1,38 +1,52 @@ --- id: index-lesson_7 -title: 'Search with RediSearch' -sidebar_label: Search w/ RediSearch +title: Search with Redis +sidebar_label: Search w/ Redis slug: /develop/java/redis-and-spring-course/lesson_7 authors: [bsb] --- -import useBaseUrl from '@docusaurus/useBaseUrl'; +:::warning LETTUCE + +This tutorial uses Lettuce, which is an unsupported Redis library. For production applications, we recommend using [**Jedis**](https://github.com/redis/jedis) + +::: + +import Authors from '@theme/Authors'; + + ### Objectives -Learn how the RediSearch module can bridge the querying gap between SQL and NoSQL systems. We’ll focus on two everyday use cases: full-text search and auto-complete. +Learn how the built-in Search and Query engine in Redis can bridge the querying gap between SQL and NoSQL systems. We’ll focus on two everyday use cases: full-text search and auto-complete. ### Agenda In this lesson, you'll learn: -- How to create search indexes with RediSeach using spring-redisearch and lettuce-search. -- How to use RediSearch in a Spring Boot application to implement faceted search. -- How to use the RediSearch suggestions feature to implement auto-complete. +- How to create search indexes with Redis using spring-redisearch and lettuce-search. +- How to use Redis in a Spring Boot application to implement faceted search. +- How to use the Redis suggestions feature to implement auto-complete. If you get stuck: - The progress made in this lesson is available on the redi2read github repository at https://github.com/redis-developer/redi2read/tree/course/milestone-7 -### RediSearch +### Redis Stack Search and Query engine -RediSearch is a source-available module for querying, secondary indexing, and full-text search in Redis. -Redisearch implements a secondary index in Redis, but unlike other Redis indexing libraries, it does not use internal data structures such as sorted sets. -This also enables more advanced features, such as multi-field queries, aggregation, and full-text search. Also, RediSearch supports exact phrase matching and numeric filtering for text queries, neither possible nor efficient with traditional Redis indexing approaches. +Redis Stack is a source-available version of Redis used for querying, secondary indexing, and full-text search in Redis. +Redis Stack implements a secondary index in Redis, but unlike other Redis indexing libraries, it does not use internal data structures such as sorted sets. +This also enables more advanced features, such as multi-field queries, aggregation, and full-text search. Also, Redis Stack supports exact phrase matching and numeric filtering for text queries, neither possible nor efficient with traditional Redis indexing approaches. Having a rich query and aggregation engine in your Redis database opens the door to many new applications that go well beyond caching. You can use Redis as your primary database even when you need to access the data using complex queries without adding complexity to the code to update and index data. ### Using spring-redisearch -Spring RediSearch (https://github.com/RediSearch/spring-redisearch) is a library built on LettuSearch (https://github.com/RediSearch/lettusearch), providing access to RediSearch from Spring applications. -LettuSearch is a Java client for RediSearch based on the popular Redis Java client library Lettuce. +:::warn + +Spring Redis Search and LettuSearch have been merged into multi-module client [**LettuceMod**](https://github.com/redis-developer/lettucemod). Please use LettuceMod instead. + +::: + +Spring Redis Search (https://github.com/RediSearch/spring-redisearch) is a library built on LettuSearch (https://github.com/RediSearch/lettusearch), providing access to Redis Stack from Spring applications. +LettuSearch is a Java client for Redis Stack based on the popular Redis Java client library Lettuce. Adding the `spring-redisearch` dependency In your Maven `pom.xml`, add the following dependency: @@ -55,7 +69,7 @@ For the `Book` model, you will be indexing four fields: #### Authors -Creating the index is done using the FT.CREATE command. The RediSearch engine will scan the database using one or more PREFIX key pattern values and update the index based on the schema definition. +Creating the index is done using the FT.CREATE command. The Redis Search and Query engine will scan the database using one or more PREFIX key pattern values and update the index based on the schema definition. This active index maintenance makes it easy to add an index to an existing application. To create our index, we’ll use the now-familiar `CommandLineRunner` recipe. We will keep the name of the soon to be created index in the application's property field as shown: @@ -131,8 +145,8 @@ public class CreateBooksSearchIndex implements CommandLineRunner { ``` Let’s break down what our `CreateBooksSearchIndex` `CommandLineRunner` is doing. We'll be working with classes out of the `com.redislabs.lettusearch` package: -Inject a `StatefulRediSearchConnection`, which gives access to RediSearch commands in synchronous mode, asynchronous mode, and reactive mode. -From the `StatefulRediSearchConnection` we get an instance of RediSearchCommands using the `sync()` method (return the synchronous mode methods). +Inject a `StatefulRediSearchConnection`, which gives access to Search commands in synchronous mode, asynchronous mode, and reactive mode. +From the `StatefulRediSearchConnection` we get an instance of Search commands using the `sync()` method (return the synchronous mode methods). We only create the index if it doesn’t exist, which will be signalled by the FT.INFO command command throwing an exception. To create the index, we build a `CreateOptions` object passing the Book class prefix. For each one the fields to be indexed, we create a Field object: @@ -143,7 +157,7 @@ For each one the fields to be indexed, we create a Field object: Authors are stored in a Set, so they are serialized as prefixed indexed fields (`authors.[0], authors.[1]`, ...). We indexed up to 6 authors. To create the index, we invoke the create method passing the index name, the CreateOptions, and the fields. -To see more options and all field types, see https://oss.redis.com/redisearch/Commands/#ftcreate +To see more options and all field types, see https://redis.io/commands/ft.create/ On server restart, you should run your Redis CLI MONITOR to see the following commands: ``` @@ -172,7 +186,7 @@ This snippet from the FT.INFO command output for the `“books-idx”` index sho ### Full-text Search Queries -RediSearch is a full-text search engine, allowing the application to run powerful queries. For example, to search all books that contain “networking”-related information, you would run the following command: +Redis Stack is a full-text search engine, allowing the application to run powerful queries. For example, to search all books that contain “networking”-related information, you would run the following command: ```bash 127.0.0.1:6379> FT.SEARCH books-idx "networking" RETURN 1 title @@ -243,7 +257,7 @@ Unions: 127.0.0.1:6379> FT.SEARCH books-idx "rust | %scal%" RETURN 3 title subtitle authors.[0] ``` -You can find more information about the query syntax in the RediSearch documentation. +You can find more information about the query syntax in the [Redis Search documentation](https://redis.io/docs/stack/search/). Adding Search to the Books Controller To add full-text search capabilities to the `BooksController`, we'll first inject a `StatefulRediSearchConnection` and simply pass a text query param to the search method available from the `RediSearchCommands` interface: @@ -315,11 +329,11 @@ This returns: ] ``` -### Adding and getting Auto-complete suggestions +### Adding and getting auto-complete suggestions -RediSearch provides a completion suggester that is typically used for auto-complete/search-as-you-type functionality. +Redis Stack provides a completion suggester that is typically used for auto-complete/search-as-you-type functionality. This is a navigational feature to guide users to relevant results as they are typing, improving search precision. -RediSearch provides completion suggestions with four commands: +Redis provides completion suggestions with four commands: - FT.SUGADD: Adds a suggestion string to an auto-complete dictionary. - FT.SUGGET: Get a list of suggestions for a string. diff --git a/docs/develop/java/spring/redis-and-spring-course/lesson_9/index-lesson_9.mdx b/docs/develop/java/spring/redis-and-spring-course/lesson_9/index-lesson_9.mdx index d72e71ee067..bd21e0e0301 100644 --- a/docs/develop/java/spring/redis-and-spring-course/lesson_9/index-lesson_9.mdx +++ b/docs/develop/java/spring/redis-and-spring-course/lesson_9/index-lesson_9.mdx @@ -6,10 +6,9 @@ slug: /develop/java/redis-and-spring-course/lesson_9 authors: [bsb] --- -import RedisCard from '@site/src/theme/RedisCard'; -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; +import Authors from '@theme/Authors'; + + ### Objectives diff --git a/docs/develop/java/spring/redis-om/index-redis-om-spring-hash.mdx b/docs/develop/java/spring/redis-om/index-redis-om-spring-hash.mdx index 4825cf486f6..9a2f1f23160 100644 --- a/docs/develop/java/spring/redis-om/index-redis-om-spring-hash.mdx +++ b/docs/develop/java/spring/redis-om/index-redis-om-spring-hash.mdx @@ -5,11 +5,6 @@ sidebar_label: Working with Hashes slug: /develop/java/spring/redis-om/redis-om-spring-hash --- -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; - ## Introduction The Spring Data Redis (SDR) framework makes it easy to write Spring applications that use the Redis as a store @@ -20,8 +15,8 @@ Redis OM Spring, builds on top of SDR to improve and optimize the interaction wi Redis' rich module ecosystem. For Java objects mapped with SDR's `@RedisHash` annotation we enhance the object-mapping by: -- Eliminating the need for client-side maintained secondary indices and instead using Redis' native search engine: RediSearch. -- Implementing dynamic repository finders using RediSearch fast and flexible querying +- Eliminating the need for client-side maintained secondary indices and instead using Redis' native Search and Query engine. +- Implementing dynamic repository finders with fast and flexible querying - Using ULIDs instead of traditional UUIDs for performance, readability and interoperability ## What You Will build @@ -58,9 +53,17 @@ The dependencies included are: - _Lombok_: Java annotation library which helps to reduce boilerplate code. - _Spring Boot DevTools_: Provides fast application restarts, LiveReload, and configurations for enhanced development experience. -_NOTE:_ If your IDE has the Spring Initializr integration, you can complete this process from your IDE. +:::note + +If your IDE has the Spring Initializr integration, you can complete this process from your IDE. + +::: + +:::note + +You can also fork the project from Github and open it in your IDE or other editor. -_NOTE:_ You can also fork the project from Github and open it in your IDE or other editor. +::: ## Adding Redis OM Spring @@ -70,12 +73,18 @@ To use Redis OM Spring, open the `pom.xml` file and add the Redis OM Spring Mave ```xml - com.redis.om.spring - redis-om-spring - 0.1.0-SNAPSHOT + com.redis.om + redis-om-spring + 0.5.2-SNAPSHOT ``` +:::note + +Please check the official [Redis OM Spring GitHub repository](https://github.com/redis/redis-om-spring) for the latest version information + +::: + ### Gradle If using gradle add the dependency as follows: @@ -112,7 +121,7 @@ public class RomsHashesApplication { ## 🚀 Launch Redis -Redis OM Spring relies on the power of the [RediSearch][redisearch-url] and [RedisJSON][redis-json-url] modules. +Redis OM Spring relies on the power of [Redis Stack](https://redis.io/docs/stack/). The docker compose YAML file below can get started quickly. You can place at the root folder of your project and name it `docker-compose.yml`: @@ -120,21 +129,13 @@ The docker compose YAML file below can get started quickly. You can place at the version: '3.9' services: redis: - image: 'redislabs/redismod:edge' + image: 'redis/redis-stack:latest' ports: - '6379:6379' volumes: - ./data:/data - entrypoint: > - redis-server - --loadmodule /usr/lib/redis/modules/redisai.so - --loadmodule /usr/lib/redis/modules/redisearch.so - --loadmodule /usr/lib/redis/modules/redisgraph.so - --loadmodule /usr/lib/redis/modules/redistimeseries.so - --loadmodule /usr/lib/redis/modules/rejson.so - --loadmodule /usr/lib/redis/modules/redisbloom.so - --loadmodule /var/opt/redislabs/lib/modules/redisgears.so - --appendonly yes + environment: + - REDIS_ARGS: --save 20 1 deploy: replicas: 1 restart_policy: @@ -319,10 +320,10 @@ If every goes as expected, you should see the familiar Spring Boot banner fly by If you were watching the Redis CLI monitor you should have seen a barrage of output fly by. Let's break it down and inspect it using another Redis CLI so as to understand the inner workings of the system. -### RediSearch Indices +### Redis Search Indices -At the top you should have seen the `FT.CREATE` command which using the annotations in our POJO determined a RediSearch -index recipe. Since our POJO is annotated with `@Document` we get a RediSearch index `ON JSON` against any keys starting +At the top you should have seen the `FT.CREATE` command which using the annotations in our POJO determined an +index recipe. Since our POJO is annotated with `@Document` we get an index `ON JSON` against any keys starting with `com.redis.om.documents.domain.Company:` (which is the default key prefix for Spring Data Redis and also for ROMS): ```bash @@ -492,7 +493,7 @@ Let's format the resulting JSON: } ``` -Inspecting the Redis CLI MONITOR we should see the RediSearch query issued: +Inspecting the Redis CLI MONITOR we should see the query issued: ```bash 1638342334.137327 [0 172.19.0.1:63402] "FT.SEARCH" "UserIdx" "@lastName:{Morello} " @@ -538,16 +539,11 @@ Formatting the resulting JSON we can see the record for `Brad Wilk` is returned ] ``` -Back on the Redis CLI monitor we can see the RediSearch query generated by our repository method: +Back on the Redis CLI monitor we can see the query generated by our repository method: ```bash 1638343589.454213 [0 172.19.0.1:63406] "FT.SEARCH" "UserIdx" "@firstName:{Brad} @lastName:{Wilk} " ``` Redis OM Spring, extends Spring Data Redis with search capabilities that rival the flexibility of JPA -queries by using Redis' native search engine; RediSearch. - - - -[redisearch-url]: https://oss.redis.com/redisearch/ -[redis-json-url]: https://oss.redis.com/redisjson/ +queries by using Redis' native Search and Query engine. diff --git a/docs/develop/java/spring/redis-om/index-redis-om-spring-json.mdx b/docs/develop/java/spring/redis-om/index-redis-om-spring-json.mdx index 951cf60d3d5..6456b8150d5 100644 --- a/docs/develop/java/spring/redis-om/index-redis-om-spring-json.mdx +++ b/docs/develop/java/spring/redis-om/index-redis-om-spring-json.mdx @@ -6,10 +6,9 @@ slug: /develop/java/spring/redis-om/redis-om-spring-json authors: [bsb] --- -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; +import Authors from '@theme/Authors'; + + ## Introduction @@ -19,9 +18,9 @@ a several document-oriented databases like CouchDB and MongoDB gaining in popula a data format eliminates the rigidity of relational database schemas, allow applications to evolve more naturally. -But did you know that Redis is a full-fledge document database supporting JSON natively? The -RedisJSON module adds JSON as a native Redis datatype `ReJSON-RL` and it is seamlessly integrated -with Redis' Search Engine RediSearch. In this tutorial we'll build a simple Document application +But did you know that Redis is a full-fledge document database supporting JSON natively? Redis Stack +adds JSON as a native Redis datatype `ReJSON-RL` and it is seamlessly integrated +with Redis' Search and Query engine. In this tutorial we'll build a simple Document application using Redis OM Spring. ## What You Will build @@ -58,9 +57,17 @@ The dependencies included are: - _Lombok_: Java annotation library which helps to reduce boilerplate code. - _Spring Boot DevTools_: Provides fast application restarts, LiveReload, and configurations for enhanced development experience. -_NOTE:_ If your IDE has the Spring Initializr integration, you can complete this process from your IDE. +:::note + +If your IDE has the Spring Initializr integration, you can complete this process from your IDE. + +::: + +:::note -_NOTE:_ You can also fork the project from Github and open it in your IDE or other editor. +You can also fork the project from Github and open it in your IDE or other editor. + +::: ## Adding Redis OM Spring @@ -70,12 +77,18 @@ To use Redis OM Spring, open the `pom.xml` file and add the Redis OM Spring Mave ```xml - com.redis.om.spring - redis-om-spring - 0.1.0-SNAPSHOT + com.redis.om + redis-om-spring + 0.5.2-SNAPSHOT ``` +:::note + +Please check the official [Redis OM Spring GitHub repository](https://github.com/redis/redis-om-spring) for the latest version information + +::: + ### Gradle If using gradle add the dependency as follows: @@ -129,7 +142,7 @@ public class RomsDocumentsApplication { ## 🚀 Launch Redis -Redis OM Spring relies on the power of the [RediSearch](https://redisearch.io/) and [RedisJSON](https://redisjson.io) modules. +Redis OM Spring relies on the power of [Redis Stack](https://redis.io/docs/stack/about/). The docker compose YAML file below can get started quickly. You can place at the root folder of your project and name it `docker-compose.yml`: @@ -137,21 +150,13 @@ The docker compose YAML file below can get started quickly. You can place at the version: '3.9' services: redis: - image: 'redislabs/redismod:edge' + image: 'redis/redis-stack:latest' ports: - '6379:6379' volumes: - ./data:/data - entrypoint: > - redis-server - --loadmodule /usr/lib/redis/modules/redisai.so - --loadmodule /usr/lib/redis/modules/redisearch.so - --loadmodule /usr/lib/redis/modules/redisgraph.so - --loadmodule /usr/lib/redis/modules/redistimeseries.so - --loadmodule /usr/lib/redis/modules/rejson.so - --loadmodule /usr/lib/redis/modules/redisbloom.so - --loadmodule /var/opt/redislabs/lib/modules/redisgears.so - --appendonly yes + environment: + - REDIS_ARGS: --save 20 1 deploy: replicas: 1 restart_policy: @@ -242,7 +247,7 @@ and the `yearFounded` and a `boolean` as to whether the company is `publiclyList ## Redis OM Spring Document Repositories Working with Redis OM Spring Document Repositories lets you seamlessly convert and store domain objects in Redis JSON keys, -apply custom mapping strategies, and use secondary indexes maintained by RediSearch. +apply custom mapping strategies, and use secondary indexes maintained by Redis. To create the component responsible for storage and retrieval, we need to define a repository interface. The `RedisDocumentRepository` extends the familiar `PagingAndSortingRepository` from the core `org.springframework.data.repository` package. @@ -358,10 +363,10 @@ If every goes as expected, you should see the familiar Spring Boot banner fly by If you were watching the Redis CLI monitor you should have seen a barrage of output fly by. Let's break it down and inspect it using another Redis CLI so as to understand the inner workings of the system. -### RediSearch Indices +### Redis Stack Search Indices -At the top you should have seen the `FT.CREATE` command which using the annotations in our POJO determined a RediSearch -index recipe. Since our POJO is annotated with `@Document` we get a RediSearch index `ON JSON` against any keys starting +At the top you should have seen the `FT.CREATE` command which using the annotations in our POJO determined an +index recipe. Since our POJO is annotated with `@Document` we get an index `ON JSON` against any keys starting with `com.redis.om.documents.domain.Company:` (which is the default key prefix for Spring Data Redis and also for ROMS): ```bash @@ -392,7 +397,7 @@ Finally, for each of the `Company` POJOs we should see a sequence of REDIS comma ``` The first line checks whether the object already exists in the Redis SET of primary keys using the `SISMEMBER` command. Then, -the RedisJSON `JSON.SET` commands is used to save the JSON serialization of the entity. Once that operation succeeds, the +the `JSON.SET` commands is used to save the JSON serialization of the entity. Once that operation succeeds, the `id` property of the object is addded to the primary keys set using the `SADD` command. Let's inspect the data using the Redis CLI. We'll start by listing the keys prefixed with `com.redis.om.documents.domain.Company`: @@ -528,14 +533,14 @@ Let's format the resulting JSON: } ``` -Inspecting the Redis CLI Monitor shows the resulting RediSearch query: +Inspecting the Redis CLI Monitor shows the resulting query: ```bash 1638344903.218982 [0 172.19.0.1:63410] "FT.SEARCH" "CompanyIdx" "@name:Redis " ``` Notice that you can use `redis` (all lowercase) or `rEdI` and you will get a match for `Redis`, if you go below 4 -characters, say you try `red` or `RED` you will get no hits. RediSearch limits the minimun string match size to 4 +characters, say you try `red` or `RED` you will get no hits. Redis limits the minimun string match size to 4 characters to prevent potentially millions of results being returned. ### Storing and Querying Geospatial Data @@ -591,19 +596,19 @@ Formatting the JSON result we get a JSON array containing one entry: `Redis`. ] ``` -Inspecting the Redis CLI Monitor shows the resulting RediSearch query: +Inspecting the Redis CLI Monitor shows the resulting query: ```bash 1638344951.451871 [0 172.19.0.1:63410] "FT.SEARCH" "CompanyIdx" "@location:[-122.064 37.384 30.0 mi] " ``` -### Native RediSearch Queries +### Native Redis Stack Searches and Queries -There might be occassions where you just need to reach for the raw querying power of RediSearch +There might be occassions where you just need to reach for the raw querying power of Redis Stack (just like when you need raw SQL over JPA). For these scenario, we provide the `@Query` (`com.redis.om.spring.annotations.Query`) and the `@Aggregation` (`com.redis.om.spring.annotations.Aggregation`) annotations. These annotations expose the raw querying API provided by the `JRediSearch` library. ROMS adds parameter -parsing and results mapping so you can use raw RediSearch queries and aggregations in your repositories. +parsing and results mapping so you can use raw queries and aggregations in your repositories. ```java // find by tag field, using JRediSearch "native" annotation @@ -651,7 +656,7 @@ Formatting the JSON we can see that the results include companies with the tag ` ] ``` -Inspecting the Redis CLI Monitor we see the RediSearch query that produced the results: +Inspecting the Redis CLI Monitor we see the query that produced the results: ```bash 1638345120.384300 [0 172.19.0.1:63412] "FT.SEARCH" "CompanyIdx" "@tags:{reliable} " diff --git a/docs/develop/java/spring/redis-om/index-redis-om-spring.mdx b/docs/develop/java/spring/redis-om/index-redis-om-spring.mdx index c55b5a21597..b1f8866e538 100644 --- a/docs/develop/java/spring/redis-om/index-redis-om-spring.mdx +++ b/docs/develop/java/spring/redis-om/index-redis-om-spring.mdx @@ -6,10 +6,10 @@ slug: /develop/java/spring/redis-om/redis-om-spring authors: [bsb] --- -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; +import RedisCard from '@theme/RedisCard'; +import Authors from '@theme/Authors'; + + ## Introduction @@ -23,7 +23,7 @@ The current **preview** release provides all of SDRs capabilities plus: - A `@Document` annotation to map Spring Data models to Redis JSON documents - Enhancements to SDR's `@RedisHash` via `@EnableRedisEnhancedRepositories` to: - - use Redis' native search engine (RediSearch) for secondary indexing + - use Redis' native search engine (Redis Search) for secondary indexing - use [ULID](https://github.com/ulid/spec) indentifiers for `@Id` annotated fields - `RedisDocumentRepository` with automatic implementation of Repository interfaces for complex querying capabilities using `@EnableRedisDocumentRepositories` - Declarative Search Indices via `@Indexable` diff --git a/docs/develop/node/gettingstarted/index-gettingstarted.mdx b/docs/develop/node/gettingstarted/index-gettingstarted.mdx index 7883ff980dc..2649da63241 100644 --- a/docs/develop/node/gettingstarted/index-gettingstarted.mdx +++ b/docs/develop/node/gettingstarted/index-gettingstarted.mdx @@ -8,8 +8,9 @@ authors: [ajeet, simon] import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; +import Authors from '@theme/Authors'; + + Find tutorials, examples and technical articles that will help you to develop with Redis and Node.js/JavaScript: @@ -44,8 +45,7 @@ Use the following commands to setup a Redis server locally: ``` :::info INFO -Redis Stack unifies and simplifies the developer experience of the leading Redis data store, modules and the capabilities they provide. Redis Stack bundles five Redis modules: RedisJSON, RedisSearch, RedisGraph, RedisTimeSeries, and RedisBloom. -[Learn more](/create/redis-stack) +Redis Stack unifies and simplifies the developer experience of the leading Redis data store, modules and the capabilities they provide. Redis Stack supports the folliwng in additon to Redis: JSON, Search, Time Series, Triggers and Functions, and Probilistic data structures. ::: Ensure that you are able to use the following Redis command to connect to the Redis instance. @@ -171,7 +171,7 @@ Check out a few of these apps below, or [click here to access the complete list] ![marketplace](images/hackernews.png) -[A Hacker News Clone project](https://launchpad.redis.com/?id=project%3Aredis-hacker-news-demo) built in NextJS, NodeJS and Express based on RediSearch & RedisJSON +[A Hacker News Clone project](https://launchpad.redis.com/?id=project%3Aredis-hacker-news-demo) built in NextJS, NodeJS and Express based on Search and JSON
diff --git a/docs/develop/node/index-node.mdx b/docs/develop/node/index-node.mdx index 452e92a7a69..f8e97b4fed6 100644 --- a/docs/develop/node/index-node.mdx +++ b/docs/develop/node/index-node.mdx @@ -6,7 +6,10 @@ slug: /develop/node authors: [ajeet] --- -import RedisCard from '@site/src/theme/RedisCard'; +import RedisCard from '@theme/RedisCard'; +import Authors from '@theme/Authors'; + + The following links provides you with the available options to develop your application using NodeJS and Redis diff --git a/docs/develop/node/index-node.mdx.bkp b/docs/develop/node/index-node.mdx.bkp deleted file mode 100644 index e03d39ef74a..00000000000 --- a/docs/develop/node/index-node.mdx.bkp +++ /dev/null @@ -1,179 +0,0 @@ ---- -id: index-node -title: Redis and Node.js -sidebar_label: Overview -slug: /develop/node ---- - -import RedisCard from '@site/src/theme/RedisCard'; -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; - -This is a complete online course for Node.js developers wanting to learn about Redis, -the ioredis client and the Redis modules ecosystem. - -In this course, you'll learn about using Redis with Node.js through a blend of video and -text-based training. You can also get hands-on with some optional workshop exercises -where you'll add new functionality to an existing Node.js application. - -
-
- -
- - -
- -
- -
- -
-
- -
- -
- -
- -
- -
- -
- -
-
- -
-
- -
- - -
- -
- -
- -
-
- -
-
- -
- -
- -
- -
- -
-
- -
-
- -
- -
- -
- -
- -
-
- -
-
- -
- -
- -
- -
- -
-
- - diff --git a/docs/develop/node/node-crash-course/advancedstreams/index-advancedstreams.mdx b/docs/develop/node/node-crash-course/advancedstreams/index-advancedstreams.mdx index e7acf9bcb26..d4595480622 100644 --- a/docs/develop/node/node-crash-course/advancedstreams/index-advancedstreams.mdx +++ b/docs/develop/node/node-crash-course/advancedstreams/index-advancedstreams.mdx @@ -6,7 +6,9 @@ slug: /develop/node/nodecrashcourse/advancedstreams authors: [simon] --- -import useBaseUrl from '@docusaurus/useBaseUrl'; +import Authors from '@theme/Authors'; + +
-Finding Bigfoot RESTfuly with Express + RediSearch: +Finding Bigfoot RESTfuly with Express + Redis Stack:
-Learn more about RedisJSON at its [official homepage](https://redisjson.io/). +Learn more about JSON at https://redis.io/docs/stack/json/. diff --git a/docs/develop/node/node-crash-course/runningtheapplication/index-runningtheapplication.mdx b/docs/develop/node/node-crash-course/runningtheapplication/index-runningtheapplication.mdx index 68c7816100c..93306883e49 100644 --- a/docs/develop/node/node-crash-course/runningtheapplication/index-runningtheapplication.mdx +++ b/docs/develop/node/node-crash-course/runningtheapplication/index-runningtheapplication.mdx @@ -6,7 +6,9 @@ slug: /develop/node/nodecrashcourse/runningtheapplication authors: [simon] --- -import useBaseUrl from '@docusaurus/useBaseUrl'; +import Authors from '@theme/Authors'; + + Let's get hands on, clone the application repository from GitHub, start up Redis in a Docker container, and load the sample data! @@ -55,7 +57,7 @@ Creating rediscrashcourse ... done $ docker ps ``` -The output from the docker ps command should show one container running, using the "redislabs/redismod" image. This container runs Redis 6 with the RediSearch, RedisJSON and RedisBloom modules. +The output from the docker ps command should show one container running, using the "redis/redis-stack" image. This container runs Redis with the Search, JSON, Time Series, and Probabilistic data structures. ### Load the Sample Data into Redis diff --git a/docs/develop/node/node-crash-course/sampleapplicationoverview/index-sampleapplicationoverview.mdx b/docs/develop/node/node-crash-course/sampleapplicationoverview/index-sampleapplicationoverview.mdx index ad0543b1da4..ccf2850b298 100644 --- a/docs/develop/node/node-crash-course/sampleapplicationoverview/index-sampleapplicationoverview.mdx +++ b/docs/develop/node/node-crash-course/sampleapplicationoverview/index-sampleapplicationoverview.mdx @@ -6,7 +6,9 @@ slug: /develop/node/nodecrashcourse/sampleapplicationoverview authors: [simon] --- -import useBaseUrl from '@docusaurus/useBaseUrl'; +import Authors from '@theme/Authors'; + +
+
+ +### References + +- [Python based application on Heroku using Redis](/howtos/herokupython/) +- [How to build a Rate Limiter using Redis](/howtos/ratelimiting/) + +## + + diff --git a/docs/develop/python/fastapi/index-fastapi.mdx b/docs/develop/python/fastapi/index-fastapi.mdx index b124e46037d..ca200e1be8a 100644 --- a/docs/develop/python/fastapi/index-fastapi.mdx +++ b/docs/develop/python/fastapi/index-fastapi.mdx @@ -6,10 +6,9 @@ slug: /develop/python/fastapi authors: [andrew] --- -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; +import Authors from '@theme/Authors'; + + This tutorial helps you get started with Redis and FastAPI. @@ -26,8 +25,7 @@ Redis. Unlike most databases, Redis excels at low-latency access because it's an In this tutorial, we'll walk through the steps necessary to use Redis with FastAPI. We're going to build _IsBitcoinLit_, an API that stores Bitcoin -sentiment and price averages in [RedisTimeSeries](https://oss.redis.com/redistimeseries/), then rolls these averages up for -the last three hours. +sentiment and price averages in [Redis Stack](https://redis.io/docs/stack/timeseries/) using a timeseries data structure, then rolls these averages up for the last three hours. Next, let's look at the learning objectives of this tutorial. @@ -37,7 +35,7 @@ The learning objectives of this tutorial are: 1. Learn how to install aioredis-py and connect to Redis 2. Learn how to integrate aioredis-py with FastAPI -3. Learn how to use RedisTimeSeries to store and query timeseries data +3. Learn how to use Redis to store and query timeseries data 4. Learn how to use Redis as a cache with aioredis-py Let's get started! @@ -73,13 +71,11 @@ GitHub](https://github.com/redis-developer/fastapi-redis-tutorial). [Follow the README](https://github.com/redis-developer/fastapi-redis-tutorial/blob/master/README.md) to the project running. -## About RedisTimeSeries - -RedisTimeSeries is a source available Redis Module that adds a timeseries data type to Redis. Timeseries is a great way to model any data that you want to query over time, like in this case, the ever-changing price of Bitcoin. +## About Redis for time series data -You can get started by following the [setup instructions](https://oss.redis.com/redistimeseries/#setup) in the RedisTimeSeries documentation. +Redis Stack adds a time series data type to Redis. Time Series is a great way to model any data that you want to query over time, like in this case, the ever-changing price of Bitcoin. -However, note that this tutorial's example project configures RedisTimeSeries automatically for you with the redismod Docker image. You can even use Docker Compose to build up your Redis server. +You can get started by following the [setup instructions](https://redis.io/docs/stack/timeseries/) in the Redis Stack documentation. ## An Asyncio Primer @@ -113,9 +109,13 @@ IsBitcoinLit includes a `pyproject.toml` file that Poetry uses to manage the pro Once you have a `pyproject.toml` file, and assuming you already added FastAPI and any other necessary dependencies, you could add aioredis-py to your project like this: - $ poetry add aioredis-py@2.0.0b1 + $ poetry add aioredis@2.0.0 + +:::note + +This tutorial uses aioredis-py 2.0. The 2.0 version of aioredis-py features an API that matches the most popular synchronous Redis client for Python, [redis-py](https://github.com/andymccurdy/redis-py). -**NOTE**: This tutorial uses a beta version of aioredis-py 2.0. The 2.0 version of aioredis-py features an API that matches the most popular synchronous Redis client for Python, [redis-py](https://github.com/andymccurdy/redis-py). +::: The aioredis-py client is now installed. Time to write some code! @@ -123,19 +123,23 @@ The aioredis-py client is now installed. Time to write some code! We're going to use Redis for a few things in this FastAPI app: -1. Storing 30-second averages of sentiment and price for the last 24 hours with RedisTimeSeries -2. Rolling up these averages into a three-hour snapshot with RedisTimeSeries +1. Storing 30-second averages of sentiment and price for the last 24 hours with Redis Time Series +2. Rolling up these averages into a three-hour snapshot with Redis Time Series 3. Caching the three-hour snapshot Let's look at each of these integration points in more detail. -### Creating the Timeseries +### Creating the time series The data for our app consists of 30-second averages of Bitcoin prices and sentiment ratings for the last 24 hours. We pull these from the [SentiCrypt API](https://senticrypt.com/docs.html). -**NOTE**: We have no affiliation with SentiCrypt or any idea how accurate these numbers are. This example is **just for fun**! +:::note -We're going to store price and sentiment averages in a timeseries with RedisTimeSeries, so we want to make sure that when the app starts up, the timeseries exists. +We have no affiliation with SentiCrypt or any idea how accurate these numbers are. This example is **just for fun**! + +::: + +We're going to store price and sentiment averages in a time series with Redis Stack, so we want to make sure that when the app starts up, the time series exists. We can use a [startup event](https://fastapi.tiangolo.com/advanced/events/) to accomplish this. Doing so looks like the following: @@ -146,7 +150,7 @@ async def startup_event(): await initialize_redis(keys) ``` -We'll use the `TS.CREATE` RedisTimeSeries command to create the timeseries within our `initialize_redis()` function: +We'll use the `TS.CREATE` command to create the time series within our `initialize_redis()` function: ```python async def make_timeseries(key): @@ -167,13 +171,17 @@ async def make_timeseries(key): ) except ResponseError as e: # Time series probably already exists - log.info('Could not create timeseries %s, error: %s', key, e) + log.info('Could not create time series %s, error: %s', key, e) ``` -**TIP**: An interesting point to note from this code is that when we create a timeseries, we can use the `DUPLICATE_POLICY` option to specify how to handle duplicate pairs of timestamp and values. +:::tip + +When you create a time series, use the `DUPLICATE_POLICY` option to specify how to handle duplicate pairs of timestamp and values. + +::: -### Storing Sentiment and Price Data in RedisTimeSeries +### Storing Sentiment and Price Data in Redis A `/refresh` endpoint exists in the app to allow a client to trigger a refresh of the 30-second averages. This is the entire function: @@ -208,7 +216,7 @@ The first thing we do is get the latest sentiment and price data from SentiCrypt ] ``` -Then we save the data into two timeseries in Redis with the `persist()` function. That ends up calling another helper, `add_many_to_timeseries()`, like this: +Then we save the data into two time series in Redis with the `persist()` function. That ends up calling another helper, `add_many_to_timeseries()`, like this: ```python await add_many_to_timeseries( @@ -219,7 +227,7 @@ Then we save the data into two timeseries in Redis with the `persist()` function ) ``` -The `add_many_to_timeseries()` function takes a list of (timeseries key, sample key) pairs and a list of samples from SentiCrypt. For each sample, it reads the value of the sample key in the SentiCrypt sample, like "btc_price," and adds that value to the given timeseries key. +The `add_many_to_timeseries()` function takes a list of (time series key, sample key) pairs and a list of samples from SentiCrypt. For each sample, it reads the value of the sample key in the SentiCrypt sample, like "btc_price," and adds that value to the given time eries key. Here's the function: @@ -249,11 +257,11 @@ async def add_many_to_timeseries( This code is dense, so let's break it down. -We're using the `TS.MADD` RedisTimeSeries command to add many samples to a timeseries. We use `TS.MADD` because doing so is faster than `TS.ADD` for adding batches of samples to a timeseries. +We're using the `TS.MADD` command to add many samples to a time series. We use `TS.MADD` because doing so is faster than `TS.ADD` for adding batches of samples to a time series. -This results in a single large `TS.MADD` call that adds price data to the price timeseries and sentiment data to the sentiment timeseries. Conveniently, `TS.MADD` can add samples to multiple timeseries in a single call. +This results in a single large `TS.MADD` call that adds price data to the price time series and sentiment data to the sentiment timeseries. Conveniently, `TS.MADD` can add samples to multiple time series in a single call. -## Calculating Three-Hour Averages with RedisTimeSeries +## Calculating Three-Hour Averages with Redis Clients use IsBitcoinLit to get the average price and sentiment for each of the last three hours. But so far, we've only stored 30-second averages in Redis. How do we calculate the average of these averages for the last three hours? @@ -304,7 +312,7 @@ So where does this leave us? With **averages of the averages**, one for each of Let's review. We have code that achieves the following: 1. Gets the latest sentiment and price data from SentiCrypt. -2. Saves the data into two timeseries in Redis. +2. Saves the data into two time series in Redis. 3. Calculates the average of the averages for the last three hours. The snapshot of averages for the last three hours is the data we want to serve clients when they hit the `/is-bitcoin-lit` endpoint. We could run this calculation every time a client requests data, but that would be inefficient. Let's cache it in Redis! @@ -394,4 +402,4 @@ Putting all the pieces together, we now have a FastAPI app that can retrieve Bit Here are a few **notes to consider**: 1. We manually controlled caching in this tutorial, but you can also use a library like [aiocache](https://github.com/aio-libs/aiocache) to cache data in Redis. -2. We ran RedisTimeSeries commands like `TS.MADD` using the `execute_command()` method in aioredis-py. If you are instead using redis-py in a synchronous project, you can use the [redistimeseries-py](https://github.com/RedisTimeSeries/redistimeseries-py) library to run RedisTimeSeries commands. +2. We ran Redis commands like `TS.MADD` using the `execute_command()` method in aioredis-py. If you are instead using [redis-py](https://pypi.org/project/redis/) in a synchronous project, you can use the same commands. diff --git a/docs/develop/python/index-python.mdx b/docs/develop/python/index-python.mdx deleted file mode 100644 index c141f27be75..00000000000 --- a/docs/develop/python/index-python.mdx +++ /dev/null @@ -1,181 +0,0 @@ ---- -id: index-python -title: Python and Redis -sidebar_label: Overview -slug: /develop/python/ -authors: [ajeet] ---- - -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; - -Find tutorials, examples and technical articles that will help you to develop with Redis and Python. - -### Getting Started - -The Python community has built many client libraries that you can find here . For your first steps with Python and Redis, this article will show how to use the recommended library: [redis-py](https://github.com/redis/redis-py) - - - - -#### Step 1. Run a Redis server - -Redis is an open source, in-memory, key-value data store most commonly used as a primary database, cache, message broker, and queue. Unlike relational databases, Redis database delivers sub-millisecond response times, enabling fast and powerful real-time applications in industries such as gaming, fintech, ad-tech, social media, healthcare, and IoT. - -Redis (also called remote dictionary server) is a multi-model database, and provides several built-in data structures/data type such as Lists, Hashes, Geospatial indexes, Strings, Sets etc. -You can either run Redis server in a Docker container or directly on your machine. Follow the below command line to setup a Redis server on Mac OS: - -``` - brew tap redis-stack/redis-stack - brew install --cask redis-stack -``` - -:::info INFO -Redis Stack unifies and simplifies the developer experience of the leading Redis modules and the capabilities they provide. Redis Stack bundles five Redis modules: RedisJSON, RedisSearch, RedisGraph, RedisTimeSeries, and RedisBloom -[Learn more](/create/redis-stack) -::: - -Ensure that you are able to use the following Redis command to connect to the Redis instance. - -```bash - redis-cli - 127.0.0.1:6379> -``` - -Now, you should be able to get Redis data by using Redis commands. - -### Step 2. Install the Redis client library using `pip` - -The following Python code allows you to connect to the default Redis server instance . - -```bash - pip3 install redis -``` - -### Step 2. Write your application code - -```python - import redis - - redis = redis.Redis( - host= 'localhost', - port= '6379') - - redis.set('mykey', 'Hello from Python!') - value = redis.get('mykey') - print(value) - - redis.zadd('vehicles', {'car' : 0}) - redis.zadd('vehicles', {'bike' : 0}) - vehicles = redis.zrange('vehicles', 0, -1) - print(vehicles) -``` - -Find more information about Redis database instances & Redis connections in the "[Redis Connect](https://github.com/redis-developer/redis-connect/tree/master/python/redispy)". - - - - -### More developer resources - -
- -
- -#### Sample Code - -**[Flask Simple Rate limiting Example ](https://github.com/redis-developer/basic-caching-demo-nodejs)** -Application that shows how to do rate limiting using various Redis datastructure. - -
- -
- -#### Technical Articles & Videos - -**[Beyond the Cache with Python](https://redis.com/blog/beyond-the-cache-with-python/)** - -
-
- ---- - -### Redis Launchpad - -Redis Launchpad is like an “App Store” for Redis sample apps. You can easily find apps for your preferred frameworks and languages. -Check out a few of these apps below, or [click here to access the complete list](https://launchpad.redis.com). - -
- -
-
- -#### Rate-Limiting app in Python & Django - -![launchpad](images/ratelimitingdjango.png) - -[Rate Limiting app](https://launchpad.redis.com/?id=project%3Abasic-rate-limiting-demo-python) built in Python & Django - -
-
- -
-
- -#### Leaderboard app in Python & Django - -![launchpad](images/leaderboarddjango.png) - -[How to implement leaderboard app](https://launchpad.redis.com/?id=project%3Abasic-redis-leaderboard-demo-python) in Python & Django - -
-
-
- -## Redis University - -### [Redis for Python Developers](https://university.redis.com/courses/ru102py/) - -A complete introduction to Redis for Python developers. - -
- -
- -### References - -- [How to store JSON documents in Redis with Python](/howtos/redisjson/using-python) -- [Python based application on Heroku using Redis](/howtos/herokupython/) -- [How to build a Rate Limiter using Redis](/howtos/ratelimiting/) -- [Writing Your Serverless function using RedisGears Browser Tool](/explore/redisinsight/redisgears/) - -## - - diff --git a/docs/develop/python/redis-om/index-redis-om.mdx b/docs/develop/python/redis-om/index-redis-om.mdx index a6e97503477..cbff4502b56 100644 --- a/docs/develop/python/redis-om/index-redis-om.mdx +++ b/docs/develop/python/redis-om/index-redis-om.mdx @@ -6,10 +6,9 @@ slug: /develop/python/redis-om authors: [andrew] --- -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; +import Authors from '@theme/Authors'; + + [Redis OM for Python](https://github.com/redis/redis-om-python) makes it easy to model and query data in Redis using declarative models that will feel right at home to users of Peewee, SQLAlchemy, @@ -47,9 +46,13 @@ complete this tutorial. The latest version of Redis is available from [Redis.io](https://redis.io/). You can also install Redis with your operating system's package manager. -**NOTE:** This tutorial will guide you through starting Redis locally, but the +:::note + +This tutorial will guide you through starting Redis locally, but the instructions will also work if Redis is running on a remote server. +::: + ### Installing Redis On Windows Redis doesn't run directly on Windows, but you can use Windows Subsystem for @@ -72,14 +75,16 @@ features. You can also use the official Redis Docker image, which is hosted on [Docker Hub](https://hub.docker.com/_/redis). -**NOTE**: We'll talk about how to actually start Redis with Docker when we -discuss _running_ Redis later in this guide. +:::note + +We'll talk about how to actually start Redis with Docker when we discuss _running_ Redis later in this guide. -## Recommended: RediSearch and RedisJSON +::: -Redis OM relies on the [RediSearch][redisearch-url] and -[RedisJSON][redis-json-url] Redis modules to support rich queries and embedded -models. + +## Recommended: Redis Search and JSON + +Redis OM relies on the Search and JSON support of Redis Stack. You don't need these Redis modules to use Redis OM's data modeling, validation, and persistence features, but we recommend them to get the most out of Redis OM. @@ -646,10 +651,14 @@ You can view the data stored in Redis for any Redis OM model. First, get the key of a model instance you want to inspect. The `key()` method will give you the exact Redis key used to store the model. -**NOTE:** The naming of this method may be confusing. This is not the primary +:::note + +The naming of this method may be confusing. This is not the primary key, but is instead the Redis key for this model. For this reason, the method name may change. +::: + In this example, we're looking at the key created for the `Customer` model we've been building: @@ -770,13 +779,12 @@ class Customer(HashModel): # Now, if we use this model with a Redis deployment that has the -# RediSearch module installed, we can run queries like the following. +# Redis Stack installed, we can run queries like the following. # Before running queries, we need to run migrations to set up the # indexes that Redis OM will use. You can also use the `migrate` # CLI tool for this! -redis = get_redis_connection() -Migrator(redis).run() +Migrator().run() # Find all customers with the last name "Brookins" diff --git a/docs/develop/ruby/index-ruby.mdx b/docs/develop/ruby/index-ruby.mdx index 805614ea24d..af71bb6d33a 100644 --- a/docs/develop/ruby/index-ruby.mdx +++ b/docs/develop/ruby/index-ruby.mdx @@ -7,8 +7,6 @@ slug: /develop/ruby/ import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; Find tutorials, examples and technical articles that will help you to develop with Redis and Ruby. @@ -37,8 +35,7 @@ Follow the below commands to setup a Redis server on Mac OS: ``` :::info INFO -Redis Stack unifies and simplifies the developer experience of the leading Redis modules and the capabilities they provide. Redis Stack bundles five Redis modules: RedisJSON, RedisSearch, RedisGraph, RedisTimeSeries, and RedisBloom. -[Learn more](/create/redis-stack) +Redis Stack unifies and simplifies the developer experience of the leading Redis modules and the capabilities they provide. ::: Ensure that you are able to use the following Redis command to connect to the Redis instance. diff --git a/docs/develop/rust/index-rust.mdx b/docs/develop/rust/index-rust.mdx index dd38ec7b529..19e2074ebfa 100644 --- a/docs/develop/rust/index-rust.mdx +++ b/docs/develop/rust/index-rust.mdx @@ -7,15 +7,13 @@ slug: /develop/rust/ import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; Find tutorials, examples and technical articles that will help you to develop with Redis and Rust. ### Getting Started Rust community has built many client libraries that you can find here. For your first steps with Java and Rust, this article will show how to use a popula library: redis-rs -The web page “Redis Enterprise and Rust” will help you to get started with Redis Enterprise and Rust in a much easier manner. +The web page “Redis Cloud and Rust” will help you to get started with Redis Cloud and Rust in a much easier manner. redis-rs is a rust implementation of a Redis client library. It exposes a general purpose interface to Redis and also provides specific helpers for commonly used functionality. @@ -66,7 +64,7 @@ It exposes a general purpose interface to Redis and also provides specific helpe ### Further References -- [Redis Enterprise and Rust](https://redis.com/lp/redis-enterprise-rust/) +- [Redis Cloud and Rust](https://redis.com/lp/redis-enterprise-rust/) - [Getting Started with Redis & Rust](https://github.com/redis-developer/redis-rust-getting-started)
diff --git a/docs/develop/rust/index-rust.mdx.orig b/docs/develop/rust/index-rust.mdx.orig deleted file mode 100644 index 6cd73f834bd..00000000000 --- a/docs/develop/rust/index-rust.mdx.orig +++ /dev/null @@ -1,78 +0,0 @@ ---- -id: index-rust -title: Rust and Redis -sidebar_label: Rust -slug: /develop/rust/ ---- -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; - -Find tutorials, examples and technical articles that will help you to develop with Redis and Rust. - - - - -# Getting Started with redis-rs - - -Rust community has built many client libraries that you can find here. For your first steps with Java and Rust, this article will show how to use a popula library: redis-rs -The web page “Redis Enterprise and Rust” will help you to get started with Redis Enterprise and Rust in a much easier manner. -redis-rs is a rust implementation of a Redis client library. -It exposes a general purpose interface to Redis and also provides specific helpers for commonly used functionality. - -## Install rust - -Rust is installed and managed by the rustup tool. - - ```bash - curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh - ``` - -## Configure your current shell: - - ```bash - source $HOME/.cargo/env - ``` - -## Verify Rust compiler: - - ```bash - rustc --version - rustc 1.49.0 - ``` - -## Creating Cargo.toml with Redis dependency: - - ```bash - [dependencies] - redis = "0.8.0" - ``` - - -## Clone the repository - - ```bash - git clone https://github.com/redis-developer/redis-rust-getting-started - ``` - - -## Run the application - - ```bash - cargo run - ``` - -# Further Reference: - -- [Redis Enterprise and Rust](https://redis.com/lp/redis-enterprise-rust/) -- https://github.com/redis-developer/redis-rust-getting-started - - - - diff --git a/docs/ebooks/nosql-data-modeling-patterns.mdx b/docs/ebooks/nosql-data-modeling-patterns.mdx new file mode 100644 index 00000000000..d647d744573 --- /dev/null +++ b/docs/ebooks/nosql-data-modeling-patterns.mdx @@ -0,0 +1,189 @@ +--- +id: nosql-data-modeling-patterns +title: Learn 8 NoSQL Data Modeling Patterns in Redis +image: /img/ebooks/nosql-data-modeling-patterns/8-data-modeling-patterns-in-redis.jpg +sidebar_label: NoSQL Data Modeling Patterns +slug: /ebooks/8-nosql-data-modeling-patterns +editUrl: false +showLastUpdateTime: false +--- + +import Excerpt from '@theme/Excerpt'; + + + +## Introduction + +When someone is looking to use NoSQL for an application, the question that most often comes up is, “How do I structure +my data?” The short answer to this question is, as you might guess, it depends. There are several questions that can +inform how to structure your data in a NoSQL database. +Is your application read heavy, or write heavy? What does the user experience of your application look like? How does +your data need to be presented to the user? How much data will you be storing? What performance considerations do +you need to account for? How do you anticipate scaling your application? + +These questions are only a small subset of what you need to ask yourself when you start working with NoSQL. A common +misconception with NoSQL databases is that since they are “schemaless” you don’t need to worry about your schema. +In reality, your schema is incredibly important regardless of what database you choose. You also need to ensure that the +schema you choose will scale well with the database you plan to use. + +In this e-book you will learn how to approach data modeling in NoSQL, specifically within the context of Redis. Redis is a +great database for demonstrating several NoSQL patterns and practices. Not only is Redis commonly used and loved by +developers, it also is a multi-model database. This means that while many of the patterns covered in this e-book apply to +different types of databases (e.g. document, graph, time series, etc.), with Redis you can apply all of the patterns in +a single database. + +:::note + +By the end of this, you should have + +- A firm understanding of how to approach modeling data in Redis as well as in NoSQL generally. +- An understanding of several NoSQL data modeling patterns, their pros and cons, as well as use cases for + them in practice. +- A working knowledge of how to actually write code (with examples) to take advantage of NoSQL patterns + within Redis. + +::: + +## SQL versus NoSQL + +I’m sure at a certain level you understand the difference between SQL and NoSQL. SQL is a structured query language +whereas NoSQL can mean several different things depending on the context. However, generally speaking, the approach +to modeling data is fundamentally different in NoSQL than in SQL. There are also differences in terms of scalability, with +NoSQL being easier to scale horizontally. + +When building applications you are probably using an object-oriented language like JavaScript, Java, C#, or others. +Your data is represented as strings, lists, sets, hashes, JSON, and so on. However, if you store data in a SQL database +or a document database, you need to squeeze and transform the data into several tables or collections. You also need +complex queries (such as SQL queries) to get the data out. This is called **impedance mismatch** and is the fundamental +reason why NoSQL exists. + +A large application might use other systems for data storage such as Neo4J for graph data, MongoDB for document +data, InfluxDB for time series, etc. Using separate databases turns an impedance mismatch problem into a database +orchestration problem. You have to juggle multiple connections to different databases, as well as learn the different client +libraries used. + +With Redis, in addition to the basic data structures such as strings, lists, sets, and hashes, you can also store advanced +data structures such as JSON for documents, Search for secondary indexing, +Time Series for time-series data, and Probabilistic data (think leaderboards). + +This reduces impedance mismatch because your data is stored in one of 15 structures with little or no transformations. +You can also use a single connection (or connection pool) and client library to access your data. What you end up with is +a simplified architecture with purpose-built models that are blazing fast and simple to manage. For this reason, this e-book +will use Redis to explain several of the NoSQL data modeling patterns. + +Most developers have at least a little understanding of SQL and how to model data in it. This is because SQL is widely +used and there are several incredible books and even full courses devoted to it. NoSQL is quickly growing and becoming +more popular. But given that when you’re talking about NoSQL you’re talking about more than just a document store, there +is a lot of ground to cover. That’s why when covering certain NoSQL data modeling patterns in this e-book, you will be +presented with what it might look like to model the data in SQL as well. + +When you approach data modeling in SQL you are typically focused on relationships, as SQL is meant for set-based +operations on relational data. NoSQL doesn’t have this constraint and is more flexible in the way you model data. +However, this can lead to schemas that are overly complex. When considering NoSQL schema design, always think about +performance and try to keep things simple. + +So to kick things off, let’s start by looking at something that is very near and dear to a SQL developer’s heart: **relationships**. + +## Modeling 1-to-1 Relationships + +Imagine that you are creating a retail app that sells electronics. Let’s use **Picture 1** and **Picture 2** as an example of the +UI for a standard retail e-commerce app. First, you’ll create a list view of all the electronics and then a detailed view +that shows all the details of each item. There is a 1-to-1 relationship between each item in the list view and the detailed +view (shown in **Picture 2**) of the item. The detailed view shows all the details such as multiple photos, description, +manufacturer, dimensions, weight, and so on. + +
+
+
+ Picture 1 + Picture 1 1-to-1 List View +
+
+ Picture 2 + Picture 2 1-to-1 Detailed View +
+
+
+ +### 1-to-1 Relationships using SQL + +In a relational database, you may create a table called `products` where each row holds just enough data to display the information in the list view. Then, you may create another table called `product_details` where each row holds the rest of the details. You would also need a `product_images` table, where you store all of the images for a product. You can see the entity relationship diagram in **Picture 3**. + +Picture 3 +Picture 3 1-to-1 Entity Diagram + +Picture 3 depicts the entity relationships between `products`, `product_details`, and `product_images` and represents a normalized data model with a single denormalized field image in the `products` table. The reason for this is to avoid having to use a SQL JOIN when selecting the products for the list view. Using this model, the SQL query used to get the data needed for the list view might resemble **Code Example 1**. + +```sql title="Code Example 1" +SELECT + p.id, p.name, p.image, p.price, pi.url +FROM + products p +``` + +### 1-to-1 Relationships using Redis + +In Redis, similar to a relational database, you can create a collection called `products` and another called `product_details`. But with Redis JSON you can improve this by simply embedding `product_images` and `product_details` directly into the `Products` collection. Then, when you query the `Products` collection, specify which fields you need based on which view you are trying to create. + +This will allow you to easily keep all the data in one place. This is called the **Embedded Pattern** and is one of the most common patterns you will see in NoSQL document databases like Redis JSON. **Code Example 2** uses Python and a client library called Redis OM (an ORM for Redis) to model `Products` and `ProductDetails`. Note that `ProductDetails` is embedded into `Products` directly, so all of the data for a product will be stored within the same document. + +```python title="Code Example 2" +class ProductDetail(EmbeddedJsonModel): + description: str + manufacturer: str + dimensions: str + weight: str + images: List[str] + +class Product(JsonModel): + name: str = Field(index=True) + image: str = Field(index=True) + price: int = Field(index=True) + details: Optional[ProductDetail] +``` + +**Code Example 2** also shows how you can index fields using Redis OM and Redis Search. Doing this turns Redis into not only a document store but also a search engine since Redis Search enables secondary indexing and searching. When you create models using Redis OM, it will automatically manage secondary indexes with Redis Search on your behalf. + +Using Redis OM we can write a function to retrieve our `products` list for the list view, as shown in **Code Example 3**. + +```python title="Code Example 3" +async def get_product_list(): + results = await connections \ + .get_redis_connection() \ + .execute_command( + f'FT.SEARCH {Product.Meta.index_name} * LIMIT 0 10 RETURN 3 name image price' + ) + return Product.from_redis(results) +``` + +Notice that in **Code Example 3** we are using the `FT.SEARCH` command, which specifies the index managed on our behalf by Redis OM and returns three fields: name, image, and price. While the documents all have details and images embedded, we don’t want to display them in the list view so we don’t need to query them. When we want the detailed view, we can query an entire Product document. See **Code Example 4** for how to query an entire document. + +```python title="Code Example 4" +async def get_product_details(product_id: str): + return await Product.get(product_id) +``` + +When using Redis, you can use RedisInsight as a GUI tool to visualize and interact with the data in your database. **Picture 4** shows you what a `Products` document looks like. + +Picture 4 +Picture 4 1-to-1 RedisInsight + +## Download the E-book + +
diff --git a/docs/ebooks/three-caching-design-patterns.mdx b/docs/ebooks/three-caching-design-patterns.mdx new file mode 100644 index 00000000000..7f689aad3cb --- /dev/null +++ b/docs/ebooks/three-caching-design-patterns.mdx @@ -0,0 +1,124 @@ +--- +id: three-caching-design-patterns +title: 3 design patterns to speed up MEAN and MERN stack applications +image: /img/ebooks/three-caching-design-patterns/three-caching-design-patterns.png +sidebar_label: 3 design patterns to speed up MEAN and MERN stack applications +slug: /ebooks/three-caching-design-patterns +editUrl: false +showLastUpdateTime: false +--- + +import Excerpt from '@theme/Excerpt'; +import CachingMovieAppDesign from '../howtos/solutions/caching-architecture/common-caching/caching-movie-app.mdx'; +import SourceCodeMovieApp from '../howtos/solutions/caching-architecture/common-caching/source-code-movie-app.mdx'; + + + +## Introduction + +**If you don't design and build software with attention to performance, your applications can encounter significant bottlenecks when they go into production.** + +Over time, the development community has learned common techniques that work as reliable **design patterns** to solve well-understood problems, including application performance. + +So what are design patterns? They are recommended practices to solve recurring design problems in software systems. A design pattern has four parts: a name, a problem description (a particular set of conditions to which the pattern applies), a solution (the best general strategy for resolving the problem), and a set of consequences. + +Two development stacks that have become popular ways to build Node.js applications are the **MEAN** stack and the **MERN** stack. The MEAN stack is made up of the MongoDB database, the Express and Angular.js frameworks, and Node.js. It is a pure JavaScript stack that helps developers create every part of a website or application. In contrast, the MERN stack is made up of MongoDB, the Express and ReactJS frameworks, and Node.js. + +Both stacks work well, which accounts for their popularity. But it doesn't mean the software generated runs as fast as it can—or as fast as it needs to. + +In this post, we share one popular design pattern that developers use with Redis to improve application performance with MEAN and MERN stack applications: the `master data-lookup pattern`. We explain the pattern in detail and accompany it with an overview, typical use cases, and a code example. Our intent is to help you understand when and how to use this particular pattern in your own software development. The Ebook has other patterns too like `The cache-aside pattern` and `The write-behind pattern` + +## Building a movie application + + + +This tutorial uses a GitHub sample demo that was built using the following tools: + +- **Frontend**: ReactJS (18.2.0) +- **Backend**: Node.js (16.17.0) +- **Database**: MongoDB +- **Cache and database**: Redis stack (using Docker) + + + +## The master data-lookup pattern + +One ongoing developer challenge is to (swiftly) create, read, update, and (possibly) delete data that lives long, changes infrequently, and is regularly referenced by other data, directly or indirectly. That's a working definition of master data, especially when it also represents the organization's core data that is considered essential for its operations. + +Master data generally changes infrequently. Country lists, genres, and movie languages usually stay the same. That presents an opportunity to speed things up. You can address access and manipulation operations so that [data consistency](https://redis.com/blog/database-consistency/) is preserved and data access happens quickly. + +From a developer's point of view, master data lookup refers to the process by which master data is accessed in business transactions, in application setup, and any other way that software retrieves the information. Examples of master data lookup include fetching data for user interface (UI) elements (such as drop-down dialogs, select values, multi-language labels), fetching constants, user access control, theme, and other product configuration. And you can do that even when you rely primarily on MongoDB as a persistent data store. + +![pattern](/img/ebooks/three-caching-design-patterns/pattern-01.jpg) + +To serve master data from Redis, preload the data from MongoDB. + +1. Read the master data from MongoDB on application startup and store a copy of the data in Redis. This pre-caches the data for fast retrieval. Use a script or a cron job to repeatedly copy master data to Redis. +1. The application requests master data. +1. Instead of MongoDB serving the data, the master data will be served from Redis. + +### Use cases + +Consider this pattern when you need to + +- **Serve master data at speed**: By definition, nearly every application requires access to master data. Pre-caching master data with Redis delivers it to users at high speed. +- **Support massive master tables**: Master tables often have millions of records. Searching through them can cause performance bottlenecks. Use Redis to perform real-time search on the master data to increase performance with sub-millisecond response. +- **Postpone expensive hardware and software investments**: Defer costly infrastructure enhancements by using Redis. Get the performance and scaling benefits without asking the CFO to write a check. + +### Demo + +The image below illustrates a standard way to showcase a UI that is suitable for master data lookups. The developer responsible for this application would treat certain fields as master data, including movie language, country, genre, and ratings, because they are required for common application transactions. + +Consider the pop-up dialog that appears when a user who wants to add a new movie clicks the movie application plus the icon. The pop-up includes drop-down menus for both country and language. In this demonstration, Redis loads the values. + +![demo-03](/img/ebooks/three-caching-design-patterns/demo-03.png) + +### Code + +The two code blocks below display a fetch query of master data from both MongoDB and Redis that loads the country and language drop-down values. + +Previously, if the application used MongoDB, it searched the static database to retrieve the movie's country and language values. That can be time-consuming if it's read from persistent storage—and is inefficient if the information is static. + +```js +*** BEFORE (MongoDB)*** +*** MongoDB regular search query *** +function getMasterCategories() { + ... + db.collection("masterCategories").find({ + statusCode: { + $gt: 0, + }, + category: { + $in: ["COUNTRY", "LANGUAGE"], + }, + }); + ... +} +``` + +Instead, the “after” views in the code blocks show that the master data can be accessed with only a few lines of code—and much faster response times. + +```js +*** AFTER (Redis) *** +*** Redis OM Node query *** +function getMasterCategories() { + ... + masterCategoriesRepository + .search() + .where("statusCode") + .gt(0) + .and("categoryTag") + .containOneOf("COUNTRY", "LANGUAGE"); + ... +} +``` + +## Download the E-book + +**Sensing a pattern here?** +The master data-lookup pattern is not the only design pattern you can use to improve application performance. + + diff --git a/docs/explore/datadog/index-datadog.mdx b/docs/explore/datadog/index-datadog.mdx index ddb554a99bc..6fdb1c4acf1 100644 --- a/docs/explore/datadog/index-datadog.mdx +++ b/docs/explore/datadog/index-datadog.mdx @@ -6,6 +6,10 @@ slug: /explore/datadog authors: [ajeet, christian] --- +import Authors from '@theme/Authors'; + + + ![Datadog](images/datadog-redis.png) Devops and SRE practitioners are already keenly aware of the importance of system reliability, as it’s one of the shared goals in every high performing organization. Defining clear reliability targets based on solid data is crucial for productive collaboration between developers and SREs. This need spans the entire infrastructure from application to backend database services. diff --git a/docs/explore/import/index-import.mdx b/docs/explore/import/index-import.mdx index bcfeef6ef7f..03acb83e83e 100644 --- a/docs/explore/import/index-import.mdx +++ b/docs/explore/import/index-import.mdx @@ -6,7 +6,10 @@ slug: /explore/import/ authors: [ajeet] --- -import RedisCard from '@site/src/theme/RedisCard'; +import RedisCard from '@theme/RedisCard'; +import Authors from '@theme/Authors'; + + Redis offers multiple ways to import data into a database; from an file, an script or from an existing Redis database. @@ -63,7 +66,7 @@ _- **Warning**: Importing data erases all existing content in the database._ [Redis Input/Output Tools (RIOT)](/explore/riot) is a set of import/export command line utilities for Redis: -- RIOT DB: migrate from an RDBMS to Redis, RediSearch, RedisJSON, ... +- RIOT DB: migrate from an RDBMS to Redis, Search, JSON, ... - RIOT File: bulk import/export data from/to files. - RIOT Gen: generate sample Redis datasets for new feature development and proof of concept. - RIOT Redis: live replication from any Redis database (including AWS Elasticache) to another Redis database. @@ -74,7 +77,7 @@ _- **Warning**: Importing data erases all existing content in the database._
@@ -84,7 +87,7 @@ _- **Warning**: Importing data erases all existing content in the database._ ## Import data into Redis Enterprise -You can easily import data into Redis Enterprise and Redis Enterprise Cloud, take a look to the following documentation: +You can easily import data into Redis Enterprise and Redis Cloud, take a look to the following documentation: - [Redis Enterprise Software: Importing Data into a Database](https://docs.redis.com/latest/rs/administering/import-export/importing-data/) -- [Redis Enterprise Cloud: Databases Backup and Import](https://docs.redis.com/latest/rc/api/examples/back-up-and-import-data/) +- [Redis Cloud: Databases Backup and Import](https://docs.redis.com/latest/rc/api/examples/back-up-and-import-data/) diff --git a/docs/explore/index-explore.mdx b/docs/explore/index-explore.mdx index 56e635e44ab..a024e7d686d 100644 --- a/docs/explore/index-explore.mdx +++ b/docs/explore/index-explore.mdx @@ -5,29 +5,11 @@ sidebar_label: Overview slug: /explore --- -import RedisCard from '@site/src/theme/RedisCard'; +import RedisCard from '@theme/RedisCard'; The following links provides you with the available options to explore a new Redis database either on the Cloud or using local software.
-
- -
-
- -
-
- -
-
- -
-
-
-
-

RedisInsight

-
- -
-
-
- Read More -
-
- -
-
-
-
-

Redis Datasource for Grafana

-
- -
-
-
- Read More -
-
- -
- diff --git a/docs/explore/redisdatasource/index-redisdatasource.mdx b/docs/explore/redisdatasource/index-redisdatasource.mdx index 7d0860ba755..4f8a938e2a5 100644 --- a/docs/explore/redisdatasource/index-redisdatasource.mdx +++ b/docs/explore/redisdatasource/index-redisdatasource.mdx @@ -8,8 +8,9 @@ authors: [ajeet] import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; +import Authors from '@theme/Authors'; + + The Redis Data Source for Grafana is a plug-in that allows users to connect to the Redis database and build dashboards in Grafana to easily monitor Redis and application data. It provides an out-of-the-box predefined dashboard, but also lets you build customized dashboards tuned to your specific needs. @@ -22,10 +23,9 @@ The Redis Data Source for Grafana is a plug-in that allows users to connect to t - Redis Cluster and Sentinel supported since version 1.2. - Data Source supports: - - [RedisTimeSeries](https://oss.redis.com/redistimeseries/): TS.GET, TS.INFO, TS.MRANGE, TS.QUERYINDEX, TS.RANGE - - [RedisGears](https://oss.redis.com/redisgears/): RG.DUMPREGISTRATIONS, RG.PYEXECUTE, RG.PYSTATS - - [RedisSearch](https://oss.redis.com/redisearch/): FT.INFO - - [RedisGraph](https://oss.redis.com/redisgraph/): GRAPH.QUERY, GRAPH.SLOWLOG + - [Time Series](https://redis.io/docs/stack/timeseries/): TS.GET, TS.INFO, TS.MRANGE, TS.QUERYINDEX, TS.RANGE + - [Search and Query](https://redis.io/docs/stack/search/): FT.INFO + - [Graph](https://redis.io/docs/stack/graph/): GRAPH.QUERY, GRAPH.SLOWLOG - - - Redis Launchpad - - -
diff --git a/docs/explore/redisinsight/autodiscover/images/image1.png b/docs/explore/redisinsight/autodiscover/images/image1.png deleted file mode 100644 index f85043ea7e4..00000000000 Binary files a/docs/explore/redisinsight/autodiscover/images/image1.png and /dev/null differ diff --git a/docs/explore/redisinsight/autodiscover/images/image10.png b/docs/explore/redisinsight/autodiscover/images/image10.png deleted file mode 100644 index eb6113b536a..00000000000 Binary files a/docs/explore/redisinsight/autodiscover/images/image10.png and /dev/null differ diff --git a/docs/explore/redisinsight/autodiscover/images/image11.png b/docs/explore/redisinsight/autodiscover/images/image11.png deleted file mode 100644 index 7afd4ff07d2..00000000000 Binary files a/docs/explore/redisinsight/autodiscover/images/image11.png and /dev/null differ diff --git a/docs/explore/redisinsight/autodiscover/images/image12.png b/docs/explore/redisinsight/autodiscover/images/image12.png deleted file mode 100644 index 445fb64be80..00000000000 Binary files a/docs/explore/redisinsight/autodiscover/images/image12.png and /dev/null differ diff --git a/docs/explore/redisinsight/autodiscover/images/image13.png b/docs/explore/redisinsight/autodiscover/images/image13.png deleted file mode 100644 index 4af6bac7225..00000000000 Binary files a/docs/explore/redisinsight/autodiscover/images/image13.png and /dev/null differ diff --git a/docs/explore/redisinsight/autodiscover/images/image14.png b/docs/explore/redisinsight/autodiscover/images/image14.png deleted file mode 100644 index 2b170abfac9..00000000000 Binary files a/docs/explore/redisinsight/autodiscover/images/image14.png and /dev/null differ diff --git a/docs/explore/redisinsight/autodiscover/images/image15.png b/docs/explore/redisinsight/autodiscover/images/image15.png deleted file mode 100644 index bf2b056338e..00000000000 Binary files a/docs/explore/redisinsight/autodiscover/images/image15.png and /dev/null differ diff --git a/docs/explore/redisinsight/autodiscover/images/image16.png b/docs/explore/redisinsight/autodiscover/images/image16.png deleted file mode 100644 index 15398a0661f..00000000000 Binary files a/docs/explore/redisinsight/autodiscover/images/image16.png and /dev/null differ diff --git a/docs/explore/redisinsight/autodiscover/images/image17.png b/docs/explore/redisinsight/autodiscover/images/image17.png deleted file mode 100644 index ba724b0a677..00000000000 Binary files a/docs/explore/redisinsight/autodiscover/images/image17.png and /dev/null differ diff --git a/docs/explore/redisinsight/autodiscover/images/image18.png b/docs/explore/redisinsight/autodiscover/images/image18.png deleted file mode 100644 index d370d194e56..00000000000 Binary files a/docs/explore/redisinsight/autodiscover/images/image18.png and /dev/null differ diff --git a/docs/explore/redisinsight/autodiscover/images/image19.png b/docs/explore/redisinsight/autodiscover/images/image19.png deleted file mode 100644 index 4e72fca1170..00000000000 Binary files a/docs/explore/redisinsight/autodiscover/images/image19.png and /dev/null differ diff --git a/docs/explore/redisinsight/autodiscover/images/image2.png b/docs/explore/redisinsight/autodiscover/images/image2.png deleted file mode 100644 index 2067fc2ed40..00000000000 Binary files a/docs/explore/redisinsight/autodiscover/images/image2.png and /dev/null differ diff --git a/docs/explore/redisinsight/autodiscover/images/image20.png b/docs/explore/redisinsight/autodiscover/images/image20.png deleted file mode 100644 index d0f464032d0..00000000000 Binary files a/docs/explore/redisinsight/autodiscover/images/image20.png and /dev/null differ diff --git a/docs/explore/redisinsight/autodiscover/images/image3.png b/docs/explore/redisinsight/autodiscover/images/image3.png deleted file mode 100644 index 81038f76593..00000000000 Binary files a/docs/explore/redisinsight/autodiscover/images/image3.png and /dev/null differ diff --git a/docs/explore/redisinsight/autodiscover/images/image4.png b/docs/explore/redisinsight/autodiscover/images/image4.png deleted file mode 100644 index 2450a964c20..00000000000 Binary files a/docs/explore/redisinsight/autodiscover/images/image4.png and /dev/null differ diff --git a/docs/explore/redisinsight/autodiscover/images/image5.png b/docs/explore/redisinsight/autodiscover/images/image5.png deleted file mode 100644 index e3f725dcf70..00000000000 Binary files a/docs/explore/redisinsight/autodiscover/images/image5.png and /dev/null differ diff --git a/docs/explore/redisinsight/autodiscover/images/image6.png b/docs/explore/redisinsight/autodiscover/images/image6.png deleted file mode 100644 index 85422f26423..00000000000 Binary files a/docs/explore/redisinsight/autodiscover/images/image6.png and /dev/null differ diff --git a/docs/explore/redisinsight/autodiscover/images/image7.png b/docs/explore/redisinsight/autodiscover/images/image7.png deleted file mode 100644 index 2b847472c32..00000000000 Binary files a/docs/explore/redisinsight/autodiscover/images/image7.png and /dev/null differ diff --git a/docs/explore/redisinsight/autodiscover/images/image8.png b/docs/explore/redisinsight/autodiscover/images/image8.png deleted file mode 100644 index abde193ff2d..00000000000 Binary files a/docs/explore/redisinsight/autodiscover/images/image8.png and /dev/null differ diff --git a/docs/explore/redisinsight/autodiscover/images/image9.png b/docs/explore/redisinsight/autodiscover/images/image9.png deleted file mode 100644 index cc53380a65d..00000000000 Binary files a/docs/explore/redisinsight/autodiscover/images/image9.png and /dev/null differ diff --git a/docs/explore/redisinsight/autodiscover/index-autodiscover.mdx b/docs/explore/redisinsight/autodiscover/index-autodiscover.mdx deleted file mode 100644 index 2ae3c870fff..00000000000 --- a/docs/explore/redisinsight/autodiscover/index-autodiscover.mdx +++ /dev/null @@ -1,179 +0,0 @@ ---- -id: index-autodiscover -title: Utilize Elasticache Auto Discovery For Redis with RedisInsight -sidebar_label: Utilize Elasticache Auto Discovery For Redis with RedisInsight -slug: /explore/redisinsight/autodiscover -authors: [ajeet] ---- - -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; - -RedisInsight is a 100% free Redis GUI that allows you to visualise, monitor, and optimize while developing your applications with Redis. It provides an intuitive and efficient GUI for Redis allowing developers like you to interact with your databases and manage your data. RedisInsight comes with the compatibility to connect to your database through the Sentinel instance too. Please note that RedisInsight v2.0 is an open source visual tool built by Redis that lets you do both GUI- and CLI-based interactions with your Redis database. - -RedisInsight lets you automatically add Redis Enterprise Software and Redis Enterprise Cloud databases. RedisInsight also allows you to automatically discover Elasticache Redis caches. - -:::important - -ElastiCache Redis caches cannot be accessed from outside the VPC, as they don’t have public IP addresses assigned to them.If you want to work with ElastiCache Redis caches with RedisInsight, you can either setup an SSH tunnel between RedisInsight and your Elasticache instance, in case you're not using Redis Cluster. -::: - -This tutorial shows how to: - -- Setup and configure Amazon Elasticache -- Configure the VPC -- Configuring the security groups -- Configure and setup Amazon EC2 -- Create and configure IAM role -- Assign the permissions -- Connect to Elasticache from EC2 instance -- Setup RedisInsight -- Access RedisInsight -- Autodiscover Elasticache Instance - -### Step 1. Setup and configure Amazon Elasticache - -Login to [AWS Management Console](https://aws.amazon.com) and click "Get Started now" - -![elasticache](images/image1.png) - -Choose "Redis" as the cluster engine - -![elasticache](images/image2.png) - -Configure Redis settings: - -![elasticache](images/image3.png) - -Copy and save the Elasticache primary endpoint URL: - -![elasticache](images/image5.png) - -### Step 2. Configure the VPC - -Configure and chose VPC that has your ElastiCache instances - -![elasticache](images/image6.png) - -### Step 3. Configure the Security Groups - -![elasticache](images/image7.png) - -Configure inbound and outbound rules to allow RedisInsight and Redis ports: - -![elasticache](images/image12.png) - -### Step 4. Configure and setup Amazon EC2 - -![elasticache](images/image8.png) - -### Step 5. Create and configure IAM role - -You can use the AWS Management Console to create a role that an IAM user can assume - -![elasticache](images/image9.png) - -Under Select type of trusted entity, choose EC2. In other words, the role is used by an EC2 instance - -![elasticache](images/image10.png) - -Click “Next”. - -### Step 6. Assign the permissions - -Assign the below permissions: - -- AmazonS3ReadOnlyAccess -- AmazonElastiCacheReadOnlyAccess - -![elasticache](images/image11.png) - -### Step 7. Connect to Elasticache from EC2 instance - -Use the `redis-cli` command to connect to the remote Amazon Elasticache for Redis server endpoint URL. - -```bash - ubuntu@ip-10-0-0-254:~$ redis-cli -h redisinsightdemo.8cfnjo.ng.0001.use1.cache.amazonaws.com -p 6379 - redisinsightdemo.8cfnjo.ng.0001.use1.cache.amazonaws.com:6379> -``` - -### Step 8. Setup RedisInsight - -In order to access the RedisInsight GUI, run the following Docker command: - -```bash - ubuntu@ip-10-0-0-254:~$ sudo docker run -v redisinsight:/db -p 8001:8001 redislabs/redisinsight:latest - Unable to find image 'redislabs/redisinsight:latest' locally - latest: Pulling from redislabs/redisinsight -``` - -```bash - sudo docker ps - CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES - caf3d674fb81 redislabs/redisinsight:latest "bash ./docker-entry…" 4 seconds ago Up 3 seconds 0.0.0.0:8001->8001/tcp, :::8001->8001/tcp cool_pascal -``` - -### Step 9. Access RedisInsight - -To access the RedisInsight GUI, open your preferred browser and access https://localhost:8001 - -![elasticache](images/image13.png) -![elasticache](images/image14.png) - -### Step 10. Autodiscover Elasticache Instance - -:::important -In case you encounter the below error message: - -This EC2 instance does not have permissions to discover your ElastiCache instances. To grant permission, create an IAM role with the DescribeCacheClusters permission and attach the role to this EC2 instance. - -You might have to attach IAM role to the instance as shown below: - -![elasticache](images/image15.png) - -::: - -Now you can should be able to autodiscover Elasticache - -![elasticache](images/image16.png) - -![elasticache](images/image18.png) - -Add the selected instance: - -![elasticache](images/image19.png) - -Add the discovered instance: - -![elasticache](images/image20.png) - -### References - -- [Explore Redis keys using RedisInsight browser tool](/explore/redisinsight/browser) -- [Memory Analysis using RedisInsight](/explore/redisinsight/memoryanalyzer) -- [Unified Search and Analytics using RediSearch Browser Tool](/explore/redisinsight/redisearch) -- [Managing time-series data using RedisTimeSeries Browser Tool](/explore/redisinsight/redistimeseries) -- [Analyze Your Redis commands using RedisInsight Profiler tool](/explore/redisinsight/profiler) -- [Debugging Redis using RedisInsight Slowlog Tool](/explore/redisinsight/slowlog) -- [Using Redis Streams with RedisInsight](/explore/redisinsight/streams) - -## - - diff --git a/docs/explore/redisinsight/autodiscover/launchpad.png b/docs/explore/redisinsight/autodiscover/launchpad.png deleted file mode 100644 index 66e7a455f63..00000000000 Binary files a/docs/explore/redisinsight/autodiscover/launchpad.png and /dev/null differ diff --git a/docs/explore/redisinsight/browser/images/image1.png b/docs/explore/redisinsight/browser/images/image1.png deleted file mode 100644 index e6ca70713ba..00000000000 Binary files a/docs/explore/redisinsight/browser/images/image1.png and /dev/null differ diff --git a/docs/explore/redisinsight/browser/images/image10.png b/docs/explore/redisinsight/browser/images/image10.png deleted file mode 100644 index 3f492b80148..00000000000 Binary files a/docs/explore/redisinsight/browser/images/image10.png and /dev/null differ diff --git a/docs/explore/redisinsight/browser/images/image11.png b/docs/explore/redisinsight/browser/images/image11.png deleted file mode 100644 index 7a64194a579..00000000000 Binary files a/docs/explore/redisinsight/browser/images/image11.png and /dev/null differ diff --git a/docs/explore/redisinsight/browser/images/image12.png b/docs/explore/redisinsight/browser/images/image12.png deleted file mode 100644 index 3f70b390c91..00000000000 Binary files a/docs/explore/redisinsight/browser/images/image12.png and /dev/null differ diff --git a/docs/explore/redisinsight/browser/images/image13.png b/docs/explore/redisinsight/browser/images/image13.png deleted file mode 100644 index 61be03eaf13..00000000000 Binary files a/docs/explore/redisinsight/browser/images/image13.png and /dev/null differ diff --git a/docs/explore/redisinsight/browser/images/image14.png b/docs/explore/redisinsight/browser/images/image14.png deleted file mode 100644 index ee965b171b7..00000000000 Binary files a/docs/explore/redisinsight/browser/images/image14.png and /dev/null differ diff --git a/docs/explore/redisinsight/browser/images/image15.png b/docs/explore/redisinsight/browser/images/image15.png deleted file mode 100644 index 22af2a3abc1..00000000000 Binary files a/docs/explore/redisinsight/browser/images/image15.png and /dev/null differ diff --git a/docs/explore/redisinsight/browser/images/image16.png b/docs/explore/redisinsight/browser/images/image16.png deleted file mode 100644 index 86a523d911f..00000000000 Binary files a/docs/explore/redisinsight/browser/images/image16.png and /dev/null differ diff --git a/docs/explore/redisinsight/browser/images/image17.png b/docs/explore/redisinsight/browser/images/image17.png deleted file mode 100644 index 3171127e07b..00000000000 Binary files a/docs/explore/redisinsight/browser/images/image17.png and /dev/null differ diff --git a/docs/explore/redisinsight/browser/images/image18.png b/docs/explore/redisinsight/browser/images/image18.png deleted file mode 100644 index f0f3c7c634d..00000000000 Binary files a/docs/explore/redisinsight/browser/images/image18.png and /dev/null differ diff --git a/docs/explore/redisinsight/browser/images/image19.png b/docs/explore/redisinsight/browser/images/image19.png deleted file mode 100644 index b789b6682eb..00000000000 Binary files a/docs/explore/redisinsight/browser/images/image19.png and /dev/null differ diff --git a/docs/explore/redisinsight/browser/images/image2.png b/docs/explore/redisinsight/browser/images/image2.png deleted file mode 100644 index e72743fa637..00000000000 Binary files a/docs/explore/redisinsight/browser/images/image2.png and /dev/null differ diff --git a/docs/explore/redisinsight/browser/images/image20.png b/docs/explore/redisinsight/browser/images/image20.png deleted file mode 100644 index 338722b66e0..00000000000 Binary files a/docs/explore/redisinsight/browser/images/image20.png and /dev/null differ diff --git a/docs/explore/redisinsight/browser/images/image3.png b/docs/explore/redisinsight/browser/images/image3.png deleted file mode 100644 index d37e734cd30..00000000000 Binary files a/docs/explore/redisinsight/browser/images/image3.png and /dev/null differ diff --git a/docs/explore/redisinsight/browser/images/image4.png b/docs/explore/redisinsight/browser/images/image4.png deleted file mode 100644 index 953309f20c7..00000000000 Binary files a/docs/explore/redisinsight/browser/images/image4.png and /dev/null differ diff --git a/docs/explore/redisinsight/browser/images/image5.png b/docs/explore/redisinsight/browser/images/image5.png deleted file mode 100644 index fd9cd4f7d1d..00000000000 Binary files a/docs/explore/redisinsight/browser/images/image5.png and /dev/null differ diff --git a/docs/explore/redisinsight/browser/images/image6.png b/docs/explore/redisinsight/browser/images/image6.png deleted file mode 100644 index 8c2fd68d231..00000000000 Binary files a/docs/explore/redisinsight/browser/images/image6.png and /dev/null differ diff --git a/docs/explore/redisinsight/browser/images/image7.png b/docs/explore/redisinsight/browser/images/image7.png deleted file mode 100644 index 8766b9a5fc9..00000000000 Binary files a/docs/explore/redisinsight/browser/images/image7.png and /dev/null differ diff --git a/docs/explore/redisinsight/browser/images/image8.png b/docs/explore/redisinsight/browser/images/image8.png deleted file mode 100644 index 34d48cb581c..00000000000 Binary files a/docs/explore/redisinsight/browser/images/image8.png and /dev/null differ diff --git a/docs/explore/redisinsight/browser/images/image9.png b/docs/explore/redisinsight/browser/images/image9.png deleted file mode 100644 index 68a163a2300..00000000000 Binary files a/docs/explore/redisinsight/browser/images/image9.png and /dev/null differ diff --git a/docs/explore/redisinsight/browser/images/redisinsight4.png b/docs/explore/redisinsight/browser/images/redisinsight4.png deleted file mode 100644 index f00d956f37a..00000000000 Binary files a/docs/explore/redisinsight/browser/images/redisinsight4.png and /dev/null differ diff --git a/docs/explore/redisinsight/browser/images/redisinsightinstall.png b/docs/explore/redisinsight/browser/images/redisinsightinstall.png deleted file mode 100644 index 99f2c696ea5..00000000000 Binary files a/docs/explore/redisinsight/browser/images/redisinsightinstall.png and /dev/null differ diff --git a/docs/explore/redisinsight/browser/index-browser.mdx b/docs/explore/redisinsight/browser/index-browser.mdx deleted file mode 100644 index b36729d3a32..00000000000 --- a/docs/explore/redisinsight/browser/index-browser.mdx +++ /dev/null @@ -1,160 +0,0 @@ ---- -id: index-browser -title: Visualize Redis database keys using RedisInsight Browser Tool -sidebar_label: Visualize Redis database keys using RedisInsight Browser Tool -slug: /explore/redisinsight/browser -authors: [ajeet] ---- - -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; - -RedisInsight is a 100% free Redis GUI that allows you to visualise, monitor, and optimize while developing your applications with Redis. It provides an intuitive and efficient GUI for Redis allowing developers like you to interact with your databases and manage your data. - -RedisInsight Browser lets you explore keys in your redis server. You can add, edit and delete a key. You can even update the key expiry and copy the key name to be used in different places of the application. - -In order to understand the capabilities of the browser tool, let us take a simple example and demonstrate capabilities of each of browse tool options: - -## Step 1: Create a Redis Database - -[Follow this link ](https://developer.redis.com/create)to create Redis database - -## Step 2: Download RedisInsight - -To install RedisInsight on your local system, you need to first download the software from the Redis website. - -[Click this link ](https://redis.com/redis-enterprise/redis-insight/#insight-form) to access a form that allows you to select the operating system of your choice. - -![My Image](images/redisinsightinstall.png) - -Run the installer. After the web server starts, open http://YOUR_HOST_IP:8001 and add a Redis database connection. - -Select "Connect to a Redis database" -![My Image](images/redisinsight4.png) - -Enter the requested details, including Name, Host (endpoint), Port, and Password. Then click “ADD REDIS DATABASE”. - -## Step 3: Open "Browser Tool" - -![alt_text](images/image1.png) - -## Step 4: Importing keys - -Let us import a user database( 6k keys). This dataset contains users stored as Redis Hash. - -### - -**Users** - -The user hashes contain the following fields: - -- `user:id` : The key of the hash. -- `first_name` : First Name. -- `last_name` : Last name. -- `email` : email address. -- `gender` : Gender (male/female). -- `ip_address` : IP address. -- `country` : Country Name. -- `country_code` : Country Code. -- `city` : City of the user. -- `longitude` : Longitude of the user. -- `latitude` : Latitude of the user. -- `last_login` : EPOC time of the last login. - -## Step 5: Cloning the repository - -```bash - git clone https://github.com/redis-developer/redis-datasets - cd redis-datasets/user-database -``` - -Importing the user database: - -```bash - redis-cli -h localhost -p 6379 < ./import_users.redis -``` - -Refresh the keys database by clicking as shown below: - -Click on “Scan More” to scan all 6k keys - -![alt_text](images/image3.png) - -You can get a real-time view of the data in your Redis database as shown below: - -![alt_text](images/image4.png) - -Select any key in the key database and the results gets displayed in the right hand side that includes Fields and values. - -![alt_text](images/image5.png) - -## Step 6. Adding a new key - -![alt_text](images/image6.png) - -Enter key name, field and value. - -![alt_text](images/image7.png) - -## Step 7. Searching the hash key - -You can search the key by “user:9999” and you will see the result. - -![alt_text](images/image8.png) - -Let us add fields for user:9999 as shown below: - -You can even search by adding “\*” and typing the first few letters. - -![alt_text](images/image10.png) - -## Step 8: Filter keys by Data Type - -![alt_text](images/image12.png) - -## Step 9: Setting up the Expiry value - -Let us set it to 2 seconds and you won’t be able to search for the same key as it gets expired. - -![alt_text](images/image13.png) - -## Step 10: Using CLI - -RedisInsight CLI lets you run commands against a redis server. You don’t need to remember the syntax - the integrated help shows you all the arguments and validates your command as you type. - -``` -> HMGET user:3333 first_name last_name city - -1) "Myrlene" -2) "McGrane" -3) "Qinghu" -``` - -![alt_text](images/image14.png) - -## Further References - -- [Slowlog Configuration using RedisInsight](/explore/redisinsight/slowlog) -- [Explore Redis keys using RedisInsight browser tool](/explore/redisinsight/browser) -- [Memory Analysis using RedisInsight](/explore/redisinsight/memoryanalyzer) - -## - - diff --git a/docs/explore/redisinsight/browser/launchpad.png b/docs/explore/redisinsight/browser/launchpad.png deleted file mode 100644 index 66e7a455f63..00000000000 Binary files a/docs/explore/redisinsight/browser/launchpad.png and /dev/null differ diff --git a/docs/explore/redisinsight/cluster/cluster10.png b/docs/explore/redisinsight/cluster/cluster10.png deleted file mode 100644 index d877e4d574d..00000000000 Binary files a/docs/explore/redisinsight/cluster/cluster10.png and /dev/null differ diff --git a/docs/explore/redisinsight/cluster/cluster11.png b/docs/explore/redisinsight/cluster/cluster11.png deleted file mode 100644 index 654e3200380..00000000000 Binary files a/docs/explore/redisinsight/cluster/cluster11.png and /dev/null differ diff --git a/docs/explore/redisinsight/cluster/cluster116.png b/docs/explore/redisinsight/cluster/cluster116.png deleted file mode 100644 index 2bfe64c8c5e..00000000000 Binary files a/docs/explore/redisinsight/cluster/cluster116.png and /dev/null differ diff --git a/docs/explore/redisinsight/cluster/cluster13.png b/docs/explore/redisinsight/cluster/cluster13.png deleted file mode 100644 index f2845d8a45d..00000000000 Binary files a/docs/explore/redisinsight/cluster/cluster13.png and /dev/null differ diff --git a/docs/explore/redisinsight/cluster/cluster14.png b/docs/explore/redisinsight/cluster/cluster14.png deleted file mode 100644 index 778f6349fc5..00000000000 Binary files a/docs/explore/redisinsight/cluster/cluster14.png and /dev/null differ diff --git a/docs/explore/redisinsight/cluster/cluster15.png b/docs/explore/redisinsight/cluster/cluster15.png deleted file mode 100644 index 55fc7e18aa8..00000000000 Binary files a/docs/explore/redisinsight/cluster/cluster15.png and /dev/null differ diff --git a/docs/explore/redisinsight/cluster/cluster16.png b/docs/explore/redisinsight/cluster/cluster16.png deleted file mode 100644 index 6f64aee1caa..00000000000 Binary files a/docs/explore/redisinsight/cluster/cluster16.png and /dev/null differ diff --git a/docs/explore/redisinsight/cluster/cluster17.png b/docs/explore/redisinsight/cluster/cluster17.png deleted file mode 100644 index 2bfe64c8c5e..00000000000 Binary files a/docs/explore/redisinsight/cluster/cluster17.png and /dev/null differ diff --git a/docs/explore/redisinsight/cluster/cluster18.png b/docs/explore/redisinsight/cluster/cluster18.png deleted file mode 100644 index 1024b678934..00000000000 Binary files a/docs/explore/redisinsight/cluster/cluster18.png and /dev/null differ diff --git a/docs/explore/redisinsight/cluster/cluster19.png b/docs/explore/redisinsight/cluster/cluster19.png deleted file mode 100644 index 6aa8932d1c2..00000000000 Binary files a/docs/explore/redisinsight/cluster/cluster19.png and /dev/null differ diff --git a/docs/explore/redisinsight/cluster/cluster2.png b/docs/explore/redisinsight/cluster/cluster2.png deleted file mode 100644 index 2dd3339b175..00000000000 Binary files a/docs/explore/redisinsight/cluster/cluster2.png and /dev/null differ diff --git a/docs/explore/redisinsight/cluster/cluster20.png b/docs/explore/redisinsight/cluster/cluster20.png deleted file mode 100644 index 6667c6b474d..00000000000 Binary files a/docs/explore/redisinsight/cluster/cluster20.png and /dev/null differ diff --git a/docs/explore/redisinsight/cluster/cluster21.png b/docs/explore/redisinsight/cluster/cluster21.png deleted file mode 100644 index 4edcaf5e81d..00000000000 Binary files a/docs/explore/redisinsight/cluster/cluster21.png and /dev/null differ diff --git a/docs/explore/redisinsight/cluster/cluster22.png b/docs/explore/redisinsight/cluster/cluster22.png deleted file mode 100644 index 0763816cb02..00000000000 Binary files a/docs/explore/redisinsight/cluster/cluster22.png and /dev/null differ diff --git a/docs/explore/redisinsight/cluster/cluster23.png b/docs/explore/redisinsight/cluster/cluster23.png deleted file mode 100644 index 95fb5c41721..00000000000 Binary files a/docs/explore/redisinsight/cluster/cluster23.png and /dev/null differ diff --git a/docs/explore/redisinsight/cluster/cluster24.png b/docs/explore/redisinsight/cluster/cluster24.png deleted file mode 100644 index 30e589bfcf6..00000000000 Binary files a/docs/explore/redisinsight/cluster/cluster24.png and /dev/null differ diff --git a/docs/explore/redisinsight/cluster/cluster25.png b/docs/explore/redisinsight/cluster/cluster25.png deleted file mode 100644 index 7b8b4bf1ba3..00000000000 Binary files a/docs/explore/redisinsight/cluster/cluster25.png and /dev/null differ diff --git a/docs/explore/redisinsight/cluster/cluster26.png b/docs/explore/redisinsight/cluster/cluster26.png deleted file mode 100644 index 15bfacb3e2f..00000000000 Binary files a/docs/explore/redisinsight/cluster/cluster26.png and /dev/null differ diff --git a/docs/explore/redisinsight/cluster/cluster27.png b/docs/explore/redisinsight/cluster/cluster27.png deleted file mode 100644 index 15bfacb3e2f..00000000000 Binary files a/docs/explore/redisinsight/cluster/cluster27.png and /dev/null differ diff --git a/docs/explore/redisinsight/cluster/cluster28.png b/docs/explore/redisinsight/cluster/cluster28.png deleted file mode 100644 index 76a241330cf..00000000000 Binary files a/docs/explore/redisinsight/cluster/cluster28.png and /dev/null differ diff --git a/docs/explore/redisinsight/cluster/cluster29.png b/docs/explore/redisinsight/cluster/cluster29.png deleted file mode 100644 index c139e0602f1..00000000000 Binary files a/docs/explore/redisinsight/cluster/cluster29.png and /dev/null differ diff --git a/docs/explore/redisinsight/cluster/cluster3.png b/docs/explore/redisinsight/cluster/cluster3.png deleted file mode 100644 index e9571f7857b..00000000000 Binary files a/docs/explore/redisinsight/cluster/cluster3.png and /dev/null differ diff --git a/docs/explore/redisinsight/cluster/cluster30.png b/docs/explore/redisinsight/cluster/cluster30.png deleted file mode 100644 index ba7f5d6bfc7..00000000000 Binary files a/docs/explore/redisinsight/cluster/cluster30.png and /dev/null differ diff --git a/docs/explore/redisinsight/cluster/cluster31.png b/docs/explore/redisinsight/cluster/cluster31.png deleted file mode 100644 index 8a57dff4c2b..00000000000 Binary files a/docs/explore/redisinsight/cluster/cluster31.png and /dev/null differ diff --git a/docs/explore/redisinsight/cluster/cluster32.png b/docs/explore/redisinsight/cluster/cluster32.png deleted file mode 100644 index c58347a028f..00000000000 Binary files a/docs/explore/redisinsight/cluster/cluster32.png and /dev/null differ diff --git a/docs/explore/redisinsight/cluster/cluster33.png b/docs/explore/redisinsight/cluster/cluster33.png deleted file mode 100644 index 7d06c97b075..00000000000 Binary files a/docs/explore/redisinsight/cluster/cluster33.png and /dev/null differ diff --git a/docs/explore/redisinsight/cluster/cluster34.png b/docs/explore/redisinsight/cluster/cluster34.png deleted file mode 100644 index fc359d1aeb2..00000000000 Binary files a/docs/explore/redisinsight/cluster/cluster34.png and /dev/null differ diff --git a/docs/explore/redisinsight/cluster/cluster35.png b/docs/explore/redisinsight/cluster/cluster35.png deleted file mode 100644 index 1a26c011b7f..00000000000 Binary files a/docs/explore/redisinsight/cluster/cluster35.png and /dev/null differ diff --git a/docs/explore/redisinsight/cluster/cluster36.png b/docs/explore/redisinsight/cluster/cluster36.png deleted file mode 100644 index c5ca3a3bd02..00000000000 Binary files a/docs/explore/redisinsight/cluster/cluster36.png and /dev/null differ diff --git a/docs/explore/redisinsight/cluster/cluster4.png b/docs/explore/redisinsight/cluster/cluster4.png deleted file mode 100644 index 79790496134..00000000000 Binary files a/docs/explore/redisinsight/cluster/cluster4.png and /dev/null differ diff --git a/docs/explore/redisinsight/cluster/cluster5.png b/docs/explore/redisinsight/cluster/cluster5.png deleted file mode 100644 index c3c13bfb194..00000000000 Binary files a/docs/explore/redisinsight/cluster/cluster5.png and /dev/null differ diff --git a/docs/explore/redisinsight/cluster/cluster6.png b/docs/explore/redisinsight/cluster/cluster6.png deleted file mode 100644 index 46b196326c3..00000000000 Binary files a/docs/explore/redisinsight/cluster/cluster6.png and /dev/null differ diff --git a/docs/explore/redisinsight/cluster/cluster7.png b/docs/explore/redisinsight/cluster/cluster7.png deleted file mode 100644 index d05eb3bf4ed..00000000000 Binary files a/docs/explore/redisinsight/cluster/cluster7.png and /dev/null differ diff --git a/docs/explore/redisinsight/cluster/cluster8.png b/docs/explore/redisinsight/cluster/cluster8.png deleted file mode 100644 index 07a38f5d228..00000000000 Binary files a/docs/explore/redisinsight/cluster/cluster8.png and /dev/null differ diff --git a/docs/explore/redisinsight/cluster/cluster9.png b/docs/explore/redisinsight/cluster/cluster9.png deleted file mode 100644 index 1f92a9dc906..00000000000 Binary files a/docs/explore/redisinsight/cluster/cluster9.png and /dev/null differ diff --git a/docs/explore/redisinsight/cluster/images/redisinsight4.png b/docs/explore/redisinsight/cluster/images/redisinsight4.png deleted file mode 100644 index f00d956f37a..00000000000 Binary files a/docs/explore/redisinsight/cluster/images/redisinsight4.png and /dev/null differ diff --git a/docs/explore/redisinsight/cluster/images/redisinsightinstall.png b/docs/explore/redisinsight/cluster/images/redisinsightinstall.png deleted file mode 100644 index 99f2c696ea5..00000000000 Binary files a/docs/explore/redisinsight/cluster/images/redisinsightinstall.png and /dev/null differ diff --git a/docs/explore/redisinsight/cluster/index-cluster.mdx b/docs/explore/redisinsight/cluster/index-cluster.mdx deleted file mode 100644 index 57263d3fc47..00000000000 --- a/docs/explore/redisinsight/cluster/index-cluster.mdx +++ /dev/null @@ -1,402 +0,0 @@ ---- -id: index-cluster -title: Manage Your Redis Cluster using RedisInsight Cluster Management Tool -sidebar_label: Manage Your Redis Cluster using RedisInsight Cluster Management Tool -slug: /explore/redisinsight/cluster -authors: [ajeet] ---- - -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; - -A full-featured desktop GUI client, RedisInsight is an essential tool for Redis developers. It is a lightweight multi-platform management visualization tool that helps you design, develop, and optimize your application capabilities in a single easy-to-use environment. RedisInsight not just makes it easier to interact with your databases and manage your data, but also helps in managing Redis Cluster with ease. - -## Getting Started - -Redis Cluster is an active-passive cluster implementation that consists of master and replicas nodes. -There are two kinds of nodes: master and replicas nodes. To get started with Redis Cluster, follow the below steps to build 3 Master and 3 Replicas. - -:::important - -Please Note: In Redis, slave nodes are generally called replicas as they hold the replicated slots that their masters have. - -::: - -## Step 1. Install Redis from source - -[Follow this link](/create/from-source/) to build Redis from source in your local system. - -## Step 2: Executing the create-cluster script - -By default, if you compile Redis from source, you will find a simple bash script called create-cluster under /utils/ directory. -In order to start a 6 nodes cluster with 3 masters and 3 replicas, just type the following commands: - -```bash - cd /utils/create-cluster - ./create-cluster start -``` - -```bash - Starting 30001 - Starting 30002 - Starting 30003 - Starting 30004 - Starting 30005 - Starting 30006 -``` - -## Step 3. Access Redis Instance - -```bash - redis-cli -c -p 30001 - 127.0.0.1:30001> set foo bar - -> Redirected to slot [12182] located at 127.0.0.1:30003 - OK - 127.0.0.1:30003> -``` - -## Step 4. Verify the cluster nodes - -``` - cluster nodes - b37c153b7cb63a863b51fa08bdde46bfda9c6a98 127.0.0.1:30005@40005 slave 3e85f061bebd9b566e1cbf7f03cbe3e1859babbc 0 1620304753134 3 connected - 8a1a0ba49e1845feff5314fbb8b73a2ec99e3647 127.0.0.1:30001@40001 master - 0 1620304753033 1 connected 0-5460 - bd7326d7b907a04214372fe41189e41763a1e1df 127.0.0.1:30006@40006 slave 8a1a0ba49e1845feff5314fbb8b73a2ec99e3647 0 1620304753033 1 connected - 3e85f061bebd9b566e1cbf7f03cbe3e1859babbc 127.0.0.1:30003@40003 myself,master - 0 1620304753000 3 connected 10923-16383 - 67bbe43901031fa4bfe4cee6105d284f4fe7733b 127.0.0.1:30002@40002 master - 0 1620304753033 2 connected 5461-10922 - 0e3fb1de10b722458c959b35f1468275c34ba49f 127.0.0.1:30004@40004 slave 67bbe43901031fa4bfe4cee6105d284f4fe7733b 0 1620304753134 2 connected -``` - -## Step 5. Set Protected mode off - -```bash - redis-cli -c -p 30001 - 127.0.0.1:30001> CONFIG SET protected-mode no - OK - 127.0.0.1:30001> exit -``` - -## Step 6: Download RedisInsight - -To install RedisInsight on your local system, you need to first download the software from the Redis website. - -[Click this link ](https://redis.com/redis-enterprise/redis-insight/#insight-form) to access a form that allows you to select the operating system of your choice. - -![My Image](images/redisinsightinstall.png) - -Run the installer. After the web server starts, open http://YOUR_HOST_IP:8001 and add a Redis database connection. - -Select "Connect to a Redis database" -![My Image](images/redisinsight4.png) - -Enter the requested details, including Name, Host (endpoint), Port, and Password. Then click “ADD REDIS DATABASE”. - -![cluster](cluster2.png) - -Enter the details - host IP, port and name of the database of your choice. - -![cluster](cluster3.png) - -## Step 7. Select the seed nodes of your cluster - -![cluster](cluster4.png) - -## Step 8. Choose the database - -![cluster](cluster5.png) - -## Step 9. Click "Cluster Management" - -On the left menu of the RedisInsight, click 'Cluster Management' option to check the cluster health and cluster nodes. - -![cluster](cluster6.png) - -## Step 10. Choose "Master Layout View" - -This view only contains information about the masters present in the Redis Cluster. The information present is - slot ranges, host, port and few metrics gathered from redis INFO Command. - -![cluster](cluster7.png) - -Cluster Management comes with three different views to analyze your cluster architecture(as shown above). - -- Master Layout - This view only contains information about the masters present in the Redis Cluster. The information present is - slot ranges, host, port and few metrics gathered from redis INFO Command. -- Master-Replica Layout - This view contains masters along with their replicas. This view contains information about slots ranges, host, port, etc for both master and replicas. -- Physical Layout - This view gives you a representation of your server i.e. it groups all nodes according to the physical server they reside in. - -## Step 11. Resharding - -Resharding basically means to move hash slots from a set of nodes to another set of nodes, and like cluster creation it is accomplished using the redis-cli utility. - -Typically, to start a resharding, you pass “--cluster reshard” option to the redis client CLI as shown below: - -```bash - redis-cli --cluster reshard 127.0.0.1:7000 -``` - -You only need to specify a single node, redis-cli will find the other nodes automatically. - -With RedisInsight, it can be performed over the GUI. Open Cluster Management > Manual Resharding option > Destination node > Source Nodes and enter the slot range. -Before we perform resharding, let us try to insert real-time keys into the cluster. You need to have ruby installed on your system - -```ruby - - require './cluster' - - if ARGV.length != 2 - startup_nodes = [ - {:host => "127.0.0.1", :port => 30001}, - {:host => "127.0.0.1", :port => 30003} - ] - else - startup_nodes = [ - {:host => ARGV[0], :port => ARGV[1].to_i} - ] - end - - rc = RedisCluster.new(startup_nodes,32,:timeout => 0.1) - - last = false - - while not last - begin - last = rc.get("__last__") - last = 0 if !last - rescue => e - puts "error #{e.to_s}" - sleep 1 - end - end - - ((last.to_i+1)..1000000000).each{|x| - begin - rc.set("foo#{x}",x) - puts rc.get("foo#{x}") - rc.set("__last__",x) - rescue => e - puts "error #{e.to_s}" - end - sleep 0.1 - } -``` - -Save the above content in a file called testing.rb as shown below: - -```bash - ruby testing.rb -``` - -The above script will insert keys into the Redis cluster. - -![cluster](cluster9.png) - -You can check the real-time logs via MONITOR command: - -```bash - - 1620718356.267791 [0 127.0.0.1:56056] "set" "foo2124" "2124" - 1620718356.268153 [0 127.0.0.1:56056] "get" "foo2124" - 1620718356.683092 [0 127.0.0.1:56056] "set" "foo2128" "2128" - 1620718356.683403 [0 127.0.0.1:56056] "get" "foo2128" - 1620718357.208191 [0 127.0.0.1:56056] "set" "foo2133" "2133" - 1620718357.208636 [0 127.0.0.1:56056] "get" "foo2133" - 1620718357.625524 [0 127.0.0.1:56056] "set" "foo2137" "2137" - 1620718357.625961 [0 127.0.0.1:56056] "get" "foo2137" - 1620718358.248578 [0 127.0.0.1:56056] "set" "foo2143" "2143" -``` - -Let us perform a manual sharding. Select "Manual Sharding" under Actions tab. -It will ask to select destination and source nodes. It allows to enter slot range too as shown below. - -![cluster](cluster10.png) - -Click "Next". - -The resharding process begins instantly. - -![cluster](cluster11.png) - -Finally, you can view the changes under the Cluster Management section as shown below: - -![cluster](cluster14.png) - -## Step 12. Viewing Physical Layout - -This view gives you a representation of your Cluster nodes i.e. it groups all nodes according to the physical server they reside in. - -![cluster](cluster13.png) - -## Step 13. Adding keys Manually - -Let us try to add a key against the cluster nodes - -![cluster](cluster19.png) - -```bash - - Connecting... - - Pinging Redis server on 127.0.0.1:30003... - Connected. - Ready to execute commands. - - >> set hello world - - -> Redirected to slot [866] located at 127.0.0.1:30001 -"OK" -``` - -```bash - - >> set lang python - - 127.0.0.1:30001 [master] - "OK" - - 127.0.0.1:30003 [master] - (error) MOVED 3807 127.0.0.1:30001 - - 127.0.0.1:30002 [master] - (error) MOVED 3807 127.0.0.1:30001 -``` - -Once you set up a cluster, a cluster will have a sharding. - -```bash - > set a1 100 -``` - -Once you add a key to Redis, hashes slot is calculated.Redis calculates the slot where the key is going to land by taking CRC16 of the key modulo 16384. - -## Step 14. Cyclic redundancy checkpoint - -By Finding the hashes slot for the keys, your data will automatically be spread across the nodes. - -![cluster](cluster20.png) - -In a Redis cluster, there are 16,384 slots available. The first master node in a cluster contain 0 to 5500, 5501 to 11000 and 3rd 11001 to 16,385 - -## Step 15. Adding a New Node - -Let us add a new node in the Redis configuration file: - -```bash - % tree - . - ├── 30010 - │ ├── appendonly.aof - │ ├── dump.rdb - │ ├── nodes.conf - │ └── redis.conf - └── redis.conf - - 1 directory, 5 files - - % cat 30010/redis.conf - port 30011 - cluster-enabled yes - cluster-config-file nodes.conf - cluster-node-timeout 5000 - appendonly yes -``` - -It's time to run the new Redis instance. - -````bash - redis-server ./redis.conf - 34168:C 10 May 2021 15:49:04.251 # oO0OoO0OoO0Oo Redis is starting oO0OoO0OoO0Oo - 34168:C 10 May 2021 15:49:04.251 # Redis version=6.1.241, bits=64, commit=00000000, modified=0, pid=34168, just started - 34168:C 10 May 2021 15:49:04.251 # Configuration loaded - 34168:M 10 May 2021 15:49:04.252 * Increased maximum number of open files to 10032 (it was originally set to 2560). - 34168:M 10 May 2021 15:49:04.252 * monotonic clock: POSIX clock_gettime - 34168:M 10 May 2021 15:49:04.253 * No cluster configuration found, I'm d3c15d55a60a4cdf9c1f4de8b0c637dda3500ca0 - _._ - _.-``__ ''-._ - _.-`` `. `_. ''-._ Redis 6.1.241 (00000000/0) 64 bit - .-`` .-```. ```\/ _.,_ ''-._ - ( ' , .-` | `, ) Running in cluster mode - |`-._`-...-` __...-.``-._|'` _.-'| Port: 30010 - | `-._ `._ / _.-' | PID: 34168 - `-._ `-._ `-./ _.-' _.-' - |`-._`-._ `-.__.-' _.-'_.-'| - | `-._`-._ _.-'_.-' | http://redis.io - `-._ `-._`-.__.-'_.-' _.-' - |`-._`-._ `-.__.-' _.-'_.-'| - | `-._`-._ _.-'_.-' | - `-._ `-._`-.__.-'_.-' _.-' - `-._ `-.__.-' _.-' - `-._ _.-' - `-.__.-' - - 34168:M 10 May 2021 15:49:04.254 # Server initialized - 34168:M 10 May 2021 15:49:04.254 * Ready to accept connections -```` - -Enter the host and port details of the new node: - -![cluster](cluster21.png) - -Now, you can view the nodes layout as shown below: - -![cluster](cluster21.png) - -You can also view it via CLI on your local system if you want to verify the new node entry. - -```bash - redis-cli -p 30001 -127.0.0.1:30001> cluster nodes -1a959116fb6c32726b8513668149c8a27dc61613 127.0.0.1:30006@40006 replicas 7ac14c8345df91640bc7174de903f0dd8683a1d2 0 1620642235140 7 connected -5d2ce263fb025d38c2d7626d48422d0e28280aa7 127.0.0.1:30004@40004 replicas 6f0096be6248834c0f3237192020d12ff6496f74 0 1620642235343 1 connected -7ac14c8345df91640bc7174de903f0dd8683a1d2 127.0.0.1:30003@40003 master - 0 1620642235039 7 connected 0-1000 10923-16383 -e0daeb42432323b587b281f26b26b90e9e6f2482 127.0.0.1:30005@40005 replicas 011209ddc3577e8ec15efbcb12e38a405bda20f9 0 1620642235241 2 connected -6f0096be6248834c0f3237192020d12ff6496f74 127.0.0.1:30001@40001 myself,master - 0 1620642235000 1 connected 1001-5460 -011209ddc3577e8ec15efbcb12e38a405bda20f9 127.0.0.1:30002@40002 master - 0 1620642235039 2 connected 5461-10922 -d3c15d55a60a4cdf9c1f4de8b0c637dda3500ca0 127.0.0.1:30010@40010 master - 0 1620642235140 0 connected -127.0.0.1:30001> -``` - -## Step 16. Make Replica Of - -Whenever you add a new node, Redis allows you to rebalance your cluster as shown below: - -![cluster](cluster22.png) - -You can select the master that will be replicas of the specific node: - -![cluster](cluster23.png) - -Choose the right master node of your choice and click "Proceed". - -![cluster](cluster24.png) -![cluster](cluster25.png) - -## Step 17. Deleting a Node - -To delete a node, select “Master-Replica Nodes” option and you will see all the replicas nodes - -![cluster](cluster27.png) -![cluster](cluster28.png) -![cluster](cluster29.png) - -## Step 18. Removing the node from the Cluster - -![cluster](cluster30.png) - -## Step 19. Failover - -In order to upgrade the Redis process of one of the master nodes it is a good idea to failover it in order to turn it into a replicas with minimal impact on availability. - -![cluster](cluster31.png) - -Also, RedisInsight Cluster Management tool allows you to rebalance your cluster by manually defining the slot coverage as shown below: - -![cluster](cluster32.png) -![cluster](cluster33.png) - -## Addition Links - -- [Slowlog Configuration using RedisInsight](/explore/redisinsight/slowlog) -- [Memory Analysis using RedisInsight](/explore/redisinsight/memoryanalyzer) -- [Visualize Redis database keys using RedisInsight Browser Tool](/explore/redisinsight/browser) -- [Using Redis Streams with RedisInsight](/explore/redisinsight/streams) diff --git a/docs/explore/redisinsight/cluster/redisinsightinstall.png b/docs/explore/redisinsight/cluster/redisinsightinstall.png deleted file mode 100644 index 99f2c696ea5..00000000000 Binary files a/docs/explore/redisinsight/cluster/redisinsightinstall.png and /dev/null differ diff --git a/docs/explore/redisinsight/getting-started/index-gettingstarted.mdx b/docs/explore/redisinsight/getting-started/index-gettingstarted.mdx deleted file mode 100644 index 165cd09418e..00000000000 --- a/docs/explore/redisinsight/getting-started/index-gettingstarted.mdx +++ /dev/null @@ -1,508 +0,0 @@ ---- -id: index-gettingstarted -title: Getting Started with RedisInsight -sidebar_label: Getting Started with RedisInsight -slug: /explore/redisinsight/getting-started -authors: [ajeet] ---- - -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; - -RedisInsight is an intuitive and efficient GUI for Redis, allowing you to interact with your databases and manage your data—with built-in support for most popular Redis modules. It is a 100% free Redis GUI tool that allows you to visualise, monitor, and optimize while developing your applications with Redis - -![my image](redisinsight.gif) - -RedisInsight provides built-in support for the RedisJSON, RediSearch, RedisGraph, Redis Streams, and RedisTimeSeries modules to make it even easier to query, visualize, and interactively manipulate search indexes, graphs, streams, and time-series data. - -A full-featured pure desktop GUI client, RedisInsight is available for Windows, macOS, and Linux and is fully compatible with Redis Enterprise. It works with any cloud provider as long as you run it on a host with network access to your cloud-based Redis server. RedisInsight makes it easy to discover cloud databases and configure connection details with a single click. It allows you to automatically add Redis Enterprise Software and Redis Enterprise Cloud databases. - -### Getting Started with RedisInsight - - - - -#### Using MacOS - -#### Step 1. Download RedisInsight - -To use RedisInsight on Mac OS, download it from the RedisInsight page on the Redis website: - -[Click this link ](https://redis.com/redis-enterprise/redis-insight/#insight-form) to access a form that allows you to select the operating system of your choice. - -![My Image](redisinsightmac.png) - -Fill out the rest of the form and click “Download.” Please note that the package name is the combination of the platform and version as shown here: - -redisinsight-platform-version - -:::info TIP -If you're using Redis Stack, you don't really need to install RedisInsight separately. -Redis Stack includes RedisInsight, a visualization tool for understanding and optimizing Redis data. -::: - -#### Step 2. Install RedisInsight - -Click on the RedisInsight executable and install it in your system. - -![My Image](redisinsight2.png) - -Head over to your web browser and go to http://localhost:8001 - -Congratulations! You have successfully installed RedisInsight and are now ready to inspect your Redis data, monitor database health, and perform runtime server configuration with this browser-based management interface for your Redis deployment. - -#### Step 3. Connect to Redis database - -![My Image](redisinsight3.png) - -Assuming that you already have Redis database up and running, select "Connect to a Redis database" -![My Image](redisinsight4.png) - -#### Step 4. Add Redis Database - -Enter the requested details, including Name, Host (endpoint), Port, and Password in the form, as shown below. You can skip username for now. Then click “ADD REDIS DATABASE”: - -![My Image](redisinsight5.png) - -#### Step 5. Run the Redis CLI - -Finally, although RedisInsight is a great GUI, sometimes you want to work directly in the command-line interface (CLI). To do so, click “CLI” in the menu on the left side of the RedisInsight UI: - -![My Image](redisinsight7.png) - -Then paste the appropriate Redis commands in the command section, marked with “>>” as shown below, and press Enter. - -![My Image](redisinsight9.png) - -You can see the output displayed at the top of the screen. If it says “OK,” the command was executed successfully. - - - - - -#### Using Linux - -#### Step 1. Download RedisInsight - -To use RedisInsight on your Linux machine, you can download from the RedisInsight page on the RedisLabs website: - -Open [this](https://redis.com/redis-enterprise/redis-insight/#insight-form) link to open up a form that allows you to select the operating system of your choice. - -![My Image](redisinsight-linux.png) - -Fill out the rest of the form and click “Download.” Please note that the package name is the combination of the platform and version as shown here: - -redisinsight-linux64 - -#### Step 2. Install RedisInsight - -Open a terminal and navigate to the folder containing the downloaded file. - -Make your downloaded file into an executable. - -``` -chmod +x redisinsight-linux64- -``` - -#### Step 3. Start RedisInsight. - -``` -./redisinsight-linux64-version -``` - -To access your RedisInsight GUI, open a web browser and navigate to http://127.0.0.1:8001. - -Congratulations! You have successfully installed RedisInsight and are now ready to inspect your Redis data, monitor database health, and perform runtime server configuration with this browser-based management interface for your Redis deployment. - -![My Image](redisinsight3.png) - -#### Step 4. Connect to Redis database - -Assuming that you already have Redis database up and running, select "Connect to a Redis database" -![My Image](redisinsight4.png) - -Enter the requested details, including Name, Host (endpoint), Port, and Password in the form, as shown below. You can skip username for now. Then click “ADD REDIS DATABASE”: - -![My Image](redisinsight5.png) - -#### Step 5. Run the Redis CLI - -Finally, although RedisInsight is a great GUI, sometimes you want to work directly in the command-line interface (CLI). To do so, click “CLI” in the menu on the left side of the RedisInsight UI: - -![My Image](redisinsight7.png) - -Then paste the appropriate Redis commands in the command section, marked with “>>” as shown below, and press Enter. - -![My Image](redisinsight9.png) - -You can see the output displayed at the top of the screen. If it says “OK,” the command was executed successfully. - - - - - -#### Using Windows - -#### Step 1. Download RedisInsight - -To install RedisInsight on Windows, there is no need to install any .NET framework. RedisInsight should install and run on a fresh Windows system. - -To use RedisInsight on your Windows machine, you can download from the RedisInsight page on the RedisLabs website: - -Open [this](https://redis.com/redis-enterprise/redis-insight/#insight-form) link to open up a form that allows you to select the operating system of your choice. - -#### Step 2. Install RedisInsight - -![My Image](redisinsight-windows.png) - -Fill out the rest of the form and click “Download.” Please note that the package name is the combination of the platform and version as shown here: - -redisinsight-win-msi - -#### Step 3. Accessing RedisInsight - -After the web server starts, open http://127.0.0.1:8001 and add a Redis database connection. - -Congratulations! You have successfully installed RedisInsight and are now ready to inspect your Redis data, monitor database health, and perform runtime server configuration with this browser-based management interface for your Redis deployment. - -#### Step 4. Connect to a Redis database - -![My Image](redisinsight3.png) - -Assuming that you already have Redis database up and running, select "Connect to a Redis database" -![My Image](redisinsight4.png) - -Enter the requested details, including Name, Host (endpoint), Port, and Password in the form, as shown below. You can skip username for now. Then click “ADD REDIS DATABASE”: - -![My Image](redisinsight5.png) - -#### Step 5. Run the Redis CLI - -Finally, although RedisInsight is a great GUI, sometimes you want to work directly in the command-line interface (CLI). To do so, click “CLI” in the menu on the left side of the RedisInsight UI: - -![My Image](redisinsight7.png) - -Then paste the appropriate Redis commands in the command section, marked with “>>” as shown below, and press Enter. - -![My Image](redisinsight9.png) - -You can see the output displayed at the top of the screen. If it says “OK,” the command was executed successfully. - - - - - -#### Using Docker - -#### Step 1. Install Docker - -The first step is to install docker for your operating system. Run the docker version command in a terminal window to make sure that docker is installed correctly. - -Note - On Windows and Mac, install docker version 18.03 or higher. You can run docker version to find out your docker version. - -#### Step 2. Run RedisInsight Docker image - -Next, run the RedisInsight container. The easiest way is to run the following command: - -```bash - docker run -d -v redisinsight:/db -p 8001:8001 redislabs/redisinsight:latest -``` - -#### Step 3. Accessing RedisInsight - -Next, point your browser to http://localhost:8001. - -RedisInsight also provides a health check endpoint at http://localhost:8001/healthcheck/ to monitor the health of the running container. - -Note: Make sure the directory you pass as a volume to the container has necessary permissions for the container to access it. For example, if the previous command returns a permissions error, run the following command: - -``` - chown -R 1001 redisinsight -``` - -In addition, you can add some additional flags to the docker run command: - -You can add the -it flag to see the logs and view the progress. - -On Linux, you can add --network host. This makes it easy to work with redis running on your local machine. - -To analyze RDB files stored in S3, you can add the access key and secret access key as environment variables using the -e flag. - -For example: -e AWS_ACCESS_KEY=aws_access_key -e AWS_SECRET_KEY=aws_secret_access_key - -If everything worked, you should see the following output in the terminal: - -```bash - Starting webserver... - Visit http://0.0.0.0:8001 in your web browser. - Press CTRL-C to exit. -``` - -Head over to your web browser and go to http://localhost:8001 - -Congratulations! You have successfully installed RedisInsight and are now ready to inspect your Redis data, monitor database health, and perform runtime server configuration with this browser-based management interface for your Redis deployment. - -#### Step 4. Connect to a Redis database - -![My Image](redisinsight3.png) - -Assuming that you already have Redis database up and running, select "Connect to a Redis database" -![My Image](redisinsight4.png) - -Enter the requested details, including Name, Host (endpoint), Port, and Password in the form, as shown below. You can skip username for now. Then click “ADD REDIS DATABASE”: - -![My Image](redisinsight5.png) - -#### Step 5. Run the Redis CLI - -Finally, although RedisInsight is a great GUI, sometimes you want to work directly in the command-line interface (CLI). To do so, click “CLI” in the menu on the left side of the RedisInsight UI: - -![My Image](redisinsight7.png) - -Then paste the appropriate Redis commands in the command section, marked with “>>” as shown below, and press Enter. - -![My Image](redisinsight9.png) - -You can see the output displayed at the top of the screen. If it says “OK,” the command was executed successfully. - - - - - -#### Using Kubernetes - -#### Step 1. Create the RedisInsight deployment and service - -Below is an annotated YAML file that will create a RedisInsight deployment and a service in a k8s cluster. - -#### Step 2. Create a new file redisinsight.yaml with the content below - -``` - # RedisInsight service with name 'redisinsight-service' - apiVersion: v1 - kind: Service - metadata: - name: redisinsight-service # name should not be 'redisinsight' - # since the service creates - # environment variables that - # conflicts with redisinsight - # application's environment - # variables `REDISINSIGHT_HOST` and - # `REDISINSIGHT_PORT` - spec: - type: LoadBalancer - ports: - - port: 80 - targetPort: 8001 - selector: - app: redisinsight ---- - ### RedisInsight deployment with name 'redisinsight' - - apiVersion: apps/v1 - kind: Deployment - metadata: - name: redisinsight #deployment name - labels: - app: redisinsight #deployment label - spec: - replicas: 1 #a single replica pod - selector: - matchLabels: - app: redisinsight #which pods is the deployment managing, as defined by the pod template - template: #pod template - metadata: - labels: - app: redisinsight #label for pod/s - spec: - containers: - - - name: redisinsight #Container name (DNS_LABEL, unique) - image: redislabs/redisinsight:1.7.0 #repo/image - imagePullPolicy: IfNotPresent #Always pull image - volumeMounts: - - name: db #Pod volumes to mount into the container's filesystem. Cannot be updated. - mountPath: /db - ports: - - containerPort: 8001 #exposed conainer port and protocol - protocol: TCP - volumes: - - name: db - emptyDir: {} # node-ephemeral volume https://kubernetes.io/docs/concepts/storage/volumes/#emptydir -``` - -#### Step 3. Create the RedisInsight deployment and service - -``` -kubectl apply -f redisinsight.yaml -``` - -#### Step 4. Accessing RedisInsight - -Once the deployment and service are successfully applied and complete, you shoould be able to access RedisInsight. -This can be accomplished by listing the using the External IP address of the service we created to reach redisinsight. - -``` -$ kubectl get svc redisinsight-service -NAME CLUSTER-IP EXTERNAL-IP PORT(S) AGE -redisinsight-service 80:32143/TCP 1m -``` - -If you are using minikube, run minikube list to list the service and access RedisInsight at http://minikube-ip:minikube-service-port. - -``` -$ minikube list -|-------------|----------------------|--------------|---------------------------------------------| -| NAMESPACE | NAME | TARGET PORT | URL | -|-------------|----------------------|--------------|---------------------------------------------| -| default | kubernetes | No node port | | -| default | redisinsight-service | 80 | http://minikube-ip:minikubeservice-port | -| kube-system | kube-dns | No node port | | -|-------------|----------------------|--------------|---------------------------------------------| -``` - -#### Step 5. Create the RedisInsight deployment without a service. - -Below is an annotated YAML file that will create a RedisInsight deployment in a K8s cluster. - -#### Create a new file redisinsight.yaml with the content below - -``` -apiVersion: apps/v1 -kind: Deployment -metadata: - name: redisinsight #deployment name - labels: - app: redisinsight #deployment label -spec: - replicas: 1 #a single replica pod - selector: - matchLabels: - app: redisinsight #which pods is the deployment managing, as defined by the pod template - template: #pod template - metadata: - labels: - app: redisinsight #label for pod/s - spec: - containers: - - name: redisinsight #Container name (DNS_LABEL, unique) - image: redislabs/redisinsight:1.7.0 #repo/image - imagePullPolicy: IfNotPresent #Always pull image - env: - # If there's a service named 'redisinsight' that exposes the - # deployment, we manually set `REDISINSIGHT_HOST` and - # `REDISINSIGHT_PORT` to override the service environment - # variables. - - name: REDISINSIGHT_HOST - value: "0.0.0.0" - - name: REDISINSIGHT_PORT - value: "8001" - volumeMounts: - - name: db #Pod volumes to mount into the container's filesystem. Cannot be updated. - mountPath: /db - ports: - - containerPort: 8001 #exposed conainer port and protocol - protocol: TCP - livenessProbe: - httpGet: - path : /healthcheck/ # exposed RI endpoint for healthcheck - port: 8001 # exposed container port - initialDelaySeconds: 5 # number of seconds to wait after the container starts to perform liveness probe - periodSeconds: 5 # period in seconds after which liveness probe is performed - failureThreshold: 1 # number of liveness probe failures after which container restarts - volumes: - - name: db - emptyDir: {} # node-ephemeral volume https://kubernetes.io/docs/concepts/storage/volumes/#emptydir -``` - -#### Step 6. Create the RedisInsight deployment - -``` -kubectl apply -f redisinsight.yaml -``` - -Note - If the deployment will be exposed by a service whose name is ‘redisinsight’, set REDISINSIGHT_HOST and REDISINSIGHT_PORT environment variables to override the environment variables created by the service. - -Once the deployment has been successfully applied and the deployment complete, access RedisInsight. This can be accomplished by exposing the deployment as a K8s Service or by using port forwarding, as in the example below: - -``` -kubectl port-forward deployment/redisinsight 8001 -``` - -Open your browser and point to http://localhost:8001 - -Congratulations! You have successfully installed RedisInsight and are now ready to inspect your Redis data, monitor database health, and perform runtime server configuration with this browser-based management interface for your Redis deployment. - -![My Image](redisinsight3.png) - -Assuming that you already have Redis database up and running, select "Connect to a Redis database" -![My Image](redisinsight4.png) - -Enter the requested details, including Name, Host (endpoint), Port, and Password in the form, as shown below. You can skip username for now. Then click “ADD REDIS DATABASE”: - -![My Image](redisinsight5.png) - -#### Step 7. Run the Redis CLI - -Finally, although RedisInsight is a great GUI, sometimes you want to work directly in the command-line interface (CLI). To do so, click “CLI” in the menu on the left side of the RedisInsight UI: - -![My Image](redisinsight7.png) - -Then paste the appropriate Redis commands in the command section, marked with “>>” as shown below, and press Enter. - -![My Image](redisinsight9.png) - -You can see the output displayed at the top of the screen. If it says “OK,” the command was executed successfully. - - - - -### RedisInsight Overview (RedisConf'21) - -
- -
- -### Further References - -- [Debugging Redis using Slowlog Configuration under RedisInsight](/explore/redisinsight/slowlog) -- [Explore Redis keys using RedisInsight browser tool](/explore/redisinsight/browser) -- [Memory Analysis using RedisInsight](/explore/redisinsight/memoryanalyzer) -- [Manage Your Redis Cluster using RedisInsight](/explore/redisinsight/cluster) -- [Using Redis Streams with RedisInsight](/explore/redisinsight/streams) - -## - - diff --git a/docs/explore/redisinsight/getting-started/launchpad.png b/docs/explore/redisinsight/getting-started/launchpad.png deleted file mode 100644 index 66e7a455f63..00000000000 Binary files a/docs/explore/redisinsight/getting-started/launchpad.png and /dev/null differ diff --git a/docs/explore/redisinsight/getting-started/redisinsight-linux.png b/docs/explore/redisinsight/getting-started/redisinsight-linux.png deleted file mode 100644 index 3406cf17c61..00000000000 Binary files a/docs/explore/redisinsight/getting-started/redisinsight-linux.png and /dev/null differ diff --git a/docs/explore/redisinsight/getting-started/redisinsight-windows.png b/docs/explore/redisinsight/getting-started/redisinsight-windows.png deleted file mode 100644 index 0a2a8b70300..00000000000 Binary files a/docs/explore/redisinsight/getting-started/redisinsight-windows.png and /dev/null differ diff --git a/docs/explore/redisinsight/getting-started/redisinsight.gif b/docs/explore/redisinsight/getting-started/redisinsight.gif deleted file mode 100644 index c1a7da6f075..00000000000 Binary files a/docs/explore/redisinsight/getting-started/redisinsight.gif and /dev/null differ diff --git a/docs/explore/redisinsight/getting-started/redisinsight2.png b/docs/explore/redisinsight/getting-started/redisinsight2.png deleted file mode 100644 index 5c6ef2fef14..00000000000 Binary files a/docs/explore/redisinsight/getting-started/redisinsight2.png and /dev/null differ diff --git a/docs/explore/redisinsight/getting-started/redisinsight3.png b/docs/explore/redisinsight/getting-started/redisinsight3.png deleted file mode 100644 index 33c9f8d950b..00000000000 Binary files a/docs/explore/redisinsight/getting-started/redisinsight3.png and /dev/null differ diff --git a/docs/explore/redisinsight/getting-started/redisinsight4.png b/docs/explore/redisinsight/getting-started/redisinsight4.png deleted file mode 100644 index f00d956f37a..00000000000 Binary files a/docs/explore/redisinsight/getting-started/redisinsight4.png and /dev/null differ diff --git a/docs/explore/redisinsight/getting-started/redisinsight5.png b/docs/explore/redisinsight/getting-started/redisinsight5.png deleted file mode 100644 index bb716253bed..00000000000 Binary files a/docs/explore/redisinsight/getting-started/redisinsight5.png and /dev/null differ diff --git a/docs/explore/redisinsight/getting-started/redisinsight7.png b/docs/explore/redisinsight/getting-started/redisinsight7.png deleted file mode 100644 index dfa2dcc6969..00000000000 Binary files a/docs/explore/redisinsight/getting-started/redisinsight7.png and /dev/null differ diff --git a/docs/explore/redisinsight/getting-started/redisinsight8.png b/docs/explore/redisinsight/getting-started/redisinsight8.png deleted file mode 100644 index f2d04e964a6..00000000000 Binary files a/docs/explore/redisinsight/getting-started/redisinsight8.png and /dev/null differ diff --git a/docs/explore/redisinsight/getting-started/redisinsight9.png b/docs/explore/redisinsight/getting-started/redisinsight9.png deleted file mode 100644 index 3a233149fc6..00000000000 Binary files a/docs/explore/redisinsight/getting-started/redisinsight9.png and /dev/null differ diff --git a/docs/explore/redisinsight/getting-started/redisinsightmac.png b/docs/explore/redisinsight/getting-started/redisinsightmac.png deleted file mode 100644 index 5cca4c45a0a..00000000000 Binary files a/docs/explore/redisinsight/getting-started/redisinsightmac.png and /dev/null differ diff --git a/docs/explore/redisinsight/index-redisinsight.mdx b/docs/explore/redisinsight/index-redisinsight.mdx deleted file mode 100644 index f9a473eafec..00000000000 --- a/docs/explore/redisinsight/index-redisinsight.mdx +++ /dev/null @@ -1,137 +0,0 @@ ---- -id: index-redisinsight -title: RedisInsight Developer Hub for Redis Interactive Tutorials -sidebar_label: Overview -slug: /explore/redisinsight ---- - -import RedisCard from '@site/src/theme/RedisCard'; -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; - -
- -
- -
- -
- -
- -
- -
-
-
- -
- -
- -
- -
- -
- -
- -
- -
- -
- -
- -
- -
- -
- -
- -
- -
- -
- -
- -
- -
- -
- -
- -
diff --git a/docs/explore/redisinsight/memoryanalyzer/images/image1.png b/docs/explore/redisinsight/memoryanalyzer/images/image1.png deleted file mode 100644 index 356a9b1b8e4..00000000000 Binary files a/docs/explore/redisinsight/memoryanalyzer/images/image1.png and /dev/null differ diff --git a/docs/explore/redisinsight/memoryanalyzer/images/image10.png b/docs/explore/redisinsight/memoryanalyzer/images/image10.png deleted file mode 100644 index 3ac49eb68e1..00000000000 Binary files a/docs/explore/redisinsight/memoryanalyzer/images/image10.png and /dev/null differ diff --git a/docs/explore/redisinsight/memoryanalyzer/images/image11.png b/docs/explore/redisinsight/memoryanalyzer/images/image11.png deleted file mode 100644 index 51a00624262..00000000000 Binary files a/docs/explore/redisinsight/memoryanalyzer/images/image11.png and /dev/null differ diff --git a/docs/explore/redisinsight/memoryanalyzer/images/image12.png b/docs/explore/redisinsight/memoryanalyzer/images/image12.png deleted file mode 100644 index 3ac49eb68e1..00000000000 Binary files a/docs/explore/redisinsight/memoryanalyzer/images/image12.png and /dev/null differ diff --git a/docs/explore/redisinsight/memoryanalyzer/images/image13.png b/docs/explore/redisinsight/memoryanalyzer/images/image13.png deleted file mode 100644 index 536b5b9eeb4..00000000000 Binary files a/docs/explore/redisinsight/memoryanalyzer/images/image13.png and /dev/null differ diff --git a/docs/explore/redisinsight/memoryanalyzer/images/image14.png b/docs/explore/redisinsight/memoryanalyzer/images/image14.png deleted file mode 100644 index 51a00624262..00000000000 Binary files a/docs/explore/redisinsight/memoryanalyzer/images/image14.png and /dev/null differ diff --git a/docs/explore/redisinsight/memoryanalyzer/images/image15.png b/docs/explore/redisinsight/memoryanalyzer/images/image15.png deleted file mode 100644 index d9be37a1431..00000000000 Binary files a/docs/explore/redisinsight/memoryanalyzer/images/image15.png and /dev/null differ diff --git a/docs/explore/redisinsight/memoryanalyzer/images/image2.png b/docs/explore/redisinsight/memoryanalyzer/images/image2.png deleted file mode 100644 index e38426a796c..00000000000 Binary files a/docs/explore/redisinsight/memoryanalyzer/images/image2.png and /dev/null differ diff --git a/docs/explore/redisinsight/memoryanalyzer/images/image3.png b/docs/explore/redisinsight/memoryanalyzer/images/image3.png deleted file mode 100644 index 1edc241c954..00000000000 Binary files a/docs/explore/redisinsight/memoryanalyzer/images/image3.png and /dev/null differ diff --git a/docs/explore/redisinsight/memoryanalyzer/images/image4.png b/docs/explore/redisinsight/memoryanalyzer/images/image4.png deleted file mode 100644 index e233424a5ac..00000000000 Binary files a/docs/explore/redisinsight/memoryanalyzer/images/image4.png and /dev/null differ diff --git a/docs/explore/redisinsight/memoryanalyzer/images/image5.png b/docs/explore/redisinsight/memoryanalyzer/images/image5.png deleted file mode 100644 index 540fd65791d..00000000000 Binary files a/docs/explore/redisinsight/memoryanalyzer/images/image5.png and /dev/null differ diff --git a/docs/explore/redisinsight/memoryanalyzer/images/image6.png b/docs/explore/redisinsight/memoryanalyzer/images/image6.png deleted file mode 100644 index 19a6f7c56be..00000000000 Binary files a/docs/explore/redisinsight/memoryanalyzer/images/image6.png and /dev/null differ diff --git a/docs/explore/redisinsight/memoryanalyzer/images/image7.png b/docs/explore/redisinsight/memoryanalyzer/images/image7.png deleted file mode 100644 index 34d54ae4896..00000000000 Binary files a/docs/explore/redisinsight/memoryanalyzer/images/image7.png and /dev/null differ diff --git a/docs/explore/redisinsight/memoryanalyzer/images/image8.png b/docs/explore/redisinsight/memoryanalyzer/images/image8.png deleted file mode 100644 index 3320aa1b31f..00000000000 Binary files a/docs/explore/redisinsight/memoryanalyzer/images/image8.png and /dev/null differ diff --git a/docs/explore/redisinsight/memoryanalyzer/images/image9.png b/docs/explore/redisinsight/memoryanalyzer/images/image9.png deleted file mode 100644 index 7e18e9f5150..00000000000 Binary files a/docs/explore/redisinsight/memoryanalyzer/images/image9.png and /dev/null differ diff --git a/docs/explore/redisinsight/memoryanalyzer/images/redisinsight4.png b/docs/explore/redisinsight/memoryanalyzer/images/redisinsight4.png deleted file mode 100644 index f00d956f37a..00000000000 Binary files a/docs/explore/redisinsight/memoryanalyzer/images/redisinsight4.png and /dev/null differ diff --git a/docs/explore/redisinsight/memoryanalyzer/images/redisinsightinstall.png b/docs/explore/redisinsight/memoryanalyzer/images/redisinsightinstall.png deleted file mode 100644 index 99f2c696ea5..00000000000 Binary files a/docs/explore/redisinsight/memoryanalyzer/images/redisinsightinstall.png and /dev/null differ diff --git a/docs/explore/redisinsight/memoryanalyzer/index-memoryanalyzer.mdx b/docs/explore/redisinsight/memoryanalyzer/index-memoryanalyzer.mdx deleted file mode 100644 index f074ee21fc2..00000000000 --- a/docs/explore/redisinsight/memoryanalyzer/index-memoryanalyzer.mdx +++ /dev/null @@ -1,200 +0,0 @@ ---- -id: index-memoryanalyzer -title: Optimize & Analyze Redis using RedisInsight Memory Analyzer Tool -sidebar_label: Optimize & Analyze Redis using RedisInsight Memory Analyzer Tool -slug: /explore/redisinsight/memoryanalyzer -authors: [ajeet] ---- - -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; - -RedisInsight, a free web-based GUI management interface for Redis, offers several tools to manage and optimize Redis, with the main focus on memory optimization. RedisInsight Memory analysis helps you analyze your Redis instance, reduce memory usage, and improve application performance. - -Let’s assume that you noticed high memory usage out of your Redis instance and you want to analyze any specific cache/key pattern, which may be consuming most of the memory or not using Redis in a very efficient way. RedisInsight is a great tool that can help you analyze memory used by Redis through keys or key patterns, expiry,data types, or the instance’s internal encoding. Analysis can be done in two ways—online and offline mode(discussed later in the tutorial). - -This tutorial demonstrates the below list of features under RedisInsight: - -- Memory Overview -- Keyspace Summary -- Recommendations -- Memory Analyzer - -Follow the steps below to see how memory analysis can be performed over Redis Database(RDB) dumps using RedisInsight. - -## Step 1: Create a Redis database - -[Follow this link ](https://developer.redis.com/create)to create Redis database - -## Step 2: Download RedisInsight - -To install RedisInsight on your local system, you need to first download the software from the Redis website. - -[Click this link ](https://redis.com/redis-enterprise/redis-insight/#insight-form) to access a form that allows you to select the operating system of your choice. - -![My Image](images/redisinsightinstall.png) - -:::important - -If you're using RedisInsight Docker container, you might need to follow the below steps to add a mount point so as to acess the RDB dumps. -::: -It is recommended to add a volume mount point using the ”-v” parameter as shown below: - -```bash - mkdir memorytest - docker run -d -v /temp/memorytest:/db -p 8001:8001 redislabs/redisinsight:latest -``` - -Run the installer. After the web server starts, open http://YOUR_HOST_IP:8001 and add a Redis database connection. - -Select "Connect to a Redis database" - -![My Image](images/redisinsight4.png) - -## Step 3: Connect to Redis database - -![alt_text](images/image2.png) - -## Step 4: Store Redis user sample datasets - -```bash - git clone https://github.com/redis-developer/redis-datasets - cd redis-datasets/user-database -``` - -Execute the below command to import users database - -```bash - redis-cli -h 192.168.1.9 -p 6379 < ./import_users.redis - .. - .. - "5996 Users Created" -``` - -Under RedisInsight GUI, click on “Memory Analyzer” under Browser. -Before you click on “Analysis,” ensure that you store dump.rdb at the right location. -If you’re on Mac, the Redis dump file is located under /usr/local/var/db/redis - -```bash - tree /usr/local/var/db - /usr/local/var/db - └── redis - └── dump.rdb - - 1 directory, 1 file - -``` - -Copy the dump.rdb to /memorytest/ location. You can also verify if that file is available in the container by using docker exec command. - -```bash - ls - bulk_operation dump.rdb queries.log redisinsight.log - dropbox profiler_logs redisinsight.db rsnaps -``` - -Memory analysis in RedisInsight is done in two different ways: - -- **Online mode** - In this mode, RedisInsight downloads a rdb file from your connected redis instance and analyzes it to create a temp file with all the keys and meta data required for analysis. -- **Offline mode** - In this mode, RedisInsight analyzes your Redis backup files. These files can either be present in your system or on s3. RedisInsight accepts a list of rdb files given to it and analyzes all the information required from these files instead of downloading it from your Redis instance. - -Choose the offline analysis approach if you have a RDB Backup file that you want to analyze. - -![alt_text](images/image4.png) - -Enter the right location of the RDB backup file as shown below: - -![alt_text](images/image5.png) - -Click “Proceed”. - -![alt_text](images/image6.png) - -:::important - -If you are using online memory analysis, you will want to have enough space to store the RDB file for your Redis database. This is usually 10-50% of the Redis instance’s memory usage. - -::: - -## Keyspace Summary - -Keyspace Summary identifies the top key patterns from the set of keys in descending order of memory. This helps you identify which key patterns are consuming most of your memory and what are the top keys for that pattern. You can add your own key patterns in order to identify their memory usage and the top keys for that key pattern. - -Click “Keyspace” and you will see the total memory consumed by each of the top key patterns as shown below: - -![alt_text](images/image7.png) - -You can click on each of these keys patterns(as shown above) and check the memory usage. - -## Recommendations - -RedisInsight provides recommendations on how you can save your memory. The recommendations are specially curated according to your Redis instance. These recommendations have been formed based on industry standards. - -:::important - -Combine Small Strings to Hash - -Small key value pairs in redis consume a lot of memory because a top level key has several overheads. If you do not need an expiry on these keys, you can combine multiple keys into a larger hash. As long as the hash has less than 512 elements and the size of each element is within 64 bytes, you will save a significant amount of memory. -Read Instagram's blog post on how they used this technique to save memory -Once you combine strings into a larger hash, evaluate hash-max-ziplist-entries and hash-max-ziplist-value settings. You may want to increase them to save even more memory. - -Performance comes with a cost. By converting the strings to hash, we will be saving on a lot of memory because it saves only the string value and no extra information like: idle time, expiration, object reference count, and encoding related to it. But if we want the key with the expiration value, we can't associate it with a hash structure as expiration is not available. Read More -::: - -Let us add 1 million keys using redis-benchmark tool: - -Fill 127.0.0.1:6379 with about 1 million keys only using the SET test: - -```bash - redis-benchmark -t set -n 1000000 -r 100000000 - … - 100.000% <= 1011.199 milliseconds (cumulative count 1000000) - -Summary: - throughput summary: 13451.89 requests per second - latency summary (msec): - avg min p50 p95 p99 max - 3.600 0.616 3.007 5.591 7.567 1011.199 - -``` - -Click on Recommendations and you will see the below messages - -:::important - -Avoid unnecessary Long Keys - -You can save some memory by avoiding large keys. In general, you should always prefer descriptive keys. This recommendation only applies if you have millions of very large keys. In a well written application, switching to shorter keys usually involves updating a few constant strings in the application code. -The trade off of converting large keys to smaller keys is that large Keys were more descriptive then shortened keys, hence when reading through your database you may find the keys less relatable. Read More -Key patterns that need to be modified: -key:\* - -::: - -![alt_text](images/image11.png) - -## Memory Analyzer - -Memory Analyzer lets you search a key or key patterns and get related information regarding it with other stats. You can apply various filters and aggregations using our advance filters feature. - -When the `analyze-memory` button is clicked, it connects to the redis instance and takes a point-in-time snapshot of the database. [Here’s a link](https://docs.redis.com/latest/ri/using-redisinsight/memory-analysis/#how-memory-analysis-works) that deep dive into SYNC and DUMP approach - -Considering 1 million keys only using the SET command, you can view the memory analyzer section and click “Advanced Filters”. - -![alt_text](images/image13.png) - -The advanced filters allow you to choose data type, encoding, memory, group by data types, encoding, aggregate etc. - -![alt_text](images/image14.png) - -You can check memory usage by key: - -![alt_text](images/image15.png) - -## Additional Links - -- [Slowlog Configuration using RedisInsight](/explore/redisinsight/slowlog) -- [Explore Redis keys using RedisInsight browser tool](/explore/redisinsight/browser) -- [Cluster Management with ease using RedisInsight](/explore/redisinsight/cluster) diff --git a/docs/explore/redisinsight/profiler/images/image1.png b/docs/explore/redisinsight/profiler/images/image1.png deleted file mode 100644 index fcccc281a76..00000000000 Binary files a/docs/explore/redisinsight/profiler/images/image1.png and /dev/null differ diff --git a/docs/explore/redisinsight/profiler/images/image2.png b/docs/explore/redisinsight/profiler/images/image2.png deleted file mode 100644 index 09f2b839907..00000000000 Binary files a/docs/explore/redisinsight/profiler/images/image2.png and /dev/null differ diff --git a/docs/explore/redisinsight/profiler/images/image3.png b/docs/explore/redisinsight/profiler/images/image3.png deleted file mode 100644 index 4a1ed266065..00000000000 Binary files a/docs/explore/redisinsight/profiler/images/image3.png and /dev/null differ diff --git a/docs/explore/redisinsight/profiler/images/image4.png b/docs/explore/redisinsight/profiler/images/image4.png deleted file mode 100644 index 8dac36f66c5..00000000000 Binary files a/docs/explore/redisinsight/profiler/images/image4.png and /dev/null differ diff --git a/docs/explore/redisinsight/profiler/images/image5.png b/docs/explore/redisinsight/profiler/images/image5.png deleted file mode 100644 index ef3ebd1283a..00000000000 Binary files a/docs/explore/redisinsight/profiler/images/image5.png and /dev/null differ diff --git a/docs/explore/redisinsight/profiler/images/image6.png b/docs/explore/redisinsight/profiler/images/image6.png deleted file mode 100644 index cc78132e6bf..00000000000 Binary files a/docs/explore/redisinsight/profiler/images/image6.png and /dev/null differ diff --git a/docs/explore/redisinsight/profiler/images/image7.png b/docs/explore/redisinsight/profiler/images/image7.png deleted file mode 100644 index efc28b44d1d..00000000000 Binary files a/docs/explore/redisinsight/profiler/images/image7.png and /dev/null differ diff --git a/docs/explore/redisinsight/profiler/images/redisinsight4.png b/docs/explore/redisinsight/profiler/images/redisinsight4.png deleted file mode 100644 index f00d956f37a..00000000000 Binary files a/docs/explore/redisinsight/profiler/images/redisinsight4.png and /dev/null differ diff --git a/docs/explore/redisinsight/profiler/images/redisinsightinstall.png b/docs/explore/redisinsight/profiler/images/redisinsightinstall.png deleted file mode 100644 index 99f2c696ea5..00000000000 Binary files a/docs/explore/redisinsight/profiler/images/redisinsightinstall.png and /dev/null differ diff --git a/docs/explore/redisinsight/profiler/index-profiler.mdx b/docs/explore/redisinsight/profiler/index-profiler.mdx deleted file mode 100644 index 2c5aa6ac329..00000000000 --- a/docs/explore/redisinsight/profiler/index-profiler.mdx +++ /dev/null @@ -1,287 +0,0 @@ ---- -id: index-profiler -title: RedisInsight Profiler Tool - Analyze Your Redis Commands Using Redis Monitor Command -sidebar_label: RedisInsight Profiler Tool - Analyze Your Redis Commands Using Redis Monitor Command -slug: /explore/redisinsight/profiler -authors: [ajeet] ---- - -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; - -RedisInsight profiler analyzes your Redis commands that are being run on the Redis server in real-time. The tool provides you detailed information about the number of commands processed, commands/second and number of connected clients. It also gives information about top prefixes, top keys and top commands. - -It basically runs the Redis MONITOR command and generates a summarized view. MONITOR is a debugging command that streams back every command processed by the Redis server. It can help in understanding what is happening to the database. This command can both be used via `redis-cli` and via `telnet`.All the commands sent to the redis instance are monitored for the duration of the profiling. The ability to see all the requests processed by the server is useful in order to spot bugs in an application both when using Redis as a database and as a distributed caching system. - -:::important - -Because MONITOR streams back all commands, its use comes at a cost.Running monitor command is dangerous to the performance of your production server, hence the profiler is run for a maximum time of 5 minutes, if the user has not stopped it in between. This is to avoid overload on the Redis server. - -::: - -Follow the below instructions to test drive RedisInsight profiler tool: - -## Step 1. Create Redis database with RedisTimeSeries module enabled - -Visit [https://developer.redis.com/create/rediscloud](https://developer.redis.com/create/rediscloud) and create a Redis database. [Follow these steps to enable RedisTimeSeries module ](https://developer.redis.com/howtos/redistimeseries)on Redis Enterprise Cloud - -![alt_text](images/image1.png) - -You can use Redis CLI to connect to the remote Redis Enterprise cloud database. You can check memory usage with the Redis `INFO` command. - -:::info TIP -RedisInsight allows you to add a Redis Sentinel database too. Refer to the [documentation](https://docs.redis.com/latest/ri/using-redisinsight/add-instance/#add-a-redis-sentinel-database) to learn more. -::: - -## Step 2: Download RedisInsight - -:::note TIP -RedisInsight v2.0 is an open source visual tool that lets you do both GUI- and CLI-based interactions with your Redis database. -It is an open source visual tool that lets you do both GUI- and CLI-based interactions with your Redis database . It is a desktop manager that provides an intuitive and efficient GUI for Redis, allowing you to interact with your databases, monitor, and manage your data. - -[Refer to these tutorials](/explore/redisinsightv2/) to learn more about this latest release. -::: -To install RedisInsight on your local system, you need to first download the software from the Redis website. - -[Click this link](https://redis.com/redis-enterprise/redis-insight/#insight-form) to access a form that allows you to select the operating system of your choice. - -![My Image](images/redisinsightinstall.png) - -Run the installer. After the web server starts, open http://YOUR_HOST_IP:8001 and add a Redis database connection. - -Select "Connect to a Redis database" -![My Image](images/redisinsight4.png) - -Enter the requested details, including Name, Host (endpoint), Port, and Password. Then click “ADD REDIS DATABASE”. - -## Step 3. Cloning the GITHUB repo - -We will be using a python script to fetch sensor data from one of the IoT Edge sensor devices (such as BME680 sensors) and then push the sensor values to the Redis Cloud database. - -``` -$ git clone https://github.com/redis-developer/redis-datasets/tree/master/redistimeseries -cd redistimeseries/realtime-sensor-jetson -``` - -``` -import bme680 -import time -import datetime -import csv -import argparse -import redis - - -print("""read-sensor.py - Displays temperature, pressure, humidity, and gas. -Press Ctrl+C to exit! -""") - -try: - sensor = bme680.BME680(bme680.I2C_ADDR_PRIMARY) -except IOError: - sensor = bme680.BME680(bme680.I2C_ADDR_SECONDARY) - -# These calibration data can safely be commented -# out, if desired. - -print('Calibration data:') -for name in dir(sensor.calibration_data): - - if not name.startswith('_'): - value = getattr(sensor.calibration_data, name) - - if isinstance(value, int): - print('{}: {}'.format(name, value)) - -# These oversampling settings can be tweaked to -# change the balance between accuracy and noise in -# the data. - -sensor.set_humidity_oversample(bme680.OS_2X) -sensor.set_pressure_oversample(bme680.OS_4X) -sensor.set_temperature_oversample(bme680.OS_8X) -sensor.set_filter(bme680.FILTER_SIZE_3) -sensor.set_gas_status(bme680.ENABLE_GAS_MEAS) - -print('\n\nInitial reading:') -for name in dir(sensor.data): - value = getattr(sensor.data, name) - - if not name.startswith('_'): - print('{}: {}'.format(name, value)) - -sensor.set_gas_heater_temperature(320) -sensor.set_gas_heater_duration(150) -sensor.select_gas_heater_profile(0) - -# Up to 10 heater profiles can be configured, each -# with their own temperature and duration. -# sensor.set_gas_heater_profile(200, 150, nb_profile=1) -# sensor.select_gas_heater_profile(1) - - -parser = argparse.ArgumentParser() -parser.add_argument("--port", type=int, - help="redis instance port", default=6379) -parser.add_argument( - "--password", type=int, help="redis instance password", default=None -) -parser.add_argument( - "--verbose", help="enable verbose output", action="store_true") -parser.add_argument("--host", type=str, - help="redis instance host", default="127.0.0.1") - - -args = parser.parse_args() - -# redis setup -redis_obj = redis.Redis(host=args.host, port=args.port, password=args.password) -temperature_key = "ts:temperature" -pressure_key = "ts:pressure" -humidity_key = "ts:humidity" - -print('\n\nPolling:') -try: - while True: - if not sensor.get_sensor_data(): - print('Can not access sensor data') - continue - - output = '{0:.2f} C,{1:.2f} hPa,{2:.2f} %RH'.format( - sensor.data.temperature, - sensor.data.pressure, - sensor.data.humidity) - - if not sensor.data.heat_stable: - print('Heat unstable: ' + output) - continue - - print('{0},{1} Ohms'.format( - output, - sensor.data.gas_resistance)) - - date = datetime.datetime.now() - timestamp = int(date.timestamp() * 1000) - - # Create pipeline - pipe = redis_obj.pipeline() - - pipe.execute_command( - "ts.add", temperature_key, timestamp, sensor.data.temperature - ) - - pipe.execute_command( - "ts.add", pressure_key, timestamp, sensor.data.pressure - ) - - pipe.execute_command("ts.add", humidity_key, - timestamp, sensor.data.humidity) - - # Execute pipeline - pipe.execute() - - time.sleep(1) - -except KeyboardInterrupt: - pass - -``` - -The complete walkthrough of this python script is explained [here](https://redis.com/blog/how-to-manage-real-time-iot-sensor-data-in-redis/). - -## Step 4: Execute the sensor script - -Let us execute the script using the command line: - -``` -$ sudo python3 sensorloader2.py --host Endpoint_of_Redis_enterprise_Cloud --port port -``` - -Run the monitor command to verify if sensor values are being fetched or not.(Don’t run this command in the production environment) - -``` -redis-17316.c251.east-us-mz.azure.cloud.redislabs.com:17316> monitor -OK -1622212328.833139 [0 122.171.186.213:59471] "monitor" -1622212329.865158 [0 70.167.220.160:50378] "MULTI" -1622212329.865158 [0 70.167.220.160:50378] "ts.add" "ts:temperature" "1622212329847" "35.67" -1622212329.865158 [0 70.167.220.160:50378] "ts.add" "ts:pressure" "1622212329847" "957.52" -1622212329.865158 [0 70.167.220.160:50378] "ts.add" "ts:humidity" "1622212329847" "11.111" -1622212329.865158 [0 70.167.220.160:50378] "EXEC" -1622212330.941178 [0 70.167.220.160:50378] "MULTI" -1622212330.941178 [0 70.167.220.160:50378] "ts.add" "ts:temperature" "1622212330920" "35.68" -1622212330.941178 [0 70.167.220.160:50378] "ts.add" "ts:pressure" "1622212330920" "957.51" -1622212330.941178 [0 70.167.220.160:50378] "ts.add" "ts:humidity" "1622212330920" "11.111" -1622212330.941178 [0 70.167.220.160:50378] "EXEC" - -``` - -## Step 5: Accessing the RedisTimeSeries Keys - -[Follow these steps to connect to the database](https://developer.redis.com/explore/redisinsight/getting-started) using RedisInsight. Once you are connected to RedisInsight GUI, you can verify the 3 RedisTimeSeries keys: - -- ts:temperature -- ts:pressure -- ts:humidity - -![alt_text](images/image2.png) - -## Step 6: Running RedisTimeSeries specific queries\*\* - -![alt_text](images/image3.png) - -Please note that In RedisTimeSeries, only [TS.RANGE ](https://oss.redis.com/redistimeseries/commands/#tsrangetsrevrange)and [TS.MRANGE](https://oss.redis.com/redistimeseries/commands/#tsmrangetsmrevrange) are supported as of the current release. In the next release, TS.REVRANGE and TS.MREVRANGE will be supported too. - -![alt_text](images/image4.png) - -## Step 7. Initiate the Profiler - -Click “Start Profiler” while sensor data is continuously being pushed to Redis database. - -![alt_text](images/image5.png) - -Let the profiler tool run for next 1-2 minutes. - -![alt_text](images/image6.png) - -Stop the profiler to see the results as shown below: - -![alt_text](images/image7.png) - -Hence, the profiler provides the below statistical details: - -- How many commands were processed -- Number of connected clients -- Rate at which the commands were executed -- Top key patterns (key patterns followed by number of commands) -- Top Keys -- Top Commands & their frequency - -## Additional Links - -- [Explore Redis keys using RedisInsight browser tool](/explore/redisinsight/browser) -- [Memory Usage and Analysis using RedisInsight](/explore/redisinsight/memoryanalyzer) -- [Using Redis Streams with RedisInsight](/explore/redisinsight/streams) -- [RedisInsight Release Notes](https://docs.redis.com/latest/ri/release-notes/) -- [Debug Redis using RedisInsight Slow log Debugging Tool](https://developer.redis.com/explore/redisinsight/slowlog) - -## - - diff --git a/docs/explore/redisinsight/profiler/launchpad.png b/docs/explore/redisinsight/profiler/launchpad.png deleted file mode 100644 index 66e7a455f63..00000000000 Binary files a/docs/explore/redisinsight/profiler/launchpad.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisearch/.index-redisearch.mdx.swo b/docs/explore/redisinsight/redisearch/.index-redisearch.mdx.swo deleted file mode 100644 index 126bf4d5b07..00000000000 Binary files a/docs/explore/redisinsight/redisearch/.index-redisearch.mdx.swo and /dev/null differ diff --git a/docs/explore/redisinsight/redisearch/images/image1.png b/docs/explore/redisinsight/redisearch/images/image1.png deleted file mode 100644 index e96ca1fc0e1..00000000000 Binary files a/docs/explore/redisinsight/redisearch/images/image1.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisearch/images/image10.png b/docs/explore/redisinsight/redisearch/images/image10.png deleted file mode 100644 index 936478f68d0..00000000000 Binary files a/docs/explore/redisinsight/redisearch/images/image10.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisearch/images/image11.png b/docs/explore/redisinsight/redisearch/images/image11.png deleted file mode 100644 index 962dabff13f..00000000000 Binary files a/docs/explore/redisinsight/redisearch/images/image11.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisearch/images/image12.png b/docs/explore/redisinsight/redisearch/images/image12.png deleted file mode 100644 index f821a14e58a..00000000000 Binary files a/docs/explore/redisinsight/redisearch/images/image12.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisearch/images/image13.png b/docs/explore/redisinsight/redisearch/images/image13.png deleted file mode 100644 index ed8e1833753..00000000000 Binary files a/docs/explore/redisinsight/redisearch/images/image13.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisearch/images/image14.png b/docs/explore/redisinsight/redisearch/images/image14.png deleted file mode 100644 index 484da482f9a..00000000000 Binary files a/docs/explore/redisinsight/redisearch/images/image14.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisearch/images/image15.png b/docs/explore/redisinsight/redisearch/images/image15.png deleted file mode 100644 index c1353e12c76..00000000000 Binary files a/docs/explore/redisinsight/redisearch/images/image15.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisearch/images/image16.png b/docs/explore/redisinsight/redisearch/images/image16.png deleted file mode 100644 index a0600a704dc..00000000000 Binary files a/docs/explore/redisinsight/redisearch/images/image16.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisearch/images/image17.png b/docs/explore/redisinsight/redisearch/images/image17.png deleted file mode 100644 index 9c411ed6cc3..00000000000 Binary files a/docs/explore/redisinsight/redisearch/images/image17.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisearch/images/image18.png b/docs/explore/redisinsight/redisearch/images/image18.png deleted file mode 100644 index 479564f3b41..00000000000 Binary files a/docs/explore/redisinsight/redisearch/images/image18.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisearch/images/image19.png b/docs/explore/redisinsight/redisearch/images/image19.png deleted file mode 100644 index f531bd42ac6..00000000000 Binary files a/docs/explore/redisinsight/redisearch/images/image19.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisearch/images/image2.png b/docs/explore/redisinsight/redisearch/images/image2.png deleted file mode 100644 index 809ff4d0f60..00000000000 Binary files a/docs/explore/redisinsight/redisearch/images/image2.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisearch/images/image20.png b/docs/explore/redisinsight/redisearch/images/image20.png deleted file mode 100644 index 713b3cf0e19..00000000000 Binary files a/docs/explore/redisinsight/redisearch/images/image20.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisearch/images/image21.png b/docs/explore/redisinsight/redisearch/images/image21.png deleted file mode 100644 index 42c1c28444e..00000000000 Binary files a/docs/explore/redisinsight/redisearch/images/image21.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisearch/images/image22.png b/docs/explore/redisinsight/redisearch/images/image22.png deleted file mode 100644 index 0e2b9560682..00000000000 Binary files a/docs/explore/redisinsight/redisearch/images/image22.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisearch/images/image23.png b/docs/explore/redisinsight/redisearch/images/image23.png deleted file mode 100644 index 89e1cc1c0c2..00000000000 Binary files a/docs/explore/redisinsight/redisearch/images/image23.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisearch/images/image3.png b/docs/explore/redisinsight/redisearch/images/image3.png deleted file mode 100644 index f6ed510852a..00000000000 Binary files a/docs/explore/redisinsight/redisearch/images/image3.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisearch/images/image4.png b/docs/explore/redisinsight/redisearch/images/image4.png deleted file mode 100644 index feb8371a56f..00000000000 Binary files a/docs/explore/redisinsight/redisearch/images/image4.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisearch/images/image5.png b/docs/explore/redisinsight/redisearch/images/image5.png deleted file mode 100644 index 3cade2d1e88..00000000000 Binary files a/docs/explore/redisinsight/redisearch/images/image5.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisearch/images/image6.png b/docs/explore/redisinsight/redisearch/images/image6.png deleted file mode 100644 index 6a101ea6f33..00000000000 Binary files a/docs/explore/redisinsight/redisearch/images/image6.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisearch/images/image7.png b/docs/explore/redisinsight/redisearch/images/image7.png deleted file mode 100644 index d6e435d383e..00000000000 Binary files a/docs/explore/redisinsight/redisearch/images/image7.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisearch/images/image8.png b/docs/explore/redisinsight/redisearch/images/image8.png deleted file mode 100644 index 8112bd39d50..00000000000 Binary files a/docs/explore/redisinsight/redisearch/images/image8.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisearch/images/image9.png b/docs/explore/redisinsight/redisearch/images/image9.png deleted file mode 100644 index 00f6799f93a..00000000000 Binary files a/docs/explore/redisinsight/redisearch/images/image9.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisearch/images/redisinsight4.png b/docs/explore/redisinsight/redisearch/images/redisinsight4.png deleted file mode 100644 index f00d956f37a..00000000000 Binary files a/docs/explore/redisinsight/redisearch/images/redisinsight4.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisearch/images/redisinsightinstall.png b/docs/explore/redisinsight/redisearch/images/redisinsightinstall.png deleted file mode 100644 index 99f2c696ea5..00000000000 Binary files a/docs/explore/redisinsight/redisearch/images/redisinsightinstall.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisearch/index-redisearch.mdx b/docs/explore/redisinsight/redisearch/index-redisearch.mdx deleted file mode 100644 index ecfc4e82ddf..00000000000 --- a/docs/explore/redisinsight/redisearch/index-redisearch.mdx +++ /dev/null @@ -1,375 +0,0 @@ ---- -id: index-redisearch -title: Perform Database Search and Analytics using RediSearch Browser Tool -sidebar_label: Perform Database Search and Analytics using RediSearch Browser Tool -slug: /explore/redisinsight/redisearch -authors: [ajeet] ---- - -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; - -A full-featured pure desktop GUI client, RedisInsight supports RediSearch. [RediSearch](https://oss.redis.com/redisearch/) is a powerful indexing, querying, and full-text search engine for Redis. It is one of the most mature and feature-rich Redis modules.With RedisInsight, the below functionalities are possible - -- Multi-line for building queries -- Added ability to submit query with ‘ctrl + enter’ in single line mode -- Better handling of long index names in index selector dropdown -- Fixed bug with pagination on queries with whitespace in the query string -- Support Aggregation -- Support Fuzzy logic -- Simple and complex conditions -- Sorting -- Pagination -- Counting - -RediSearch allows you to quickly create indexes on datasets (Hashes), and uses an incremental indexing approach for rapid index creation and deletion. The indexes let you query your data at lightning speed, perform complex aggregations, and filter by properties, numeric ranges, and geographical distance. - -### Step 1. Create Redis database - -[Follow this link to create Redis database using Docker container ](/explore/redismod)that comes with RediSearch module enabled - -## Step 2: Download RedisInsight - -To install RedisInsight on your local system, you need to first download the software from the Redis website. - -[Click this link ](https://redis.com/redis-enterprise/redis-insight/#insight-form) to access a form that allows you to select the operating system of your choice. - -![My Image](images/redisinsightinstall.png) - -Run the installer. After the web server starts, open http://YOUR_HOST_IP:8001 and add a Redis database connection. - -Select "Connect to a Redis database" -![My Image](images/redisinsight4.png) - -Enter the requested details, including Name, Host (endpoint), Port, and Password. Then click “ADD REDIS DATABASE”. - -![alt_text](images/image1.png) - -We will look at 2 datasets - one is OpenBeerDB and other is Movie datasets. -Let us begin with OpenBeerDB sample dataset. - -### Step 3. OpenBeerDB sample dataset - -To demonstrate RediSearch, we will use OpenbeerDB dataset. The dataset is available publicly for general public under [openbeerdb.com](https://openbeerdb.com) - -![alt_text](images/image2.png) - -Let us clone the repository to access the dataset: - -``` -$ git clone https://github.com/redis-developer/redis-datasets -cd redis-datasets/redisearch/openbeerdb -``` - -### Step 4. Installing prerequisite packages - -``` -$ brew install python3 -$ pip3 install -r requirements.txt -``` - -### Step 5. Importing the data - -``` -$ python3 import.py --url redis://localhost:6379 -Importing categories… -Importing styles... -Importing breweries... -Adding beer data to RediSearch.. - -``` - -### Step 6: Choose “RediSearch” under RedisInsight browser tool - -![alt_text](images/image3.png) - -Run the below query: - -``` -"@abv:[5 6]" -``` - -![alt_text](images/image4.png) - -You can click on “{:} “ to get a JSON view as shown below: - -![alt_text](images/image5.png) - -![alt_text](images/image6.png) - -You can download the data in CSV format. - -![alt_text](images/image7.png) - -![alt_text](images/image8.png) - -#### Query: All beers with ABV higher than 5% but lower than 6% - -The beers are added to the RediSearch index weighted by ABV. So by default, the results will be ordered by ABV highest to lowest. Both ABV and IBU are sortable, so you can order results by either of these fields using sortby in the query - -![alt_text](images/image9.png) - -#### Query: All beers with ABV higher than 5% but lower than 6% within the specified limits - -``` -"@abv:[5 6]" limit 0 100 -``` - -![alt_text](images/image10.png) - -#### Query: Find out Irish Ale and German Ale beers with ABV greater than 9%: - -![alt_text](images/image11.png) - -### Step 7. Using AGGREGATION - -Aggregations are a way to process the results of a search query, group, sort and transform them - and extract analytic insights from them. Much like aggregation queries in other databases and search engines, they can be used to create analytics reports, or perform Faceted Search style queries. - -For example, indexing a web-server's logs, we can create a report for unique users by hour, country or any other breakdown; or create different reports for errors, warnings, etc. - -Let's run the aggregation query - -``` -FT.AGGREGATE "beerIdx" "@abv:[5 6]" limit 0 1060 GROUPBY 1 @breweryid -``` - -![alt_text](images/image12.png) - -Let us look at Movie sample dataset too. - -### Step 8. Create Redis database - -[Follow this link to create Redis database using Docker container ](/explore/redismod)that comes with RediSearch module enabled - -### Step 9. Install RedisInsight - -[Follow this link](/explore/redisinsight/getting-started) to setup RedisInsight locally in your system - -![alt_text](images/image1.png) - -### Step 10. Movie Sample Database - -In this project you will use a simple dataset describing movies, for now, all records are in English. You will learn more about other languages in another tutorial. - -A movie is represented by the following attributes: - -- **`movie_id`** : The unique ID of the movie, internal to this database -- **`title`** : The title of the movie. -- **`plot`** : A summary of the movie. -- **`genre`** : The genre of the movie, for now a movie will only have a single genre. -- **`release_year`** : The year the movie was released as a numerical value. -- **`rating`** : A numeric value representing the public's rating for this movie. -- **`votes`** : Number of votes. -- **`poster`** : Link to the movie poster. -- **`imdb_id`** : id of the movie in the [IMDB](https://imdb.com) database. - -#### Key and Data structure - -As a Redis developer, one of the first things to look when building your application is to define the structure of the key and data (data design/data modeling). - -A common way of defining the keys in Redis is to use specific patterns in them. For example in this application where the database will probably deal with various business objects: movies, actors, theaters, users, ... we can use the following pattern: - -- `business_object:key` - -For example: - -- `movie:001` for the movie with the id 001 -- `user:001` the user with the id 001 - -and for the movies information you should use a Redis [Hash](https://redis.io/topics/data-types#hashes). - -A Redis Hash allows the application to structure all the movie attributes in individual fields; also RediSearch will index the fields based on the index definition. - -### Step 11. Insert Movies - -It is time now to add some data into your database, let's insert a few movies, using `redis-cli` or [RedisInsight](https://redis.com/redis-enterprise/redis-insight/). - -Once you are connected to your Redis instance run the following commands: - -``` -> HSET movie:11002 title "Star Wars: Episode V - The Empire Strikes Back" plot "After the Rebels are brutally overpowered by the Empire on the ice planet Hoth, Luke Skywalker begins Jedi training with Yoda, while his friends are pursued by Darth Vader and a bounty hunter named Boba Fett all over the galaxy." release_year 1980 genre "Action" rating 8.7 votes 1127635 imdb_id tt0080684 - - -> HSET movie:11003 title "The Godfather" plot "The aging patriarch of an organized crime dynasty transfers control of his clandestine empire to his reluctant son." release_year 1972 genre "Drama" rating 9.2 votes 1563839 imdb_id tt0068646 - - -> HSET movie:11004 title "Heat" plot "A group of professional bank robbers start to feel the heat from police when they unknowingly leave a clue at their latest heist." release_year 1995 genre "Thriller" rating 8.2 votes 559490 imdb_id tt0113277 - - -> HSET "movie:11005" title "Star Wars: Episode VI - Return of the Jedi" genre "Action" votes 906260 rating 8.3 release_year 1983 plot "The Rebels dispatch to Endor to destroy the second Empire's Death Star." ibmdb_id "tt0086190" - - -``` - -![alt_text](images/image13.png) - -Now it is possible to get information from the hash using the movie ID. For example if you want to get the title, and rating execute the following command: - -``` ->> HMGET movie:11002 title rating - -1) "Star Wars: Episode V - The Empire Strikes Back" -2) "8.7" - - - -``` - -And you can increment the rating of this movie using: - -``` -HINCRBYFLOAT movie:11002 rating 0.1 -``` - -But how do you get a movie or list of movies by year of release, rating or title? - -One option, would be to read all the movies, check all fields and then return only matching movies; no need to say that this is a really bad idea. - -Nevertheless this is where Redis developers often create custom secondary indexes using SET/SORTED SET structures that point back to the movie hash. This needs some heavy design and implementation. - -This is where the RediSearch module can help, and why it was created. - -### Step 12. RediSearch & Indexing - -RediSearch greatly simplifies this by offering a simple and automatic way to create secondary indices on Redis Hashes. (more datastructure will eventually come) - -![Secondary Index](https://github.com/RediSearch/redisearch-getting-started/blob/master/docs/images/secondary-index.png?raw=true) - -Using RediSearch if you want to query on a field, you must first index that field. Let's start by indexing the following fields for our movies: - -- Title -- Release Year -- Rating -- Genre - -When creating a index you define: - -- which data you want to index: all _hashes_ with a key starting with `movies` -- which fields in the hashes you want to index using a Schema definition. - -> **_Warning: Do not index all fields_** -> -> Indexes take space in memory, and must be updated when the primary data is updated. So create the index carefully and keep the definition up to date with your needs. - -### Step 13. Create the Index - -``` ->> FT.CREATE idx:movie ON hash PREFIX 1 "movie:" SCHEMA title TEXT SORTABLE release_year NUMERIC SORTABLE rating NUMERIC SORTABLE genre TAG SORTABLE - -"OK" -``` - -![alt_text](images/image14.png) - -The database contains a few movies, and an index, it is now possible to execute some queries. - -#### Query: All the movies that contains the string "`war`" - -![alt_text](images/image15.png) - -#### Query: Limit the list of fields returned by the query using the RETURN parameter - -The FT.SEARCH commands returns a list of results starting with the number of results, then the list of elements (keys & fields). - -As you can see the movie _Star Wars: Episode V - The Empire Strikes Back_ is found, even though you used only the word “war” to match “Wars” in the title. This is because the title has been indexed as text, so the field is [tokenized](https://oss.redis.com/redisearch/Escaping/) and [stemmed](https://oss.redis.com/redisearch/Stemming/). - -Later when looking at the query syntax in more detail you will learn more about the search capabilities. - -It is also possible to limit the list of fields returned by the query using the `RETURN` parameter, let's run the same query, and return only the title and release_year: - -![alt_text](images/image16.png) - -#### Query: All the movies that contains the string "war but NOT the jedi one" - -Adding the string `-jedi` (minus) will ask the query engine not to return values that contain `jedi`. - -``` -_"war -jedi" RETURN 2 title release_year_ -``` - -![alt_text](images/image17.png) - -### Step 14: Fuzzy Search - -All the movies that contains the string "gdfather using fuzzy search" - -![alt_text](images/image18.png) - -#### Query: All Thriller movies - -``` -@genre:{Thriller}" RETURN 2 title release_year -``` - -![alt_text](images/image19.png) - -#### Query: All Thriller or Action movies - -``` -@genre:{Thriller|Action}" RETURN 2 title release_year -``` - -![alt_text](images/image20.png) - -#### Query : All the movies released between 1970 and 1980 (included) - -The FT.SEARCH syntax has two ways to query numeric fields: - -- using the FILTER parameter - -- FILTER release_year 1970 1980 RETURN 2 title release_year - -![alt_text](images/image21.png) - -### Step 15. AGGREGATION - -#### Query: Number of movies by year - -``` -"*" GROUPBY 1 @release_year REDUCE COUNT 0 AS nb_of_movies - -1 -``` - -![alt_text](images/image22.png) - -#### Query: Number of movies by year from the most recent to the oldest - -``` -"*" GROUPBY 1 @release_year REDUCE COUNT 0 AS nb_of_movies SORTBY 2 @release_year DESC - -1 -``` - -![alt_text](images/image23.png) - -### Additional Links - -- [RediSearch Project](https://oss.redis.com/redisearch/) -- [RediSearch Tutorial](/howtos/redisearch) -- [Getting Started with Movie Database](/howtos/moviesdatabase/getting-started) -- [Slowlog Configuration using RedisInsight](/explore/redisinsight/slowlog) -- [Memory Analysis using RedisInsight](/explore/redisinsight/memoryanalyzer) -- [Visualize Redis database keys using RedisInsight Browser Tool](/explore/redisinsight/browser) -- [Using Redis Streams with RedisInsight](/explore/redisinsight/streams) - -## - - diff --git a/docs/explore/redisinsight/redisearch/launchpad.png b/docs/explore/redisinsight/redisearch/launchpad.png deleted file mode 100644 index 66e7a455f63..00000000000 Binary files a/docs/explore/redisinsight/redisearch/launchpad.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisgears/images/image1.png b/docs/explore/redisinsight/redisgears/images/image1.png deleted file mode 100644 index 5175de17128..00000000000 Binary files a/docs/explore/redisinsight/redisgears/images/image1.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisgears/images/image2.png b/docs/explore/redisinsight/redisgears/images/image2.png deleted file mode 100644 index 964dacf615c..00000000000 Binary files a/docs/explore/redisinsight/redisgears/images/image2.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisgears/images/image3.png b/docs/explore/redisinsight/redisgears/images/image3.png deleted file mode 100644 index 520b75cbfdd..00000000000 Binary files a/docs/explore/redisinsight/redisgears/images/image3.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisgears/images/redisinsight4.png b/docs/explore/redisinsight/redisgears/images/redisinsight4.png deleted file mode 100644 index f00d956f37a..00000000000 Binary files a/docs/explore/redisinsight/redisgears/images/redisinsight4.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisgears/images/redisinsightinstall.png b/docs/explore/redisinsight/redisgears/images/redisinsightinstall.png deleted file mode 100644 index 99f2c696ea5..00000000000 Binary files a/docs/explore/redisinsight/redisgears/images/redisinsightinstall.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisgears/index-redisgears.mdx b/docs/explore/redisinsight/redisgears/index-redisgears.mdx deleted file mode 100644 index 7d7108c6417..00000000000 --- a/docs/explore/redisinsight/redisgears/index-redisgears.mdx +++ /dev/null @@ -1,112 +0,0 @@ ---- -id: index-redisgears -title: Write Your Serverless Redis function using RedisGears Browser Tool -sidebar_label: Write Your Serverless Redis function using RedisGears Browser Tool -slug: /explore/redisinsight/redisgears -authors: [ajeet] ---- - -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; - -RedisInsight has built-in support for Redis Modules like RedisJSON, RediSearch, RedisGraph, Streams, RedisTimeSeries, and RedisGears. RedisGears enables reactive programming at the database level. It's like using lambda functions, but with a dramatically lower latency, and with much less encoding/decoding overhead. - -Support for [RedisGears](https://oss.redis.com/redisgears/) was first introduced in RedisInsight v1.5.0. RedisInsights allows you: - -- Explore the latest executed functions and analyze the results or errors. -- Manage registered functions and get execution summary. -- Code, build and execute functions. - -RedisGears is a dynamic framework that enables developers to write and execute [functions](https://oss.redis.com/redisgears/functions.html) that implement data flows in Redis, while abstracting away the data’s distribution and deployment. These capabilities enable efficient data processing using multiple models in Redis with infinite programmability, while remaining simple to use in any environment. - -Follow the below steps to get started with the RedisInsight browser tool for RedisGears. - -## Step 1. Create Redis database - -[Follow this link to run Redis container with RedisGears modules enabled ](/explore/redismod) - -## Step 2: Download RedisInsight - -To install RedisInsight on your local system, you need to first download the software from the Redis website. - -[Click this link ](https://redis.com/redis-enterprise/redis-insight/#insight-form) to access a form that allows you to select the operating system of your choice. - -![My Image](images/redisinsightinstall.png) - -Run the installer. After the web server starts, open http://YOUR_HOST_IP:8001 and add a Redis database connection. - -Select "Connect to a Redis database" -![My Image](images/redisinsight4.png) - -Enter the requested details, including Name, Host (endpoint), Port, and Password. Then click “ADD REDIS DATABASE”. - -## Step 3. Clone the repository - -``` -$ git clone https://github.com/RedisGears/ImdbExample -$ cd ImdbExample -``` - -## Step 4. Download the IMDB data - -Download the data from this link and extract it to the current directory: [https://datasets.imdbws.com/title.basics.tsv.gz](https://datasets.imdbws.com/title.basics.tsv.gz) - -``` -$ wget https://datasets.imdbws.com/title.basics.tsv.gz -$ gunzip title.basics.tsv.gz -``` - -## Step 5. Execute the script - -``` -$ python3 UploadImdb.py -H localhost -P 6379 - -python3 UploadImdb.py -H 192.168.1.9 -P 6379 -/Users/ajeetraina/projects/redis-datasets/redisgears/ImdbExample/UploadImdb.py:27: DeprecationWarning: Pipeline.hmset() is deprecated. Use Pipeline.hset() instead. - pipe.hmset(d['tconst'], d) -done -``` - -## Step 6. Accessing RedisInsight - -Choose “RedisGears” on the left menu. - -![alt_text](images/image1.png) - -## Step 7. Add the below script: - -``` -GB('KeysOnlyReader').map(lambda x: execute('hget', x, 'genres')).flatmap(lambda x: x.split(',')).countby().run() -``` - -![alt_text](images/image2.png) - -![alt_text](images/image3.png) - -## Additional References - -- [RedisGears Project](https://oss.redis.com/redisgears/) -- [RedisGears Tutorials](/howtos/redisgears) -- [How to build a Fraud Detection System using RedisGears and RedisBloom](/howtos/frauddetection) -- [Building a Pipeline for Natural Language Processing using RedisGears](/howtos/nlp) - -## - - diff --git a/docs/explore/redisinsight/redisgears/launchpad.png b/docs/explore/redisinsight/redisgears/launchpad.png deleted file mode 100644 index 66e7a455f63..00000000000 Binary files a/docs/explore/redisinsight/redisgears/launchpad.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisgraph/images/image1.png b/docs/explore/redisinsight/redisgraph/images/image1.png deleted file mode 100644 index 58f318a6689..00000000000 Binary files a/docs/explore/redisinsight/redisgraph/images/image1.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisgraph/images/image10.png b/docs/explore/redisinsight/redisgraph/images/image10.png deleted file mode 100644 index 389260a774b..00000000000 Binary files a/docs/explore/redisinsight/redisgraph/images/image10.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisgraph/images/image11.png b/docs/explore/redisinsight/redisgraph/images/image11.png deleted file mode 100644 index ea9e855403d..00000000000 Binary files a/docs/explore/redisinsight/redisgraph/images/image11.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisgraph/images/image12.png b/docs/explore/redisinsight/redisgraph/images/image12.png deleted file mode 100644 index 7b9e6bacef7..00000000000 Binary files a/docs/explore/redisinsight/redisgraph/images/image12.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisgraph/images/image13.png b/docs/explore/redisinsight/redisgraph/images/image13.png deleted file mode 100644 index 0b38ac9cf9a..00000000000 Binary files a/docs/explore/redisinsight/redisgraph/images/image13.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisgraph/images/image14.png b/docs/explore/redisinsight/redisgraph/images/image14.png deleted file mode 100644 index 4333642b22c..00000000000 Binary files a/docs/explore/redisinsight/redisgraph/images/image14.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisgraph/images/image15.png b/docs/explore/redisinsight/redisgraph/images/image15.png deleted file mode 100644 index 61ca13e655c..00000000000 Binary files a/docs/explore/redisinsight/redisgraph/images/image15.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisgraph/images/image16.png b/docs/explore/redisinsight/redisgraph/images/image16.png deleted file mode 100644 index f5350ba18d6..00000000000 Binary files a/docs/explore/redisinsight/redisgraph/images/image16.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisgraph/images/image17.png b/docs/explore/redisinsight/redisgraph/images/image17.png deleted file mode 100644 index ea84f7ff026..00000000000 Binary files a/docs/explore/redisinsight/redisgraph/images/image17.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisgraph/images/image18.png b/docs/explore/redisinsight/redisgraph/images/image18.png deleted file mode 100644 index 92d7b0e9b4b..00000000000 Binary files a/docs/explore/redisinsight/redisgraph/images/image18.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisgraph/images/image2.png b/docs/explore/redisinsight/redisgraph/images/image2.png deleted file mode 100644 index fa85a836bc2..00000000000 Binary files a/docs/explore/redisinsight/redisgraph/images/image2.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisgraph/images/image3.png b/docs/explore/redisinsight/redisgraph/images/image3.png deleted file mode 100644 index 2262a2409bd..00000000000 Binary files a/docs/explore/redisinsight/redisgraph/images/image3.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisgraph/images/image4.png b/docs/explore/redisinsight/redisgraph/images/image4.png deleted file mode 100644 index 03694be47b2..00000000000 Binary files a/docs/explore/redisinsight/redisgraph/images/image4.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisgraph/images/image5.png b/docs/explore/redisinsight/redisgraph/images/image5.png deleted file mode 100644 index c078a6f7390..00000000000 Binary files a/docs/explore/redisinsight/redisgraph/images/image5.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisgraph/images/image6.png b/docs/explore/redisinsight/redisgraph/images/image6.png deleted file mode 100644 index 0608cbb7d04..00000000000 Binary files a/docs/explore/redisinsight/redisgraph/images/image6.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisgraph/images/image7.png b/docs/explore/redisinsight/redisgraph/images/image7.png deleted file mode 100644 index 6049feaf113..00000000000 Binary files a/docs/explore/redisinsight/redisgraph/images/image7.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisgraph/images/image8.png b/docs/explore/redisinsight/redisgraph/images/image8.png deleted file mode 100644 index b8ba39f0719..00000000000 Binary files a/docs/explore/redisinsight/redisgraph/images/image8.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisgraph/images/image9.png b/docs/explore/redisinsight/redisgraph/images/image9.png deleted file mode 100644 index 575e9604618..00000000000 Binary files a/docs/explore/redisinsight/redisgraph/images/image9.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisgraph/images/redisinsight4.png b/docs/explore/redisinsight/redisgraph/images/redisinsight4.png deleted file mode 100644 index f00d956f37a..00000000000 Binary files a/docs/explore/redisinsight/redisgraph/images/redisinsight4.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisgraph/images/redisinsightinstall.png b/docs/explore/redisinsight/redisgraph/images/redisinsightinstall.png deleted file mode 100644 index 99f2c696ea5..00000000000 Binary files a/docs/explore/redisinsight/redisgraph/images/redisinsightinstall.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisgraph/index-redisgraph.mdx b/docs/explore/redisinsight/redisgraph/index-redisgraph.mdx deleted file mode 100644 index d910c891857..00000000000 --- a/docs/explore/redisinsight/redisgraph/index-redisgraph.mdx +++ /dev/null @@ -1,224 +0,0 @@ ---- -id: index-redisgraph -title: Query, Visualize and Manipulate Graphs using RedisGraph Browser Visualization Tool -sidebar_label: Query, Visualize and Manipulate Graphs using RedisGraph Browser Visualization Tool -slug: /explore/redisinsight/redisgraph -authors: [ajeet] ---- - -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; - -If you’re a Redis user who prefers to use a Graphical User Interface(GUI) for graph queries, then RedisInsight is a right tool for you. It’s 100% free pure desktop Redis GUI that provides easy-to-use browser tools to query, visualize and interactively manipulate graphs. You can add new graphs, run queries and explore the results over the GUI tool. - -RedisInsight supports [RedisGraph](https://oss.redis.com/redisgraph/) and allows you to: - -- Build and execute queries -- Navigate your graphs -- Browse, analyze, and export results -- Keyboard shortcuts to zoom -- Button to reset view; center entire graph -- Zoom capability via mouse wheel(Double right-click to zoom out, Double right-click to zoom out.) -- Ability to copy commands with a button click -- Ability to persist nodes display choices between queries - -As a benefit, you get faster turnarounds when building your application using Redis and RedisGraph. - -Follow the below steps to see how your data is connected via the RedisInsight Browser tool. - -## Step 1. Create Redis database - -[Follow this link to create a Redis database](https://developer.redis.com/howtos/redisgraph) using Redis Enterprise Cloud with RedisGraph module enabled - -![alt_text](images/image1.png) - -## Step 2: Download RedisInsight - -To install RedisInsight on your local system, you need to first download the software from the Redis website. - -[Click this link ](https://redis.com/redis-enterprise/redis-insight/#insight-form) to access a form that allows you to select the operating system of your choice. - -![My Image](images/redisinsightinstall.png) - -Run the installer. After the web server starts, open http://YOUR_HOST_IP:8001 and add a Redis database connection. - -Select "Connect to a Redis database" -![My Image](images/redisinsight4.png) - -Enter the requested details, including Name, Host (endpoint), Port, and Password. Then click “ADD REDIS DATABASE”. - -## Step 3: Click “RedisGraph” and then “Add Graph” - -Select RedisGraph from the menu. - -![alt_text](images/image2.png) - -## Step 4. Create a new Graph called “Friends” - -![alt_text](images/image3.png) - -## Step 5. Add new nodes(individuals) and links - -Let us add individuals to the graph. CREATE is used to introduce new nodes and relationships.Run the below cypher query on RedisInsight GUI to add a label called person and property called “name”. - -``` -CREATE (:Person{name:"Tom" }), (:Person{name:"Alex" }), (:Person{name:"Susan" }), (:Person{name:"Bill" }), (:Person{name:"Jane" }) -``` - -![alt_text](images/image4.png) - -As we see that “1” label is added and that refers to a person label. It’s the same for every node and hence created once. Overall there are 5 nodes created. The five “name” properties refer to 5 name properties that have been added. - -## Step 6: View all the individuals (nodes) - -Match describes the relationship between queried entities, using ascii art to represent pattern(s) to match against. Nodes are represented by parentheses () , and Relationships are represented by brackets [] . - -As shown below, we have added lowercase “p” in front of our label and is a variable we can make a reference to. It returns all the nodes with a label called “Person”. - -``` -MATCH (p:Person) RETURN p -``` - -![alt_text](images/image5.png) - -You can select "Graph View" on the right menu to display the graphical representation as shown below: - -![alt_text](images/image6.png) - -## Step 7. Viewing just one individual(node) - -``` -MATCH (p:Person {name:"Tom"}) RETURN p -``` - -![alt_text](images/image7.png) - -## Step 8: Visualize the relationship between the individuals - -Run the below query to build a relationship between two nodes and how the relationship flows from one node(“Tom”) to the another node(“Alex”). - -``` -MATCH (p1:Person {name: "Tom" }), (p2:Person {name: "Alex" }) CREATE (p1)-[:Knows]->(p2) -``` - -The symbol “>” (greater than) shows which way the relationship flows. - -![alt_text](images/image8.png) - -You can view the relationship in the form of graph as shown below: - -![alt_text](images/image9.png) - -## Step 9. Create and visualize the multiple relationships - -Run the below query to create and visualize relationsship between the multiple individuals - -``` -MATCH (p1:Person {name: "Tom" }), (p2:Person {name: "Susan" }), (p3:Person {name: "Bill" }) CREATE (p1)-[:Knows]->(p2), (p1)-[:Knows]->(p3) -``` - -![alt_text](images/image10.png) - -## Step 10. Create and visualize the relationship between two individuals (Susan and Bill) - -Let us look at how to generate graph showcasing the relationship between two individuals - Susan and Bill - -``` -MATCH (p1:Person {name: "Susan"}), (p2:Person {name: "Bill"}) CREATE (p1)-[:Knows]->(p2) -``` - -![alt_text](images/image11.png) - -## Step 11. Create and visualize the relationship between two indiviual (Bill and Jane) - -``` -MATCH (p1:Person {name: "Bill"}), (p2:Person {name: "Jane"}) CREATE (p1)-[:Knows]->(p2) -``` - -![alt_text](images/image12.png) - -![alt_text](images/image13.png) - -## Step 12. Building a social networking - -This can be achieved by “friend of friends” kind of relationship. Say, If Tom wanted to social network with Jane. He has two contacts that know Jane - one is Susan and the other person is Bill. - -![alt_text](images/image14.png) - -``` -MATCH p = (p1:Person {name: "Tom" })-[:Knows*1..3]-(p2:Person {name: "Jane"}) RETURN p -``` - -In this query, we assign a variable “p” to a node graph path. We search for “Tom” as p1 and “Jane” as “p2”. We say interested in knows link with 1..3 degree of separation. - -![alt_text](images/image15.png) - -## Step 13. Cleaning up the Graph - -![alt_text](images/image16.png) - -## Importing the Bulk Graph data - -Let us try to insert bulk data using Python and then extrapolate it in the form of nodes and relationships. - -## Step 14. Cloning the repository\*\* - -``` -$ git clone https://github.com/redis-developer/redis-datasets -cd redis-datasets/redisgraph/datasets/iceandfire -``` - -## Step 15. Execute the script - -``` -$ python3 bulk_insert.py GOT_DEMO -n data/character.csv -n data/house.csv -n data/book.csv -n data/writer.csv -r data/wrote.csv -r data/belongs.csv -h 192.168.1.9 -p 6379 - - - -2124 nodes created with label 'b'character'' -438 nodes created with label 'b'house'' -12 nodes created with label 'b'book'' -3 nodes created with label 'b'writer'' -14 relations created for type 'b'wrote'' -2208 relations created for type 'b'belongs'' -Construction of graph 'GOT_DEMO' complete: 2577 nodes created, 2222 relations created in 0.169954 seconds - - -``` - -## Step 16. Run the cypher query - -``` -GRAPH.QUERY GOT_DEMO "MATCH (w:writer)-[wrote]->(b:book) return w,b" -``` - -![alt_text](images/image18.png) - -## Additional Resources - -- [RedisGraph Project](https://oss.redis.com/redisgraph/) -- [Slowlog Configuration using RedisInsight](/explore/redisinsight/slowlog) -- [Memory Analysis using RedisInsight](/explore/redisinsight/memoryanalyzer) -- [Visualize Redis database keys using RedisInsight Browser Tool](/explore/redisinsight/browser) -- [Using Redis Streams with RedisInsight](/explore/redisinsight/streams) - -## - - diff --git a/docs/explore/redisinsight/redisgraph/launchpad.png b/docs/explore/redisinsight/redisgraph/launchpad.png deleted file mode 100644 index 66e7a455f63..00000000000 Binary files a/docs/explore/redisinsight/redisgraph/launchpad.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisinsight-linux.png b/docs/explore/redisinsight/redisinsight-linux.png deleted file mode 100644 index 3406cf17c61..00000000000 Binary files a/docs/explore/redisinsight/redisinsight-linux.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisinsight-windows.png b/docs/explore/redisinsight/redisinsight-windows.png deleted file mode 100644 index 0a2a8b70300..00000000000 Binary files a/docs/explore/redisinsight/redisinsight-windows.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisinsight.gif b/docs/explore/redisinsight/redisinsight.gif deleted file mode 100644 index c1a7da6f075..00000000000 Binary files a/docs/explore/redisinsight/redisinsight.gif and /dev/null differ diff --git a/docs/explore/redisinsight/redisinsight2.png b/docs/explore/redisinsight/redisinsight2.png deleted file mode 100644 index 5c6ef2fef14..00000000000 Binary files a/docs/explore/redisinsight/redisinsight2.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisinsight3.png b/docs/explore/redisinsight/redisinsight3.png deleted file mode 100644 index 33c9f8d950b..00000000000 Binary files a/docs/explore/redisinsight/redisinsight3.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisinsight4.png b/docs/explore/redisinsight/redisinsight4.png deleted file mode 100644 index f00d956f37a..00000000000 Binary files a/docs/explore/redisinsight/redisinsight4.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisinsight5.png b/docs/explore/redisinsight/redisinsight5.png deleted file mode 100644 index bb716253bed..00000000000 Binary files a/docs/explore/redisinsight/redisinsight5.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisinsight7.png b/docs/explore/redisinsight/redisinsight7.png deleted file mode 100644 index dfa2dcc6969..00000000000 Binary files a/docs/explore/redisinsight/redisinsight7.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisinsight8.png b/docs/explore/redisinsight/redisinsight8.png deleted file mode 100644 index f2d04e964a6..00000000000 Binary files a/docs/explore/redisinsight/redisinsight8.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisinsight9.png b/docs/explore/redisinsight/redisinsight9.png deleted file mode 100644 index 3a233149fc6..00000000000 Binary files a/docs/explore/redisinsight/redisinsight9.png and /dev/null differ diff --git a/docs/explore/redisinsight/redisinsightmac.png b/docs/explore/redisinsight/redisinsightmac.png deleted file mode 100644 index 5cca4c45a0a..00000000000 Binary files a/docs/explore/redisinsight/redisinsightmac.png and /dev/null differ diff --git a/docs/explore/redisinsight/redistimeseries/images/image1.png b/docs/explore/redisinsight/redistimeseries/images/image1.png deleted file mode 100644 index 7530e4bca24..00000000000 Binary files a/docs/explore/redisinsight/redistimeseries/images/image1.png and /dev/null differ diff --git a/docs/explore/redisinsight/redistimeseries/images/image2.png b/docs/explore/redisinsight/redistimeseries/images/image2.png deleted file mode 100644 index 6f0aadbaee8..00000000000 Binary files a/docs/explore/redisinsight/redistimeseries/images/image2.png and /dev/null differ diff --git a/docs/explore/redisinsight/redistimeseries/images/image3.png b/docs/explore/redisinsight/redistimeseries/images/image3.png deleted file mode 100644 index 1a3b743c454..00000000000 Binary files a/docs/explore/redisinsight/redistimeseries/images/image3.png and /dev/null differ diff --git a/docs/explore/redisinsight/redistimeseries/images/redisinsight4.png b/docs/explore/redisinsight/redistimeseries/images/redisinsight4.png deleted file mode 100644 index f00d956f37a..00000000000 Binary files a/docs/explore/redisinsight/redistimeseries/images/redisinsight4.png and /dev/null differ diff --git a/docs/explore/redisinsight/redistimeseries/images/redisinsightinstall.png b/docs/explore/redisinsight/redistimeseries/images/redisinsightinstall.png deleted file mode 100644 index 99f2c696ea5..00000000000 Binary files a/docs/explore/redisinsight/redistimeseries/images/redisinsightinstall.png and /dev/null differ diff --git a/docs/explore/redisinsight/redistimeseries/index-redistimeseries.mdx b/docs/explore/redisinsight/redistimeseries/index-redistimeseries.mdx deleted file mode 100644 index e6be17c91c4..00000000000 --- a/docs/explore/redisinsight/redistimeseries/index-redistimeseries.mdx +++ /dev/null @@ -1,226 +0,0 @@ ---- -id: index-redistimeseries -title: Manage Redis time-series data using RedisTimeSeries Browser Tool -sidebar_label: Manage Redis time-series data using RedisTimeSeries Browser Tool -slug: /explore/redisinsight/redistimeseries -authors: [ajeet] ---- - -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; - -If you want to add a time series data structure to your Redis database, check out RedisTimeSeries browser tool that comes with RedisInsight. - -[RedisTimeseries](https://oss.redis.com/redistimeseries/) is a Redis module developed by Redis to enhance your experience managing time-series data with Redis. It simplifies the use of Redis for time-series use cases such as internet of things (IoT) data, stock prices, and telemetry. With RedisTimeSeries, you can ingest and query millions of samples and events at the speed of Redis. Advanced tooling such as downsampling and aggregation ensure a small memory footprint without impacting performance. Use a variety of queries for visualization and monitoring with built-in connectors to popular monitoring tools like Grafana, Prometheus, and Telegraf. - -With RedisInsight browser tool, you can perform the below sets of activities: - -- TS.RANGE & TS.MRANGE are supported -- Charts support milliseconds. -- Ability to configure auto refresh interval. -- Ability to submit query with ‘ctrl + enter’ in single line mode -- Display tabular as well as JSON view - -## Step 1. Create Redis database - -[Follow this link to create Redis database using Docker container ](https://developer.redis.com/explore/redismod)that comes with RedisTimeSeries module enabled - -## Step 2: Download RedisInsight - -To install RedisInsight on your local system, you need to first download the software from the Redis website. - -[Click this link ](https://redis.com/redis-enterprise/redis-insight/#insight-form) to access a form that allows you to select the operating system of your choice. - -![My Image](images/redisinsightinstall.png) - -Run the installer. After the web server starts, open http://YOUR_HOST_IP:8001 and add a Redis database connection. - -Select "Connect to a Redis database" -![My Image](images/redisinsight4.png) - -Enter the requested details, including Name, Host (endpoint), Port, and Password. Then click “ADD REDIS DATABASE”. - -## Step 3. Clone the repository - -``` -$ git clone https://github.com/redis-developer/redis-datasets -cd redis-datasets/redistimeseries/AirQualityUCI -``` - -## Step 4. Execute the Python script - -``` -#!/usr/bin/env python3 -# -*- coding: utf-8 -*- - -"""sample module for dataset loading into redistimeseries from csv file -""" - -import argparse -import redis -import csv -import datetime -import logging -from tqdm import tqdm - - -def parse_dataset_row(line): - - result = False - date = None - Time = None - unix_ts = None - carbon_monoxide = None - temperature_c = None - relative_humidity = None - # check if we have 15 fields or more, and all fields have something on it - if len(line) > 14 and sum([len(line[x]) > 0 for x in range(0, 14)]) == 14: - str_date = line[0] - str_time = line[1] - carbon_monoxide = ( - float(line[2].replace(",", ".")) - if (float(line[2].replace(",", ".")) > -200.0) - else None - ) - temperature_c = ( - float(line[12].replace(",", ".")) - if (float(line[12].replace(",", ".")) > -200.0) - else None - ) - relative_humidity = ( - float(line[13].replace(",", ".")) - if (float(line[13].replace(",", ".")) > -200.0) - else None - ) - unix_ts = int( - datetime.datetime.strptime( - "{0} {1}".format(str_date, str_time), "%d/%m/%Y %H.%M.%S" - ).timestamp() - ) - result = True - - return result, unix_ts, carbon_monoxide, temperature_c, relative_humidity - - -parser = argparse.ArgumentParser() -parser.add_argument("--port", type=int, help="redis instance port", default=6379) -parser.add_argument( - "--password", type=int, help="redis instance password", default=None -) -parser.add_argument("--verbose", help="enable verbose output", action="store_true") -parser.add_argument("--host", type=str, help="redis instance host", default="127.0.0.1") -parser.add_argument( - "--csv", - type=str, - help="csv file containing the dataset", - default="./AirQualityUCI/AirQualityUCI.csv", -) -parser.add_argument( - "--csv_delimiter", type=str, help="csv file field delimiter", default=";" -) -args = parser.parse_args() - -log_level = logging.ERROR -if args.verbose is True: - log_level = logging.INFO -logging.basicConfig(level=log_level) - -# redis setup -redis_obj = redis.Redis(host=args.host, port=args.port, password=args.password) -temperature_key = "ts:temperature" -carbon_monoxide_key = "ts:carbon_monoxide" -relative_humidity_key = "ts:relative_humidity" - -with open(args.csv, newline="") as csv_file: - csv_reader = csv.reader(csv_file, delimiter=args.csv_delimiter) - next(csv_reader, None) # skip the headers - for row in tqdm(csv_reader): - ( - result, - unix_ts, - carbon_monoxide, - temperature_c, - relative_humidity, - ) = parse_dataset_row(row) - if result is True: - try: - if temperature_c is not None: - redis_obj.execute_command( - "ts.add", temperature_key, unix_ts, temperature_c - ) - logging.info( - "ts.add {0} {1} {2}".format( - temperature_key, unix_ts, temperature_c - ) - ) - if carbon_monoxide is not None: - redis_obj.execute_command( - "ts.add", carbon_monoxide_key, unix_ts, carbon_monoxide - ) - logging.info( - "ts.add {0} {1} {2}".format( - carbon_monoxide_key, unix_ts, carbon_monoxide - ) - ) - if relative_humidity is not None: - redis_obj.execute_command( - "ts.add", relative_humidity_key, unix_ts, relative_humidity - ) - logging.info( - "ts.add {0} {1} {2}".format( - relative_humidity_key, unix_ts, relative_humidity - ) - ) - except redis.RedisError as err: - logging.error(err) - -``` - -## Step 5. Execute the script - -``` -$ python3 dataloader.py -9471it [00:29, 326.33it/s] -``` - -## Step 6. Query a range across one or multiple time-series - -` TS.RANGE ts:carbon_monoxide 1112596200 1112603400` - -![alt_text](images/image1.png) - -## Step 7 . Displaying the JSON view - -![alt_text](images/image2.png) - -## Step 8. Displaying the tabular view - -![alt_text](images/image3.png) - -## Additional Resources - -- [RedisTimeSeries Project](https://oss.redis.com/redistimeseries/) -- [RedisTimeSeries Tutorial](/howtos/redistimeseries) -- [Analyze Your Redis commands using RedisInsight Profiler tool](explore/redisinsight/profiler) -- [How to Manage Real-Time IoT Sensor Data in Redis](https://redis.com/blog/how-to-manage-real-time-iot-sensor-data-in-redis/) - -## - - diff --git a/docs/explore/redisinsight/redistimeseries/launchpad.png b/docs/explore/redisinsight/redistimeseries/launchpad.png deleted file mode 100644 index 66e7a455f63..00000000000 Binary files a/docs/explore/redisinsight/redistimeseries/launchpad.png and /dev/null differ diff --git a/docs/explore/redisinsight/slowlog/images/image1.png b/docs/explore/redisinsight/slowlog/images/image1.png deleted file mode 100644 index ceba137ec7d..00000000000 Binary files a/docs/explore/redisinsight/slowlog/images/image1.png and /dev/null differ diff --git a/docs/explore/redisinsight/slowlog/images/image2.png b/docs/explore/redisinsight/slowlog/images/image2.png deleted file mode 100644 index c84b14d3ecd..00000000000 Binary files a/docs/explore/redisinsight/slowlog/images/image2.png and /dev/null differ diff --git a/docs/explore/redisinsight/slowlog/images/image3.png b/docs/explore/redisinsight/slowlog/images/image3.png deleted file mode 100644 index 52c75702c14..00000000000 Binary files a/docs/explore/redisinsight/slowlog/images/image3.png and /dev/null differ diff --git a/docs/explore/redisinsight/slowlog/images/image4.png b/docs/explore/redisinsight/slowlog/images/image4.png deleted file mode 100644 index 88b5cc66350..00000000000 Binary files a/docs/explore/redisinsight/slowlog/images/image4.png and /dev/null differ diff --git a/docs/explore/redisinsight/slowlog/images/image5.png b/docs/explore/redisinsight/slowlog/images/image5.png deleted file mode 100644 index d52eeb025ef..00000000000 Binary files a/docs/explore/redisinsight/slowlog/images/image5.png and /dev/null differ diff --git a/docs/explore/redisinsight/slowlog/images/image6.png b/docs/explore/redisinsight/slowlog/images/image6.png deleted file mode 100644 index 57ee5093ec2..00000000000 Binary files a/docs/explore/redisinsight/slowlog/images/image6.png and /dev/null differ diff --git a/docs/explore/redisinsight/slowlog/images/image7.png b/docs/explore/redisinsight/slowlog/images/image7.png deleted file mode 100644 index 5497392aae8..00000000000 Binary files a/docs/explore/redisinsight/slowlog/images/image7.png and /dev/null differ diff --git a/docs/explore/redisinsight/slowlog/images/image8.png b/docs/explore/redisinsight/slowlog/images/image8.png deleted file mode 100644 index 98bcf71b6ef..00000000000 Binary files a/docs/explore/redisinsight/slowlog/images/image8.png and /dev/null differ diff --git a/docs/explore/redisinsight/slowlog/images/image9.png b/docs/explore/redisinsight/slowlog/images/image9.png deleted file mode 100644 index 2366caeca02..00000000000 Binary files a/docs/explore/redisinsight/slowlog/images/image9.png and /dev/null differ diff --git a/docs/explore/redisinsight/slowlog/images/redisinsight4.png b/docs/explore/redisinsight/slowlog/images/redisinsight4.png deleted file mode 100644 index f00d956f37a..00000000000 Binary files a/docs/explore/redisinsight/slowlog/images/redisinsight4.png and /dev/null differ diff --git a/docs/explore/redisinsight/slowlog/images/redisinsightinstall.png b/docs/explore/redisinsight/slowlog/images/redisinsightinstall.png deleted file mode 100644 index 99f2c696ea5..00000000000 Binary files a/docs/explore/redisinsight/slowlog/images/redisinsightinstall.png and /dev/null differ diff --git a/docs/explore/redisinsight/slowlog/index-slowlog.mdx b/docs/explore/redisinsight/slowlog/index-slowlog.mdx deleted file mode 100644 index 7150e9e35ed..00000000000 --- a/docs/explore/redisinsight/slowlog/index-slowlog.mdx +++ /dev/null @@ -1,197 +0,0 @@ ---- -id: index-slowlog -title: Debug Redis using RedisInsight Slowlog Debugging Tool -sidebar_label: Debug Redis using RedisInsight Slowlog Debugging Tool -slug: /explore/redisinsight/slowlog -author: [ajeet] ---- - -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; - -RedisInsight, a free GUI for Redis, allows you to identify and troubleshoot bottlenecks with the Slowlog analysis tool. If you are experiencing high latency and high CPU usage with Redis operations and looking for a tool for debugging and tracing your Redis database, RedisInsight Slow Log is a perfect tool for you. - -Redis Slow Log is highly effective at showing the actual processing time of each slow command. The Redis slowlog is a log of all commands which exceed a specified run time. Please note that the network latency is not included in the measurement, just the time taken to actually execute the command. Redis Slow Log is a list of slow operations for your Redis instance. - -Follow the below steps to see how Slowlog is leveraged to troubleshoot performance issues. - -## Step 1. Create a Redis database - -Follow [https://developer.redis.com/create](https://developer.redis.com/create) to install and create Redis database - -## Step 2: Download RedisInsight - -To install RedisInsight on your local system, you need to first download the software from the Redis website. - -[Click this link ](https://redis.com/redis-enterprise/redis-insight/#insight-form) to access a form that allows you to select the operating system of your choice. - -![My Image](images/redisinsightinstall.png) - -Run the installer. After the web server starts, open http://YOUR_HOST_IP:8001 and add a Redis database connection. - -Select "Connect to a Redis database" -![My Image](images/redisinsight4.png) - -Enter the requested details, including Name, Host (endpoint), Port, and Password. Then click “ADD REDIS DATABASE”. - -## Step 3. Connect to the database using RedisInsight GUI - -![alt_text](images/image1.png) - -## Step 4: Click “Slowlog” and then “Configure Slowlog” - -![alt_text](images/image2.png) - -## Step 5. Configure Slowlog - -There are two configurations related to slowlog query - - -- slowlog-log-slower-than: Used to set the evaluation time of slow query, that is to say, commands that exceed this configuration item will be treated as slow operations and recorded in the slow query log. Its execution unit is microseconds (1 second equals 1000000 microseconds); -- slowlog-max-len: Used to configure the maximum number of records in the slow query log. - -Please note that a negative number disables the slowlog, while a value of zero forces the logging of every command. Slowlog-max-len is the length of the slowlog. The minimum value is zero. When a new command is logged and the slowlog is already at its maximum length, the oldest one is removed from the queue of logged commands in order to make space. The configuration can be done by editing redis.conf or while the server is running using the CONFIG GET and CONFIG SET commands. - -Slowlog will log the last X number(amount) of queries which took more time than Y microseconds to run. You can set this either in redis.conf or at runtime using CONFIG command - -```bash - CONFIG SET slowlog-log-slower-than 500 - CONFIG SET slowlog-max-len 50 -``` - -![alt_text](images/image3.png) - -## Step 6. Prepare a script to add large dataset to Redis database - -To see slowlog in action, let us pick up a large dataset. Create a file called importcities.py and add the below content: - -```python - import csv - import config - from redis import Redis - - # Database Connection - host = config.REDIS_CFG["host"] - port = config.REDIS_CFG["port"] - pwd = config.REDIS_CFG["password"] - redis = Redis(host=host, port=port, password=pwd, charset="utf-8", decode_responses=True) - - # Import Cities - print("Importing ...") - - count = 0 - - with open("data/worldcities.csv", 'r') as cities: - reader = csv.DictReader(cities) - for row in reader: - id = row["id"] - name = row["city_ascii"] - lng = row["lng"] - lat = row["lat"] - country = row["country"] - pop = row["population"] - - print("id = {}, name = {}, lng = {}, lat = {}".format(id, name, lng, lat)) - count += 1 - - redis.hmset("ct:{}".format(id), { "_id" : id, "name" : name, "country" : country, "population" : pop }) - redis.geoadd("idx:cities", lng, lat, id) - redis.hset("idx:city_by_name", name, id) - -``` - -Create a file called config.py as shown below: - -```python - REDIS_CFG = { - "host" : "localhost", - "port" : 6379, - "password" : "" - } - -``` - -Ensure that you provide the right host and port details. - -Execute the script: - -```bash - python3 importcities.py -``` - -You will see the below results: - -```bash - id = 762, name = Labatt Ontario Breweries, lng = -81.2467, lat = 42.9778 - id = 915, name = Ninkasi Brewing, lng = -123.11, lat = 44.0569 - id = 930, name = Oaken Barrel Brewing, lng = -86.0901, lat = 39.615 - Import of 16790 records completed - -``` - -If you want to simulate slowlogs, then consider using KEYS command. It is always recommended NOT TO USE [KEYS](https://redis.io/commands/keys) in your regular application code. If you're looking for a way to find keys in a subset of your keyspace, consider using [SCAN](https://redis.io/commands/scan) or [sets](https://redis.io/topics/data-types#sets). - -The KEYS command may ruin performance when it is executed against large databases - -Let us try to run KEYS \* in RedisInsight CLI and see if it generates slowlog as shown below: - -![alt_text](images/image4.png) - -Run it one more time and you will notice below: - -![alt_text](images/image5.png) - -Try decreasing the execution time(50 ms), and you will notice that the below run query also gets logged into the slowlog - -![alt_text](images/image6.png) - -## Step 7. Configuring the execution time - -Each entry in the slowlog contains four fields: a slowlog entry ID, the Unix timestamp of when the command was run, the execution time in microseconds, and an array with the command itself, along with any arguments. See the example output below: - -In order to retrieve the slowlog queries, you have to use SLOWLOG GET X. Where X is the number of slow queries you want to retrieve. - -![alt_text](images/image7.png) - -As shown above, the result displays a unique id, timestamp, time taken to execute the query in microseconds, and the actual command + parameter executed. It is important to note that the Slow log is transient; there's no persistence for it so in the case of failover, the slowlog is lost. If you are looking to rely on a persistent slowlog, you'll be wanting to reconsider your design choices - -Please note: If I choose “0” it forces the logging of every command while “-1” disabled the slowlog. - -![alt_text](images/image8.png) - -![alt_text](images/image9.png) - -:::important - -In a clustered database, each node can have different values for slowlog. You will need to use the configuration tool in order to configure slowlog for clustered databases. -::: - -## Additional Links - -- [Explore Redis keys using RedisInsight browser tool](/explore/redisinsight/browser) -- [Memory Analysis using RedisInsight](/explore/redisinsight/memoryanalyzer) -- [Unified Search and Analytics using RediSearch Browser Tool](/explore/redisinsight/redisearch) -- [Managing time-series data using RedisTimeSeries Browser Tool](/explore/redisinsight/redistimeseries) -- [Analyze Your Redis commands using RedisInsight Profiler tool](/explore/redisinsight/profiler) -- [Debugging Redis using RedisInsight Slowlog Tool](/explore/redisinsight/slowlog) -- [Using Redis Streams with RedisInsight](/explore/redisinsight/streams) - -## - - diff --git a/docs/explore/redisinsight/slowlog/launchpad.png b/docs/explore/redisinsight/slowlog/launchpad.png deleted file mode 100644 index 66e7a455f63..00000000000 Binary files a/docs/explore/redisinsight/slowlog/launchpad.png and /dev/null differ diff --git a/docs/explore/redisinsight/streams/images/image16.png b/docs/explore/redisinsight/streams/images/image16.png deleted file mode 100644 index 86a523d911f..00000000000 Binary files a/docs/explore/redisinsight/streams/images/image16.png and /dev/null differ diff --git a/docs/explore/redisinsight/streams/images/image17.png b/docs/explore/redisinsight/streams/images/image17.png deleted file mode 100644 index 3171127e07b..00000000000 Binary files a/docs/explore/redisinsight/streams/images/image17.png and /dev/null differ diff --git a/docs/explore/redisinsight/streams/images/image18.png b/docs/explore/redisinsight/streams/images/image18.png deleted file mode 100644 index f0f3c7c634d..00000000000 Binary files a/docs/explore/redisinsight/streams/images/image18.png and /dev/null differ diff --git a/docs/explore/redisinsight/streams/images/image19.png b/docs/explore/redisinsight/streams/images/image19.png deleted file mode 100644 index b789b6682eb..00000000000 Binary files a/docs/explore/redisinsight/streams/images/image19.png and /dev/null differ diff --git a/docs/explore/redisinsight/streams/images/image20.png b/docs/explore/redisinsight/streams/images/image20.png deleted file mode 100644 index 338722b66e0..00000000000 Binary files a/docs/explore/redisinsight/streams/images/image20.png and /dev/null differ diff --git a/docs/explore/redisinsight/streams/images/redisinsight4.png b/docs/explore/redisinsight/streams/images/redisinsight4.png deleted file mode 100644 index f00d956f37a..00000000000 Binary files a/docs/explore/redisinsight/streams/images/redisinsight4.png and /dev/null differ diff --git a/docs/explore/redisinsight/streams/images/redisinsightinstall.png b/docs/explore/redisinsight/streams/images/redisinsightinstall.png deleted file mode 100644 index 99f2c696ea5..00000000000 Binary files a/docs/explore/redisinsight/streams/images/redisinsightinstall.png and /dev/null differ diff --git a/docs/explore/redisinsight/streams/index-streams.mdx b/docs/explore/redisinsight/streams/index-streams.mdx deleted file mode 100644 index 4b0423f1b27..00000000000 --- a/docs/explore/redisinsight/streams/index-streams.mdx +++ /dev/null @@ -1,168 +0,0 @@ ---- -id: index-streams -title: Use Redis Streams Consumer Groups with RedisInsight -sidebar_label: Use Redis Streams Consumer Groups with RedisInsight -slug: /explore/redisinsight/streams -authors: [ajeet] ---- - -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; - -Redis is an open source, in-memory, key-value data store most commonly used as a primary database, cache, message broker, and message queue. Redis cache delivers sub-millisecond response times, enabling fast and powerful real-time applications in industries such as gaming, fintech, ad-tech, social media, healthcare, and IoT. The Stream is a new data type introduced with Redis 5.0, which models a log data structure in a more abstract way. -A Redis Stream is a Redis data type that represents a log, so you can add new information and message in an append-only mode. -Redis Streams lets you build “Kafka-like” applications, which can: - -- Create applications that publish and consume messages. Nothing extraordinary here, you could already do that with Redis Pub/Sub(Publisher/Subscriber). -- Consume messages that are published even when the client application (consumer) is not running. This is a big difference from Redis Pub/Sub. -- Consume messages starting from a specific point. For example, read the whole history or only new messages. - -In addition, Redis Streams has the concept of a consumer group. Redis Streams consumer groups, like the similar concept in [Apache Kafka](https://kafka.apache.org/), allows client applications to consume messages in a distributed fashion (multiple clients), making it easy to scale and create highly available systems. - -Let’s dive under the covers and see [Redis Streams](https://redis.io/topics/streams-intro) through the lens of RedisInsight. You will see how to use the [Lettuce Java client](https://developer.redis.com/develop/java/#using-lettuce) to publish and consume messages using consumer group.This is the first basic example that uses a single consumer. - -## Prerequisite: - -- [Install OpenJDK](https://openjdk.java.net/install/) -- [Install Apache Maven](https://maven.apache.org/install.html) -- [Install Redis](https://developer.redis.com/create) - -### Step 1. Run a Redis server - -Redis is an open source, in-memory, key-value data store most commonly used as a primary database, cache, message broker, and queue. Redis delivers sub-millisecond response times, enabling fast and powerful real-time applications in industries such as gaming, fintech, ad-tech, social media, healthcare, and IoT. You can run a Redis database directly over your local mac os or in a container. If you have Docker installed in your sytem, type the following command: - -```bash - docker run -d -p 6379:6379 redislabs/redismod -``` - -You can connect to Redis server using the `redis-cli` command like this: - -``` - redis-cli -``` - -The above command will make a connection to the Redis server. It will then present a prompt that allows you to run Redis commands. -Please note that you can connect to Redis server using multiple clients. - -## Step 2: Download RedisInsight - -To install RedisInsight on your local system, you need to first download the software from the Redis website. - -[Click this link ](https://redis.com/redis-enterprise/redis-insight/#insight-form) to access a form that allows you to select the operating system of your choice. - -![My Image](images/redisinsightinstall.png) - -Run the installer. After the web server starts, open http://YOUR_HOST_IP:8001 and add a Redis database connection. - -Select "Connect to a Redis database" -![My Image](images/redisinsight4.png) - -Enter the requested details, including Name, Host (endpoint), Port, and Password. Then click “ADD REDIS DATABASE”. - -## Step 3. Cloning the repository - -```bash - git clone https://github.com/redis-developer/redis-streams-101-java - cd redis-streams-101-java - mvn clean verify -``` - -## Step 4. Run the producer(Post a new message) - -```bash - - mvn exec:java -Dexec.mainClass="com.kanibl.redis.streams.simple.RedisStreams101Producer" -Dexec.args="5" - - Downloaded from central: https://repo.maven.apache.org/maven2/org/sonatype/sisu/sisu-guice/2.1.7/sisu-guice-2.1.7-noaop.jar (472 kB at 450 kB/s) - Downloaded from central: https://repo.maven.apache.org/maven2/org/slf4j/slf4j-api/1.7.5/slf4j-api-1.7.5.jar (26 kB at 25 kB/s) - Downloaded from central: https://repo.maven.apache.org/maven2/commons-codec/commons-codec/1.11/commons-codec-1.11.jar (335 kB at 313 kB/s) - - Sending 5 message(s) - May 18, 2021 1:07:00 PM io.lettuce.core.EpollProvider - INFO: Starting without optional epoll library - May 18, 2021 1:07:00 PM io.lettuce.core.KqueueProvider - INFO: Starting without optional kqueue library - Message 1621343220998-0 : {sensor_ts=1621343220975, loop_info=0, speed=15, direction=270} posted - Message 1621343221009-0 : {sensor_ts=1621343221007, loop_info=1, speed=15, direction=270} posted - Message 1621343221016-0 : {sensor_ts=1621343221011, loop_info=2, speed=15, direction=270} posted - Message 1621343221019-0 : {sensor_ts=1621343221017, loop_info=3, speed=15, direction=270} posted - Message 1621343221023-0 : {sensor_ts=1621343221021, loop_info=4, speed=15, direction=270} posted - - - [INFO] ------------------------------------------------------------------------ - [INFO] BUILD SUCCESS - [INFO] ------------------------------------------------------------------------ - [INFO] Total time: 9.102 s - [INFO] Finished at: 2021-05-18T13:07:01Z - [INFO] ------------------------------------------------------------------------ - -``` - -## Step 5. Run the consumer(Consume messages) - -Open a new terminal and run this command: - -```bash - - mvn exec:java -Dexec.main -``` - -The consumer will start and consume the message you just posted, and wait for any new messages. - -## Step 6: Posting the new messages - -In the first terminal, let us post new entries to a Redis stream: - -```bash - mvn exec:java -Dexec.mainClass="com.kanibl.redis.streams.simple.RedisStreams101Producer" -Dexec.args="100" -``` - -Let us try to visualise the latest message using the RedisInsight browser tool. Make sure ‘Stream Data’ is selected and select any one of the streams. For a specified stream, you’ll find a table showing data in that stream along with a timestamp of when each entry was added. - -![alt_text](images/image16.png) - -To see the processing side of the stream select ‘Stream Data”. You will see 105 records under the streaming data. - -![alt_text](images/image17.png) - -Click on “Consumer Group” to see application_1 as promising active consumers. - -![alt_text](images/image18.png) - -RedisInsight also provide you to select fields as shown under “View Columns” section. - -![alt_text](images/image19.png) - -It also displays pending items/messages for the specific streams as shown above. - -## Additional Links - -- [Slowlog Configuration using RedisInsight](/explore/redisinsight/slowlog) -- [Explore Redis keys using RedisInsight browser tool](/explore/redisinsight/browser) -- [Memory Analysis using RedisInsight](/explore/redisinsight/memoryanalyzer) -- [Add data to a Redis Stream using the XADD command](https://redis.io/commands/xadd/) -- [XREAD with support for consumer groups using XREADGROUP command](https://redis.io/commands/xreadgroup/) -- [How to use the XREAD command (XREAD count)](https://redis.io/commands/xread/) -- [Remove one or multiple messages from consumer group using XACK command](https://redis.io/commands/xack/) -- [Removing single items from a stream using stream ID](https://redis.io/docs/manual/data-types/streams/#removing-single-items-from-a-stream) - -## - - diff --git a/docs/explore/redisinsight/streams/launchpad.png b/docs/explore/redisinsight/streams/launchpad.png deleted file mode 100644 index 66e7a455f63..00000000000 Binary files a/docs/explore/redisinsight/streams/launchpad.png and /dev/null differ diff --git a/docs/explore/redisinsight/usinghelm/images/image1.png b/docs/explore/redisinsight/usinghelm/images/image1.png deleted file mode 100644 index 3f103bc31e1..00000000000 Binary files a/docs/explore/redisinsight/usinghelm/images/image1.png and /dev/null differ diff --git a/docs/explore/redisinsight/usinghelm/images/image2.png b/docs/explore/redisinsight/usinghelm/images/image2.png deleted file mode 100644 index dca5c121216..00000000000 Binary files a/docs/explore/redisinsight/usinghelm/images/image2.png and /dev/null differ diff --git a/docs/explore/redisinsight/usinghelm/images/image3.png b/docs/explore/redisinsight/usinghelm/images/image3.png deleted file mode 100644 index 80f66805ce6..00000000000 Binary files a/docs/explore/redisinsight/usinghelm/images/image3.png and /dev/null differ diff --git a/docs/explore/redisinsight/usinghelm/images/image_4.png b/docs/explore/redisinsight/usinghelm/images/image_4.png deleted file mode 100644 index 4f3a160bff1..00000000000 Binary files a/docs/explore/redisinsight/usinghelm/images/image_4.png and /dev/null differ diff --git a/docs/explore/redisinsight/usinghelm/index-usinghelm.mdx b/docs/explore/redisinsight/usinghelm/index-usinghelm.mdx deleted file mode 100644 index fd304139920..00000000000 --- a/docs/explore/redisinsight/usinghelm/index-usinghelm.mdx +++ /dev/null @@ -1,148 +0,0 @@ ---- -id: index-usinghelm -title: Installing RedisInsight using Helm -sidebar_label: Installing RedisInsight using Helm -slug: /explore/redisinsight/usinghelm -authors: [ajeet] ---- - -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; - -Helm is a package manager for Kubernetes. It is the best way to find, share, and use software built for Kubernetes. It is the K8s equivalent of yum or apt. -Helm helps you manage Kubernetes applications — Helm Charts help you define, install, and upgrade even the most complex Kubernetes application. Helm is a graduated project in the CNCF and is maintained by the Helm community. - -### Benefits of Helm: - -- Improves developer productivity -- Makes application deployment easy, standarized and reusable -- Enhances operational readiness -- Reduces the complexity of deployments of microservices -- Speeds up the adaptation of cloud native applications - -It is possible to install RedisInsight using Helm chart. A full-featured desktop GUI client, RedisInsight is an essential tool for Redis developers. It is a lightweight multi-platform management visualization tool that helps you design, develop, and optimize your application capabilities in a single easy-to-use environment. RedisInsight not just makes it easier to interact with your databases and manage your data, but also helps in managing Redis Cluster with ease. - -## Getting Started - -### Step 1. Install the Prerequisites - -Install Docker Desktop for Mac and enable Kubernetes as shown below: - -![alt_text](images/image1.png) - -### Step 2. Install Helm on your Mac system - -```bash - brew install helm -``` - -### Step 3. Verify if helm is installed correctly - -```bash - helm version - version.BuildInfo{Version:"v3.6.1", - GitCommit:"61d8e8c4a6f95540c15c6a65f36a6dd0a45e7a2f", GitTreeState:"dirty", - GoVersion:"go1.16.5"} -``` - -### Step 4. Download RedisInsight Helm Chart - -``` - wget https://docs.redis.com/latest/pkgs/redisinsight-chart-0.1.0.tgz -``` - -### Step 5. Verify if Kubernetes is up and running - -```bash - kubectl get nodes - NAME STATUS ROLES AGE VERSION - docker-desktop Ready master 22d v1.19.7 -``` - -### Step 6. Install RedisInsight using Helm chart - -```bash - helm install redisinsight redisinsight-chart-0.1.0.tgz --set service.type=NodePort - - NAME: redisinsight - LAST DEPLOYED: Sat Jun 26 11:40:11 2021 - NAMESPACE: default - STATUS: deployed - REVISION: 1 - NOTES: - 1. Get the application URL by running these commands: - export NODE_PORT=$(kubectl get --namespace default -o - jsonpath="{.spec.ports[0].nodePort}" services redisinsight-redisinsight-chart) - export NODE_IP=$(kubectl get nodes --namespace default -o - jsonpath="{.items[0].status.addresses[0].address}") - echo http://$NODE_IP:$NODE_PORT -``` - -### Step 7. Get the application URL - -```bash - export NODE_PORT=$(kubectl get --namespace default -o jsonpath="{.spec.ports[0].nodePort}" services redisinsight-redisinsight-chart) - export NODE_IP=$(kubectl get nodes --namespace default -o jsonpath="{.items[0].status.addresses[0].address}") -``` - -### Step 8. Listing the IP address - -```bash - echo http://$NODE_IP:$NODE_PORT - http://192.168.65.4:30269 -``` - -### Step 9. Listing the Helm Chart - -```bash - helm list - NAME NAMESPACE REVISION UPDATED STATUS CHART APP VERSION - redisinsight default 1 2021-06-26 11:40:11.82793 +0530 IST deployed redisinsight-chart-0.1.0 -``` - -![images](images/image2.png) - -### Step 10. Listing the Redisinsight Pods - -```bash - kubectl get po - NAME READY STATUS RESTARTS AGE - fortune 2/2 Running 8 22d - redisinsight-redisinsight-chart-857b486d8f-w9xpv 1/1 Running 0 15m -``` - -### Step 11. Accessing RedisInsight - -![images](images/image3.png) -![images](images/image_4.png) - -### References - -- [Explore Redis keys using RedisInsight browser tool](/explore/redisinsight/browser) -- [Memory Analysis using RedisInsight](/explore/redisinsight/memoryanalyzer) -- [Unified Search and Analytics using RediSearch Browser Tool](/explore/redisinsight/redisearch) -- [Managing time-series data using RedisTimeSeries Browser Tool](/explore/redisinsight/redistimeseries) -- [Analyze Your Redis commands using RedisInsight Profiler tool](/explore/redisinsight/profiler) -- [Debugging Redis using RedisInsight Slowlog Tool](/explore/redisinsight/slowlog) -- [Using Redis Streams with RedisInsight](/explore/redisinsight/streams) - -## - - diff --git a/docs/explore/redisinsight/usinghelm/launchpad.png b/docs/explore/redisinsight/usinghelm/launchpad.png deleted file mode 100644 index 66e7a455f63..00000000000 Binary files a/docs/explore/redisinsight/usinghelm/launchpad.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/.gitignore b/docs/explore/redisinsightv2/.gitignore deleted file mode 100644 index fca99142337..00000000000 --- a/docs/explore/redisinsightv2/.gitignore +++ /dev/null @@ -1 +0,0 @@ -.swo diff --git a/docs/explore/redisinsightv2/browser/images/add_database.png b/docs/explore/redisinsightv2/browser/images/add_database.png deleted file mode 100644 index 9ada742a2f2..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/add_database.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/image1.png b/docs/explore/redisinsightv2/browser/images/image1.png deleted file mode 100644 index 87763c959c7..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/image1.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/image10.png b/docs/explore/redisinsightv2/browser/images/image10.png deleted file mode 100644 index 3f492b80148..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/image10.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/image11.png b/docs/explore/redisinsightv2/browser/images/image11.png deleted file mode 100644 index 7a64194a579..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/image11.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/image12.png b/docs/explore/redisinsightv2/browser/images/image12.png deleted file mode 100644 index 3f70b390c91..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/image12.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/image13.png b/docs/explore/redisinsightv2/browser/images/image13.png deleted file mode 100644 index 61be03eaf13..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/image13.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/image14.png b/docs/explore/redisinsightv2/browser/images/image14.png deleted file mode 100644 index ee965b171b7..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/image14.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/image15.png b/docs/explore/redisinsightv2/browser/images/image15.png deleted file mode 100644 index 22af2a3abc1..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/image15.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/image16.png b/docs/explore/redisinsightv2/browser/images/image16.png deleted file mode 100644 index 86a523d911f..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/image16.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/image17.png b/docs/explore/redisinsightv2/browser/images/image17.png deleted file mode 100644 index 3171127e07b..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/image17.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/image18.png b/docs/explore/redisinsightv2/browser/images/image18.png deleted file mode 100644 index f0f3c7c634d..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/image18.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/image19.png b/docs/explore/redisinsightv2/browser/images/image19.png deleted file mode 100644 index b789b6682eb..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/image19.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/image2.png b/docs/explore/redisinsightv2/browser/images/image2.png deleted file mode 100644 index 10ca8f542be..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/image2.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/image20.png b/docs/explore/redisinsightv2/browser/images/image20.png deleted file mode 100644 index 338722b66e0..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/image20.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/image3.png b/docs/explore/redisinsightv2/browser/images/image3.png deleted file mode 100644 index 17ccd420901..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/image3.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/image31.png b/docs/explore/redisinsightv2/browser/images/image31.png deleted file mode 100644 index abd31b1ff0f..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/image31.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/image4.png b/docs/explore/redisinsightv2/browser/images/image4.png deleted file mode 100644 index 9a77d9b081f..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/image4.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/image41.png b/docs/explore/redisinsightv2/browser/images/image41.png deleted file mode 100644 index 147d375c245..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/image41.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/image5.png b/docs/explore/redisinsightv2/browser/images/image5.png deleted file mode 100644 index fd9cd4f7d1d..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/image5.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/image6.png b/docs/explore/redisinsightv2/browser/images/image6.png deleted file mode 100644 index 8c2fd68d231..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/image6.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/image61.png b/docs/explore/redisinsightv2/browser/images/image61.png deleted file mode 100644 index 86e53d336f0..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/image61.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/image7.png b/docs/explore/redisinsightv2/browser/images/image7.png deleted file mode 100644 index 8766b9a5fc9..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/image7.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/image8.png b/docs/explore/redisinsightv2/browser/images/image8.png deleted file mode 100644 index 34d48cb581c..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/image8.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/image9.png b/docs/explore/redisinsightv2/browser/images/image9.png deleted file mode 100644 index 68a163a2300..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/image9.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/image_61.png b/docs/explore/redisinsightv2/browser/images/image_61.png deleted file mode 100644 index b7c972ab095..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/image_61.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/image_browser.png b/docs/explore/redisinsightv2/browser/images/image_browser.png deleted file mode 100644 index 7728e380cc7..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/image_browser.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/image_browser1.png b/docs/explore/redisinsightv2/browser/images/image_browser1.png deleted file mode 100644 index e123142bbc5..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/image_browser1.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/image_browser2.png b/docs/explore/redisinsightv2/browser/images/image_browser2.png deleted file mode 100644 index 9c27924aa51..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/image_browser2.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/image_browser3.png b/docs/explore/redisinsightv2/browser/images/image_browser3.png deleted file mode 100644 index 267ef33df7b..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/image_browser3.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/image_browser31.png b/docs/explore/redisinsightv2/browser/images/image_browser31.png deleted file mode 100644 index a898c66e358..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/image_browser31.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/image_browser4.png b/docs/explore/redisinsightv2/browser/images/image_browser4.png deleted file mode 100644 index d28c238c123..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/image_browser4.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/image_db.png b/docs/explore/redisinsightv2/browser/images/image_db.png deleted file mode 100644 index eb8b94c4003..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/image_db.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/image_redisinsight_browser.png b/docs/explore/redisinsightv2/browser/images/image_redisinsight_browser.png deleted file mode 100644 index 6096e70260b..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/image_redisinsight_browser.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/redisinsight.png b/docs/explore/redisinsightv2/browser/images/redisinsight.png deleted file mode 100644 index bbd8f5f9790..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/redisinsight.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/redisinsight4.png b/docs/explore/redisinsightv2/browser/images/redisinsight4.png deleted file mode 100644 index f00d956f37a..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/redisinsight4.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/redisinsightinstall.png b/docs/explore/redisinsightv2/browser/images/redisinsightinstall.png deleted file mode 100644 index 99f2c696ea5..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/redisinsightinstall.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/testredis1.png b/docs/explore/redisinsightv2/browser/images/testredis1.png deleted file mode 100644 index 88bfdf751b5..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/testredis1.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/testredis2.png b/docs/explore/redisinsightv2/browser/images/testredis2.png deleted file mode 100644 index 712038efad5..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/testredis2.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/testredis3.png b/docs/explore/redisinsightv2/browser/images/testredis3.png deleted file mode 100644 index c366b3b8ca6..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/testredis3.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/testredis4.png b/docs/explore/redisinsightv2/browser/images/testredis4.png deleted file mode 100644 index 15cbdccf451..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/testredis4.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/testredis5.png b/docs/explore/redisinsightv2/browser/images/testredis5.png deleted file mode 100644 index 00e05e4c408..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/testredis5.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/images/testredis6.png b/docs/explore/redisinsightv2/browser/images/testredis6.png deleted file mode 100644 index dce00ef0632..00000000000 Binary files a/docs/explore/redisinsightv2/browser/images/testredis6.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/browser/index-browser.mdx b/docs/explore/redisinsightv2/browser/index-browser.mdx deleted file mode 100644 index 7b286fd7b26..00000000000 --- a/docs/explore/redisinsightv2/browser/index-browser.mdx +++ /dev/null @@ -1,182 +0,0 @@ ---- -id: index-browser -title: Visualize Redis Database keys using the RedisInsight Browser Tool -sidebar_label: Visualize Redis Database keys using the RedisInsight Browser Tool -slug: /explore/redisinsightv2/browser -authors: [ajeet] ---- - -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; - -![My Image](images/image_redisinsight_browser.png) - -RedisInsight is a 100% free Redis GUI that allows you to visualise, monitor, and optimize while developing your applications with Redis. It provides an intuitive and efficient GUI for Redis allowing developers like you to interact with your databases and manage your data. RedisInsight v2.0 now incorporates a completely new tech stack based on the popular Electron and Elastic UI frameworks. You can run the application locally along with your favorite IDE, and it remains cross-platform, supported on Linux, Windows, and MacOS. - -## What's New in the RedisInsight v2.0 Browser Tool? - -RedisInsight Browser lets you explore keys in your Redis server. You can add, edit and delete a key. You can even update the key expiry and copy the key name to be used in different parts of the application. Below are the list of features available under the browser tool: - -- Browse, filter and visualize key-value Redis data structures -- Visual cues per data type -- Quick view of size and ttl in the main browser view -- Ability to filter by pattern and/or data type -- Ability to change the number of keys to scan through during filtering -- CRUD support for Lists, Hashes, Strings, Sets, Sorted Sets -- Search within the data structure (except for Strings) -- CRUD support for RedisJSON - -In order to understand the capabilities of the browser tool, let us take a simple example and demonstrate each of the browser tool's options: - -### Step 1. Install RedisInsight - -To use RedisInsight on a local Mac, you can install Redis Stack by running the following commands: - -First, tap the Redis Stack Homebrew tap and then run `brew install` as shown below: - -```bash - brew tap redis-stack/redis-stack - brew install --cask redis-stack -``` - -This will install all Redis and Redis Stack binaries. How you run these binaries depends on whether you already have Redis installed on your system. - -``` - ==> Installing Cask redis-stack-redisinsight - ==> Moving App 'RedisInsight-preview.app' to '/Applications/RedisInsight-preview.app' - 🍺 redis-stack-redisinsight was successfully installed! - ==> Installing Cask redis-stack - 🍺 redis-stack was successfully installed! -``` - -:::info TIP - -If this is the first time you’ve installed Redis on your system, then all Redis Stack binaries be installed and accessible on your path. On M1 Macs, this assumes that `/opt/homebrew/bin` is in your path. On Intel-based Macs, `/usr/local/bin` should be in the $PATH. - -To check this, run: - -```bash - echo $PATH -``` - -Then, confirm that the output contains `/opt/homebrew/bin` (M1 Mac) or `/usr/local/bin` (Intel Mac). If these directories are not in the output, see the “Existing Redis installation” instructions below. -::: - -### Step 2. Start Redis Stack Server - -You can now start Redis Stack Server as follows: - -```bash - redis-stack-server -``` - -### Existing Redis installation - -If you have an existing Redis installation on your system, then you’ll need to modify your path to ensure that you’re using the latest Redis Stack binaries. - -Open the file `~/.bashrc` or `~/zshrc` (depending on your shell), and add the following lines. - -```bash - export PATH=/usr/local/Caskroom/redis-stack-server//bin:$PATH -``` - -Go to Applications and click "RedisInsight-v2" to bring up the Redis Desktop GUI tool. - -### Step 3. Add Redis database - -![access redisinsight](images/add_database.png) - -### Step 4. Enter Redis database details - -Add the local Redis database endpoint and port. - -![access redisinsight](images/testredis1.png) - -### Step 5: Open "Browser Tool" - -Click on the "Key" icon on the left sidebar to open up the browser tool. - -![alt_text](images/image_browser.png) - -### Step 6: Importing keys - -Let us import a user database (6k keys). This dataset contains users stored as Redis Hashes. - -### - -**Users** - -The user hashes contain the following fields: - -- `user:id` : The key of the hash. -- `first_name` : First Name. -- `last_name` : Last name. -- `email` : email address. -- `gender` : Gender (male/female). -- `ip_address` : IP address. -- `country` : Country Name. -- `country_code` : Country Code. -- `city` : City of the user. -- `longitude` : Longitude of the user. -- `latitude` : Latitude of the user. -- `last_login` : Epoch time of the last login. - -### Step 7: Cloning the repository - -Open up the CLI terminal and run the following command: - -```bash - git clone https://github.com/redis-developer/redis-datasets - cd redis-datasets/user-database -``` - -### Importing the user database: - -```bash - redis-cli -h localhost -p 6379 < ./import_users.redis -``` - -Refresh the keys view by clicking as shown below: - -![alt_text](images/image_browser1.png) - -You can get a real-time view of the data in your Redis database as shown below: - -Select any key in the keys view and the key's value gets displayed in the right hand side that includes Fields and values. - -### Step 8. Modifying a key - -![alt_text](images/image_browser31.png) - -Enter key name, field and value. - -### Step 9: Using CLI - -RedisInsight CLI lets you run commands against a Redis server. You don’t need to remember the syntax - the integrated help shows you all the arguments and validates your command as you type. - -![alt_text](images/image_browser4.png) - -## Further References - -- [Slowlog Configuration using RedisInsight](/explore/redisinsight/slowlog) -- [Explore Redis keys using RedisInsight browser tool](/explore/redisinsight/browser) -- [Memory Analysis using RedisInsight](/explore/redisinsight/memoryanalyzer) - - diff --git a/docs/explore/redisinsightv2/browser/launchpad.png b/docs/explore/redisinsightv2/browser/launchpad.png deleted file mode 100644 index 66e7a455f63..00000000000 Binary files a/docs/explore/redisinsightv2/browser/launchpad.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/getting-started/images/add_database.png b/docs/explore/redisinsightv2/getting-started/images/add_database.png deleted file mode 100644 index 9ada742a2f2..00000000000 Binary files a/docs/explore/redisinsightv2/getting-started/images/add_database.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/getting-started/images/image1.png b/docs/explore/redisinsightv2/getting-started/images/image1.png deleted file mode 100644 index 87763c959c7..00000000000 Binary files a/docs/explore/redisinsightv2/getting-started/images/image1.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/getting-started/images/image10.png b/docs/explore/redisinsightv2/getting-started/images/image10.png deleted file mode 100644 index a220a050cb0..00000000000 Binary files a/docs/explore/redisinsightv2/getting-started/images/image10.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/getting-started/images/image11.png b/docs/explore/redisinsightv2/getting-started/images/image11.png deleted file mode 100644 index 63cbc16b24e..00000000000 Binary files a/docs/explore/redisinsightv2/getting-started/images/image11.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/getting-started/images/image12.png b/docs/explore/redisinsightv2/getting-started/images/image12.png deleted file mode 100644 index 4ad75db1ceb..00000000000 Binary files a/docs/explore/redisinsightv2/getting-started/images/image12.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/getting-started/images/image13.png b/docs/explore/redisinsightv2/getting-started/images/image13.png deleted file mode 100644 index 7400ea91b5d..00000000000 Binary files a/docs/explore/redisinsightv2/getting-started/images/image13.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/getting-started/images/image16.png b/docs/explore/redisinsightv2/getting-started/images/image16.png deleted file mode 100644 index 12b61b6ee7a..00000000000 Binary files a/docs/explore/redisinsightv2/getting-started/images/image16.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/getting-started/images/image17.png b/docs/explore/redisinsightv2/getting-started/images/image17.png deleted file mode 100644 index 80241a5b410..00000000000 Binary files a/docs/explore/redisinsightv2/getting-started/images/image17.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/getting-started/images/image2.png b/docs/explore/redisinsightv2/getting-started/images/image2.png deleted file mode 100644 index 10ca8f542be..00000000000 Binary files a/docs/explore/redisinsightv2/getting-started/images/image2.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/getting-started/images/image21.png b/docs/explore/redisinsightv2/getting-started/images/image21.png deleted file mode 100644 index ca31e6d0fce..00000000000 Binary files a/docs/explore/redisinsightv2/getting-started/images/image21.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/getting-started/images/image3.png b/docs/explore/redisinsightv2/getting-started/images/image3.png deleted file mode 100644 index 17ccd420901..00000000000 Binary files a/docs/explore/redisinsightv2/getting-started/images/image3.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/getting-started/images/image31.png b/docs/explore/redisinsightv2/getting-started/images/image31.png deleted file mode 100644 index abd31b1ff0f..00000000000 Binary files a/docs/explore/redisinsightv2/getting-started/images/image31.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/getting-started/images/image4.png b/docs/explore/redisinsightv2/getting-started/images/image4.png deleted file mode 100644 index 9a77d9b081f..00000000000 Binary files a/docs/explore/redisinsightv2/getting-started/images/image4.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/getting-started/images/image41.png b/docs/explore/redisinsightv2/getting-started/images/image41.png deleted file mode 100644 index 147d375c245..00000000000 Binary files a/docs/explore/redisinsightv2/getting-started/images/image41.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/getting-started/images/image5.png b/docs/explore/redisinsightv2/getting-started/images/image5.png deleted file mode 100644 index 525b9faf4ae..00000000000 Binary files a/docs/explore/redisinsightv2/getting-started/images/image5.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/getting-started/images/image6.png b/docs/explore/redisinsightv2/getting-started/images/image6.png deleted file mode 100644 index 6373de1f1a8..00000000000 Binary files a/docs/explore/redisinsightv2/getting-started/images/image6.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/getting-started/images/image61.png b/docs/explore/redisinsightv2/getting-started/images/image61.png deleted file mode 100644 index 86e53d336f0..00000000000 Binary files a/docs/explore/redisinsightv2/getting-started/images/image61.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/getting-started/images/image7.png b/docs/explore/redisinsightv2/getting-started/images/image7.png deleted file mode 100644 index 96751df56c2..00000000000 Binary files a/docs/explore/redisinsightv2/getting-started/images/image7.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/getting-started/images/image8.png b/docs/explore/redisinsightv2/getting-started/images/image8.png deleted file mode 100644 index c27c74bff6e..00000000000 Binary files a/docs/explore/redisinsightv2/getting-started/images/image8.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/getting-started/images/image9.png b/docs/explore/redisinsightv2/getting-started/images/image9.png deleted file mode 100644 index 69cdf09283f..00000000000 Binary files a/docs/explore/redisinsightv2/getting-started/images/image9.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/getting-started/images/image_3.png b/docs/explore/redisinsightv2/getting-started/images/image_3.png deleted file mode 100644 index 2c3b1dcf9e2..00000000000 Binary files a/docs/explore/redisinsightv2/getting-started/images/image_3.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/getting-started/images/image_61.png b/docs/explore/redisinsightv2/getting-started/images/image_61.png deleted file mode 100644 index b7c972ab095..00000000000 Binary files a/docs/explore/redisinsightv2/getting-started/images/image_61.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/getting-started/images/image_71.png b/docs/explore/redisinsightv2/getting-started/images/image_71.png deleted file mode 100644 index a4f65ea3ec6..00000000000 Binary files a/docs/explore/redisinsightv2/getting-started/images/image_71.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/getting-started/images/image_appearance.png b/docs/explore/redisinsightv2/getting-started/images/image_appearance.png deleted file mode 100644 index 2171649d91c..00000000000 Binary files a/docs/explore/redisinsightv2/getting-started/images/image_appearance.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/getting-started/images/image_db.png b/docs/explore/redisinsightv2/getting-started/images/image_db.png deleted file mode 100644 index eb8b94c4003..00000000000 Binary files a/docs/explore/redisinsightv2/getting-started/images/image_db.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/getting-started/images/image_hash.png b/docs/explore/redisinsightv2/getting-started/images/image_hash.png deleted file mode 100644 index f85379575ba..00000000000 Binary files a/docs/explore/redisinsightv2/getting-started/images/image_hash.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/getting-started/images/image_windows.png b/docs/explore/redisinsightv2/getting-started/images/image_windows.png deleted file mode 100644 index a20c6d7c648..00000000000 Binary files a/docs/explore/redisinsightv2/getting-started/images/image_windows.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/getting-started/images/redisinsight.png b/docs/explore/redisinsightv2/getting-started/images/redisinsight.png deleted file mode 100644 index bbd8f5f9790..00000000000 Binary files a/docs/explore/redisinsightv2/getting-started/images/redisinsight.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/getting-started/images/testredis1.png b/docs/explore/redisinsightv2/getting-started/images/testredis1.png deleted file mode 100644 index 88bfdf751b5..00000000000 Binary files a/docs/explore/redisinsightv2/getting-started/images/testredis1.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/getting-started/images/testredis2.png b/docs/explore/redisinsightv2/getting-started/images/testredis2.png deleted file mode 100644 index 712038efad5..00000000000 Binary files a/docs/explore/redisinsightv2/getting-started/images/testredis2.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/getting-started/images/testredis3.png b/docs/explore/redisinsightv2/getting-started/images/testredis3.png deleted file mode 100644 index c366b3b8ca6..00000000000 Binary files a/docs/explore/redisinsightv2/getting-started/images/testredis3.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/getting-started/images/testredis4.png b/docs/explore/redisinsightv2/getting-started/images/testredis4.png deleted file mode 100644 index 15cbdccf451..00000000000 Binary files a/docs/explore/redisinsightv2/getting-started/images/testredis4.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/getting-started/images/testredis5.png b/docs/explore/redisinsightv2/getting-started/images/testredis5.png deleted file mode 100644 index 00e05e4c408..00000000000 Binary files a/docs/explore/redisinsightv2/getting-started/images/testredis5.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/getting-started/images/testredis6.png b/docs/explore/redisinsightv2/getting-started/images/testredis6.png deleted file mode 100644 index dce00ef0632..00000000000 Binary files a/docs/explore/redisinsightv2/getting-started/images/testredis6.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/getting-started/index-gettingstarted.mdx b/docs/explore/redisinsightv2/getting-started/index-gettingstarted.mdx deleted file mode 100644 index 55b7ba97cac..00000000000 --- a/docs/explore/redisinsightv2/getting-started/index-gettingstarted.mdx +++ /dev/null @@ -1,356 +0,0 @@ ---- -id: index-gettingstarted -title: Getting Started with RedisInsight -sidebar_label: Getting Started with RedisInsight -slug: /explore/redisinsightv2/getting-started -authors: [ajeet] ---- - -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; - -![My Image](images/redisinsight.png) - -RedisInsight is a visual tool that lets you do both GUI- and CLI-based interactions with your Redis database, and so much more when developing your Redis based application. It is a fully-featured pure Desktop GUI client that provides capabilities to design, develop and optimize your Redis application. It works with any cloud provider as long as you run it on a host with network access to your cloud-based Redis server. It makes it easy to discover cloud databases and configure connection details with a single click. It allows you to automatically add Redis Enterprise Software and Redis Enterprise Cloud databases. - -## What's New in RedisInsight v2.0? - -RedisInsight v2.0 is a complete product rewrite based on a new tech stack comprising of [Electron](https://www.electronjs.org/), [Monaco Editor](https://microsoft.github.io/monaco-editor/) and [NodeJS](https://nodejs.org). This version contains a number of must-have and most-used capabilities from previous releases, plus a number of differentiators and delighters. You can run the application locally along with your favorite IDE, and it remains cross-platform, supported on Linux, Windows, and MacOS. - -
- -
- -Starting with RedisInsight v2.0 release, the code is open source and publicly available over on [GitHub](https://github.com/redisinsight/redisinsight). Below are the list of new features introduced with this latest release: - -- Workbench - An advanced command line interface with intelligent command auto-complete and complex data visualizations -- Ability to write and render your own data visualizations within Workbench -- Built-in click-through Redis Guides available -- Support for Light and Dark themes -- Enhanced user experience with Browser - -## Getting Started - - - - -## Using MacOS - -To install RedisInsight on MacOS, the easiest way is to install Redis Stack. -Make sure that you have Homebrew installed before starting on the installation instructions below. - -### Step 1. Install Redis Stack using Homebrew - -First, tap the Redis Stack Homebrew tap and then run `brew install` as shown below: - -```bash - brew tap redis-stack/redis-stack - brew install --cask redis-stack -``` - -This will install all Redis and Redis Stack binaries. How you run these binaries depends on whether you already have Redis installed on your system. - -``` - ==> Installing Cask redis-stack-redisinsight - ==> Moving App 'RedisInsight-preview.app' to '/Applications/RedisInsight-preview.app' - 🍺 redis-stack-redisinsight was successfully installed! - ==> Installing Cask redis-stack - 🍺 redis-stack was successfully installed! -``` - -:::info TIP - -If this is the first time you’ve installed Redis on your system, then all Redis Stack binaries be installed and accessible on your paht. On M1 Macs, this assumes that `/opt/homebrew/bin` is in your path. On Intel-based Macs, `/usr/local/bin` should be in the path. - -To check this, run: - -```bash - echo $PATH -``` - -Then, confirm that the output contains `/opt/homebrew/bin` (M1 Mac) or `/usr/local/bin` (Intel Mac). If these directories are not in the output, see the “Existing Redis installation” instructions below. -::: - -### Start Redis Stack Server - -You can now start Redis Stack Server as follows: - -```bash - redis-stack-server -``` - -### Existing Redis installation - -If you have an existing Redis installation on your system, then you’ll need to modify your path to ensure that you’re using the latest Redis Stack binaries. - -Open the file `~/.bashrc` or `~/zshrc` (depending on your shell), and add the following lines. - -```bash - export PATH=/usr/local/Caskroom/redis-stack-server//bin:$PATH -``` - -Go to Applications and click "RedisInsight-v2" to bring up the Redis Desktop GUI tool. - -### Step 2. Add Redis database - -![access redisinsight](images/add_database.png) - -### Step 3. Enter Redis database details - -Add the local Redis database endpoint and port. - -![access redisinsight](images/testredis1.png) - -### Step 5. Redis for time series - -Redis Stack provides you with a native time series data structure. Let's see how a time series might be useful in our bike shop. - -As we have multiple physical shops too, alongside our online shop, it could be helpful to have an overview of the sales volume. We will create one time series per shop tracking the total amount of all sales. In addition, we will mark the time series with the appropriate region label, east or west. This kind of representation will allow us to easily query bike sales performance per certain time periods, per shop, per region or across all shops. - -Click "Guides" icon(just below the key) in the left sidebar and choose "Redis for the time series" for this demonstration. i - -![redis for timeseries](images/testredis2.png) - -### Step 6. Create time series per shop - -```bash - TS.CREATE bike_sales_1 DUPLICATE_POLICY SUM LABELS region east compacted no - TS.CREATE bike_sales_2 DUPLICATE_POLICY SUM LABELS region east compacted no - TS.CREATE bike_sales_3 DUPLICATE_POLICY SUM LABELS region west compacted no - TS.CREATE bike_sales_4 DUPLICATE_POLICY SUM LABELS region west compacted no - TS.CREATE bike_sales_5 DUPLICATE_POLICY SUM LABELS region west compacted no -``` - -As shown in the following query, we make the shop id (1,2,3,4,5) a part of the time series name. You might also notice the `DUPLICATE_POLICY SUM` argument; this describes what should be done when two events in the same time series share the same timestamp: In this case, it would mean that two sales happened at exactly the same time, so the resulting value should be a sum of the two sales amounts. - -Since the metrics are collected with a millisecond timestamp, we can compact our time series into sales per hour: - -![create time series per shop](images/testredis3.png) - -### Step 7. Running the query - -![execute the query](images/testredis4.png) - -### Step 8. Time series compaction - -RedisTimeSeries supports downsampling with the following aggregations: avg, sum, min, max, range, count, first and last. If you want to keep all of your raw data points indefinitely, your data set grows linearly over time. However, if your use case allows you to have less fine-grained data further back in time, downsampling can be applied. This allows you to keep fewer historical data points by aggregating raw data for a given time window using a given aggregation function. - -#### Example: - -``` - TS.CREATERULE bike_sales_5 bike_sales_5_per_day AGGREGATION sum 86400000 -``` - -![time series compaction](images/testredis6.png) - -### Overview of RedisInsight Workbench - -With the new RedisInsight v2.0, a Workbench has been introduced. Workbench is basically an advanced command-line interface that lets you run commands against your Redis server. Workbench editor allows comments, multi-line formatting and multi-command execution. It is an Intelligent Redis command auto-complete and syntax highlighting with support for RediSearch, RedisJSON, RedisGraph, RedisTimeSeries, RedisGears, RedisAI, RedisBloom. It allows rendering custom data visualization per Redis command using externally developed plugins. - -You can locate the workbench on the left sidebar of RedisInsight dashboard UI. It displays a built-in click-through guides for Redis capabilities. You can also see a number of metrics always on display within the database workspace. These metrics get updated every 5 seconds. The metrics include CPU, number of keys, commands/sec, network input, network output, total memory, number of connected clients. - -![My Image](images/image13.png) - -Check out the reference section to learn more about the new RedisInsight v2.0 features. - -### Accessing the CLI - -The new RedisInsight v2.0 comes with a command-line interface with enhanced type-ahead command help. It includes an embedded command helper where you can filter and search for Redis commands. Click on "CLI" option to open CLI window: - -![My Image](images/image16.png) - -Try executing Redis commands as shown below: - -![My Image](images/image17.png) - - - - - -## Using Linux - -### Step 1. Download RedisInsight - -To use RedisInsight on your Linux machine, you can download it directly from the official Redis website: - -Open [this](https://redis.com/redis-enterprise/redis-insight/#insight-form) link to open up a form that allows you to select the operating system of your choice. - -![My Image](images/image21.png) - -Fill out the rest of the form and click “Download”. Please note that the package is based on AppImage. The AppImage file is a compressed image which is temporarily mounted to allow access to the program, but not having to extract the program or modify the underlying system. - -Package Name: RedisInsight-preview-linux.AppImage - -```bash - file RedisInsight-preview-linux.AppImage - RedisInsight-preview-linux.AppImage: ELF 64-bit LSB executable, x86-64, version 1 (SYSV), dynamically linked, interpreter /lib64/ld-linux-x86-64.so.2, for GNU/Linux 2.6.18, stripped -``` - -### Step 2. Install RedisInsight - -Open a terminal and navigate to the folder containing the downloaded file. - -Make your downloaded file into an executable. - -``` -chmod a+x RedisInsight-preview-linux.AppImage -``` - -### Step 3. Start RedisInsight. - -Run the below command to open up RedisInsight dashboard. - -``` -./RedisInsight-preview-linux.AppImage -``` - -### Step 4. Changing the appearance - -RedisInsight v2.x allows you to specify the color theme of your choice. Click on "Settings" and change the appearance from "Dark Theme" to "Light Theme" as shown in the image below: - -![My Image](images/image_appearance.png) - -### Step 5. Connect to Redis Database - -There are multiple ways you can connect to a Redis database - either by creating a new Redis Enterprise Cloud database or connecting to an existing database. To connect to Redis Enterprise Cloud, choose the "Create a Free on Redis Cloud" option. - -![My Image](images/image31.png) - -Once clicked, it will redirect to [the link](https://redis.com/try-free/?utm_source=redis&utm_medium=app&utm_campaign=redisinsight) where you will need to complete the form. - -![My Image](images/image41.png) - -You can follow [this link](/create/rediscloud) to create a New Redis Enterprise Cloud database. -In case you have an existing Redis database, follow the below steps. - -Assuming that you already have Redis database up and running locally, proceed to the next step to select "ADD REDIS DATABASE" - -### Step 6. Add Redis database - -Enter the requested details, including Host (endpoint), Port, and Alias in the form, as shown below. You can skip username for now. Then click “ADD REDIS DATABASE”: - -![My Image](images/image_61.png) - -Once added, you will see the database added as shown below: - -![My Image](images/image_db.png) - -### Step 7. Adding a New Key - -Select the "Key" icon on the left sidebar of RedisInsight UI and click "+Key" to add a new key. - -![My Image](images/image_71.png) - -Once added, the dashboard UI shows the hash key details. - -![My Image](images/image11.png) - -### Step 8. Accessing the Workbench - -With the new RedisInsight v2.0, a Workbench has been introduced. Workbench is basically an advanced command-line interface that lets you run commands against your Redis server. Workbench editor allows comments, multi-line formatting and multi-command execution. It is an Intelligent Redis command auto-complete and syntax highlighting with support for RediSearch, RedisJSON, RedisGraph, RedisTimeSeries, RedisGears, RedisAI, RedisBloom. It allows rendering custom data visualization per Redis command using externally developed plugins. - -You can locate the workbench on the left sidebar of RedisInsight dashboard UI. It displays a built-in click-through guides for Redis capabilities. You can also see a number of metrics always on display within the database workspace. These metrics get updated every 5 seconds. The metrics include CPU, number of keys, commands/sec, network input, network output, total memory, number of connected clients. - -![My Image](images/image13.png) - -### Step 9. Accessing the CLI - -The new RedisInsight v2.0 comes with a command-line interface with enhanced type-ahead command help. It includes an embedded command helper where you can filter and search for Redis commands. Click on "CLI" option to open CLI window: - -![My Image](images/image16.png) - -Try executing Redis commands as shown below: - -![My Image](images/image17.png) - - - - - -## Using Windows - -### Step 1. Download RedisInsight - -To install RedisInsight on Windows, you need to first download the RedisInsight windows bits. - -Open [this](https://redis.com/redis-enterprise/redis-insight/#insight-form) link to open up a form that allows you to select the operating system of your choice. - -![My Image](images/image_windows.png) - -### Step 2. Install RedisInsight - -Once you download the bits, double-click the file 'RedisInsight-preview-win-installer.exe' to install RedisInsight in your Windows desktop. - -### Step 3. Accessing RedisInsight - -Double-click on RedisInsight icon to access RedisInsight. - -### Step 4. Changing the Theme - -RedisInsight v2.x allows you to specify the color theme of your choice. Click on "Settings" and change the appearance from "Dark Theme" to "Light Theme" as shown in the image below: - -![My Image](images/image_appearance.png) - -### Step 5. Add a Redis Database - -Enter the requested details, including Host (endpoint), Port, and Alias in the form, as shown below. You can skip username for now. Then click “ADD REDIS DATABASE”: - -![My Image](images/image_61.png) - -Once added, you will see the database added as shown below: - -![My Image](images/image_db.png) - -### Step 6. Adding a New Key - -Select the "Key" icon on the left sidebar of RedisInsight UI and click "+Key" to add a new key. - -![My Image](images/image_71.png) - -Once added, the dashboard UI shows the hash key details. - -![My Image](images/image11.png) - -### Step 7. Accessing the Workbench - -With the new RedisInsight v2.0, a Workbench has been introduced. Workbench is basically an advanced command-line interface that lets you run commands against your Redis server. Workbench editor allows comments, multi-line formatting and multi-command execution. It is an Intelligent Redis command auto-complete and syntax highlighting with support for RediSearch, RedisJSON, RedisGraph, RedisTimeSeries, RedisGears, RedisAI, RedisBloom. It allows rendering custom data visualization per Redis command using externally developed plugins - -You can locate the workbench on the left sidebar of RedisInsight dashboard UI. It displays a built-in click-through guides for Redis capabilities. You can also see a number of metrics always on display within the database workspace. These metrics get updated every 5 seconds. The metrics include CPU, number of keys, commands/sec, network input, network output, total memory, number of connected clients. - -![My Image](images/image13.png) - -### Step 8. Accessing the CLI - -The new RedisInsight v2.0 comes with a command-line interface with enhanced type-ahead command help. It includes an embedded command helper where you can filter and search for Redis commands. Click on "CLI" option to open CLI window: - -![My Image](images/image16.png) - -Try executing Redis commands as shown below: - -![My Image](images/image17.png) - - - - -RedisInsight allows you to browse, filter and visualize key-value Redis data structures. It support CRUD operation for Lists, Hashes, Strings, Sets, Sorted Sets etc. [In our next tutorial](/explore/redisinsightv2/browser), we will explore the browser tool in more details. - -### References - -- [RedisInsight v2.0 Release Blog](https://redis.com/blog/introducing-redisinsight-2/) -- [RedisInsight v2.0 Release Notes](https://docs.redis.com/latest/ri/release-notes/v2.0.2/) -- [RedisInsight GitHub Repository](https://github.com/redisinsight/redisinsight) diff --git a/docs/explore/redisinsightv2/index-redisinsightv2.mdx b/docs/explore/redisinsightv2/index-redisinsightv2.mdx deleted file mode 100644 index d7976609ec3..00000000000 --- a/docs/explore/redisinsightv2/index-redisinsightv2.mdx +++ /dev/null @@ -1,62 +0,0 @@ ---- -id: index-redisinsightv2 -title: RedisInsight Developer Hub for Redis Interactive Tutorials -sidebar_label: Overview -slug: /explore/redisinsightv2 ---- - -import RedisCard from '@site/src/theme/RedisCard'; -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; - -
- -
- -
- -
- -
- -
- -
-
- -
- -
- -
- -
- -
- -
diff --git a/docs/explore/redisinsightv2/profiler/images/profiler_1.png b/docs/explore/redisinsightv2/profiler/images/profiler_1.png deleted file mode 100644 index 940cc807ce4..00000000000 Binary files a/docs/explore/redisinsightv2/profiler/images/profiler_1.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/profiler/images/profiler_10.png b/docs/explore/redisinsightv2/profiler/images/profiler_10.png deleted file mode 100644 index 40a8d5037be..00000000000 Binary files a/docs/explore/redisinsightv2/profiler/images/profiler_10.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/profiler/images/profiler_11.png b/docs/explore/redisinsightv2/profiler/images/profiler_11.png deleted file mode 100644 index 87d250121ac..00000000000 Binary files a/docs/explore/redisinsightv2/profiler/images/profiler_11.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/profiler/images/profiler_12.png b/docs/explore/redisinsightv2/profiler/images/profiler_12.png deleted file mode 100644 index 64d0dbec013..00000000000 Binary files a/docs/explore/redisinsightv2/profiler/images/profiler_12.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/profiler/images/profiler_2.png b/docs/explore/redisinsightv2/profiler/images/profiler_2.png deleted file mode 100644 index 71efb9e2897..00000000000 Binary files a/docs/explore/redisinsightv2/profiler/images/profiler_2.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/profiler/images/profiler_3.png b/docs/explore/redisinsightv2/profiler/images/profiler_3.png deleted file mode 100644 index 9eec0df179c..00000000000 Binary files a/docs/explore/redisinsightv2/profiler/images/profiler_3.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/profiler/images/profiler_4.png b/docs/explore/redisinsightv2/profiler/images/profiler_4.png deleted file mode 100644 index 28bb0b3f2a5..00000000000 Binary files a/docs/explore/redisinsightv2/profiler/images/profiler_4.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/profiler/images/profiler_5.png b/docs/explore/redisinsightv2/profiler/images/profiler_5.png deleted file mode 100644 index dfab4144224..00000000000 Binary files a/docs/explore/redisinsightv2/profiler/images/profiler_5.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/profiler/images/profiler_6.png b/docs/explore/redisinsightv2/profiler/images/profiler_6.png deleted file mode 100644 index 48689f89498..00000000000 Binary files a/docs/explore/redisinsightv2/profiler/images/profiler_6.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/profiler/images/profiler_7.png b/docs/explore/redisinsightv2/profiler/images/profiler_7.png deleted file mode 100644 index dc27e6a5fb1..00000000000 Binary files a/docs/explore/redisinsightv2/profiler/images/profiler_7.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/profiler/images/profiler_8.png b/docs/explore/redisinsightv2/profiler/images/profiler_8.png deleted file mode 100644 index 33c6256e557..00000000000 Binary files a/docs/explore/redisinsightv2/profiler/images/profiler_8.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/profiler/images/profiler_9.png b/docs/explore/redisinsightv2/profiler/images/profiler_9.png deleted file mode 100644 index bd61084deb2..00000000000 Binary files a/docs/explore/redisinsightv2/profiler/images/profiler_9.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/profiler/index-profiler.mdx b/docs/explore/redisinsightv2/profiler/index-profiler.mdx deleted file mode 100644 index eab7257b3c0..00000000000 --- a/docs/explore/redisinsightv2/profiler/index-profiler.mdx +++ /dev/null @@ -1,233 +0,0 @@ ---- -id: index-profiler -title: RedisInsight Profiler Tool - Analyze Your Redis Commands Using Redis Monitor Command -sidebar_label: RedisInsight Profiler Tool - Analyze Your Redis Commands Using Redis Monitor Command -slug: /explore/redisinsightv2/profiler -authors: [ajeet] ---- - -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; - -![alt_text](images/profiler_1.png) - -Last week [the maintenance release of RedisInsight Preview 2.0](https://docs.redis.com/latest/ri/release-notes/v2.0.2/) (v2.0.4) was introduced by the RedisInsight Team. RedisInsight v2.0 is a complete product rewrite based on a new tech stack composed of [Electron](https://www.electronjs.org/), [Elastic UI](https://elastic.github.io/eui/#/), [Monaco Editor](https://microsoft.github.io/monaco-editor/), and [Node](https://nodejs.org/).js. This newer preview build added a dedicated RedisInsight Profiler UI for the first time. The profiler uses the MONITOR command to analyze every command sent to the Redis instance in real time. - -RedisInsight Profiler analyzes your Redis commands that are being run on the Redis server in real time. The tool provides you detailed information about the number of commands processed, commands/second, and number of connected clients. It also gives information about top prefixes, top keys, and top commands. - -It basically runs the Redis MONITOR command and generates a summarized view. MONITOR is a debugging command that streams back every command processed by the Redis server. It can help in understanding what is happening to the database. This command can both be used via redis-cli and via telnet. All the commands sent to the Redis instance are monitored for the duration of the profiling. The ability to see all the requests processed by the server is useful in order to spot bugs in an application, both when using Redis as a database and as a distributed caching system. - -Follow the below instructions to test drive RedisInsight Profiler tool introduced under the RedisInsight v2.0.4 release: - -### Step 1. Create Redis database with RedisTimeSeries module enabled - -![alt_text](images/profiler_2.png) - -Visit [https://developer.redis.com/create/rediscloud](https://developer.redis.com/create/rediscloud) and create a Redis database. [Follow these steps to enable RedisTimeSeries module ](https://developer.redis.com/howtos/redistimeseries)on Redis Enterprise Cloud. - -### Step 2. Create database - -Click “Create Database”. Enter database name and select RedisTimeSeries Module. - -![alt_text](images/profiler_3.png) - -Once the database is created, you will see the endpoint URL that gets generated. Save it for future reference. - -![alt_text](images/profiler_4.png) - -### Step 3. Download RedisInsight - -To install RedisInsight on your local system, you need to first download the software from the Redis website.\*\* - -[Click this link ](https://redis.com/redis-enterprise/redis-insight/#insight-form)to access a form that allows you to select the operating system of your choice.\*\* - -![alt_text](images/profiler_5.png) - -Execute the installer. Once it is installed on your computer, click on the RedisInsight icon to open the tool. - -![alt_text](images/profiler_6.png) - -### Step 4. Connect to Redis Enterprise Cloud Database - -![alt_text](images/profiler_7.png) - -![alt_text](images/profiler_8.png) - -As the database is empty, you won’t be able to see any key. - -![alt_text](images/profiler_9.png) - -### Step 5. Execute the script - -Below is the script that creates a time series representing sensor temperature measurements. After you create the time series, you can send temperature measurements. Then you can query the data for a time range on some aggregation rule. - -``` -from redistimeseries.client import Client as RedisTimeSeries -import time -import sys -import site -import datetime -import random - -print(' \n '.join(sys.path)) -redis = RedisTimeSeries(host='redis-16169.c212.ap-south-1-1.ec2.cloud.redislabs.com', port=16169, password='XXXX') - -# redis.flushdb() -key = 'temperature' -def create(key): - print('\n Create new time series: %s' % str(key)) - #redis.create(key,retentionSecs=30,labels={'sensor_id' : 2,'area_id' : 32}) - redis.create(key,retention_msecs=30000,labels={'sensor_id' : 2,'area_id' : 32}) - print('') -def store(key, interval): - print("\n Append new value to time series:\n") - begin_time = int(time.time()) - for i in range(interval): - timestamp = int(time.time()) - value = round(random.uniform(0.0,100.0),2) - timestamp_strftime = datetime.datetime.fromtimestamp(timestamp).strftime('%Y-%m-%d %H:%M:%S') - sys.stdout.write(' %s : %.2f \n' % (timestamp_strftime, value)) - sys.stdout.flush() - #redis.add(key,timestamp,value,retentionSecs=30, labels={'sensor_id' : 2,'area_id' : 32}) - redis.add(key,timestamp,value,retention_msecs=30000, labels={'sensor_id' : 2,'area_id' : 32}) - time.sleep(1) - end_time = int(time.time()-1) - return (begin_time, end_time) -def query(key, begin_time, end_time): - begin_time_datetime = datetime.datetime.fromtimestamp(begin_time).strftime('%Y-%m-%d %H:%M:%S') - end_time_datetime = datetime.datetime.fromtimestamp(end_time).strftime('%Y-%m-%d %H:%M:%S') - print("\n Query time series in range:\n\n %s to %s \n" % (begin_time_datetime, end_time_datetime)) - try: - #for record in redis.range(key,begin_time, end_time,bucketSizeSeconds=1): - for record in redis.range(key,begin_time, end_time,bucket_size_msec=1000): - timestamp = datetime.datetime.fromtimestamp(record[0]).strftime('%Y-%m-%d %H:%M:%S') - value = round(float(record[1]),2) - print(' %s : %.2f ' % (timestamp,value)) - except Exception as e: - print("\n Error: %s" % e) - print('') -def print_info(): - print('\n Query time series info:\n') - for key in redis.keys('*'): - print(' key=%s' % (key.decode('utf8'))) - info = redis.info(key) - sensor = info.labels['sensor_id'] - print(" sensor_id=%s " % str(sensor)) - area = info.labels['area_id'] - print(" area_id=%s " % str(area)) - last_time_stamp_seconds = info.__dict__['lastTimeStamp'] - last_time_stamp = datetime.datetime.fromtimestamp(last_time_stamp_seconds).strftime('%Y-%m-%d %H:%M:%S') - print(" last_time_stamp=%s " % str(last_time_stamp)) - - print('') - -def print_loop(loops): - - for i in range(loops): - - if i == 0: - sys.stdout.write(' ') - - sys.stdout.write('.') - sys.stdout.flush() - time.sleep(1) - - print('') - -create(key) -interval = 10 -begin_time, end_time = store(key,interval) -time.sleep(1) -query(key,begin_time,end_time) -query(key,begin_time+4,end_time-5) -print_info() -print('\n Set expire key: %s' % str(key)) -redis.expire(key, (30)) -loops = 30 -print_loop(loops) -query(key,begin_time,end_time) -time.sleep(1) -interval = 1 -create(key) -begin_time, end_time = store(key,interval) -time.sleep(1) -query(key,begin_time,end_time) -time.sleep(1) -print('\n Delete key: %s' % str(key)) -redis.delete(key) -time.sleep(1) - -query(key,begin_time,end_time) - -print('') - -``` - -Results: - -``` -Create new time series: temperature - - - Append new value to time series: - - 2022-02-13 17:52:16 : 36.50 - 2022-02-13 17:52:17 : 84.56 - 2022-02-13 17:52:18 : 25.90 - 2022-02-13 17:52:19 : 29.24 - 2022-02-13 17:52:20 : 35.75 - 2022-02-13 17:52:21 : 78.14 - 2022-02-13 17:52:22 : 28.77 - 2022-02-13 17:52:23 : 26.37 - 2022-02-13 17:52:24 : 74.93 - 2022-02-13 17:52:25 : 46.61 - - Query time series in range: - - 2022-02-13 17:52:16 to 2022-02-13 17:52:25 - - 2022-02-13 17:52:16 : 36.50 - 2022-02-13 17:52:17 : 84.56 - 2022-02-13 17:52:18 : 25.90 - 2022-02-13 17:52:19 : 29.24 - 2022-02-13 17:52:20 : 35.75 - 2022-02-13 17:52:21 : 78.14 - 2022-02-13 17:52:22 : 28.77 - 2022-02-13 17:52:23 : 26.37 - 2022-02-13 17:52:24 : 74.93 - 2022-02-13 17:52:25 : 46.61 - - - Query time series in range: - - 2022-02-13 17:52:20 to 2022-02-13 17:52:20 - - 2022-02-13 17:52:20 : 35.75 - - -``` - -### Step 6. Running Profiler - -The new RedisInsight Browser tool allows you to explore keys in your Redis server. You can add, edit, and delete a key. It also helps you to browse, filter, and visualize key-value Redis data structures. - -Open Browser tool and select TS from the drop-down menu as shown below: - -![alt_text](images/profiler_10.png) - -It will display temperature as a key. Choose the “Profiler” option and click on “Start Profiler.” - -![alt_text](images/profiler_11.png) - -Soon you will be able to see the detailed information about the number of commands processed, commands/second, and number of connected clients. It also gives information about top prefixes, top keys, and top commands. - -![alt_text](images/profiler_12.png) - -### References: - -- [RedisInsight: The Best Redis GUI](https://redis.com/redis-enterprise/redis-insight/) -- [Slowlog Configuration using RedisInsight](/explore/redisinsight/slowlog) -- [Explore Redis keys using RedisInsight browser tool](/explore/redisinsight/browser) -- [Memory Analysis using RedisInsight](/explore/redisinsight/memoryanalyzer) diff --git a/docs/explore/redisinsightv2/redisearch/.gitignore b/docs/explore/redisinsightv2/redisearch/.gitignore deleted file mode 100644 index b4b8eb20d02..00000000000 --- a/docs/explore/redisinsightv2/redisearch/.gitignore +++ /dev/null @@ -1 +0,0 @@ -*.swo diff --git a/docs/explore/redisinsightv2/redisearch/images/database-connection.png b/docs/explore/redisinsightv2/redisearch/images/database-connection.png deleted file mode 100644 index a4c0a2722fa..00000000000 Binary files a/docs/explore/redisinsightv2/redisearch/images/database-connection.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/redisearch/images/database.png b/docs/explore/redisinsightv2/redisearch/images/database.png deleted file mode 100644 index 5c2bb348278..00000000000 Binary files a/docs/explore/redisinsightv2/redisearch/images/database.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/redisearch/images/hash.png b/docs/explore/redisinsightv2/redisearch/images/hash.png deleted file mode 100644 index 4ab24e64a22..00000000000 Binary files a/docs/explore/redisinsightv2/redisearch/images/hash.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/redisearch/images/image1.png b/docs/explore/redisinsightv2/redisearch/images/image1.png deleted file mode 100644 index e96ca1fc0e1..00000000000 Binary files a/docs/explore/redisinsightv2/redisearch/images/image1.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/redisearch/images/image10.png b/docs/explore/redisinsightv2/redisearch/images/image10.png deleted file mode 100644 index 936478f68d0..00000000000 Binary files a/docs/explore/redisinsightv2/redisearch/images/image10.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/redisearch/images/image11.png b/docs/explore/redisinsightv2/redisearch/images/image11.png deleted file mode 100644 index 962dabff13f..00000000000 Binary files a/docs/explore/redisinsightv2/redisearch/images/image11.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/redisearch/images/image12.png b/docs/explore/redisinsightv2/redisearch/images/image12.png deleted file mode 100644 index f821a14e58a..00000000000 Binary files a/docs/explore/redisinsightv2/redisearch/images/image12.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/redisearch/images/image13.png b/docs/explore/redisinsightv2/redisearch/images/image13.png deleted file mode 100644 index ed8e1833753..00000000000 Binary files a/docs/explore/redisinsightv2/redisearch/images/image13.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/redisearch/images/image14.png b/docs/explore/redisinsightv2/redisearch/images/image14.png deleted file mode 100644 index 484da482f9a..00000000000 Binary files a/docs/explore/redisinsightv2/redisearch/images/image14.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/redisearch/images/image15.png b/docs/explore/redisinsightv2/redisearch/images/image15.png deleted file mode 100644 index c1353e12c76..00000000000 Binary files a/docs/explore/redisinsightv2/redisearch/images/image15.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/redisearch/images/image16.png b/docs/explore/redisinsightv2/redisearch/images/image16.png deleted file mode 100644 index a0600a704dc..00000000000 Binary files a/docs/explore/redisinsightv2/redisearch/images/image16.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/redisearch/images/image17.png b/docs/explore/redisinsightv2/redisearch/images/image17.png deleted file mode 100644 index 9c411ed6cc3..00000000000 Binary files a/docs/explore/redisinsightv2/redisearch/images/image17.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/redisearch/images/image18.png b/docs/explore/redisinsightv2/redisearch/images/image18.png deleted file mode 100644 index 479564f3b41..00000000000 Binary files a/docs/explore/redisinsightv2/redisearch/images/image18.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/redisearch/images/image19.png b/docs/explore/redisinsightv2/redisearch/images/image19.png deleted file mode 100644 index f531bd42ac6..00000000000 Binary files a/docs/explore/redisinsightv2/redisearch/images/image19.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/redisearch/images/image2.png b/docs/explore/redisinsightv2/redisearch/images/image2.png deleted file mode 100644 index 809ff4d0f60..00000000000 Binary files a/docs/explore/redisinsightv2/redisearch/images/image2.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/redisearch/images/image20.png b/docs/explore/redisinsightv2/redisearch/images/image20.png deleted file mode 100644 index 713b3cf0e19..00000000000 Binary files a/docs/explore/redisinsightv2/redisearch/images/image20.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/redisearch/images/image21.png b/docs/explore/redisinsightv2/redisearch/images/image21.png deleted file mode 100644 index 42c1c28444e..00000000000 Binary files a/docs/explore/redisinsightv2/redisearch/images/image21.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/redisearch/images/image22.png b/docs/explore/redisinsightv2/redisearch/images/image22.png deleted file mode 100644 index 0e2b9560682..00000000000 Binary files a/docs/explore/redisinsightv2/redisearch/images/image22.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/redisearch/images/image23.png b/docs/explore/redisinsightv2/redisearch/images/image23.png deleted file mode 100644 index 89e1cc1c0c2..00000000000 Binary files a/docs/explore/redisinsightv2/redisearch/images/image23.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/redisearch/images/image3.png b/docs/explore/redisinsightv2/redisearch/images/image3.png deleted file mode 100644 index f6ed510852a..00000000000 Binary files a/docs/explore/redisinsightv2/redisearch/images/image3.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/redisearch/images/image4.png b/docs/explore/redisinsightv2/redisearch/images/image4.png deleted file mode 100644 index feb8371a56f..00000000000 Binary files a/docs/explore/redisinsightv2/redisearch/images/image4.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/redisearch/images/image5.png b/docs/explore/redisinsightv2/redisearch/images/image5.png deleted file mode 100644 index 3cade2d1e88..00000000000 Binary files a/docs/explore/redisinsightv2/redisearch/images/image5.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/redisearch/images/image6.png b/docs/explore/redisinsightv2/redisearch/images/image6.png deleted file mode 100644 index 6a101ea6f33..00000000000 Binary files a/docs/explore/redisinsightv2/redisearch/images/image6.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/redisearch/images/image7.png b/docs/explore/redisinsightv2/redisearch/images/image7.png deleted file mode 100644 index d6e435d383e..00000000000 Binary files a/docs/explore/redisinsightv2/redisearch/images/image7.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/redisearch/images/image8.png b/docs/explore/redisinsightv2/redisearch/images/image8.png deleted file mode 100644 index 8112bd39d50..00000000000 Binary files a/docs/explore/redisinsightv2/redisearch/images/image8.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/redisearch/images/image9.png b/docs/explore/redisinsightv2/redisearch/images/image9.png deleted file mode 100644 index 00f6799f93a..00000000000 Binary files a/docs/explore/redisinsightv2/redisearch/images/image9.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/redisearch/images/insert_movies.png b/docs/explore/redisinsightv2/redisearch/images/insert_movies.png deleted file mode 100644 index 7821c6c1796..00000000000 Binary files a/docs/explore/redisinsightv2/redisearch/images/insert_movies.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/redisearch/images/rating.png b/docs/explore/redisinsightv2/redisearch/images/rating.png deleted file mode 100644 index 64c75ed9733..00000000000 Binary files a/docs/explore/redisinsightv2/redisearch/images/rating.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/redisearch/images/redisearch.png b/docs/explore/redisinsightv2/redisearch/images/redisearch.png deleted file mode 100644 index 680bd46d0a3..00000000000 Binary files a/docs/explore/redisinsightv2/redisearch/images/redisearch.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/redisearch/images/redisinsight-connection.png b/docs/explore/redisinsightv2/redisearch/images/redisinsight-connection.png deleted file mode 100644 index b51212727ec..00000000000 Binary files a/docs/explore/redisinsightv2/redisearch/images/redisinsight-connection.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/redisearch/images/redisinsight-register.png b/docs/explore/redisinsightv2/redisearch/images/redisinsight-register.png deleted file mode 100644 index 5552d7fbe73..00000000000 Binary files a/docs/explore/redisinsightv2/redisearch/images/redisinsight-register.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/redisearch/images/redisinsight4.png b/docs/explore/redisinsightv2/redisearch/images/redisinsight4.png deleted file mode 100644 index f00d956f37a..00000000000 Binary files a/docs/explore/redisinsightv2/redisearch/images/redisinsight4.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/redisearch/images/redisinsightinstall.png b/docs/explore/redisinsightv2/redisearch/images/redisinsightinstall.png deleted file mode 100644 index 99f2c696ea5..00000000000 Binary files a/docs/explore/redisinsightv2/redisearch/images/redisinsightinstall.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/redisearch/index-redisearch.mdx b/docs/explore/redisinsightv2/redisearch/index-redisearch.mdx deleted file mode 100644 index fbb8feecc8c..00000000000 --- a/docs/explore/redisinsightv2/redisearch/index-redisearch.mdx +++ /dev/null @@ -1,430 +0,0 @@ ---- -id: index-redisearch -title: Perform Database Search and Analytics using the RediSearch Plugin in RedisInsight v2.0 -sidebar_label: Perform Database Search and Analytics using the RediSearch Plugin in RedisInsight v2.0 -slug: /explore/redisinsightv2/redisearch -authors: [ajeet] ---- - -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; - -A full-featured pure desktop GUI client, RedisInsight supports RediSearch. [RediSearch](https://oss.redis.com/redisearch/) is a powerful indexing, querying, and full-text search engine for Redis. It is one of the most mature and feature-rich Redis modules. With RedisInsight, the following functionalities are possible: - -![MyImage](images/redisearch.png) - -- Multi-line for building queries -- Ability to submit query with ‘ctrl + enter’ in single line mode -- Better handling of long index names in index selector dropdown -- Supports Aggregation -- Supports Fuzzy logic -- Supports simple and complex conditions -- Sorting -- Pagination -- Counting - -RediSearch allows you to quickly create indexes on datasets (stored as Redis Hashes or, with RedisJSON, as JSON documents), and uses an incremental indexing approach for rapid index creation and deletion. The indexes let you query your data at lightning speed, perform complex aggregations, and filter by properties, numeric ranges, and geographical distance. - -### Step 1. Create a Redis Database - -[Follow this link to create a Redis database using a Docker container](/explore/redismod) that comes with the RediSearch module enabled. - -### Step 2: Download RedisInsight - -To install RedisInsight on your local system, you need to first download the software from the Redis website. - -[Click this link](https://redis.com/redis-enterprise/redis-insight/#insight-form) to access a form that allows you to select the operating system of your choice. - -![My Image](images/redisinsight-register.png) - -Run the installer. -Once the installation completes, you should be able to connect to a Redis database. - -Select "Connect to a Redis database". - -![My Image](images/database-connection.png) - -Enter the requested details, including Name, Host (endpoint), Port, and Password. Then click “ADD REDIS DATABASE”. - -![alt_text](images/database.png) - -### Step 3. Movie Sample Database - -In this section, you will use a simple dataset describing movies, for now, all records are in English. You will learn more about other languages in another tutorial. - -A movie is represented by the following attributes: - -- **`movie_id`** : The unique ID of the movie, internal to this database -- **`title`** : The title of the movie. -- **`plot`** : A summary of the movie. -- **`genre`** : The genre of the movie, for now a movie will only have a single genre. -- **`release_year`** : The year the movie was released as a numerical value. -- **`rating`** : A numeric value representing the public's rating for this movie. -- **`votes`** : Number of votes. -- **`poster`** : Link to the movie poster. -- **`imdb_id`** : id of the movie in the [IMDB](https://imdb.com) database. - -#### Key and Data Structure - -As a Redis developer, one of the first things to look at when building your application is to define the structure of the key and data (data design/data modeling). - -A common strategy for Redis is to use specific patterns when naming keys. For example in this application where the database will probably deal with various business objects: movies, actors, theaters, users, ... we can use the following pattern: - -- `business_object:key` - -For example: - -- `movie:001` for the movie with the id 001 -- `user:001` the user with the id 001 - -and for the movie's information you should use a Redis [Hash](https://redis.io/topics/data-types#hashes). - -A Redis Hash allows the application to structure all the movie attributes in individual fields; also RediSearch will index the fields based on the index definition. - -### Step 4. Insert Movies - -It is time now to add some data into your database, let's insert a few movies, using `redis-cli` or [RedisInsight](https://redis.com/redis-enterprise/redis-insight/). - -Once you are connected to your Redis instance run the following commands: - -``` -HSET movie:11002 title "Star Wars: Episode V - The Empire Strikes Back" plot "After the Rebels are brutally overpowered by the Empire on the ice planet Hoth, Luke Skywalker begins Jedi training with Yoda, while his friends are pursued by Darth Vader and a bounty hunter named Boba Fett all over the galaxy." release_year 1980 genre "Action" rating 8.7 votes 1127635 imdb_id tt0080684 -``` - -``` -HSET movie:11003 title "The Godfather" plot "The aging patriarch of an organized crime dynasty transfers control of his clandestine empire to his reluctant son." release_year 1972 genre "Drama" rating 9.2 votes 1563839 imdb_id tt0068646 -``` - -``` -HSET movie:11004 title "Heat" plot "A group of professional bank robbers start to feel the heat from police when they unknowingly leave a clue at their latest heist." release_year 1995 genre "Thriller" rating 8.2 votes 559490 imdb_id tt0113277 -``` - -``` -HSET "movie:11005" title "Star Wars: Episode VI - Return of the Jedi" genre "Action" votes 906260 rating 8.3 release_year 1983 plot "The Rebels dispatch to Endor to destroy the second Empire's Death Star." ibmdb_id "tt0086190" -``` - -Now it is possible to get information from the hash using the movie ID. For example if you want to get the title, and rating execute the following command: - -``` ->> HMGET movie:11002 title rating -``` - -#### Result: - -``` -1) "Star Wars: Episode V - The Empire Strikes Back" -2) "8.7" -``` - -#### Increment the Movie Rating - -You can increment the rating of this movie using: - -``` -HINCRBYFLOAT movie:11002 rating 0.1 -``` - -Here's a quick screenshot of the results shown in RedisInsight: - -![MyImage](images/rating.png) - -But how do you get a movie or list of movies by year of release, rating or title? - -One option, would be to read all the movies, check all fields and then return only matching movies; no need to say that this is a really bad idea. Nevertheless this is where Redis developers often create custom secondary indexes using SET/SORTED SET structures that point back to the movie hash. This needs some heavy design and implementation. - -This is where the RediSearch module can help, and why it was created. - -### Step 5. RediSearch & Indexing - -RediSearch greatly simplifies this by offering a simple and automatic way to create secondary indices on Redis Hashes. (more datastructure will eventually come) - -![Secondary Index](https://github.com/RediSearch/redisearch-getting-started/blob/master/docs/images/secondary-index.png?raw=true) - -Using RediSearch if you want to query on a field, you must first index that field. Let's start by indexing the following fields for our movies: - -- Title -- Release Year -- Rating -- Genre - -When creating an index you define: - -- which data you want to index: all _hashes_ with a key starting with `movies` -- which fields in the hashes you want to index using a Schema definition. - -> **_Warning: Do not index all fields_** -> -> Indexes take space in memory, and must be updated when the primary data is updated. So create the index carefully and keep the definition up to date with your needs. - -### Step 6. Create the Index - -``` - FT.CREATE idx:movie ON hash PREFIX 1 "movie:" SCHEMA title TEXT SORTABLE release_year NUMERIC SORTABLE rating NUMERIC SORTABLE genre TAG SORTABLE -``` - -The database contains a few movies, and an index, it is now possible to execute some queries. - -#### Query: All the movies that contains the string "`war`" - -``` -FT.SEARCH idx:movie "war" -``` - -#### Result: - -``` -1) 2 -2) "movie:11005" -3) 1) "title" - 2) "Star Wars: Episode VI - Return of the Jedi" - 3) "votes" - 4) "906260" - 5) "plot" - 6) "The Rebels dispatch to Endor to destroy the second Empire's Death Star." - 7) "rating" - 8) "8.3" - 9) "release_year" - 10) "1983" - 11) "ibmdb_id" - 12) "tt0086190" - 13) "genre" - 14) "Action" -4) "movie:11002" -5) 1) "title" - 2) "Star Wars: Episode V - The Empire Strikes Back" - 3) "votes" - 4) "1127635" - 5) "plot" - 6) "After the Rebels are brutally overpowered by the Empire on the ice planet Hoth, Luke Skywalker begins Jedi training with Yoda, while his friends are pursued by Darth Vader and a bounty hunter named Boba Fett all over the galaxy." - 7) "rating" - 8) "8.8" - 9) "release_year" - 10) "1980" - 11) "genre" - 12) "Action" - 13) "imdb_id" - 14) "tt0080684" -> -``` - -#### Query: Limit the list of fields returned by the query using the RETURN parameter - -The `FT.SEARCH` commands returns a list of results starting with the number of results, then the list of elements (keys & fields). - -``` -FT.SEARCH idx:movie "war" RETURN 2 title release_year -``` - -#### Result: - -``` -1) 2 -2) "movie:11005" -3) 1) "title" - 2) "Star Wars: Episode VI - Return of the Jedi" - 3) "release_year" - 4) "1983" -4) "movie:11002" -5) 1) "title" - 2) "Star Wars: Episode V - The Empire Strikes Back" - 3) "release_year" - 4) "1980" -> -``` - -As you can see the movie _Star Wars: Episode V - The Empire Strikes Back_ is found, even though you used only the word “war” to match “Wars” in the title. This is because the title has been indexed as text, so the field is [tokenized](https://oss.redis.com/redisearch/Escaping/) and [stemmed](https://oss.redis.com/redisearch/Stemming/). - -Later when looking at the query syntax in more detail you will learn more about the search capabilities. - -It is also possible to limit the list of fields returned by the query using the `RETURN` parameter, let's run the same query, and return only the title and release_year. - -#### Query: All the movies that contains the string "war" but NOT the "jedi" one - -Adding the string `-Jedi` (minus) will ask the query engine not to return values that contain `jedi`. - -``` -FT.SEARCH idx:movie "war -Jedi" RETURN 2 title release_year -``` - -#### Result: - -``` -1) 1 -2) "movie:11002" -3) 1) "title" - 2) "Star Wars: Episode V - The Empire Strikes Back" - 3) "release_year" - 4) "1980" -``` - -### Step 7. Fuzzy Search - -All the movies that contains the string "gdfather using fuzzy search" - -``` -FT.SEARCH "idx:movie" " %gdfather% " RETURN 2 title release_year -``` - -#### Result: - -``` -1) 1 -2) "movie:11003" -3) 1) "title" - 2) "The Godfather" - 3) "release_year" - 4) "1972" -``` - -#### Query: All Thriller movies - -``` -FT.SEARCH "idx:movie" "@genre:{Thriller}" RETURN 2 title release_year -``` - -#### Result: - -``` -1) 1 -2) "movie:11004" -3) 1) "title" - 2) "Heat" - 3) "release_year" - 4) "1995" -``` - -#### Query: All Thriller or Action movies - -``` -FT.SEARCH "idx:movie" "@genre:{Thriller|Action}" RETURN 2 title release_year -``` - -#### Result: - -``` -1) 3 -2) "movie:11004" -3) 1) "title" - 2) "Heat" - 3) "release_year" - 4) "1995" -4) "movie:11005" -5) 1) "title" - 2) "Star Wars: Episode VI - Return of the Jedi" - 3) "release_year" - 4) "1983" -6) "movie:11002" -7) 1) "title" - 2) "Star Wars: Episode V - The Empire Strikes Back" - 3) "release_year" - 4) "1980" -``` - -#### Query : All the movies released between 1970 and 1980 (included) - -The `FT.SEARCH` syntax has two ways to query numeric fields: - -- using the `FILTER` parameter - -``` -FT.SEARCH "idx:movie" "@genre:{Thriller|Action}" FILTER release_year 1970 1980 RETURN 2 title release_year -``` - -#### Result: - -``` -1) 1 -2) "movie:11002" -3) 1) "title" - 2) "Star Wars: Episode V - The Empire Strikes Back" - 3) "release_year" - 4) "1980" -``` - -### Step 8. Aggregation - -#### Query: Number of movies by year - -``` -FT.AGGREGATE "idx:movie" "*" GROUPBY 1 @release_year REDUCE COUNT 0 AS nb_of_movies -``` - -#### Result: - -``` -1) 4 -2) 1) "release_year" - 2) "1983" - 3) "nb_of_movies" - 4) "1" -3) 1) "release_year" - 2) "1995" - 3) "nb_of_movies" - 4) "1" -4) 1) "release_year" - 2) "1980" - 3) "nb_of_movies" - 4) "1" -5) 1) "release_year" - 2) "1972" - 3) "nb_of_movies" - 4) "1" -``` - -#### Query: Number of movies by year from the most recent to the oldest - -``` -FT.AGGREGATE "idx:movie" "*" GROUPBY 1 @release_year REDUCE COUNT 0 AS nb_of_movies SORTBY 2 @release_year DESC -``` - -#### Result: - -``` -1) 4 -2) 1) "release_year" - 2) "1995" - 3) "nb_of_movies" - 4) "1" -3) 1) "release_year" - 2) "1983" - 3) "nb_of_movies" - 4) "1" -4) 1) "release_year" - 2) "1980" - 3) "nb_of_movies" - 4) "1" -5) 1) "release_year" - 2) "1972" - 3) "nb_of_movies" - 4) "1" - -``` - -### Additional Links - -- [RediSearch Project](https://oss.redis.com/redisearch/) -- [RediSearch Tutorial](/howtos/redisearch) -- [Getting Started with Movie Database](/howtos/moviesdatabase/getting-started) -- [Getting Started with RedisInsight v2.0](/explore/redisinsightv2/getting-started) -- [Visualize Redis Database keys using the RedisInsight Browser Tool](/explore/redisinsightv2/browser) - -## - - diff --git a/docs/explore/redisinsightv2/redisearch/launchpad.png b/docs/explore/redisinsightv2/redisearch/launchpad.png deleted file mode 100644 index 66e7a455f63..00000000000 Binary files a/docs/explore/redisinsightv2/redisearch/launchpad.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/windows/images/add_database.png b/docs/explore/redisinsightv2/windows/images/add_database.png deleted file mode 100644 index 9ada742a2f2..00000000000 Binary files a/docs/explore/redisinsightv2/windows/images/add_database.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/windows/images/database_creds.png b/docs/explore/redisinsightv2/windows/images/database_creds.png deleted file mode 100644 index ef6379e72b3..00000000000 Binary files a/docs/explore/redisinsightv2/windows/images/database_creds.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/windows/images/database_details.png b/docs/explore/redisinsightv2/windows/images/database_details.png deleted file mode 100644 index 5007a480b06..00000000000 Binary files a/docs/explore/redisinsightv2/windows/images/database_details.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/windows/images/details_database.png b/docs/explore/redisinsightv2/windows/images/details_database.png deleted file mode 100644 index 3881cf02728..00000000000 Binary files a/docs/explore/redisinsightv2/windows/images/details_database.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/windows/images/rediscloud_endpoint.png b/docs/explore/redisinsightv2/windows/images/rediscloud_endpoint.png deleted file mode 100644 index c09c1af4a48..00000000000 Binary files a/docs/explore/redisinsightv2/windows/images/rediscloud_endpoint.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/windows/images/redisinsight-add-database.png b/docs/explore/redisinsightv2/windows/images/redisinsight-add-database.png deleted file mode 100644 index eba08f01075..00000000000 Binary files a/docs/explore/redisinsightv2/windows/images/redisinsight-add-database.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/windows/images/redisinsight-db-added.png b/docs/explore/redisinsightv2/windows/images/redisinsight-db-added.png deleted file mode 100644 index e3d51e6dfa3..00000000000 Binary files a/docs/explore/redisinsightv2/windows/images/redisinsight-db-added.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/windows/images/redisinsight-db-select.png b/docs/explore/redisinsightv2/windows/images/redisinsight-db-select.png deleted file mode 100644 index dedf90e33b6..00000000000 Binary files a/docs/explore/redisinsightv2/windows/images/redisinsight-db-select.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/windows/images/redisinsight-eula.png b/docs/explore/redisinsightv2/windows/images/redisinsight-eula.png deleted file mode 100644 index 3104431eece..00000000000 Binary files a/docs/explore/redisinsightv2/windows/images/redisinsight-eula.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/windows/images/redisinsight-form.png b/docs/explore/redisinsightv2/windows/images/redisinsight-form.png deleted file mode 100644 index 20cc0c44ae8..00000000000 Binary files a/docs/explore/redisinsightv2/windows/images/redisinsight-form.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/windows/images/redisinsight-keys-added.png b/docs/explore/redisinsightv2/windows/images/redisinsight-keys-added.png deleted file mode 100644 index e9030f16726..00000000000 Binary files a/docs/explore/redisinsightv2/windows/images/redisinsight-keys-added.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/windows/images/redisinsight-no-keys.png b/docs/explore/redisinsightv2/windows/images/redisinsight-no-keys.png deleted file mode 100644 index 4bba88009cd..00000000000 Binary files a/docs/explore/redisinsightv2/windows/images/redisinsight-no-keys.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/windows/images/redisinsight-setup.png b/docs/explore/redisinsightv2/windows/images/redisinsight-setup.png deleted file mode 100644 index 7f07feada81..00000000000 Binary files a/docs/explore/redisinsightv2/windows/images/redisinsight-setup.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/windows/images/redisinsight_featured.png b/docs/explore/redisinsightv2/windows/images/redisinsight_featured.png deleted file mode 100644 index 871b7e21bcb..00000000000 Binary files a/docs/explore/redisinsightv2/windows/images/redisinsight_featured.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/windows/images/redisinsight_featured_image.png b/docs/explore/redisinsightv2/windows/images/redisinsight_featured_image.png deleted file mode 100644 index be242b9fdee..00000000000 Binary files a/docs/explore/redisinsightv2/windows/images/redisinsight_featured_image.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/windows/images/redisinsight_flushdb.png b/docs/explore/redisinsightv2/windows/images/redisinsight_flushdb.png deleted file mode 100644 index 162ab17ed20..00000000000 Binary files a/docs/explore/redisinsightv2/windows/images/redisinsight_flushdb.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/windows/images/redisinsight_hashes.png b/docs/explore/redisinsightv2/windows/images/redisinsight_hashes.png deleted file mode 100644 index 5f785a0781a..00000000000 Binary files a/docs/explore/redisinsightv2/windows/images/redisinsight_hashes.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/windows/images/redisinsight_modify_keys.png b/docs/explore/redisinsightv2/windows/images/redisinsight_modify_keys.png deleted file mode 100644 index 65d3a8105eb..00000000000 Binary files a/docs/explore/redisinsightv2/windows/images/redisinsight_modify_keys.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/windows/images/redisinsight_photo.png b/docs/explore/redisinsightv2/windows/images/redisinsight_photo.png deleted file mode 100644 index 42eec3bf3df..00000000000 Binary files a/docs/explore/redisinsightv2/windows/images/redisinsight_photo.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/windows/images/select_cloud_vendor.png b/docs/explore/redisinsightv2/windows/images/select_cloud_vendor.png deleted file mode 100644 index 2526223c800..00000000000 Binary files a/docs/explore/redisinsightv2/windows/images/select_cloud_vendor.png and /dev/null differ diff --git a/docs/explore/redisinsightv2/windows/index-windows.mdx b/docs/explore/redisinsightv2/windows/index-windows.mdx deleted file mode 100644 index 6949202c2c9..00000000000 --- a/docs/explore/redisinsightv2/windows/index-windows.mdx +++ /dev/null @@ -1,177 +0,0 @@ ---- -id: index-windows -title: How to run RedisInsight on Windows -sidebar_label: How to run RedisInsight on Windows -slug: /explore/redisinsightv2/windows -authors: [ajeet] ---- - -RedisInsight is a visual tool that provides capabilities to design, develop and optimize your Redis application. It is a 100% free Redis GUI that allows developers like you to interact with your databases and manage your data. - -RedisInsight v2.0 incorporates a completely new tech stack based on the popular Electron and Elastic UI frameworks. You can run the application locally along with your favorite IDE, and it remains cross-platform, supported on Linux, Windows, and MacOS. RedisInsight Browser lets you explore keys in your Redis server. You can add, edit and delete a key. You can even update the key expiry and copy the key name to be used in different parts of the application. - -## RedisInsight Windows Installer - -The RedisInsight desktop client installer for Windows is just 70 MB in size. It allows you to download and use the RedisInsight GUI locally. The desktop client is supported on Windows operating systems and works with all variants of Redis. RedisInsight should install and run on a fresh Windows system. - -:::info INFO -There is no need to install the .NET framework in order to install RedisInsight on Windows. -::: - -## Getting Started - -- Step 1. Create a free Cloud account -- Step 2. Create a database -- Step 3. Verify the database details -- Step 4. Install RedisInsight -- Step 5. Connect to the Redis database -- Step 6. Use Browser Tool -- Step 7. Clone the repository -- Step 8. Import user database keys -- Step 9. Modify a Redis key -- Step 10. Cleaning up - -### Step 1. Create a free Cloud account - -Create your free Redis Enterprise Cloud account. Once you click on “Get Started”, you will receive an email with a link to activate your account and complete your signup process. - -:::info TIP -For a limited time, use **TIGER200** to get **$200** credits on Redis Enterprise Cloud and try all the advanced capabilities! - -:tada: [Click here to sign up](https://redis.com/try-free) - -::: - -### Step 2. Create a database - -Choose your preferred cloud vendor. Select the region and then click "Let's start free" to create your free database automatically. - -:::info TIP -If you want to create a custom database with your preferred name and type of Redis, -click "Create a custom database" option shown in the image. -::: - -![create database ](images/select_cloud_vendor.png) - -### Step 3. Verify the database details - -You will be provided with Public endpoint URL and "Redis Stack" as the type of database with the list of modules that comes by default. - -![verify database](images/details_database.png) - -### Step 4. Install RedisInsight - -Click on the RedisInsight executable (.exe file) and install it in your system. - -![setup redisinsight](images/redisinsight-setup.png) - -Once the RedisInsight software is installed, click on its icon to open the RedisInsight application. It will display the End-User License Agreement and Privacy Settings. Enable Analytics and Encrypt sensitive information as per your preference. - -![accept redisinsight licence](images/redisinsight-eula.png) - -### Step 5. Connect to the Redis Database - -Enter the requested details, including Host (endpoint), Port, and Alias in the form, as shown below. You can use "default" as the username for now. Then click “ADD REDIS DATABASE”. - -![adding redis database](images/database_creds.png) - -Once added, you will see the database name listed as shown below: - -![listing the redis database](images/database_details.png) - -### Step 6. Use "Browser Tool" - -Click on the "Key" icon on the left sidebar to open up the browser tool. - -![redis database with no keys](images/redisinsight-no-keys.png) - -### Step 5. Overview of User database keys - -Let us import a user database (6k keys). This dataset contains users stored as Redis Hashes. - -### - -**Users** - -The user hashes contain the following fields: - -- `user:id` : The key of the hash. -- `first_name` : First Name. -- `last_name` : Last name. -- `email` : email address. -- `gender` : Gender (male/female). -- `ip_address` : IP address. -- `country` : Country Name. -- `country_code` : Country Code. -- `city` : City of the user. -- `longitude` : Longitude of the user. -- `latitude` : Latitude of the user. -- `last_login` : Epoch time of the last login. - -### Step 6. Clone the repository - -Open up the CLI terminal and run the following commands: - -```bash - git clone https://github.com/redis-developer/redis-datasets - cd redis-datasets/user-database -``` - -### Step 7. Import the user database keys - -Open up the CLI terminal and run the following command. - -:::note NOTE -You will need a hostname, port and password to run this for a cloud database. -::: - -```bash - redis-cli -h redis-18386.c110-qa.us-east-1-1.ec2.cloud.redislabs.com -p 18386 -a < ./import_users.redis -``` - -Refresh the keys view by clicking as shown below: - -![listing the keys](images/redisinsight-keys-added.png) - -You can get a real-time view of the data in your Redis database as shown below: - -Select any key in the keys view and the key's value gets displayed in the right hand side that includes fields and values. - -![hash keys listed](images/redisinsight_hashes.png) - -### Step 8. Modify a key - -The RedisInsight browser tool allows you to modify the data instantly. -Select any key and change the values as shown in the following screenshot - -![modify the redis keys](images/redisinsight_modify_keys.png) - -### Step 9. Cleaning up - -Run the following command to clean up all the Redis keys: - -![flushing the database](images/redisinsight_flushdb.png) - -## Further References - -- [How to Install Redis on Windows](/create/windows) -- [Slowlog Configuration using RedisInsight](/explore/redisinsight/slowlog) -- [Explore Redis keys using RedisInsight browser tool](/explore/redisinsight/browser) -- [Memory Analysis using RedisInsight](/explore/redisinsight/memoryanalyzer) - - diff --git a/docs/explore/redismod/index-redismod.mdx b/docs/explore/redismod/index-redismod.mdx deleted file mode 100644 index d2927a292c8..00000000000 --- a/docs/explore/redismod/index-redismod.mdx +++ /dev/null @@ -1,121 +0,0 @@ ---- -id: index-redismod -title: Redis Modules in a Docker Container -sidebar_label: RedisMod -slug: /explore/redismod -authors: [ajeet] ---- - -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; - -This simple container image bundles together the latest stable releases of Redis and select Redis modules. -This image is based on the official image of Redis from Docker. By default, the container starts with Redis' default configuration and all included modules loaded. - -### Modules included in the container - -- [RediSearch](https://redis.com/modules/redis-search/): a full-featured search engine -- [RedisGraph](https://redis.com/modules/redis-graph/): a graph database -- [RedisTimeSeries](https://redis.com/modules/redis-timeseries/): a timeseries database -- [RedisAI](https://redis.com/modules/redis-ai/): a tensor and deep learning model server -- [RedisJSON](https://redis.com/modules/redis-json/): a native JSON data type -- [RedisBloom](https://redis.com/modules/redis-bloom/): native Bloom and Cuckoo Filter data types -- [RedisGears](https://redis.com/modules/redis-gears/): a dynamic execution framework - -### Step 1. Install Docker - -To use RedisMod on a local Mac, the first step is to install Docker for your operating system. -Run the docker version command in a terminal window to make sure that docker is installed correctly. - -```bash - docker version -``` - -It should display Docker Engine Server and Client version successfully. - -### Step 2. Running Redismod Docker container - -```bash - docker run -d -p 6379:6379 redislabs/redismod -``` - -### Step 3. Connect to Redis database - -You can either use [redis-cli](/create/homebrew) or use [RedisInsight](/explore/redisinsight/getting-started) to connect to Redis database. -Let's try using redis-cli as shown below: - -```bash - redis-cli -``` - -### Step 4. Verify if all the Redis modules are getting loaded - -```bash - $ redis-cli - 127.0.0.1:6379> info modules - # Modules - module:name=rg,ver=10006,api=1,filters=0,usedby=[],using=[ai],options=[] - module:name=ai,ver=10002,api=1,filters=0,usedby=[rg],using=[],options=[] - module:name=timeseries,ver=10408,api=1,filters=0,usedby=[],using=[],options=[] - module:name=bf,ver=20205,api=1,filters=0,usedby=[],using=[],options=[] - module:name=graph,ver=20402,api=1,filters=0,usedby=[],using=[],options=[] - module:name=ReJSON,ver=10007,api=1,filters=0,usedby=[],using=[],options=[] - module:name=search,ver=20006,api=1,filters=0,usedby=[],using=[],options=[] -``` - -### Step 5. Testing Redis Modules - -Let us test drive RediSearch modules as discussed below in detail. - -#### Insert data into RediSearch - -We are now ready to insert some data. This example uses movies data stored as Redis Hashes, so let’s insert a couple of movies: - -```bash - HSET movies:11002 title "Star Wars: Episode V - The Empire Strikes Back" plot "Luke Skywalker begins Jedi training with Yoda." release_year 1980 genre "Action" - rating 8.7 votes 1127635 -``` - -```bash - HSET movies:11003 title "The Godfather" plot "The aging patriarch of an organized crime dynasty transfers control of his empire to his son." release_year 1972 - genre "Drama" rating 9.2 votes 1563839 -``` - -Your Redis database now contains two Hashes. It is simple to retrieve information using the HMGET command, if you know the key of the movies (movies:11002): - -``` - HMGET movies:11002 title rating -``` - -#### Create an index in RediSearch - -To be able to query the hashes on the field for title, say, or genre, you must first create an index. To create an index, you must define a schema to list the fields and their types that are indexed, and that you can use in your queries. - -Use the FT.CREATE command to create an index, as shown here: - -``` - FT.CREATE idx:movies ON hash PREFIX 1 "movies:" SCHEMA title TEXT SORTABLE release_year NUMERIC SORTABLE rating NUMERIC SORTABLE genre TAG SORTABLE -``` - -#### Search the movies in the RediSearch index - -You can now use the FT.SEARCH to search your database, for example, to search all movies sorted by release year: - -```bash - FT.SEARCH idx:movies * SORTBY release_year ASC RETURN 2 title release_year -``` - -To test drive rest of Redis modules, please visit the links mentioned under "References" section. - -### References - -- [Indexing, Querying, and Full-Text Search of JSON Documents with Redis](https://redis.com/blog/index-and-query-json-docs-with-redis/) -- [Redismod GITHUB repository](https://github.com/RedisLabsModules/redismod) -- [Connecting to the database using RedisInsight](https://developer.redis.com/explore/redisinsight/) -- [RedisJSON Tutorial](https://developer.redis.com/howtos/redisjson/) -- [RedisTimeSeries Tutorial](https://developer.redis.com/howtos/redistimeseries) -- [RedisGraph Tutorial](https://developer.redis.com/howtos/redisgraph) -- [RedisBloom Tutorial](https://developer.redis.com/howtos/redisbloom) -- [RedisGears Tutorial](https://developer.redis.com/howtos/redisgears) diff --git a/docs/explore/riot/index-riot.mdx b/docs/explore/riot/index-riot.mdx index f25064b80b9..2666eda1074 100644 --- a/docs/explore/riot/index-riot.mdx +++ b/docs/explore/riot/index-riot.mdx @@ -8,13 +8,14 @@ authors: [ajeet] import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; +import Authors from '@theme/Authors'; + + Redis Input/Output Tools (RIOT) is a set of import/export command line utilities for Redis: - RIOT Redis: live replication from any Redis database (including AWS Elasticache) to another Redis database. -- RIOT DB: migrate from an RDBMS to Redis, RediSearch, RedisJSON, ... +- RIOT DB: migrate from an RDBMS to Redis diff --git a/docs/explore/riot/index-riot.mdx.orig b/docs/explore/riot/index-riot.mdx.orig deleted file mode 100644 index 6f865c3f711..00000000000 --- a/docs/explore/riot/index-riot.mdx.orig +++ /dev/null @@ -1,388 +0,0 @@ ---- -id: index-riot -title: RIOT -sidebar_label: RIOT -slug: /riot ---- - -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; - - -Redis Input/Output Tools (RIOT) is a set of import/export command line utilities for Redis: - -- RIOT DB: migrate from an RDBMS to Redis, RediSearch, RedisJSON, ... -- RIOT File: bulk import/export data from/to files. -- RIOT Gen: generate sample Redis datasets for new feature development and proof of concept. -- RIOT Redis: live replication from any Redis database (including AWS Elasticache) to another Redis database. -- RIOT Stream: import/export messages from/to Kafka topics. - - - -![my image](riottool.svg) - - - - - - - - -Most database migration tools available today are offline in nature. Migrating data from AWS ElastiCache to Redis Enterprise Cloud for example means backing up your Elasticache data to an AWS S3 bucket and importing it into Redis Enterprise Cloud using its UI.This implies some downtime and might result in data loss. -Other available techniques include creating point-in-time snapshots of the source Redis server & applying the changes to the destination servers to keep both servers in sync. -It might sound like a good approach but can be challenging when you have to maintain dozens of scripts to implement the migration strategy. - -RIOT Redis is a migration tool that allows for seamless live replication between two Redis databases. - -## 1. Getting Started - - -Download the https://github.com/redis-developer/riot/releases/latest[latest release] and unzip the archive. - -Launch the `bin/riot-redis` script and follow the usage information provided. - -## 2. Build and Run - -``` -git clone https://github.com/redis-developer/riot.git -cd riot/riot-redis -./riot-redis -``` - -## 3. Install via Homebrew (macOS) - -``` -brew install jruaux/tap/riot-redis` -``` - -## Usage - -``` -❯ riot-redis -Usage: {app} [OPTIONS] [COMMAND] - --help Show this help message and exit. - -V, --version Print version information and exit. - -q, --quiet Log errors only - -d, --debug Log in debug mode (includes normal stacktrace) - -i, --info Set log level to info -``` - -You can use --help on any subcommand: - -``` -❯ riot-redis --help - -❯ riot-redis import --help - -❯ riot-redis import .. hset --help -``` - -Redis connection options are the same as redis-cli: - -``` - -h, --hostname= Server hostname (default: 127.0.0.1) - -p, --port= Server port (default: 6379) - -s, --socket= Server socket (overrides hostname and port) - --user= Used to send ACL style 'AUTH username pass'. Needs password. - -a, --pass[=] Password to use when connecting to the server - -u, --uri= Server URI - -o, --timeout= Redis command timeout (default: 60) - -n, --db= Database number (default: 0) - -c, --cluster Enable cluster mode - -t, --tls Establish a secure TLS connection - -l, --latency Show latency metrics - -m, --pool= Max pool connections (default: 8) -``` -Redis URI syntax is described here. - -## 4. Example - -Here is an example of a live replication from a source Redis running on localhost and port 6379, to a target Redis running on localhost and port 6380: - -``` -❯ riot-redis -h source -p 6379 replicate --idle-timeout 500 -h target -p 6380 --live -``` - -## 5. Verification - -Once replication is complete RIOT Redis will perform a verification step to compare values and TTLs between source and target databases. The output looks like this: - -``` -OK:1000 V:0 >:0 <:0 T:0 -``` - -- OK: # identical values - -- V: # mismatched values - -- >: # keys only present in source database - -- <: # keys only present in target database - -- T: # keys with TTL difference greater than tolerance - - -## 6. Architecture - -RIOT Redis implements client-side replication using a producer/consumer approach: - -- the producer is connected to the source Redis (e.g. ElastiCache) and iterates over keys to read their corresponding values - -- the consumer is connected to the target Redis (e.g. Redis Enterprise Cloud) and writes the key/value tuples previously created - -1. Key reader: initiates a SCAN and optionally calls SUBSCRIBE to listen for keyspace notifications (live replication). -2. Value reader: takes the keys and calls DUMP and TTL. -3. Key/Value writer: takes key/value/ttl tuples and calls RESTORE and EXPIRE. - -Note: Live replication makes use of keyspace notifications. Make sure the source Redis database has keyspace notifications enabled using notify-keyspace-events = KA in redis.conf or via CONFIG SET. - -Note: The live replication mechanism does not guarantee data consistency. Redis sends keyspace notifications over pub/sub which does not provide guaranteed delivery. It is possible that RIOT Redis can miss some notifications in case of network failures for example. - - - - - - - - -RIOT DB lets you import/export data from relational databases. - -## 1. Getting Started - -Download the [latest release](https://github.com/redis-developer/riot/releases/latest) and unzip the archive. - -Launch the bin/riot-db script and follow the usage information provided. - -## 2. Build and Run - -``` -❯ git clone https://github.com/redis-developer/riot.git -❯ cd riot/riot-db -❯ ./riot-db -``` - -## 3. Install via Homebrew (macOS) - -``` -brew install jruaux/tap/riot-db -``` - -## 4. Usage - -``` -❯ riot-db -Usage: riot-db [OPTIONS] [COMMAND] - --help Show this help message and exit. - -V, --version Print version information and exit. - -q, --quiet Log errors only - -d, --debug Log in debug mode (includes normal stacktrace) - -i, --info Set log level to info -``` -You can use --help on any subcommand: - -``` -❯ riot-db --help -❯ riot-db import --help -❯ riot-db import … hset --help -``` - -Redis connection options are the same as redis-cli: - -``` - -h, --hostname= Server hostname (default: 127.0.0.1) - -p, --port= Server port (default: 6379) - -s, --socket= Server socket (overrides hostname and port) - --user= Used to send ACL style 'AUTH username pass'. Needs password. - -a, --pass[=] Password to use when connecting to the server - -u, --uri= Server URI - -o, --timeout= Redis command timeout (default: 60) - -n, --db= Database number (default: 0) - -c, --cluster Enable cluster mode - -t, --tls Establish a secure TLS connection - -l, --latency Show latency metrics - -m, --pool= Max pool connections (default: 8) - -``` - - -## 5. Drivers - -RIOT DB includes drivers for the most common RDBMSs: - -### Oracle - -``` -jdbc:oracle:thin:@myhost:1521:orcl -``` - -### IBM Db2 - -``` -jdbc:db2://host:port/database -``` - -### MS SQL Server - -``` -jdbc:sqlserver://[serverName[\instanceName][:portNumber]][;property=value[;property=value]] -``` - -### MySQL - -``` -jdbc:mysql://[host]:[port][/database][?properties] -``` - -### PostgreSQL - -``` -jdbc:postgresql://host:port/database -``` - -### SQLite - -``` -jdbc:sqlite:sqlite_database_file_path -``` - -For non-included databases you must install the corresponding JDBC driver under the lib directory and modify the RIOT DB CLASSPATH: - -``` -*nix: bin/riot-db → CLASSPATH=$APP_HOME/lib/myjdbc.jar:$APP_HOME/lib/… -Windows: bin{app}.bat → set CLASSPATH=%APP_HOME%\lib\myjdbc.jar;%APP_HOME%\lib\… -``` - -## 6. Import - -Use the import command to import the result set of a SQL statement. - -### Import from PostgreSQL - -``` -❯ riot-db -h localhost -p 6379 import "SELECT * FROM orders" --url jdbc:postgresql://host:port/database --username appuser --password passwd hset --keyspace order --keys order_id -``` - -You can specify one or many Redis commands as targets of the import: - -### Import into hashes - -``` -❯ riot-db import .. set --keyspace blah --keys id -``` - -### Import into hashes and set TTL on the key - -``` -❯ riot-db import .. hset --keyspace blah --keys id expire --keyspace blah --keys id -``` - -### Import into hashes and set TTL and add to a set named myset - -``` -❯ riot-db import .. hset --keyspace blah --keys id expire --keyspace blah --keys id sadd --keyspace myset --members id -``` - -## 7. Processing - -The following processors can be applied to records in that order: - -- Transforms - -- Regular expressions - -- Filters - -## 6. Transform - - -Produce key/value pairs using [Spring Expression Language](https://docs.spring.io/spring/docs/current/spring-framework-reference/core.html#expressions) (SpEL): field1=, field2=, -For example --spel "field1='foo'" generates a field named field1 with always the same value foo. - -Input fields are accessed by name (e.g. field3=field1+field2). - -The processor also exposes the following variables that can be called with the # prefix: - -### redis - -Redis connection to issue any command: - -``` ---spel "name=#redis.hgetall('person1').lastName" -``` - -### date - -Date parser/formatter: - -``` ---spel "epoch=#date.parse(mydate).getTime()" -``` - -### index - -Sequence number of the item being generated: - -``` ---spel "id=#index" -``` - -## Regex - -Extract patterns from source fields using regular expressions: - -``` ---regex "name=(?\w+)\/(?\w+)" -``` - -## Filter - -Keep records that match a SpEL boolean expression. - -``` ---filter "value matches '\\d+'" will only keep records where the value field is a series of digits. -``` - -## 7. Export - -### Export to PostgreSQL - -``` -❯ riot-db export "INSERT INTO mytable (id, field1, field2) VALUES (CAST(:id AS SMALLINT), :field1, :field2)" --url jdbc:postgresql://host:port/database --username appuser --password passwd --scan-match "hash:*" --key-regex "hash:(?.*)" -``` - -### Import from PostgreSQL to JSON strings - -``` -❯ riot-db -h localhost -p 6379 import "SELECT * FROM orders" --url jdbc:postgresql://host:port/database --username appuser --password passwd set --keyspace order --keys order_id -``` -This will produce Redis strings that look like this: - -``` -{ - "order_id": 10248, - "customer_id": "VINET", - "employee_id": 5, - "order_date": "1996-07-04", - "required_date": "1996-08-01", - "shipped_date": "1996-07-16", - "ship_via": 3, - "freight": 32.38, - "ship_name": "Vins et alcools Chevalier", - "ship_address": "59 rue de l'Abbaye", - "ship_city": "Reims", - "ship_postal_code": "51100", - "ship_country": "France" -} -``` - - - - - diff --git a/docs/explore/what-is-redis/index-what-is-redis.mdx b/docs/explore/what-is-redis/index-what-is-redis.mdx deleted file mode 100644 index 33cad865dfd..00000000000 --- a/docs/explore/what-is-redis/index-what-is-redis.mdx +++ /dev/null @@ -1,152 +0,0 @@ ---- -id: index-what-is-redis -title: 'Redis: In-memory database. How it works and Why you should use it' -description: Redis is an open source, in-memory data store that delivers sub-millisecond response times and is used as a primary database, cache, message broker, and queue. -sidebar_label: What is Redis? -slug: /explore/what-is-redis -authors: [will] -keywords: - - Redis - - in-memory database - - caching - - session store - - leaderboard - - pub/sub - - real-time apps - - gaming - - primary database ---- - -# What Is Redis? - -Redis is an open source, in-memory, key-value data store most commonly used as a primary database, cache, message broker, and queue. Redis delivers sub-millisecond response times, enabling fast and powerful real-time applications in industries such as gaming, fintech, ad-tech, social media, healthcare, and IoT. - -Redis is the [most-loved database](https://insights.stackoverflow.com/survey/2021) by developers for five years running. Developers love Redis because of its ease of use, performance, and scalability. There is a Redis client available for use in every popular modern programming language. This, coupled with the performance benefits, makes Redis the most popular choice for caching, session management, gaming, fraud detection, leaderboards, real-time analytics, geospatial indexing, ride-sharing, social media, and streaming applications. - -Redis Enterprise is the only true datastore built for hybrid and multicloud applications. [Get started with Redis Enterprise Cloud](/create/rediscloud). - -### Redis on AWS - -You might be familiar with Amazon ElastiCache for Redis. It is a Redis-compatible cache service that is available on AWS. Redis Enterprise Cloud on AWS is a fully-managed Redis Enterprise as a service and supports Redis as a cache and a database. Learn more about [Redis on AWS](/create/aws/redis-on-aws). - -### Redis on other cloud providers - -- [Redis on Google Cloud](/create/gcp) -- [Redis on Azure](/create/azure) -- [Redis on Heroku](/create/heroku) - -## Benefits of Redis - -### Performance - -The primary benefit of Redis is its sub-millisecond queries. Redis runs in-memory, which enables low-latency and high throughput. Running in-memory means requests for data do not require a trip to disk. This leads to an order of magnitude more operations and faster response times. Redis is one of the only databases that supports millions of operations per second. - -### Flexible data structures - -Redis is a multi-model database, and provides several built-in data structures such as: - -- Strings - any text or binary data (512MB max.) -- Hashes - field-value pairs that most commonly represent objects -- Lists - a collection of Strings ordered by when they were added as a linked list. Useful for queues and "latest updates" for social media posts -- Sets - an unordered collection of Strings with the ability to intersect, union, and diff against other Sets -- Sorted Sets - similar to a Redis Set, the Sorted Set is a collection of unique String members. In a Sorted Set, each member is associated with a score that can be used to sort the collection. -- Bitmaps - not necessarily a data type, but a set of bit-oriented operations on the String type -- HyperLogLogs - a probabilistic data structure used in order to count unique things (cardinality of a set) -- Geospatial - a Sorted Set of longitude/latitude/name key-value pairs useful for maps, geosearching, and "nearby" features -- Streams - a data type that models an append only log and which can be used as a durable message queue - -#### Redis modules - -Redis also supports custom data structures called modules that run natively alongside core Redis. These modules can be used to create custom data structures that are not available in the built-in data structures. Examples of custom modules include: - -- [RediSearch](https://redis.com/redis-enterprise/redis-search/) - a real-time search and secondary indexing engine that runs on your Redis dataset and allows you to query data that has just been indexed -- [RedisJSON](https://redis.com/redis-enterprise/redis-json/) - a native JSON data type tailored for fast, efficient, in-memory storage and retrieval of JSON documents at high speed and volume -- [RedisGears](https://redis.com/redis-enterprise/redis-gears/) - a programmable engine for Redis that runs inside Redis, closer to where your data lives, and which allows cluster-wide operations across shards, nodes, data structures, and data models at a sub-millisecond speed -- [RedisAI](https://redis.com/redis-enterprise/redis-ai/) - a machine learning data type that runs inside Redis and allows you to train and predict on your data. Additionally provides a common layer among different formats and platforms, including PyTorch, TensorFlow/TensorRT, and ONNX Runtime -- [RedisGraph](https://redis.com/redis-enterprise/redis-graph/) - a graph data structure that can be used to store and query data in a graph-oriented way. Supports the industry-standard Cypher as a query language and incorporates the state-of-the-art SuiteSparse GraphBLAS engine for matrix operations on sparse matrices -- [RedisTimeSeries](https://redis.com/redis-enterprise/redis-time-series/) - a time series data type with capabilities like automatic downsampling, aggregations, labeling and search, compression, and enhanced multi-range queries as well as built-in connectors to popular monitoring tools like Prometheus and Grafana to enable the extraction of data into useful formats for visualization and monitoring -- [RedisBloom](https://redis.com/redis-enterprise/redis-bloom/) - provides Redis with support for additional probabilistic data structures and allows for constant memory space and extremely fast processing while still maintaining a low error rate. Supports Bloom and Cuckoo filters to determine whether an item is present or absent from a collection with a given degree of certainty, Count-min sketch to count the frequency of the different items in sub-linear space, and Top-K to count top k events in a near deterministic manner - -### Simplicity and ease-of-use - -Redis makes complex applications easier to write and maintain. Redis presents a simple command and query structure for working with data versus query languages of traditional databases. When building applications you typically are using object-oriented languages, such as Java, Python, PHP, C, C++, C#, JavaScript, TypeScript, Node.js, Ruby, Go, and many others. The built-in data structures of Redis present a natural way of storing data exactly as you use it in object-oriented languages, minimizing [impedance mismatch](https://redis.com/blog/the-impedance-mismatch-test/). Redis also provides clients for almost every popular language, making it easy to build applications that can run on any platform. - -### Replication and persistence - -Redis offers asynchronous replication where data can be replicated to multiple servers. This allows for improved read performance and faster recovery. Redis Enterprise additionally provides [Active-Active Geo-Distribution](https://redis.com/redis-enterprise/technology/active-active-geo-distribution/) to ensure that data is distributed across multiple servers in a highly available manner for both reads and writes. Redis supports point-in-time backups (known as RDB) that lets you copy Redis data to disk or cloud storage. - -While Redis open source was not necessarily developed with an emphasis on durability and consistency as a default, [Redis Enterprise provides durability and consistency](https://redis.com/redis-enterprise/technology/durable-redis/) for Redis and allows you to use Redis as both a cache and a database. - -### High availability and scalability - -Redis provides a primary-replica architecture as a single node or cluster. This allows you to build highly available, scalable, and reliable applications on top of Redis. You can easily scale Redis up, down, in, and out to meet application demands. - -### Open source - -Redis is open source and available for free, and offers open source clients in many languages. The [Redis modules listed above](#redis-modules) are also available for download or use in Redis Enterprise Cloud. - -## Popular Redis use cases - -### Caching - -Redis is the de facto solution for caching in almost every application. Because Redis can handle millions of queries per second and offers high availability and scalability, it is used as a cache to reduce the load on a relational or NoSQL database. This includes database query caching, session caching, page caching, and caching of frequently used objects such as images, files, and application data. Learn more about [Redis caching](https://redis.com/solutions/use-cases/caching/). - -### Session storage - -Redis provides sub-millisecond latency at scale, making it a natural choice to store session data. This includes user profile information, OAuth tokens, credentials, session state, and more. Learn more about [Redis session storage](https://redis.com/solutions/use-cases/session-management/). - -### Fraud detection - -Redis is built to handle real-time AI and machine learning workloads because of its scalability and high write throughput at low latency. Redis is often used as a primary database, enabling deep learning models directly where the data lives. Bloom filters, time series, and other data structures that work natively with Redis enable cost reduction with high-speed statistical analysis. Learn more about [Redis fraud detection](https://redis.com/solutions/use-cases/fraud-detection/). - -### Real-time inventory - -Retailers need to ensure that their real-time inventory systems can survive seasonal peaks, maintain data consistency, and deliver instant results. Redis is a great choice for this use case. It is a highly available and highly scalable database that can handle millions of queries per second. Redis clusters can be configured to replicate across multiple servers in a highly available manner, enabling data consistency between stores. Learn more about [Redis for real-time inventory management](https://redis.com/solutions/use-cases/real-time-inventory/). - -### Claims processing - -Insurance companies need to process claims in real time, and they receive millions of claims daily. Redis provides sub-millisecond latency and can process millions of requests per second. Redis has built-in data types for building scalable, event-driven architectures. Redis Streams can enable ingesting and analyzing large amounts of data in real time. Learn more about [Redis claims processing](https://redis.com/solutions/use-cases/claims-processing/). - -### Gaming leaderboards - -Leaderboards require constant updates and scalability across millions of users. They also require complex mathematical computation, and must be distributed globally. Redis has built-in data types, such as sorted sets, that are useful for manipulating leaderboards. Redis also supports clustering and can be distributed globally. Learn more about [Redis gaming leaderboards](https://redis.com/solutions/use-cases/leaderboards/). - -### Messaging - -Microservices and distributed systems need to be able to communicate with each other. Redis provides a simple, fast, and reliable messaging system that can be used for real-time communication between microservices. Redis Streams can be used to enable real-time analytics and data ingestion. Redis Pub/Sub is a lightweight messaging protocol designed for broadcasting and receiving notifications. Redis Lists and Redis Sorted Sets are two native data structures that are great for implementing message queues. Redis also has client libraries in most programming languages that enable you to use your programming language of choice. Learn more about [Redis messaging](https://redis.com/solutions/use-cases/messaging/). - -### Fast data ingest - -Redis can handle millions of read/write operations per second at sub-millisecond latencies, and it runs on AWS, GCP, Azure, and other cloud platforms. This makes Redis a great choice for processing large volumes of data that arrive in bursts, data from multiple sources/formats, data that needs to be filtered and analyzed, and data that is distributed geographically. Learn more about [Redis data ingestion](https://redis.com/solutions/use-cases/fast-data-ingest/). - -## Redis language support - -Redis supports most high-level, popular programming languages and has SDKs built to make it easy to get started. Redis clients are available for the following languages (and more): - -- Python -- JavaScript -- Node.js -- Java -- Go -- C/C++ -- C# -- PHP -- Ruby -- Perl -- Rust - -## Redis vs. Memcached - -Both Redis and Memcached are open source, powerful, in-memory data stores. The main difference between the two is that Redis is a more full-featured database that is built to fit a number of different use cases. Memcached is primarily used for key/value caching. Redis is used for both caching and as a database. - -## How to host Redis - -You can [sign up for Redis Enterprise Cloud](https://redis.com/try-free/) for free, and when you create your subscription you can specify that you want to use AWS, GCP, or Azure. You can also configure the geographic region where you want to host Redis. Redis Enterprise Cloud is a great option when choosing Redis because: - -1. It is a fully managed service that provides a single point of contact for all your Redis clusters. -1. It is the only managed service that provides [Redis modules](https://redis.com/modules/get-started/) that turn Redis into a multi-model database. -1. It is built to scale with enterprise clustering, Redis-on-Flash, and Active-Active geo-distribution using CRDTs. - -## Getting started with Redis - -If you are ready to start building applications using Redis, check out our [tutorials for Redis](/develop) that let you use your programming language of choice! diff --git a/docs/get-involved/devcember/index-devcember.mdx b/docs/get-involved/devcember/index-devcember.mdx index 94894870079..5aec8f82c8a 100644 --- a/docs/get-involved/devcember/index-devcember.mdx +++ b/docs/get-involved/devcember/index-devcember.mdx @@ -9,6 +9,10 @@ pagination_next: null pagination_prev: null --- +import Authors from '@theme/Authors'; + + + ## What's it all About? We're excited to announce DEVcember, a month-long festival of live online events and fun challenges, showcasing Redis and our community! @@ -37,8 +41,8 @@ We'll be on the [Redis channel on Twitch](https://www.twitch.tv/redisinc) and [Y | Sat 11 / Sun 12 Dec | Second weekend hands-on exercise | [Take the challenge on GitHub](https://github.com/redis-developer/devcember/tree/main/challenges/weekend2) | | Mon 13 Dec, 4pm UTC | Sort it out! All about Sorted Sets | | | Tue 14 Dec, 4:45pm UTC | What's the Score? Top K with Redis Bloom! | | -| Wed 15 Dec, 10am UTC | Seek and You May Find… Introducing RediSearch! (Part 1) | | -| Wed 15 Dec, 10am UTC | Seek and You May Find… Introducing RediSearch! (Part 2) | | +| Wed 15 Dec, 10am UTC | Seek and You May Find… Introducing Redis Search! (Part 1) | | +| Wed 15 Dec, 10am UTC | Seek and You May Find… Introducing Redis Search! (Part 2) | | | Thu 16 Dec, 3:45pm UTC | Introducing Redis OM for Node.js | | | Fri 17 Dec, 4pm UTC | Object Mapping and More! Redis OM for .NET | | | Sat 18 / Sun 19 Dec | Third weekend hands-on exercise | [Take the challenge on GitHub](https://github.com/redis-developer/devcember/tree/main/challenges/weekend3) | @@ -46,7 +50,7 @@ We'll be on the [Redis channel on Twitch](https://www.twitch.tv/redisinc) and [Y | Tue 21 Dec, 5pm UTC | What's the deal with Pub/Sub? | | | Wed 22 Dec, 5:15pm UTC | Spring into Redis OM! (Redis OM for Java/Spring Framework) | | | Thu 23 Dec, 5pm UTC | Finding your way with Redis Geospatial! | | -| Fri 24 Dec, 9:15am UTC | Herding Cats with RedisJSON | | +| Fri 24 Dec, 9:15am UTC | Herding Cats with Redis JSON | | ## Meet the Team diff --git a/docs/get-involved/hacktoberfest/index-hacktoberfest.mdx b/docs/get-involved/hacktoberfest/index-hacktoberfest.mdx index a96caa89b4a..8ddf2cf9356 100644 --- a/docs/get-involved/hacktoberfest/index-hacktoberfest.mdx +++ b/docs/get-involved/hacktoberfest/index-hacktoberfest.mdx @@ -9,6 +9,12 @@ pagination_next: null pagination_prev: null --- +import Authors from '@theme/Authors'; + + + +#### (Looking for Hacktoberfest 2022? Find us over at [redis.io](https://redis.io/community/hacktoberfest/)!) + [Hacktoberfest](https://hacktoberfest.digitalocean.com/) is a month-long online festival which takes place every year in October. It is sponsored by [DigitalOcean](https://www.digitalocean.com/) and aims to encourage people to get involved in open source projects. Hacktoberfest 2021 has now finished! We've left the below information here so you can see how it worked and check out the recordings of our live streams. @@ -32,4 +32,4 @@ Querying, Indexing, and Full-text Search in Redis
-If you have questions about RediSearch and other module ask them in the [Redis Community Forum](https://forum.redis.com/c/modules/redisearch/58). +If you have questions about Redis Search and other module ask them in the [Redis Community Forum](https://forum.redis.com/c/modules/redisearch/58). diff --git a/docs/guides/security/how-to-use-tls-with-redis-enterprise/how-to-use-ssl-tls-with-redis-enterprise.mdx b/docs/guides/security/how-to-use-tls-with-redis-enterprise/how-to-use-ssl-tls-with-redis-enterprise.mdx index c32d054964c..4b64c1d071f 100644 --- a/docs/guides/security/how-to-use-tls-with-redis-enterprise/how-to-use-ssl-tls-with-redis-enterprise.mdx +++ b/docs/guides/security/how-to-use-tls-with-redis-enterprise/how-to-use-ssl-tls-with-redis-enterprise.mdx @@ -3,6 +3,7 @@ id: how-to-use-ssl-tls-with-redis-enterprise title: How to Use SSL/TLS With Redis Enterprise slug: /guide/security/how-to-use-ssl-tls-with-redis-enterprise/ description: Learn how to secure your Redis databases using SSL +authors: [tug] keywords: - java - node.js @@ -11,10 +12,11 @@ keywords: - tls --- -![Header](/img/guides/security/how-to-use-tls-with-redis-enterprise/000_header.jpeg) +import Authors from '@theme/Authors'; + + -- Date: 19-JAN-2021 -- Author: [Tug Grall](https://twitter.com/tgrall) +![Header](/img/guides/security/how-to-use-tls-with-redis-enterprise/000_header.jpeg) In this article, I will explain how to secure your Redis databases using SSL (Secure Sockets Layer). In production, it is a good practice to use SSL to protect the data that are moving between various computers (client applications and Redis servers). Transport Level Security (TLS) guarantees that only allowed applications/computers are connected to the database, and also that data is not viewed or altered by a middle man process. @@ -189,40 +191,46 @@ except Exception as err: print("Error connecting to Redis: {}".format(err)) ``` -More information in the documentation "[Using Redis with Python](https://developer.redis.com/develop/python/)". - ### 4.3 Using Node.JS For [Node Redis](http://redis.js.org/), use the [TLS](https://nodejs.org/api/tls.html) library to configure the client connection: ```javascript -var redis = require('redis'); -var tls = require('tls'); -var fs = require('fs'); +import { createClient } from 'redis'; +import tls from 'tls'; +import fs from 'fs'; -var ssl = { +const ssl = { key: fs.readFileSync( '../certificates/client_key_app_001.pem', - (encoding = 'ascii'), + {encoding: 'ascii'}, ), cert: fs.readFileSync( '../certificates/client_cert_app_001.pem', - (encoding = 'ascii'), + {encoding: 'ascii'}, ), - ca: [fs.readFileSync('../certificates/proxy_cert.pem', (encoding = 'ascii'))], + ca: [fs.readFileSync('../certificates/proxy_cert.pem', {encoding: 'ascii'})], checkServerIdentity: () => { return null; }, }; -var client = redis.createClient(12000, '127.0.0.1', { - password: 'secretdb01', - tls: ssl, +const client = redis.createClient({ + // replace with your connection string + url: 'rediss://localhost:12000', + socket: { + tls: true, + key: ssl.key, + cert: ssl.cert, + ca: ssl.ca, + }, }); client.info('SERVER', function (err, reply) { console.log(reply); }); + +await client.connect(); ``` More information in the documentation "[Using Redis with Node.js](https://developer.redis.com/develop/node/)". diff --git a/docs/guides/security/index-security.mdx b/docs/guides/security/index-security.mdx index c5134e52a8e..aa0c017df03 100644 --- a/docs/guides/security/index-security.mdx +++ b/docs/guides/security/index-security.mdx @@ -12,7 +12,7 @@ In this section of the site you will learn about Security and Redis. - [Redis.io : Security Overview](https://redis.io/topics/security) - [Redis.io : ACL](https://redis.io/topics/acl) (Access Control List) -### Redis Enterprise by Redis +### Redis Cloud by Redis -- [Redis Enterprise Cloud : Security](https://docs.redis.com/latest/rc/security/) +- [Redis Cloud: Security](https://docs.redis.com/latest/rc/security/) - [Tutorial: How to Use SSL/TLS With Redis Enterprise](/guide/security/how-to-use-ssl-tls-with-redis-enterprise/) (`redis-cli`, Java, Python, Node,) diff --git a/docs/howtos/analytics/index-analytics.mdx b/docs/howtos/analytics/index-analytics.mdx index f31f2efcabe..8a2ae2c2787 100644 --- a/docs/howtos/analytics/index-analytics.mdx +++ b/docs/howtos/analytics/index-analytics.mdx @@ -6,6 +6,10 @@ slug: /howtos/analytics authors: [ajeet] --- +import Authors from '@theme/Authors'; + + + Interactive analytics dashboards serve several purposes. They allow you to share data and provide you with all those vital information to make game-changing decisions at a faster pace. Building a real-time dynamic dashboard using a traditional relational database might require a complex set of queries. By using a NoSQL database like Redis, you can build a powerful interactive and dynamic dashboard with a small number of Redis commands. Redis is an open source, in-memory, key-value data store most commonly used as a primary database, cache, message broker, and queue. Redis cache delivers sub-millisecond response times, enabling fast and powerful real-time applications in industries such as gaming, fintech, ad-tech, social media, healthcare, and IoT. @@ -60,7 +64,7 @@ Go to /server folder (cd ./server) and then execute the following command: You may need to preface the docker command with `sudo`. If you don't want to use sudo, create a Unix group called docker and add users to it. When the Docker daemon starts, it creates a Unix socket accessible by members of the docker group. Once the Redis database is up and running, you can connect to it using the `redis-cli` command. -:::info TIP +:::tip - By default, Redis runs on port 6379 but you can change it by specifying an alternative host port in the docker compose file. - You can use a Redis configuration file and mount it as volumes in the docker compose YAML file. diff --git a/docs/howtos/antipatterns/index-antipatterns.mdx b/docs/howtos/antipatterns/index-antipatterns.mdx index 76e0c720b0f..8d224a1367a 100644 --- a/docs/howtos/antipatterns/index-antipatterns.mdx +++ b/docs/howtos/antipatterns/index-antipatterns.mdx @@ -6,6 +6,10 @@ slug: /howtos/antipatterns/ authors: [ajeet] --- +import Authors from '@theme/Authors'; + + + ![antipattern](antipattern.png) Developers don’t just use Redis, they love it. [Stack Overflow’s annual Developer Survey 2021](https://insights.stackoverflow.com/survey/2021#technology-most-loved-dreaded-and-wanted) has ranked Redis as the Most Loved Database platform for the fifth years running! But it is equally important to understand that Redis defaults are not the best for everyone. Millions of developers uses Redis due to its speed and performance, however it is important to make sure that it is being used properly. @@ -16,13 +20,7 @@ Developers don’t just use Redis, they love it. [Stack Overflow’s annual Deve With large databases running on a single shard/Redis instance, there are chances that the fail over, backup and recovery all will take longer. Hence, it’s always recommended to keep shards to recommended sizes. General conservative rule of thumb is 25Gb or 25K Ops/Second. -Redis Enterprise recommends to shard if you have more than 25 GB of data and a high number of operations. Another aspect is if you have above 25,000 operations per second, then sharding can improve performance. With less number of operations/second, it can handle up to 50GB of data too. - -### 2. One connection per request - -Opening a connection each operation requires a lot of overhead to build up and tear down the TCP connection. As a developer, you might sometimes create a connection, run a command, and close the connection. While opening and closing connections per command will technically work, it’s far from optimal and needlessly cuts into the performance of Redis as a whole. Hence, it is recommended to use a connection pool (Jedis) or a Redis client that has a reactive design (Lettuce). - -Using the OSS Cluster API, the connection to the nodes are maintained by the client as needed, so you’ll have multiple connections open to different nodes at any given time. With Redis Enterprise, the connection is actually to a proxy, which takes care of the complexity of connections at the cluster level. +Redis Cloud recommends to shard if you have more than 25 GB of data and a high number of operations. Another aspect is if you have above 25,000 operations per second, then sharding can improve performance. With less number of operations/second, it can handle up to 50GB of data too. ### Examples #1 - redis-py @@ -33,62 +31,33 @@ Let us look at the redis-py that uses a connection pool to manage connections to >>> r = redis.Redis(connection_pool=pool) ``` -[Learn more about redis-py](/develop/python/) - -### Example #2 - Lettuce - -Lettuce provides generic connection pool support.Lettuce connections are designed to be thread-safe so one connection can be shared amongst multiple threads and Lettuce connections auto-reconnection by default. While connection pooling is not necessary in most cases it can be helpful in certain use cases. Lettuce provides generic connection pooling support. - -```java - RedisClient client = RedisClient.create(RedisURI.create(host, port)); - - GenericObjectPool> pool = ConnectionPoolSupport - .createGenericObjectPool(() -> client.connect(), new GenericObjectPoolConfig()); - - // executing work - try (StatefulRedisConnection connection = pool.borrowObject()) { - - RedisCommands commands = connection.sync(); - commands.multi(); - commands.set("key", "value"); - commands.set("key2", "value2"); - commands.exec(); - } - - // terminating - pool.close(); - client.shutdown(); -``` - -[Learn more about Lettuce](/develop/java/?s=lettuce) - -### 3. Connecting directly to Redis instances +### 2. Connecting directly to Redis instances With a large number of clients, a reconnect flood will be able to simply overwhelm a single threaded Redis process and force a failover. Hence, it is recommended that you should use the right tool that allows you to reduce the number of open connections to your Redis server. [Redis Enterprise DMC proxy](https://docs.redis.com/latest/rs/administering/designing-production/networking/multiple-active-proxy/) allows you to reduce the number of connections to your cache server by acting as a proxy. There are other 3rd party tool like [Twemproxy](https://github.com/twitter/twemproxy). It is a fast and lightweight proxy server that allows you to reduce the number of open connections to your Redis server. It was built primarily to reduce the number of connections to the caching servers on the backend. This, together with protocol pipelining and sharding enables you to horizontally scale your distributed caching architecture. -### 4. More than one secondary shard (Redis OSS) +### 3. More than one secondary shard (Redis OSS) Redis OSS uses a shard-based quorum. It's advised to use at least 3 copies of the data (2 replica shards per master shard) in order to be protected from split-brain situations. In nutshell, Redis OSS solves the quorum challenge by having an odd number of shards (primary + 2 replicas). -Redis Enterprise solves the quorum challenge with an odd number of nodes. Redis Enterprise avoids a split-brain situation with only 2 copies of the data, which is more cost-efficient. In addition, the so-called ‘quorum-only node' can be used to bring a cluster up to an odd number of nodes if an additional, not necessary data node would be too expensive. +Redis Cloud solves the quorum challenge with an odd number of nodes. Redis Cloud avoids a split-brain situation with only 2 copies of the data, which is more cost-efficient. In addition, the so-called ‘quorum-only node' can be used to bring a cluster up to an odd number of nodes if an additional, not necessary data node would be too expensive. -### 5. Performing single operation +### 4. Performing single operation Performing several operations serially increases connection overhead. Instead, use [Redis Pipelining](https://redis.io/topics/pipelining). Pipelining is the process of sending multiple messages down the pipe without waiting on the reply from each - and (typically) processing the replies later when they come in. Pipelining is completely a client side implementation. It is aimed at solving response latency issues in high network latency environments. So, the lesser the amount of time spent over the network in sending commands and reading responses, the better. This is effectively achieved by buffering. The client may (or may not) buffer the commands at the TCP stack (as mentioned in other answers) before they are sent to the server. Once they are sent to the server, the server executes them and buffers them on the server side. The benefit of the pipelining is a drastically improved protocol performance. The speedup gained by pipelining ranges from a factor of five for connections to localhost up to a factor of at least one hundred over slower internet connections. -### 6. Caching keys without TTL +### 5. Caching keys without TTL Redis functions primarily as a key-value store. It is possible to set timeout values on these keys. Said that, a timeout expiration automatically deletes the key. Additionally, when we use commands that delete or overwrite the contents of the key, it will clear the timeout. Redis TTL command is used to get the remaining time of the key expiry in seconds. TTL returns the remaining time to live of a key that has a timeout. This introspection capability allows a Redis client to check how many seconds a given key will continue to be part of the dataset.Keys will accumulate and end up being evicted. Hence, it is recommended to set TTLs on all caching keys. -### 7. Endless Redis Replication Loop +### 6. Endless Redis Replication Loop When attempting to replicate a very large active database over a slow or saturated link, replication never finishes due to the continuous updates. Hence, it is recommended to tune the slave and client buffers to allow for slower replication. Check out [this detailed blog](https://redis.com/blog/the-endless-redis-replication-loop-what-why-and-how-to-solve-it/). -### 8. Hot Keys +### 7. Hot Keys Redis can easily become the core of your app’s operational data, holding valuable and frequently accessed information. However, if you centralize the access down to a few pieces of data accessed constantly, you create what is known as a hot-key problem. In a Redis cluster, the key is actually what determines where in the cluster that data is stored. The data is stored in one single, primary location based off of hashing that key. So, when you access a single key over and over again, you’re actually accessing a single node/shard over and over again. Let’s put it another way—if you have a cluster of 99 nodes and you have a single key that gets a million requests in a second, all million of those requests will be going to a single node, not spread across the other 98 nodes. @@ -100,35 +69,35 @@ Redis even provides tools to find where your hot keys are located. Use redis-cli When possible, the best defence is to avoid the development pattern that is creating the situation. Writing the data to multiple keys that reside in different shards will allow you to access the same data more frequently. In nutshell, having specific keys that are accessed with every client operation. Hence, it's recommended to shard out hot keys using hashing algorithms. You can set policy to LFU and run redis-cli --hotkeys to determine. -### 9. Using Keys command +### 8. Using Keys command In Redis, the KEYS command can be used to perform exhaustive pattern matching on all stored keys. This is not advisable, as running this on an instance with a large number of keys could take a long time to complete, and will slow down the Redis instance in the process. In the relational world, this is equivalent to running an unbound query (SELECT...FROM without a WHERE clause). Execute this type of operation with care, and take necessary measures to ensure that your tenants are not performing a KEYS operation from within their application code. Use SCAN, which spreads the iteration over many calls, not tying up your whole server at one time. -Scaning keyspace by keyname is an extremely slow operation and will run O(N) with N being the number of keys. It is recommended to use RediSearch to return information based on the contents of the data instead of iterating through the key space. +Scaning keyspace by keyname is an extremely slow operation and will run O(N) with N being the number of keys. It is recommended to use Redis Search to return information based on the contents of the data instead of iterating through the key space. ```bash FT.SEARCH orders "@make: ford @model: explorer" 2SQL: SELECT * FROM orders WHERE make=ford AND model=explorer" ``` -### 10. Running Ephemeral Redis as a primary database +### 9. Running Ephemeral Redis as a primary database Redis is often used as a primary storage engine for applications. Unlike using Redis as a cache, using Redis as a primary database requires two extra features to be effective. Any primary database should really be highly available. If a cache goes down, then generally your application is in a brown-out state. If a primary database goes down, your application also goes down. Similarly, if a cache goes down and you restart it empty, that’s no big deal. For a primary database, though, that’s a huge deal. Redis can handle these situations easily, but they generally require a different configuration than running as a cache. Redis as a primary database is great, but you’ve got to support it by turning on the right features. -With Redis open source, you need to set up Redis Sentinel for high availability. In Redis Enterprise, it’s a core feature that you just need to turn on when creating the database. As for durability, both Redis Enterprise and open source Redis provide durability through AOF or snapshotting so your instance(s) start back up the way you left them. +With Redis open source, you need to set up Redis Sentinel for high availability. In Redis Cloud, it’s a core feature that you just need to turn on when creating the database. As for durability, both Redis Cloud and open source Redis provide durability through AOF or snapshotting so your instance(s) start back up the way you left them. -### 11. Storing JSON blobs in a string +### 10. Storing JSON blobs in a string -Microservices written in several languages may not marshal/unmarshal JSON in a consistent manner. Application logic will be required to lock/watch a key for atomic updates. JSON manipulation is often a very compute costly operation. Hence, it is recommended to use HASH data structure and also, RedisJSON module. +Microservices written in several languages may not marshal/unmarshal JSON in a consistent manner. Application logic will be required to lock/watch a key for atomic updates. JSON manipulation is often a very compute costly operation. Hence, it is recommended to use HASH data structure and also Redis JSON. -### 12. Translating a table or JSON to a HASH without considering query pattern +### 11. Translating a table or JSON to a HASH without considering query pattern The only query mechanism is a SCAN which requires reading the data structure and limits filtering to the MATCH directive. It is recommended to store the table or JSON as a string. Break out the indexes into reverse indexes using a SET or SORTED SET and point back to the key for the string. Using SELECT command and multiple databases inside one Redis instance The usage of SELECT and multiple databases inside one Redis instance was mentioned as an anti-pattern by Salvatore (the creator of Redis). It is recommended to use a dedicated Redis instance for each database need. This is especially true in microservice architectures where client applications might step on each other's toes (noisy neighbor, database setup/teardown impact, maintenance, upgrade, ...) -The RedisTimeSeries module provides a direct compete to time series databases. But if the only query is based on ordering, it's unnecessary complexity. Hence, it is recommended to use a SORTED SET with a score of 0 for every value. The values are appended. Or use a timestamp for the score for simple time based queries +The Redis Time Series module provides a direct compete to time series databases. But if the only query is based on ordering, it's unnecessary complexity. Hence, it is recommended to use a SORTED SET with a score of 0 for every value. The values are appended. Or use a timestamp for the score for simple time based queries ### References @@ -145,13 +114,11 @@ The RedisTimeSeries module provides a direct compete to time series databases. B target="_blank" rel="noopener" className="link"> - Redis Launchpad -
diff --git a/docs/howtos/bert-qa-benchmarking/index-bert-qa-benchmarking-with-redisai-and-redisgears.mdx b/docs/howtos/bert-qa-benchmarking/index-bert-qa-benchmarking-with-redisai-and-redisgears.mdx index 417efccb28b..2633fb35dcb 100644 --- a/docs/howtos/bert-qa-benchmarking/index-bert-qa-benchmarking-with-redisai-and-redisgears.mdx +++ b/docs/howtos/bert-qa-benchmarking/index-bert-qa-benchmarking-with-redisai-and-redisgears.mdx @@ -10,6 +10,8 @@ tags: - community --- +import Authors from '@theme/Authors'; + + + + ## Introduction -In this article, we will explore the challenges and opportunities associated with deploying [large BERT Question Answering Transformer](https://huggingface.co/bert-large-uncased-whole-word-masking-finetuned-squad) models from Hugging Face, using [RedisGears](/howtos/redisgears) and [RedisAI](/howtos/redisai/getting-started) to perform a lot of the heavy lifting while also leveraging the in-memory datastore Redis. +In this article, we will explore the challenges and opportunities associated with deploying [large BERT Question Answering Transformer](https://huggingface.co/bert-large-uncased-whole-word-masking-finetuned-squad) models from Hugging Face, using RedisGears and RedisAI to perform a lot of the heavy lifting while also leveraging the in-memory datastore Redis. ## Why do we need RedisAI? diff --git a/docs/howtos/caching/index-caching.mdx b/docs/howtos/caching/index-caching.mdx index 933dfd9a063..3c5bec94b3b 100644 --- a/docs/howtos/caching/index-caching.mdx +++ b/docs/howtos/caching/index-caching.mdx @@ -6,10 +6,9 @@ slug: /howtos/caching authors: [ajeet] --- -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; +import Authors from '@theme/Authors'; + + ![My Image](cachingapp.png) diff --git a/docs/howtos/caching/index-caching.mdx.rails b/docs/howtos/caching/index-caching.mdx.rails index 0673058fe57..fce384e0e55 100644 --- a/docs/howtos/caching/index-caching.mdx.rails +++ b/docs/howtos/caching/index-caching.mdx.rails @@ -8,7 +8,7 @@ slug: /howtos/caching import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem'; import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; +import RedisCard from '@theme/RedisCard'; ![My Image](caching.png) @@ -68,4 +68,4 @@ npm run serve ``` rails s -``` +``` diff --git a/docs/howtos/chatapp/index-chatapp.mdx b/docs/howtos/chatapp/index-chatapp.mdx index 97b2998d73e..312e16b6eaa 100644 --- a/docs/howtos/chatapp/index-chatapp.mdx +++ b/docs/howtos/chatapp/index-chatapp.mdx @@ -6,14 +6,18 @@ slug: /howtos/chatapp authors: [ajeet] --- -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; +import Authors from '@theme/Authors'; + + Real-time chat app is an online communication channel that allows you to conduct real-time conversations. More and more developers are tapping into the power of Redis as it is extremely fast & due to its support for variety of rich data structure such as Lists, Sets, Sorted Sets, Hashes etc. Redis comes along with a Pub/Sub messaging feature functionality that allows developers to scale the backend by spawning multiple server instances. + +:::info + Please note that this code is open source. You can find the link at the end of this tutorial. +::: +
+
diff --git a/docs/howtos/ratelimiting/index-ratelimiting.mdx b/docs/howtos/ratelimiting/index-ratelimiting.mdx index 3f34cc3aaad..55cbc6a7235 100644 --- a/docs/howtos/ratelimiting/index-ratelimiting.mdx +++ b/docs/howtos/ratelimiting/index-ratelimiting.mdx @@ -8,8 +8,9 @@ authors: [ajeet] import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; +import Authors from '@theme/Authors'; + + Rate limiting is a mechanism that many developers may have to deal with at some point in their life. It’s useful for a variety of purposes like sharing access to limited resources or limiting the number of requests made to an API endpoint and responding with a 429 status code. @@ -281,7 +282,7 @@ Copy config/application.yml.example to config/application.yml ### Step 5. Run Redis Docker container ```bash - docker run -d -p 6379:6379 redislabs/redismod + docker run -d -p 6379:6379 redis/redis-stack:latest ``` ### Step 6. Running the app @@ -376,13 +377,11 @@ You can get permitted_requests_count with this command: target="_blank" rel="noopener" className="link"> - Redis Launchpad - diff --git a/docs/howtos/redisai/getting-started/index-gettingstarted.mdx b/docs/howtos/redisai/getting-started/index-gettingstarted.mdx deleted file mode 100644 index 6f7ce8d4a51..00000000000 --- a/docs/howtos/redisai/getting-started/index-gettingstarted.mdx +++ /dev/null @@ -1,249 +0,0 @@ ---- -id: index-gettingstarted -title: RedisAI Tutorial -sidebar_label: RedisAI Tutorial -slug: /howtos/redisai/getting-started -authors: [ajeet] ---- - -RedisAI is a Redis module for executing deep learning/machine learning models and managing their data. It provides tensors as a data type and deep learning model execution on CPUs and GPUs. RedisAI turns Redis Enterprise into a full-fledged deep learning runtime.The RedisAI module is seamlessly plugged into Redis. It is a scalable platform that addresses the unique requirements for both AI training and AI inference in one server. It provides a complete software platform that allows data scientists to easily deploy and manage AI solutions for enterprise applications. - -The platform combines popular open source deep learning frameworks (PyTorch, ONNXRuntime, and TensorFlow), software libraries, and Redis modules like RedisGears, RedisTimeSeries, and more. With RedisAI, AI application developers no longer have to worry about tuning databases for performance. Requiring no added infrastructure, RedisAI lets you run your inference engine where the data lives, decreasing latency. - -Below is an interesting example of Iris (a genus of species of flowering plants with showy flowers) classification based on measurement of width and length of sepal/petals that makes up input tensors and how to load these measurements into RedisAI: - -### Step 1. Installing RedisAI - -```bash - docker run \ - -p 6379:6379 \ - redislabs/redismod \ - --loadmodule /usr/lib/redis/modules/redisai.so \ - ONNX redisai_onnxruntime/redisai_onnxruntime.so -``` - -You will see that ONNX backend getting loaded as shown below in the results. - -```bash - 1:C 09 Jun 2021 12:28:47.985 # oO0OoO0OoO0Oo Redis is starting oO0OoO0OoO0Oo - 1:C 09 Jun 2021 12:28:47.985 # Redis version=6.0.1, bits=64, commit=00000000, modified=0, pid=1, just started - 1:C 09 Jun 2021 12:28:47.985 # Configuration loaded - 1:M 09 Jun 2021 12:28:47.987 * Running mode=standalone, port=6379. - 1:M 09 Jun 2021 12:28:47.987 # Server initialized - 1:M 09 Jun 2021 12:28:47.987 # WARNING you have Transparent Huge Pages (THP) support enabled in your kernel. This will create latency and memory usage issues with Redis. To fix this issue run the command 'echo never > /sys/kernel/mm/transparent_hugepage/enabled' as root, and add it to your /etc/rc.local in order to retain the setting after a reboot. Redis must be restarted after THP is disabled. - 1:M 09 Jun 2021 12:28:47.989 * Redis version found by RedisAI: 6.0.1 - oss - 1:M 09 Jun 2021 12:28:47.989 * RedisAI version 10003, git_sha=7f808a934dff121e188cb76fdfcc3eb1f9ec7cbf - 1:M 09 Jun 2021 12:28:48.011 * ONNX backend loaded from /usr/lib/redis/modules/backends/redisai_onnxruntime/redisai_onnxruntime.so - 1:M 09 Jun 2021 12:28:48.011 * Module 'ai' loaded from /usr/lib/redis/modules/redisai.so - 1:M 09 Jun 2021 12:28:48.011 * Ready to accept connections -``` - -You can verify if the RedisAI module is loaded or not by running the following command: - -```bash - 127.0.0.1:6379> info modules - # Modules - module:name=ai,ver=10003,api=1,filters=0,usedby=[],using=[],options=[] - - # ai_git - ai_git_sha:7f808a934dff121e188cb76fdfcc3eb1f9ec7cbf - - # ai_load_time_configs - ai_threads_per_queue:1 - ai_inter_op_parallelism:0 - ai_intra_op_parallelism:0 -``` - -### Step 2. Setup Python Environment - -Ensure that Python3.8+ is installed. - -```bash - brew install python -``` - -Create a Python virtual environment and activate it: - -```bash - python3 -m venv venv - . ./venv/bin/activate -``` - -### Step 3. Install PIP - -```bash - pip install --upgrade pip -``` - -### Step 4. Clone the repository - -```bash - git clone https://github.com/redis-developer/redisai-iris -``` - -### Step 5. Install the dependencies - -```bash - pip install -r requirements.txt -``` - -### Step 6. Build the ONNX Model - -RedisAI supports DL/ML identifiers and their respective backend libraries, including: - -- TF: The TensorFlow backend -- TFLITE: The TensorFlow Lite backend -- TORCH: The PyTorch backend -- ONNX: ONNXRuntime backend - -A complete list of supported backends is in the [release notes for each version](https://docs.redis.com/latest/modules/redisai/release-notes/redisai-1.0-release-notes/). - -```bash - python build.py -``` - -### Step 7: Deploy the Model into RedisAI - -A Model is a Deep Learning or Machine Learning frozen graph that was generated by some framework. The RedisAI Model data structure represents a DL/ML model that is stored in the database and can be run. Models, like any other Redis and RedisAI data structures, are identified by keys. A Model’s key is created using the `AI.MODELSET` command and requires the graph payload serialized as a protobuf for input. - -NOTE: This requires redis-cli. If you don't have redis-cli, I've found the easiest way to get it is to download, build, and install Redis itself. Details can be found at [the Redis quickstart page](https://redis.io/topics/quickstart) - -```bash - redis-cli -x AI.MODELSET iris ONNX CPU BLOB < iris.onnx -``` - -### Step 8. Make Some Predictions - -[The `AI.TENSORSET` command](https://oss.redis.com/redisai/commands/#aitensorset) stores a tensor as the value of a key. - -Launch redis-cli: - -```bash - redis-cli -``` - -### Step 9. Set the input tensor - -This will set the key 'iris' to the 2x4 RedisAI tensor (i.e. 2 sets of inputs of 4 values each). - -```bash - AI.TENSORSET iris:in FLOAT 2 4 VALUES 5.0 3.4 1.6 0.4 6.0 2.2 5.0 1.5 -``` - -where, - -- iris:in refers to the tensor's key name, -- FLOAT is a tensor's data type -- {5.0 3.4 1.6 0.4} refers to 1st item with 4 features -- {6.0 2.2 5.0 1.5} refers to 2nd item with 4 features - -### Step 10. Display TENSORGET in BLOB format - -The `AI.TENSORGET` command returns a tensor stored as key's value. -The BLOB indicates that data is in binary format and is provided via the subsequent data argument - -```bash - redis-cli AI.TENSORGET iris:in BLOB - "\x00\x00\xa0@\x9a\x99Y@\xcd\xcc\xcc?\xcd\xcc\xcc>\x00\x00\xc0@\xcd\xcc\x0c@\x00\x00\xa0@\x00\x00\xc0?" -``` - -### Step 11. Check the predictions - -```bash - redis-cli AI.TENSORGET iris:in VALUES - 1) "5" - 2) "3.4000000953674316" - 3) "1.6000000238418579" - 4) "0.40000000596046448" - 5) "6" - 6) "2.2000000476837158" - 7) "5" - 8) "1.5" -``` - -### Step 12. Display TENSORGET META information - -The META used with `AI.TENSORGET` returns the tensor's metadata as shown below: - -```bash - redis-cli AI.TENSORGET iris:in META - 1) "dtype" - 2) "FLOAT" - 3) "shape" - 4) 1) (integer) 2 - 2) (integer) 4 - -``` - -### Step 13. Display TENSORGET META information with tensor values - -```bash - redis-cli AI.TENSORGET iris:in META VALUES - 1) "dtype" - 2) "FLOAT" - 3) "shape" - 4) 1) (integer) 2 - 2) (integer) 4 - 5) "values" - 6) 1) "5" - 2) "3.4000000953674316" - 3) "1.6000000238418579" - 4) "0.40000000596046448" - 5) "6" - 6) "2.2000000476837158" - 7) "5" - 8) "1.5" -``` - -### Step 14. Run the model - -Define inputs for the loaded model. - -```bash - redis-cli AI.MODELRUN iris INPUTS iris:in OUTPUTS iris:inferences iris:scores - OK -``` - -### Step 15. Make the prediction - -```bash - redis-cli AI.TENSORGET iris:inferences VALUES META - 1) "dtype" - 2) "INT64" - 3) "shape" - 4) 1) (integer) 2 - 5) "values" - 6) 1) (integer) 0 - 2) (integer) 2 -``` - -### References - -- [Sample IRIS Classification Source Code](https://github.com/redis-developer/redisai-iris) -- [RedisAI - A Server for Machine and Deep Learning Models](https://oss.redis.com/redisai/) - -### Redis University - -#### RedisAI Explained - -
- -
- -#### RedisAI from the Command Line - -
- -
diff --git a/docs/howtos/redisai/index-redisai.mdx b/docs/howtos/redisai/index-redisai.mdx deleted file mode 100644 index 8974e8cfeff..00000000000 --- a/docs/howtos/redisai/index-redisai.mdx +++ /dev/null @@ -1,29 +0,0 @@ ---- -id: index-redisai -title: RedisAI Tutorial -sidebar_label: Overview -slug: /howtos/redisai/ ---- - -import RedisCard from '@site/src/theme/RedisCard'; - -The following links provides you with the available options to get started with RedisTimeSeries - -
- -
- -
- -
- -
-
diff --git a/docs/howtos/redisai/market-basket-analysis/images/image1.png b/docs/howtos/redisai/market-basket-analysis/images/image1.png deleted file mode 100644 index 6bf471ae28d..00000000000 Binary files a/docs/howtos/redisai/market-basket-analysis/images/image1.png and /dev/null differ diff --git a/docs/howtos/redisai/market-basket-analysis/images/image2.png b/docs/howtos/redisai/market-basket-analysis/images/image2.png deleted file mode 100644 index 634e1dcc880..00000000000 Binary files a/docs/howtos/redisai/market-basket-analysis/images/image2.png and /dev/null differ diff --git a/docs/howtos/redisai/market-basket-analysis/images/image3.png b/docs/howtos/redisai/market-basket-analysis/images/image3.png deleted file mode 100644 index 9d4a3f93c2b..00000000000 Binary files a/docs/howtos/redisai/market-basket-analysis/images/image3.png and /dev/null differ diff --git a/docs/howtos/redisai/market-basket-analysis/images/image4.png b/docs/howtos/redisai/market-basket-analysis/images/image4.png deleted file mode 100644 index 67c1e761e09..00000000000 Binary files a/docs/howtos/redisai/market-basket-analysis/images/image4.png and /dev/null differ diff --git a/docs/howtos/redisai/market-basket-analysis/images/image5.png b/docs/howtos/redisai/market-basket-analysis/images/image5.png deleted file mode 100644 index 157cf54126c..00000000000 Binary files a/docs/howtos/redisai/market-basket-analysis/images/image5.png and /dev/null differ diff --git a/docs/howtos/redisai/market-basket-analysis/images/image6.png b/docs/howtos/redisai/market-basket-analysis/images/image6.png deleted file mode 100644 index 2668281b0cb..00000000000 Binary files a/docs/howtos/redisai/market-basket-analysis/images/image6.png and /dev/null differ diff --git a/docs/howtos/redisai/market-basket-analysis/images/image_10.png b/docs/howtos/redisai/market-basket-analysis/images/image_10.png deleted file mode 100644 index e3ca766177c..00000000000 Binary files a/docs/howtos/redisai/market-basket-analysis/images/image_10.png and /dev/null differ diff --git a/docs/howtos/redisai/market-basket-analysis/images/image_11.png b/docs/howtos/redisai/market-basket-analysis/images/image_11.png deleted file mode 100644 index 2440860eaec..00000000000 Binary files a/docs/howtos/redisai/market-basket-analysis/images/image_11.png and /dev/null differ diff --git a/docs/howtos/redisai/market-basket-analysis/images/image_12.png b/docs/howtos/redisai/market-basket-analysis/images/image_12.png deleted file mode 100644 index 2321fdea308..00000000000 Binary files a/docs/howtos/redisai/market-basket-analysis/images/image_12.png and /dev/null differ diff --git a/docs/howtos/redisai/market-basket-analysis/images/image_13.png b/docs/howtos/redisai/market-basket-analysis/images/image_13.png deleted file mode 100644 index 1ac3009ed83..00000000000 Binary files a/docs/howtos/redisai/market-basket-analysis/images/image_13.png and /dev/null differ diff --git a/docs/howtos/redisai/market-basket-analysis/images/image_14.png b/docs/howtos/redisai/market-basket-analysis/images/image_14.png deleted file mode 100644 index 39254811d67..00000000000 Binary files a/docs/howtos/redisai/market-basket-analysis/images/image_14.png and /dev/null differ diff --git a/docs/howtos/redisai/market-basket-analysis/images/image_7.png b/docs/howtos/redisai/market-basket-analysis/images/image_7.png deleted file mode 100644 index 1600b6a7536..00000000000 Binary files a/docs/howtos/redisai/market-basket-analysis/images/image_7.png and /dev/null differ diff --git a/docs/howtos/redisai/market-basket-analysis/images/image_8.png b/docs/howtos/redisai/market-basket-analysis/images/image_8.png deleted file mode 100644 index 947f0d4d938..00000000000 Binary files a/docs/howtos/redisai/market-basket-analysis/images/image_8.png and /dev/null differ diff --git a/docs/howtos/redisai/market-basket-analysis/images/image_9.png b/docs/howtos/redisai/market-basket-analysis/images/image_9.png deleted file mode 100644 index deaaf739788..00000000000 Binary files a/docs/howtos/redisai/market-basket-analysis/images/image_9.png and /dev/null differ diff --git a/docs/howtos/redisai/market-basket-analysis/index-market-basket-analysis.mdx b/docs/howtos/redisai/market-basket-analysis/index-market-basket-analysis.mdx deleted file mode 100644 index 583836c0853..00000000000 --- a/docs/howtos/redisai/market-basket-analysis/index-market-basket-analysis.mdx +++ /dev/null @@ -1,224 +0,0 @@ ---- -id: index-market-basket-analysis -title: How E-Commerce Websites Can Improve Market Basket Analysis with RedisAI and RedisGears -sidebar_label: Market-Basket Analysis using RedisAI and RedisGears -slug: /howtos/redisai/market-basket-analysis -authors: [ajeet, christian] ---- - -![Market-Basket](images/image1.png) - -Whenever you shop online, there is a recommendation system (just like a digital salesman) that is busy guiding you toward the most likely product you might purchase. One good example is movie recommendations offered up when you are searching through the leading entertainment services such as Amazon Prime or Netflix. Modern multi-channel retailers are turning to real-time retail data analytics to understand their customer behavior so that they can properly plan and promote products, increase sales, and optimize supply chain performance. - -### What is Market Basket Analysis? - -With the rapid growth of e-commerce data, more and more organizations are discovering ways of using **Market Basket Analysis** (MBA) to gain useful insights into associations and hidden relationships. A predictive version of market basket analysis is gaining popularity across many sectors in an effort to identify sequential purchases. - -Market basket analysis is a technique based on buying a group item. The approach is based on the theory that customers who buy a certain item (or group of items) are more likely to buy another specific item (or group of items). It creates If-Then scenario rules; for example, if item A is purchased, then item B is likely to be purchased. The rules are probabilistic in nature or, in other words, they are derived from the frequencies of co-occurrence in the observations. - - If {A} then likelihood of B is probably more accurate - -Say you are in a mobile retail shop to buy a mobile phone. Based on the analysis, are you more likely to buy a leather phone case or earphones in the same transaction than somebody who did not buy a phone? - -![alt_text](images/image2.png) - -If someone buys a mobile phone, they are more likely to buy accessories like a leather cover, battery pack, wireless headphones, and so on. - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
BasketProduct 1Product 2 Product 3
1Mobile PhoneLeather CoverTempered Glass Screen
2Mobile PhoneTempered Glass Screen
3Mobile Phone64GB SD cardBattery Pack
4Leather Cover1 year Damage Protection
5Mobile Phone1 year Damage ProtectionTempered Glass Screen
- -As shown above, there are five baskets containing varying combinations of Mobile Phone, Earphone, Battery Pack, and Leather Cover. Retailers use the results of the Market Basket Analysis to guide product placement in stores as well as cross-category and co-marketing promotions. - -Online fraud, which is soaring due to growth in online transactions, digital identity threats, cybercrime, and customer fraud, [is a $20 billion problem](https://www.marketsandmarkets.com/Market-Reports/fraud-detection-prevention-market-1312.html) that just keeps getting worse and worse, especially for financial services companies. According to a recent report, 79% of retailers have observed an increase in fraud. All of these results are driving fraud detection to be much more important than ever before. - -Interestingly, Market Basket Analysis is also used in fraud detection. It may be possible to identify purchase behavior that can be associated with fraud on the basis of MBA data that contain credit card information. Watch this [video](https://www.youtube.com/watch?v=UBRuYVn4MjQ) for more information on this topic. - -In the following example, we demonstrate how you can quickly run your Market Basket Analysis using RedisAI and RedisGears. This tutorial will allow you to score various baskets based on pre-populated users and profiles. The data was pulled from [Amazon reviews datasets](https://s3.amazonaws.com/amazon-reviews-pds/readme.html) and the model was built using reviews with verified purchases. Individual user profiles were compiled and analyzed to describe baskets that were trained to build the model that is included in this repo. - -### Prerequisites: - -- Docker -- Docker Compose - -### Step 1. Clone the repository - -``` -git clone https://github.com/redis-field-engineering/demo-market-basket-analysis -``` - -### Step 2. Bring up the services - -The below compose file specifies the two services: redis as a backend and Market Basket Analysis as a frontend application. It uses a redismod Docker image that is exposed to port 6379 while the application service uses `maguec/ai_basket_analysis:latest `image that runs a container exposed to port 8080. - -#### Redis service - -The `redis` service uses a public Redis image called redismod that holds all the necessary Redis modules, such as RedisGears and RedisAI. It then binds the container and the host machine to the exposed port, `6379`. - -#### Frontend service - -The `application `service uses an image `maguec/ai_basket_analysis:latest` pulled from the Docker Hub registry. The Docker image uses [app.py](https://github.com/redis-field-engineering/demo-market-basket-analysis/blob/master/app.py) that is just 161 lines of Python code written to be as simple as possible. It then binds the container and the host machine to the exposed port, `8080.` - -``` -version: '3' - - -services: - redis: - image: "redislabs/redismod:edge" - ports: - - "6379:6379" - application: - image: "maguec/ai_basket_analysis:latest" - links: - - "redis:redis" - ports: - - "8080:8080" - environment: - - REDIS_SERVER=redis - - REDIS_PORT=6379 - -``` - -From your project directory, start up your application by running `docker-compose up`. - -``` -docker-compose up -d -``` - -### Step 3. List the service status - -``` -docker-compose ps -NAME COMMAND SERVICE STATUS PORTS -demo-market-basket-analysis-application-1 "python3 app.py" application running 0.0.0.0:8080->8080/tcp -demo-market-basket-analysis-redis-1 "redis-server --load…" redis running 0.0.0.0:6379->6379/tcp - -``` - -### Step 4. Access the app - -Open [https://localhost:8080](https://localhost:8080) to access the application. - -![alt_text](images/image3.png) - -Select your preferred user from the list (e.g., pcgamer) - -Once you select the user, click login and that will open the profile page: - -![alt_text](images/image4.png) - -Click on the cart tab and add new items. - -![alt_text](images/image5.png) - -Click “Score” and you will find that all the previously purchased categories will get pre-checked. - -![alt_text](images/image6.png) - -As shown above, red indicates that this category of item has not been purchased by the customer before, while green indicates what has previously been purchased. Let us select a different user and experiment with baskets. - -![alt_text](images/image_7.png) - -Click “Login.” - -![alt_text](images/image_8.png) - -Click “Cart.” - -![alt_text](images/image_9.png) - -Click “Score.” - -![alt_text](images/image_10.png) - -### How does it work? - -![alt_text](images/image_11.png) - -User profiles are stored in Redis as [hash data structures](https://redis.io/commands#hash). After the user adds items to the cart for scoring, the cart is transformed into a tensor scored by [RedisAI](https://redisai.io/), and a confidence score is returned. - -### Scoring - -The scores vary between 0 and 1. A score closer to 1 indicates that the basket of items by category is more likely to mirror broader purchasing patterns. Let’s look at another example for a user profile named "splurgenarrow": - -![alt_text](images/image_12.png) - -If we add Wireless, then this is a highly likely basket with a score of 0.99989. Adding items categorized as "Wireless" results in a basket with a high likelihood of matching the purchasing patterns that we have modeled from previous data. - -![alt_text](images/image_13.png) - -You can use the Redis [MONITOR](https://redis.io/commands/monitor) command in order to stream back every command processed by the Redis server. As shown below, you will find Redis data structures called “Hashes” commands that hold various field-value pairs for representing data objects. - -``` -"GET" "session:53ee3f94-c507-4075-a60f-871546e2f90f" -"EXISTS" "USERLIST" -"SMEMBERS" "USERLIST" -... -"SETEX" -"GET" "session:53ee3f94-c507-4075-a60f-871546e2f90f" -"SETEX" -.. -"GET" "session:53ee3f94-c507-4075-a60f-871546e2f90f" -"HGETALL" "user:splurgenarrow" -``` - -But if we add Major_Appliances, it will be an unlikely basket with a score of 0.00917 - -![alt_text](images/image_14.png) - -``` -"GET" "session:53ee3f94-c507-4075-a60f-871546e2f90f" -"AI.TENSORSET" "TENSOR:53ee3f94-c507-4075-a60f-871546e2f90f" "FLOAT" "1" "43" "VALUES" "0.0" "0.0" "0.0" "0.0" "0.0" "0.0" "0.0" "0.0" "0.0" "0.0" "0.0" "2.0" "0.0" "0.0" "0.0" "0.0" "0.0" "0.0" "0.0" "0.0" "0.0" "0.0" "0.0" "2.0" "0.0" "0.0" "0.0" "0.0" "0.0" "0.0" "1.0" "0.0" "0.0" "0.0" "0.0" "0.0" "0.0" "16.0" "0.0" "0.0" "0.0" "0.0" "0.0" -"AI.MODELRUN" "profile_model" "INPUTS" "TENSOR:53ee3f94-c507-4075-a60f-871546e2f90f" "OUTPUTS" "TENSOR:53ee3f94-c507-4075-a60f-871546e2f90f:results" -"AI.TENSORGET" "TENSOR:53ee3f94-c507-4075-a60f-871546e2f90f:results" "META" "BLOB" -"DEL" "TENSOR:53ee3f94-c507-4075-a60f-871546e2f90f" -"DEL" "TENSOR:53ee3f94-c507-4075-a60f-871546e2f90f:results" -"HGETALL" "user:splurgenarrow" -"SETEX" - -``` - -### Conclusion - -As digital marketing continues to grow, data-mining techniques such as Market Basket Analysis, powered with Redis Modules like RedisAI and RedisGears, are increasingly necessary for online retailers to better understand customer purchasing patterns. Retailers can use this insight into which items are frequently purchased together in order to optimize product placement, offer special deals, and create new product bundles to increase sales while also being better able to identify online fraud. - -### Further References: - -- [Source code of Market Basket Analysis ](https://github.com/redis-field-engineering/demo-market-basket-analysis) -- [Real-Time Fraud Detection with Azure and Redis Enterprise](https://www.youtube.com/watch?v=UBRuYVn4MjQ) diff --git a/docs/howtos/redisbloom/images/README.md b/docs/howtos/redisbloom/images/README.md deleted file mode 100644 index 4148b2f11ab..00000000000 --- a/docs/howtos/redisbloom/images/README.md +++ /dev/null @@ -1 +0,0 @@ -# images diff --git a/docs/howtos/redisbloom/images/Verify_subscription.png b/docs/howtos/redisbloom/images/Verify_subscription.png deleted file mode 100644 index e5911628f69..00000000000 Binary files a/docs/howtos/redisbloom/images/Verify_subscription.png and /dev/null differ diff --git a/docs/howtos/redisbloom/images/activate.png b/docs/howtos/redisbloom/images/activate.png deleted file mode 100644 index b871b07fd4e..00000000000 Binary files a/docs/howtos/redisbloom/images/activate.png and /dev/null differ diff --git a/docs/howtos/redisbloom/images/add_database.png b/docs/howtos/redisbloom/images/add_database.png deleted file mode 100644 index 9ada742a2f2..00000000000 Binary files a/docs/howtos/redisbloom/images/add_database.png and /dev/null differ diff --git a/docs/howtos/redisbloom/images/aws.png b/docs/howtos/redisbloom/images/aws.png deleted file mode 100644 index 5d49974a6a4..00000000000 Binary files a/docs/howtos/redisbloom/images/aws.png and /dev/null differ diff --git a/docs/howtos/redisbloom/images/choosemodule.png b/docs/howtos/redisbloom/images/choosemodule.png deleted file mode 100644 index ba5165de56c..00000000000 Binary files a/docs/howtos/redisbloom/images/choosemodule.png and /dev/null differ diff --git a/docs/howtos/redisbloom/images/create_database.png b/docs/howtos/redisbloom/images/create_database.png deleted file mode 100644 index 6f68abd90d7..00000000000 Binary files a/docs/howtos/redisbloom/images/create_database.png and /dev/null differ diff --git a/docs/howtos/redisbloom/images/create_subscription.png b/docs/howtos/redisbloom/images/create_subscription.png deleted file mode 100644 index 347fdd15353..00000000000 Binary files a/docs/howtos/redisbloom/images/create_subscription.png and /dev/null differ diff --git a/docs/howtos/redisbloom/images/createdatabase.png b/docs/howtos/redisbloom/images/createdatabase.png deleted file mode 100644 index a4415e902e9..00000000000 Binary files a/docs/howtos/redisbloom/images/createdatabase.png and /dev/null differ diff --git a/docs/howtos/redisbloom/images/database_creds.png b/docs/howtos/redisbloom/images/database_creds.png deleted file mode 100644 index ef6379e72b3..00000000000 Binary files a/docs/howtos/redisbloom/images/database_creds.png and /dev/null differ diff --git a/docs/howtos/redisbloom/images/database_details.png b/docs/howtos/redisbloom/images/database_details.png deleted file mode 100644 index 5007a480b06..00000000000 Binary files a/docs/howtos/redisbloom/images/database_details.png and /dev/null differ diff --git a/docs/howtos/redisbloom/images/deployment.png b/docs/howtos/redisbloom/images/deployment.png deleted file mode 100644 index adb4c49d3d9..00000000000 Binary files a/docs/howtos/redisbloom/images/deployment.png and /dev/null differ diff --git a/docs/howtos/redisbloom/images/details_database.png b/docs/howtos/redisbloom/images/details_database.png deleted file mode 100644 index 3881cf02728..00000000000 Binary files a/docs/howtos/redisbloom/images/details_database.png and /dev/null differ diff --git a/docs/howtos/redisbloom/images/final_subscription.png b/docs/howtos/redisbloom/images/final_subscription.png deleted file mode 100644 index 333ce58c396..00000000000 Binary files a/docs/howtos/redisbloom/images/final_subscription.png and /dev/null differ diff --git a/docs/howtos/redisbloom/images/launch_database.png b/docs/howtos/redisbloom/images/launch_database.png deleted file mode 100644 index 861f20f9dec..00000000000 Binary files a/docs/howtos/redisbloom/images/launch_database.png and /dev/null differ diff --git a/docs/howtos/redisbloom/images/plan.png b/docs/howtos/redisbloom/images/plan.png deleted file mode 100644 index d481c7540a6..00000000000 Binary files a/docs/howtos/redisbloom/images/plan.png and /dev/null differ diff --git a/docs/howtos/redisbloom/images/recloud1.png b/docs/howtos/redisbloom/images/recloud1.png deleted file mode 100644 index a73de599091..00000000000 Binary files a/docs/howtos/redisbloom/images/recloud1.png and /dev/null differ diff --git a/docs/howtos/redisbloom/images/recloud2.png b/docs/howtos/redisbloom/images/recloud2.png deleted file mode 100644 index 5cb98fc25f1..00000000000 Binary files a/docs/howtos/redisbloom/images/recloud2.png and /dev/null differ diff --git a/docs/howtos/redisbloom/images/recloud3.png b/docs/howtos/redisbloom/images/recloud3.png deleted file mode 100644 index a390a684cc8..00000000000 Binary files a/docs/howtos/redisbloom/images/recloud3.png and /dev/null differ diff --git a/docs/howtos/redisbloom/images/rediscloud_redisbloom.png b/docs/howtos/redisbloom/images/rediscloud_redisbloom.png deleted file mode 100644 index 1b64278c7ba..00000000000 Binary files a/docs/howtos/redisbloom/images/rediscloud_redisbloom.png and /dev/null differ diff --git a/docs/howtos/redisbloom/images/region.png b/docs/howtos/redisbloom/images/region.png deleted file mode 100644 index ddbe75e4287..00000000000 Binary files a/docs/howtos/redisbloom/images/region.png and /dev/null differ diff --git a/docs/howtos/redisbloom/images/select_cloud.png b/docs/howtos/redisbloom/images/select_cloud.png deleted file mode 100644 index 2784e455de7..00000000000 Binary files a/docs/howtos/redisbloom/images/select_cloud.png and /dev/null differ diff --git a/docs/howtos/redisbloom/images/select_cloud_vendor.png b/docs/howtos/redisbloom/images/select_cloud_vendor.png deleted file mode 100644 index 2526223c800..00000000000 Binary files a/docs/howtos/redisbloom/images/select_cloud_vendor.png and /dev/null differ diff --git a/docs/howtos/redisbloom/images/select_subscription.png b/docs/howtos/redisbloom/images/select_subscription.png deleted file mode 100644 index 531615615e6..00000000000 Binary files a/docs/howtos/redisbloom/images/select_subscription.png and /dev/null differ diff --git a/docs/howtos/redisbloom/images/subscription.png b/docs/howtos/redisbloom/images/subscription.png deleted file mode 100644 index b4a61342f3e..00000000000 Binary files a/docs/howtos/redisbloom/images/subscription.png and /dev/null differ diff --git a/docs/howtos/redisbloom/images/try-free.png b/docs/howtos/redisbloom/images/try-free.png deleted file mode 100644 index 11915ea5927..00000000000 Binary files a/docs/howtos/redisbloom/images/try-free.png and /dev/null differ diff --git a/docs/howtos/redisbloom/images/tryfree.png b/docs/howtos/redisbloom/images/tryfree.png deleted file mode 100644 index bbd57089df9..00000000000 Binary files a/docs/howtos/redisbloom/images/tryfree.png and /dev/null differ diff --git a/docs/howtos/redisbloom/index-redisbloom.mdx b/docs/howtos/redisbloom/index-redisbloom.mdx deleted file mode 100644 index 9da46031f51..00000000000 --- a/docs/howtos/redisbloom/index-redisbloom.mdx +++ /dev/null @@ -1,146 +0,0 @@ ---- -id: index-redisbloom -title: Probabilistic data structures using Redis Stack -sidebar_label: Probabilistic data structures using Redis Stack -slug: /howtos/redisbloom -authors: [ajeet] ---- - -RedisBloom extends Redis core to support additional probabilistic data structures. It allows for solving computer science problems in a constant memory space with extremely fast processing and a low error rate. It supports scalable Bloom and Cuckoo filters to determine (with a specified degree of certainty) whether an item is present or absent from a collection. - -The RedisBloom module provides four data types: - -- Bloom filter: A probabilistic data structure that can test for presence. A Bloom filter is a data structure designed to tell you, rapidly and memory-efficiently, whether an element is present in a set. Bloom filters typically exhibit better performance and scalability when inserting items (so if you're often adding items to your dataset then Bloom may be ideal). -- Cuckoo filter: An alternative to Bloom filters, Cuckoo filters comes with additional support for deletion of elements from a set. These filters are quicker on check operations. -- Count-min sketch: A count-min sketch is generally used to determine the frequency of events in a stream. You can query the count-min sketch to get an estimate of the frequency of any given event. -- Top-K: The Top-K probabilistic data structure in RedisBloom is a deterministic algorithm that approximates frequencies for the top k items. With Top-K, you’ll be notified in real time whenever elements enter into or are expelled from your Top-K list. If an element add-command enters the list, the dropped element will be returned. - -In this tutorial, you will see how Redis Stack provides Redis with support for low latency and compact probabilistic data structures. - -### Step 1. Create a free Cloud account - -Create your free Redis Enterprise Cloud account. Once you click on “Get Started”, you will receive an email with a link to activate your account and complete your signup process. - -:::info TIP -For a limited time, use **TIGER200** to get **$200** credits on Redis Enterprise Cloud and try all the advanced capabilities! - -:tada: [Click here to sign up](https://redis.com/try-free) - -::: - -### Step 2. Create Your database - -Choose your preferred cloud vendor. Select the region and then click "Let's start free" to create your free database automatically. - -:::info TIP -If you want to create a custom database with your preferred name and type of redis, -click "Create a custom database" option shown in the image. -::: - -![create database ](images/select_cloud_vendor.png) - -### Step 3. Verify the database details - -You will be provided with Public endpoint URL and "Redis Stack" as the type of database with the list of modules that comes by default. - -![verify database](images/details_database.png) - -### Step 4. Using RedisInsight - -RedisInsight is a visual tool that lets you do both GUI- and CLI-based interactions with your Redis database, and so much more when developing your Redis based application. It is a fully-featured pure Desktop GUI client that provides capabilities to design, develop and optimize your Redis application. It works with any cloud provider as long as you run it on a host with network access to your cloud-based Redis server. It makes it easy to discover cloud databases and configure connection details with a single click. It allows you to automatically add Redis Enterprise Software and Redis Enterprise Cloud databases. - -[Follow this link](/explore/redisinsightv2/getting-started) to install RedisInsight v2 on your local system. -Assuming that you already have RedisInsight v2 installed on your MacOS, you can browse through the Applications and click "RedisInsight-v2" to bring up the Redis Desktop GUI tool. - -### Step 5. Add Redis database - -![access redisinsight](images/add_database.png) - -### Step 6. Enter Redis Enterprise Cloud details - -Add the Redis Enterprise cloud database endpoint, port and password. - -![access redisinsight](images/database_creds.png) - -### Step 7. Verify the database under RedisInsight dashboard - -![database details](images/database_details.png) - -### Step 8. Getting Started with RedisBloom - -In the next steps you will use some basic RedisBloom commands. You can run them from the Redis command-line interface (redis-cli) or use the CLI available in RedisInsight. (See part 2 of this tutorial to learn more about using the RedisInsight CLI) To interact with RedisBloom, you use the `BF.ADD` and `BF.EXISTS` commands. - -Let’s go ahead and test drive some RedisBloom-specific operations. We will create a basic dataset based on unique visitors’ IP addresses, and you will see how to: - -- Create a Bloom filter -- Determine whether or not an item exists in the Bloom filter -- Add one or more items to the Bloom filter -- Determine whether or not a unique visitor’s IP address exists - -Let’s walk through the process step-by-step: - -#### Create a Bloom filter - -Use the BF.ADD command to add a unique visitor IP address to the Bloom filter as shown here: - -``` ->> BF.ADD unique_visitors 10.94.214.120 -(integer) 1 -(1.75s) -``` - -#### Determine whether or not an item exists - -Use the BF.EXISTS command to determine whether or not an item may exist in the Bloom filter: - -``` ->> BF.EXISTS unique_visitors 10.94.214.120 -(integer) 1 -``` - -``` ->> BF.EXISTS unique_visitors 10.94.214.121 -(integer) 0 -(1.46s) -``` - -In the above example, the first command shows the result as “1”, indicating that the item may exist, whereas the second command displays "0", indicating that the item certainly may not exist. - -#### Add one or more items to the Bloom filter - -Use the BF.MADD command to add one or more items to the Bloom filter, creating the filter if it does not yet exist. This command operates identically to BF.ADD, except it allows multiple inputs and returns multiple values: - -``` ->> BF.MADD unique_visitors 10.94.214.100 10.94.214.200 10.94.214.210 10.94.214.212 -1) (integer) 1 -2) (integer) 1 -3) (integer) 1 -4) (integer) 1 -``` - -As shown above, the BF.MADD allows you to add one or more visitors’ IP addresses to the Bloom filter. - -#### Determine whether or not a unique visitor’s IP address exists - -Use BF.MEXISTS to determine if one or more items may exist in the filter or not: - -``` ->> BF.MEXISTS unique_visitors 10.94.214.200 10.94.214.212 -1) (integer) 1 -2) (integer) 1 -``` - -``` - >> BF.MEXISTS unique_visitors 10.94.214.200 10.94.214.213 -1) (integer) 1 -2) (integer) 0 -``` - -In the above example, the first command shows the result as “1” for both the visitors’ IP addresses, indicating that these items do exist. The second command displays "0" for one of the visitor’s IP addresses, indicating that the item certainly does not exist. - -### Next Steps - -- Learn more about RedisBloom in the [Quick Start](https://oss.redis.com/redisbloom/Quick_Start/) tutorial. -- [How to build a Fraud Detection System using RedisGears and RedisBloom](https://developer.redis.com/howtos/frauddetection/) -- [How to Use Redis for Content Filtering](https://redis.com/blog/use-redis-content-filtering/) -- [Benefits of RedisBloom](https://redis.com/modules/redis-bloom/) diff --git a/docs/howtos/redisbloom/redisbloom.png b/docs/howtos/redisbloom/redisbloom.png deleted file mode 100644 index 2b68d266476..00000000000 Binary files a/docs/howtos/redisbloom/redisbloom.png and /dev/null differ diff --git a/docs/howtos/redisbloom/redisbloom1.png b/docs/howtos/redisbloom/redisbloom1.png deleted file mode 100644 index eded167bb6c..00000000000 Binary files a/docs/howtos/redisbloom/redisbloom1.png and /dev/null differ diff --git a/docs/howtos/redisbloom/tryfree.png b/docs/howtos/redisbloom/tryfree.png deleted file mode 100644 index bbd57089df9..00000000000 Binary files a/docs/howtos/redisbloom/tryfree.png and /dev/null differ diff --git a/docs/howtos/redisbloom/with-dotnet/index.md b/docs/howtos/redisbloom/with-dotnet/index.md deleted file mode 100644 index 6862a60eb97..00000000000 --- a/docs/howtos/redisbloom/with-dotnet/index.md +++ /dev/null @@ -1,176 +0,0 @@ ---- -id: redisbloom-withdotnet -title: Using RedisBloom with .NET -sidebar_label: Using RedisBloom with .NET -slug: /howtos/redisbloom/with-dotnet/redisbloom-withdotnet -authors: [steve] ---- - -Using RedisBloom allows you to efficiently keep track of presence, heavy hitters, and counts on large streams of data. To use RedisBloom in .NET, you should use the [StackExchange.Redis](https://github.com/stackexchange/stackexchange.redis) library. To get started with that package, follow our [getting started guide](https://developer.redis.com/develop/dotnet/). Once you have a reference to an `IDatabase` object, you will need to use the `db.Execute` and `db.ExecuteAsync` methods to run the custom commands you want against redis bloom. - -## Bloom Filters - -Bloom Filters are a powerful data structure that can tell if an item is in a set, think a username on a sign-up form. They're incredibly compact, requiring only 10-20 bits per item you want to add, and extremely quick to add items to, and equally fast to determine if an item is in a set or not. - -### Create a Filter - -You don't need to create a Bloom Filter explicitly as any call of `BF.ADD` to a non-existent key will automatically create a Bloom Filter for you. However, if you want to tell Redis ahead of time how much data the Bloom Filter can expect and the error rate that you want for that data (the number of false positives it will report), You can use the `BF.RESERVE` command: - -```csharp -await db.ExecuteAsync("BF.RESERVE", "bf:username", .01, 10000); -``` - -The above command will reserve a Bloom Filter on the key `bf:username` that expects 10000 records and will have an error rate of 1%. - -### Adding to a Filter - -To add to a Bloom Filter, all you need is to use the `BF.ADD` command: - -```csharp -await db.ExecuteAsync("BF.ADD", "bf:username", "Kermit"); -``` - -The preceding code will add the username `Kermit` to the `bf:username` filter. - -### Check if an Item is in a Filter - -To check if an item has been added to a Bloom Filter yet, you will use the `BF.EXISTS` command: - -```csharp -var exists = await db.ExecuteAsync("BF.EXISTS", "bf:username", "Kermit") == 1; -``` - -After running that command, if the Bloom Filter reports that it contains the item, `exists` will be true; otherwise, `exists` will be false. - -## Count-Min Sketch - -You can use Count-Min Sketches to count the number of times an item has been added to a set quickly and compactly. Although, of course, like other probabilistic data structures, it has some margin of error. In this case, it can over count the number of occurrences. The dimensions of the sketch determine the likelihood of this. - -### Creating a Count-Min Sketch - -There are two ways to create a Count-Min Sketch, by probability and by dimension. Creating a Count-Min Sketch by probability will automatically generate a Count-Min Sketch based on the amount of overestimation you want to allow and the likelihood of overestimating a given element. If you want to initialize by dimensions, a Count-Min Sketch will initialize with the provided width and depth. - -```csharp -await db.ExecuteAsync("CMS.INITBYPROB", "cms:views", .1, .01); -``` - -This code will initialize a Count-Min Sketch. The sketch will have an acceptable overcount of 10% and a probability of overcounting of 1%. - -### Adding Items to a Count-Min Sketch - -To add an item to a Count-Min Sketch, you call the `CMS.INCRBY` command, passing in the quantity of the given item you want to add to the sketch. - -```csharp -await db.ExecuteAsync("CMS.INCRBY", "cms:views", "Gangnam Style", 1); -await db.ExecuteAsync("CMS.INCRBY", "cms:views", "Baby Shark", 1); -await db.ExecuteAsync("CMS.INCRBY", "cms:views", "Gangnam Style", 2); -``` - -The above will add three views of Gangnam Style to the sketch and one view of Baby Shark. - -### Querying the Sketch - -To query the number of occurrences of an element in the sketch, you need to use the `CMS.QUERY` command: - -```csharp -var numViewsGangnamStyle = (long)await db.ExecuteAsync("CMS.QUERY", "cms:views", "Gangnam Style"); -var numViewsBabyShark = (long)await db.ExecuteAsync("CMS.QUERY", "cms:views", "Baby Shark"); -Console.WriteLine($"Gangnam Style Views: {numViewsGangnamStyle}"); -Console.WriteLine($"Baby Shark Views: {numViewsBabyShark}"); -``` - -## Cuckoo Filters - -Cuckoo Filters solve a similar problem to Bloom Filters; they allow you to determine if an item has been added to a set yet. However, Cuckoo Filters have slightly different characteristics than Bloom Filters. For example, you may add the same item to a Cuckoo Filter more than once, and they do support delete operations (which introduces the possibility of false negatives in addition to false positives). - -### Creating a Cuckoo Filter - -Similar to a Bloom Filter, a Cuckoo Filter is automatically created by adding an item to a Cuckoo Filter that does not exist. However, you may want to reserve a Cuckoo Filter ahead of time explicitly, so it knows precisely how many items you expect and how to expand. To do this, just run the `CF.RESERVE` command: - -```csharp -await db.ExecuteAsync("CF.RESERVE", "cf:emails", 10000); -``` - -### Adding to a Cuckoo Filter - -To add an item to a Cuckoo Filter, use the `CF.ADD` command: - -```csharp -await db.ExecuteAsync("CF.ADD", "cf:emails", "foo@bar.com"); -await db.ExecuteAsync("CF.ADD", "cf:emails", "James.Bond@mi6.com"); -``` - -The above will add `foo@bar.com` and `James.Bond@mi6.com` to the Cuckoo Filter. - -### Checking Item Presence in a Cuckoo Filter - -To check if an item has been added to a Cuckoo Filter yet, use the `CF.EXISTS` command: - -```csharp -var jamesEmailExists = (int) await db.ExecuteAsync("CF.EXISTS", "cf:emails", "James.Bond@mi6.com") == 1; -var str = jamesEmailExists - ? "James.Bond@mi6.com has already been added" - : "James.Bond@mi6.com has not been added"; -Console.WriteLine(str); -``` - -## Top-K - -The Top-K data structure allows you to keep a compact leader board of heavy-hitters. This data structure can be extremely useful when keeping track of the most popular items in an enormous stream of data as it makes it so you don't have to keep track of all of the counts of all of your records. - -### Initializing a Top-K - -To initialize a Top-K, use the `TOPK.RESERVE` command. This command will reserve a Top-K that will keep track of the highest `k` items: - -```csharp -await db.ExecuteAsync("TOPK.RESERVE", "topk:views", 5); -``` - -The above, for example, will keep track of the five most viewed videos sent to the Top-K. - -### Add Items to the Top-K - -Adding Items to a Top-K requires the use of the `TOPK.ADD` command, this command can take however many items you want to insert into it, so if you get a batch of items to send at once, it may make sense to send them all across at the same time. For example, let's say we wanted to send 10,000 updates to the Top-K at the same time from a random set of videos: - -```csharp -var videos = new[] {"Gangnam Style", "Baby Shark", "Despacito", "Uptown Funk", "See You Again", "Hello", "Roar", "Sorry"}; -var rand = new Random(); -var args = new List(10001){"topk:views"}; -for (var i = 0; i < 10000; i++) -{ - args.Add(videos[rand.Next(videos.Length)]); -} - -await db.ExecuteAsync("TOPK.ADD", args.ToArray()); -``` - -This code will send them all across in one shot. You can, of course, chunk the items and send them in batches as well. Regardless, this will add items to your Top-K. - -### List the Top K Items - -To list the items in your Top-K, you need to query the Top-K using the `TOPK.LIST` command: - -```csharp -var topK = (RedisResult[]) await db.ExecuteAsync("TOPK.LIST", "topk:views"); -foreach (var item in topK) -{ - Console.WriteLine(item); -} -``` - -This code will get all the items back for you and print them out. - -### Query if an Item is in the Top-K - -To see if a given item is present in the Top-K, you would use `TOPK.QUERY`, passing in the item you want to check membership of: - -```csharp -var BabySharkInTopK = (int) await db.ExecuteAsync("TOPK.QUERY", "topk:views", "Baby Shark") == 1; -Console.WriteLine(BabySharkInTopK ? "Baby Shark is in the Top 5" : "Baby Shark is Not in the Top 5" ); -``` - -The above code will check if Baby Shark is in the Top 5 for video views from our above example. - -## Resources - -- The Code for this Demo can be found in [GitHub](https://github.com/redis-developer/redis-bloom-dotnet-demo) diff --git a/docs/howtos/redisearch/README.md b/docs/howtos/redisearch/README.md deleted file mode 100644 index 4148b2f11ab..00000000000 --- a/docs/howtos/redisearch/README.md +++ /dev/null @@ -1 +0,0 @@ -# images diff --git a/docs/howtos/redisearch/Verify_subscription.png b/docs/howtos/redisearch/Verify_subscription.png deleted file mode 100644 index e5911628f69..00000000000 Binary files a/docs/howtos/redisearch/Verify_subscription.png and /dev/null differ diff --git a/docs/howtos/redisearch/activate.png b/docs/howtos/redisearch/activate.png deleted file mode 100644 index b871b07fd4e..00000000000 Binary files a/docs/howtos/redisearch/activate.png and /dev/null differ diff --git a/docs/howtos/redisearch/aws.png b/docs/howtos/redisearch/aws.png deleted file mode 100644 index 5d49974a6a4..00000000000 Binary files a/docs/howtos/redisearch/aws.png and /dev/null differ diff --git a/docs/howtos/redisearch/choosemodule.png b/docs/howtos/redisearch/choosemodule.png deleted file mode 100644 index ba5165de56c..00000000000 Binary files a/docs/howtos/redisearch/choosemodule.png and /dev/null differ diff --git a/docs/howtos/redisearch/create_database.png b/docs/howtos/redisearch/create_database.png deleted file mode 100644 index 6f68abd90d7..00000000000 Binary files a/docs/howtos/redisearch/create_database.png and /dev/null differ diff --git a/docs/howtos/redisearch/create_subscription.png b/docs/howtos/redisearch/create_subscription.png deleted file mode 100644 index 9ba14b5d269..00000000000 Binary files a/docs/howtos/redisearch/create_subscription.png and /dev/null differ diff --git a/docs/howtos/redisearch/createdatabase.png b/docs/howtos/redisearch/createdatabase.png deleted file mode 100644 index a4415e902e9..00000000000 Binary files a/docs/howtos/redisearch/createdatabase.png and /dev/null differ diff --git a/docs/howtos/redisearch/deployment.png b/docs/howtos/redisearch/deployment.png deleted file mode 100644 index adb4c49d3d9..00000000000 Binary files a/docs/howtos/redisearch/deployment.png and /dev/null differ diff --git a/docs/howtos/redisearch/final_subscription.png b/docs/howtos/redisearch/final_subscription.png deleted file mode 100644 index 333ce58c396..00000000000 Binary files a/docs/howtos/redisearch/final_subscription.png and /dev/null differ diff --git a/docs/howtos/redisearch/images/README.md b/docs/howtos/redisearch/images/README.md deleted file mode 100644 index 4148b2f11ab..00000000000 --- a/docs/howtos/redisearch/images/README.md +++ /dev/null @@ -1 +0,0 @@ -# images diff --git a/docs/howtos/redisearch/images/Verify_subscription.png b/docs/howtos/redisearch/images/Verify_subscription.png deleted file mode 100644 index e5911628f69..00000000000 Binary files a/docs/howtos/redisearch/images/Verify_subscription.png and /dev/null differ diff --git a/docs/howtos/redisearch/images/activate.png b/docs/howtos/redisearch/images/activate.png deleted file mode 100644 index b871b07fd4e..00000000000 Binary files a/docs/howtos/redisearch/images/activate.png and /dev/null differ diff --git a/docs/howtos/redisearch/images/add_database.png b/docs/howtos/redisearch/images/add_database.png deleted file mode 100644 index 9ada742a2f2..00000000000 Binary files a/docs/howtos/redisearch/images/add_database.png and /dev/null differ diff --git a/docs/howtos/redisearch/images/aws.png b/docs/howtos/redisearch/images/aws.png deleted file mode 100644 index 5d49974a6a4..00000000000 Binary files a/docs/howtos/redisearch/images/aws.png and /dev/null differ diff --git a/docs/howtos/redisearch/images/choosemodule.png b/docs/howtos/redisearch/images/choosemodule.png deleted file mode 100644 index ba5165de56c..00000000000 Binary files a/docs/howtos/redisearch/images/choosemodule.png and /dev/null differ diff --git a/docs/howtos/redisearch/images/create_database.png b/docs/howtos/redisearch/images/create_database.png deleted file mode 100644 index 6f68abd90d7..00000000000 Binary files a/docs/howtos/redisearch/images/create_database.png and /dev/null differ diff --git a/docs/howtos/redisearch/images/create_subscription.png b/docs/howtos/redisearch/images/create_subscription.png deleted file mode 100644 index 347fdd15353..00000000000 Binary files a/docs/howtos/redisearch/images/create_subscription.png and /dev/null differ diff --git a/docs/howtos/redisearch/images/createdatabase.png b/docs/howtos/redisearch/images/createdatabase.png deleted file mode 100644 index a4415e902e9..00000000000 Binary files a/docs/howtos/redisearch/images/createdatabase.png and /dev/null differ diff --git a/docs/howtos/redisearch/images/database_creds.png b/docs/howtos/redisearch/images/database_creds.png deleted file mode 100644 index ef6379e72b3..00000000000 Binary files a/docs/howtos/redisearch/images/database_creds.png and /dev/null differ diff --git a/docs/howtos/redisearch/images/database_details.png b/docs/howtos/redisearch/images/database_details.png deleted file mode 100644 index 5007a480b06..00000000000 Binary files a/docs/howtos/redisearch/images/database_details.png and /dev/null differ diff --git a/docs/howtos/redisearch/images/deployment.png b/docs/howtos/redisearch/images/deployment.png deleted file mode 100644 index adb4c49d3d9..00000000000 Binary files a/docs/howtos/redisearch/images/deployment.png and /dev/null differ diff --git a/docs/howtos/redisearch/images/details_database.png b/docs/howtos/redisearch/images/details_database.png deleted file mode 100644 index 3881cf02728..00000000000 Binary files a/docs/howtos/redisearch/images/details_database.png and /dev/null differ diff --git a/docs/howtos/redisearch/images/final_subscription.png b/docs/howtos/redisearch/images/final_subscription.png deleted file mode 100644 index 333ce58c396..00000000000 Binary files a/docs/howtos/redisearch/images/final_subscription.png and /dev/null differ diff --git a/docs/howtos/redisearch/images/launch_database.png b/docs/howtos/redisearch/images/launch_database.png deleted file mode 100644 index 861f20f9dec..00000000000 Binary files a/docs/howtos/redisearch/images/launch_database.png and /dev/null differ diff --git a/docs/howtos/redisearch/images/plan.png b/docs/howtos/redisearch/images/plan.png deleted file mode 100644 index d481c7540a6..00000000000 Binary files a/docs/howtos/redisearch/images/plan.png and /dev/null differ diff --git a/docs/howtos/redisearch/images/recloud1.png b/docs/howtos/redisearch/images/recloud1.png deleted file mode 100644 index a73de599091..00000000000 Binary files a/docs/howtos/redisearch/images/recloud1.png and /dev/null differ diff --git a/docs/howtos/redisearch/images/recloud2.png b/docs/howtos/redisearch/images/recloud2.png deleted file mode 100644 index 5cb98fc25f1..00000000000 Binary files a/docs/howtos/redisearch/images/recloud2.png and /dev/null differ diff --git a/docs/howtos/redisearch/images/recloud3.png b/docs/howtos/redisearch/images/recloud3.png deleted file mode 100644 index a390a684cc8..00000000000 Binary files a/docs/howtos/redisearch/images/recloud3.png and /dev/null differ diff --git a/docs/howtos/redisearch/images/rediscloud_redisearch.png b/docs/howtos/redisearch/images/rediscloud_redisearch.png deleted file mode 100644 index 9d5dbdc9c99..00000000000 Binary files a/docs/howtos/redisearch/images/rediscloud_redisearch.png and /dev/null differ diff --git a/docs/howtos/redisearch/images/region.png b/docs/howtos/redisearch/images/region.png deleted file mode 100644 index ddbe75e4287..00000000000 Binary files a/docs/howtos/redisearch/images/region.png and /dev/null differ diff --git a/docs/howtos/redisearch/images/select_cloud.png b/docs/howtos/redisearch/images/select_cloud.png deleted file mode 100644 index 2784e455de7..00000000000 Binary files a/docs/howtos/redisearch/images/select_cloud.png and /dev/null differ diff --git a/docs/howtos/redisearch/images/select_cloud_vendor.png b/docs/howtos/redisearch/images/select_cloud_vendor.png deleted file mode 100644 index 2526223c800..00000000000 Binary files a/docs/howtos/redisearch/images/select_cloud_vendor.png and /dev/null differ diff --git a/docs/howtos/redisearch/images/select_subscription.png b/docs/howtos/redisearch/images/select_subscription.png deleted file mode 100644 index 531615615e6..00000000000 Binary files a/docs/howtos/redisearch/images/select_subscription.png and /dev/null differ diff --git a/docs/howtos/redisearch/images/subscription.png b/docs/howtos/redisearch/images/subscription.png deleted file mode 100644 index b4a61342f3e..00000000000 Binary files a/docs/howtos/redisearch/images/subscription.png and /dev/null differ diff --git a/docs/howtos/redisearch/images/try-free.png b/docs/howtos/redisearch/images/try-free.png deleted file mode 100644 index 11915ea5927..00000000000 Binary files a/docs/howtos/redisearch/images/try-free.png and /dev/null differ diff --git a/docs/howtos/redisearch/images/tryfree.png b/docs/howtos/redisearch/images/tryfree.png deleted file mode 100644 index bbd57089df9..00000000000 Binary files a/docs/howtos/redisearch/images/tryfree.png and /dev/null differ diff --git a/docs/howtos/redisearch/index-redisearch.mdx b/docs/howtos/redisearch/index-redisearch.mdx deleted file mode 100644 index 0e9ffa33e65..00000000000 --- a/docs/howtos/redisearch/index-redisearch.mdx +++ /dev/null @@ -1,160 +0,0 @@ ---- -id: index-redisearch -title: Full-text search using Redis Stack -sidebar_label: RediSearch Tutorial -slug: /howtos/redisearch -authors: [ajeet] ---- - -RediSearch is a powerful text search and secondary indexing engine, built on top of Redis as a Redis module. Written in C, RediSearch is extremely fast compared to other open-source search engines. It implements multiple data types and commands that fundamentally change what you can do with Redis. RediSearch supports capabilities for search and filtering such as geo-spatial queries, retrieving only IDs (instead of whole documents), and custom document scoring. Aggregations can combine map, filter, and reduce/group-by operations in custom pipelines that run across millions of elements in an instant. - -RediSearch also supports auto-completion with fuzzy prefix matching, and atomic real-time insertion of new documents to a search index. With the latest RediSearch 2.0 release, it’s now easier than ever to create a secondary index on top of your existing data. You can just add RediSearch to your existing Redis database, create an index, and start querying it, without having to migrate your data or use new commands for adding data to the index. This drastically lowers the learning curve for new RediSearch users and lets you create indexes on your existing Redis databases—without even having to restart them. - -### Step 1. Create a free Cloud account - -Create your free Redis Enterprise Cloud account. Once you click on “Get Started”, you will receive an email with a link to activate your account and complete your signup process. - -:::info TIP -For a limited time, use **TIGER200** to get **$200** credits on Redis Enterprise Cloud and try all the advanced capabilities! - -:tada: [Click here to sign up](https://redis.com/try-free) - -::: - -### Step 2. Create Your database - -Choose your preferred cloud vendor. Select the region and then click "Let's start free" to create your free database automatically. - -:::info TIP -If you want to create a custom database with your preferred name and type of Redis, -click "Create a custom database" option shown in the image. -::: - -![create database ](images/select_cloud_vendor.png) - -### Step 3. Verify the database details - -You will be provided with Public endpoint URL and "Redis Stack" as the type of database with the list of modules that comes by default. - -![verify database](images/details_database.png) - -### Step 4. Using RedisInsight - -RedisInsight is a visual tool that lets you do both GUI- and CLI-based interactions with your Redis database, and so much more when developing your Redis based application. It is a fully-featured pure Desktop GUI client that provides capabilities to design, develop and optimize your Redis application. It works with any cloud provider as long as you run it on a host with network access to your cloud-based Redis server. It makes it easy to discover cloud databases and configure connection details with a single click. It allows you to automatically add Redis Enterprise Software and Redis Enterprise Cloud databases. - -Assuming that you already have RedisInsight v2 installed on your MacOS, you can browse through the Applications and click "RedisInsight-v2" to bring up the Redis Desktop GUI tool. - -### Step 5. Add Redis database - -![access redisinsight](images/add_database.png) - -### Step 6. Enter Redis Enterprise Cloud details - -Add the Redis Enterprise cloud database endpoint, port and password. - -![access redisinsight](images/database_creds.png) - -### Step 7. Verify the database under RedisInsight dashboard - -![database details](images/database_details.png) - -### Step 8. Getting Started with Redisearch - -To begin, let’s create a basic dataset based on movies information, which we will use to show how to: - -- Insert data -- Create an index -- Query data - -![Redisearch](redisearch12.png) - -#### Insert data into RediSearch - -We are now ready to insert some data. This example uses movies data stored as Redis Hashes, so let’s insert a couple of movies: - -``` -HSET movies:11002 title "Star Wars: Episode V - The Empire Strikes Back" plot "Luke Skywalker begins Jedi training with Yoda." release_year 1980 genre "Action" rating 8.7 votes 1127635 - -(integer) 6 - -> HSET movies:11003 title "The Godfather" plot "The aging patriarch of an organized crime dynasty transfers control of his empire to his son." release_year 1972 genre "Drama" rating 9.2 votes 1563839 - -(integer) 6 -``` - -Your Redis Enterprise Cloud database now contains two Hashes. It is simple to retrieve information using the HMGET command, if you know the key of the movies (movies:11002): - -``` -> HMGET movies:11002 title rating - -1) "Star Wars: Episode V - The Empire Strikes Back" -2) "8.7" -``` - -#### Create an index in RediSearch - -To be able to query the hashes on the field for title, say, or genre, you must first create an index. To create an index, you must define a schema to list the fields and their types that are indexed, and that you can use in your queries. - -Use the FT.CREATE command to create an index, as shown here: - -``` -> FT.CREATE idx:movies ON hash PREFIX 1 "movies:" SCHEMA title TEXT SORTABLE release_year NUMERIC SORTABLE rating NUMERIC SORTABLE genre TAG SORTABLE - -OK -``` - -In the command above, we: - -- Create an index named idx:movies -- Used a schema made up of four fields: - title - release_year - rating - genre - -Before running queries on our new index, though, let’s take a closer look at the elements of the FT.CREATE command: - -- idx:movies: the name of the index, which you will use when doing queries -- ON hash: the type of structure to be indexed. (Note that RediSearch 2.0 supports only the Hash structure, but this parameter will allow RediSearch to index other structures in the future.) -- PREFIX 1 “movies:”: the prefix of the keys that should be indexed. This is a list, so since we want to index only movies:\* keys the number is 1. If you want to index movies and TV shows with the same fields, you could use: PREFIX 2 “movies:” “tv_show:” -- SCHEMA …: defines the schema, the fields, and their type to index. As you can see in the command, we are using TEXT, NUMERIC, and TAG, as well as SORTABLE parameters. - -The RediSearch 2.0 engine will scan the database using the PREFIX values, and update the index based on the schema definition. This makes it easy to add an index to an existing application that uses Hashes, there’s no need to change your code. - -#### Search the movies in the RediSearch index - -You can now use the FT.SEARCH to search your database, for example, to search all movies sorted by release year: - -``` -> FT.SEARCH idx:movies * SORTBY release_year ASC RETURN 2 title release_year -1) (integer) 2 -2) "movies:1003" -3) 1) "release_year" - 2) "1972" - 3) "title" - 4) "The Godfather" -4) "movies:1002" -5) 1) "release_year" - 2) "1980" - 3) "title" - 4) "Star Wars: Episode V - The Empire Strikes Back" -``` - -You can also search “action” movies that contain “star” in the index (in our sample index, the term “star” will occur only in the title): - -``` -> FT.SEARCH idx:movies "star @genre:{action}" RETURN 2 title release_year -1) (integer) 1 -2) "movies:1002" -3) 1) "title" - 2) "Star Wars: Episode V - The Empire Strikes Back" - 3) "release_year" - 4) "1980" -``` - -The FT.SEARCH command is the base command to search your database, it has many options and is associated with a powerful and rich query syntax that you can find in the documentation. (Note: You can also use the index to do data aggregation using the FT.AGGREGATE command.) - -### Next Steps - -- Learn more about RediSearch in the [Getting Started with RediSearch 2.0](https://github.com/RediSearch/redisearch-getting-started/) tutorial on GitHub. -- [How to list and search Movie database using Redisearch](/howtos/moviesdatabase/getting-started) diff --git a/docs/howtos/redisearch/launch_database.png b/docs/howtos/redisearch/launch_database.png deleted file mode 100644 index 861f20f9dec..00000000000 Binary files a/docs/howtos/redisearch/launch_database.png and /dev/null differ diff --git a/docs/howtos/redisearch/plan.png b/docs/howtos/redisearch/plan.png deleted file mode 100644 index d481c7540a6..00000000000 Binary files a/docs/howtos/redisearch/plan.png and /dev/null differ diff --git a/docs/howtos/redisearch/recloud1.png b/docs/howtos/redisearch/recloud1.png deleted file mode 100644 index a73de599091..00000000000 Binary files a/docs/howtos/redisearch/recloud1.png and /dev/null differ diff --git a/docs/howtos/redisearch/recloud2.png b/docs/howtos/redisearch/recloud2.png deleted file mode 100644 index 5cb98fc25f1..00000000000 Binary files a/docs/howtos/redisearch/recloud2.png and /dev/null differ diff --git a/docs/howtos/redisearch/recloud3.png b/docs/howtos/redisearch/recloud3.png deleted file mode 100644 index a390a684cc8..00000000000 Binary files a/docs/howtos/redisearch/recloud3.png and /dev/null differ diff --git a/docs/howtos/redisearch/redisearch1.png b/docs/howtos/redisearch/redisearch1.png deleted file mode 100644 index 08a87e0ef51..00000000000 Binary files a/docs/howtos/redisearch/redisearch1.png and /dev/null differ diff --git a/docs/howtos/redisearch/redisearch10.png b/docs/howtos/redisearch/redisearch10.png deleted file mode 100644 index b621fc78951..00000000000 Binary files a/docs/howtos/redisearch/redisearch10.png and /dev/null differ diff --git a/docs/howtos/redisearch/redisearch11.png b/docs/howtos/redisearch/redisearch11.png deleted file mode 100644 index fd39d98d208..00000000000 Binary files a/docs/howtos/redisearch/redisearch11.png and /dev/null differ diff --git a/docs/howtos/redisearch/redisearch12.png b/docs/howtos/redisearch/redisearch12.png deleted file mode 100644 index 64f18af1488..00000000000 Binary files a/docs/howtos/redisearch/redisearch12.png and /dev/null differ diff --git a/docs/howtos/redisearch/redisearch2.png b/docs/howtos/redisearch/redisearch2.png deleted file mode 100644 index 86202909239..00000000000 Binary files a/docs/howtos/redisearch/redisearch2.png and /dev/null differ diff --git a/docs/howtos/redisearch/redisearch3.png b/docs/howtos/redisearch/redisearch3.png deleted file mode 100644 index eded167bb6c..00000000000 Binary files a/docs/howtos/redisearch/redisearch3.png and /dev/null differ diff --git a/docs/howtos/redisearch/redisearch4.png b/docs/howtos/redisearch/redisearch4.png deleted file mode 100644 index ea5817846ee..00000000000 Binary files a/docs/howtos/redisearch/redisearch4.png and /dev/null differ diff --git a/docs/howtos/redisearch/redisearch5.png b/docs/howtos/redisearch/redisearch5.png deleted file mode 100644 index 3c1ae9b1840..00000000000 Binary files a/docs/howtos/redisearch/redisearch5.png and /dev/null differ diff --git a/docs/howtos/redisearch/redisearch6.png b/docs/howtos/redisearch/redisearch6.png deleted file mode 100644 index becf40ab0d2..00000000000 Binary files a/docs/howtos/redisearch/redisearch6.png and /dev/null differ diff --git a/docs/howtos/redisearch/redisearch7.png b/docs/howtos/redisearch/redisearch7.png deleted file mode 100644 index cedbfe66b59..00000000000 Binary files a/docs/howtos/redisearch/redisearch7.png and /dev/null differ diff --git a/docs/howtos/redisearch/redisearch8.png b/docs/howtos/redisearch/redisearch8.png deleted file mode 100644 index 645cf65625f..00000000000 Binary files a/docs/howtos/redisearch/redisearch8.png and /dev/null differ diff --git a/docs/howtos/redisearch/redisearch9.png b/docs/howtos/redisearch/redisearch9.png deleted file mode 100644 index 44368245b37..00000000000 Binary files a/docs/howtos/redisearch/redisearch9.png and /dev/null differ diff --git a/docs/howtos/redisearch/region.png b/docs/howtos/redisearch/region.png deleted file mode 100644 index ddbe75e4287..00000000000 Binary files a/docs/howtos/redisearch/region.png and /dev/null differ diff --git a/docs/howtos/redisearch/select_cloud.png b/docs/howtos/redisearch/select_cloud.png deleted file mode 100644 index 2784e455de7..00000000000 Binary files a/docs/howtos/redisearch/select_cloud.png and /dev/null differ diff --git a/docs/howtos/redisearch/select_subscription.png b/docs/howtos/redisearch/select_subscription.png deleted file mode 100644 index 531615615e6..00000000000 Binary files a/docs/howtos/redisearch/select_subscription.png and /dev/null differ diff --git a/docs/howtos/redisearch/subscription.png b/docs/howtos/redisearch/subscription.png deleted file mode 100644 index b4a61342f3e..00000000000 Binary files a/docs/howtos/redisearch/subscription.png and /dev/null differ diff --git a/docs/howtos/redisearch/try-free.png b/docs/howtos/redisearch/try-free.png deleted file mode 100644 index 11915ea5927..00000000000 Binary files a/docs/howtos/redisearch/try-free.png and /dev/null differ diff --git a/docs/howtos/redisearch/tryfree.png b/docs/howtos/redisearch/tryfree.png deleted file mode 100644 index bbd57089df9..00000000000 Binary files a/docs/howtos/redisearch/tryfree.png and /dev/null differ diff --git a/docs/howtos/redisgears/index-redisgears.mdx b/docs/howtos/redisgears/index-redisgears.mdx deleted file mode 100644 index c0407c9912b..00000000000 --- a/docs/howtos/redisgears/index-redisgears.mdx +++ /dev/null @@ -1,66 +0,0 @@ ---- -id: index-redisgears -title: RedisGears Tutorial -sidebar_label: RedisGears Tutorial -slug: /howtos/redisgears -authors: [ajeet] ---- - -[RedisGears](https://redis.com/modules/redis-gears/) is an engine for data processing in Redis. RedisGears supports batch and event-driven processing for Redis data. To use RedisGears, you write functions that describe how your data should be processed. You then submit this code to your Redis deployment for remote execution. - -RedisGears is implemented by a Redis module. To use RedisGears, you’ll need to make sure that your Redis deployment has the module installed. - -### Step 1. Installing RedisGears - -Before you can use RedisGears, you have to install the RedisGears module. We will be using redislabs/redismod Docker image for this demonsration - -```bash - docker run -d -p 6379:6379 redislabs/redismod -``` - -### Step 2. Verifying if RedisGears module is enabled: - -You can directly use `redis-cli` CLI to verify if RedisGears module("rg") is properly loaded or not. - -```bash - redis-cli - redis-cli - 127.0.0.1:6379> info modules - # Modules - .. - module:name=rg,ver=10006,api=1,filters=0,usedby=[],using=[ai],options=[] -``` - -### Step 3. Create a "wordcount" Python script - -To demonstrate RedisGears functionality, we will be performing a unique word count on the existing strings. -We will be writing a RedisGears function to do this. - -Open a file called wordcount.py, and add the following code: - -```python - gb = GearsBuilder() - gb.map(lambda x: x['value']) # map each key object to its string value - gb.flatmap(lambda x: x.split()) # split each string into a list of words - gb.countby() # run a count-unique on these words - gb.run() -``` - -### Step 4. Execute the CLI - -```bash - redis-cli rg.pyexecute "`cat wordcount.py`" - 1) 1) "{'key': 'world', 'value': 1}" - 2) "{'key': 'galaxy', 'value': 1}" - 3) "{'key': 'hello', 'value': 3}" - 4) "{'key': 'universe', 'value': 1}" - 2) (empty array) -``` - -The results here show the number of occurences of each word in all of our strings. So, we’ve effectively processed the data in our Redis database all at once, in a batch. - -### References - -- [How to build a Fraud Detection System using RedisGears and RedisBloom](/howtos/frauddetection) -- [Writing Your Serverless function using RedisGears Browser Tool](/explore/redisinsight/redisgears) -- [RedisGears Module](https://redis.com/modules/redis-gears/) diff --git a/docs/howtos/redisgraph/csvtograph/index-csvtograph.mdx b/docs/howtos/redisgraph/csvtograph/index-csvtograph.mdx index 6e2e567abca..dc95d12d285 100644 --- a/docs/howtos/redisgraph/csvtograph/index-csvtograph.mdx +++ b/docs/howtos/redisgraph/csvtograph/index-csvtograph.mdx @@ -6,7 +6,13 @@ slug: /howtos/redisgraph/csvtograph authors: [ajeet] --- -[RedisGraph](https://redis.com/modules/redis-graph/) is the fastest graph database that processes complex graph operations in real time, 10x – 600x faster than any other graph database. It shows how your data is connected through multiple visualization integrations including [RedisInsight](/explore/redisinsight/getting-started), Linkurious, and Graphileon. +import Authors from '@theme/Authors'; +import GraphEol from '@site/docs/common/_graph-eol.mdx'; + + + + +[RedisGraph](https://redis.com/modules/redis-graph/) is the fastest graph database that processes complex graph operations in real time, 10x – 600x faster than any other graph database. It shows how your data is connected through multiple visualization integrations including RedisInsight, Linkurious, and Graphileon. It allows you to query graphs using the industry-standard Cypher query language and you can easily use graph capabilities from application code. ![My Image](redisgraph_preview.png) @@ -17,10 +23,10 @@ If you have a bunch of CSV files that you want to load to RedisGraph database, y Follow the steps below to load CSV data into RedisGraph database: -### Step 1. Run RedisMod Docker container +### Step 1. Run Redis Stack Docker container ```bash - docker run -p 6379:6379 --name redislabs/redismod + docker run -p 6379:6379 --name redis/redis-stack ``` ### Step 2. Verify if RedisGraph module is loaded @@ -173,9 +179,7 @@ Next, point your browser to http://localhost:8001. ### References -- [Building Movies database app using RedisGraph and NodeJS](/howtos/redisgraphmovies/) - Learn more about RedisGraph in the [Quickstart](https://oss.redis.com/redisgraph/) tutorial. -- [Query, Visualize and Manipulate Graphs using RedisGraph Browser Tool](/explore/redisinsight/redisgraph) ## @@ -185,13 +189,11 @@ Next, point your browser to http://localhost:8001. target="_blank" rel="noopener" className="link"> - Redis Launchpad - diff --git a/docs/howtos/redisgraph/explore-python-code/index-explorepythoncode.mdx b/docs/howtos/redisgraph/explore-python-code/index-explorepythoncode.mdx index f1d16c52ea2..332878c8807 100644 --- a/docs/howtos/redisgraph/explore-python-code/index-explorepythoncode.mdx +++ b/docs/howtos/redisgraph/explore-python-code/index-explorepythoncode.mdx @@ -6,6 +6,12 @@ slug: /howtos/redisgraph/explore-python-code authors: [ajeet] --- +import Authors from '@theme/Authors'; +import GraphEol from '@site/docs/common/_graph-eol.mdx'; + + + + Pycograph is an open source tool that creates a RedisGraph model of your Python code. The tool lets you to explore your Python codebase with graph queries. With Pycograph, you can query the python code with Cypher. Additionally, it is possible to visualize the graph model using RedisInsight. The project is hosted over https://pycograph.com/ and package is available in [PyPI repository](https://pypi.org/project/pycograph/). It was introduced for the first time by [Reka Horvath](https://pypi.org/user/reka/) during RedisConf 2021. @@ -37,10 +43,10 @@ Let us see how to explore Python code using Pycograph and RedisGraph below: ### Step 3. Start RedisGraph Module -The redislabs/redismod Docker image provides you all the essential Redis modules. +The redis/redis-stack Docker image provides you all the essential Redis modules. ```bash - docker run -d -p 6379:6379 redislabs/redismod + docker run -d -p 6379:6379 redis/redis-stack ``` ### Step 4. Run RedisInsight @@ -120,13 +126,11 @@ Functions called by the Docker Compose top level commands up and run target="_blank" rel="noopener" className="link"> - Redis Launchpad - diff --git a/docs/howtos/redisgraph/getting-started/index-gettingstarted.mdx b/docs/howtos/redisgraph/getting-started/index-gettingstarted.mdx index 9af93fac350..3d89580cd12 100644 --- a/docs/howtos/redisgraph/getting-started/index-gettingstarted.mdx +++ b/docs/howtos/redisgraph/getting-started/index-gettingstarted.mdx @@ -6,14 +6,21 @@ slug: /howtos/redisgraph/getting-started authors: [ajeet] --- +import Authors from '@theme/Authors'; +import GraphEol from '@site/docs/common/_graph-eol.mdx'; + + + + + RedisGraph is a Redis module that enables enterprises to process any kind of connected data much faster than with traditional relational or existing graph databases. RedisGraph implements a unique data storage and processing solution (with sparse-adjacency matrices and GraphBLAS) to deliver the fastest and most efficient way to store, manage, and process connected data in graphs. With RedisGraph, you can process complex transactions 10 - 600 times faster than with traditional graph solutions while using 50 - 60% less memory resources than other graph databases! ### Step 1. Create a free Cloud account -Create your free Redis Enterprise Cloud account. Once you click on “Get Started”, you will receive an email with a link to activate your account and complete your signup process. +Create your free Redis Cloud account. Once you click on “Get Started”, you will receive an email with a link to activate your account and complete your signup process. -:::info TIP -For a limited time, use **TIGER200** to get **$200** credits on Redis Enterprise Cloud and try all the advanced capabilities! +:::tip +For a limited time, use **TIGER200** to get **$200** credits on Redis Cloud and try all the advanced capabilities! :tada: [Click here to sign up](https://redis.com/try-free) @@ -23,7 +30,7 @@ For a limited time, use **TIGER200** to get **$200** credits on Redis Enterprise Choose your preferred cloud vendor. Select the region and then click "Let's start free" to create your free database automatically. -:::info TIP +:::tip If you want to create a custom database with your preferred name and type of Redis, click "Create a custom database" option shown in the image. ::: @@ -38,7 +45,7 @@ You will be provided with Public endpoint URL and "Redis Stack" as the type of d ### Step 4. Install RedisInsight -RedisInsight is a visual tool that lets you do both GUI- and CLI-based interactions with your Redis database, and so much more when developing your Redis based application. It is a fully-featured pure Desktop GUI client that provides capabilities to design, develop and optimize your Redis application. It works with any cloud provider as long as you run it on a host with network access to your cloud-based Redis server. It makes it easy to discover cloud databases and configure connection details with a single click. It allows you to automatically add Redis Enterprise Software and Redis Enterprise Cloud databases. +RedisInsight is a visual tool that lets you do both GUI- and CLI-based interactions with your Redis database, and so much more when developing your Redis based application. It is a fully-featured pure Desktop GUI client that provides capabilities to design, develop and optimize your Redis application. It works with any cloud provider as long as you run it on a host with network access to your cloud-based Redis server. It makes it easy to discover cloud databases and configure connection details with a single click. It allows you to automatically add Redis Enterprise Software and Redis Cloud databases. You can install Redis Stack on your local system to get RedisInsight GUI tool up and running. Ensure that you have `brew` package installed in your Mac system. @@ -61,9 +68,9 @@ Go to Applications and click "RedisInsight-v2" to bring up the Redis Desktop GUI ![access redisinsight](images/add_database.png) -### Step 6. Enter Redis Enterprise Cloud details +### Step 6. Enter Redis Cloud details -Add the Redis Enterprise cloud database endpoint, port and password. +Add the Redis Cloud database endpoint, port and password. ![access redisinsight](images/database_creds.png) @@ -231,9 +238,7 @@ Click on the Execute button, and double click on the actors to follow the relati ### Next Steps -- [Building Movies database app using RedisGraph and NodeJS](/howtos/redisgraphmovies/) - Learn more about RedisGraph in the [Quickstart](https://oss.redis.com/redisgraph/3) tutorial. -- [Query, Visualize and Manipulate Graphs using RedisGraph Browser Tool](/explore/redisinsight/redisgraph) ## @@ -243,13 +248,11 @@ Click on the Execute button, and double click on the actors to follow the relati target="_blank" rel="noopener" className="link"> - Redis Launchpad - diff --git a/docs/howtos/redisgraph/index-redisgraph.mdx b/docs/howtos/redisgraph/index-redisgraph.mdx index 6e1b4163dda..7c3640ef79a 100644 --- a/docs/howtos/redisgraph/index-redisgraph.mdx +++ b/docs/howtos/redisgraph/index-redisgraph.mdx @@ -5,8 +5,10 @@ sidebar_label: Overview slug: /howtos/redisgraph/ --- -import RedisCard from '@site/src/theme/RedisCard'; +import RedisCard from '@theme/RedisCard'; +import GraphEol from '@site/docs/common/_graph-eol.mdx'; + The following links provides you with the available options to get started with RedisGraph
diff --git a/docs/howtos/redisgraph/redisgraph-cheatsheet/index-redisgraph-cheatsheet.mdx b/docs/howtos/redisgraph/redisgraph-cheatsheet/index-redisgraph-cheatsheet.mdx index d9ade4b5b8d..39702250e71 100644 --- a/docs/howtos/redisgraph/redisgraph-cheatsheet/index-redisgraph-cheatsheet.mdx +++ b/docs/howtos/redisgraph/redisgraph-cheatsheet/index-redisgraph-cheatsheet.mdx @@ -4,6 +4,9 @@ title: RedisGRAPH Cheatsheet sidebar_label: RedisGRAPH CheatSheet slug: /howtos/redisgraph/redisgraph-cheatsheet --- +import GraphEol from '@site/docs/common/_graph-eol.mdx'; + + | Command | Purpose | Syntax | | :------------------------------------------------------------------------------------------------------------------------------------------- | :------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | :------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | diff --git a/docs/howtos/redisgraph/redisgraphmovies/index-redisgraphmovies.mdx b/docs/howtos/redisgraph/redisgraphmovies/index-redisgraphmovies.mdx index 7f5ab60a932..ceb1be1f83c 100644 --- a/docs/howtos/redisgraph/redisgraphmovies/index-redisgraphmovies.mdx +++ b/docs/howtos/redisgraph/redisgraphmovies/index-redisgraphmovies.mdx @@ -6,6 +6,12 @@ slug: /howtos/redisgraph/redisgraphmovies authors: [ajeet] --- +import Authors from '@theme/Authors'; +import GraphEol from '@site/docs/common/_graph-eol.mdx'; + + + + IMDb(Internet Movie Database) is the world's most popular and authoritative source for information on movies, TV shows and celebrities. This application is an IMDB clone with basic account authentication and movie recommendation functionality. You will learn the power of RedisGraph and NodeJS to build a simple movie database. ![moviedb](moviedb_frontpage.png) @@ -20,10 +26,10 @@ IMDb(Internet Movie Database) is the world's most popular and authoritative sour - Node - v13.14.0+ - NPM - v7.6.0+ -### Step 2. Run Redismod Docker container +### Step 2. Run Redis Stack Docker container ```bash - docker run -d -p 6379:6379 redislabs/redismod + docker run -d -p 6379:6379 redis/redis-stack ``` Ensure that Docker container is up and running: @@ -31,7 +37,7 @@ Ensure that Docker container is up and running: ```bash docker ps CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES - fd5ef30f025a redislabs/redismod "redis-server --load…" 2 hours ago Up 2 hours 0.0.0.0:6379->6379/tcp nervous_buck + fd5ef30f025a redis/redis-stack "redis-server --load…" 2 hours ago Up 2 hours 0.0.0.0:6379->6379/tcp nervous_buck ``` ### Step 3. Run RedisInsight Docker container @@ -46,7 +52,7 @@ Ensure that Docker container is up and runnig docker ps CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES 264db1706dcc redislabs/redisinsight:latest "bash ./docker-entry…" About an hour ago Up About an hour 0.0.0.0:8001->8001/tcp angry_shirley - fd5ef30f025a redislabs/redismod "redis-server --load…" 2 hours ago Up 2 hours 0.0.0.0:6379->6379/tcp nervous_buck + fd5ef30f025a redis/redis-stack "redis-server --load…" 2 hours ago Up 2 hours 0.0.0.0:6379->6379/tcp nervous_buck ``` ### Step 4. Clone the repository @@ -471,5 +477,4 @@ On this page a user can rate the film and view the Actors/directors who particip ### References -- [How to list and search Movies Database using Redisearch](/howtos/moviesdatabase/getting-started) -- [RediSearch 2.0’s New Indexing Capabilities to add search to Movie app](https://redis.com/blog/getting-started-with-redisearch-2-0/) +- [Redis Search Indexing Capabilities to add search to Movie app](https://redis.com/blog/getting-started-with-redisearch-2-0/) diff --git a/docs/howtos/redisgraph/using-dotnet/index-usingdotnet.md b/docs/howtos/redisgraph/using-dotnet/index-usingdotnet.md index feae096d06b..27c837d7b59 100644 --- a/docs/howtos/redisgraph/using-dotnet/index-usingdotnet.md +++ b/docs/howtos/redisgraph/using-dotnet/index-usingdotnet.md @@ -5,7 +5,9 @@ sidebar_label: RedisGraph and .NET slug: /howtos/redisgraph/using-dotnet authors: [steve] --- +import GraphEol from '@site/docs/common/_graph-eol.mdx'; + RedisGraph enables you to store and query graph data in Redis using the [Cypher Query Language](https://opencypher.org/). In this article, we will discuss the usage of RedisGraph with .NET. ## NRedisGraph diff --git a/docs/howtos/redisgraph/using-go/index-usinggo.mdx b/docs/howtos/redisgraph/using-go/index-usinggo.mdx index 67389c760d4..2079284a4cf 100644 --- a/docs/howtos/redisgraph/using-go/index-usinggo.mdx +++ b/docs/howtos/redisgraph/using-go/index-usinggo.mdx @@ -6,6 +6,12 @@ slug: /howtos/redisgraph/using-go authors: [ajeet] --- +import Authors from '@theme/Authors'; +import GraphEol from '@site/docs/common/_graph-eol.mdx'; + + + + RedisGraph is the fastest graph database that processes complex graph operations in real time, 10x – 600x faster than any other graph database. Show how your data is connected through multiple visualization integrations including RedisInsight, Linkurious, and Graphileon. Query graphs using the industry-standard Cypher query language and easily use graph capabilities from application code. ## RedisGraph Go Client @@ -14,10 +20,10 @@ The `redisgraph-go` is a Golang client for the RedisGraph module. It relies on r Follow the steps below to get started with RedisGraph with Go: -### Step 1. Run RedisMod Docker container +### Step 1. Run Redis Stack Docker container ```bash - docker run -p 6379:6379 --name redislabs/redismodCopy + docker run -p 6379:6379 --name redis/redis-stack ``` ### Step 2. Verify if RedisGraph module is loaded @@ -104,9 +110,7 @@ GRAPH.QUERY "social" "MATCH (n) RETURN n" ### References -- [Building Movies database app using RedisGraph and NodeJS](/howtos/redisgraphmovies/) - Learn more about RedisGraph in the [Quickstart](https://oss.redis.com/redisgraph/) tutorial. -- [Query, Visualize and Manipulate Graphs using RedisGraph Browser Tool](/explore/redisinsight/redisgraph) ## @@ -116,13 +120,11 @@ GRAPH.QUERY "social" "MATCH (n) RETURN n" target="_blank" rel="noopener" className="link"> - Redis Launchpad -
diff --git a/docs/howtos/redisgraph/using-javascript/index-usingjavascript.mdx b/docs/howtos/redisgraph/using-javascript/index-usingjavascript.mdx index 92da79f9b13..f0c33a72b84 100644 --- a/docs/howtos/redisgraph/using-javascript/index-usingjavascript.mdx +++ b/docs/howtos/redisgraph/using-javascript/index-usingjavascript.mdx @@ -6,16 +6,22 @@ slug: /howtos/redisgraph/using-javascript authors: [ajeet] --- +import Authors from '@theme/Authors'; +import GraphEol from '@site/docs/common/_graph-eol.mdx'; + + + + RedisGraph is the fastest graph database that processes complex graph operations in real time, 10x – 600x faster than any other graph database. Show how your data is connected through multiple visualization integrations including RedisInsight, Linkurious, and Graphileon. Query graphs using the industry-standard Cypher query language and easily use graph capabilities from application code. ## RedisGraph JavaScript Client Follow the steps below to get started with RedisGraph with Java: -### Step 1. Run RedisMod Docker container +### Step 1. Run Redis Stack Docker container ```bash - docker run -p 6379:6379 --name redislabs/redismod + docker run -p 6379:6379 --name redis/redis-stack ``` ### Step 2. Verify if RedisGraph module is loaded @@ -133,9 +139,7 @@ You can display the number of records returned by a query: ### References -- [Building Movies database app using RedisGraph and NodeJS](/howtos/redisgraphmovies/) - Learn more about RedisGraph in the [Quickstart](https://oss.redis.com/redisgraph/) tutorial. -- [Query, Visualize and Manipulate Graphs using RedisGraph Browser Tool](/explore/redisinsight/redisgraph) ## @@ -145,13 +149,11 @@ You can display the number of records returned by a query: target="_blank" rel="noopener" className="link"> - Redis Launchpad - diff --git a/docs/howtos/redisgraph/using-python/index-usingpython.mdx b/docs/howtos/redisgraph/using-python/index-usingpython.mdx index a1278bbd49c..c10f174b6cc 100644 --- a/docs/howtos/redisgraph/using-python/index-usingpython.mdx +++ b/docs/howtos/redisgraph/using-python/index-usingpython.mdx @@ -6,6 +6,12 @@ slug: /howtos/redisgraph/using-python authors: [ajeet] --- +import Authors from '@theme/Authors'; +import GraphEol from '@site/docs/common/_graph-eol.mdx'; + + + + RedisGraph is the fastest graph database that processes complex graph operations in real time, 10x – 600x faster than any other graph database. Show how your data is connected through multiple visualization integrations including RedisInsight, Linkurious, and Graphileon. Query graphs using the industry-standard Cypher query language and easily use graph capabilities from application code. ![My Image](redisgraph_python.png) @@ -16,10 +22,10 @@ The 'redisgraph-py' is a package that allows querying Graph data in a Redis data Follow the steps below to get started with RedisGraph with Python: -### Step 1. Run RedisMod Docker container +### Step 1. Run Redis Stack Docker container ```bash - docker run -p 6379:6379 --name redislabs/redismodCopy + docker run -p 6379:6379 --name redis/redis-stack ``` ### Step 2. Verify if RedisGraph module is loaded @@ -168,9 +174,7 @@ MATCH (n) RETURN n ### References -- [Building Movies database app using RedisGraph and NodeJS](/howtos/redisgraphmovies/) - Learn more about RedisGraph in the [Quickstart](https://oss.redis.com/redisgraph/) tutorial. -- [Query, Visualize and Manipulate Graphs using RedisGraph Browser Tool](/explore/redisinsight/redisgraph) ## @@ -180,13 +184,11 @@ MATCH (n) RETURN n target="_blank" rel="noopener" className="link"> - Redis Launchpad - diff --git a/docs/howtos/redisgraph/using-redisinsight/index-usingredisinsight.mdx b/docs/howtos/redisgraph/using-redisinsight/index-usingredisinsight.mdx index cae5c05cfa7..0e151f07c1b 100644 --- a/docs/howtos/redisgraph/using-redisinsight/index-usingredisinsight.mdx +++ b/docs/howtos/redisgraph/using-redisinsight/index-usingredisinsight.mdx @@ -6,6 +6,12 @@ slug: /howtos/redisgraph/using-redisinsight authors: [ajeet] --- +import Authors from '@theme/Authors'; +import GraphEol from '@site/docs/common/_graph-eol.mdx'; + + + + If you’re a Redis user who prefers to use a Graphical User Interface(GUI) for graph queries, then RedisInsight is a right tool for you. It’s 100% free pure desktop Redis GUI that provides easy-to-use browser tools to query, visualize and interactively manipulate graphs. You can add new graphs, run queries and explore the results over the GUI tool. RedisInsight supports [RedisGraph](https://oss.redis.com/redisgraph/) and allows you to: @@ -25,7 +31,7 @@ Follow the below steps to see how your data is connected via the RedisInsight Br ## Step 1. Create Redis database -[Follow this link to create a Redis database](https://developer.redis.com/howtos/redisgraph) using Redis Enterprise Cloud with RedisGraph module enabled +[Follow this link to create a Redis database](https://developer.redis.com/howtos/redisgraph) using Redis Cloud with RedisGraph module enabled ![alt_text](deployment.png) @@ -194,10 +200,6 @@ GRAPH.QUERY GOT_DEMO "MATCH (w:writer)-[wrote]->(b:book) return w,b" ## Additional Resources - [RedisGraph Project](https://oss.redis.com/redisgraph/) -- [Slowlog Configuration using RedisInsight](/explore/redisinsight/slowlog) -- [Memory Analysis using RedisInsight](/explore/redisinsight/memoryanalyzer) -- [Visualize Redis database keys using RedisInsight Browser Tool](/explore/redisinsight/browser) -- [Using Redis Streams with RedisInsight](/explore/redisinsight/streams) ## @@ -207,13 +209,11 @@ GRAPH.QUERY GOT_DEMO "MATCH (w:writer)-[wrote]->(b:book) return w,b" target="_blank" rel="noopener" className="link"> - Redis Launchpad - diff --git a/docs/howtos/redisgraph/using-ruby/index-usingruby.mdx b/docs/howtos/redisgraph/using-ruby/index-usingruby.mdx index 95ed06008c4..505edfbc21c 100644 --- a/docs/howtos/redisgraph/using-ruby/index-usingruby.mdx +++ b/docs/howtos/redisgraph/using-ruby/index-usingruby.mdx @@ -6,6 +6,12 @@ slug: /howtos/redisgraph/using-ruby authors: [ajeet] --- +import Authors from '@theme/Authors'; +import GraphEol from '@site/docs/common/_graph-eol.mdx'; + + + + RedisGraph is the first queryable Property Graph database to use sparse matrices to represent the adjacency matrix in graphs and linear algebra to query the graph. Few of the notable features of RedisGraph includes: @@ -25,10 +31,10 @@ redisgraph-rb is a Ruby gem client for the RedisGraph module. It relies on redis Follow the steps below to get started with RedisGraph with Ruby: -### Step 1. Run RedisMod Docker container +### Step 1. Run Redis Stack Docker container ```bash - docker run -p 6379:6379 --name redislabs/redismodCopy + docker run -p 6379:6379 --name redis/redis-stack ``` ### Step 2. Verify if RedisGraph module is loaded @@ -102,9 +108,7 @@ Copy the below sample code and save it in a file "test.rb" ### References -- [Building Movies database app using RedisGraph and NodeJS](/howtos/redisgraphmovies/) - Learn more about RedisGraph in the [Quickstart](https://oss.redis.com/redisgraph/) tutorial. -- [Query, Visualize and Manipulate Graphs using RedisGraph Browser Tool](/explore/redisinsight/redisgraph) ## @@ -114,13 +118,11 @@ Copy the below sample code and save it in a file "test.rb" target="_blank" rel="noopener" className="link"> - Redis Launchpad - diff --git a/docs/howtos/redisgraph/using-rust/index-usingrust.mdx b/docs/howtos/redisgraph/using-rust/index-usingrust.mdx index 602822295c8..4933e6e4b2a 100644 --- a/docs/howtos/redisgraph/using-rust/index-usingrust.mdx +++ b/docs/howtos/redisgraph/using-rust/index-usingrust.mdx @@ -6,6 +6,12 @@ slug: /howtos/redisgraph/using-rust authors: [ajeet] --- +import Authors from '@theme/Authors'; +import GraphEol from '@site/docs/common/_graph-eol.mdx'; + + + + RedisGraph is the first queryable Property Graph database to use sparse matrices to represent the adjacency matrix in graphs and linear algebra to query the graph. RedisGraph is based on a unique approach and architecture that translates Cypher queries to matrix operations executed over a GraphBLAS engine. This new design allows use cases like social graph operation, fraud detection, and real-time recommendation to be executed 10x – 600x faster than any other graph database. Undoubtedly, it is the fastest graph database that processes complex graph operations in real time, 10x – 600x faster than any other graph database. It primariy shows how your data is connected through multiple visualization integrations including RedisInsight, Linkurious, and Graphileon. @@ -25,10 +31,10 @@ redisgraph-rs is an idiomatic Rust client for RedisGraph, the graph database by Follow the steps below to get started with RedisGraph with Rust: -### Step 1. Run RedisMod Docker container +### Step 1. Run Redis Stack Docker container ```bash - docker run -p 6379:6379 --name redislabs/redismod + docker run -p 6379:6379 --name redis/redis-stack ``` ### Step 2. Verify if RedisGraph module is loaded @@ -107,7 +113,7 @@ Copy the below content and save it as "main.rs" under src directory. ### Step 8. Install RedisInsight -[Follow this link to install RedisInsight](/explore/redisinsight/getting-started). For this demo, we will be using RedisInsight Docker container as shown below: +For this demo, we will be using RedisInsight Docker container as shown below: ```bash docker run -d -v redisinsight:/db -p 8001:8001 redislabs/redisinsight:latest @@ -129,9 +135,7 @@ You can use the limit clause to limit the number of records returned by a query: ### References -- [Building Movies database app using RedisGraph and NodeJS](/howtos/redisgraphmovies/) - Learn more about RedisGraph in the [Quickstart](https://oss.redis.com/redisgraph/) tutorial. -- [Query, Visualize and Manipulate Graphs using RedisGraph Browser Tool](/explore/redisinsight/redisgraph) ## @@ -141,13 +145,11 @@ You can use the limit clause to limit the number of records returned by a query: target="_blank" rel="noopener" className="link"> - Redis Launchpad - diff --git a/docs/howtos/redisgraphmovies/a.png b/docs/howtos/redisgraphmovies/a.png deleted file mode 100644 index fdfeed5dd89..00000000000 Binary files a/docs/howtos/redisgraphmovies/a.png and /dev/null differ diff --git a/docs/howtos/redisgraphmovies/b.png b/docs/howtos/redisgraphmovies/b.png deleted file mode 100644 index 9cf455321e2..00000000000 Binary files a/docs/howtos/redisgraphmovies/b.png and /dev/null differ diff --git a/docs/howtos/redisgraphmovies/c.png b/docs/howtos/redisgraphmovies/c.png deleted file mode 100644 index c935eb002da..00000000000 Binary files a/docs/howtos/redisgraphmovies/c.png and /dev/null differ diff --git a/docs/howtos/redisgraphmovies/d.png b/docs/howtos/redisgraphmovies/d.png deleted file mode 100644 index 47dae4e1dab..00000000000 Binary files a/docs/howtos/redisgraphmovies/d.png and /dev/null differ diff --git a/docs/howtos/redisgraphmovies/e.png b/docs/howtos/redisgraphmovies/e.png deleted file mode 100644 index 2e86fdbf74f..00000000000 Binary files a/docs/howtos/redisgraphmovies/e.png and /dev/null differ diff --git a/docs/howtos/redisgraphmovies/f.png b/docs/howtos/redisgraphmovies/f.png deleted file mode 100644 index 9bef5c42234..00000000000 Binary files a/docs/howtos/redisgraphmovies/f.png and /dev/null differ diff --git a/docs/howtos/redisgraphmovies/g.png b/docs/howtos/redisgraphmovies/g.png deleted file mode 100644 index 24390247d5a..00000000000 Binary files a/docs/howtos/redisgraphmovies/g.png and /dev/null differ diff --git a/docs/howtos/redisgraphmovies/index-redisgraphmovies.mdx b/docs/howtos/redisgraphmovies/index-redisgraphmovies.mdx deleted file mode 100644 index 2406da78741..00000000000 --- a/docs/howtos/redisgraphmovies/index-redisgraphmovies.mdx +++ /dev/null @@ -1,474 +0,0 @@ ---- -id: index-redisgraphmovies -title: Building Movies database app using RedisGraph and NodeJS -sidebar_label: Building Movies database app using RedisGraph and NodeJS -slug: /howtos/redisgraphmovies ---- - -IMDb(Internet Movie Database) is the world's most popular and authoritative source for information on movies, TV shows and celebrities. This application is an IMDB clone with basic account authentication and movie recommendation functionality. You will learn the power of RedisGraph and NodeJS to build a simple movie database. - -![moviedb](moviedb_frontpage.png) - -### Tech Stack - -- Frontend - React -- Backend - Node.js, Redis, RedisGraph - -### Step 1. Install the pre-requisites - -- Node - v13.14.0+ -- NPM - v7.6.0+ - -### Step 2. Run Redismod Docker container - -```bash - docker run -d -p 6379:6379 redislabs/redismod -``` - -Ensure that Docker container is up and running: - -```bash - docker ps - CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES - fd5ef30f025a redislabs/redismod "redis-server --load…" 2 hours ago Up 2 hours 0.0.0.0:6379->6379/tcp nervous_buck -``` - -### Step 3. Run RedisInsight Docker container - -```bash - docker run -d -v redisinsight:/db -p 8001:8001 redislabs/redisinsight:latest -``` - -Ensure that Docker container is up and runnig - -```bash - docker ps - CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES - 264db1706dcc redislabs/redisinsight:latest "bash ./docker-entry…" About an hour ago Up About an hour 0.0.0.0:8001->8001/tcp angry_shirley - fd5ef30f025a redislabs/redismod "redis-server --load…" 2 hours ago Up 2 hours 0.0.0.0:6379->6379/tcp nervous_buck -``` - -### Step 4. Clone the repository - -```bash - git clone https://github.com/redis-developer/basic-redisgraph-movie-demo-app-nodejs -``` - -### Step 5. Setting up environment variables - -Copy .env.sample to .env and add the below details: - -```bash - REDIS_ENDPOINT_URL = "Redis server URI" - REDIS_PASSWORD = "Password to the server" -``` - -### Step 6. Install the dependencies - -```bash - npm install -``` - -### Step 7. Run the backend server - -```bash - node app.js -``` - -### Step 8. Run the client - -```bash - cd client - yarn install - yarn start -``` - -### Step 9. Accessing the Movie app - -Open http://IP:3000 to access the movie app - -![movieapp](moviedb.png) - -### Step 10. Sign up for a new account - -![moviedb](moviedb_signup.png) - -Enter the details to create a new account: - -![movieapp](moviedb_createaccount.png) - -### Step 11. Sign-in to movie app - -![movieapp](moviedb_sign.png) - -### Step 12. Rate the movie - -![movieapp](moviedb_rating.png) - -### Step 13. View the list of rated movie - -![movieapp](moviedb_rated.png) - -### Step 14. View directed movie over RedisInsight - -``` - GRAPH.QUERY "MovieApp" "MATCH (director:Director {tmdbId: \"4945\"})-[:DIRECTED]->(movie:Movie) RETURN DISTINCT movie,director" -``` - -![movieapp](moviedb_directed.png) - -### Step 15. Find movies where actor acted in. - -Run the below query under RedisGraph to find the author acted in a movie - -``` - GRAPH.QUERY "MovieApp" "MATCH (actor:Actor {tmdbId: \"8537\"})-[:ACTED_IN_MOVIE]->(movie:Movie) RETURN DISTINCT movie,actor" -``` - -![movieapp](moviedb_actedin.png) - -### Step 16. Store a user in a database - -``` - CREATE (user:User {id: 32, - username: "user", password: "hashed_password", api_key: "525d40da10be8ec75480"}) - RETURN user -``` - -![movieapp](moviedb_createuser.png) - -### Step 17. Find a user by username - -``` - MATCH (user:User {username: "user"}) RETURN user -``` - -![movieapp](moviedb_createuser1.png) - -### How it works? - -The app consumes the data provided by the Express API and presents it through some views to the end user, including: - -- Home page -- Sign-up and Login pages -- Movie detail page -- Actor and Director detail page -- User detail page - -#### Home page - -![How it works](a.png) - -The home page shows the genres and a brief listing of movies associated with them. - -#### How the data is stored - -#### Add a new genre: - -```Cypher - create (g:Genre{name:"Adventure"}) -``` - -#### Add a movie: - -```Cypher - create (m:Movie { - url: "https://themoviedb.org/movie/862", - id:232, - languages:["English"], - title:"Toy Story", - countries:["USA"], - budget:30000000, - duration:81, - imdbId:"0114709", - imdbRating:8.3, - imdbVotes:591836, - movieId:42, - plot:"...", - poster:"https://image.tmd...", - poster_image:"https://image.tmdb.or...", - released:"1995-11-22", - revenue:373554033, - runtime:$runtime, - tagline:"A cowboy doll is profoundly t...", - tmdbId:"8844", - year:"1995"}) -``` - -#### Set genre to a movie: - -```Cypher - MATCH (g:Genre), (m:Movie) - WHERE g.name = "Adventure" AND m.title = "Toy Story" - CREATE (m)-[:IN_GENRE]->(g) -``` - -#### How the data is accessed - -#### Get genres: - -```Cypher - MATCH (genre:Genre) RETURN genre -``` - -#### Get moves by genre: - -```Cypher - MATCH (movie:Movie)-[:IN_GENRE]->(genre) - WHERE toLower(genre.name) = toLower("Film-Noir") OR id(genre) = toInteger("Film-Noir") - RETURN movie -``` - -#### Code example: Get movies with genre - -```Javascript - const getByGenre = function (session, genreId) { - const query = [ - 'MATCH (movie:Movie)-[:IN_GENRE]->(genre)', - 'WHERE toLower(genre.name) = toLower($genreId) OR id(genre) = toInteger($genreId)', - 'RETURN movie', - ].join('\n'); - - return session - .query(query, { - genreId, - }) - .then((result) => manyMovies(result)); -}; -``` - -#### Sign-up and Login pages - -![moviedb](f.png) -![moviedb](g.png) - -To be able to rate movies a user needs to be logged in: for that a basic JWT-based authentication system is implemented, where user details are stored in the RedisGraph for persistence. - -#### How the data is stored - -#### Store user in the database: - -```Cypher - CREATE (user:User {id: 32, - username: "user", password: "hashed_password", api_key: "525d40da10be8ec75480"}) - RETURN user -``` - -#### How the data is accessed - -#### Find by user name: - -```Cypher - MATCH (user:User {username: "user"}) RETURN user -``` - -#### Code Example: Find user - -```Javascript - const me = function (session, apiKey) { - return session - .query('MATCH (user:User {api_key: $api_key}) RETURN user', { - api_key: apiKey, - }) - .then((foundedUser) => { - if (!foundedUser.hasNext()) { - throw {message: 'invalid authorization key', status: 401}; - } - while (foundedUser.hasNext()) { - const record = foundedUser.next(); - return new User(record.get('user')); - } - }); - }; -``` - -#### Movie detail page - -![How it works](d.png) - -On this page a user can rate the film and view the Actors/directors who participated in the production of the film. - -#### How the data is stored - -#### Associate actor with a movie: - -```Cypher - MATCH (m:Movie) WHERE m.title="Jumanji" CREATE (a:Actor :Person{ - bio:"Sample...", - bornIn:"Denver, Colorado, USA", - imdbId:"0000245", - name:"Robin Williams", - poster:"https://image.tmdb.org/t/p/w440_and_...", - tmdbId:"2157", - url:"https://themoviedb.org/person/2157"})-[r:ACTED_IN_MOVIE - {role: "Alan Parrish"}]->(m) -``` - -#### Associate director with a movie: - -```Cypher - MATCH (m:Movie) WHERE m.title="Dead Presidents" CREATE (d:Director :Person{ - bio: "From Wikipedia, the free e...", - bornIn: "Detroit, Michigan, USA", - imdbId: "0400436", - name: "Albert Hughes", - tmdbId: "11447", - url: "https://themoviedb.org/person/11447"})-[r:DIRECTED]->(m) -``` - -#### How the data is accessed - -#### Find movie by id with genre, actors and director: - -```Cypher - MATCH (movie:Movie {tmdbId: $movieId}) - OPTIONAL MATCH (movie)<-[my_rated:RATED]-(me:User {id: "e1e3991f-fe81-439e-a507-aa0647bc0b88"}) - OPTIONAL MATCH (movie)<-[r:ACTED_IN_MOVIE]-(a:Actor) - OPTIONAL MATCH (movie)-[:IN_GENRE]->(genre:Genre) - OPTIONAL MATCH (movie)<-[:DIRECTED]-(d:Director) - WITH DISTINCT movie, my_rated, genre, d, a, r - RETURN DISTINCT movie, - collect(DISTINCT d) AS directors, - collect(DISTINCT a) AS actors, - collect(DISTINCT genre) AS genres -``` - -#### Code Example: Get movie detail - -```Javascript - const getById = function (session, movieId, userId) { - if (!userId) throw {message: 'invalid authorization key', status: 401}; - const query = [ - 'MATCH (movie:Movie {tmdbId: $movieId})\n' + - ' OPTIONAL MATCH (movie)<-[my_rated:RATED]-(me:User {id: $userId})\n' + - ' OPTIONAL MATCH (movie)<-[r:ACTED_IN_MOVIE]-(a:Actor)\n' + - ' OPTIONAL MATCH (movie)-[:IN_GENRE]->(genre:Genre)\n' + - ' OPTIONAL MATCH (movie)<-[:DIRECTED]-(d:Director)\n' + - ' WITH DISTINCT movie, my_rated, genre, d, a, r\n' + - ' RETURN DISTINCT movie,\n' + - ' collect(DISTINCT d) AS directors,\n' + - ' collect(DISTINCT a) AS actors,\n' + - ' collect(DISTINCT genre) AS genres', - ].join(' '); - return session - .query(query, { - movieId: movieId.toString(), - userId: userId.toString(), - }) - .then((result) => { - if (result.hasNext()) { - return _singleMovieWithDetails(result.next()); - } - throw {message: 'movie not found', status: 404}; - }); - }; -``` - -#### Actor and Director detail page - -![How it works](c.png) - -#### How the data is accessed - -#### Find movies where actor acted in: - -```Cypher - MATCH (actor:Actor {tmdbId: "8537"})-[:ACTED_IN_MOVIE]->(movie:Movie) - RETURN DISTINCT movie,actor -``` - -#### Find movies directed by: - -```Cypher - MATCH (director:Director {tmdbId: "4945"})-[:DIRECTED]->(movie:Movie) - RETURN DISTINCT movie,director -``` - -#### Get movies directed by - -```Javascript - const getByDirector = function (session, personId) { - const query = [ - 'MATCH (director:Director {tmdbId: $personId})-[:DIRECTED]->(movie:Movie)', - 'RETURN DISTINCT movie,director', - ].join('\n'); - - return session - .query(query, { - personId, - }) - .then((result) => manyMovies(result)); - }; -``` - -#### User detail page - -![How it works](b.png) - -#### Shows the profile info and movies which were rated by user - -#### How the data is stored - -#### Set rating for a movie: - -```Cypher - MATCH (u:User {id: 42}),(m:Movie {tmdbId: 231}) - MERGE (u)-[r:RATED]->(m) - SET r.rating = "7" - RETURN m -``` - -#### How the data is accessed - -#### Get movies and user ratings: - -```Cypher - MATCH (:User {id: "d6b31131-f203-4d5e-b1ff-d13ebc06934d"})-[rated:RATED]->(movie:Movie) - RETURN DISTINCT movie, rated.rating as my_rating -``` - -#### Get rated movies for user - -```Javascript - const getRatedByUser = function (session, userId) { - return session - .query( - 'MATCH (:User {id: $userId})-[rated:RATED]->(movie:Movie) \ - RETURN DISTINCT movie, rated.rating as my_rating', - {userId}, - ) - .then((result) => - result._results.map((r) => new Movie(r.get('movie'), r.get('my_rating'))), - ); - }; -``` - -#### Data types: - -- The data is stored in various keys and various relationships. - - There are 5 types of data - - User - - Director - - Actor - - Genre - - Movie - -#### Each type has its own properties - -- Actor: `id, bio, born , bornIn, imdbId, name, poster, tmdbId, url` -- Genre: `id, name` -- Director: `id, born, bornIn, imdbId, name, tmdbId, url` -- User: `id, username, password, api_key` -- Movie: `id, url, languages, countries, budget, duration, imdbId, imdbRating, indbVotes, movieId, plot, poster, poster_image, released, revenue, runtime, tagline, tmdbId, year` - -#### And there are 4 types of relationship: - -- User-`RATED`->Movie -- Director-`DIRECTED`->Movie -- Actor-`ACTED_IN_MOVIE`->Movie -- Movie-`IN_GENRE`->Genre - -### References - -- [How to list and search Movies Database using Redisearch](/howtos/moviesdatabase/getting-started) -- [RediSearch 2.0’s New Indexing Capabilities to add search to Movie app](https://redis.com/blog/getting-started-with-redisearch-2-0/) diff --git a/docs/howtos/redisgraphmovies/moviedb.png b/docs/howtos/redisgraphmovies/moviedb.png deleted file mode 100644 index 482d5dcf7c2..00000000000 Binary files a/docs/howtos/redisgraphmovies/moviedb.png and /dev/null differ diff --git a/docs/howtos/redisgraphmovies/moviedb2.png b/docs/howtos/redisgraphmovies/moviedb2.png deleted file mode 100644 index 60598e2008b..00000000000 Binary files a/docs/howtos/redisgraphmovies/moviedb2.png and /dev/null differ diff --git a/docs/howtos/redisgraphmovies/moviedb_actedin.png b/docs/howtos/redisgraphmovies/moviedb_actedin.png deleted file mode 100644 index 99e97b8a6f0..00000000000 Binary files a/docs/howtos/redisgraphmovies/moviedb_actedin.png and /dev/null differ diff --git a/docs/howtos/redisgraphmovies/moviedb_associate.png b/docs/howtos/redisgraphmovies/moviedb_associate.png deleted file mode 100644 index d3cd9c0a079..00000000000 Binary files a/docs/howtos/redisgraphmovies/moviedb_associate.png and /dev/null differ diff --git a/docs/howtos/redisgraphmovies/moviedb_createaccount.png b/docs/howtos/redisgraphmovies/moviedb_createaccount.png deleted file mode 100644 index e4b60ad344d..00000000000 Binary files a/docs/howtos/redisgraphmovies/moviedb_createaccount.png and /dev/null differ diff --git a/docs/howtos/redisgraphmovies/moviedb_createuser.png b/docs/howtos/redisgraphmovies/moviedb_createuser.png deleted file mode 100644 index b9441c27af3..00000000000 Binary files a/docs/howtos/redisgraphmovies/moviedb_createuser.png and /dev/null differ diff --git a/docs/howtos/redisgraphmovies/moviedb_createuser1.png b/docs/howtos/redisgraphmovies/moviedb_createuser1.png deleted file mode 100644 index 32449ad7231..00000000000 Binary files a/docs/howtos/redisgraphmovies/moviedb_createuser1.png and /dev/null differ diff --git a/docs/howtos/redisgraphmovies/moviedb_directed.png b/docs/howtos/redisgraphmovies/moviedb_directed.png deleted file mode 100644 index 61ba782fd98..00000000000 Binary files a/docs/howtos/redisgraphmovies/moviedb_directed.png and /dev/null differ diff --git a/docs/howtos/redisgraphmovies/moviedb_frontpage.png b/docs/howtos/redisgraphmovies/moviedb_frontpage.png deleted file mode 100644 index 2e125c0dfa8..00000000000 Binary files a/docs/howtos/redisgraphmovies/moviedb_frontpage.png and /dev/null differ diff --git a/docs/howtos/redisgraphmovies/moviedb_rated.png b/docs/howtos/redisgraphmovies/moviedb_rated.png deleted file mode 100644 index 2bc66840901..00000000000 Binary files a/docs/howtos/redisgraphmovies/moviedb_rated.png and /dev/null differ diff --git a/docs/howtos/redisgraphmovies/moviedb_rating.png b/docs/howtos/redisgraphmovies/moviedb_rating.png deleted file mode 100644 index af1694d7996..00000000000 Binary files a/docs/howtos/redisgraphmovies/moviedb_rating.png and /dev/null differ diff --git a/docs/howtos/redisgraphmovies/moviedb_sign.png b/docs/howtos/redisgraphmovies/moviedb_sign.png deleted file mode 100644 index 35135fce796..00000000000 Binary files a/docs/howtos/redisgraphmovies/moviedb_sign.png and /dev/null differ diff --git a/docs/howtos/redisgraphmovies/moviedb_signup.png b/docs/howtos/redisgraphmovies/moviedb_signup.png deleted file mode 100644 index 05451cd672f..00000000000 Binary files a/docs/howtos/redisgraphmovies/moviedb_signup.png and /dev/null differ diff --git a/docs/howtos/redisjson/database_creds.png b/docs/howtos/redisjson/database_creds.png deleted file mode 100644 index ef6379e72b3..00000000000 Binary files a/docs/howtos/redisjson/database_creds.png and /dev/null differ diff --git a/docs/howtos/redisjson/getting-started/Verify_subscription.png b/docs/howtos/redisjson/getting-started/Verify_subscription.png deleted file mode 100644 index e5911628f69..00000000000 Binary files a/docs/howtos/redisjson/getting-started/Verify_subscription.png and /dev/null differ diff --git a/docs/howtos/redisjson/getting-started/add_database.png b/docs/howtos/redisjson/getting-started/add_database.png deleted file mode 100644 index 9ada742a2f2..00000000000 Binary files a/docs/howtos/redisjson/getting-started/add_database.png and /dev/null differ diff --git a/docs/howtos/redisjson/getting-started/create_database.png b/docs/howtos/redisjson/getting-started/create_database.png deleted file mode 100644 index 3af0ef5ab9c..00000000000 Binary files a/docs/howtos/redisjson/getting-started/create_database.png and /dev/null differ diff --git a/docs/howtos/redisjson/getting-started/create_subscription.png b/docs/howtos/redisjson/getting-started/create_subscription.png deleted file mode 100644 index 347fdd15353..00000000000 Binary files a/docs/howtos/redisjson/getting-started/create_subscription.png and /dev/null differ diff --git a/docs/howtos/redisjson/getting-started/database_creds.png b/docs/howtos/redisjson/getting-started/database_creds.png deleted file mode 100644 index ef6379e72b3..00000000000 Binary files a/docs/howtos/redisjson/getting-started/database_creds.png and /dev/null differ diff --git a/docs/howtos/redisjson/getting-started/database_details.png b/docs/howtos/redisjson/getting-started/database_details.png deleted file mode 100644 index 5007a480b06..00000000000 Binary files a/docs/howtos/redisjson/getting-started/database_details.png and /dev/null differ diff --git a/docs/howtos/redisjson/getting-started/deployment.png b/docs/howtos/redisjson/getting-started/deployment.png deleted file mode 100644 index adb4c49d3d9..00000000000 Binary files a/docs/howtos/redisjson/getting-started/deployment.png and /dev/null differ diff --git a/docs/howtos/redisjson/getting-started/details_database.png b/docs/howtos/redisjson/getting-started/details_database.png deleted file mode 100644 index 3881cf02728..00000000000 Binary files a/docs/howtos/redisjson/getting-started/details_database.png and /dev/null differ diff --git a/docs/howtos/redisjson/getting-started/final_subscription.png b/docs/howtos/redisjson/getting-started/final_subscription.png deleted file mode 100644 index 333ce58c396..00000000000 Binary files a/docs/howtos/redisjson/getting-started/final_subscription.png and /dev/null differ diff --git a/docs/howtos/redisjson/getting-started/index-gettingstarted.mdx b/docs/howtos/redisjson/getting-started/index-gettingstarted.mdx deleted file mode 100644 index f9562899024..00000000000 --- a/docs/howtos/redisjson/getting-started/index-gettingstarted.mdx +++ /dev/null @@ -1,290 +0,0 @@ ---- -id: index-gettingstarted -title: Storing and Querying JSON documents using Redis Stack -sidebar_label: Storing and Querying JSON documents -slug: /howtos/redisjson/getting-started -authors: [ajeet] ---- - -Redis Stack is an extension of Redis that adds modern data models and processing engines to provide a complete developer experience. Redis Stack provides a simple and seamless way to access different data models such as full-text search, document store, graph, time series, and probabilistic data structures enabling developers to build any real-time data application. - -In this tutorial, you will see how Redis Stack can help you in storing and querying JSON documents. - -### Step 1. Create a free Cloud account - -Create your free Redis Enterprise Cloud account. Once you click on “Get Started”, you will receive an email with a link to activate your account and complete your signup process. - -:::info TIP -For a limited time, use **TIGER200** to get **$200** credits on Redis Enterprise Cloud and try all the advanced capabilities! - -:tada: [Click here to sign up](https://redis.com/try-free) - -::: - -### Step 2. Create Your database - -Choose your preferred cloud vendor. Select the region and then click "Let's start free" to create your free database automatically. - -:::info TIP -If you want to create a custom database with your preferred name and type of Redis, -click "Create a custom database" option shown in the image. -::: - -![create database ](select_cloud_vendor.png) - -### Step 3. Verify the database details - -You will be provided with Public endpoint URL and "Redis Stack" as the type of database with the list of modules that comes by default. - -![verify database](details_database.png) - -### Step 4. Using RedisInsight - -RedisInsight is a visual tool that lets you do both GUI- and CLI-based interactions with your Redis database, and so much more when developing your Redis based application. It is a fully-featured pure Desktop GUI client that provides capabilities to design, develop and optimize your Redis application. It works with any cloud provider as long as you run it on a host with network access to your cloud-based Redis server. It makes it easy to discover cloud databases and configure connection details with a single click. It allows you to automatically add Redis Enterprise Software and Redis Enterprise Cloud databases. - -[Follow this link](/explore/redisinsightv2/getting-started) to install RedisInsight v2 on your local system. -Assuming that you already have RedisInsight v2 installed on your MacOS, you can browse through the Applications and click "RedisInsight-v2" to bring up the Redis Desktop GUI tool. - -### Step 5. Enter Redis Enterprise Cloud details - -Add the Redis Enterprise cloud database endpoint, port and password. - -![access redisinsight](database_creds.png) - -### Step 6. Verify the database under RedisInsight dashboard - -![database details](database_details.png) - -### Step 7. Getting Started with RedisJSON - -The following steps use some basic RedisJSON commands. You can run them from the Redis command-line interface (redis-cli) or use the CLI available in RedisInsight. - -To interact with RedisJSON, you will most often use the JSON.SET and JSON.GET commands. Before using RedisJSON, you should familiarize yourself with its commands and syntax as detailed in the documentation: RedisJSON Commands. - -Let’s go ahead and test drive some JSON-specific operations for setting and retrieving a Redis key with a JSON value: - -- Scalar -- Objects (including nested objects) -- Arrays of JSON objects -- JSON nested objects - -#### Scalar - -Under RedisJSON, a key can contain any valid JSON value. It can be scalar, objects or arrays. JSON scalar is basically a string. You will have to use the JSON.SET command to set the JSON value. For new Redis keys the path must be the root, so you will use “.” path in the example below. For existing keys, when the entire path exists, the value that it contains is replaced with the JSON value. Here you will use JSON.SET to set the JSON scalar value to “Hello JSON!” Scalar will contain a string that holds “Hello JSON!” - -Command: - -``` -JSON.SET greetings . ' "Hello JSON!" ' -``` - -Result: - -``` -OK -``` - -Use JSON.GET to return the value at path in JSON serialized form: - -Command: - -``` -JSON.GET greetings -``` - -Result: - -``` -"\"Hello JSON!\"" -``` - -#### Objects - -Let’s look at a JSON object example. A JSON object contains data in the form of a key-value pair. The keys are strings and the values are the JSON types. Keys and values are separated by a colon. Each entry (key-value pair) is separated by a comma. The { (curly brace) represents the JSON object: - -``` -{ - "employee": { - "name": "alpha", - "age": 40, - "married": true - } -} -``` - -Here is the command to insert JSON data into Redis: - -Command: - -``` -JSON.SET employee_profile $ '{ "employee": { "name": "alpha", "age": 40,"married": true } } ' -``` - -:::important - -Please note that the above command works for 2.0+ release of RedisJSON. If you are using the older version of RedisJSON, you can replace "$" with "." -::: - -Result: - -``` -"OK" -``` - -The subcommands below change the reply’s format and are all set to the empty string by default: _ INDENT sets the indentation string for nested levels _. NEWLINE sets the string that’s printed at the end of each line. \* SPACE sets the string that’s put between a key and a value: - -Command: - -``` -JSON.GET employee_profile -``` - -Result: - -``` -"{\"employee\":{\"name\":\"alpha\",\"age\":40,\"married\":true}}" -``` - -#### Retrieving a part of JSON document - -You can also retrieve a part of the JSON document from Redis. In the example below, “.ans” can be passed in the commandline to retrieve the value 4: - -Command: - -``` -JSON.SET object . '{"foo":"bar", "ans":"4" }' -``` - -Result: - -``` -"OK" -``` - -Command: - -``` -JSON.GET object -``` - -Result: - -``` -"{\"foo\":\"bar\",\"ans\":\"4\"}" -``` - -Command: - -``` -JSON.GET object .ans -``` - -Results: - -``` -"\"4\"" -``` - -#### Retrieving the type of JSON data - -JSON.TYPE reports the type of JSON value at path and path defaults to root if not provided. If the key or path do not exist, null is returned. - -Command: - -``` -JSON.TYPE employee_profile -``` - -Result: - -``` -"Object" -``` - -#### JSON arrays of objects - -The JSON array represents an ordered list of values. A JSON array can store multiple values, including strings, numbers, or objects. In JSON arrays, values must be separated by a comma. The [ (square bracket) represents the JSON array. Let’s look at a simple JSON array example with four objects: - -``` -{"employees":[ - {"name":"Alpha", "email":"alpha@gmail.com", "age":23}, - {"name":"Beta", "email":"beta@gmail.com", "age":28}, - {"name":"Gamma", "email":"gamma@gmail.com", "age":33}, - {"name":"Theta", "email":"theta@gmail.com", "age":41} -]} -``` - -Command: - -``` -JSON.SET testarray . '{"employees":[ {"name":"Alpha", "email":"alpha@gmail.com", "age":23}, {"name":"Beta", "email":"beta@gmail.com", "age":28}, {"name":"Gamma", "email":"gamma@gmail.com", "age":33}, {"name":"Theta", "email":"theta@gmail.com", "age":41} ]} ' -``` - -Result: - -``` -"OK" -``` - -Command: - -``` -JSON.GET testarray -``` - -Result: - -``` -"{\"employees\":[{\"name\":\"Alpha\",\"email\":\ -alpha@gmail.com - -\",\"age\":23},{\"name\":\"Beta\",\"email\":\"beta@gmail.com.... -``` - -#### JSON nested objects - -A JSON object can also have another object. Here is a simple example of a JSON object having another object nested in it: - -Command: - -``` ->> JSON.SET employee_info . ' { "firstName": "Alpha", "lastName": "K", "age": 23, "address" : { "streetAddress": "110 Fulbourn Road Cambridge", "city": "San Francisco", "state": "California", "postalCode": "94016" } } ' -``` - -Command: - -``` ->> JSON.GET employee_info -``` - -Result: - -``` -"{\"firstName\":\"Alpha\",\"lastName\":\"K\",\"age\":23,\"address\":{\"streetAddress\":\"110 Fulbourn Road Cambridge\",\"city\":\"San Francisco\",\"state\":\"California\",\"postalCode\":\"94016\"}}" -``` - -### Next Steps - -- [RU204: Storing, Querying and Indexing JSON at Speed](https://university.redis.com/courses/ru204/) - a course at Redis University -- Learn more about [RedisJSON](https://oss.redis.com/redisjson/) in the Quickstart tutorial. -- [How to build shopping cart app using NodeJS and RedisJSON](/howtos/shoppingcart) -- [Indexing, Querying, and Full-Text Search of JSON Documents with Redis](https://redis.com/blog/index-and-query-json-docs-with-redis/) - -## - - diff --git a/docs/howtos/redisjson/getting-started/launch_database.png b/docs/howtos/redisjson/getting-started/launch_database.png deleted file mode 100644 index 861f20f9dec..00000000000 Binary files a/docs/howtos/redisjson/getting-started/launch_database.png and /dev/null differ diff --git a/docs/howtos/redisjson/getting-started/redisjson1.png b/docs/howtos/redisjson/getting-started/redisjson1.png deleted file mode 100644 index 6ef5bc0a9af..00000000000 Binary files a/docs/howtos/redisjson/getting-started/redisjson1.png and /dev/null differ diff --git a/docs/howtos/redisjson/getting-started/redisjson3.png b/docs/howtos/redisjson/getting-started/redisjson3.png deleted file mode 100644 index eded167bb6c..00000000000 Binary files a/docs/howtos/redisjson/getting-started/redisjson3.png and /dev/null differ diff --git a/docs/howtos/redisjson/getting-started/select_cloud.png b/docs/howtos/redisjson/getting-started/select_cloud.png deleted file mode 100644 index 2784e455de7..00000000000 Binary files a/docs/howtos/redisjson/getting-started/select_cloud.png and /dev/null differ diff --git a/docs/howtos/redisjson/getting-started/select_cloud_vendor.png b/docs/howtos/redisjson/getting-started/select_cloud_vendor.png deleted file mode 100644 index 2526223c800..00000000000 Binary files a/docs/howtos/redisjson/getting-started/select_cloud_vendor.png and /dev/null differ diff --git a/docs/howtos/redisjson/getting-started/select_subscription.png b/docs/howtos/redisjson/getting-started/select_subscription.png deleted file mode 100644 index 531615615e6..00000000000 Binary files a/docs/howtos/redisjson/getting-started/select_subscription.png and /dev/null differ diff --git a/docs/howtos/redisjson/getting-started/try-free.png b/docs/howtos/redisjson/getting-started/try-free.png deleted file mode 100644 index 11915ea5927..00000000000 Binary files a/docs/howtos/redisjson/getting-started/try-free.png and /dev/null differ diff --git a/docs/howtos/redisjson/getting-started/tryfree.png b/docs/howtos/redisjson/getting-started/tryfree.png deleted file mode 100644 index bbd57089df9..00000000000 Binary files a/docs/howtos/redisjson/getting-started/tryfree.png and /dev/null differ diff --git a/docs/howtos/redisjson/index-redisjson.mdx b/docs/howtos/redisjson/index-redisjson.mdx deleted file mode 100644 index 0e5fd2b3758..00000000000 --- a/docs/howtos/redisjson/index-redisjson.mdx +++ /dev/null @@ -1,122 +0,0 @@ ---- -id: index-redisjson -title: RedisJSON Tutorial -sidebar_label: Overview -slug: /howtos/redisjson/ ---- - -import RedisCard from '@site/src/theme/RedisCard'; - -The following links provides you with the available options to get started with RedisJSON - -
- -
- -
- -
- -
- -
- -
- -
- -
- -
- -
- -
- -
- -
- -
-
- -
- -
- -
- -
- -
- -
- -
-
- -
- -
- -
- -
- -
- -
- -
- -
diff --git a/docs/howtos/redisjson/json-using-redisearch/index-json-using-redisearch.mdx b/docs/howtos/redisjson/json-using-redisearch/index-json-using-redisearch.mdx deleted file mode 100644 index 90305882d06..00000000000 --- a/docs/howtos/redisjson/json-using-redisearch/index-json-using-redisearch.mdx +++ /dev/null @@ -1,6 +0,0 @@ ---- -id: index-json-using-redisearch -title: How to index JSON document using RediSearch -sidebar_label: Indexing JSON document using RediSearch -slug: /howtos/redisjson/json-using-redisearch ---- diff --git a/docs/howtos/redisjson/json-using-redisearch/jsonindex-document/index-jsonindex-document.mdx b/docs/howtos/redisjson/json-using-redisearch/jsonindex-document/index-jsonindex-document.mdx deleted file mode 100644 index 68460c51a19..00000000000 --- a/docs/howtos/redisjson/json-using-redisearch/jsonindex-document/index-jsonindex-document.mdx +++ /dev/null @@ -1,172 +0,0 @@ ---- -id: index-jsonindex-document -title: How to index JSON documents using RedisJSON & RediSearch -sidebar_label: How to index JSON documents using RedisJSON & RediSearch -slug: /howtos/redisjson/jsonindex-document ---- - -RedisJSON 2.0 Private Preview was announced for the first time during RedisConf 2021. With this newer version, RedisJSON will fully support JSONPath expressions and [Active-Active geo-distribution](https://redis.com/redis-enterprise/technology/active-active-geo-distribution/). The Active-Active implementation is based on [Conflict-free Replicated Data-Types (CRDT)](https://en.wikipedia.org/wiki/Conflict-free_replicated_data_type). - -Prior to v2.2, RediSearch only supported Redis hashes. Going forward, RediSearch will support RedisJSON documents. This opens a powerful new set of document-based indexing use cases. In addition, RediSearch now provides query profiling. This will empower developers to understand and optimize their RediSearch queries, increasing their developer experience. - -RediSearch has been providing indexing and search capabilities on hashes. Under the hood, RedisJSON 2.0 exposes an internal public API. Internal, because this API is exposed to other modules running inside a Redis node. Public, because any module can consume this API. So does RediSearch 2.2 ! In addition to indexing Redis hashes, RediSearch also indexes JSON. To index JSON, you must use the RedisJSON module. - -By exposing its capabilities to other modules, RedisJSON gives RediSearch the ability to index JSON documents so users can now find documents by indexing and querying the content. These combined modules give you a powerful, low latency, JSON-oriented document database! - -### Prerequisites: - -- Redis 6.x or later -- RediSearch 2.2 or later -- RediJSON 2.0 or later - -### Step 1. Run the "preview" tagged Redismod container - -Please note that this publicly available Docker Image is a community preview and doesn't support Active-Active.This Docker image contains Redis together with the main Redis modules, including RediSearch and RedisJSON. You'll need the preview tag of the image, which you can access as follows: - -```bash - docker run -p 6379:6379 redislabs/redismod:preview -``` - -```bash - info modules - # Modules - module:name=rg,ver=10006,api=1,filters=0,usedby=[],using=[ai],options=[] - module:name=graph,ver=20406,api=1,filters=0,usedby=[],using=[],options=[] - module:name=timeseries,ver=10410,api=1,filters=0,usedby=[],using=[],options=[] - module:name=bf,ver=20205,api=1,filters=0,usedby=[],using=[],options=[] - module:name=ai,ver=10003,api=1,filters=0,usedby=[rg],using=[],options=[] - module:name=ReJSON,ver=20000,api=1,filters=0,usedby=[search],using=[],options=[] - module:name=search,ver=20200,api=1,filters=0,usedby=[],using=[ReJSON],options=[] -``` - -### Step 2. Create an Index - -Let's start by creating an index. - -We can now specify ON JSON to inform RediSearch that we want to index JSON documents. -Then, on the SCHEMA part, you can provide JSONPath expressions. The result of each JSON Path expression is indexed and associated with a logical name ( attribute ). This attribute (previously called field ) is used in the query part. - -This is the basic syntax for indexing a JSON document: - -Syntax: - -``` - FT.CREATE {index_name} ON JSON SCHEMA {json_path} AS {attribute} {type} -``` - -Command: - -```bash - FT.CREATE userIdx ON JSON SCHEMA $.user.name AS name TEXT $.user.email AS email TAG -``` - -### Step 3. Populate the database with JSON document - -We should first populate the database with a JSON document using the JSON.SET command. In our example we are going to use the following JSON document: - -``` -{ - "user": { - "name": "Paul John", - "email": "paul.john@example.com", - "age": "42", - "country": "London" - } -} -``` - -``` -JSON.SET myuser $ '{ "user":{"name": "Paul John", "email": "paul.john@example.com", "age": "4", "country": "London" }}' - -``` - -Because indexing is synchronous, the document will be visible on the index as soon as the JSON.SET command returns. Any subsequent query matching the indexed content will return the document - -### Step 4. Indexing the database with JSON document - -This new version includes a comprehensive support of JSONPath. It is now possible to use all the expressiveness of JSONPath expressions. - -To create a new index, we use the FT.CREATE command. The schema of the index now accepts JSONPath expressions. The result of the expression is indexed and associated with an attribute (here: title). - -``` -FT.CREATE myIdx ON JSON SCHEMA $.title AS title TEXT -``` - -We can now do a search query and find our JSON document using FT.SEARCH: - -Command: - -```bash - FT.SEARCH userIdx '@name:(John)' -``` - -Result: - -```bash - 1) (integer) 1 - 2) "myuser" - 3) 1) "$" - 2) "{\"user\":{\"name\":\"Paul John\",\"email\":\"paul.john@example.com\",\"age\":\"4\",\"country\":\"London\"}}" -``` - -We just saw that, by default, FT.SEARCH returns the whole document. We can also return only specific attribute (here name) - -```bash - FT.SEARCH userIdx '@name:(John)' RETURN 1 name -``` - -```bash - 1) (integer) 1 - 2) "myuser" - 3) 1) "name" - 2) "Paul John" -``` - -### Step 5. Projecting using JSON Path expressions - -The RETURN parameter also accepts a JSON Path expression which let us extract any part of the JSON document. -In this example, we return the result of the JSON Path expression $.user.hp . - -Command: - -```bash - FT.SEARCH userIdx '@name:(John)' RETURN 1 $.user.email -``` - -Result: - -```bash - 1) (integer) 1 - 2) "myuser" - 3) 1) "$.user.email" - 2) "paul.john@example.com" -``` - -Please Note: It is not possible to index JSON object and JSON arrays. - -``` - { - "user": { - "name": "Paul John", - "email": "paul.john@example.com", - "age": "42", - "country": "London", - “address": [ - "Orbital Park", - " Hounslow" - ], - "pincode": "TW4 6JS" - } - } -} -``` - -Command: - -``` - JSON.SET myuser $ '{ "user": { "name": "Paul John", "email": "paul.hojn@example.com", "age" :"40", "country": "London", "address": [ "Orbital Park","Hounslow" ], "pincode": "TW4 6JS" }}' -``` - -### References - -- [RU204: Storing, Querying and Indexing JSON at Speed](https://university.redis.com/courses/ru204/) - a course at Redis University diff --git a/docs/howtos/redisjson/jsonind-document/index-jsonind-document.mdx b/docs/howtos/redisjson/jsonind-document/index-jsonind-document.mdx deleted file mode 100644 index 1282af92636..00000000000 --- a/docs/howtos/redisjson/jsonind-document/index-jsonind-document.mdx +++ /dev/null @@ -1,201 +0,0 @@ ---- -id: index-jsonind-document -title: Indexing JSON document using RediSearch -sidebar_label: Indexing JSON document using RediSearch -slug: /howtos/redisjson/jsonind-document -authors: [ajeet] ---- - -RedisJSON 2.0 Private Preview was announced for the first time during RedisConf 2021. With this newer version, RedisJSON will fully support JSONPath expressions and [Active-Active geo-distribution](https://redis.com/redis-enterprise/technology/active-active-geo-distribution/). The Active-Active implementation is based on [Conflict-free Replicated Data-Types (CRDT)](https://en.wikipedia.org/wiki/Conflict-free_replicated_data_type). - -Prior to v2.2, RediSearch only supported Redis hashes. Going forward, RediSearch will support RedisJSON documents. This opens a powerful new set of document-based indexing use cases. In addition, RediSearch now provides query profiling. This will empower developers to understand and optimize their RediSearch queries, increasing their developer experience. - -RediSearch has been providing indexing and search capabilities on hashes. Under the hood, RedisJSON 2.0 exposes an internal public API. Internal, because this API is exposed to other modules running inside a Redis node. Public, because any module can consume this API. So does RediSearch 2.2 ! In addition to indexing Redis hashes, RediSearch also indexes JSON. To index JSON, you must use the RedisJSON module. - -By exposing its capabilities to other modules, RedisJSON gives RediSearch the ability to index JSON documents so users can now find documents by indexing and querying the content. These combined modules give you a powerful, low latency, JSON-oriented document database! - -### Prerequisite: - -- Redis 6.x or later -- RediSearch 2.2 or later -- RediJSON 2.0 or later - -### Step 1. Run the "preview" tagged Redismod container - -Please note that this publicly available Docker Image is a community preview and doesn't support Active-Active.This Docker image contains Redis together with the main Redis modules, including RediSearch and RedisJSON. You'll need the preview tag of the image, which you can access as follows: - -```bash - docker run -p 6379:6379 redislabs/redismod:preview -``` - -```bash - info modules - # Modules - module:name=rg,ver=10006,api=1,filters=0,usedby=[],using=[ai],options=[] - module:name=graph,ver=20406,api=1,filters=0,usedby=[],using=[],options=[] - module:name=timeseries,ver=10410,api=1,filters=0,usedby=[],using=[],options=[] - module:name=bf,ver=20205,api=1,filters=0,usedby=[],using=[],options=[] - module:name=ai,ver=10003,api=1,filters=0,usedby=[rg],using=[],options=[] - module:name=ReJSON,ver=20000,api=1,filters=0,usedby=[search],using=[],options=[] - module:name=search,ver=20200,api=1,filters=0,usedby=[],using=[ReJSON],options=[] -``` - -### Step 2. Create an Index - -Let's start by creating an index. - -We can now specify ON JSON to inform RediSearch that we want to index JSON documents. -Then, on the SCHEMA part, you can provide JSONPath expressions. The result of each JSON Path expression is indexed and associated with a logical name ( attribute ). This attribute (previously called field ) is used in the query part. - -This is the basic syntax for indexing a JSON document: - -Syntax: - -``` - FT.CREATE {index_name} ON JSON SCHEMA {json_path} AS {attribute} {type} -``` - -Command: - -```bash - FT.CREATE userIdx ON JSON SCHEMA $.user.name AS name TEXT $.user.email AS email TAG -``` - -### Step 3. Populate the database with JSON document - -We should first populate the database with a JSON document using the JSON.SET command. In our example we are going to use the following JSON document: - -``` -{ - "user": { - "name": "Paul John", - "email": "paul.john@example.com", - "age": "42", - "country": "London" - } -} -``` - -``` -JSON.SET myuser $ '{ "user":{"name": "Paul John", "email": "paul.john@example.com", "age": "4", "country": "London" }}' - -``` - -Because indexing is synchronous, the document will be visible on the index as soon as the JSON.SET command returns. Any subsequent query matching the indexed content will return the document - -### Step 4. Indexing the database with JSON document - -This new version includes a comprehensive support of JSONPath. It is now possible to use all the expressiveness of JSONPath expressions. - -To create a new index, we use the FT.CREATE command. The schema of the index now accepts JSONPath expressions. The result of the expression is indexed and associated with an attribute (here: title). - -``` -FT.CREATE myIdx ON JSON SCHEMA $.title AS title TEXT -``` - -We can now do a search query and find our JSON document using FT.SEARCH: - -Command: - -```bash - FT.SEARCH userIdx '@name:(John)' -``` - -Result: - -```bash - 1) (integer) 1 - 2) "myuser" - 3) 1) "$" - 2) "{\"user\":{\"name\":\"Paul John\",\"email\":\"paul.john@example.com\",\"age\":\"4\",\"country\":\"London\"}}" -``` - -We just saw that, by default, FT.SEARCH returns the whole document. We can also return only specific attribute (here name) - -```bash - FT.SEARCH userIdx '@name:(John)' RETURN 1 name -``` - -```bash - 1) (integer) 1 - 2) "myuser" - 3) 1) "name" - 2) "Paul John" -``` - -### Step 5. Projecting using JSON Path expressions - -The RETURN parameter also accepts a JSON Path expression which let us extract any part of the JSON document. -In this example, we return the result of the JSON Path expression $.user.hp . - -Command: - -```bash - FT.SEARCH userIdx '@name:(John)' RETURN 1 $.user.email -``` - -Result: - -```bash - 1) (integer) 1 - 2) "myuser" - 3) 1) "$.user.email" - 2) "paul.john@example.com" -``` - -Please Note: It is not possible to index JSON object and JSON arrays. -To be indexed, a JSONPath expression must return a single scalar value (string or number). If the JSONPath expression returns an object or an array, it will be ignored. - -Given the following document: - -``` - { - - "name": "Paul John", - “address": [ - "Orbital Park", - " Hounslow" - ], - "pincode": "TW4 6JS" - } -``` - -If we want to index the array under the address key, we have to create two fields: - -Command: - -```bash - FT.CREATE orgIdx ON JSON SCHEMA $.address[0] AS a1 TEXT $.address[1] AS a2 TEXT -``` - -It's time to index the document: - -Command: - -``` - JSON.SET org:1 $ '{ "name": "Home Address", "address": [ "Orbital Park","Hounslow" ], "pincode": "TW4 6JS" }' -``` - -We can now search in the address: - -Command: - -``` - FT.SEARCH orgIdx "Orbital Park" -``` - -Result: - -``` - FT.SEARCH orgIdx "Orbital Park" - 1) (integer) 1 - 2) "org:1" - 3) 1) "$" - 2) "{\"name\":\"Home Address\",\"address\":[\"Orbital Park\",\"Hounslow\"],\"pincode\":\"TW4 6JS\"}" -``` - -### References - -- [RU204: Storing, Querying and Indexing JSON at Speed](https://university.redis.com/courses/ru204/) - a course at Redis University -- [Indexing JSON Documents](https://oss.redis.com/redisearch/master/Indexing_JSON/) -- [Indexing, Querying, and Full-Text Search of JSON Documents with Redis](https://redis.com/blog/index-and-query-json-docs-with-redis/) diff --git a/docs/howtos/redisjson/redisjson-cheatsheet/.index-redisjson-cheatsheet.mdx.swp b/docs/howtos/redisjson/redisjson-cheatsheet/.index-redisjson-cheatsheet.mdx.swp deleted file mode 100644 index 2903284aaab..00000000000 Binary files a/docs/howtos/redisjson/redisjson-cheatsheet/.index-redisjson-cheatsheet.mdx.swp and /dev/null differ diff --git a/docs/howtos/redisjson/redisjson-cheatsheet/index-redisjson-cheatsheet.mdx b/docs/howtos/redisjson/redisjson-cheatsheet/index-redisjson-cheatsheet.mdx deleted file mode 100644 index f31fec19962..00000000000 --- a/docs/howtos/redisjson/redisjson-cheatsheet/index-redisjson-cheatsheet.mdx +++ /dev/null @@ -1,29 +0,0 @@ ---- -id: index-redisjson-cheatsheet -title: RedisJSON Cheatsheet -sidebar_label: RedisJSON CheatSheet -slug: /howtos/redisjson/redisjson-cheatsheet -authors: [ajeet] ---- - -| Command | Purpose | Syntax | -| :------------------------------------------------------------------------------------------------ | :------------------------------------------------------------------------------------- | :----------------------------------------------------------------------------------------- | --- | -| | Return the value at path in JSON serialized form | JSON.GET <key> | -| | Sets the JSON value at path in key | JSON.SET <key> <path> <json> [NX | XX] | -| | Returns the values at path from multiple key | JSON.MGET <key> [key ...] <path> | -| | Report the type of JSON value at path . | JSON.TYPE <key> [path] | -| | Increments the number value stored at path by number | JSON.NUMINCRBY <key> <path> <number> | -| | Multiplies the number value stored at path by number | JSON.NUMMULTBY <key> <path> <number> | -| | Append the json-string value(s) the string at path | JSON.STRAPPEND <key> [path] <json-string> | -| | Append the json value(s) into the array at path after the last element in it | JSON.ARRAPPEND <key> <path> <json> [json ...] | -| | Report the length of the JSON String at path in key | JSON.STRLEN <key> [path] | -| | Report the length of the JSON Array at path in key | JSON.ARRLEN <key> [path] | -| | Insert the json value(s) into the array at path before the index (shifts to the right) | JSON.ARRINSERT <key> <path> <index> <json> [json ...] | -| | Search for the first occurrence of a scalar JSON value in an array | JSON.ARRINDEX <key> <path> <json-scalar> [start [stop]] | -| | Remove and return element from the index in the array | JSON.ARRPOP <key> [path [index]] | -| | Trim an array so that it contains only the specified inclusive range of elements | JSON.ARRTRIM <key> <path> <start> <stop> | -| | Return the keys in the object that's referenced by path | JSON.OBJKEYS <key> [path] | -| | Report the number of keys in the JSON Object at path in key | JSON.OBJLEN <key> [path] | -| | Report information | JSON.DEBUG <subcommand & arguments> | -| | Return the JSON in key in Redis Serialization Protocol (RESP) | JSON.RESP <key> [path] | | -| | An Alias for [JSON.DEL](https://oss.redis.com/redisjson/commands/#jsondel) | JSON.DEL <key> [path] | diff --git a/docs/howtos/redisjson/redisjson1.png b/docs/howtos/redisjson/redisjson1.png deleted file mode 100644 index 6ef5bc0a9af..00000000000 Binary files a/docs/howtos/redisjson/redisjson1.png and /dev/null differ diff --git a/docs/howtos/redisjson/redisjson3.png b/docs/howtos/redisjson/redisjson3.png deleted file mode 100644 index eded167bb6c..00000000000 Binary files a/docs/howtos/redisjson/redisjson3.png and /dev/null differ diff --git a/docs/howtos/redisjson/select_cloud_vendor.png b/docs/howtos/redisjson/select_cloud_vendor.png deleted file mode 100644 index 2526223c800..00000000000 Binary files a/docs/howtos/redisjson/select_cloud_vendor.png and /dev/null differ diff --git a/docs/howtos/redisjson/shoppingcart/index-shoppingcart.mdx b/docs/howtos/redisjson/shoppingcart/index-shoppingcart.mdx deleted file mode 100644 index aeabff60aa2..00000000000 --- a/docs/howtos/redisjson/shoppingcart/index-shoppingcart.mdx +++ /dev/null @@ -1,803 +0,0 @@ ---- -id: index-shoppingcart -title: How to build a Shopping cart app using NodeJS and RedisJSON -sidebar_label: How to build a Shopping cart app using NodeJS and RedisJSON -slug: /howtos/redisjson/shoppingcart -authors: [ajeet] ---- - -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; - -It’s hard to imagine an online store without a shopping cart. Almost every online store must have the shopping cart functionality to be able to sell products to customers. In order to build a scalable ecommerce platform, you need a powerful framework and a simple storage system. At times, a lot of developers focus on improving the frontend performance of an ecommerce platform to rectify these things. The real bottleneck, however, remains the slow backend load time. -A slow backend load time can have a serious impact on your search engine rankings. A good rule of thumb is that backend load time should take no more than 20% of your total load time. A good backend load time to aim for is 200ms or less. -In this tutorial, you will see how to build a shopping cart application using Node.js, Vue.js, Express and Redis. - -
- -
- -### Content - -- What will you build? -- What do you need? -- Getting started -- Setting up the backend (Node.js Express) -- Setting up the frontend (Vue.js) -- Running the application -- Conclusion - -### What will you build? - -This tutorial will show you how to harness the power of Redis by creating a basic ecommerce shopping cart application with Node.js. Usually, the shopping cart data is stored on the client-side as a cookie. Cookies are small text files stored in a web user's browser directory or data folder. The advantage of doing this is that you wouldn't need to store such temporary data in your database. However, this will require you to send the cookies with every web request, which can slow down the request in case of large cookies. Storing shopping cart data in Redis is a good idea since you can retrieve the items very fast at any time and persist this data if needed. - -![Shopping Cart](shopping2.png) - -### What do you need? - -- Redis compiled with RedisJSON module -- Express 4 backend -- Node 15.5.0 (at least v12.9.0+) -- NPM 7.3.0 (at least v6.14.8+) -- Docker 19.03.X (Optional) -- Docker Compose (Optional) - -Building an ecommerce app with Node.js makes a lot more sense because it ensures the balance between frontend and backend load time due to its asynchronous nature (the ability to handle multiple concurrent users at a time). Node.js helps developers make the best use of event loops and callbacks for I/O operations. Node.js runs single-threaded, non-blocking, asynchronous programming, which is very memory efficient. - -In order to create a shopping cart we need a simple storage system where we can collect products and the cart's total. Node.js provides us with the express-session package, middleware for ExpressJS. We will be using express-session middleware to manage sessions in Node.js The session is stored in the express server itself. - -The default server-side session storage, MemoryStore, is purposely not designed for a production environment. It will leak memory under most conditions, does not scale past a single process, and is meant for debugging and developing. To manage multiple sessions for multiple users, we have to create a global map and put each session object to it. Global variables in NodeJs are memory consuming and can prove to be terrible security holes in production-level projects.This can be solved by using an external session store. We have to store every session in the store so that each one will belong to only a single user. One popular session store is built using Redis. - -We will start by setting up the backend for our application. Let’s create a new directory for our application and initialize a new Node.js application. Open up your terminal and type the following: - -### Getting Started - -Clone the repository: - -``` -$ git clone https://github.com/redis-developer/basic-redis-shopping-chart-nodejs -``` - -#### Compiling Redis with RedisJSON module - -You can use the below docker compose file to run Redis server compiled with RedisJSON module: - -``` -version: '3' - -services: - redis: - image: redislabs/rejson:latest - container_name: redis.redisshoppingcart.docker - restart: unless-stopped - environment: - REDIS_PASSWORD: ${REDIS_PASSWORD} - command: redis-server --loadmodule "/usr/lib/redis/modules/rejson.so" --requirepass "$REDIS_PASSWORD" - ports: - - 127.0.0.1:${REDIS_PORT}:6379 - networks: - - global -networks: - global: - external: true -``` - -I assume that you have Docker and Docker Compose up and installed on your local environment. Execute the below compose CLI to bring up Redis server: - -``` -$ docker network create global -$ docker-compose up -d --build -``` - -The `docker-compose ps` shows the list of running Redis services: - -``` -$ docker-compose ps -Name Command State Ports - -redis.redisshoppingcart.docker docker-entrypoint.sh redis ... Up 127.0.0.1:55000->6379/tcp -``` - -### Setting up the backend server - -![Shopping](shoppingcart3.png) - -Node.js is a runtime environment that allows software developers to launch both the frontend and backend of web apps using JavaScript. To save your time, the directory /server/src has already been created for you.This is where we will be creating our modules by adding the following sub-directories - - -- routes -- controller -- middleware -- services - -Routes forward the supported requests (and any information encoded in request URLs) to the appropriate controller functions, whereas controller functions get the requested data from the models, create an HTML page displaying the data, and return it to the user to view in the browser. Services hold your actual business logic. Middleware functions are functions that have access to the request object (req), the response object (res), and the next middleware function in the application’s request-response cycle. - -#### Project directory structure - -``` -% tree -. -├── controllers -│ ├── Cart -│ │ ├── DeleteItemController.js -│ │ ├── EmptyController.js -│ │ ├── IndexController.js -│ │ └── UpdateController.js -│ └── Product -│ ├── IndexController.js -│ └── ResetController.js -├── index.js -├── middleware -│ └── checkSession.js -├── products.json -├── routes -│ ├── cart.js -│ ├── index.js -│ └── products.js -└── services - └── RedisClient.js - -6 directories, 13 files -``` - -Let us first initialize the application server through the index.js shown below: - -``` -// server/src/index.js - - -const express = require('express'); -const redis = require('redis'); -const rejson = require('redis-rejson'); -const session = require('express-session'); -const RedisStore = require('connect-redis')(session); -const path = require('path'); -const bodyParser = require('body-parser'); -const cors = require('cors'); -const RedisClient = require('./services/RedisClient'); - -rejson(redis); - -require('dotenv').config(); - -const { REDIS_ENDPOINT_URI, REDIS_HOST, REDIS_PORT, REDIS_PASSWORD, PORT } = process.env; - -const app = express(); - -app.use( - cors({ - origin(origin, callback) { - callback(null, true); - }, - credentials: true - }) -); - -const redisEndpointUri = REDIS_ENDPOINT_URI - ? REDIS_ENDPOINT_URI.replace(/^(redis\:\/\/)/, '') - : `${REDIS_HOST}:${REDIS_PORT}`; - -const redisClient = redis.createClient(`redis://${redisEndpointUri}`, { - password: REDIS_PASSWORD -}); - -const redisClientService = new RedisClient(redisClient); - -app.set('redisClientService', redisClientService); - -app.use( - session({ - store: new RedisStore({ client: redisClient }), - secret: 'someSecret', - resave: false, - saveUninitialized: false, - rolling: true, - cookie: { - maxAge: 3600 * 1000 * 3 - } - }) -); - -app.use(bodyParser.json()); - -app.use('/', express.static(path.join(__dirname, '../../client-dist'))); - -const router = require('./routes')(app); - -app.use('/api', router); - -const port = PORT || 3000; - -app.listen(port, () => { - console.log(`App listening on port ${port}`); -}); -``` - -You'll see that the responsibility of this index.js is to simply set up the server. It initializes all the middleware, sets up the view engine, etc. The last thing to do is set up routes by deferring that responsibility to the index.js within the routes folder. - -As shown above, app.use, app.set, and app.listen are endpoints, for the purposes of this demo, we will need to be able to add and get items from the basket ( Keeping it simple ). -We need to define our basic routes to get all products, get single product details, remove products, and create products. - -#### Routes - -The routes directory is only responsible for defining our routes. Within index.js in this folder, you'll see that its responsibility is to set up our top level routes and delegate their responsibilities to each of their respective route files. Each respective route file will further define any additional subroutes and controller actions for each one. - -The web server skeleton already has a ./routes folder containing routes for the index, products and cart. (as shown under https://github.com/redis-developer/basic-redis-shopping-chart-nodejs/tree/main/server/src/routes) - -``` -// routes/index.js - -const fs = require('fs'); -const express = require('express'); -const router = express.Router(); - -module.exports = app => { - fs.readdirSync(__dirname).forEach(function (route) { - route = route.split('.')[0]; - - if (route === 'index') { - return; - } - - router.use(`/${route}`, require(`./${route}`)(app)); - }); - - return router; -}; -``` - -A route is a section of Express code that associates an HTTP verb (GET, POST, PUT, DELETE, etc.), a URL path/pattern, and a function that is called to handle that pattern. -There are several ways to create routes. For this demo app we're going to use the express.Router middleware as it allows us to group the route handlers for a particular part of a site together and access them using a common route-prefix. -The module requires Express and then uses it to create a Router object. The routes are all set up on the router, which is then exported.The routes are defined either using .get() or .post() methods on the router object. All the paths are defined using strings (we don't use string patterns or regular expressions). Routes that act on some specific resource (e.g. book) use path parameters to get the object id from the URL. -The handler functions are all imported from the controller modules we created in the previous section. - -#### Controllers - -Controllers are responsible for invoking the appropriate action. If a controller's responsibility is to render a view, it will render the appropriate view from the app/views directory. - -``` -// controller/Product/IndexController.js - - -const { products } = require('../../products.json'); - -class ProductIndexController { - constructor(redisClientService) { - this.redisClientService = redisClientService; - } - - async index(req, res) { - const productKeys = await this.redisClientService.scan('product:*'); - const productList = []; - - if (productKeys.length) { - for (const key of productKeys) { - const product = await this.redisClientService.jsonGet(key); - - productList.push(JSON.parse(product)); - } - - return res.send(productList); - } - - for (const product of products) { - const { id } = product; - - await this.redisClientService.jsonSet(`product:${id}`, '.', JSON.stringify(product)); - - productList.push(product); - } - - return res.send(productList); - } -} - -module.exports = ProductIndexController; -``` - -#### Services - -Services hold your actual business logic.The service layer carries out the application logic and delegates CRUD operations to a database/persistent storage (Redis in our case). Let us look at each condition and try to understand how the data is stored, modified, and accessed: - -#### How the data is stored: - -The product data is stored in an external JSON file. After the first request, this data is saved in a JSON data type in Redis like: - -``` -JSON.SET product:{productId} . '{ "id": "productId", "name": "Product Name", "price": "375.00", "stock": 10 }'. -``` - -#### Example: - -``` -JSON.SET product:e182115a-63d2-42ce-8fe0-5f696ecdfba6 . '{ "id": "e182115a-63d2-42ce-8fe0-5f696ecdfba6", "name": "Brilliant Watch", "price": "250.00", "stock": 2 }' -``` - -The cart data is stored in a hash like: - -``` -HSET cart:{cartId} product:{productId} {productQuantity}, -``` - -where cartId is a random generated value and stored in the user session. Please note that Redis’s hash management command HSET stores 2 keys-cart and product-as shown in the below example. - -#### Example: - -``` -HSET cart:77f7fc881edc2f558e683a230eac217d product:e182115a-63d2-42ce-8fe0-5f696ecdfba6 1 -``` - -#### How the data is modified: - -The product data is modified like - -``` -JSON.SET product:{productId} . '{ "id": "productId", "name": "Product Name", "price": "375.00", "stock": {newStock} }'. -``` - -#### Example: - -``` -JSON.SET product:e182115a-63d2-42ce-8fe0-5f696ecdfba6 . '{ "id": "e182115a-63d2-42ce-8fe0-5f696ecdfba6", "name": "Brilliant Watch", "price": "250.00", "stock": 1 }' -``` - -The cart data is modified like - -``` -HSET cart:{cartId} product:{productId} {newProductQuantity} or HINCRBY cart:{cartId} product:{productId} {incrementBy}. -``` - -#### Example: - -``` -HSET cart:77f7fc881edc2f558e683a230eac217d product:e182115a-63d2-42ce-8fe0-5f696ecdfba6 2 - -HINCRBY cart:77f7fc881edc2f558e683a230eac217d product:e182115a-63d2-42ce-8fe0-5f696ecdfba6 1 - -HINCRBY cart:77f7fc881edc2f558e683a230eac217d product:e182115a-63d2-42ce-8fe0-5f696ecdfba6 -1 -``` - -The product can be removed from the cart like - -``` -HDEL cart:{cartId} product:{productId} -``` - -#### Example: - -``` -HDEL cart:77f7fc881edc2f558e683a230eac217d product:e182115a-63d2-42ce-8fe0-5f696ecdfba6 -``` - -The cart can be cleared using - -``` -HGETALL cart:{cartId} and then HDEL cart:{cartId} {productKey} in loop. -``` - -#### Example: - -``` -HGETALL cart:77f7fc881edc2f558e683a230eac217d => product:e182115a-63d2-42ce-8fe0-5f696ecdfba6, product:f9a6d214-1c38-47ab-a61c-c99a59438b12, product:1f1321bb-0542-45d0-9601-2a3d007d5842 => HDEL cart:77f7fc881edc2f558e683a230eac217d product:e182115a-63d2-42ce-8fe0-5f696ecdfba6, HDEL cart:77f7fc881edc2f558e683a230eac217d product:f9a6d214-1c38-47ab-a61c-c99a59438b12, HDEL cart:77f7fc881edc2f558e683a230eac217d product:1f1321bb-0542-45d0-9601-2a3d007d5842 -``` - -All carts can be deleted when reset data is requested like: - -``` - SCAN {cursor} MATCH cart:* and then DEL cart:{cartId} in loop. -``` - -#### Example: - -``` - SCAN {cursor} MATCH cart:* => cart:77f7fc881edc2f558e683a230eac217d, cart:217dedc2f558e683a230eac77f7fc881, cart:1ede77f558683a230eac7fc88217dc2f => DEL cart:77f7fc881edc2f558e683a230eac217d, DEL cart:217dedc2f558e683a230eac77f7fc881, DEL cart:1ede77f558683a230eac7fc88217dc2f -``` - -#### How the data is accessed: - -Products: SCAN {cursor} MATCH product:\* to get all product keys and then JSON.GET {productKey} - -#### Example: - -``` -SCAN {cursor} MATCH product:* => product:e182115a-63d2-42ce-8fe0-5f696ecdfba6, product:f9a6d214-1c38-47ab-a61c-c99a59438b12, product:1f1321bb-0542-45d0-9601-2a3d007d5842 -=> JSON.GET product:e182115a-63d2-42ce-8fe0-5f696ecdfba6, JSON.GET product:f9a6d214-1c38-47ab-a61c-c99a59438b1, JSON.GET product:1f1321bb-0542-45d0-9601-2a3d007d5842 -``` - -Cart: HGETALL cart:{cartId} to get quantity of products and JSON.GET product:{productId} to get products data in loop. - -#### Example: - -``` -HGETALL cart:77f7fc881edc2f558e683a230eac217d => product:e182115a-63d2-42ce-8fe0-5f696ecdfba6 (quantity: 1), product:f9a6d214-1c38-47ab-a61c-c99a59438b12 (quantity: 0), product:1f1321bb-0542-45d0-9601-2a3d007d5842 (quantity: 2) => JSON.GET product:e182115a-63d2-42ce-8fe0-5f696ecdfba6, JSON.GET product:f9a6d214-1c38-47ab-a61c-c99a59438b12, JSON.GET product:1f1321bb-0542-45d0-9601-2a3d007d5842 -``` - -HGETALL returns an array of keys and corresponding values from hash data type. -Open up RedisClient.js file using your favourite editor as shown below: - -``` -// services/RedisClient.js - -const { promisify } = require('util'); - -class RedisClient { - constructor(redisClient) { - ['json_get', 'json_set', 'hgetall', 'hset', 'hget', 'hdel', 'hincrby', 'del', 'scan'].forEach( - method => (redisClient[method] = promisify(redisClient[method])) - ); - this.redis = redisClient; - } - - async scan(pattern) { - let matchingKeysCount = 0; - let keys = []; - - const recursiveScan = async (cursor = '0') => { - const [newCursor, matchingKeys] = await this.redis.scan(cursor, 'MATCH', pattern); - cursor = newCursor; - - matchingKeysCount += matchingKeys.length; - keys = keys.concat(matchingKeys); - - if (cursor === '0') { - return keys; - } else { - return await recursiveScan(cursor); - } - }; - - return await recursiveScan(); - } - - jsonGet(key) { - return this.redis.json_get(key); - } - - jsonSet(key, path, json) { - return this.redis.json_set(key, path, json); - } - - hgetall(key) { - return this.redis.hgetall(key); - } - - hset(hash, key, value) { - return this.redis.hset(hash, key, value); - } - - hget(hash, key) { - return this.redis.hget(hash, key); - } - - hdel(hash, key) { - return this.redis.hdel(hash, key); - } - - hincrby(hash, key, incr) { - return this.redis.hincrby(hash, key, incr); - } - - del(key) { - return this.redis.del(key); - } -} - -module.exports = RedisClient; -``` - -#### How does the overall process work? - -The process flow is fairly straightforward. Once a request is sent to an endpoint on this shopping cart application e.g http://localhost:8081/. It first hits the router for that endpoint and then if it is a public endpoint such as this one it goes to the controller that handles that. As an analogy, the controller is just like a manager, while the service is the worker. A controller manages the incoming work HTTP requests whereas services receives the request data it needs from the manager in order to perform its tasks - -Next, we create routes for a cart in a module named cart.js. The code first imports the Express application object, uses it to get a Router object and then adds a couple of routes to it using the get() method. Last of all the module returns the Router object. - -First let us define the product model to our controllers/Product/IndexController.js file(https://github.com/redis-developer/basic-redis-shopping-chart-nodejs/tree/main/server/src/controllers/Product): - -Our product model will be basic as possible as it holds the product name, price and image. - -``` -{ - "products": [ - { - "id": "e182115a-63d2-42ce-8fe0-5f696ecdfba6", - "name": "Brilliant Watch", - "price": "250.00", - "stock": 2 - }, - { - "id": "f9a6d214-1c38-47ab-a61c-c99a59438b12", - "name": "Old fashion cellphone", - "price": "24.00", - "stock": 2 - }, - { - "id": "1f1321bb-0542-45d0-9601-2a3d007d5842", - "name": "Modern iPhone", - "price": "1000.00", - "stock": 2 - }, - { - "id": "f5384efc-eadb-4d7b-a131-36516269c218", - "name": "Beautiful Sunglasses", - "price": "12.00", - "stock": 2 - }, - { - "id": "6d6ca89d-fbc2-4fc2-93d0-6ee46ae97345", - "name": "Stylish Cup", - "price": "8.00", - "stock": 2 - }, - { - "id": "efe0c7a3-9835-4dfb-87e1-575b7d06701a", - "name": "Herb caps", - "price": "12.00", - "stock": 2 - }, - { - "id": "x341115a-63d2-42ce-8fe0-5f696ecdfca6", - "name": "Audiophile Headphones", - "price": "550.00", - "stock": 2 - }, - { - "id": "42860491-9f15-43d4-adeb-0db2cc99174a", - "name": "Digital Camera", - "price": "225.00", - "stock": 2 - }, - { - "id": "63a3c635-4505-4588-8457-ed04fbb76511", - "name": "Empty Bluray Disc", - "price": "5.00", - "stock": 2 - }, - { - "id": "97a19842-db31-4537-9241-5053d7c96239", - "name": "256BG Pendrive", - "price": "60.00", - "stock": 2 - } - ] -} -``` - -#### Testing the Server - -Copy `.env.example` to `.env` file and set environment variables as shown below: - -``` -REDIS_PORT=6379 -REDIS_HOST=127.0.0.1 -REDIS_PASSWORD=demo - -COMPOSE_PROJECT_NAME=redis-shopping-cart -``` - -(Note: In case you’re using Redis Enterprise Cloud instead of localhost, then you need to enter the database endpoint under REDIS_HOST(without port) while rest of the entries like REDIS_PORT and REDIS_PASSWORD are quite obvious) - -#### Installing the dependencies - -``` -$ npm install -``` - -#### Testing the Routes - -After adding this, you can run your application by typing npm install in your terminal. Once you run this command, it will return Application is running on 3000. - -``` -$ npm run dev -``` - -``` -$ npm run dev - -> redis-shopping-cart-backend@1.0.0 dev -> nodemon src/index.js - -[nodemon] 2.0.7 -[nodemon] to restart at any time, enter `rs` -[nodemon] watching path(s): *.* -[nodemon] watching extensions: js,mjs,json -[nodemon] starting `node src/index.js` -App listening on port 3000 -``` - -### Setting up the frontend web Client using Vue.js - -Now that we have the application’s backend running, let us begin developing its frontend. We will be leveraging Vue.js - a robust but simple JavaScript framework for building our frontend web client. It has one of the lowest barriers to entry of any modern framework while providing all the required features for high performance web applications. - -``` -. -├── README.md -├── babel.config.js -├── node_modules -├── package-lock.json -├── package.json -├── public -├── src -└── vue.config.js -``` - -The files at the root level (babel.config.js, package.json, node_modules) are used to configure the project. -The most interesting part, at least for now, is located in the src directory(directory structure is shown below): - -The main.js file is the main JavaScript file of the application, which will load all common elements and call the App.vue main screen. -The App.vue is a file that contains in the HTML, CSS, and JavaScript for a specific page or template. As an entry point for the application, this part is shared by all screens by default, so it is a good place to write the notification-client piece in this file. -The public/index.html is the static entry point from where the DOM will be loaded. - -#### Directory Structure: - -``` -% tree -. -├── App.vue -├── assets -│ ├── RedisLabs_Illustration.svg -│ └── products -│ ├── 1f1321bb-0542-45d0-9601-2a3d007d5842.jpg -│ ├── 42860491-9f15-43d4-adeb-0db2cc99174a.jpg -│ ├── 63a3c635-4505-4588-8457-ed04fbb76511.jpg -│ ├── 6d6ca89d-fbc2-4fc2-93d0-6ee46ae97345.jpg -│ ├── 97a19842-db31-4537-9241-5053d7c96239.jpg -│ ├── e182115a-63d2-42ce-8fe0-5f696ecdfba6.jpg -│ ├── efe0c7a3-9835-4dfb-87e1-575b7d06701a.jpg -│ ├── f5384efc-eadb-4d7b-a131-36516269c218.jpg -│ ├── f9a6d214-1c38-47ab-a61c-c99a59438b12.jpg -│ └── x341115a-63d2-42ce-8fe0-5f696ecdfca6.jpg -├── components -│ ├── Cart.vue -│ ├── CartItem.vue -│ ├── CartList.vue -│ ├── Info.vue -│ ├── Product.vue -│ ├── ProductList.vue -│ └── ResetDataBtn.vue -├── config -│ └── index.js -├── main.js -├── plugins -│ ├── axios.js -│ └── vuetify.js -├── store -│ ├── index.js -│ └── modules -│ ├── cart.js -│ └── products.js -└── styles - └── styles.scss - -8 directories, 27 files -``` - -In the client directory, under the subdirectory src, open the file App.vue. You will see the below content: - -``` - - - -``` - -This is client-side code. Here API returns, among other things, links to icons suitable for use on Maps. If you follow the flow through, you’ll see the map markers are loading those icons directly using the include URLs. - -### Running/Testing the web client - -``` -$ cd client -$ npm run serve - -> redis-shopping-cart-client@1.0.0 serve -> vue-cli-service serve - - INFO Starting development server... -98% after emitting CopyPlugin - - DONE Compiled successfully in 7733ms 7:15:56 AM - - - App running at: - - Local: http://localhost:8081/ - - Network: http://192.168.43.81:8081/ - - Note that the development build is not optimized. - To create a production build, run npm run build. -``` - -Let us click on the first item “256GB Pendrive” and try to check out this product. Once you add it to the cart, you will see the below output using redis-cli monitor command: - -``` -1613320256.801562 [0 172.22.0.1:64420] "json.get" "product:97a19842-db31-4537-9241-5053d7c96239" -1613320256.803062 [0 172.22.0.1:64420] "hget" -... -1613320256.805950 [0 172.22.0.1:64420] "json.set" "product:97a19842-db31-4537-9241-5053d7c96239" "." "{\"id\":\"97a19842-db31-4537-9241-5053d7c96239\",\"name\":\"256BG Pendrive\",\"price\":\"60.00\",\"stock\":1}" -1613320256.807792 [0 172.22.0.1:64420] "set" "sess:Ii9njXZd6zeUViL3tKJimN5zU7Samfze" -... -1613320256.823055 [0 172.22.0.1:64420] "scan" "0" "MATCH" "product:*" -... -1613320263.232527 [0 172.22.0.1:64420] "hgetall" "cart:bdee1606395f69985e8f8e01d3ada8c4" -1613320263.233752 [0 172.22.0.1:64420] "set" "sess:gXk5K9bobvrR790-HFEoi3bQ2kP9YmjV" "{\"cookie\":{\"originalMaxAge\":10800000,\"expires\":\"2021-02-14T19:31:03.233Z\",\"httpOnly\":true,\"path\":\"/\"},\"cartId\":\"bdee1606395f69985e8f8e01d3ada8c4\"}" "EX" "10800" -1613320263.240797 [0 172.22.0.1:64420] "scan" "0" "MATCH" "product:*" -1613320263.241908 [0 172.22.0.1:64420] "scan" "22" "MATCH" "product:*" -… -"{\"cookie\":{\"originalMaxAge\":10800000,\"expires\":\"2021-02-14T19:31:03.254Z\",\"httpOnly\":true,\"path\":\"/\"},\"cartId\":\"4bc231293c5345370f8fab83aff52cf3\"}" "EX" "10800" -``` - -![Shopping Cart](shoppingcart5.png) - -### Conclusion - -Storing shopping cart data in Redis is a good idea because it lets you retrieve the data very fast at any time and persist this data if needed. As compared to cookies that store the entire shopping cart data in session that is bloated and relatively slow in operation, storing the shopping cart data in Redis speeds up the shopping cart’s read and write performance , thereby improving the user experience. - -### Reference - -- [Source code](https://github.com/redis-developer/basic-redis-shopping-chart-nodejs) diff --git a/docs/howtos/redisjson/shoppingcart/shopping2.png b/docs/howtos/redisjson/shoppingcart/shopping2.png deleted file mode 100644 index 11ad10681af..00000000000 Binary files a/docs/howtos/redisjson/shoppingcart/shopping2.png and /dev/null differ diff --git a/docs/howtos/redisjson/shoppingcart/shoppingcart.png b/docs/howtos/redisjson/shoppingcart/shoppingcart.png deleted file mode 100644 index 8e83f9721fe..00000000000 Binary files a/docs/howtos/redisjson/shoppingcart/shoppingcart.png and /dev/null differ diff --git a/docs/howtos/redisjson/shoppingcart/shoppingcart2.png b/docs/howtos/redisjson/shoppingcart/shoppingcart2.png deleted file mode 100644 index b7b95e25fce..00000000000 Binary files a/docs/howtos/redisjson/shoppingcart/shoppingcart2.png and /dev/null differ diff --git a/docs/howtos/redisjson/shoppingcart/shoppingcart3.png b/docs/howtos/redisjson/shoppingcart/shoppingcart3.png deleted file mode 100644 index 177a3ad453d..00000000000 Binary files a/docs/howtos/redisjson/shoppingcart/shoppingcart3.png and /dev/null differ diff --git a/docs/howtos/redisjson/shoppingcart/shoppingcart5.png b/docs/howtos/redisjson/shoppingcart/shoppingcart5.png deleted file mode 100644 index 29d024581b6..00000000000 Binary files a/docs/howtos/redisjson/shoppingcart/shoppingcart5.png and /dev/null differ diff --git a/docs/howtos/redisjson/storing-complex-json-document/index-storing-complex-json-document.mdx b/docs/howtos/redisjson/storing-complex-json-document/index-storing-complex-json-document.mdx deleted file mode 100644 index 195517167e4..00000000000 --- a/docs/howtos/redisjson/storing-complex-json-document/index-storing-complex-json-document.mdx +++ /dev/null @@ -1,134 +0,0 @@ ---- -id: index-storing-complex-json-document -title: Storing and retrieving Nested JSON document -sidebar_label: Storing and retrieving Nested JSON document -slug: /howtos/redisjson/storing-complex-json-document -authors: [ajeet] ---- - -JSON(a.k.a JavaScript Object Notation) is a format for sharing data. A JSON object is a key-value data format that is typically rendered in curly braces. When you’re working with JSON, you’ll likely see JSON objects in a .json file, but they can also exist as a JSON object or string within the context of a program. - -Nested JSON is a JSON file with a big portion of its values being other JSON objects. Compared with Simple JSON, Nested JSON provides higher clarity in that it decouples objects into different layers, making it easier to maintain. - -## Example of Nested JSON object - -``` -employee = { - 'name': "Paul", - 'Age': '25', - 'Location': "USA", - 'Address': - { - "longitude": "-113.6335371", - "latitude": "37.1049502", - "postal code": "90266" - } - } -``` - -Follow the below steps to understand how nested JSON objects can be imported into Redis database: - -### Step 1. Run RedisJSON Docker container - -```bash - docker run -p 6379:6379 --name redis-redisjson redislabs/rejson:latest -``` - -### Step 2. Verify if RedisJSON module is loaded - -```bash - redis-cli - 127.0.0.1:6379> info modules - # Modules - module:name=ReJSON,ver=10007,api=1,filters=0,usedby=[],using=[],options=[] - 127.0.0.1:6379> -``` - -### Step 3. Nested JSON - -Below is a python code for nested JSON document: - -```python - import redis - import json - - employee = { - 'name': "Paul", - 'Age': '25', - 'Location': "USA", - 'Address': - { - "longitude": "-113.6335371", - "latitude": "37.1049502", - "postal code": "90266" - } - } - r = redis.StrictRedis() - r.execute_command('JSON.SET', 'record', '.', json.dumps(employee)) - reply = json.loads(r.execute_command('JSON.GET', 'record')) -``` - -Copy the code and save it in a file called `employee.py` - -### Step 4. Load Redis Module - -```bash - pip install rejson -``` - -### Step 5. Execute the python script - -Execute the below script and ensure that it executes successfully. - -```bash - python3 employee.py -``` - -### Step 6. Verify the JSON objects gets added to Redis - -```bash - redis-cli - 127.0.0.1:6379> JSON.GET record - "{\"name\":\"Paul\",\"Age\":\"25\",\"Location\":\"USA\",\"Address\":[{\"longitude\":\"-113.6335371\",\"latitude\":\"37.1049502\",\"postal code\":\"90266\"}]}" -``` - -### Step 7. Fetching the specific fields - -In case you want to fetch specific filed (like address), then the code would look like this: - -```python - import redis - import json - - employee = { - 'name': "Paul", - 'Age': '25', - 'Location': "USA", - 'Address': - { - "longitude": "-113.6335371", - "latitude": "37.1049502", - "postal code": "90266" - } - - } - r = redis.StrictRedis() - r.execute_command('JSON.SET', 'record', '.', json.dumps(employee)) - reply = json.loads(r.execute_command('JSON.GET', 'record', '.Address.longitude')) -``` - -### Step 8. Verifying the results - -```bash - redis-cli - 127.0.0.1:6379> JSON.GET record .Address.longitude - "\"-113.6335371\"" -``` - -### References - -- [RU204: Storing, Querying and Indexing JSON at Speed](https://university.redis.com/courses/ru204/) - a course at Redis University -- [Importing JSON data into Redis using NodeJS](/howtos/redisjson/using-nodejs) -- Learn more about [RedisJSON](https://oss.redis.com/redisjson/) in the Quickstart tutorial. -- [How to store and retrieve nested JSON document](/howtos/redisjson/storing-complex-json-document) -- [Importing JSON data into Redis using NodeJS](/howtos/redisjson/using-nodejs) diff --git a/docs/howtos/redisjson/storing-json-using-nodejs/images/application_architecture.png b/docs/howtos/redisjson/storing-json-using-nodejs/images/application_architecture.png deleted file mode 100644 index a55d4120ccf..00000000000 Binary files a/docs/howtos/redisjson/storing-json-using-nodejs/images/application_architecture.png and /dev/null differ diff --git a/docs/howtos/redisjson/storing-json-using-nodejs/images/configure_insight.png b/docs/howtos/redisjson/storing-json-using-nodejs/images/configure_insight.png deleted file mode 100644 index b07bea6954a..00000000000 Binary files a/docs/howtos/redisjson/storing-json-using-nodejs/images/configure_insight.png and /dev/null differ diff --git a/docs/howtos/redisjson/storing-json-using-nodejs/images/locationdetails_redis_json.png b/docs/howtos/redisjson/storing-json-using-nodejs/images/locationdetails_redis_json.png deleted file mode 100644 index 566027ca8ca..00000000000 Binary files a/docs/howtos/redisjson/storing-json-using-nodejs/images/locationdetails_redis_json.png and /dev/null differ diff --git a/docs/howtos/redisjson/storing-json-using-nodejs/index-storingjson-nodejs.mdx b/docs/howtos/redisjson/storing-json-using-nodejs/index-storingjson-nodejs.mdx deleted file mode 100644 index 1d8bae0cabd..00000000000 --- a/docs/howtos/redisjson/storing-json-using-nodejs/index-storingjson-nodejs.mdx +++ /dev/null @@ -1,440 +0,0 @@ ---- -id: index-storingjson-nodejs -title: How to store and retrieve JSON documents using Node.js -sidebar_label: Storing and retrieving JSON documents using Node.js -slug: /howtos/redisjson/storing-json-using-nodejs -authors: [ajeet, simon] ---- - -Imagine that you're building a social network application where users can "check in" at different locations and give them a star rating, say from 0 for an awful experience through 5 to report that they had the best time ever there! When designing your application, you determined that there's a need to manage data about three main entities: - -- Users -- Locations -- Checkins - -Let's look at what we're storing about each of these entities. As we're using Redis as our only data store, we'll also consider how they map to Redis data types... - -## Users - -We'll represent each user as a flat map of name/value pairs with no nested objects. As we'll see later on, this maps nicely to a Redis Hash. Here's a JSON representation of the schema we'll use to represent each user: - -```json -{ - "id": 99, - "firstName": "Isabella", - "lastName": "Pedersen", - "email": "isabella.pedersen@example.com", - "password": "xxxxxx1", - "numCheckins": 8073, - "lastCheckin": 1544372326893, - "lastSeenAt": 138 -} -``` - -We've given each user an ID and we're storing basic information about them. Also, we’ll encrypt their password using bcrypt when we load the sample data into Redis. - -For each user, we'll keep track of the total number of checkins that they've submitted to the system, and the timestamp and location ID of their most recent checkin so that we know where and when they last used the system. - -## Locations - -For each location that users can check in at, we're going to maintain two types of data. The first of these is also a flat map of name/value pairs, containing summary information about the location: - -```json -{ - "id": 138, - "name": "Stacey's Country Bakehouse", - "category": "restaurant", - "location": "-122.195447,37.774636", - "numCheckins": 170, - "numStars": 724, - "averageStars": 4 -} -``` - -We've given each location an ID and a category—we'll use the category to search for locations by type later on. The "location" field stores the coordinates in longitude, latitude format… this is the opposite from the usual latitude, longitude format. We'll see how to use this to perform geospatial searches later when we look at the RediSearch module. - -For each location, we're also storing the total number of checkins that have been recorded there by all of our users, the total number of stars that those checkins gave the location, and an average star rating per checkin for the location. - -The second type of data that we want to maintain for each location is what we'll call "location details". These take the form of more structured JSON documents with nested objects and arrays. Here's an example for location 138, Stacey's Country Bakehouse: - -```json -{ - "id": 138, - "hours": [ - { "day": "Monday", "hours": "8-7" }, - { "day": "Tuesday", "hours": "9-7" }, - { "day": "Wednesday", "hours": "6-8" }, - { "day": "Thursday", "hours": "6-6" }, - { "day": "Friday", "hours": "9-5" }, - { "day": "Saturday", "hours": "8-9" }, - { "day": "Sunday", "hours": "7-7" } - ], - "socials": [ - { - "instagram": "staceyscountrybakehouse", - "facebook": "staceyscountrybakehouse", - "twitter": "staceyscountrybakehouse" - } - ], - "website": "www.staceyscountrybakehouse.com", - "description": "Lorem ipsum....", - "phone": "(316) 157-8620" -} -``` - -We want to build an API that allows us to retrieve all or some of these extra details, and keep the overall structure of the document intact. For that, we'll need the RedisJSON module as we'll see later. - -## Checkins - -Checkins differ from users and locations in that they're not entities that we need to store forever. In our application, checkins consist of a user ID, a location ID, a star rating and a timestamp - we'll use these values to update attributes of our users and locations. - -Each checkin can be thought of as a flat map of name/value pairs, for example: - -```json -{ - "userId": 789, - "locationId": 171, - "starRating": 5 -} -``` - -Here, we see that user 789 visited location 171 ("Hair by Parvinder") and was really impressed with the service. - -We need a way to store checkins for long enough to process them, but not forever. We also need to associate a timestamp with each one, as we'll need that when we process the data. - -Redis provides a Stream data type that's perfect for this - with Redis Streams, we can store maps of name/value pairs and have the Redis server timestamp them for us. Streams are also perfect for the sort of asynchronous processing we want to do with this data. When a user posts a new checkin to our API we want to store that data and respond to the user that we've received it as quickly as possible. Later we can have one or more other parts of the system do further processing with it. Such processing might include updating the total number of checkins and last seen at fields for a user, or calculating a new average star rating for a location. - -## Application Architecture - -We decided to use Node.js with the Express framework and ioredis client to build the application. Rather than have a monolithic codebase, the application has been split out into four components or services. These are: - -- **Authentication Service**: Listens on an HTTP port and handles user authentication using Redis as a shared session store that other services can access. -- **Checkin Receiver**: Listens on an HTTP port and receives checkins as HTTP POST requests from our users. Each checkin is placed in a Redis Stream for later processing. -- **Checkin Processor**: Monitors the checkin Stream in Redis, updating user and location information as it processes each checkin. -- **API Server**: Implements the bulk of the application's API endpoints, including those to retrieve information about users and locations from Redis. - -These components fit together like so: - -![Application Architecture](images/application_architecture.png) - -There's also a data loader component, which we'll use to load some initial sample data into the system. - -### Step 1. Cloning the repository - -```bash - git clone https://github.com/redislabs-training/node-js-crash-course.git - cd node-js-crash-course - npm install -``` - -### Step 2. Running Redis container - -From the node-js-crash-course directory, start Redis using `docker-compose`: - -```bash - $ docker-compose up -d - Creating network "node-js-crash-course_default" with the default driver - Creating rediscrashcourse ... done - $ docker ps -``` - -The output from the docker ps command should show one container running, using the "redislabs/redismod" image. This container runs Redis 6 with the RediSearch, RedisJSON and RedisBloom modules. - -### Step 3. Load the Sample Data into Redis - -Load the course example data using the provided data loader. This is a Node.js application: - -```bash -$ npm run load all -> node src/utils/dataloader.js -- "all" - -Loading user data... -User data loaded with 0 errors. -Loading location data... -Location data loaded with 0 errors. -Loading location details... -Location detail data loaded with 0 errors. -Loading checkin stream entries... -Loaded 5000 checkin stream entries. -Creating consumer group... -Consumer group created. -Dropping any existing indexes, creating new indexes... -Created indexes. -Deleting any previous bloom filter, creating new bloom filter... -Created bloom filter. -``` - -In another terminal window, run the redis-cli executable that's in the Docker container. Then, enter the Redis commands shown at the redis-cli prompt to verify that data loaded successfully: - -```bash -$ docker exec -it rediscrashcourse redis-cli -127.0.0.1:6379> hgetall ncc:locations:106 - 1) "id" - 2) "106" - 3) "name" - 4) "Viva Bubble Tea" - 5) "category" - 6) "cafe" - 7) "location" - 8) "-122.268645,37.764288" - 9) "numCheckins" -10) "886" -11) "numStars" -12) "1073" -13) "averageStars" -14) "1" -127.0.0.1:6379> hgetall ncc:users:12 - 1) "id" - 2) "12" - 3) "firstName" - 4) "Franziska" - 5) "lastName" - 6) "Sieben" - 7) "email" - 8) "franziska.sieben@example.com" - 9) "password" -10) "$2b$05$uV38PUcdFD3Gm6ElMlBkE.lzZutqWVE6R6ro48GsEjcmnioaZZ55C" -11) "numCheckins" -12) "8945" -13) "lastCheckin" -14) "1490641385511" -15) "lastSeenAt" -16) "22" -127.0.0.1:6379> xlen ncc:checkins -(integer) 5000 -``` - -### Step 4. Start and Configure RedisInsight - -If you're using RedisInsight, start it up and it should open in your browser automatically. If not, point your browser at http://localhost:8001. - -If this is your first time using RedisInsight click "I already have a database". - -If you already have other Redis databases configured in RedisInsight, click "Add Redis Database". - -Now, click "Connect to a Redis Database Using hostname and port". Configure the database details as shown below, then click "Add Redis Database". - -![Configuring RedisInsight](images/configure_insight.png) - -You should now be able to browse your Redis instance. If you need more guidance on how to connect to Redis from RedisInsight, check out Justin's video below but be sure to use 127.0.0.1 as the host, 6379 as the port and leave the username and password fields blank when configuring your database. - -
- -
- -### Step 5. Start the Application - -Now it's time to start the API Server component of the application and make sure it connects to Redis. This component listens on port 8081. - -If port 8081 is in use on your system, edit this section of the `config.json` file and pick another available port: - -```json -"application": { - "port": 8081 -}, -``` - -Start the server like this: - -```bash -$ npm run dev - -> ./node_modules/nodemon/bin/nodemon.js - -[nodemon] 2.0.7 -[nodemon] to restart at any time, enter `rs` -[nodemon] watching path(s): *.* -[nodemon] watching extensions: js,mjs,json -[nodemon] starting `node src/server.js` -Warning: Environment variable WEATHER_API_KEY is not set! -info: Application listening on port 8081. -``` - -This starts the application using nodemon, which monitors for changes in the source code and will restart the server when a change is detected. This will be useful in the next module where you'll be making some code changes. - -Ignore the warning about `WEATHER_API_KEY` — we'll address this in a later exercise when we look at using Redis as a cache. - -To verify that the server is running correctly and connected to Redis, point your browser at: - -``` -http://localhost:8081/api/location/200 -``` - -You should see the summary information for location 200, Katia's Kitchen: - -```json -{ - "id": "200", - "name": "Katia's Kitchen", - "category": "restaurant", - "location": "-122.2349598,37.7356811", - "numCheckins": "359", - "numStars": "1021", - "averageStars": "3" -} -``` - -Great! Now you're up and running. Let's move on to the next module and see how we're using Redis Hashes in the application. You'll also get to write some code! - -### Step 6. Stopping redis-cli, the Redis Container and the Application - -**Don't do this now, as we’ve only just started!** However, when you do want to shut everything down, here's how to do it... - -To stop running redis-cli, simply enter the quit command at the redis-cli prompt: - -```bash -127.0.0.1:6379> quit -$ -``` - -To stop the Redis Server, make sure you are in the `node-js-crash-course` folder that you checked the application repo out to, then: - -```bash -$ docker-compose down -Stopping rediscrashcourse ... done -Removing rediscrashcourse ... done -Removing network node-js-crash-course_default -``` - -Redis persists data to the "redisdata" folder. If you want to remove this, just delete it: - -```bash -$ rm -rf redisdata -``` - -To stop each of the application's components, press Ctrl+C in the terminal window that the component is running in. For example, to stop the API server: - -```bash -$ npm run dev - -> ./node_modules/nodemon/bin/nodemon.js - -[nodemon] 2.0.7 -[nodemon] to restart at any time, enter `rs` -[nodemon] watching path(s): *.* -[nodemon] watching extensions: js,mjs,json -[nodemon] starting `node src/server.js` -info: Application listening on port 8081. -^C -node-js-crash-course $ -``` - -We used Redis' built-in Hash data type to represent our user and location entities. Hashes are great for this, but they are limited in that they can only contain flat name/value pairs. For our locations, we want to store extra details in a more structured way. - -Here's an example of the additional data we want to store about a location: - -```json -{ - "id": 121, - "hours": [ - { "day": "Monday", "hours": "6-7" }, - { "day": "Tuesday", "hours": "6-7" }, - { "day": "Wednesday", "hours": "7-8" }, - { "day": "Thursday", "hours": "6-9" }, - { "day": "Friday", "hours": "8-5" }, - { "day": "Saturday", "hours": "9-6" }, - { "day": "Sunday", "hours": "6-4" } - ], - "socials": [ - { - "instagram": "theginclub", - "facebook": "theginclub", - "twitter": "theginclub" - } - ], - "website": "www.theginclub.com", - "description": "Lorem ipsum...", - "phone": "(318) 251-0608" -} -``` - -We could store this data as serialized JSON in a Redis String, but then our application would have to retrieve and parse the entire document every time it wanted to read some of the data. And we'd have to do the same to update it too. Furthermore, with this approach, update operations aren't atomic and a second client could update the JSON stored at a given key while we're making changes to it in our application code. Then, when we serialize our version of the JSON back into the Redis String, the other client's changes would be lost. - -RedisJSON adds a new JSON data type to Redis, and a query syntax for selecting and updating individual elements in a JSON document atomically on the Redis server. This makes our application code simpler, more efficient, and much more reliable. - -### Step 7. Final exercise - -In this exercise, you'll complete the code for an API route that gets just the object representing a location's opening hours for a given day. Open the file `src/routes/location_routes.js`, and find the route for `/location/:locationId/hours/:day`. The starter code looks like this: - -```javascript -// EXERCISE: Get opening hours for a given day. -router.get( - '/location/:locationId/hours/:day', - [ - param('locationId').isInt({ min: 1 }), - param('day').isInt({ min: 0, max: 6 }), - apiErrorReporter, - ], - async (req, res) => { - /* eslint-disable no-unused-vars */ - const { locationId, day } = req.params; - /* eslint-enable */ - const locationDetailsKey = redis.getKeyName('locationdetails', locationId); - - // TODO: Get the opening hours for a given day from - // the JSON stored at the key held in locationDetailsKey. - // You will need to provide the correct JSON path to the hours - // array and return the element held in the position specified by - // the day variable. Make sure RedisJSON returns only the day - // requested! - const jsonPath = 'TODO'; - - /* eslint-enable no-unused-vars */ - const hoursForDay = JSON.parse( - await redisClient.call('JSON.GET', locationDetailsKey, jsonPath), - ); - /* eslint-disable */ - - // If null response, return empty object. - res.status(200).json(hoursForDay || {}); - }, -); -``` - -You'll need to update the code to provide the correct JSON path, replacing the "TODO" value with a JSON path expression. - -Looking at the JSON stored at key `ncc:locationdetails:121`, we see that the opening hours are stored as an array of objects in a field named `hours`, where day 0 is Monday and day 6 is Sunday: - -![Location Details in RedisInsight](images/locationdetails_redis_json.png) - -So you'll need a JSON path query that gets the right element from the `hours` array depending on the value stored in the variable `day`. - -If you're using redis-cli, you can look at the structure of the JSON document with the following command: - -```bash -json.get ncc:locationdetails:121 . -``` - -Make sure your query returns only the day requested, so that you don't have to write Node.js code to filter the value returned from Redis. Use the [RedisJSON path syntax page](https://oss.redis.com/redisjson/path/) to help you formulate the right query. - -To test your code, start the server with: - -```bash -$ npm run dev -``` - -Recall that this will allow you to edit the code and try your changes without restarting the server. - -If you have the correct JSON path in your code, visiting `http://localhost:80801/api/location/121/hours/2` should return: - -```json -{ - "day": "Wednesday", - "hours": "7-8" -} -``` - -## External Resources - -- [Sample Social Network GitHub Repository](https://github.com/redislabs-training/node-js-crash-course) -- [RedisJSON](https://redisjson.io/) -- [How to store and retrieve nested JSON document](/howtos/redisjson/storing-complex-json-document) -- [Importing JSON data into Redis using NodeJS](/howtos/redisjson/using-nodejs) -- Learn more about [RedisJSON](https://oss.redis.com/redisjson/) in the Quickstart tutorial. -- [RU204: Storing, Querying and Indexing JSON at Speed](https://university.redis.com/courses/ru204/) - a course at Redis University diff --git a/docs/howtos/redisjson/using-dotnet/index-usingdotnet.mdx b/docs/howtos/redisjson/using-dotnet/index-usingdotnet.mdx deleted file mode 100644 index ae89f313ae7..00000000000 --- a/docs/howtos/redisjson/using-dotnet/index-usingdotnet.mdx +++ /dev/null @@ -1,7 +0,0 @@ ---- -id: index-usingdotnet -title: Importing JSON data into Redis using .Net -sidebar_label: RedisJSON with .Net -slug: /howtos/redisjson/using-dotnet -authors: [ajeet] ---- diff --git a/docs/howtos/redisjson/using-go/index-usinggo.mdx b/docs/howtos/redisjson/using-go/index-usinggo.mdx deleted file mode 100644 index 28669e07541..00000000000 --- a/docs/howtos/redisjson/using-go/index-usinggo.mdx +++ /dev/null @@ -1,131 +0,0 @@ ---- -id: index-usinggo -title: How to cache JSON data in Redis with Go -sidebar_label: RedisJSON and Go -slug: /howtos/redisjson/using-go -author: [ajeet] ---- - -[Go-ReJSON](https://github.com/nitishm/go-rejson) is a Go client for RedisJSON Module. It is a Golang client that support multiple Redis clients such as the print-like Redis-api client [redigo](/develop/golang/) and the type-safe Redis client [go-redis](/develop/golang/). - -Follow the below steps to get started with RedisJSON using Go client. - -### Step 1. Initialise the Go module - -```bash - go mod init github.com/my/repo -``` - -### Step 2. Install Go-redis - -```bash - go get github.com/go-redis/redis/v8 -``` - -### Step 3. Install Go client for RedisJSON - -```bash - go get github.com/nitishm/go-rejson/v4 -``` - -### Step 4. Clone the repository - -```bash - git clone https://github.com/nitishm/go-rejson - cd go-rejson/example -``` - -### Step 5. Build the Go package - -Command: - -```go - go build json_set/json_set.go -``` - -Result: - -```go - go: downloading github.com/go-redis/redis/v8 v8.4.4 - go: downloading github.com/gomodule/redigo v1.8.3 - go: downloading go.opentelemetry.io/otel v0.15.0 - go build: build output "json_set" already exists and is a directory -``` - -### Step 6. Run the Go program - -Command: - -```go - go run json_set/json_set.go -``` - -Result: - -```go - Executing Example_JSONSET for Redigo Client - Success: OK - Student read from redis : main.Student{Name:main.Name{First:"Mark", Middle:"S", Last:"Pronto"}, Rank:1} - - Executing Example_JSONSET for Redigo Client - Success: OK - Student read from redis : main.Student{Name:main.Name{First:"Mark", Middle:"S", Last:"Pronto"}, Rank:1} -``` - -Command: - -```go - pwd - go-rejson/examples -``` - -Command: - -```go - go run json_array/json_array.go -``` - -Result: - -```go - Executing Example_JSONSET for Redigo Client - arr: OK - arr before pop: [one two three four five] - Length: 5 - Deleted element: five - arr after pop: [one two three four] - Length: 4 - Index of "one": 0 - Out of range: -1 - "ten" not found: -1 - no. of elements left: 2 - arr after trimming to [1,2]: [two three] - no. of elements: 3 - arr after inserting "one": [one two three] - - Executing Example_JSONSET for Redigo Client - arr: OK - arr before pop: [one two three four five] - Length: 5 - Deleted element: five - arr after pop: [one two three four] - Length: 4 - Index of "one": 0 - Out of range: -1 - "ten" not found: -1 - no. of elements left: 2 - arr after trimming to [1,2]: [two three] - no. of elements: 3 - arr after inserting "one": [one two three] -``` - -### References - -- [Go and Redis](/develop/golang/) -- [RU204: Storing, Querying and Indexing JSON at Speed](https://university.redis.com/courses/ru204/) - a course at Redis University -- [RedisJSON and Python](/howtos/redisjson/using-python) -- [How to store and retrieve nested JSON document](/howtos/redisjson/storing-complex-json-document) -- [Importing JSON data into Redis using NodeJS](/howtos/redisjson/using-nodejs) -- Learn more about [RedisJSON](https://oss.redis.com/redisjson/) in the Quickstart tutorial. -- [How to build shopping cart app using NodeJS and RedisJSON](/howtos/shoppingcart) -- [Indexing, Querying, and Full-Text Search of JSON Documents with Redis](https://redis.com/blog/index-and-query-json-docs-with-redis/) diff --git a/docs/howtos/redisjson/using-java/index-usingjava.mdx b/docs/howtos/redisjson/using-java/index-usingjava.mdx deleted file mode 100644 index b7dea167dd7..00000000000 --- a/docs/howtos/redisjson/using-java/index-usingjava.mdx +++ /dev/null @@ -1,129 +0,0 @@ ---- -id: index-usingjava -title: Modeling JSON Documents with Redis and Java -sidebar_label: RedisJSON and Java -slug: /howtos/redisjson/using-java ---- - -RedisJSON lets you store, index, and query JSON data in Redis. Jedis, a Java driver for Redis, provides full support for RedisJSON as of the 4.0 release. - -Follow along with the steps below to get started with Java and RedisJSON. - -### 1. Run the RedisMod Docker container - -The RedisMod docker container bundles the Redis modules that power RedisJSON. To get a running Redis instance with RedisJSON, run the following bash command: - -```bash - docker run -d -p 6379:6379 redislabs/redismod:latest -``` - -### 2. Add Jedis as a Dependency - -You'll need to add Jedis to your Java project. If you're using [Maven](https://maven.apache.org/install.html), the dependency will look something like this: - -```xml - - redis.clients - jedis - 4.0.0 - -``` - -Replace the version with your desired version of Jedis, but note that you'll need Jedis 4.0 or greater to get RedisJSON support. - -### 3. Connect to Redis - -You'll need to initialize you connection to Redis. This means configuring and creating a `UnifiedJedis` instance: - -```java - HostAndPort config = new HostAndPort(Protocol.DEFAULT_HOST, 6379); - PooledConnectionProvider provider = new PooledConnectionProvider(config); - UnifiedJedis client = new UnifiedJedis(provider); -``` - -## 4. Model Your Domain - -You'll need to represent your data by creating [POJOs](https://en.wikipedia.org/wiki/Plain_old_Java_object). Jedis will then help you serialize these objects to JSON. - -Suppose you're building an online learning platform, and you want to represent -students. Let's create a simple POJO to represent these students: - -```java -private class Student { - private String firstName; - private String lastName; - - public Student(String firstName, String lastName) { - this.firstName = firstName; - this.lastName = lastName; - } - - public String getFirstName() { - return firstName; - } - - public String getLastName() { - return lastName; - } -} -``` - -Now we can create some students and store them in Redis as JSON: - -```java -Student maya = new Student("Maya", "Jayavant"); -client.jsonSet("student:111", maya); - -Student oliwia = new Student("Oliwia", "Jagoda"); -client.jsonSet("student:112", oliwia); -``` - -Notice we pass the `Student` instances to the `jsonSet()` method. Jedis then serializes the objects and stores them in Redis at the specified keys (in this case, "student:111" and "student:112"). - -## Querying and indexing JSON - -If we want to be able to query this JSON, we'll need to create an index. Let's -create an index on the "firstName" and "lastName" fields. To do this: - -1. We define which fields to index ("firstName" and "lastName"). -2. We set up the index definition to recognize JSON and include only those - documents - whose key starts with "student:". -3. Then we actually create the index, called "student-index", by calling `ftCreate ()`. - -```java -Schema schema = new Schema().addTextField("$.firstName", 1.0).addTextField("$" + - ".lastName", 1.0); -IndexDefinition rule = new IndexDefinition(IndexDefinition.Type.JSON) - .setPrefixes(new String[]{"student:"}); -client.ftCreate("student-index", - IndexOptions.defaultOptions().setDefinition(rule), - schema); -``` - -With an index now defined, we can query our JSON. Let's find all students whose -name begins with "maya": - -```java -Query q = new Query("@\\$\\" + ".firstName:maya*"); -SearchResult mayaSearch = client.ftSearch("student-index", q); -``` - -We can then iterate over our search results: - -```java -List docs = mayaSearch.getDocuments(); -for (Document doc : docs) { - System.out.println(doc); -} -``` - -This example just scratches the surface. You can atomically manipulate JSON documents and query them in a variety of ways. See the [RedisJSON docs](https://oss.redis.com/redisjson/), the [RediSearch](https://oss.redis.com/redisearch/) docs, and our course, ["Querying, Indexing, and Full-text Search in Redis"](https://university.redis.com/courses/ru203/), for a lot more examples. - -- [RU204: Storing, Querying and Indexing JSON at Speed](https://university.redis.com/courses/ru204/) - a course at Redis University -- [RedisJSON and Python](/howtos/redisjson/using-python) -- [How to store and retrieve nested JSON document](/howtos/redisjson/storing-complex-json-document) -- [Importing JSON data into Redis using NodeJS](/howtos/redisjson/using-nodejs) -- Learn more about [RedisJSON](https://oss.redis.com/redisjson/) in the Quickstart tutorial. -- [How to build shopping cart app using NodeJS and RedisJSON](/howtos/shoppingcart) -- [Indexing, Querying, and Full-Text Search of JSON Documents with Redis](https://redis.com/blog/index-and-query-json-docs-with-redis/) diff --git a/docs/howtos/redisjson/using-nodejs/configure_insight.png b/docs/howtos/redisjson/using-nodejs/configure_insight.png deleted file mode 100644 index b07bea6954a..00000000000 Binary files a/docs/howtos/redisjson/using-nodejs/configure_insight.png and /dev/null differ diff --git a/docs/howtos/redisjson/using-nodejs/images/application_architecture.png b/docs/howtos/redisjson/using-nodejs/images/application_architecture.png deleted file mode 100644 index a55d4120ccf..00000000000 Binary files a/docs/howtos/redisjson/using-nodejs/images/application_architecture.png and /dev/null differ diff --git a/docs/howtos/redisjson/using-nodejs/images/configure_insight.png b/docs/howtos/redisjson/using-nodejs/images/configure_insight.png deleted file mode 100644 index b07bea6954a..00000000000 Binary files a/docs/howtos/redisjson/using-nodejs/images/configure_insight.png and /dev/null differ diff --git a/docs/howtos/redisjson/using-nodejs/images/locationdetails_redis_json.png b/docs/howtos/redisjson/using-nodejs/images/locationdetails_redis_json.png deleted file mode 100644 index 566027ca8ca..00000000000 Binary files a/docs/howtos/redisjson/using-nodejs/images/locationdetails_redis_json.png and /dev/null differ diff --git a/docs/howtos/redisjson/using-nodejs/index-usingnodejs.mdx b/docs/howtos/redisjson/using-nodejs/index-usingnodejs.mdx deleted file mode 100644 index a241965180e..00000000000 --- a/docs/howtos/redisjson/using-nodejs/index-usingnodejs.mdx +++ /dev/null @@ -1,129 +0,0 @@ ---- -id: index-usingnodejs -title: How to cache JSON data in Redis with Node.js -sidebar_label: RedisJSON and Node.js -slug: /howtos/redisjson/using-nodejs -authors: [ajeet, simon] ---- - -Node.js has become incredibly popular for both web and mobile application development. Node.js can be installed on MacOS, Linux and Windows systems. The Node Package Manager (npm) enables developers to install packages which are tried and tested libraries that help you to build applications quickly. - -Node.js is a fast runtime, but adding the power, speed and flexibility of Redis can take it to the next level. Redis is best suited to situations that require data to be retrieved and delivered to the client as quickly as possible. - -[RedisJSON](https://redisjson.io) is an add-on module that adds JSON as a native data type to Redis. It enables atomic, in place operations to be performed on JSON documents stored in Redis. - -We'll use the [node-redis](https://npmjs.com/package/redis) client to connect to Redis and leverage the power of RedisJSON. - -### Step 1. Run the redismod Docker Container - -This simple container image bundles together the latest stable releases of Redis and select Redis modules from Redis, Inc. - -```bash -$ docker run -d -p 6379:6379 redislabs/redismod:latest -``` - -### Step 2. Install Node.js - -Download and install the current LTS (Long Term Support) version of Node.js from the [nodejs.org](https://nodejs.org/) website. - -### Step 3. Initialize an npm Project - -Run `npm init` to initialize a new project. Use the default answers to all the questions: - -``` -$ mkdir jsondemo -$ cd jsondemo -$ npm init -``` - -Now edit `package.json` and add the line `"type": "module"`. The file should look something like this: - -```json -{ - "name": "jsondemo", - "type": "module", - "version": "1.0.0", - "description": "", - "main": "index.js", - "scripts": { - "test": "echo \"Error: no test specified\" && exit 1" - }, - "author": "", - "license": "ISC" -} -``` - -### Step 4. Install node-redis - -[node-redis](https://npmjs.com/package/redis) is a high performance Node.js Redis client with support for the RedisJSON module. Install it using `npm`: - -```bash -$ npm install redis -``` - -### Step 5. Create a JavaScript File - -Copy the code below into a file called `app.js`: - -```javascript -import { createClient } from 'redis'; - -async function redisJSONDemo() { - try { - const TEST_KEY = 'test_node'; - - const client = createClient(); - await client.connect(); - - // RedisJSON uses JSON Path syntax. '.' is the root. - await client.json.set(TEST_KEY, '.', { node: 4303 }); - const value = await client.json.get(TEST_KEY, { - // JSON Path: .node = the element called 'node' at root level. - path: '.node', - }); - - console.log(`value of node: ${value}`); - - await client.quit(); - } catch (e) { - console.error(e); - } -} - -redisJSONDemo(); -``` - -### Step 6. Run the Application - -Start the application as follows: - -```bash -$ node app.js -``` - -You should see this output: - -```bash -value of node: 4303 -``` - -Using the Redis [`MONITOR`](https://redis.io/commands/monitor) command, you can see the Redis commands that node-redis sent to the Redis server while running the application: - -``` -$ redis-cli -127.0.0.1:6379> monitor -OK - -1637866932.281949 [0 127.0.0.1:61925] "JSON.SET" "test_node" "." "{\"node\":4303}" -1637866932.282842 [0 127.0.0.1:61925] "JSON.GET" "test_node" ".node" -``` - -### References - -- [RU204: Storing, Querying and Indexing JSON at Speed](https://university.redis.com/courses/ru204/) - a course at Redis University -- [RedisJSON and Python](/howtos/redisjson/using-python) -- [How to store and retrieve nested JSON documents](/howtos/redisjson/storing-complex-json-document) -- [Importing JSON data into Redis using Node.js](/howtos/redisjson/using-nodejs) -- Learn more about [RedisJSON](https://redisjson.io/) in the Quickstart tutorial. -- [How to build a shopping cart app using Node.js and RedisJSON](/howtos/shoppingcart) -- [Indexing, Querying, and Full-Text Search of JSON Documents with Redis](https://redis.com/blog/index-and-query-json-docs-with-redis/) diff --git a/docs/howtos/redisjson/using-python/images/add_database.png b/docs/howtos/redisjson/using-python/images/add_database.png deleted file mode 100644 index 9ada742a2f2..00000000000 Binary files a/docs/howtos/redisjson/using-python/images/add_database.png and /dev/null differ diff --git a/docs/howtos/redisjson/using-python/images/database_creds.png b/docs/howtos/redisjson/using-python/images/database_creds.png deleted file mode 100644 index ef6379e72b3..00000000000 Binary files a/docs/howtos/redisjson/using-python/images/database_creds.png and /dev/null differ diff --git a/docs/howtos/redisjson/using-python/images/database_details.png b/docs/howtos/redisjson/using-python/images/database_details.png deleted file mode 100644 index 5007a480b06..00000000000 Binary files a/docs/howtos/redisjson/using-python/images/database_details.png and /dev/null differ diff --git a/docs/howtos/redisjson/using-python/images/details_database.png b/docs/howtos/redisjson/using-python/images/details_database.png deleted file mode 100644 index 3881cf02728..00000000000 Binary files a/docs/howtos/redisjson/using-python/images/details_database.png and /dev/null differ diff --git a/docs/howtos/redisjson/using-python/images/select_cloud_vendor.png b/docs/howtos/redisjson/using-python/images/select_cloud_vendor.png deleted file mode 100644 index 2526223c800..00000000000 Binary files a/docs/howtos/redisjson/using-python/images/select_cloud_vendor.png and /dev/null differ diff --git a/docs/howtos/redisjson/using-python/index-usingpython.mdx b/docs/howtos/redisjson/using-python/index-usingpython.mdx deleted file mode 100644 index 61fe5655c61..00000000000 --- a/docs/howtos/redisjson/using-python/index-usingpython.mdx +++ /dev/null @@ -1,156 +0,0 @@ ---- -id: index-usingpython -title: How to store JSON documents in Redis with Python -sidebar_label: RedisJSON and Python -slug: /howtos/redisjson/using-python ---- - -[RedisJSON](https://oss.redis.com/redisjson/) is a source-available Redis module that lets you store, manipulate, and query JSON documents in Redis. The standard Redis Python client (v4.0 or greater) supports all of the features of RedisJSON, and in this tutorial, we'll see how to get started with them. - -### Step 1. Create a free Cloud account - -Create your free Redis Enterprise Cloud account. Once you click on “Get Started”, you will receive an email with a link to activate your account and complete your signup process. - -:::info TIP -For a limited time, use **TIGER200** to get **$200** credits on Redis Enterprise Cloud and try all the advanced capabilities! - -:tada: [Click here to sign up](https://redis.com/try-free) - -::: - -### Step 2. Create Your database - -Choose your preferred cloud vendor. Select the region and then click "Let's start free" to create your free database automatically. - -:::info TIP -If you want to create a custom database with your preferred name and type of redis, -click "Create a custom database" option shown in the image. -::: - -![create database ](images/select_cloud_vendor.png) - -### Step 3. Verify the database details - -You will be provided with Public endpoint URL and "Redis Stack" as the type of database with the list of modules that comes by default. - -![verify database](images/details_database.png) - -### Step 4. Using RedisInsight - -RedisInsight is a visual tool that lets you do both GUI- and CLI-based interactions with your Redis database, and so much more when developing your Redis based application. It is a fully-featured pure Desktop GUI client that provides capabilities to design, develop and optimize your Redis application. It works with any cloud provider as long as you run it on a host with network access to your cloud-based Redis server. It makes it easy to discover cloud databases and configure connection details with a single click. It allows you to automatically add Redis Enterprise Software and Redis Enterprise Cloud databases. - -[Follow this link](/explore/redisinsightv2/getting-started) to install RedisInsight v2 on your local system. -Assuming that you already have RedisInsight v2 installed on your MacOS, you can browse through the Applications and click "RedisInsight-v2" to bring up the Redis Desktop GUI tool. - -### Step 5. Add Redis database - -![access redisinsight](images/add_database.png) - -### Step 6. Enter Redis Enterprise Cloud details - -Add the Redis Enterprise cloud database endpoint, port and password. - -![access redisinsight](images/database_creds.png) - -### Step 7. Verify the database under RedisInsight dashboard - -![database details](images/database_details.png) - -## Storing JSON in Redis - -Let's consider a simple JSON document structure representing a user: - -``` - { - "name": "Jane", - "age": 33, - "location: "Chawton" - } -``` - -## Installing Redis - -``` - $ pip3 install redis -Collecting redis - Downloading redis-4.2.0-py3-none-any.whl (225 kB) -Collecting async-timeout>=4.0.2 - Downloading async_timeout-4.0.2-py3-none-any.whl (5.8 kB) -Collecting typing-extensions - Downloading typing_extensions-4.1.1-py3-none-any.whl (26 kB) -.. - Requirement already satisfied: packaging>=20.4 in /usr/lib/python3.8/site-packages (from redis) (20.4) -Collecting wrapt<2,>=1.10 -Installing collected packages: async-timeout, typing-extensions, wrapt, deprecated, redis - Running setup.py install for wrapt ... done - Successfully installed async-timeout-4.0.2 deprecated-1.2.13 redis-4.2.0 typing-extensions-4.1.1 wrapt-1.14.0 -``` - -Here's the Python code to store this document in Redis using RedisJSON: - -```python -import redis -from redis.commands.json.path import Path - -client = redis.Redis(host='localhost', port=6379, db=0) - -jane = { - 'name': "Jane", - 'Age': 33, - 'Location': "Chawton" - } - -client.json().set('person:1', Path.root_path(), jane) - -result = client.json().get('person:1') -print(result) -``` - -In the code above, we first connect to Redis and store a reference to the connection in the `client` variable. - -Next, we create a Python dictionary to represent a person object. - -And finally, we store the object in Redis using the `json().set()` method. The first argument, `person:1` is the name of the key that will reference the JSON. The second argument is a JSON path. We use `Path.root_path()`, as this is a new object. Finally, we pass in the Python dictionary, which will be serialized to JSON. - -To retrieve the JSON object, we run `json().get()`, passing in the key. The result is a Python dictionary representing the JSON object stored in Redis. - -### Run the code - -If you copy the code above into a file called `main.py`, you can run the code like so: - -```bash -$ pipenv python run main.py -``` - -## Verify that the JSON document has been added to Redis - -Start `redis-cli` to connect to your Redis instance. Then run the following command: - -```bash -localhost:6379> json.get person:1 -"{\"name\":\"Jane\",\"Age\":33,\"Location\":\"Chawton\"}" -``` - -## Fetching specific fields from a JSON document - -You can use RedisJSON to fetch specific fields from a document by specifying a path. For example, here's how to return only the `name` field: - -```python -name = client.json().get('person:1', Path('.name')) -print(name) -``` - -This code will print the string "Jane". - -You can execute the same query from the command line: - -```bash -localhost:6379> json.get person:1 '.name' -"\"Jane\"" -``` - -### References - -- [RU204: Storing, Querying and Indexing JSON at Speed](https://university.redis.com/courses/ru204/) - a course at Redis University -- [Importing JSON data into Redis using NodeJS](/howtos/redisjson/using-nodejs) -- Learn more about [RedisJSON](https://redisjson.io) in the Quickstart tutorial diff --git a/docs/howtos/redisjson/using-redisinsight/database_details.png b/docs/howtos/redisjson/using-redisinsight/database_details.png deleted file mode 100644 index 5007a480b06..00000000000 Binary files a/docs/howtos/redisjson/using-redisinsight/database_details.png and /dev/null differ diff --git a/docs/howtos/redisjson/using-redisinsight/image b/docs/howtos/redisjson/using-redisinsight/image deleted file mode 100644 index 5007a480b06..00000000000 Binary files a/docs/howtos/redisjson/using-redisinsight/image and /dev/null differ diff --git a/docs/howtos/redisjson/using-redisinsight/image_1_1.png b/docs/howtos/redisjson/using-redisinsight/image_1_1.png deleted file mode 100644 index 246473443ca..00000000000 Binary files a/docs/howtos/redisjson/using-redisinsight/image_1_1.png and /dev/null differ diff --git a/docs/howtos/redisjson/using-redisinsight/image_1_2.png b/docs/howtos/redisjson/using-redisinsight/image_1_2.png deleted file mode 100644 index ca68f427d8b..00000000000 Binary files a/docs/howtos/redisjson/using-redisinsight/image_1_2.png and /dev/null differ diff --git a/docs/howtos/redisjson/using-redisinsight/image_1_3.png b/docs/howtos/redisjson/using-redisinsight/image_1_3.png deleted file mode 100644 index 07fec5e2bae..00000000000 Binary files a/docs/howtos/redisjson/using-redisinsight/image_1_3.png and /dev/null differ diff --git a/docs/howtos/redisjson/using-redisinsight/image_1_4.png b/docs/howtos/redisjson/using-redisinsight/image_1_4.png deleted file mode 100644 index 2e51dac29b3..00000000000 Binary files a/docs/howtos/redisjson/using-redisinsight/image_1_4.png and /dev/null differ diff --git a/docs/howtos/redisjson/using-redisinsight/image_1_5.png b/docs/howtos/redisjson/using-redisinsight/image_1_5.png deleted file mode 100644 index 378b2b7fb00..00000000000 Binary files a/docs/howtos/redisjson/using-redisinsight/image_1_5.png and /dev/null differ diff --git a/docs/howtos/redisjson/using-redisinsight/image_1_6.png b/docs/howtos/redisjson/using-redisinsight/image_1_6.png deleted file mode 100644 index bf462bb6450..00000000000 Binary files a/docs/howtos/redisjson/using-redisinsight/image_1_6.png and /dev/null differ diff --git a/docs/howtos/redisjson/using-redisinsight/images/add_database.png b/docs/howtos/redisjson/using-redisinsight/images/add_database.png deleted file mode 100644 index 9ada742a2f2..00000000000 Binary files a/docs/howtos/redisjson/using-redisinsight/images/add_database.png and /dev/null differ diff --git a/docs/howtos/redisjson/using-redisinsight/images/database_creds.png b/docs/howtos/redisjson/using-redisinsight/images/database_creds.png deleted file mode 100644 index ef6379e72b3..00000000000 Binary files a/docs/howtos/redisjson/using-redisinsight/images/database_creds.png and /dev/null differ diff --git a/docs/howtos/redisjson/using-redisinsight/images/database_details.png b/docs/howtos/redisjson/using-redisinsight/images/database_details.png deleted file mode 100644 index 5007a480b06..00000000000 Binary files a/docs/howtos/redisjson/using-redisinsight/images/database_details.png and /dev/null differ diff --git a/docs/howtos/redisjson/using-redisinsight/images/details_database.png b/docs/howtos/redisjson/using-redisinsight/images/details_database.png deleted file mode 100644 index 3881cf02728..00000000000 Binary files a/docs/howtos/redisjson/using-redisinsight/images/details_database.png and /dev/null differ diff --git a/docs/howtos/redisjson/using-redisinsight/images/json_0.png b/docs/howtos/redisjson/using-redisinsight/images/json_0.png deleted file mode 100644 index eccb549e7fd..00000000000 Binary files a/docs/howtos/redisjson/using-redisinsight/images/json_0.png and /dev/null differ diff --git a/docs/howtos/redisjson/using-redisinsight/images/json_1.png b/docs/howtos/redisjson/using-redisinsight/images/json_1.png deleted file mode 100644 index d4e926ae3aa..00000000000 Binary files a/docs/howtos/redisjson/using-redisinsight/images/json_1.png and /dev/null differ diff --git a/docs/howtos/redisjson/using-redisinsight/images/json_2.png b/docs/howtos/redisjson/using-redisinsight/images/json_2.png deleted file mode 100644 index e957377b83f..00000000000 Binary files a/docs/howtos/redisjson/using-redisinsight/images/json_2.png and /dev/null differ diff --git a/docs/howtos/redisjson/using-redisinsight/images/json_3.png b/docs/howtos/redisjson/using-redisinsight/images/json_3.png deleted file mode 100644 index 45fd7b491a9..00000000000 Binary files a/docs/howtos/redisjson/using-redisinsight/images/json_3.png and /dev/null differ diff --git a/docs/howtos/redisjson/using-redisinsight/images/select_cloud_vendor.png b/docs/howtos/redisjson/using-redisinsight/images/select_cloud_vendor.png deleted file mode 100644 index 2526223c800..00000000000 Binary files a/docs/howtos/redisjson/using-redisinsight/images/select_cloud_vendor.png and /dev/null differ diff --git a/docs/howtos/redisjson/using-redisinsight/index-usingredisinsight.mdx b/docs/howtos/redisjson/using-redisinsight/index-usingredisinsight.mdx deleted file mode 100644 index f457c5a63ec..00000000000 --- a/docs/howtos/redisjson/using-redisinsight/index-usingredisinsight.mdx +++ /dev/null @@ -1,92 +0,0 @@ ---- -id: index-usingredisinsight -title: How to visualize JSON data using RedisInsight -sidebar_label: RedisJSON using RedisInsight -slug: /howtos/redisjson/using-redisinsight -authors: [ajeet] ---- - -RedisInsight provides built-in support for the RedisJSON, RediSearch, RedisGraph, Redis Streams, and RedisTimeSeries modules to make it even easier to query, visualize, and interactively manipulate search indexes, graphs, streams, and time-series data. Support for RedisJSON on Redis Cluster was introduced for the first time in RedisInsight v1.8.0. With RedisInsight, you can visualize and edit your JSON data flawlessly. - -Below steps shows how to get started with RedisJSON using RedisInsight: - -### Step 1. Create a free Cloud account - -Create your free Redis Enterprise Cloud account. Once you click on “Get Started”, you will receive an email with a link to activate your account and complete your signup process. - -:::info TIP -For a limited time, use **TIGER200** to get **$200** credits on Redis Enterprise Cloud and try all the advanced capabilities! - -:tada: [Click here to sign up](https://redis.com/try-free) - -::: - -### Step 2. Create Your database - -Choose your preferred cloud vendor. Select the region and then click "Let's start free" to create your free database automatically. - -:::info TIP -If you want to create a custom database with your preferred name and type of redis, -click "Create a custom database" option shown in the image. -::: - -![create database ](images/select_cloud_vendor.png) - -### Step 3. Verify the database details - -You will be provided with Public endpoint URL and "Redis Stack" as the type of database with the list of modules that comes by default. - -![verify database](images/details_database.png) - -### Step 4. Using RedisInsight - -RedisInsight is a visual tool that lets you do both GUI- and CLI-based interactions with your Redis database, and so much more when developing your Redis based application. It is a fully-featured pure Desktop GUI client that provides capabilities to design, develop and optimize your Redis application. It works with any cloud provider as long as you run it on a host with network access to your cloud-based Redis server. It makes it easy to discover cloud databases and configure connection details with a single click. It allows you to automatically add Redis Enterprise Software and Redis Enterprise Cloud databases. - -[Follow this link](/explore/redisinsightv2/getting-started) to install RedisInsight v2 on your local system. -Assuming that you already have RedisInsight v2 installed on your MacOS, you can browse through the Applications and click "RedisInsight-v2" to bring up the Redis Desktop GUI tool. - -### Step 5. Add Redis database - -![access redisinsight](images/add_database.png) - -### Step 6. Enter Redis Enterprise Cloud details - -Add the Redis Enterprise cloud database endpoint, port and password. - -![access redisinsight](images/database_creds.png) - -### Step 7. Verify the database under RedisInsight dashboard - -![database details](images/database_details.png) - -### Step 8. Execute JSON queries - -```bash - JSON.SET employee_profile . '{ "employee": { "name": "carol", "age": 40, "married": true } }' -``` - -![My Image](images/json_0.png) - -### Step 8. Accessing RedisInsight Browser Tool - -Select "employee_profile" to display the JSON data - -![My Image](images/json_1.png) - -### Step 9. Add a new key - -![My Image](images/json_2.png) - -### Step 10. Expand the JSON field - -![My Image](images/json_3.png) - -### References - -- [RU204: Storing, Querying and Indexing JSON at Speed](https://university.redis.com/courses/ru204/) - a course at Redis University -- [RedisJSON and Python](/howtos/redisjson/using-python) -- [How to store and retrieve nested JSON document](/howtos/redisjson/storing-complex-json-document) -- [Importing JSON data into Redis using NodeJS](/howtos/redisjson/using-nodejs) -- Learn more about [RedisJSON](https://oss.redis.com/redisjson/) in the Quickstart tutorial. -- [How to build shopping cart app using NodeJS and RedisJSON](/howtos/shoppingcart) -- [Indexing, Querying, and Full-Text Search of JSON Documents with Redis](https://redis.com/blog/index-and-query-json-docs-with-redis/) diff --git a/docs/howtos/redisjson/using-redisinsight/redisinsight2.png b/docs/howtos/redisjson/using-redisinsight/redisinsight2.png deleted file mode 100644 index 5c6ef2fef14..00000000000 Binary files a/docs/howtos/redisjson/using-redisinsight/redisinsight2.png and /dev/null differ diff --git a/docs/howtos/redisjson/using-redisinsight/redisinsight3.png b/docs/howtos/redisjson/using-redisinsight/redisinsight3.png deleted file mode 100644 index 33c9f8d950b..00000000000 Binary files a/docs/howtos/redisjson/using-redisinsight/redisinsight3.png and /dev/null differ diff --git a/docs/howtos/redisjson/using-redisinsight/redisinsight4.png b/docs/howtos/redisjson/using-redisinsight/redisinsight4.png deleted file mode 100644 index f00d956f37a..00000000000 Binary files a/docs/howtos/redisjson/using-redisinsight/redisinsight4.png and /dev/null differ diff --git a/docs/howtos/redisjson/using-redisinsight/redisinsight5.png b/docs/howtos/redisjson/using-redisinsight/redisinsight5.png deleted file mode 100644 index bb716253bed..00000000000 Binary files a/docs/howtos/redisjson/using-redisinsight/redisinsight5.png and /dev/null differ diff --git a/docs/howtos/redisjson/using-redisinsight/redisinsightinstall.png b/docs/howtos/redisjson/using-redisinsight/redisinsightinstall.png deleted file mode 100644 index 99f2c696ea5..00000000000 Binary files a/docs/howtos/redisjson/using-redisinsight/redisinsightinstall.png and /dev/null differ diff --git a/docs/howtos/redisjson/using-redisinsight/redisinsightmac.png b/docs/howtos/redisjson/using-redisinsight/redisinsightmac.png deleted file mode 100644 index 5cca4c45a0a..00000000000 Binary files a/docs/howtos/redisjson/using-redisinsight/redisinsightmac.png and /dev/null differ diff --git a/docs/howtos/redisjson/using-redisinsight/redisjson1.png b/docs/howtos/redisjson/using-redisinsight/redisjson1.png deleted file mode 100644 index 6ef5bc0a9af..00000000000 Binary files a/docs/howtos/redisjson/using-redisinsight/redisjson1.png and /dev/null differ diff --git a/docs/howtos/redisjson/using-redisinsight/redisjson3.png b/docs/howtos/redisjson/using-redisinsight/redisjson3.png deleted file mode 100644 index eded167bb6c..00000000000 Binary files a/docs/howtos/redisjson/using-redisinsight/redisjson3.png and /dev/null differ diff --git a/docs/howtos/redisjson/using-ruby/index-usingruby.mdx b/docs/howtos/redisjson/using-ruby/index-usingruby.mdx deleted file mode 100644 index 1fa1628fe7d..00000000000 --- a/docs/howtos/redisjson/using-ruby/index-usingruby.mdx +++ /dev/null @@ -1,81 +0,0 @@ ---- -id: index-usingruby -title: How to cache JSON data in Redis with Ruby -sidebar_label: RedisJSON and Ruby -slug: /howtos/redisjson/using-ruby -authors: [ajeet] ---- - -rejson-rb is a package that allows storing, updating and querying objects as JSON documents in a Redis database that is extended with the RedisJSON module. -The package extends redis-rb's interface with RedisJSON's API, and performs on-the-fly serialization/deserialization of objects to/from JSON. - -### Step 1. Run RedisJSON docker container - -```bash - docker run -d -p 6379:6379 redislabs/redismod:latest -``` - -### Step 2. Install Ruby - -```bash - brew install ruby -``` - -### Step 3. Install RedisJSON Gem - -```bash - gem install rejson-rb -``` - -### Step 4. Create a ruby file - -Copy the below content and paste it in a file called 'employee.rb'. - -```ruby - require 'rejson' - - rcl = Redis.new # Get a redis client - - # Get/Set/Delete keys - obj = { - 'id': "42", - 'name': "Paul John", - 'email': "paul.john@gmail.com", - 'address': { - 'city': 'London' - } - } - - rcl.json_set("employee", Rejson::Path.root_path, obj) - - rcl.json_set("employee", Rejson::Path.new(".id"), 43) - - rcl.json_get "employee", Rejson::Path.root_path - - rcl.json_del "employee", ".address.city" -``` - -The above script uses RedisJSON commands to set the objects, alter the id to 43 and then perform the delete operation using 'json_del' - -### Step 5. Execute the script - -```bash - ruby employee.rb -``` - -You can verify what's happening in the background by running the monitor command in Redis CLI shell: - -```bash - 127.0.0.1:6379> monitor - OK - 1627619198.040000 [0 172.17.0.1:57550] "JSON.SET" "employee" "." "{\"id\":\"42\",\"name\":\"Paul John\",\"email\":\"paul.john@gmail.com\",\"address\":{\"city\":\"London\"}}" - 1627619198.040876 [0 172.17.0.1:57550] "JSON.SET" "employee" ".id" "43" -1627619198.042132 [0 172.17.0.1:57550] "JSON.GET" "employee" "." -1627619198.042741 [0 172.17.0.1:57550] "JSON.DEL" "employee" ".address.city" -``` - -### References - -- [RU204: Storing, Querying and Indexing JSON at Speed](https://university.redis.com/courses/ru204/) - a course at Redis University -- [Rate Limiting app in Ruby and Redis](/howtos/ratelimiting/) -- [Ruby and Redis](/develop/ruby/) diff --git a/docs/howtos/redistimeseries/getting-started/README.md b/docs/howtos/redistimeseries/getting-started/README.md deleted file mode 100644 index 4148b2f11ab..00000000000 --- a/docs/howtos/redistimeseries/getting-started/README.md +++ /dev/null @@ -1 +0,0 @@ -# images diff --git a/docs/howtos/redistimeseries/getting-started/Verify_subscription.png b/docs/howtos/redistimeseries/getting-started/Verify_subscription.png deleted file mode 100644 index e5911628f69..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/Verify_subscription.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/activate.png b/docs/howtos/redistimeseries/getting-started/activate.png deleted file mode 100644 index b871b07fd4e..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/activate.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/add_database.png b/docs/howtos/redistimeseries/getting-started/add_database.png deleted file mode 100644 index 9ada742a2f2..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/add_database.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/aws.png b/docs/howtos/redistimeseries/getting-started/aws.png deleted file mode 100644 index 5d49974a6a4..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/aws.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/choosemodule.png b/docs/howtos/redistimeseries/getting-started/choosemodule.png deleted file mode 100644 index ba5165de56c..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/choosemodule.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/create_database.png b/docs/howtos/redistimeseries/getting-started/create_database.png deleted file mode 100644 index 6f68abd90d7..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/create_database.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/create_subscription.png b/docs/howtos/redistimeseries/getting-started/create_subscription.png deleted file mode 100644 index 347fdd15353..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/create_subscription.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/createdatabase.png b/docs/howtos/redistimeseries/getting-started/createdatabase.png deleted file mode 100644 index a4415e902e9..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/createdatabase.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/database_creds.png b/docs/howtos/redistimeseries/getting-started/database_creds.png deleted file mode 100644 index ef6379e72b3..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/database_creds.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/database_details.png b/docs/howtos/redistimeseries/getting-started/database_details.png deleted file mode 100644 index 5007a480b06..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/database_details.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/deployment.png b/docs/howtos/redistimeseries/getting-started/deployment.png deleted file mode 100644 index adb4c49d3d9..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/deployment.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/details_database.png b/docs/howtos/redistimeseries/getting-started/details_database.png deleted file mode 100644 index 3881cf02728..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/details_database.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/final_subscription.png b/docs/howtos/redistimeseries/getting-started/final_subscription.png deleted file mode 100644 index 333ce58c396..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/final_subscription.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/images/README.md b/docs/howtos/redistimeseries/getting-started/images/README.md deleted file mode 100644 index 4148b2f11ab..00000000000 --- a/docs/howtos/redistimeseries/getting-started/images/README.md +++ /dev/null @@ -1 +0,0 @@ -# images diff --git a/docs/howtos/redistimeseries/getting-started/images/Verify_subscription.png b/docs/howtos/redistimeseries/getting-started/images/Verify_subscription.png deleted file mode 100644 index e5911628f69..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/images/Verify_subscription.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/images/activate.png b/docs/howtos/redistimeseries/getting-started/images/activate.png deleted file mode 100644 index b871b07fd4e..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/images/activate.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/images/add_database.png b/docs/howtos/redistimeseries/getting-started/images/add_database.png deleted file mode 100644 index 9ada742a2f2..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/images/add_database.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/images/aws.png b/docs/howtos/redistimeseries/getting-started/images/aws.png deleted file mode 100644 index 5d49974a6a4..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/images/aws.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/images/choosemodule.png b/docs/howtos/redistimeseries/getting-started/images/choosemodule.png deleted file mode 100644 index ba5165de56c..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/images/choosemodule.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/images/create_database.png b/docs/howtos/redistimeseries/getting-started/images/create_database.png deleted file mode 100644 index 6f68abd90d7..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/images/create_database.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/images/create_subscription.png b/docs/howtos/redistimeseries/getting-started/images/create_subscription.png deleted file mode 100644 index 347fdd15353..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/images/create_subscription.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/images/createdatabase.png b/docs/howtos/redistimeseries/getting-started/images/createdatabase.png deleted file mode 100644 index a4415e902e9..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/images/createdatabase.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/images/database_creds.png b/docs/howtos/redistimeseries/getting-started/images/database_creds.png deleted file mode 100644 index ef6379e72b3..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/images/database_creds.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/images/database_details.png b/docs/howtos/redistimeseries/getting-started/images/database_details.png deleted file mode 100644 index 5007a480b06..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/images/database_details.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/images/deployment.png b/docs/howtos/redistimeseries/getting-started/images/deployment.png deleted file mode 100644 index adb4c49d3d9..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/images/deployment.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/images/details_database.png b/docs/howtos/redistimeseries/getting-started/images/details_database.png deleted file mode 100644 index 3881cf02728..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/images/details_database.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/images/final_subscription.png b/docs/howtos/redistimeseries/getting-started/images/final_subscription.png deleted file mode 100644 index 333ce58c396..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/images/final_subscription.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/images/launch_database.png b/docs/howtos/redistimeseries/getting-started/images/launch_database.png deleted file mode 100644 index 861f20f9dec..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/images/launch_database.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/images/plan.png b/docs/howtos/redistimeseries/getting-started/images/plan.png deleted file mode 100644 index d481c7540a6..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/images/plan.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/images/recloud1.png b/docs/howtos/redistimeseries/getting-started/images/recloud1.png deleted file mode 100644 index a73de599091..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/images/recloud1.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/images/recloud2.png b/docs/howtos/redistimeseries/getting-started/images/recloud2.png deleted file mode 100644 index 5cb98fc25f1..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/images/recloud2.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/images/recloud3.png b/docs/howtos/redistimeseries/getting-started/images/recloud3.png deleted file mode 100644 index a390a684cc8..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/images/recloud3.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/images/rediscloud_redistimeseries.png b/docs/howtos/redistimeseries/getting-started/images/rediscloud_redistimeseries.png deleted file mode 100644 index c3ae0c6063e..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/images/rediscloud_redistimeseries.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/images/region.png b/docs/howtos/redistimeseries/getting-started/images/region.png deleted file mode 100644 index ddbe75e4287..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/images/region.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/images/select_cloud.png b/docs/howtos/redistimeseries/getting-started/images/select_cloud.png deleted file mode 100644 index 2784e455de7..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/images/select_cloud.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/images/select_cloud_vendor.png b/docs/howtos/redistimeseries/getting-started/images/select_cloud_vendor.png deleted file mode 100644 index 2526223c800..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/images/select_cloud_vendor.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/images/select_subscription.png b/docs/howtos/redistimeseries/getting-started/images/select_subscription.png deleted file mode 100644 index 531615615e6..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/images/select_subscription.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/images/subscription.png b/docs/howtos/redistimeseries/getting-started/images/subscription.png deleted file mode 100644 index b4a61342f3e..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/images/subscription.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/images/try-free.png b/docs/howtos/redistimeseries/getting-started/images/try-free.png deleted file mode 100644 index 11915ea5927..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/images/try-free.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/index-redistimeseries.mdx b/docs/howtos/redistimeseries/getting-started/index-redistimeseries.mdx deleted file mode 100644 index 883a623fa3f..00000000000 --- a/docs/howtos/redistimeseries/getting-started/index-redistimeseries.mdx +++ /dev/null @@ -1,227 +0,0 @@ ---- -id: index-gettingstarted -title: Storing and Querying Time Series data using Redis Stack -sidebar_label: Storing and Querying Time Series data -slug: /howtos/redistimeseries/getting-started -authors: [ajeet] ---- - -[RedisTimeseries](https://redis.com/modules/redis-timeseries/) is a Redis module developed by Redis Inc. to enhance your experience managing time-series data with Redis. It simplifies the use of Redis for time-series use cases such as internet of things (IoT) data, stock prices, and telemetry. With RedisTimeSeries, you can ingest and query millions of samples and events at the speed of Redis. Advanced tooling such as downsampling and aggregation ensure a small memory footprint without impacting performance. Use a variety of queries for visualization and monitoring with built-in connectors to popular monitoring tools like Grafana, Prometheus, and Telegraf. - -
- -
- -### Step 1. Create a free Cloud account - -Create your free Redis Enterprise Cloud account. Once you click on “Get Started”, you will receive an email with a link to activate your account and complete your signup process. - -:::info TIP -For a limited time, use **TIGER200** to get **$200** credits on Redis Enterprise Cloud and try all the advanced capabilities! - -:tada: [Click here to sign up](https://redis.com/try-free) - -::: - -### Step 2. Create Your database - -Choose your preferred cloud vendor. Select the region and then click "Let's start free" to create your free database automatically. - -:::info TIP -If you want to create a custom database with your preferred name and type of Redis, -click "Create a custom database" option shown in the image. -::: - -![create database ](images/select_cloud_vendor.png) - -### Step 3. Verify the database details - -You will be provided with Public endpoint URL and "Redis Stack" as the type of database with the list of modules that comes by default. - -![verify database](images/details_database.png) - -### Step 4. Install RedisInsight - -RedisInsight is a visual tool that lets you do both GUI- and CLI-based interactions with your Redis database, and so much more when developing your Redis based application. It is a fully-featured pure Desktop GUI client that provides capabilities to design, develop and optimize your Redis application. It works with any cloud provider as long as you run it on a host with network access to your cloud-based Redis server. It makes it easy to discover cloud databases and configure connection details with a single click. It allows you to automatically add Redis Enterprise Software and Redis Enterprise Cloud databases. - -You can install Redis Stack on your local system to get RedisInsight GUI tool up and running. Ensure that you have the `brew` package installed in your Mac system. - -```bash - brew tap redis-stack/redis-stack - brew install --cask redis-stack -``` - -``` - ==> Installing Cask redis-stack-redisinsight - ==> Moving App 'RedisInsight-preview.app' to '/Applications/RedisInsight-preview.app' - 🍺 redis-stack-redisinsight was successfully installed! - ==> Installing Cask redis-stack - 🍺 redis-stack was successfully installed! -``` - -Go to Applications and click "RedisInsight-v2" to bring up the Redis Desktop GUI tool. - -### Step 5. Add Redis database - -![access redisinsight](images/add_database.png) - -### Step 6. Enter Redis Enterprise Cloud details - -Add the Redis Enterprise cloud database endpoint, port and password. - -![access redisinsight](images/database_creds.png) - -### Step 7. Verify the database under RedisInsight dashboard - -![database details](images/database_details.png) - -### Step 8. Getting Started with RedisTimeSeries - -This section will walk you through using some basic RedisTimeseries commands. You can run them from the Redis command-line interface (redis-cli) or use the CLI available in RedisInsight. (See part 2 of this tutorial to learn more about using the RedisInsight CLI.) -Using a basic air-quality dataset, we will show you how to: - -- Create a new time series -- Add a new sample to the list of series -- Query a range across one or multiple time series - -![RedisTimeSeries](redistimeseriesflow.png) - -#### Create a new time series - -Let’s create a time series representing air quality dataset measurements. To interact with RedisTimeSeries you will most often use the TS.RANGE command, but here you will create a time series per measurement using the TS.CREATE command. Once created, all the measurements will be sent using TS.ADD. - -The sample command below creates a time series and populates it with three entries: - -``` ->> TS.CREATE ts:carbon_monoxide ->> TS.CREATE ts:relative_humidity ->> TS.CREATE ts:temperature RETENTION 60 LABELS sensor_id 2 area_id 32 -``` - -In the above example, ts:carbon_monoxide, ts:relative_humidity and ts:temperature are key names. We are creating a time series with two labels (sensor_id and area_id are the fields with values 2 and 32 respectively) and a retention window of 60 milliseconds: - -#### Add a new sample data to the time series - -Let’s start to add samples into the keys that will be automatically created using this command: - -``` ->> TS.ADD ts:carbon_monoxide 1112596200 2.4 ->> TS.ADD ts:relative_humidity 1112596200 18.3 ->> TS.ADD ts:temperature 1112599800 28.3 -``` - -``` ->> TS.ADD ts:carbon_monoxide 1112599800 2.1 ->> TS.ADD ts:relative_humidity 1112599800 13.5 ->> TS.ADD ts:temperature 1112603400 28.5 -``` - -``` ->> TS.ADD ts:carbon_monoxide 1112603400 2.2 ->> TS.ADD ts:relative_humidity 1112603400 13.1 ->> TS.ADD ts:temperature 1112607000 28.7 -``` - -#### Querying the sample - -Now that you have sample data in your time series, you’re ready to ask questions such as: - -#### “How do I get the last sample?” - -TS.GET is used to get the last sample. The returned array will contain the last sample timestamp followed by the last sample value, when the time series contains data: - -``` ->> TS.GET ts:temperature -1) (integer) 1112607000 -2) "28.7" -``` - -#### “How do I get the last sample matching the specific filter?” - -TS.MGET is used to get the last samples matching the specific filter: - -``` ->> TS.MGET FILTER area_id=32 -1) 1) "ts:temperature" - 2) (empty list or set) - 3) 1) (integer) 1112607000 - 2) "28.7" -``` - -#### “How do I get the sample with labels matching the specific filter?” - -``` ->> TS.MGET WITHLABELS FILTER area_id=32 -1) 1) "ts:temperature" - 2) 1) 1) "sensor_id" - 2) "2" - 2) 1) "area_id" - 2) "32" - 3) 1) (integer) 1112607000 - 2) "28.7" -``` - -#### Query a range across one or more time series - -TS.RANGE is used to query a range in forward directions while TS.REVRANGE is used to query a range in reverse directions, They let you answer such questions as: - -#### “How do I get the sample for a time range?” - -``` ->> TS.RANGE ts:carbon_monoxide 1112596200 1112603400 -1) 1) (integer) 1112596200 - 2) "2.4" -2) 1) (integer) 1112599800 - 2) "2.1" -3) 1) (integer) 1112603400 - 2) "2.2" -``` - -#### Aggregation - -You can use various aggregation types such as avg, sum, min, max, range, count, first, last etc. The example below example shows how to use “avg” aggregation type to answer such questions as: - -#### “How do I get the sample for a time range on some aggregation rule?” - -``` ->> TS.RANGE ts:carbon_monoxide 1112596200 1112603400 AGGREGATION avg 2 -1) 1) (integer) 1112596200 - 2) "2.4" -2) 1) (integer) 1112599800 - 2) "2.1" -3) 1) (integer) 1112603400 - 2) "2.2" -``` - -### Next Steps - -- Learn more about RedisTimeSeries in the [Quickstart](https://oss.redis.com/redistimeseries/) tutorial. -- [Build Your Financial Application on RedisTimeSeries](https://redis.com/blog/build-your-financial-application-on-redistimeseries/) -- [How to Manage Real-Time IoT Sensor Data in Redis](https://redis.com/blog/how-to-manage-real-time-iot-sensor-data-in-redis/) -- [Introduction to RedisTimeSeries - Video](https://www.youtube.com/watch?v=rXynFOrrd-Q) - -## - - diff --git a/docs/howtos/redistimeseries/getting-started/launch_database.png b/docs/howtos/redistimeseries/getting-started/launch_database.png deleted file mode 100644 index 861f20f9dec..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/launch_database.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/plan.png b/docs/howtos/redistimeseries/getting-started/plan.png deleted file mode 100644 index d481c7540a6..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/plan.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/recloud1.png b/docs/howtos/redistimeseries/getting-started/recloud1.png deleted file mode 100644 index a73de599091..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/recloud1.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/recloud2.png b/docs/howtos/redistimeseries/getting-started/recloud2.png deleted file mode 100644 index 5cb98fc25f1..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/recloud2.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/recloud3.png b/docs/howtos/redistimeseries/getting-started/recloud3.png deleted file mode 100644 index a390a684cc8..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/recloud3.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/redistimeseries.png b/docs/howtos/redistimeseries/getting-started/redistimeseries.png deleted file mode 100644 index eded167bb6c..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/redistimeseries.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/redistimeseries1.png b/docs/howtos/redistimeseries/getting-started/redistimeseries1.png deleted file mode 100644 index 67a951bf8eb..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/redistimeseries1.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/redistimeseriesflow.png b/docs/howtos/redistimeseries/getting-started/redistimeseriesflow.png deleted file mode 100644 index 47022ba7fd5..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/redistimeseriesflow.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/region.png b/docs/howtos/redistimeseries/getting-started/region.png deleted file mode 100644 index ddbe75e4287..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/region.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/select_cloud.png b/docs/howtos/redistimeseries/getting-started/select_cloud.png deleted file mode 100644 index 2784e455de7..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/select_cloud.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/select_cloud_vendor.png b/docs/howtos/redistimeseries/getting-started/select_cloud_vendor.png deleted file mode 100644 index 2526223c800..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/select_cloud_vendor.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/select_subscription.png b/docs/howtos/redistimeseries/getting-started/select_subscription.png deleted file mode 100644 index 531615615e6..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/select_subscription.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/subscription.png b/docs/howtos/redistimeseries/getting-started/subscription.png deleted file mode 100644 index b4a61342f3e..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/subscription.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/try-free.png b/docs/howtos/redistimeseries/getting-started/try-free.png deleted file mode 100644 index 11915ea5927..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/try-free.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/getting-started/tryfree.png b/docs/howtos/redistimeseries/getting-started/tryfree.png deleted file mode 100644 index bbd57089df9..00000000000 Binary files a/docs/howtos/redistimeseries/getting-started/tryfree.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/index-redistimeseries.mdx b/docs/howtos/redistimeseries/index-redistimeseries.mdx deleted file mode 100644 index e170c9a5e78..00000000000 --- a/docs/howtos/redistimeseries/index-redistimeseries.mdx +++ /dev/null @@ -1,48 +0,0 @@ ---- -id: index-redistimeseries -title: RedisTimeSeries Tutorial -sidebar_label: Overview -slug: /howtos/redistimeseries/ ---- - -import RedisCard from '@site/src/theme/RedisCard'; - -The following links provides you with the available options to get started with RedisTimeSeries - -
- -
- -
- -
- -
- -
- -
-
-
- -
- -
- -
diff --git a/docs/howtos/redistimeseries/using-dotnet/index.md b/docs/howtos/redistimeseries/using-dotnet/index.md deleted file mode 100644 index 86ecd1957c1..00000000000 --- a/docs/howtos/redistimeseries/using-dotnet/index.md +++ /dev/null @@ -1,136 +0,0 @@ ---- -id: index-usingdotnet -title: Processing Time Series data with Redis and .NET -sidebar_label: Using RedisTimeSeries with .NET -slug: /howtos/redistimeseries/using-dotnet -authors: [steve] ---- - -Time Series data can be used to measure anything from remote sensor readings to stock market feeds. Working with time series data in .NET is a snap with Redis and [NRedisTimeSeries](https://github.com/RedisTimeSeries/NRedisTimeSeries). In this tutorial, we'll explore how to use them together. - -## Create your Project - -Start out by creating a project with the command: - -```bash -dotnet new console -n TimeSeriesDemoApp -``` - -Next, inside the `TimeSeriesDemoApp` directory, run the command: - -```bash -dotnet add package NRedisTimeSeries -``` - -## Get a RedisTimeSeries Database - -The next step is to get a RedisTimeSeries database up and running. The easiest way to do that for development purposes is to use Docker: - -``` -docker run -p 6379:63379 redis/redis-stack-server:latest -``` - -If you are well past getting started and want to get something into your production, your best bet is to run it in [Redis Enterprise](/howtos/redistimeseries/getting-started). - -## Connecting to Redis - -Open the `Program.cs` file, in here, create a new ConnectionMultiplexer using a connection string (which will vary based on what deployment you're using). Then, for our basic Docker setup, you'll just run: - -```csharp -var muxer = ConnectionMultiplexer.Connect("localhost"); -var db = muxer.GetDatabase(); -``` - -## Create a Time Series - -Now that you've gotten a handle to Redis, your next step is to initialize a time series. This will be a bit of a toy example. We are going to start off by just creating a time series called `sensor`, we will set its retention period to 1 minute, and we just give it an `id` label of `sensor-1`: - -```csharp -await db.TimeSeriesCreateAsync("sensor", 60000, new List{new TimeSeriesLabel("id", "sensor-1")}); -``` - -## Producer Task - -Next, we'll create a task that will run a consumer in the background. Every second it will send a random integer between 1 and 50 into our time series. - -```csharp -var producerTask = Task.Run(async()=>{ - while(true) - { - await db.TimeSeriesAddAsync("sensor", "*", Random.Shared.Next(50)); - await Task.Delay(1000); - } -}); -``` - -## Consumer Task - -With the Producer created, we'll create a consumer loop that will do the opposite. Every second it will pull the most recent item in the time series off and print it out. - -```csharp -var consumerTask = Task.Run(async()=>{ - while(true) - { - await Task.Delay(1000); - var result = await db.TimeSeriesGetAsync("sensor"); - Console.WriteLine($"{result.Time.Value}: {result.Val}"); - } -}); - -await Task.WhenAll(producerTask, consumerTask); -``` - -## Run the App - -Now that we produce and consume data run the app with `dotnet run`. This will run a continuous loop in the time series as it continually produces and consumes data points. - -## Run Aggregations in the Time Series - -Now what we've done so far is produce a time series of random integer data for our .NET app to consume. What if we wanted to do something a bit more interesting with it, though? Let's say we wanted to calculate a moving average every 5 seconds. We can do that with ease using Redis TimeSeries. - -### Create Rules to Store Aggregations - -Let's run min, max, and average every 5 seconds on our Time Series. Redis will do this passively in the background after we set up some keys to store them in and set up the rules. - -```csharp -var aggregations = new TsAggregation[]{TsAggregation.Avg, TsAggregation.Min, TsAggregation.Max}; -foreach(var agg in aggregations) -{ - await db.TimeSeriesCreateAsync($"sensor:{agg}", 60000, new List{new ("type", agg.ToString()), new("aggregation-for", "sensor-1")}); - await(db.TimeSeriesCreateRuleAsync("sensor", new TimeSeriesRule($"sensor:{agg}", 5000, agg))); -} -``` - -### Process Results from Aggregations - -With the rules established, we can consume the relevant time series to get the results. When we were creating the time series for our aggregations, we added a label to all of them: `new TimeSeriesLabel("aggregation-for", "sensor-1")`. We essentially told Redis that this time series would be an aggregation for `sensor-1`. We can then use that label to find just the time series aggregations of `sensor-1`. With this in mind, we can grab all the sensor aggregations in one command to Redis using `MGET`. - -```csharp -var aggregationConsumerTask = Task.Run(async()=> -{ - while(true) - { - await Task.Delay(5000); - var results = await db.TimeSeriesMGetAsync(new List(){"aggregation-for=sensor-1"}, true); - foreach(var result in results) - { - Console.WriteLine($"{result.labels.First(x=>x.Key == "type").Value}: {result.value.Val}"); - } - - } -}); -``` - -With all these sets, you can now just update the `Task.WhenAll` call at the end to include the new consumer task: - -```csharp -await Task.WhenAll(producerTask, consumerTask, aggregationConsumerTask); -``` - -When we run the application with `dotnet run`, you will see that the application will also print out the average, min, and max for the last 5 seconds of the time series, in addition to the regular ticks of the time series. - -## Resources - -- The Source Code for this demo is located in [GitHub](https://github.com/redis-developer/redis-time-series-demo-dotnet) -- The source code for NRedisTimeSeries is also located in [GitHub](https://github.com/redistimeseries/nredistimeseries) -- More information about Redis Time Series can be found at [redistimeseries.io](https://redistimeseries.io/) diff --git a/docs/howtos/redistimeseries/using-go/images/image1.png b/docs/howtos/redistimeseries/using-go/images/image1.png deleted file mode 100644 index fbb68b3cd01..00000000000 Binary files a/docs/howtos/redistimeseries/using-go/images/image1.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/using-go/images/image2.png b/docs/howtos/redistimeseries/using-go/images/image2.png deleted file mode 100644 index 56bf285bc8f..00000000000 Binary files a/docs/howtos/redistimeseries/using-go/images/image2.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/using-go/images/image3.png b/docs/howtos/redistimeseries/using-go/images/image3.png deleted file mode 100644 index c8c080fb61c..00000000000 Binary files a/docs/howtos/redistimeseries/using-go/images/image3.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/using-go/images/image4.png b/docs/howtos/redistimeseries/using-go/images/image4.png deleted file mode 100644 index f92a1e5a19f..00000000000 Binary files a/docs/howtos/redistimeseries/using-go/images/image4.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/using-go/images/image5.png b/docs/howtos/redistimeseries/using-go/images/image5.png deleted file mode 100644 index 8b7d9c38e64..00000000000 Binary files a/docs/howtos/redistimeseries/using-go/images/image5.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/using-go/images/image6.png b/docs/howtos/redistimeseries/using-go/images/image6.png deleted file mode 100644 index db28d40a312..00000000000 Binary files a/docs/howtos/redistimeseries/using-go/images/image6.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/using-go/images/image7.png b/docs/howtos/redistimeseries/using-go/images/image7.png deleted file mode 100644 index 7ba7e0e6541..00000000000 Binary files a/docs/howtos/redistimeseries/using-go/images/image7.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/using-go/images/image8.png b/docs/howtos/redistimeseries/using-go/images/image8.png deleted file mode 100644 index 549eb7c8a70..00000000000 Binary files a/docs/howtos/redistimeseries/using-go/images/image8.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/using-go/images/image9.png b/docs/howtos/redistimeseries/using-go/images/image9.png deleted file mode 100644 index 34823d6ccea..00000000000 Binary files a/docs/howtos/redistimeseries/using-go/images/image9.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/using-go/images/redistimeseries-go.png b/docs/howtos/redistimeseries/using-go/images/redistimeseries-go.png deleted file mode 100644 index 6ff0a8c2c2a..00000000000 Binary files a/docs/howtos/redistimeseries/using-go/images/redistimeseries-go.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/using-go/index-usinggo.mdx b/docs/howtos/redistimeseries/using-go/index-usinggo.mdx deleted file mode 100644 index a95bf365cb5..00000000000 --- a/docs/howtos/redistimeseries/using-go/index-usinggo.mdx +++ /dev/null @@ -1,122 +0,0 @@ ---- -id: index-usinggo -title: How to collect and process time-series data using Redis and Go -sidebar_label: RedisTimeSeries and Go -slug: /howtos/redistimeseries/using-go -authors: [ajeet] ---- - -![My Image](images/redistimeseries-go.png) - -RedisTimeSeries is a Redis module that allows Redis to be used as a fast in-memory time series database designed to collect, manage, and deliver time series data at scale. The RedisTimeSeries module shares the performance and simplicity aspects of Redis. Under the hood, it uses efficient data structures such as Radix tree to index data by timestamp, which makes it extremely fast and efficient to run time-aggregate queries. - -## RedisTimeSeries Go Client - -redistimeseries-go is a package that gives developers easy access to the RedisTimeSeries module. Go client for RedisTimeSeries ([https://github.com/RedisTimeSeries/redistimeseries](https://github.com/RedisTimeSeries/redistimeseries)), based on redigo.Client and ConnPool based on the work of dvirsky and mnunberg on [https://github.com/RediSearch/redisearch-go](https://github.com/RediSearch/redisearch-go) - -Follow the steps below to get started with RedisTimeSeries with Go: - -### Step 1. Create free Redis Enterprise Cloud account - -Create your free [Redis Enterprise Cloud account](https://redis.com/try-free/). Once you click on “Get Started”, you will receive an email with a link to activate your account and complete your signup process. - -![alt_text](images/image2.png) - -![alt_text](images/image3.png) - -### Step 2. Create Your subscription - -Next, you will have to create a Redis Enterprise Cloud subscription. In the Redis Enterprise Cloud menu, click "Create your Subscription". - -![alt_text](images/image4.png) - -### Step 3. Select the right Subscription Plan - -#### Select "Fixed Plan" for low throughout application as for now. - -![alt_text](images/image5.png) - -### Step 4. Select cloud vendor - -For the cloud provider, select your preferred cloud (for demo purpose) - -![alt_text](images/image6.png) - -### Step 5. Click "Create Subscription" - -Finally, click on the "Create Subscription" button. - -![alt_text](images/image7.png) - -You can now verify the subscription as shown below: - -![alt_text](images/image8.png) - -### Step 6. Create database - -Click "Create Database". Enter database name and your preferred module. - -![alt_text](images/image9.png) - -### Step 7.Installing RedisTimeSeries Go client - -``` -$ go get github.com/RedisTimeSeries/redistimeseries-go -``` - -### Step 8. Writing the Go program - -``` - -package main - -import ( - "fmt" - redistimeseries "github.com/RedisTimeSeries/redistimeseries-go" -) - -func main() { - // Connect to localhost with no password - var client = redistimeseries.NewClient("redis-XXXX.c264.ap-south-1-1.ec2.cloud.redislabs.com:port", "add your password here", nil) - var keyname = "mytest" - _, haveit := client.Info(keyname) - if haveit != nil { - client.CreateKeyWithOptions(keyname, redistimeseries.DefaultCreateOptions) - client.CreateKeyWithOptions(keyname+"_avg", redistimeseries.DefaultCreateOptions) - client.CreateRule(keyname, redistimeseries.AvgAggregation, 60, keyname+"_avg") - } - // Add sample with timestamp from server time and value 100 - // TS.ADD mytest * 100 - _, err := client.AddAutoTs(keyname, 100) - if err != nil { - fmt.Println("Error:", err) - } -} - -``` - -### Step 9. Run the Go program - -```bash - go run test.go -``` - -### Step 10. Monitor the Redis database - -``` -monitor -OK -1635490098.157530 [0 52.149.144.189:48430] "TS.INFO" "mytest" -1635490098.353530 [0 52.149.144.189:48430] "TS.CREATE" "mytest" -1635490098.553530 [0 52.149.144.189:48430] "TS.CREATE" "mytest_avg" -1635490098.753530 [0 52.149.144.189:48430] "TS.CREATERULE" "mytest" "mytest_avg" "AGGREGATION" "AVG" "60" -1635490098.949529 [0 52.149.144.189:48430] "TS.ADD" "mytest" "*" "100" -``` - -## References - -- [Getting Started with RedisTimeSeries](/howtos/redistimeseries/getting-started/) -- Learn more about RedisTimeSeries in the [Quickstart](https://oss.redis.com/redistimeseries/) tutorial. -- [Build Your Financial Application on RedisTimeSeries](https://redis.com/blog/build-your-financial-application-on-redistimeseries/) -- [How to Manage Real-Time IoT Sensor Data in Redis](https://redis.com/blog/how-to-manage-real-time-iot-sensor-data-in-redis/) -- [Introduction to RedisTimeSeries - Video](https://www.youtube.com/watch?v=rXynFOrrd-Q) diff --git a/docs/howtos/redistimeseries/using-prometheus/index-usingprometheus.mdx b/docs/howtos/redistimeseries/using-prometheus/index-usingprometheus.mdx deleted file mode 100644 index 0d5ffbb25ad..00000000000 --- a/docs/howtos/redistimeseries/using-prometheus/index-usingprometheus.mdx +++ /dev/null @@ -1,206 +0,0 @@ ---- -id: index-usingprometheus -title: How to monitor Redis with Prometheus and Grafana for Real-Time Analytics -sidebar_label: Using RedisTimeSeries with Prometheus and Grafana -slug: /howtos/redistimeseries/using-prometheus -authors: [ajeet] ---- - -![My Image](images/prometheus.png) - -Time-series data is basically a series of data stored in time order and produced continuously over a long period of time. These measurements and events are tracked, monitored, downsampled, and aggregated over time. The events could be, for example, IoT sensor data. Every sensor is a source of time-series data. Each data point in the series stores the source information and other sensor measurements as labels. Data labels from every source may not conform to the same structure or order. - -A time-series database is a database system designed to store and retrieve such data for each point in time. Timestamped data can include data generated at regular intervals as well as data generated at unpredictable intervals. - -### When do you use a time-series database? - -- When your application needs data that accumulates quickly and your other databases aren’t designed to handle that scale. -- For financial or industrial applications. -- When your application needs to perform real-time analysis of billions of records. -- When your application needs to perform online queries at millisecond timescales, and support CPU-efficient ad-hoc queries. - -### Challenges with the existing traditional databases - -You might find numerous solutions that still store time-series data in a relational database, but they’re quite inefficient and come with their own set of drawbacks. A typical time-series database is usually built to only manage time-series data, hence one of the challenges it faces is with use cases that involve some sort of computation on top of time-series data. One good example could be capturing a live video feed in a time-series database. If you want to run an AI model for face recognition, you would have to extract the time-series data, apply some sort of data transformation and then do computation. -Relational databases carry the overhead of locking and synchronization that aren’t required for the immutable time-series data. This results in slower-than-required performance for both ingest and queries. When scaling out, it also means investing in additional compute resources. These databases enforce a rigid structure for labels and can’t accommodate unstructured data. They also require scheduled jobs for cleaning up old data. Beyond the time-series use case, these databases are also used for other use cases, which means overuse of running time-series queries may affect other workloads. - -### What is RedisTimeSeries? - -RedisTimeSeries is a purpose-built time-series database that addresses the needs of handling time-series data. It removes the limitations enforced by relational databases and enables you to collect, manage, and deliver time-series data at scale. As an in-memory database, RedisTimeSeries can ingest over 500,000 records per second on a standard node. Our benchmarks show that you can ingest over 11.5 million records per second with a cluster of 16 Redis shards. - -RedisTimeSeries is resource-efficient. With RedisTimeSeries, you can add rules to compact data by downsampling. For example, if you’ve collected more than one billion data points in a day, you could aggregate the data by every minute in order to downsample it, thereby reducing the dataset size to 1,440 data points (24 \* 60 = 1,440). You can also set data retention policies and expire the data by time when you don’t need them anymore. RedisTimeSeries allows you to aggregate data by average, minimum, maximum, sum, count, range, first, and last. You can run over 100,000 aggregation queries per second with sub-millisecond latency. You can also perform reverse lookups on the labels in a specific time range. - -### Notables features of RedisTimeseries includes: - -- High volume inserts, low latency reads -- Query by start time and end-time -- Aggregated queries (Min, Max, Avg, Sum, Range, Count, First, Last, STD.P, STD.S, Var.P, Var.S) for any time bucket -- Configurable maximum retention period -- Downsampling/Compaction - automatically updated aggregate time series -- Secondary index - each time series has labels (field value pairs) which will allows to query by labels - -### Why Prometheus? - -Prometheus is an open-source systems monitoring and alerting toolkit. It collects and stores its metrics as time series data, i.e. metrics information. The metrics are numeric measurements in a time series, meaning changes recorded over time. These metrics are stored with the timestamp at which it was recorded, alongside optional key-value pairs called labels. Metrics play an important role in understanding why your application is working in a certain way. - -### Prometheus remote storage adapter for RedisTimeSeries - -In the RedisTimeSeries organization you can find projects that help you integrate RedisTimeSeries with other tools, including Prometheus and Grafana. The Prometheus remote storage adapter for RedisTimeSeries is available and the project is hosted over at https://github.com/RedisTimeSeries/prometheus-redistimeseries-adapter. It’s basically a read/write adapter to use RedisTimeSeries as a backend database. RedisTimeSeries Adapter receives Prometheus metrics via the remote write, and writes to Redis with the RedisTimeSeries module. - -## Getting Started - -### Prerequisite: - -- Install GIT -- Install Docker -- Install Docker Compose - -### Step 1. Clone the repository - -``` - git clone https://github.com/RedisTimeSeries/prometheus-redistimeseries-adapter -``` - -### Step 2. Examining the Docker Compose File - -This Docker compose defines 4 services - - -1. Prometheus -2. Adapter -3. Grafana -4. Redis - -```yaml - version: '3' - services: - prometheus: - image: "prom/prometheus:v2.8.0" - command: ["--config.file=/prometheus.yml"] - volumes: - - ./prometheus.yaml:/prometheus.yml - ports: - - 9090:9090 - adapter: - image: "redislabs/prometheus-redistimeseries-adapter:master" - command: ["-redis-address", "redis:6379", "-web.listen-address", "0.0.0.0:9201"] - redis: - image: "redislabs/redistimeseries:edge" - ports: - - "6379:6379" - grafana: - build: ./grafana/ - ports: - - "3000:3000" -``` - -#### Prometheus - -The `prometheus` service directly uses an image “prom/prometheus” that’s pulled from Docker Hub. It then binds the container and the host machine to the exposed port, 9090. The prometheus configuration file is accessed by mounting the volume on the host and container. - -#### Storage Adapter - -The `adapter` service uses an image “`redislabs/prometheus-redistimeseries-adapter:master`” that’s pulled from Docker Hub. Sets the default command for the container: `-redis-address", "redis:6379 and listen to the address 0.0.0.0:9201. ` - -#### Redis - -The `Redis` service directly uses an image “`redislabs/redistimeseries:edge`” that’s pulled from Docker Hub. It then binds the container and the host machine to the exposed port, `6379` - -#### Grafana - -The `grafana` service uses an image that’s built from the `Dockerfile` in the current directory. It then binds the container and the host machine to the exposed port, `3000`. - -### Step 3. Run the Docker Compose - -Change directory to compose and execute the below CLI: - -```bash - docker-compose up -d -``` - -```bash - ajeetraina@Ajeets-MacBook-Pro compose % docker-compose ps - NAME COMMAND SERVICE STATUS PORTS - compose-adapter-1 "/adapter/redis-ts-a…" adapter running - compose-grafana-1 "/run.sh" grafana running 0.0.0.0:3000->3000/tcp - compose-prometheus-1 "/bin/prometheus --c…" prometheus running 0.0.0.0:9090->9090/tcp - compose-redis-1 "docker-entrypoint.s…" redis running 0.0.0.0:6379->6379/tcp -``` - -### Step 4. Accessing the Grafana - -Open http://hostIP:3000 to access the Grafana dashboard. The default username and password is admin/admin. - -### Step 5. Add Prometheus Data Source - -In the left sidebar, you will see the “Configuration” option. Select “Data Source” and choose Prometheus. - -![alt_text](images/image1.png) - -Click “Save and Test”. - -### Step 6. Importing Prometheus Data Source - -Click on “Import” for all the Prometheus dashboards. - -![alt_text](images/image2.png) - -### Step 7. Adding Redis Datasource - -Again, click on “Data Sources” and add Redis. - -![alt_text](images/image_3.png) - -Click "Import". - -![alt_text](images/image_4.png) - -### Step 8. Running the Sensor Script - -It’s time to test drive a few demo scripts built by the Redis team. To start with, clone the below repository - -``` - git clone https://github.com/RedisTimeSeries/prometheus-demos -``` - -This repo contains a set of basic demoes showcasing the integration of RedisTimeSeries with Prometheus and Grafana. Let’s pick up a sensor script. - -``` - python3 weather_station/sensors.py -``` - -This script will add random measurements for temperature and humidity for a number of sensors. - -Go to “Add Panel” on the top right corner of the Grafana dashboard and start adding temperature and humidity values. - -![alt_text](images/image_7.png) - -### Step 9. Accessing Prometheus Dashboard - -Open up [https://HOSTIP:9090](https://HOSTIP:9090) to access Prometheus dashboard for the sensor values without any further configuration. - -![alt_text](images/image_8.png) - -### Further References: - -- [Prometheus remote storage adapter for RedisTimeSeries](https://github.com/RedisTimeSeries/prometheus-redistimeseries-adapter) -- [Remote Storage Integration](https://prometheus.io/docs/prometheus/latest/storage/#remote-storage-integrations) -- [RedisTimeSeries Demos](https://github.com/RedisTimeSeries/prometheus-demos) - -## - - diff --git a/docs/howtos/redistimeseries/using-python/images/add_database.png b/docs/howtos/redistimeseries/using-python/images/add_database.png deleted file mode 100644 index 9ada742a2f2..00000000000 Binary files a/docs/howtos/redistimeseries/using-python/images/add_database.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/using-python/images/database_creds.png b/docs/howtos/redistimeseries/using-python/images/database_creds.png deleted file mode 100644 index ef6379e72b3..00000000000 Binary files a/docs/howtos/redistimeseries/using-python/images/database_creds.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/using-python/images/database_details.png b/docs/howtos/redistimeseries/using-python/images/database_details.png deleted file mode 100644 index 5007a480b06..00000000000 Binary files a/docs/howtos/redistimeseries/using-python/images/database_details.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/using-python/images/details_database.png b/docs/howtos/redistimeseries/using-python/images/details_database.png deleted file mode 100644 index 3881cf02728..00000000000 Binary files a/docs/howtos/redistimeseries/using-python/images/details_database.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/using-python/images/image1.png b/docs/howtos/redistimeseries/using-python/images/image1.png deleted file mode 100644 index fbb68b3cd01..00000000000 Binary files a/docs/howtos/redistimeseries/using-python/images/image1.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/using-python/images/image2.png b/docs/howtos/redistimeseries/using-python/images/image2.png deleted file mode 100644 index 56bf285bc8f..00000000000 Binary files a/docs/howtos/redistimeseries/using-python/images/image2.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/using-python/images/image3.png b/docs/howtos/redistimeseries/using-python/images/image3.png deleted file mode 100644 index c8c080fb61c..00000000000 Binary files a/docs/howtos/redistimeseries/using-python/images/image3.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/using-python/images/image4.png b/docs/howtos/redistimeseries/using-python/images/image4.png deleted file mode 100644 index f92a1e5a19f..00000000000 Binary files a/docs/howtos/redistimeseries/using-python/images/image4.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/using-python/images/image5.png b/docs/howtos/redistimeseries/using-python/images/image5.png deleted file mode 100644 index 8b7d9c38e64..00000000000 Binary files a/docs/howtos/redistimeseries/using-python/images/image5.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/using-python/images/image6.png b/docs/howtos/redistimeseries/using-python/images/image6.png deleted file mode 100644 index db28d40a312..00000000000 Binary files a/docs/howtos/redistimeseries/using-python/images/image6.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/using-python/images/image7.png b/docs/howtos/redistimeseries/using-python/images/image7.png deleted file mode 100644 index 7ba7e0e6541..00000000000 Binary files a/docs/howtos/redistimeseries/using-python/images/image7.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/using-python/images/image8.png b/docs/howtos/redistimeseries/using-python/images/image8.png deleted file mode 100644 index 549eb7c8a70..00000000000 Binary files a/docs/howtos/redistimeseries/using-python/images/image8.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/using-python/images/image9.png b/docs/howtos/redistimeseries/using-python/images/image9.png deleted file mode 100644 index 34823d6ccea..00000000000 Binary files a/docs/howtos/redistimeseries/using-python/images/image9.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/using-python/images/redistimeseries-python.png b/docs/howtos/redistimeseries/using-python/images/redistimeseries-python.png deleted file mode 100644 index 4d34d82dfd0..00000000000 Binary files a/docs/howtos/redistimeseries/using-python/images/redistimeseries-python.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/using-python/images/select_cloud_vendor.png b/docs/howtos/redistimeseries/using-python/images/select_cloud_vendor.png deleted file mode 100644 index 2526223c800..00000000000 Binary files a/docs/howtos/redistimeseries/using-python/images/select_cloud_vendor.png and /dev/null differ diff --git a/docs/howtos/redistimeseries/using-python/index-usingpython.mdx b/docs/howtos/redistimeseries/using-python/index-usingpython.mdx deleted file mode 100644 index 51978597489..00000000000 --- a/docs/howtos/redistimeseries/using-python/index-usingpython.mdx +++ /dev/null @@ -1,104 +0,0 @@ ---- -id: index-usingpython -title: How to collect and process time-series data using Redis and Python -sidebar_label: RedisTimeSeries and Python -slug: /howtos/redistimeseries/using-python -authors: [ajeet] ---- - -![My Image](images/redistimeseries-python.png) - -Time series data is a series of data stored in the time order (Example: Stock performance over time). Industries today are collecting and analyzing time-based data more than ever before. Traditional databases that rely on relational or document data models are designed neither for storing and indexing data based on time, nor for running time-bucketed aggregation queries. Time-series databases fill this void by providing a data model that optimizes data indexing and querying by time. - -RedisTimeSeries is a Redis module that allows Redis to be used as a fast in-memory time series database designed to collect, manage, and deliver time series data at scale. The RedisTimeSeries module shares the performance and simplicity aspects of Redis. Under the hood, it uses efficient data structures such as Radix tree to index data by timestamp, which makes it extremely fast and efficient to run time-aggregate queries. - -## Python Client for RedisTimeSeries - -:::note TIP -As of redis-py 4.0.0, the redistimeseries-py library is deprecated. It's features have been merged into redis-py. Please either install it from pypy or the repo. -::: - -Follow the steps below to get started with RedisTimeSeries with Python: - -### Step 1. Create a free Cloud account - -Create your free Redis Enterprise Cloud account. Once you click on “Get Started”, you will receive an email with a link to activate your account and complete your signup process. - -:::info TIP -For a limited time, use **TIGER200** to get **$200** credits on Redis Enterprise Cloud and try all the advanced capabilities! - -:tada: [Click here to sign up](https://redis.com/try-free) - -::: - -### Step 2. Create Your database - -Choose your preferred cloud vendor. Select the region and then click "Let's start free" to create your free database automatically. - -:::info TIP -If you want to create a custom database with your preferred name and type of redis, -click "Create a custom database" option shown in the image. -::: - -![create database ](images/select_cloud_vendor.png) - -### Step 3. Verify the database details - -You will be provided with Public endpoint URL and "Redis Stack" as the type of database with the list of modules that comes by default. - -![verify database](images/details_database.png) - -### Step 4.Installation - -``` -$ pip install redis -``` - -### Step 5. Create a script file - -```python - import redis - r = redis.Redis(host='redis-18386.c110.qa.us-east-1-1.ec2.qa-cloud.redislabs.com', port=, password=) - r.ts().create(2, retension_msecs=5) -``` - -Save the above file with a name "ts.py". - -### Step 6. Executing the python script - -```bash - python3 ts.py -``` - -### Step 7. Monitor the Redis database - -``` - 1648389303.557366 [0 20.127.62.215:59768] "TS.CREATE" "2" -``` - -## References - -- [Getting Started with RedisTimeSeries](/howtos/redistimeseries/getting-started/) -- Learn more about RedisTimeSeries in the [Quickstart](https://oss.redis.com/redistimeseries/) tutorial. -- [Build Your Financial Application on RedisTimeSeries](https://redis.com/blog/build-your-financial-application-on-redistimeseries/) -- [How to Manage Real-Time IoT Sensor Data in Redis](https://redis.com/blog/how-to-manage-real-time-iot-sensor-data-in-redis/) -- [Introduction to RedisTimeSeries - Video](https://www.youtube.com/watch?v=rXynFOrrd-Q) - -## - - diff --git a/docs/howtos/security/tls.mdx b/docs/howtos/security/tls.mdx index e3c4a34c629..1fa948262c3 100644 --- a/docs/howtos/security/tls.mdx +++ b/docs/howtos/security/tls.mdx @@ -11,11 +11,6 @@ keywords: - tls --- -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; - ![Header](tls.jpeg) In this article, you will see how to secure your Redis databases using SSL (Secure Sockets Layer). In the production environment, it is always recommended to use SSL to protect the data that moves between various computers (client applications and Redis servers). Transport Level Security (TLS) guarantees that only allowed applications/computers are connected to the database, and also that data is not viewed or altered by a middle man process. @@ -31,7 +26,7 @@ In this article, we will focus on the Two-Way SSL, and using Redis Enterprise. - A Redis Enterprise 6.0.x database, (my database is protected by the password `secretdb01`, and listening on port `12000`) - `redis-cli` to run basic commands -- [Python](/develop/python/), [Node](/develop/node), and [Java](/develop/java) installed if you want to test various languages. +- [Node](/develop/node), and [Java](/develop/java) installed if you want to test various languages. ## Simple Test @@ -45,8 +40,7 @@ You can either run Redis server in a Docker container or directly on your machin ``` :::info INFO -Redis Stack unifies and simplifies the developer experience of the leading Redis modules and the capabilities they provide. Redis Stack bundles five Redis modules: RedisJSON, RedisSearch, RedisGraph, RedisTimeSeries, and RedisBloom. -[Learn more](/create/redis-stack) +Redis Stack unifies and simplifies the developer experience of the leading Redis modules and the capabilities they provide. Redis Stack supports the folliwng in additon to Redis: JSON, Search, Time Series, Triggers and Functions, and Probilistic data structures. ::: Let's make sure that the database is available: @@ -200,8 +194,6 @@ except Exception as err: print("Error connecting to Redis: {}".format(err)) ``` -More information in the documentation "[Using Redis with Python](/develop/python/)". - ### Step 5.3 Using Node.JS For [Node Redis](http://redis.js.org/), use the [TLS](https://nodejs.org/api/tls.html) library to configure the client connection: @@ -214,13 +206,13 @@ var fs = require('fs'); var ssl = { key: fs.readFileSync( '../certificates/client_key_app_001.pem', - (encoding = 'ascii'), + {encoding: 'ascii'}, ), cert: fs.readFileSync( '../certificates/client_cert_app_001.pem', - (encoding = 'ascii'), + {encoding: 'ascii'}, ), - ca: [fs.readFileSync('../certificates/proxy_cert.pem', (encoding = 'ascii'))], + ca: [fs.readFileSync('../certificates/proxy_cert.pem', {encoding: 'ascii'})], checkServerIdentity: () => { return null; }, diff --git a/docs/howtos/shoppingcart/index-shoppingcart.mdx b/docs/howtos/shoppingcart/index-shoppingcart.mdx index e0f73fb700c..c507a7e901d 100644 --- a/docs/howtos/shoppingcart/index-shoppingcart.mdx +++ b/docs/howtos/shoppingcart/index-shoppingcart.mdx @@ -6,10 +6,9 @@ slug: /howtos/shoppingcart authors: [ajeet] --- -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; +import Authors from '@theme/Authors'; + + It’s hard to imagine an online store without a shopping cart. Almost every online store must have the shopping cart functionality to be able to sell products to customers. In order to build a scalable ecommerce platform, you need a powerful framework and a simple storage system. At times, a lot of developers focus on improving the frontend performance of an ecommerce platform to rectify these things. The real bottleneck, however, remains the slow backend load time. A slow backend load time can have a serious impact on your search engine rankings. A good rule of thumb is that backend load time should take no more than 20% of your total load time. A good backend load time to aim for is 200ms or less. @@ -44,7 +43,7 @@ This tutorial will show you how to harness the power of Redis by creating a basi ### What do you need? -- Redis compiled with RedisJSON module +- Redis Stack - Express 4 backend - Node 15.5.0 (at least v12.9.0+) - NPM 7.3.0 (at least v6.14.8+) @@ -67,21 +66,20 @@ Clone the repository: $ git clone https://github.com/redis-developer/basic-redis-shopping-chart-nodejs ``` -#### Compiling Redis with RedisJSON module +#### Running Redis Stack -You can use the below docker compose file to run Redis server compiled with RedisJSON module: +You can use the below docker compose file to run Redis Stack server: ``` version: '3' services: redis: - image: redislabs/rejson:latest + image: redis/redis-stack:latest container_name: redis.redisshoppingcart.docker restart: unless-stopped environment: REDIS_PASSWORD: ${REDIS_PASSWORD} - command: redis-server --loadmodule "/usr/lib/redis/modules/rejson.so" --requirepass "$REDIS_PASSWORD" ports: - 127.0.0.1:${REDIS_PORT}:6379 networks: @@ -582,7 +580,11 @@ REDIS_PASSWORD=demo COMPOSE_PROJECT_NAME=redis-shopping-cart ``` -(Note: In case you’re using Redis Enterprise Cloud instead of localhost, then you need to enter the database endpoint under REDIS_HOST(without port) while rest of the entries like REDIS_PORT and REDIS_PASSWORD are quite obvious) +:::info + +In case you’re using Redis Cloud instead of localhost, then you need to enter the database endpoint under REDIS_HOST (without port) while rest of the entries like REDIS_PORT and REDIS_PASSWORD are quite obvious + +::: #### Installing the dependencies diff --git a/docs/howtos/socialnetwork/index-socialnetwork.mdx b/docs/howtos/socialnetwork/index-socialnetwork.mdx index ae8941c448f..1f6bd2d84c4 100644 --- a/docs/howtos/socialnetwork/index-socialnetwork.mdx +++ b/docs/howtos/socialnetwork/index-socialnetwork.mdx @@ -1,16 +1,20 @@ --- id: index-socialnetwork -title: How to Build a Social Network Application using RediSearch and NodeJS -sidebar_label: Building a Social Network Application using RediSearch +title: How to Build a Social Network Application using Redis Stack and NodeJS +sidebar_label: Building a Social Network Application using Redis Stack slug: /howtos/socialnetwork/ authors: [julian, manuel] --- +import Authors from '@theme/Authors'; + + + ![image](images/socialnetwork.png) -In this blog post we’ll build a social network application using RediSearch and NodeJS. This is the idea that we used for our app [Skillmarket](https://www.youtube.com/watch?v=18NPKZy28cQ). +In this blog post we’ll build a social network application using Redis Stack and NodeJS. This is the idea that we used for our app [Skillmarket](https://www.youtube.com/watch?v=18NPKZy28cQ). -The goal of the application is to match users with complementary skills. It will allow users to register and provide some information about themselves, like location, areas of expertise and interests. Using RediSearch it will match two users who are geographically close, and have complementary areas of expertise and interests, e.g., one of them knows French and want to learn Guitar and the other knows Guitar and want to learn French. +The goal of the application is to match users with complementary skills. It will allow users to register and provide some information about themselves, like location, areas of expertise and interests. Using search in Redis Stack it will match two users who are geographically close, and have complementary areas of expertise and interests, e.g., one of them knows French and want to learn Guitar and the other knows Guitar and want to learn French. The full source code of our application can be found in GitHub (note that we used some features like [FT.ADD](https://oss.redis.com/redisearch/Commands/#ftadd) which now are deprecated): @@ -19,16 +23,16 @@ The full source code of our application can be found in GitHub (note that we use We will be using a more condensed version of the backend which can be found in the [Skillmarket Blogpost](https://github.com/julianmateu/skillmarket-blogpost) GitHub repo. -Refer to the [official tutorial](https://github.com/RediSearch/redisearch-getting-started) for more information about RediSearch. +Refer to the [official tutorial](https://github.com/RediSearch/redisearch-getting-started) for more information about search in Redis Stack. -## Getting Familiar with RediSearch +## Getting Familiar with search in Redis Stack -### Launching RediSearch in a Docker container +### Launching search in RedisStack in a Docker container -Let’s start by launching Redis from the RediSearch image using Docker: +Let’s start by launching Redis from the Redis Stack image using Docker: ``` -docker run -d --name redis redislabs/redisearch:latest +docker run -d --name redis redis/redis-stack:latest ``` Here we use the `docker run` command to start the container and pull the image if it is not present. The `-d` flag tells docker to launch the container in the background (detached mode). We provide a name with `--name redis` which will allow us to refer to this container with a friendly name instead of the hash or the random name docker will assign to it. @@ -55,7 +59,7 @@ We’ll use a Hash as the data structure to store information about our users. T In a nutshell, you can think of a hash as a key/value store where the key can be any string we want, and the values are a document with several fields. It’s common practise to use the hash to store many different types of objects, so they can be prefixed with their type, so a key would take the form of "object_type:id". -An index will then be used on this hash data structure, to efficiently search for values of given fields. The following diagram taken from the RediSearch docs exeplifies this with a database for movies: +An index will then be used on this hash data structure, to efficiently search for values of given fields. The following diagram taken from the search docs exeplifies this with a database for movies: ![alt_text](images/searchindex.png) @@ -85,9 +89,9 @@ HSET users:3 name "Charles" expertises "spanish, bowling" interests "piano, danc ### Query to match users -Here we can see the power of the RediSearch index, which allows us to query by [tags](https://oss.redis.com/redisearch/Tags/) (we provide a list of values, such as interests, and it will return any user whose interests match at least one value in the list), and [Geo](https://oss.redis.com/redisearch/Query_Syntax/#geo_filters_in_query) (we can ask for users whose location is at a given radius in km from a point). +Here we can see the power of the search index, which allows us to query by [tags](https://oss.redis.com/redisearch/Tags/) (we provide a list of values, such as interests, and it will return any user whose interests match at least one value in the list), and [Geo](https://oss.redis.com/redisearch/Query_Syntax/#geo_filters_in_query) (we can ask for users whose location is at a given radius in km from a point). -To be able to do this, we have to instruct RediSearch to create an index: +To be able to do this, we have to instruct search to create an index: ``` FT.CREATE idx:users ON hash PREFIX 1 "users:" SCHEMA interests TAG expertises TAG location GEO @@ -152,7 +156,11 @@ We can now remove the docker instance and move on to building the web applicatio After understanding how the index works, let’s build a minimal backend API in NodeJS that will allow us to create a user, and query for matching users. -Please note that this is just an example, and we’re not providing proper validation or error handling, nor other features required for the backend (e.g. authentication). +:::note + +This is just an example, and we’re not providing proper validation or error handling, nor other features required for the backend (e.g. authentication). + +::: ### Redis client @@ -168,22 +176,7 @@ const client: RediSearchClient = createClient({ port: Number(REDIS_PORT), host: REDIS_HOST, }); -``` - -Given that the raw client does not include the functions from the rediSearch module, we have to add them by defining a new type and adding the commands (this is what the [redis-redisearch](https://www.npmjs.com/package/redis-redisearch) module does, and there’s also another module named [redisearchclient](https://www.npmjs.com/package/redisearchclient) which also provides more functions instead of providing arguments as strings). -``` -type RediSearchClient = RedisClient & { - ft_create?(args: any): any; - ft_search?(args: any): any; - hgetallAsync?(key: string): Promise - hsetAsync?(key: string, fields: string[]): Promise - ft_createAsync?(index: string, args: string[]): Promise - ft_searchAsync?(index: string, query: string): Promise -}; - -addCommand('ft.create'); -addCommand('ft.search'); ``` All the functions in the library use callbacks, but we can use `promisify` to enable the `async/await` syntax: @@ -424,7 +417,7 @@ docker compose down --volumes --remove-orphans - [Skillmarket Backend](https://github.com/julianmateu/skillmarket-backend) - [Skillmarket Frontend](https://github.com/julianmateu/skillmarket-front) -- [RediSearch Official Tutorial](https://github.com/RediSearch/redisearch-getting-started) +- [Search Tutorial](https://github.com/RediSearch/redisearch-getting-started) ## @@ -434,13 +427,11 @@ docker compose down --volumes --remove-orphans target="_blank" rel="noopener" className="link"> - Redis Launchpad - diff --git a/docs/howtos/solutions/caching-architecture/cache-prefetching/images/pattern.jpg b/docs/howtos/solutions/caching-architecture/cache-prefetching/images/pattern.jpg new file mode 100644 index 00000000000..d150b797cc5 Binary files /dev/null and b/docs/howtos/solutions/caching-architecture/cache-prefetching/images/pattern.jpg differ diff --git a/docs/howtos/solutions/caching-architecture/cache-prefetching/images/redis-json-01.png b/docs/howtos/solutions/caching-architecture/cache-prefetching/images/redis-json-01.png new file mode 100644 index 00000000000..0e95a274975 Binary files /dev/null and b/docs/howtos/solutions/caching-architecture/cache-prefetching/images/redis-json-01.png differ diff --git a/docs/howtos/solutions/caching-architecture/cache-prefetching/images/redis-json-02.png b/docs/howtos/solutions/caching-architecture/cache-prefetching/images/redis-json-02.png new file mode 100644 index 00000000000..9e3971f7c3a Binary files /dev/null and b/docs/howtos/solutions/caching-architecture/cache-prefetching/images/redis-json-02.png differ diff --git a/docs/howtos/solutions/caching-architecture/cache-prefetching/index-cache-prefetching.mdx b/docs/howtos/solutions/caching-architecture/cache-prefetching/index-cache-prefetching.mdx new file mode 100644 index 00000000000..ad2a435eadc --- /dev/null +++ b/docs/howtos/solutions/caching-architecture/cache-prefetching/index-cache-prefetching.mdx @@ -0,0 +1,160 @@ +--- +id: index-cache-prefetching +title: How to use Redis for Cache Prefetching Strategy +sidebar_label: How to use Redis for Cache Prefetching Strategy +slug: /howtos/solutions/caching-architecture/cache-prefetching +authors: [prasan, will] +--- + +import Authors from '@theme/Authors'; +import CachingMovieAppDesign from '../common-caching/caching-movie-app.mdx'; +import SourceCodeMovieApp from '../common-caching/source-code-movie-app.mdx'; +import RedisInsightLanguageJson from './images/redis-json-01.png'; +import RedisInsightCountryJson from './images/redis-json-02.png'; + + + + + +## What is cache prefetching? + +Cache prefetching is a technique used in database management systems (DBMS) to improve query performance by anticipating and fetching data from the storage subsystem before it is explicitly requested by a query. + +There are three main strategies for cache prefetching: + +1. Sequential prefetching: This approach anticipates that data will be accessed in a sequential manner, such as when scanning a table or index. It prefetches the next set of data blocks or pages in the sequence to ensure they are available in cache when needed. +1. Prefetching based on query patterns: Some database systems can analyze past query patterns to predict which data is likely to be accessed in the future. By analyzing these patterns, the DBMS can prefetch relevant data and have it available in cache when a similar query is executed. +1. Prefetching based on data access patterns: In some cases, data access patterns can be derived from the application logic or schema design. By understanding these patterns, the database system can prefetch data that is likely to be accessed soon. + +This tutorial will cover the third strategy, **prefetching based on data access patterns**. + +Imagine you're building a movie streaming platform. You need to be able to provide your users with a dashboard that allows them to quickly find the movies they want to watch. You have an extensive database filled with movies, and you have them categorized by things like country of origin, genre, language, etc. This data changes infrequently, and is regularly referenced all over your app and by other data. This kind of data that is long-lived and changes infrequently is called "master data." + +One ongoing developer challenge is to swiftly create, read, update, and delete master data. You might store your master data in a system of record like a SQL database or document database, and then use Redis as a cache to speed up lookups for that data. Then, when an application requests master data, instead of coming from the system of record, the master data is served from Redis. This is called the "master data-lookup" pattern. + +From a developer's point of view, "master data lookup" refers to the process by which master data is accessed in business transactions, in application setup, and any other way that software retrieves the information. Examples of master data lookup include fetching data for user interface (UI) elements (such as drop-down dialogs, select values, multi-language labels), fetching constants, user access control, theme, and other product configuration. + +Below you will find a diagram of the data flow for prefetching master data using Redis with MongoDB as the system of record. + +![pattern](./images/pattern.jpg) + +The steps involved in fetching data are as follows: + +1. Read the master data from MongoDB on application startup and store a copy of the data in Redis. This pre-caches the data for fast retrieval. Use a script or a cron job to regularly copy latest master data to Redis. +2. The application requests master data. +3. Instead of MongoDB serving the data, the master data will be served from Redis. + +## Why you should use Redis for cache prefetching + +1. **Serve prefetched data at speed**: By definition, nearly every application requires access to master or other common data. Pre-caching such frequent data with Redis delivers it to users at high speed. +1. **Support massive tables**: Master tables often have millions of records. Searching through them can cause performance bottlenecks. Use Redis to perform real-time search on the large tables to increase performance with sub-millisecond response. +1. **Postpone expensive hardware and software investments**: Defer costly infrastructure enhancements by using Redis. Get the performance and scaling benefits without asking the CFO to write a check. + +:::tip + +If you use **Redis Cloud**, cache prefetching is easier due to its support for JSON and search. You also get additional features such as real-time performance, high scalability, resiliency, and fault tolerance. You can also call upon high-availability features such as Active-Active geo-redundancy. + +::: + +## Cache prefetching in a NodeJS application with Redis and MongoDB + +### Demo application + + + + + +Certain fields used in the demo application serve as master data, including movie language, country, genre, and ratings. They are master data because they are required for almost every application transaction. For example, the pop-up dialog (seen below) that appears when a user who wants to add a new movie clicks the movie application plus the icon. The pop-up includes drop-down menus for both country and language. In this case, Redis stores and provides the values. + +![demo-03](../common-caching/images/demo-03.png) + +### Prefetching data with Redis and MongoDB + +The code snippet below is used to prefetch MongoDB JSON documents and store them in Redis (as JSON) using the [Redis OM for Node.js](https://github.com/Redis/Redis-om-node) library. + +```js +async function insertMasterCategoriesToRedis() { + ... + const _dataArr = await getMasterCategories(); //from MongoDb + const repository = MasterCategoryRepo.getRepository(); + + if (repository && _dataArr && _dataArr.length) { + for (const record of _dataArr) { + const entity = repository.createEntity(record); + entity.categoryTag = [entity.category]; //for tag search + //adds JSON to Redis + await repository.save(entity); + } + } + ... +} + +async function getMasterCategories() { + //fetching data from MongoDb + ... + db.collection("masterCategories").find({ + statusCode: { + $gt: 0, + }, + category: { + $in: ["COUNTRY", "LANGUAGE"], + }, + }); + ... +} +``` + +You can also check [RedisInsight](https://redis.com/redis-enterprise/redis-insight/) to verify that JSON data is inserted, as seen below: + +Redis-json +Redis-json + +:::tip + +RedisInsight is the free redis GUI for viewing data in redis. [Click here to download.](https://redis.com/redis-enterprise/redis-insight/) + +::: + +### Querying prefetched data from Redis + +Prior to prefetching with Redis, the application searched the static database (MongoDB) to retrieve the movie's country and language values. As more people started using the application, the database became overloaded with queries. The application was slow and unresponsive. To solve this problem, the application was modified to use Redis to store the master data. The code snippet below shows how the application queries Redis for the master data, specifically the country and language values for the dropdown menus: + +```js +*** With Redis *** +*** Redis OM Node query *** +function getMasterCategories() { + ... + masterCategoriesRepository + .search() + .where("statusCode") + .gt(0) + .and("categoryTag") + .containOneOf("COUNTRY", "LANGUAGE"); + ... +} +``` + +## Ready to use Redis for cache prefetching? + +In this tutorial you learned how to use Redis for cache prefetching with a "master data lookup" example. While this is one way Redis is used in an application, it's possible to incrementally adopt Redis wherever needed with other caching strategies/patterns. For more resources on the topic of caching, check out the links below: + +## Additional resources + +- Caching with Redis + - [Write behind caching](/howtos/solutions/caching-architecture/write-behind) + - [Write through caching](/howtos/solutions/caching-architecture/write-through) + - [Query caching in a microservices application](/howtos/solutions/microservices/caching) +- [Redis YouTube channel](https://www.youtube.com/c/Redisinc) +- Clients like [node-redis](https://github.com/redis/node-redis) and [Redis OM for Node](https://github.com/redis/redis-om-node) help you to use Redis in Node.js applications. +- [RedisInsight](https://redis.com/redis-enterprise/redis-insight/): To view your Redis data or to play with raw Redis commands in the workbench +- [Try Redis Cloud for free](https://redis.com/try-free/) diff --git a/docs/howtos/solutions/caching-architecture/common-caching/caching-movie-app.mdx b/docs/howtos/solutions/caching-architecture/common-caching/caching-movie-app.mdx new file mode 100644 index 00000000000..45ececdcf83 --- /dev/null +++ b/docs/howtos/solutions/caching-architecture/common-caching/caching-movie-app.mdx @@ -0,0 +1,4 @@ +The demo application used in the rest of this tutorial showcases a movie application with basic create, read, update, and delete (CRUD) operations. +![demo-01](./images/demo-01.png) + +The movie application dashboard contains a search section at the top and a list of movie cards in the middle. The floating plus icon displays a pop-up when the user selects it, permitting the user to enter new movie details. The search section has a text search bar and a toggle link between text search and basic (that is, form-based) search. Each movie card has edit and delete icons, which are displayed when a mouse hovers over the card. diff --git a/docs/howtos/solutions/caching-architecture/common-caching/images/demo-01.png b/docs/howtos/solutions/caching-architecture/common-caching/images/demo-01.png new file mode 100644 index 00000000000..0cf66d0de79 Binary files /dev/null and b/docs/howtos/solutions/caching-architecture/common-caching/images/demo-01.png differ diff --git a/docs/howtos/solutions/caching-architecture/common-caching/images/demo-02.png b/docs/howtos/solutions/caching-architecture/common-caching/images/demo-02.png new file mode 100644 index 00000000000..72116e792d5 Binary files /dev/null and b/docs/howtos/solutions/caching-architecture/common-caching/images/demo-02.png differ diff --git a/docs/howtos/solutions/caching-architecture/common-caching/images/demo-03.png b/docs/howtos/solutions/caching-architecture/common-caching/images/demo-03.png new file mode 100644 index 00000000000..3ba01edb2e7 Binary files /dev/null and b/docs/howtos/solutions/caching-architecture/common-caching/images/demo-03.png differ diff --git a/docs/howtos/solutions/caching-architecture/common-caching/redis-gears.mdx b/docs/howtos/solutions/caching-architecture/common-caching/redis-gears.mdx new file mode 100644 index 00000000000..63aa94b09d8 --- /dev/null +++ b/docs/howtos/solutions/caching-architecture/common-caching/redis-gears.mdx @@ -0,0 +1,78 @@ +### What is RedisGears? + +RedisGears is a programmable serverless engine for transaction, batch, and event-driven data processing allowing users to write and run their own functions on data stored in Redis. + +Functions can be implemented in different languages, including Python and C, and can be executed by the RedisGears engine in one of two ways: + +1. **Batch**: triggered by the [Run](https://oss.redislabs.com/redisgears/functions.html#run) action, execution is immediate and on existing data +2. **Event**: triggered by the [Register](https://oss.redislabs.com/redisgears/functions.html#register) action, execution is triggered by new events and on their data + +Some **batch** type operations RedisGears can do: + +- Run an operation on all keys in the KeySpace or keys matching a certain pattern like : + - Prefix all KeyNames with `person:` + - Delete all keys whose value is smaller than zero + - Write all the KeyNames starting with `person:` to a set +- Run a set of operations on all(or matched) keys where the output of one operation is the input of another like + - Find all keys with a prefix `person:` (assume all of them are of type hash) + - Increase user's days_old by 1, then sum them by age group (10-20, 20-30 etc.) + - Add today's stats to the sorted set of every client, calculate last 7 days average and save the computed result in a string + +Some **event** type operations RedisGears can do: + +- RedisGears can also register event listeners that will trigger a function execution every time a watched key is changed like + - Listen for all operations on all keys and keep a list of all KeyNames in the KeySpace + - Listen for DEL operations on keys with a prefix `I-AM-IMPORTANT:` and asynchronously dump them in a "deleted keys" log file + - Listen for all HINCRBY operations on the element score of keys with a prefix `player:` and synchronously update a user's level when the score reaches 1000 + +### How do I use RedisGears? + +Run the Docker container: + +```sh +docker run -p 6379:6379 redislabs/redisgears:latest +``` + +For a very simple example that lists all keys in your Redis database with a prefix of `person:` create the following python script and name it `hello_gears.py` : + +```python +gb = GearsBuilder() gb.run('person:*') +``` + +Execute your function: + +```sh +docker exec -i redisgears redis-cli RG.PYEXECUTE "`cat hello_gears.py`" +``` + +### Using gears-cli + +The gears-cli tool provides an easier way to execute RedisGears functions, specially if you need to pass some parameters too. + +It's written in Python and can be installed with `pip`: + +```sh +pip install gears-cli +``` + +```sh +gears-cli hello_gears.py REQUIREMENTS rgsync +``` + +Usage: + +```sh +gears-cli --help +usage: gears-cli [-h] [--host HOST] [--port PORT] +[--requirements REQUIREMENTS] [--password PASSWORD] path [extra_args [extra_args ...]] +``` + +### RedisGears references + +- [RedisGears docs](https://oss.redis.com/redisgears/) +- [rgsync docs](https://github.com/RedisGears/rgsync) +- [Installing RedisGears](https://docs.redis.com/latest/modules/redisgears/installing-redisgears/) +- [Introduction to RedisGears blog](https://redis.com/blog/introduction-to-redisgears/) +- [RedisGears GA - RedisConf 2020 video](https://www.youtube.com/watch?v=J4clHQJScZQ&list=PL83Wfqi-zYZFvs80ncPAPHt-CEuimHl6Q&index=3) +- [Conference talk video by creator of RedisGears](https://www.youtube.com/watch?v=6SGWx5DtoCQ) +- [Redis Gears sync with MongoDB](https://github.com/RedisGears/rgsync/tree/master/examples/mongo) diff --git a/docs/howtos/solutions/caching-architecture/common-caching/source-code-movie-app.mdx b/docs/howtos/solutions/caching-architecture/common-caching/source-code-movie-app.mdx new file mode 100644 index 00000000000..15d9da397ea --- /dev/null +++ b/docs/howtos/solutions/caching-architecture/common-caching/source-code-movie-app.mdx @@ -0,0 +1,9 @@ +:::tip GITHUB CODE + +Below are the commands to clone the source code (frontend and backend) for the application used in this tutorial + +git clone https://github.com/redis-developer/ebook-speed-mern-frontend.git + +git clone https://github.com/redis-developer/ebook-speed-mern-backend.git + +::: diff --git a/docs/howtos/solutions/caching-architecture/common-caching/write-behind-vs-write-through.mdx b/docs/howtos/solutions/caching-architecture/common-caching/write-behind-vs-write-through.mdx new file mode 100644 index 00000000000..07ec2172541 --- /dev/null +++ b/docs/howtos/solutions/caching-architecture/common-caching/write-behind-vs-write-through.mdx @@ -0,0 +1,6 @@ +There are two related write patterns, and the main differences between them are as follows + +| Write Behind | Write through | +| ----------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------- | +| Syncs data asynchronously | Syncs data synchronously/ immediately | +| Data between the cache and the system of record (database) is **inconsistent for a short time** | Data between the cache and the system of record (database) is always **consistent** | diff --git a/docs/howtos/solutions/caching-architecture/write-behind/images/mongo-compass.png b/docs/howtos/solutions/caching-architecture/write-behind/images/mongo-compass.png new file mode 100644 index 00000000000..93bb5fe9037 Binary files /dev/null and b/docs/howtos/solutions/caching-architecture/write-behind/images/mongo-compass.png differ diff --git a/docs/howtos/solutions/caching-architecture/write-behind/images/pattern.jpg b/docs/howtos/solutions/caching-architecture/write-behind/images/pattern.jpg new file mode 100644 index 00000000000..cefcd565fcc Binary files /dev/null and b/docs/howtos/solutions/caching-architecture/write-behind/images/pattern.jpg differ diff --git a/docs/howtos/solutions/caching-architecture/write-behind/images/redis-insight-stream.png b/docs/howtos/solutions/caching-architecture/write-behind/images/redis-insight-stream.png new file mode 100644 index 00000000000..d90ec9432ff Binary files /dev/null and b/docs/howtos/solutions/caching-architecture/write-behind/images/redis-insight-stream.png differ diff --git a/docs/howtos/solutions/caching-architecture/write-behind/images/redis-insight.png b/docs/howtos/solutions/caching-architecture/write-behind/images/redis-insight.png new file mode 100644 index 00000000000..20e90655783 Binary files /dev/null and b/docs/howtos/solutions/caching-architecture/write-behind/images/redis-insight.png differ diff --git a/docs/howtos/solutions/caching-architecture/write-behind/index-write-behind.mdx b/docs/howtos/solutions/caching-architecture/write-behind/index-write-behind.mdx new file mode 100644 index 00000000000..319aa67891f --- /dev/null +++ b/docs/howtos/solutions/caching-architecture/write-behind/index-write-behind.mdx @@ -0,0 +1,211 @@ +--- +id: index-write-behind +title: How to use Redis for Write-behind Caching +sidebar_label: How to use Redis for Write-behind Caching +slug: /howtos/solutions/caching-architecture/write-behind +authors: [prasan, will] +--- + +import Authors from '@theme/Authors'; +import RedisGears from '../common-caching/redis-gears.mdx'; +import CachingMovieAppDesign from '../common-caching/caching-movie-app.mdx'; +import SourceCodeMovieApp from '../common-caching/source-code-movie-app.mdx'; +import WritePatternDifferences from '../common-caching/write-behind-vs-write-through.mdx'; + +import mongoCompassImage from './images/mongo-compass.png'; + + + + + +## What is write-behind caching? + +Imagine you've built a movie streaming app. You used MongoDB as your data store, and as you needed to scale you implemented caching using Redis. This allows you to drastically speed up reads. However, now you are experiencing slowness when writing to MongoDB. + +For example, maybe you want to allow users to continue watching movies where they last left off. This requires you to store the timestamp of where a user is when they decide to pause the movie. With millions of users, this is starting to cause MongoDB to slow down when you have peaks in demand. + +You need a way of flattening the peaks in demand, allowing you to write data quickly and then persist it to MongoDB when the demand dies down. What you need is called the "write-behind pattern." + +The pattern is simple, your application writes data to Redis and then asynchronously data gets written to MongoDB. Write operations are queued up so that the application can move on quickly and the cache can catch up over time. However, this does mean there is a short time when the data between the cache and the system of record is inconsistent. + +Below is a diagram of the write-behind pattern for the application: + +![write-behind-pattern using Redis in a movie streaming application](./images/pattern.jpg) + +The pattern works as follows: + +1. The application reads and writes data to Redis. +1. Redis syncs any changed data to the MongoDB database asynchronously. + + + +Learn more about [Write through pattern](/howtos/solutions/caching-architecture/write-through) + +## Why you should use Redis for write-behind caching + +Consider Redis with this pattern when you need to + +1. **Flatten peaks in demand**: Under stress, an application may need to write data quickly. If your application needs to perform a large number of write operations at high speed, consider Redis. The programmability capabilities of Redis make sure the data stored in the cache is synced with the database. +1. **Batch multiple writes**: Sometimes it's expensive to write to a database frequently (for example, logging). In those cases, it can be cost-effective to batch the database writes with Redis so that data syncs at intervals. +1. **Offload the primary database**: Database load is reduced when heavy writes operate on Redis, So we can spread writes to improve performance during the peak time of application usage. + +## Redis programmability for write-behind caching using RedisGears + +:::tip + +You can skip reading this section if you are already familiar with RedisGears) + +::: + + + +## Write behind caching in a NodeJS application with Redis and MongoDB + +### Demo application + + + + + +To demonstrate this pattern using the movie application, imagine that the user opens the pop-up to add a new movie. + +![demo-02](../common-caching/images/demo-02.png) + +Instead of the application immediately storing the data in MongoDB, the application writes the changes to Redis. In the background, RedisGears automatically synchronizes the data with the MongoDB database. + +### Programming Redis using the write-behind pattern + +Developers need to load some code (say python in our example) to the Redis server before using the write-behind pattern (which syncs data from Redis to MongoDB). The Redis server has a RedisGears module that interprets the python code and syncs the data from Redis to MongoDB. + +Loading the Python code is easier than it sounds. Simply replace database details in the Python file and then load the file to the Redis server. + +Create the Python file (shown below, and [available online](https://github.com/redis-developer/ebook-speed-mern-backend/blob/main/data/write-behind/movies-write-behind.py)). Then update the MongoDB connection details, database, collection, and primary key name to sync. + +```python title="movies-write-behind.py" +# Gears Recipe for a single write behind + +# import redis gears & mongo db libs +from rgsync import RGJSONWriteBehind, RGJSONWriteThrough +from rgsync.Connectors import MongoConnector, MongoConnection + +# change mongodb connection (admin) +# mongodb://usrAdmin:passwordAdmin@10.10.20.2:27017/dbSpeedMernDemo?authSource=admin +mongoUrl = 'mongodb://usrAdmin:passwordAdmin@10.10.20.2:27017/admin' + +# MongoConnection(user, password, host, authSource?, fullConnectionUrl?) +connection = MongoConnection('', '', '', '', mongoUrl) + +# change MongoDB database +db = 'dbSpeedMernDemo' + +# change MongoDB collection & it's primary key +movieConnector = MongoConnector(connection, db, 'movies', 'movieId') + +# change redis keys with prefix that must be synced with mongodb collection +RGJSONWriteBehind(GB, keysPrefix='MovieEntity', + connector=movieConnector, name='MoviesWriteBehind', + version='99.99.99') +``` + +:::tip What is a RedisGears recipe? + +A collection of RedisGears functions and any dependencies they may have that implement a high-level functional purpose is called a `recipe`. +Example : "RGJSONWriteBehind" function in above python code + +::: + +There are two ways to load that Python file into the Redis server: + +1. Using the gears command-line interface (CLI) + +Find more information about the Gears CLI at [gears-cli](https://github.com/RedisGears/gears-cli) and [rgsync](https://github.com/RedisGears/rgsync#running-the-recipe). + +```sh +# install +pip install gears-cli +``` + +```sh +# If python file is located at “/users/tom/movies-write-behind.py” +gears-cli --host --port --password run /users/tom/movies-write-behind.py REQUIREMENTS rgsync pymongo==3.12.0 +``` + +2. Using the RG.PYEXECUTE from the Redis command line. + +Find more information at [RG.PYEXECUTE](https://oss.redis.com/redisgears/commands.html#rgpyexecute). + +```sh +# Via redis cli +RG.PYEXECUTE 'pythonCode' REQUIREMENTS rgsync pymongo==3.12.0 +``` + +The RG.PYEXECUTE command can also be executed from the Node.js code +(Consult [the sample Node file](https://github.com/redis-developer/ebook-speed-mern-backend/blob/main/data/write-behind/wb-main.js) for more details) + +Find more examples at [Redis Gears sync with MongoDB](https://github.com/RedisGears/rgsync/tree/master/examples/mongo). + +### Verifying the write-behind pattern using RedisInsight + +:::tip + +RedisInsight is the free redis GUI for viewing data in redis. [Click here to download.](https://redis.com/redis-enterprise/redis-insight/) + +::: + +The next step is to verify that RedisGears is syncing data between Redis and MongoDB. + +Insert a key starting with the prefix (that's specified in the Python file) using the Redis CLI + +![redis-insight](./images/redis-insight.png) + +Next, confirm that the JSON is inserted in MongoDB too. + +mongo-compass + +You can also check [RedisInsight](https://redis.com/redis-enterprise/redis-insight/) to verify that the data is piped in via Streams for its consumers (like RedisGears). + +![redis-insight-stream](./images/redis-insight-stream.png) + +How does all that work with the demo application? Below is a code snipped to insert a movie. Once data is written to Redis, RedisGears automatically synchronizes it to MongoDB. + +```js title="BEFORE (using MongoDB)" +... +//(Node mongo query) +if (movie) { + //insert movie to MongoDB + await db.collection("movies") + .insertOne(movie); +} +... +``` + +```js title="AFTER (using Redis)" +... +//(Redis OM Node query) +if (movie) { + const entity = repository.createEntity(movie); + //insert movie to Redis + await moviesRepository.save(entity); +} +... +``` + +## Ready to use Redis for write-behind caching? + +You now know how to use Redis for write-behind caching. It's possible to incrementally adopt Redis wherever needed with different strategies/patterns. For more resources on the topic of caching, check out the links below: + +## Additional resources + +- Caching with Redis + - [Write through caching](/howtos/solutions/caching-architecture/write-through) + - [Cache prefetching](/howtos/solutions/caching-architecture/cache-prefetching) + - [Query caching](/howtos/solutions/microservices/caching) +- [Redis YouTube channel](https://www.youtube.com/c/Redisinc) +- Clients like [Node Redis](https://github.com/redis/node-redis) and [Redis om Node](https://github.com/redis/redis-om-node) help you to use Redis in Node.js applications. +- [RedisInsight](https://redis.com/redis-enterprise/redis-insight/) : To view your Redis data or to play with raw Redis commands in the workbench +- [Try Redis Cloud for free](https://redis.com/try-free/) diff --git a/docs/howtos/solutions/caching-architecture/write-through/images/01-adminer-login.png b/docs/howtos/solutions/caching-architecture/write-through/images/01-adminer-login.png new file mode 100644 index 00000000000..c0947c0a491 Binary files /dev/null and b/docs/howtos/solutions/caching-architecture/write-through/images/01-adminer-login.png differ diff --git a/docs/howtos/solutions/caching-architecture/write-through/images/02-adminer-table-creation.png b/docs/howtos/solutions/caching-architecture/write-through/images/02-adminer-table-creation.png new file mode 100644 index 00000000000..6bc39e637f8 Binary files /dev/null and b/docs/howtos/solutions/caching-architecture/write-through/images/02-adminer-table-creation.png differ diff --git a/docs/howtos/solutions/caching-architecture/write-through/images/03-redis-hash-insert.png b/docs/howtos/solutions/caching-architecture/write-through/images/03-redis-hash-insert.png new file mode 100644 index 00000000000..32df456a20f Binary files /dev/null and b/docs/howtos/solutions/caching-architecture/write-through/images/03-redis-hash-insert.png differ diff --git a/docs/howtos/solutions/caching-architecture/write-through/images/04-redis-hash-view.png b/docs/howtos/solutions/caching-architecture/write-through/images/04-redis-hash-view.png new file mode 100644 index 00000000000..efd0cc0c43b Binary files /dev/null and b/docs/howtos/solutions/caching-architecture/write-through/images/04-redis-hash-view.png differ diff --git a/docs/howtos/solutions/caching-architecture/write-through/images/05-adminer-hash-view.png b/docs/howtos/solutions/caching-architecture/write-through/images/05-adminer-hash-view.png new file mode 100644 index 00000000000..dfef46f2cb5 Binary files /dev/null and b/docs/howtos/solutions/caching-architecture/write-through/images/05-adminer-hash-view.png differ diff --git a/docs/howtos/solutions/caching-architecture/write-through/images/06-redis-hash-update.png b/docs/howtos/solutions/caching-architecture/write-through/images/06-redis-hash-update.png new file mode 100644 index 00000000000..07fdfe482c4 Binary files /dev/null and b/docs/howtos/solutions/caching-architecture/write-through/images/06-redis-hash-update.png differ diff --git a/docs/howtos/solutions/caching-architecture/write-through/images/07-redis-hash-updated-view.png b/docs/howtos/solutions/caching-architecture/write-through/images/07-redis-hash-updated-view.png new file mode 100644 index 00000000000..27acf63d3ed Binary files /dev/null and b/docs/howtos/solutions/caching-architecture/write-through/images/07-redis-hash-updated-view.png differ diff --git a/docs/howtos/solutions/caching-architecture/write-through/images/08-adminer-updated-hash-view.png b/docs/howtos/solutions/caching-architecture/write-through/images/08-adminer-updated-hash-view.png new file mode 100644 index 00000000000..fab10db3537 Binary files /dev/null and b/docs/howtos/solutions/caching-architecture/write-through/images/08-adminer-updated-hash-view.png differ diff --git a/docs/howtos/solutions/caching-architecture/write-through/images/mongo-compass.png b/docs/howtos/solutions/caching-architecture/write-through/images/mongo-compass.png new file mode 100644 index 00000000000..93bb5fe9037 Binary files /dev/null and b/docs/howtos/solutions/caching-architecture/write-through/images/mongo-compass.png differ diff --git a/docs/howtos/solutions/caching-architecture/write-through/images/pattern-wt.png b/docs/howtos/solutions/caching-architecture/write-through/images/pattern-wt.png new file mode 100644 index 00000000000..f38123726fe Binary files /dev/null and b/docs/howtos/solutions/caching-architecture/write-through/images/pattern-wt.png differ diff --git a/docs/howtos/solutions/caching-architecture/write-through/images/redis-insight-stream.png b/docs/howtos/solutions/caching-architecture/write-through/images/redis-insight-stream.png new file mode 100644 index 00000000000..d90ec9432ff Binary files /dev/null and b/docs/howtos/solutions/caching-architecture/write-through/images/redis-insight-stream.png differ diff --git a/docs/howtos/solutions/caching-architecture/write-through/images/redis-insight.png b/docs/howtos/solutions/caching-architecture/write-through/images/redis-insight.png new file mode 100644 index 00000000000..20e90655783 Binary files /dev/null and b/docs/howtos/solutions/caching-architecture/write-through/images/redis-insight.png differ diff --git a/docs/howtos/solutions/caching-architecture/write-through/index-write-through.mdx b/docs/howtos/solutions/caching-architecture/write-through/index-write-through.mdx new file mode 100644 index 00000000000..e28ee0fde78 --- /dev/null +++ b/docs/howtos/solutions/caching-architecture/write-through/index-write-through.mdx @@ -0,0 +1,291 @@ +--- +id: index-write-through +title: How to use Redis for Write through caching strategy +sidebar_label: How to use Redis for Write through caching strategy +slug: /howtos/solutions/caching-architecture/write-through +authors: [prasan, will] +--- + +import Authors from '@theme/Authors'; +import RedisGears from '../common-caching/redis-gears.mdx'; +import CachingMovieAppDesign from '../common-caching/caching-movie-app.mdx'; +import SourceCodeMovieApp from '../common-caching/source-code-movie-app.mdx'; +import WritePatternDifferences from '../common-caching/write-behind-vs-write-through.mdx'; + +import mongoCompassImage from './images/mongo-compass.png'; +import redisHashView from './images/04-redis-hash-view.png'; +import redisHashUpdatedView from './images/07-redis-hash-updated-view.png'; + + + +## What is write-through caching? + +Imagine you've built a movie streaming app. You used PostgreSQL as your data store, and as you needed to scale you implemented caching using Redis. However, now you are experiencing slowness in reflecting of updated user profile or subscription. + +For example, When a user purchases or modifies subscription, user expects the changes to be reflected immediately on his account so that the desired movie/ show of new subscription is enabled for watching. +So you need a way of quickly providing strong consistency of user data. In such situation, What you need is called the "write-through pattern." + +With the **Write-through** pattern, every time an application writes data to the cache, it also updates the records in the database, unlike [Write behind](/howtos/solutions/caching-architecture/write-behind) the thread waits in this pattern until the write to the database is also completed. + +Below is a diagram of the write-through pattern for the application: + +![write-through-pattern using Redis in a movie streaming application](./images/pattern-wt.png) + +The pattern works as follows: + +1. The application reads and writes data to Redis. +1. Redis syncs any changed data to the PostgreSQL database **synchronously/ immediately**. + +Note : the **Redis server is blocked** until a response from the main database is received. + + + +Learn more about [Write behind pattern](/howtos/solutions/caching-architecture/write-behind) + +## Why you should use Redis for write-through caching + +Write-through caching with Redis ensures that the (critical data) cache is always up-to-date with the database, providing **strong consistency** and **improving application performance**. + +consider below scenarios of different applications : + +- **E-commerce application**: In an e-commerce application, write-through caching can be used to ensure consistency of product inventory. Whenever a customer purchases a product, the inventory count should be updated immediately to avoid overselling. Redis can be used to cache the inventory count, and every update to the count can be written through to the database. This ensures that the inventory count in the cache is always up-to-date, and customers are not able to purchase items that are out of stock. + +- **Banking application**: In a banking application, write-through caching can be used to ensure consistency of account balances. Whenever a transaction is made, the account balance should be updated immediately to avoid overdrafts or other issues. Redis can be used to cache the account balances, and every transaction can be written through to the database. This ensures that the balance in the cache is always up-to-date, and transactions can be processed with strong consistency. + +- **Online gaming platform**: Suppose you have an online gaming platform where users can play games against each other. With write-through caching, any changes made to a user's score or game state would be saved to the database and also cached in Redis. This ensures that any subsequent reads for that user's score or game state would hit the cache first. This helps to reduce the load on the database and ensures that the game state displayed to users is always up-to-date. + +- **Claims Processing System**: In an insurance claims processing system, claims data needs to be consistent and up-to-date across different systems and applications. With write-through caching in Redis, new claims data can be written to both the database and Redis cache. This ensures that different applications always have the most up-to-date information about the claims, making it easier for claims adjusters to access the information they need to process claims more quickly and efficiently. + +- **Healthcare Applications**: In healthcare applications, patient data needs to be consistent and up-to-date across different systems and applications. With write-through caching in Redis, updated patient data can be written to both the database and Redis cache, ensuring that different applications always have the latest patient information. This can help improve patient care by providing accurate and timely information to healthcare providers. + +- **Social media application**: In a social media application, write-through caching can be used to ensure consistency of user profiles. Whenever a user updates their profile, the changes should be reflected immediately to avoid showing outdated information to other users. Redis can be used to cache the user profiles, and every update can be written through to the database. This ensures that the profile information in the cache is always up-to-date, and users can see accurate information about each other. + +## Redis programmability for write-through caching using RedisGears + +:::tip + +You can skip reading this section if you are already familiar with RedisGears) + +::: + + + +### Programming Redis using the write-through pattern + +For our sample code, we will demonstrate writing `users` to Redis and then writing through to PostgreSQL. Use the docker-compose.yml file below to setup required environment: + +```yaml title="docker-compose.yml" +version: '3.9' +services: + redis: + container_name: redis + image: 'redislabs/redismod:latest' + ports: + - 6379:6379 + deploy: + replicas: 1 + restart_policy: + condition: on-failure + postgres: + image: postgres + restart: always + environment: + POSTGRES_USER: root + POSTGRES_PASSWORD: password + POSTGRES_DB: example + adminer: + image: adminer + restart: always + ports: + - 8080:8080 +``` + +To run the docker-compose file, run the following command: + +```shell +$ docker compose up -d +``` + +This will create a `Redis` server, a `PostgreSQL` server, and an `Adminer` server. Adminer is a web-based database management tool that allows you to view and edit data in your database. + +Next, open your browser to [http://localhost:8080/?pgsql=postgres&username=root&db=example&ns=public&sql=](http://localhost:8080/?pgsql=postgres&username=root&db=example&ns=public&sql=). You will have to input the password (which is `password` in the example above), + +![adminer-login](./images/01-adminer-login.png) + +then you will be taken to a SQL command page. Run the following SQL command to create a table: + +```sql title="users.sql" +CREATE TABLE users ( + id SERIAL PRIMARY KEY, + username VARCHAR(255) UNIQUE NOT NULL, + email VARCHAR(255) UNIQUE NOT NULL, + password_hash VARCHAR(255) NOT NULL, + first_name VARCHAR(255), + last_name VARCHAR(255), + date_of_birth DATE, + created_at TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP, + updated_at TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP +); +``` + +![adminer-table-creation](./images/02-adminer-table-creation.png) + +Developers need to load some code (say python in our example) to the Redis server before using the write-through pattern (which syncs data from Redis to the system of record). Redis server has a RedisGears module that interprets the python code and syncs the data from Redis to the system of record. + +Now, we need to create a RedisGears recipe that will write through to the PostgreSQL database. The following Python code will write through to the PostgreSQL database: + +```python title="write-through.py" +from rgsync import RGWriteThrough +from rgsync.Connectors import PostgresConnector, PostgresConnection + +''' +Create Postgres connection object +''' +connection = PostgresConnection('root', 'password', 'postgres:5432/example') + +''' +Create Postgres users connector +''' +usersConnector = PostgresConnector(connection, 'users', 'id') + +usersMappings = { + 'username': 'username', + 'email': 'email', + 'pwhash': 'password_hash', + 'first': 'first_name', + 'last': 'last_name', + 'dob': 'date_of_birth', + 'created_at': 'created_at', + 'updated_at': 'updated_at', +} + +RGWriteThrough(GB, keysPrefix='__', mappings=usersMappings, + connector=usersConnector, name='UsersWriteThrough', version='99.99.99') +``` + +Make sure you create the file "write-through.py" because the next instructions will use it. For the purpose of this example we are showing how to map Redis hash fields to PostgreSQL table columns. The `RGWriteThrough` function takes in the `usersMapping`, where the keys are the Redis hash keys and the values are the PostgreSQL table columns. + +:::tip What is a RedisGears recipe? + +A collection of RedisGears functions and any dependencies they may have that implement a high-level functional purpose is called a `recipe`. +Example : "RGJSONWriteThrough" function in above python code + +::: + +The python file has a few dependencies in order to work. Below is the requirements.txt file that contains the dependencies, create it alongside the "write-through.py" file: + +```text title="requirements.txt" +rgsync +psycopg2-binary +cryptography +``` + +There are two ways (gears CLI and RG.PYEXECUTE) to load that Python file into the Redis server: + +1. Using the gears command-line interface (CLI) + +Find more information about the Gears CLI at [gears-cli](https://github.com/RedisGears/gears-cli) and [rgsync](https://github.com/RedisGears/rgsync#running-the-recipe). + +```shell +# install +pip install gears-cli +``` + +To run our write-through recipe using `gears-cli`, we need to run the following command: + +```shell +$ gears-cli run --host localhost --port 6379 write-through.py --requirements requirements.txt +``` + +You should get a response that says "OK". That is how you know you have successfully loaded the Python file into the Redis server. + +:::tip + +If you are on Windows, we recommend you use WSL to install and use gears-cli. + +::: + +2. Using the RG.PYEXECUTE from the Redis command line. + +```shell +# Via redis cli +RG.PYEXECUTE 'pythonCode' REQUIREMENTS rgsync psycopg2-binary cryptography +``` + +:::tip + +The RG.PYEXECUTE command can also be executed from the Node.js code +(Consult [the sample Node file](https://github.com/redis-developer/ebook-speed-mern-backend/blob/main/data/write-through/wt-main.js) for more details) + +::: + +:::tip + +Find more examples in the [Redis Gears GitHub repository](https://github.com/RedisGears/rgsync/tree/master/examples/). + +::: + +### Verifying the write-through pattern using RedisInsight + +:::tip + +RedisInsight is the free redis GUI for viewing data in redis. [Click here to download.](https://redis.com/redis-enterprise/redis-insight/) + +::: + +The next step is to verify that RedisGears is syncing data between Redis and PostgreSQL. Note that in our Python file we specified a prefix for the keys. In this case, we specified `__` as the prefix, `users` as the table, and `id` as the unique identifier. This instructs RedisGears to look for the following key format: `__{users:}`. Try running the following command in the Redis command line: + +``` +hset __{users:1} username john email john@gmail.com pwhash d1e8a70b5ccab1dc2f56bbf7e99f064a660c08e361a35751b9c483c88943d082 first John last Doe dob 1990-01-01 created_at 2023-04-20 updated_at 2023-04-20 +``` + +![redis-hash-insert](./images/03-redis-hash-insert.png) + +Check [RedisInsight](https://redis.com/redis-enterprise/redis-insight/) to verify that the hash value made it into Redis. After RedisGears is done processing the `__{users:1}` key, it will be deleted from Redis and replaced by the `users:1` key. Check RedisInsight to verify that the `users:1` key is in Redis. + +redis-hash-view + +Next, confirm that the user is inserted in PostgreSQL too by opening up the [select page in Adminer](http://localhost:8080/?pgsql=postgres&username=root&db=example&ns=public&select=users). You should see the user inserted in the table. + +![adminer-hash-view](./images/05-adminer-hash-view.png) + +This is how you can use RedisGears to write through to PostgreSQL, and so far we have only added a hash key. You can also update specific hash fields and it will be reflected in your PostgreSQL database. Run the following command to update the `username` field: + +``` +> hset __{users:1} username bar +``` + +![redis-hash-update](./images/06-redis-hash-update.png) + +In RedisInsight, verify that the `username` field is updated + +redis-hash-updated-view + +Now go into Adminer and check the `username` field. You should see that it has been updated to `bar`. + +![adminer-updated-hash-view](./images/08-adminer-updated-hash-view.png) + +## Ready to use Redis for write-through caching? + +You now know how to use Redis for write-through caching. It's possible to incrementally adopt Redis wherever needed with different strategies/patterns. For more resources on the topic of caching, check out the links below: + +## Additional resources + +- Caching with Redis + - [Write behind caching](/howtos/solutions/caching-architecture/write-behind) + - [Cache prefetching](/howtos/solutions/caching-architecture/cache-prefetching) + - [Query caching](/howtos/solutions/microservices/caching) +- [Redis YouTube channel](https://www.youtube.com/c/Redisinc) +- Clients like [Node Redis](https://github.com/redis/node-redis) and [Redis om Node](https://github.com/redis/redis-om-node) help you to use Redis in Node.js applications. +- [RedisInsight](https://redis.com/redis-enterprise/redis-insight/) : To view your Redis data or to play with raw Redis commands in the workbench +- [Try Redis Cloud for free](https://redis.com/try-free/) diff --git a/docs/howtos/solutions/fraud-detection/common-fraud/source-code-tip.mdx b/docs/howtos/solutions/fraud-detection/common-fraud/source-code-tip.mdx new file mode 100644 index 00000000000..73d410d9bd6 --- /dev/null +++ b/docs/howtos/solutions/fraud-detection/common-fraud/source-code-tip.mdx @@ -0,0 +1,7 @@ +:::tip GITHUB CODE + +Below is a command to the clone the source code for the application used in this tutorial + +git clone --branch v3.0.0 https://github.com/redis-developer/redis-microservices-ecommerce-solutions + +::: diff --git a/docs/howtos/solutions/fraud-detection/digital-identity-validation/images/code-flow-1-1.png b/docs/howtos/solutions/fraud-detection/digital-identity-validation/images/code-flow-1-1.png new file mode 100644 index 00000000000..a04aad14863 Binary files /dev/null and b/docs/howtos/solutions/fraud-detection/digital-identity-validation/images/code-flow-1-1.png differ diff --git a/docs/howtos/solutions/fraud-detection/digital-identity-validation/images/code-flow-1-3.png b/docs/howtos/solutions/fraud-detection/digital-identity-validation/images/code-flow-1-3.png new file mode 100644 index 00000000000..c7dc1e3b6bc Binary files /dev/null and b/docs/howtos/solutions/fraud-detection/digital-identity-validation/images/code-flow-1-3.png differ diff --git a/docs/howtos/solutions/fraud-detection/digital-identity-validation/images/code-flow-2-1.png b/docs/howtos/solutions/fraud-detection/digital-identity-validation/images/code-flow-2-1.png new file mode 100644 index 00000000000..0b80f21713b Binary files /dev/null and b/docs/howtos/solutions/fraud-detection/digital-identity-validation/images/code-flow-2-1.png differ diff --git a/docs/howtos/solutions/fraud-detection/digital-identity-validation/images/code-flow-2-3.png b/docs/howtos/solutions/fraud-detection/digital-identity-validation/images/code-flow-2-3.png new file mode 100644 index 00000000000..369478bebb9 Binary files /dev/null and b/docs/howtos/solutions/fraud-detection/digital-identity-validation/images/code-flow-2-3.png differ diff --git a/docs/howtos/solutions/fraud-detection/digital-identity-validation/images/code-flow-2-4.png b/docs/howtos/solutions/fraud-detection/digital-identity-validation/images/code-flow-2-4.png new file mode 100644 index 00000000000..5d4f6a708d2 Binary files /dev/null and b/docs/howtos/solutions/fraud-detection/digital-identity-validation/images/code-flow-2-4.png differ diff --git a/docs/howtos/solutions/fraud-detection/digital-identity-validation/images/digital-identity-code-flow-1.png b/docs/howtos/solutions/fraud-detection/digital-identity-validation/images/digital-identity-code-flow-1.png new file mode 100644 index 00000000000..f443939c41d Binary files /dev/null and b/docs/howtos/solutions/fraud-detection/digital-identity-validation/images/digital-identity-code-flow-1.png differ diff --git a/docs/howtos/solutions/fraud-detection/digital-identity-validation/images/digital-identity-code-flow-2.png b/docs/howtos/solutions/fraud-detection/digital-identity-validation/images/digital-identity-code-flow-2.png new file mode 100644 index 00000000000..b27a4d0e1bc Binary files /dev/null and b/docs/howtos/solutions/fraud-detection/digital-identity-validation/images/digital-identity-code-flow-2.png differ diff --git a/docs/howtos/solutions/fraud-detection/digital-identity-validation/images/digital-identity.png b/docs/howtos/solutions/fraud-detection/digital-identity-validation/images/digital-identity.png new file mode 100644 index 00000000000..3fc99bae85e Binary files /dev/null and b/docs/howtos/solutions/fraud-detection/digital-identity-validation/images/digital-identity.png differ diff --git a/docs/howtos/solutions/fraud-detection/digital-identity-validation/index-digital-identity-validation.mdx b/docs/howtos/solutions/fraud-detection/digital-identity-validation/index-digital-identity-validation.mdx new file mode 100644 index 00000000000..13dc7367118 --- /dev/null +++ b/docs/howtos/solutions/fraud-detection/digital-identity-validation/index-digital-identity-validation.mdx @@ -0,0 +1,522 @@ +--- +id: index-digital-identity-validation +title: How to Handle Digital Identity Validation Using Redis +sidebar_label: Redis for Digital Identity Validation +slug: /howtos/solutions/fraud-detection/digital-identity-validation +authors: [prasan, will] +--- + +import Authors from '@theme/Authors'; +import digitalIdentityImg from './images/digital-identity.png'; +import SourceCodeFraudDetection from '../common-fraud/source-code-tip.mdx'; + +import MicroservicesEcommerceDesign from '../../microservices/common-data/microservices-ecommerce.mdx'; +import MicroservicesArchitectureWithRedis from '../../microservices/common-data/microservices-arch-with-redis.mdx'; + + + + + +## Digital identity validation for fraud detection introduction + +As the digital landscape continues to evolve, the need for robust security measures to protect users and organizations becomes ever more critical. Digital identity validation and fraud detection are essential components of any comprehensive security plan. This article will explore the significance of digital identity validation, the challenges faced in this area, and a solution to digital identity validation using redis. + +## Know Your Customer (KYC) + +"Know Your Customer" (KYC) regulations refer to a set of policies and procedures that financial institutions and other regulated businesses must follow to verify the identity of their customers. Customer details can be like name, address, date of birth, and other government-issued identification documents. + +As part of KYC, businesses must assess the potential risk posed by each customer and conduct **ongoing monitoring** of their transactions and behaviour to detect any suspicious activity. KYC regulations are enforced by regulatory authorities, and failure to comply can result in financial penalties and reputation damage. + +KYC regulations are intended to prevent money laundering, terrorist financing, and other illicit activities. Financial services companies are combating the use of stolen identity information by reducing reliance on static methods for verifying identity (Knowledge-Based Authentication, or KBA) and instead moving to digital identities. + +## What is digital identity? + +Digital identity refers to the collection of attributes and identifiers that represent an individual online. These may include names, email addresses, phone numbers, usernames, and biometrics, among others. Digital identity validation is the process of verifying that these attributes are accurate and belong to the entity they claim to represent. + +Identity validation is crucial because it helps establish trust in digital environments, where face-to-face interaction is often not possible. It ensures that the parties involved in a transaction are who they claim to be, minimizing the risk of fraud, identity theft, and other cyber crimes. + +Digital Identity attributes + +Digital identities consist of two parts: + +- **Static** data: personally identifiable information (PII) such as name, address, and biometrics +- **Dynamic** data: behavioural and contextual information such as browsing history, device type, and location data. Dynamic digital identities are constantly updated based on the information available from each digital transaction. + +Companies must monitor customer's every transaction and behaviour, then use stored digital identities to score the **risk**, identifying **possible suspicious activity** for a given transaction. + +## Why you should use redis for digital identity validation + +The following are the primary requirements of a storage layer for digital identities: + +- Must maintain **real-time read latency** to fit within transaction SLA. +- Must have a **flexible data model** to store multiple unstructured data types such as behavioural, transactional, location, social/mobile and more. + +These two factors are limiting for using traditional Relation Database Management Systems (RDBMS) to manage and validate digital identities in real time. While it is possible to use RDBMS to store digital identities, it is not the best choice for real-time validation of a flexible data model. + +**Redis Cloud**, on the other hand, is optimized for high throughput, low latency, data flexibility, and real-time query performance, easily satisfying the first criterion. With **sub-millisecond latency** and hundreds of millions of operations per second across both read and write operations, it is well-suited for managing dynamic digital identity data. As the volume of data grows, we can expect near-linear scalability and 99.999% of uptime with **Active-Active geo-replication**. + +Redis Cloud's flexible data model has native support for multiple data types, including **JSON, hashes, streams, graphs and more**. Additionally, it can process complex searches on structured and unstructured data, as well as filtering by numeric properties and geographical distances, making it easier to manage and query large datasets of digital identities. + +## Microservices architecture for an E-commerce Application + + + +### Storing digital identities + +Given we're discussing a microservices application, it makes sense to use a microservice for managing digital identities. Consider the following workflow outlining how digital identities are stored and retrieved from redis: + +![digital-identity-code-flow-01](./images/digital-identity-code-flow-1.png) + +:::note + +The demo application doesn't have a `login service`. All user sessions are currently authenticated in the `api gateway` service. So `login service` is synonymous with the `api gateway` with respect to the demo app. + +::: + +The demo application uses redis streams for interservice communication. The following outlines the workflow, and the responsibilities of each service: + +1. `login service`: stores the (user) digital identity as a `INSERT_LOGIN_IDENTITY` stream entry to redis + ![Login Identity in Transaction Stream](./images/code-flow-1-1.png) +2. `digital identity service`: reads the identity from the `INSERT_LOGIN_IDENTITY` stream +3. `digital identity service`: stores the identity as JSON to redis + ![Login Identity as JSON](./images/code-flow-1-3.png) + +:::note + +For demo purposes, we are only using a few characteristics of a user's digital identity like IP address, browser fingerprint, and session. In a real-world application you should store more characteristics like location, device type, and prior actions taken for better risk assessment and identity completeness. + +::: + +### Validating digital identities + +In an e-commerce application, validating digital identities happens at checkout time. You want to make sure the customer is who they say they are before you attempt to process their order. To validate digital identities, we need to calculate the digital identity score, starting in the `orders service`. The following outlines the workflow, and the responsibilities of each service: + +![digital-identity-code-flow-02](./images/digital-identity-code-flow-2.png) + +1. `orders service`: stores the digital identity in a `CALCULATE_IDENTITY_SCORE` redis stream event to calculate it's identity score + ![Validation Identity in Transaction Stream](./images/code-flow-2-1.png) +1. `digital identity service`: reads the identity from the `CALCULATE_IDENTITY_SCORE` stream event +1. `digital identity service`: stores the identity with its calculated score as JSON + ![Validation Identity as JSON](./images/code-flow-2-4.png) + +:::tip Caveat + +Even though you may receive a score of “1” this only means the score has matched 100% against the measured properties only. We are only measuring digital aspects of the identity, which can be compromised. In a real-world scenario you would want to measure more characteristics like location, device type, session, etc. This is in addition to other contextual information for a complete [transaction risk score](/howtos/solutions/fraud-detection/transaction-risk-scoring). + +::: + +## E-commerce application frontend using Next.js and Tailwind + + + + + +## Building a digital identity validation microservice with redis + +Now, let's go step-by-step through the process of storing, scoring, and validating digital identities using redis with some example code. For demo purposes, we are only using a few characteristics of a user's digital identity like IP address, browser fingerprint, and session. In a real-world application you should store more characteristics like location, device type, and prior actions taken for better risk assessment and identity completeness. + +### Storing digital identities in redis in a microservices architecture + +1. `login service`: stores the (user) digital identity as a `INSERT_LOGIN_IDENTITY` stream entry to redis + +```typescript +//addLoginToTransactionStream +const userId = 'USR_4e7acc44-e91e-4c5c-9112-bdd99d799dd3'; //from session +const sessionId = 'SES_94ff24a8-65b5-4795-9227-99906a43884e'; //from session +const persona = 'GRANDFATHER'; //from session + +const entry: ITransactionStreamMessage = { + action: TransactionStreamActions.INSERT_LOGIN_IDENTITY, + logMessage: `[${REDIS_STREAMS.CONSUMERS.IDENTITY}] Digital identity to be stored for the user ${userId}`, + userId, + persona, + sessionId, + + identityBrowserAgent: req.headers['user-agent'], + identityIpAddress: + req.headers['x-forwarded-for']?.toString() || req.socket.remoteAddress, + transactionPipeline: JSON.stringify(TransactionPipelines.LOGIN), +}; + +const nodeRedisClient = getNodeRedisClient(); +const streamKeyName = 'TRANSACTION_STREAM'; +const id = '*'; //* = auto generate +await nodeRedisClient.xAdd(streamKeyName, id, entry); +``` + +2. `digital identity service`: reads the identity from the `INSERT_LOGIN_IDENTITY` stream + +```typescript +interface ListenStreamOptions { + streams: { + streamKeyName: string; + eventHandlers: { + [messageAction: string]: IMessageHandler; + }; + }[]; + groupName: string; + consumerName: string; + maxNoOfEntriesToReadAtTime?: number; +} + +// Below is some code for how you would use redis to listen for the stream events: + +const listenToStreams = async (options: ListenStreamOptions) => { + /* + (A) create consumer group for the stream + (B) read set of messages from the stream + (C) process all messages received + (D) trigger appropriate action callback for each message + (E) acknowledge individual messages after processing + */ + const nodeRedisClient = getNodeRedisClient(); + if (nodeRedisClient) { + const streams = options.streams; + const groupName = options.groupName; + const consumerName = options.consumerName; + const readMaxCount = options.maxNoOfEntriesToReadAtTime || 100; + const idInitialPosition = '0'; //0 = start, $ = end or any specific id + const streamKeyIdArr: { + key: string; + id: string; + }[] = []; + + streams.map(async (stream) => { + LoggerCls.info( + `Creating consumer group ${groupName} in stream ${stream.streamKeyName}`, + ); + + try { + // (A) create consumer group for the stream + await nodeRedisClient.xGroupCreate( + stream.streamKeyName, + groupName, + idInitialPosition, + { + MKSTREAM: true, + }, + ); + } catch (err) { + LoggerCls.error( + `Consumer group ${groupName} already exists in stream ${stream.streamKeyName}!`, + ); + } + + streamKeyIdArr.push({ + key: stream.streamKeyName, + id: '>', // Next entry ID that no consumer in this group has read + }); + }); + + LoggerCls.info(`Starting consumer ${consumerName}.`); + + while (true) { + try { + // (B) read set of messages from different streams + const dataArr = await nodeRedisClient.xReadGroup( + commandOptions({ + isolated: true, + }), + groupName, + consumerName, + //can specify multiple streams in array [{key, id}] + streamKeyIdArr, + { + COUNT: readMaxCount, // Read n entries at a time + BLOCK: 5, //block for 0 (infinite) seconds if there are none. + }, + ); + + // dataArr = [ + // { + // name: 'streamName', + // messages: [ + // { + // id: '1642088708425-0', + // message: { + // key1: 'value1', + // }, + // }, + // ], + // }, + // ]; + + //(C) process all messages received + if (dataArr && dataArr.length) { + for (let data of dataArr) { + for (let messageItem of data.messages) { + const streamKeyName = data.name; + + const stream = streams.find( + (s) => s.streamKeyName == streamKeyName, + ); + + if (stream && messageItem.message) { + const streamEventHandlers = stream.eventHandlers; + const messageAction = messageItem.message.action; + const messageHandler = streamEventHandlers[messageAction]; + + if (messageHandler) { + // (D) trigger appropriate action callback for each message + await messageHandler(messageItem.message, messageItem.id); + } + //(E) acknowledge individual messages after processing + nodeRedisClient.xAck(streamKeyName, groupName, messageItem.id); + } + } + } + } else { + // LoggerCls.info('No new stream entries.'); + } + } catch (err) { + LoggerCls.error('xReadGroup error !', err); + } + } + } +}; + +// `listenToStreams` listens for events and calls the appropriate callback to further handle the events. +listenToStreams({ + streams: [ + { + streamKeyName: REDIS_STREAMS.STREAMS.TRANSACTIONS, + eventHandlers: { + [TransactionStreamActions.INSERT_LOGIN_IDENTITY]: insertLoginIdentity, + //... + }, + }, + ], + groupName: REDIS_STREAMS.GROUPS.IDENTITY, + consumerName: REDIS_STREAMS.CONSUMERS.IDENTITY, +}); +``` + +3. `digital identity service`: stores the identity as JSON to redis + +```typescript +const insertLoginIdentity: IMessageHandler = async ( + message: ITransactionStreamMessage, + messageId, +) => { + LoggerCls.info(`Adding digital identity to redis for ${message.userId}`); + + // add login digital identity to redis + const insertedKey = await addDigitalIdentityToRedis(message); + + //... +}; + +const addDigitalIdentityToRedis = async ( + message: ITransactionStreamMessage, +) => { + let insertedKey = ''; + + const userId = message.userId; + const digitalIdentity: IDigitalIdentity = { + action: message.action, + userId: userId, + sessionId: message.sessionId, + + ipAddress: message.identityIpAddress, + browserFingerprint: crypto + .createHash('sha256') + .update(message.identityBrowserAgent) + .digest('hex'), + identityScore: message.identityScore ? message.identityScore : '', + + createdOn: new Date(), + createdBy: userId, + statusCode: DB_ROW_STATUS.ACTIVE, + }; + + const repository = digitalIdentityRepo.getRepository(); + if (repository) { + const entity = repository.createEntity(digitalIdentity); + insertedKey = await repository.save(entity); + } + + return insertedKey; +}; +``` + +### Validating digital identities using redis in a microservices architecture + +1. `orders service`: stores the digital identity to be validated in a `CALCULATE_IDENTITY_SCORE` redis stream + +```typescript +//adding Identity To TransactionStream +const userId = 'USR_4e7acc44-e91e-4c5c-9112-bdd99d799dd3'; +const sessionId = 'SES_94ff24a8-65b5-4795-9227-99906a43884e'; +let orderDetails = { + orderId: '63f5f8dc3696d145a45775a6', + orderAmount: '1000', + userId: userId, + sessionId: sessionId, + orderStatus: 1, + products: order.products, //array of product details +}; + +const entry: ITransactionStreamMessage = { + action: 'CALCULATE_IDENTITY_SCORE', + logMessage: `Digital identity to be validated/ scored for the user ${userId}`, + userId: userId, + sessionId: sessionId, + orderDetails: orderDetails ? JSON.stringify(orderDetails) : '', + transactionPipeline: JSON.stringify(TransactionPipelines.CHECKOUT), + + identityBrowserAgent: req.headers['user-agent'], + identityIpAddress: + req.headers['x-forwarded-for']?.toString() || req.socket.remoteAddress, +}; + +const nodeRedisClient = getNodeRedisClient(); +const streamKeyName = 'TRANSACTION_STREAM'; +const id = '*'; //* = auto generate +await nodeRedisClient.xAdd(streamKeyName, id, entry); +``` + +2. `Digital identity service` reads the identity from the `CALCULATE_IDENTITY_SCORE` stream + +```typescript +listenToStreams({ + streams: [ + { + streamKeyName: REDIS_STREAMS.STREAMS.TRANSACTIONS, + eventHandlers: { + // ... + [TransactionStreamActions.CALCULATE_IDENTITY_SCORE]: + scoreDigitalIdentity, + }, + }, + ], + groupName: REDIS_STREAMS.GROUPS.IDENTITY, + consumerName: REDIS_STREAMS.CONSUMERS.IDENTITY, +}); + +const scoreDigitalIdentity: IMessageHandler = async ( + message: ITransactionStreamMessage, + messageId, +) => { + LoggerCls.info(`Scoring digital identity for ${message.userId}`); + + //step 1 - calculate score for validation digital identity + const identityScore = await calculateIdentityScore(message); + message.identityScore = identityScore.toString(); + + LoggerCls.info(`Adding digital identity to redis for ${message.userId}`); + //step 2 - add validation digital identity to redis + const insertedKey = await addDigitalIdentityToRedis(message); + + // ... +}; + +const calculateIdentityScore = async (message: ITransactionStreamMessage) => { + // Compare the "digital identity" with previously stored "login identities" and determine the identity score + + let identityScore = 0; + const repository = digitalIdentityRepo.getRepository(); + + if (message && message.userId && repository) { + let queryBuilder = repository + .search() + .where('userId') + .eq(message.userId) + .and('action') + .eq('INSERT_LOGIN_IDENTITY') + .and('statusCode') + .eq(DB_ROW_STATUS.ACTIVE); + + //console.log(queryBuilder.query); + const digitalIdentities = await queryBuilder.return.all(); + + if (digitalIdentities && digitalIdentities.length) { + //if browser details matches -> +1 score + const matchBrowserItems = digitalIdentities.filter((_digIdent) => { + let identityBrowserAgentHash = crypto + .createHash('sha256') + .update(message.identityBrowserAgent) + .digest('hex'); + return _digIdent.browserFingerprint == identityBrowserAgentHash; + }); + if (matchBrowserItems.length > 0) { + identityScore += 1; + } + + //if IP address matches -> +1 score + const matchIpAddressItems = digitalIdentities.filter((_digIdent) => { + return _digIdent.ipAddress == message.identityIpAddress; + }); + if (matchIpAddressItems.length > 0) { + identityScore += 1; + } + } + } + + //calculate average score + const noOfIdentityCharacteristics = 2; //2 == browserFingerprint, ipAddress + identityScore = identityScore / noOfIdentityCharacteristics; + return identityScore; // identityScore final value ranges between 0 (no match) and 1 (full match) +}; +``` + +3. `digital identity service`: stores the identity with score as JSON in redis + +```typescript +const addDigitalIdentityToRedis = async ( + message: ITransactionStreamMessage, +) => { + let insertedKey = ''; + + const userId = message.userId; + const digitalIdentity: IDigitalIdentity = { + action: message.action, + userId: userId, + sessionId: message.sessionId, + + ipAddress: message.identityIpAddress, + browserFingerprint: crypto + .createHash('sha256') + .update(message.identityBrowserAgent) + .digest('hex'), + identityScore: message.identityScore ? message.identityScore : '', + + createdOn: new Date(), + createdBy: userId, + statusCode: DB_ROW_STATUS.ACTIVE, //1 + }; + + const repository = digitalIdentityRepo.getRepository(); + if (repository) { + const entity = repository.createEntity(digitalIdentity); + insertedKey = await repository.save(entity); + } + + return insertedKey; +}; +``` + +## Conclusion + +Now you have learned how to use redis to setup ongoing digital identity monitoring and scoring in a microservices application. This is also called "dynamic digital identity monitoring." Dynamic digital identities are constantly updated based on the information available from each digital transaction. By analyzing these transactions, businesses can build a comprehensive and up-to-date digital identity that includes both static and dynamic elements. These identities can then be scored to determine the risk that they pose to the business. + +In addition to increasing security, digital identities can also improve the customer experience. By using the digital footprint left by a user, businesses can offer more personalized services and reduce friction in the authentication process. + +Digital identity systems are typically designed to be interoperable and scalable, allowing for seamless integration with various applications and platforms. + +### Additional Resources + +- Redis Streams + - Explore streams in detail in the [Redis University course on Redis Streams](https://university.redis.com/courses/ru202/) + - Check out our e-book on [Understanding Streams in Redis and Kafka: A Visual Guide](https://redis.com/docs/understanding-streams-in-redis-and-kafka-a-visual-guide/) +- Fraud detection with Redis + - [Transaction Risk Scoring](/howtos/solutions/fraud-detection/transaction-risk-scoring) +- [Microservices with Redis](/howtos/solutions#microservices) +- [Redis YouTube channel](https://www.youtube.com/c/Redisinc) +- Clients like [Node Redis](https://github.com/redis/node-redis) and [Redis om Node](https://github.com/redis/redis-om-node) help you to use Redis in Node.js applications. +- [RedisInsight](https://redis.com/redis-enterprise/redis-insight/) : To view your Redis data or to play with raw Redis commands in the workbench +- [Try Redis Cloud for free](https://redis.com/try-free/) diff --git a/docs/howtos/solutions/fraud-detection/transaction-risk-scoring/images/redisinsight-bloom-filters.png b/docs/howtos/solutions/fraud-detection/transaction-risk-scoring/images/redisinsight-bloom-filters.png new file mode 100644 index 00000000000..9d6531d111e Binary files /dev/null and b/docs/howtos/solutions/fraud-detection/transaction-risk-scoring/images/redisinsight-bloom-filters.png differ diff --git a/docs/howtos/solutions/fraud-detection/transaction-risk-scoring/images/redisinsight-profile-feature.png b/docs/howtos/solutions/fraud-detection/transaction-risk-scoring/images/redisinsight-profile-feature.png new file mode 100644 index 00000000000..10b79a0b6d4 Binary files /dev/null and b/docs/howtos/solutions/fraud-detection/transaction-risk-scoring/images/redisinsight-profile-feature.png differ diff --git a/docs/howtos/solutions/fraud-detection/transaction-risk-scoring/images/redisinsight-transaction-stream.png b/docs/howtos/solutions/fraud-detection/transaction-risk-scoring/images/redisinsight-transaction-stream.png new file mode 100644 index 00000000000..2c6b8bcbe56 Binary files /dev/null and b/docs/howtos/solutions/fraud-detection/transaction-risk-scoring/images/redisinsight-transaction-stream.png differ diff --git a/docs/howtos/solutions/fraud-detection/transaction-risk-scoring/images/transaction-risk-scoring-event-flow.png b/docs/howtos/solutions/fraud-detection/transaction-risk-scoring/images/transaction-risk-scoring-event-flow.png new file mode 100644 index 00000000000..5a3de444852 Binary files /dev/null and b/docs/howtos/solutions/fraud-detection/transaction-risk-scoring/images/transaction-risk-scoring-event-flow.png differ diff --git a/docs/howtos/solutions/fraud-detection/transaction-risk-scoring/index-transaction-risk-scoring.mdx b/docs/howtos/solutions/fraud-detection/transaction-risk-scoring/index-transaction-risk-scoring.mdx new file mode 100644 index 00000000000..33f7f545de6 --- /dev/null +++ b/docs/howtos/solutions/fraud-detection/transaction-risk-scoring/index-transaction-risk-scoring.mdx @@ -0,0 +1,580 @@ +--- +id: index-transaction-risk-scoring +title: How to use Redis for Transaction risk scoring +sidebar_label: How to use Redis for Transaction risk scoring +slug: /howtos/solutions/fraud-detection/transaction-risk-scoring +authors: [prasan, will] +--- + +import Authors from '@theme/Authors'; +import MicroservicesEcommerceDesign from '../../microservices/common-data/microservices-ecommerce.mdx'; +import MicroservicesArchitectureWithRedis from '../../microservices/common-data/microservices-arch-with-redis.mdx'; +import SourceCodeFraudDetection from '../common-fraud/source-code-tip.mdx'; +import RedisInsightTransactionStreamImage from './images/redisinsight-transaction-stream.png'; +import RedisInsightBloomFiltersImage from './images/redisinsight-bloom-filters.png'; +import RedisInsightProfileFeatureImage from './images/redisinsight-profile-feature.png'; + + + + + +## What is transaction risk scoring + +"Transaction risk scoring" is a method of leveraging data science, machine learning, and statistical analysis to continuously monitor transactions and assess the relative risk associated with each transaction. By comparing transactional data to models of known fraud, the risk score can be calculated, and the closer a transaction matches fraudulent behaviour, the higher the risk score. + +The score is typically based on a statistical analysis of historical transaction data to identify patterns and trends associated with fraudulent activity. The score can then be used to trigger alerts or to automatically decline transactions that exceed a certain risk threshold. It can also be used to trigger additional authentication steps for high-risk transactions. Additional steps might include a one-time password (OTP) sent via text, email, or biometric scan. + +:::tip + +Transaction risk scoring is often combined in a single system with other fraud detection methods, such as [**digital identity validation**](/howtos/solutions/fraud-detection/digital-identity-validation). + +::: + +## Why you should use redis for transaction risk scoring + +A risk-based approach must be designed to create a frictionless flow and **avoid slowing down** the transaction experience for legitimate customers while simultaneously preventing fraud. If your risk-based approach is too strict, it will **block legitimate transactions** and frustrate customers. If it is too lenient, it will **allow fraudulent transactions** to go through. + +### How to avoid false positives with rules engines + +Rules-based automated fraud detection systems operate on simple "yes or no" logic to determine whether a given transaction is likely to be fraudulent. An example of a rule would be "block all transactions over $500 from a risky region". With a simple binary decision like this, the system is likely to block a lot of genuine customers. Sophisticated fraudsters easily fool such systems, and the complex nature of fraud means that simple "yes or no" rules may not be enough to assess the risk of each transaction accurately. + +More accurate risk scoring with **AI/ML** addresses these issues. Modern fraud detection systems use machine learning models trained on large volumes of different data sets known as "features"(user profiles, transaction patterns, behavioural attributes and more) to accurately identify fraudulent transactions. These models have been designed to be flexible, so they can adapt to new types of fraud. For example, a neural network can examine suspicious activities like how many pages a customer browses before making an order, whether they are copying and pasting information or typing it in manually and flag the customer for further review. + +The models use historical as well as most recent data to create a risk profile for each customer. By analyzing past behaviour it is possible to create a profile of what is normal for each customer. Any transactions that deviate from this profile can be flagged as suspicious, reducing the likelihood of false positives. The models are very fast to adapt to changes in normal behaviour too, and can quickly identify patterns of fraud transactions. + +This is exactly where **Redis Cloud** excels in transaction risk scoring. + +### How to use Redis Cloud for transaction risk scoring + +People use Redis Cloud as the **in-memory** online feature store for online and **real-time access** to feature data as part of a transaction risk scoring system. By serving online features with low latency, Redis Cloud enables the risk-scoring models to return results in real-time, thereby allowing the whole system to achieve high accuracy and **instant response** on approving legitimate online transactions. + +Another very common use for Redis Cloud in transaction risk scoring is for **transaction filters**. A transaction filter can be implemented as a **Bloom** filter that stores information about user behaviours. It can answer questions like "Have we seen this user purchase at this merchant before?" Or, "Have we seen this user purchase at this merchant in the X to Y price range before?" Being a probabilistic data structure, Redis Bloom filters do, indeed, sacrifice some accuracy, but in return, they get a very low memory footprint and response time. + +:::tip + +You might ask why not use a Redis Set to answer some of the questions above. Redis Sets are used to store unordered collections of unique strings (members). They are very efficient, with most operations taking O(1) time complexity. However, the `SMEMBERS` command is O(N), where N is the cardinality of the set, and can be very slow for large sets and it would also take a lot of memory. This presents a problem both in single instance storage as well as geo-replication, since more data will require more time to move. This is why Redis Bloom filters are a better choice for transaction filters. Applications undergo millions of transactions every day, and Bloom filters maintain a speedy response time at scale. + +::: + +## Transaction risk scoring in a microservices architecture for an e-commerce application + + + +### Transaction risk scoring checkout procedure + +When a user goes to checkout, the system needs to check the user's digital identity and profile to determine the risk of the transaction. The system can then decide whether to approve the transaction or to trigger additional authentication steps. The following diagram shows the flow of transaction risk scoring in the e-commerce application: + +![Transaction risk scoring event flow with redis streams](./images/transaction-risk-scoring-event-flow.png) + +The following steps are performed in the checkout procedure: + +1. The customer adds an item to the cart and proceeds to checkout. +1. The `order service` receives the checkout request and creates an order in the database. +1. The `order services` publishes a `CALCULATE_IDENTITY_SCORE` event to the `TRANSACTIONS` Redis stream. +1. The `identity service` subscribes to the `TRANSACTIONS` Redis stream and receives the `CALCULATE_IDENTITY_SCORE` event. +1. The `identity service` [calculates the identity score](/howtos/solutions/fraud-detection/digital-identity-validation) for the user and publishes a `CALCULATE_PROFILE_SCORE` event to the `TRANSACTIONS` Redis stream. +1. The `profile service` subscribes to the `TRANSACTIONS` Redis stream and receives the `CALCULATE_PROFILE_SCORE` event. +1. The `profile service` calculates the profile score by checking the products in the shopping cart against a known profile for the customer. +1. The `profile service` publishes a `ASSESS_RISK` event to the `TRANSACTIONS` Redis stream. +1. The `order service` subscribes to the `TRANSACTIONS` Redis stream and receives the `ASSESS_RISK` event. +1. The `order service` determines if there is a likelihood of fraud based on the identity and profile scores. If there is a likelihood of fraud, the `order service` triggers additional authentication steps. If there is no likelihood of fraud, the `order service` approves the order and proceeds to process payments. + +## E-commerce application frontend using Next.js and Tailwind + + + + + +## Coding example for transaction risk scoring with redis + +Now that you understand the steps involved in the checkout process for transaction risk scoring, let's look at the code for the `order service` and `profile service` to facilitate this process: + +:::note + +To see the code for the `identity service` check out the [digital identity validation](/howtos/solutions/fraud-detection/digital-identity-validation) solution. + +::: + +### Initiating the checkout process in the order service + +When the `order service` receives a checkout request, it creates an order in the database and publishes a `CALCULATE_IDENTITY_SCORE` event to the `TRANSACTIONS` Redis stream. The event contains information about the order as well as the customer, such as the browser fingerprint, IP address, and persona (profile). This data will be used during the transaction by the `identity service` and `profile service` to calculate the identity and profile scores. The `order service` also specifies the transaction pipeline, meaning it determines the order of events called so that the `identity service` and `profile service` do not need to be aware of each other. The `order service` ultimately owns the transaction. The sample code below shows the `createOrder` function in the `order service`. The code example below is highly simplified. For more detail please see the source code linked above: + +```typescript +const TransactionPipelines = { + CHECKOUT: [ + TransactionStreamActions.CALCULATE_IDENTITY_SCORE, + TransactionStreamActions.CALCULATE_PROFILE_SCORE, + TransactionStreamActions.ASSESS_RISK, + TransactionStreamActions.PROCESS_PAYMENT, + TransactionStreamActions.PAYMENT_PROCESSED, + ], +}; + +async function createOrder( + order: IOrder, + browserAgent: string, + ipAddress: string, + sessionId: string, + sessionData: ISessionData, +) { + order = await validateOrder(order); + + const orderId = await addOrderToRedis(order); + order.orderId = orderId; + + await addOrderToMongoDB(order); + + // Log order creation to the LOGS stream + await streamLog({ + action: 'CREATE_ORDER', + message: `[${REDIS_STREAMS.CONSUMERS.ORDERS}] Order created with id ${orderId} for the user ${userId}`, + metadata: { + userId: userId, + persona: sessionData.persona, + sessionId: sessionId, + }, + }); + + let orderAmount = 0; + order.products?.forEach((product) => { + orderAmount += product.productPrice * product.qty; + }); + + const orderDetails: IOrderDetails = { + orderId: orderId, + orderAmount: orderAmount.toFixed(2), + userId: userId, + sessionId: sessionId, + orderStatus: order.orderStatusCode, + products: order.products, + }; + + // Initiate the transaction by adding the order details to the transaction stream and sending the first event + await addMessageToTransactionStream({ + action: TransactionPipelines.CHECKOUT[0], + logMessage: `[${REDIS_STREAMS.CONSUMERS.IDENTITY}] Digital identity to be validated/ scored for the user ${userId}`, + userId: userId, + persona: sessionData.persona, + sessionId: sessionId, + orderDetails: orderDetails ? JSON.stringify(orderDetails) : '', + transactionPipeline: JSON.stringify(TransactionPipelines.CHECKOUT), + + identityBrowserAgent: browserAgent, + identityIpAddress: ipAddress, + }); + + return orderId; +} +``` + +Let's look at the `addMessageToTransactionStream` function in more detail: + +```typescript +async function addMessageToStream(message, streamKeyName) { + try { + const nodeRedisClient = getNodeRedisClient(); + if (nodeRedisClient && message && streamKeyName) { + const id = '*'; //* = auto generate + await nodeRedisClient.xAdd(streamKeyName, id, message); + } + } catch (err) { + LoggerCls.error('addMessageToStream error !', err); + LoggerCls.error(streamKeyName, message); + } +} + +async function addMessageToTransactionStream( + message: ITransactionStreamMessage, +) { + if (message) { + const streamKeyName = REDIS_STREAMS.STREAMS.TRANSACTIONS; + await addMessageToStream(message, streamKeyName); + } +} +``` + +### Checking an order against a known profile in the profile service + +So you can see above, the transaction pipeline follows `CALCULATE_IDENTITY_SCORE` -> `CALCULATE_PROFILE_SCORE` -> `ASSESS_RISK`. Let's now look at how the `profile service` subscribes to the `TRANSACTIONS` Redis stream and receives the `CALCULATE_PROFILE_SCORE` event. When the `profile service` starts, it subscribes to the `TRANSACTIONS` Redis stream and listens for events. + +```typescript +function listen() { + listenToStreams({ + streams: [ + { + streamKeyName: REDIS_STREAMS.STREAMS.TRANSACTIONS, + eventHandlers: { + [TransactionStreamActions.CALCULATE_PROFILE_SCORE]: + calculateProfileScore, + }, + }, + ], + groupName: REDIS_STREAMS.GROUPS.PROFILE, + consumerName: REDIS_STREAMS.CONSUMERS.PROFILE, + }); +} +``` + +A highly simplified version of the `listenToStreams` method looks as follows. It takes in a list of streams with an associated object that maps events on the stream to a callback for processing the events. It also takes a stream group and a consumer name. Then it handles the subscription to the stream and calling on the appropriate method when an event comes in: + +```typescript +interface ListenStreamOptions { + streams: { + streamKeyName: string; + eventHandlers: { + [messageAction: string]: IMessageHandler; + }; + }[]; + groupName: string; + consumerName: string; + maxNoOfEntriesToReadAtTime?: number; +} + +const listenToStreams = async (options: ListenStreamOptions) => { + /* + (A) create consumer group for the stream + (B) read set of messages from the stream + (C) process all messages received + (D) trigger appropriate action callback for each message + (E) acknowledge individual messages after processing + */ + + const nodeRedisClient = getNodeRedisClient(); + if (nodeRedisClient) { + const streams = options.streams; + const groupName = options.groupName; + const consumerName = options.consumerName; + const readMaxCount = options.maxNoOfEntriesToReadAtTime || 100; + const idInitialPosition = '0'; //0 = start, $ = end or any specific id + const streamKeyIdArr: { + key: string; + id: string; + }[] = []; + + streams.map(async (stream) => { + LoggerCls.info( + `Creating consumer group ${groupName} in stream ${stream.streamKeyName}`, + ); + + try { + // (A) create consumer group for the stream + await nodeRedisClient.xGroupCreate( + stream.streamKeyName, + groupName, + idInitialPosition, + { + MKSTREAM: true, + }, + ); + } catch (err) { + LoggerCls.error( + `Consumer group ${groupName} already exists in stream ${stream.streamKeyName}!`, + ); //, err + } + + streamKeyIdArr.push({ + key: stream.streamKeyName, + id: '>', // Next entry ID that no consumer in this group has read + }); + }); + + LoggerCls.info(`Starting consumer ${consumerName}.`); + + while (true) { + try { + // (B) read set of messages from different streams + const dataArr = await nodeRedisClient.xReadGroup( + commandOptions({ + isolated: true, + }), + groupName, + consumerName, + //can specify multiple streams in array [{key, id}] + streamKeyIdArr, + { + COUNT: readMaxCount, // Read n entries at a time + BLOCK: 5, //block for 0 (infinite) seconds if there are none. + }, + ); + + // dataArr = [ + // { + // name: 'streamName', + // messages: [ + // { + // id: '1642088708425-0', + // message: { + // key1: 'value1', + // }, + // }, + // ], + // }, + // ]; + + //(C) process all messages received + if (dataArr && dataArr.length) { + for (let data of dataArr) { + for (let messageItem of data.messages) { + const streamKeyName = data.name; + + const stream = streams.find( + (s) => s.streamKeyName == streamKeyName, + ); + + if (stream && messageItem.message) { + const streamEventHandlers = stream.eventHandlers; + const messageAction = messageItem.message.action; + const messageHandler = streamEventHandlers[messageAction]; + + if (messageHandler) { + // (D) trigger appropriate action callback for each message + await messageHandler(messageItem.message, messageItem.id); + } + //(E) acknowledge individual messages after processing + nodeRedisClient.xAck(streamKeyName, groupName, messageItem.id); + } + } + } + } else { + // LoggerCls.info('No new stream entries.'); + } + } catch (err) { + LoggerCls.error('xReadGroup error !', err); + } + } + } +}; +``` + +The `processTransactionStream` method is called when a new event comes in. It validates the event, making sure it is the `CALCULATE_PROFILE_SCORE` event, and if it is then it calculates the profile score. It uses a Redis Bloom filter to check if the user has ordered a similar set of products before. It uses a pre-defined persona for the purposes of this demo, but in reality you would build a profile of the user over time. In the demo application, each product has a "master category" and "subcategory". Bloom filters are setup for the master categories as well as the master+subcategories. The scoring logic is highlighted below: + +```typescript {48-62} +async function calculateProfileScore( + message: ITransactionStreamMessage, + messageId, +) { + LoggerCls.info(`Incoming message in Profile Service ${messageId}`); + if (!(message.orderDetails && message.persona)) { + return false; + } + + await streamLog({ + action: TransactionStreamActions.CALCULATE_PROFILE_SCORE, + message: `[${REDIS_STREAMS.CONSUMERS.PROFILE}] Calculating profile score for the user ${message.userId}`, + metadata: message, + }); + + // check profile score + const { products }: IOrderDetails = JSON.parse(message.orderDetails); + const persona = message.persona.toLowerCase(); + let score = 0; + const nodeRedisClient = getNodeRedisClient(); + + if (!nodeRedisClient) { + return false; + } + + const categories = products.reduce((cat, product) => { + const masterCategory = product.productData?.masterCategory?.typeName; + const subCategory = product.productData?.subCategory?.typeName; + + if (masterCategory) { + cat[`${masterCategory}`.toLowerCase()] = true; + + if (subCategory) { + cat[`${masterCategory}:${subCategory}`.toLowerCase()] = true; + } + } + + return cat; + }, {} as Record); + + const categoryKeys = Object.keys(categories); + const checks = categoryKeys.length; + + LoggerCls.info( + `Checking ${checks} categories: ${JSON.stringify(categoryKeys)}`, + ); + + await Promise.all( + categoryKeys.map(async (category) => { + const exists = await nodeRedisClient.bf.exists( + `bfprofile:${category}`.toLowerCase(), + persona, + ); + + if (exists) { + score += 1; + } + }), + ); + + LoggerCls.info(`After ${checks} checks, total score is ${score}`); + score = score / (checks || 1); + + await streamLog({ + action: TransactionStreamActions.CALCULATE_PROFILE_SCORE, + message: `[${REDIS_STREAMS.CONSUMERS.PROFILE}] Profile score for the user ${message.userId} is ${score}`, + metadata: message, + }); + + await nextTransactionStep({ + ...message, + logMessage: `[${REDIS_STREAMS.CONSUMERS.PROFILE}] Requesting next step in transaction risk scoring for the user ${message.userId}`, + profileScore: `${score}`, + }); + + return true; +} +``` + +The `nextTransactionStep` method is called after the profile score has been calculated. It uses the `transactionPipeline` setup in the `order service` to publish the `ASSESS_RISK` event. The logic for this is below: + +```typescript +async function nextTransactionStep(message: ITransactionStreamMessage) { + const transactionPipeline: TransactionStreamActions[] = JSON.parse( + message.transactionPipeline, + ); + transactionPipeline.shift(); + + if (transactionPipeline.length <= 0) { + return; + } + + const streamKeyName = REDIS_STREAMS.STREAMS.TRANSACTIONS; + await addMessageToStream( + { + ...message, + action: transactionPipeline[0], + transactionPipeline: JSON.stringify(transactionPipeline), + }, + streamKeyName, + ); +} +``` + +In short, the `nextTransactionStep` method pops the current event off of the `transactionPipeline`, then it publishes the next event in the pipeline, which in this case is the `ASSESS_RISK` event. + +### Finalizing the order with transaction risk scoring in the order service + +The `order service` is responsible for finalizing the order prior to payment. It listens to the `ASSESS_RISK` event, and then checks the calculated scores to determine if there is potential fraud. + +:::note + +The demo application keeps things very simple, and it only sets a "potentialFraud" flag on the order. In the real world, you need to choose not only what scoring makes sense for your application, but also how to handle potential fraud. For example, you may want to request additional information from the customer such as a one-time password. You may also want to send the order to a human for review. It depends on your business and your risk appetite and mitigation strategy. + +::: + +The logic to process and finalize orders in the `order service` is below: + +```typescript {17-34} +async function checkOrderRiskScore(message: ITransactionStreamMessage) { + LoggerCls.info(`Incoming message in Order Service`); + if (!message.orderDetails) { + return false; + } + + const orderDetails: IOrderDetails = JSON.parse(message.orderDetails); + + if (!(orderDetails.orderId && orderDetails.userId)) { + return false; + } + + LoggerCls.info( + `Transaction risk scoring for user ${message.userId} and order ${orderDetails.orderId}`, + ); + + const { identityScore, profileScore } = message; + const identityScoreNumber = Number(identityScore); + const profileScoreNumber = Number(profileScore); + let potentialFraud = false; + + if (identityScoreNumber <= 0 || profileScoreNumber < 0.5) { + LoggerCls.info( + `Transaction risk score is too low for user ${message.userId} and order ${orderDetails.orderId}`, + ); + + await streamLog({ + action: TransactionStreamActions.ASSESS_RISK, + message: `[${REDIS_STREAMS.CONSUMERS.ORDERS}] Order failed fraud checks for orderId ${orderDetails.orderId} and user ${message.userId}`, + metadata: message, + }); + + potentialFraud = true; + } + + orderDetails.orderStatus = ORDER_STATUS.PENDING; + orderDetails.potentialFraud = potentialFraud; + + updateOrderStatusInRedis(orderDetails); + /** + * In real world scenario : can use RDI/ redis gears/ any other database to database sync strategy for REDIS-> Store of record data transfer. + * To keep it simple, adding data to MongoDB manually in the same service + */ + updateOrderStatusInMongoDB(orderDetails); + + message.orderDetails = JSON.stringify(orderDetails); + + await streamLog({ + action: TransactionStreamActions.ASSESS_RISK, + message: `[${REDIS_STREAMS.CONSUMERS.ORDERS}] Order status updated after fraud checks for orderId ${orderDetails.orderId} and user ${message.userId}`, + metadata: message, + }); + + await nextTransactionStep(message); + + return true; +} +``` + +### Visualizing the transaction risk scoring data and event pipeline in RedisInsight + +:::tip + +RedisInsight is the free redis GUI for viewing data in redis. [Click here to download.](https://redis.com/redis-enterprise/redis-insight/) + +::: + +Now that you understand some of the code involved in processing transactions, let's take a look at the data in RedisInsight. First let's look at the `TRANSACTION_STREAM` key, which is where the stream data is held for the checkout transaction: + +RedisInsight transaction risk scoring transaction stream + +You can see the `action` column shows the transaction pipeline discussed earlier. Another thing to look at in RedisInsight is the Bloom filters: + +RedisInsight transaction risk scoring bloom filters + +These filters are pre-populated in the demo application based on a feature store. Redis is also storing the features, which in this case is the profiles of each of the personas. Below is an example of one of the profile features: + +RedisInsight transaction risk scoring feature store + +## Conclusion + +In this post, you learned how to use Redis Streams to build a transaction risk scoring pipeline. You also learned how to use Redis Cloud as a feature store and Redis Bloom filters to calculate a profile score. Every application is unique, so this tutorial is meant to be a starting point for you to build your own transaction risk scoring pipeline. + +### Additional resources + +- Redis Streams + - Explore streams in detail in the [Redis University course on Redis Streams](https://university.redis.com/courses/ru202/) + - Check out our e-book on [Understanding Streams in Redis and Kafka: A Visual Guide](https://redis.com/docs/understanding-streams-in-redis-and-kafka-a-visual-guide/) +- Fraud detection with Redis + - [Digital identity validation](/howtos/solutions/fraud-detection/digital-identity-validation) +- [Microservices with Redis](/howtos/solutions#microservices) +- [Redis YouTube channel](https://www.youtube.com/c/Redisinc) +- Clients like [Node Redis](https://github.com/redis/node-redis) and [Redis om Node](https://github.com/redis/redis-om-node) help you to use Redis in Node.js applications. +- [RedisInsight](https://redis.com/redis-enterprise/redis-insight/) : To view your Redis data or to play with raw Redis commands in the workbench +- [Try Redis Cloud for free](https://redis.com/try-free/) diff --git a/docs/howtos/solutions/geo/common-geo/images/01-dashboard-geo-search.png b/docs/howtos/solutions/geo/common-geo/images/01-dashboard-geo-search.png new file mode 100644 index 00000000000..08501390964 Binary files /dev/null and b/docs/howtos/solutions/geo/common-geo/images/01-dashboard-geo-search.png differ diff --git a/docs/howtos/solutions/geo/common-geo/images/01-dashboard.png b/docs/howtos/solutions/geo/common-geo/images/01-dashboard.png new file mode 100644 index 00000000000..b6b12dbb9d9 Binary files /dev/null and b/docs/howtos/solutions/geo/common-geo/images/01-dashboard.png differ diff --git a/docs/howtos/solutions/geo/common-geo/images/02-cart.png b/docs/howtos/solutions/geo/common-geo/images/02-cart.png new file mode 100644 index 00000000000..9f096691ed5 Binary files /dev/null and b/docs/howtos/solutions/geo/common-geo/images/02-cart.png differ diff --git a/docs/howtos/solutions/geo/common-geo/images/05-order-history.png b/docs/howtos/solutions/geo/common-geo/images/05-order-history.png new file mode 100644 index 00000000000..747560cba9f Binary files /dev/null and b/docs/howtos/solutions/geo/common-geo/images/05-order-history.png differ diff --git a/docs/howtos/solutions/geo/common-geo/images/06-admin-charts.png b/docs/howtos/solutions/geo/common-geo/images/06-admin-charts.png new file mode 100644 index 00000000000..4bfa493670a Binary files /dev/null and b/docs/howtos/solutions/geo/common-geo/images/06-admin-charts.png differ diff --git a/docs/howtos/solutions/geo/common-geo/images/07-admin-top-trending.png b/docs/howtos/solutions/geo/common-geo/images/07-admin-top-trending.png new file mode 100644 index 00000000000..5b8be9e8222 Binary files /dev/null and b/docs/howtos/solutions/geo/common-geo/images/07-admin-top-trending.png differ diff --git a/docs/howtos/solutions/geo/common-geo/images/08-settings.png b/docs/howtos/solutions/geo/common-geo/images/08-settings.png new file mode 100644 index 00000000000..e9cabb5502d Binary files /dev/null and b/docs/howtos/solutions/geo/common-geo/images/08-settings.png differ diff --git a/docs/howtos/solutions/geo/common-geo/microservices-ecommerce-geo.mdx b/docs/howtos/solutions/geo/common-geo/microservices-ecommerce-geo.mdx new file mode 100644 index 00000000000..0a120a96996 --- /dev/null +++ b/docs/howtos/solutions/geo/common-geo/microservices-ecommerce-geo.mdx @@ -0,0 +1,21 @@ +The e-commerce microservices application consists of a frontend, built using [Next.js](https://nextjs.org/) with [TailwindCSS](https://tailwindcss.com/). The application backend uses [Node.js](https://nodejs.org). The data is stored in [Redis](https://redis.com/try-free/) and either MongoDB or PostgreSQL, using [Prisma](https://www.prisma.io/docs/reference/database-reference/supported-databases). Below are screenshots showcasing the frontend of the e-commerce app. + +**Dashboard:** Displays a list of products with different search functionalities, configurable in the settings page. +![Redis Microservices E-commerce App Frontend - Products Page](images/01-dashboard.png) + +**Settings:** Accessible by clicking the gear icon at the top right of the dashboard. Control the search bar, chatbot visibility, and other features here. +![Redis Microservices E-commerce App Frontend - Settings Page](images/08-settings.png) + +**Dashboard (Geo Location Search):** Configured for `Geo location search`, the search bar enables location queries.
+_Note:_ In our demo, each zipCode location is mapped with lat long coordinates. +![Redis Microservices E-commerce App Frontend - Semantic Text Search](images/01-dashboard-geo-search.png) + +**Shopping Cart:** Add products to the cart and check out using the "Buy Now" button. +![Redis Microservices E-commerce App Frontend - Shopping Cart](images/02-cart.png) + +**Order History:** Post-purchase, the 'Orders' link in the top navigation bar shows the order status and history. +![Redis Microservices E-commerce App Frontend - Order History Page](images/05-order-history.png) + +**Admin Panel:** Accessible via the 'admin' link in the top navigation. Displays purchase statistics and trending products. +![Redis Microservices E-commerce App Frontend - Admin Page](images/06-admin-charts.png) +![Redis Microservices E-commerce App Frontend - Admin Page](images/07-admin-top-trending.png) diff --git a/docs/howtos/solutions/geo/common-geo/microservices-source-code-geo.mdx b/docs/howtos/solutions/geo/common-geo/microservices-source-code-geo.mdx new file mode 100644 index 00000000000..eae0b77a1ac --- /dev/null +++ b/docs/howtos/solutions/geo/common-geo/microservices-source-code-geo.mdx @@ -0,0 +1,7 @@ +:::tip GITHUB CODE + +Below is a command to the clone the source code for the application used in this tutorial + +git clone --branch v10.1.0 https://github.com/redis-developer/redis-microservices-ecommerce-solutions + +::: diff --git a/docs/howtos/solutions/geo/getting-started-geo/images/collection-products.png b/docs/howtos/solutions/geo/getting-started-geo/images/collection-products.png new file mode 100644 index 00000000000..ca1c93cb033 Binary files /dev/null and b/docs/howtos/solutions/geo/getting-started-geo/images/collection-products.png differ diff --git a/docs/howtos/solutions/geo/getting-started-geo/images/collection-store-inventory.png b/docs/howtos/solutions/geo/getting-started-geo/images/collection-store-inventory.png new file mode 100644 index 00000000000..338c9d99a35 Binary files /dev/null and b/docs/howtos/solutions/geo/getting-started-geo/images/collection-store-inventory.png differ diff --git a/docs/howtos/solutions/geo/getting-started-geo/images/geo-dashboard.png b/docs/howtos/solutions/geo/getting-started-geo/images/geo-dashboard.png new file mode 100644 index 00000000000..f2677cfcbb1 Binary files /dev/null and b/docs/howtos/solutions/geo/getting-started-geo/images/geo-dashboard.png differ diff --git a/docs/howtos/solutions/geo/getting-started-geo/images/geo-query-1.png b/docs/howtos/solutions/geo/getting-started-geo/images/geo-query-1.png new file mode 100644 index 00000000000..e5eea22fc63 Binary files /dev/null and b/docs/howtos/solutions/geo/getting-started-geo/images/geo-query-1.png differ diff --git a/docs/howtos/solutions/geo/getting-started-geo/images/geo-query-2.png b/docs/howtos/solutions/geo/getting-started-geo/images/geo-query-2.png new file mode 100644 index 00000000000..ad96cc603fb Binary files /dev/null and b/docs/howtos/solutions/geo/getting-started-geo/images/geo-query-2.png differ diff --git a/docs/howtos/solutions/geo/getting-started-geo/images/new-york-state.png b/docs/howtos/solutions/geo/getting-started-geo/images/new-york-state.png new file mode 100644 index 00000000000..59474b38d88 Binary files /dev/null and b/docs/howtos/solutions/geo/getting-started-geo/images/new-york-state.png differ diff --git a/docs/howtos/solutions/geo/getting-started-geo/images/settings.png b/docs/howtos/solutions/geo/getting-started-geo/images/settings.png new file mode 100644 index 00000000000..e9cabb5502d Binary files /dev/null and b/docs/howtos/solutions/geo/getting-started-geo/images/settings.png differ diff --git a/docs/howtos/solutions/geo/getting-started-geo/index-geo-search.mdx b/docs/howtos/solutions/geo/getting-started-geo/index-geo-search.mdx new file mode 100644 index 00000000000..a1fd881dbf5 --- /dev/null +++ b/docs/howtos/solutions/geo/getting-started-geo/index-geo-search.mdx @@ -0,0 +1,427 @@ +--- +id: index-geo-search +title: Getting Started With Geo Location Search in Redis +sidebar_label: Getting Started With Geo Location Search in Redis +slug: /howtos/solutions/geo/getting-started +authors: [prasan, will] +--- + +import Authors from '@theme/Authors'; +import InitialMicroservicesArchitecture from '../../microservices/common-data/microservices-arch.mdx'; +import MicroservicesEcommerceGeoDesign from '../common-geo/microservices-ecommerce-geo.mdx'; +import SourceCode from '../common-geo/microservices-source-code-geo.mdx'; + + + +## What you will learn in this tutorial + +In this comprehensive tutorial, you will gain practical knowledge and hands-on experience with Geo Location search capabilities using Redis, particularly focusing on its application within a microservices architecture for an e-commerce platform. Here's what you can expect to learn: + +- **Integrating Geo Location Search with Redis**: Dive deep into the concept of Geo Location search, exploring how Redis can be leveraged to implement real-time location-based search functionalities such as proximity searches, geo-spatial queries, and location-based filtering. + +- **Database Setup and Indexing with Redis**: Learn the steps for setting up your Redis database to support Geo Location search, including how to structure your data collections and index them effectively for fast and efficient querying. + +- **Building and Querying Geo-Spatial Data**: Gain hands-on experience with writing and executing raw Redis queries for geo-spatial data, understanding the syntax and options available for searching within a radius, calculating distances, and sorting results based on geographical proximity. + +- **Developing an API Endpoint for Geo Location Search**: Walk through the process of building a RESTful API endpoint that leverages Redis to perform Geo Location searches, demonstrating how to integrate this functionality into a Node.js backend. + +## Microservices architecture for an e-commerce application + + + + + +## E-commerce application frontend using Next.js and Tailwind + + + +## What is Geo Location search? + +Geo Location search involves querying and processing data based on geographical locations identified by latitude and longitude coordinates. This capability is crucial for a wide range of applications, including location-based services, proximity searches, and spatial analysis. + +It allows systems to store, index, and quickly retrieve data points within geographical contexts, such as finding all users within a certain distance from a specific point, or calculating the distance between two locations ..etc. + +## Why you should use Redis for Geo Location search? + +Redis's **geo-spatial** capabilities enable developers to build location-aware applications that can perform proximity searches, location-based filtering, and spatial analysis with ease. + +Redis's **in-memory** architecture ensures that geo-spatial data is stored and processed in memory, +resulting in **low latency** and high throughput for location-based queries. This makes Redis an ideal choice for applications requiring **real-time** location-based search functionalities. + +Consider a multi store shopping scenario where consumers locate a product online, place the order in their browser or mobile device, and pick up at nearest store location. This is called “buy-online-pickup-in-store” (BOPIS). Redis enables a **real-time view** of store inventory and and seamless BOPIS shopping experience. + +## Database setup + + + +### Collection details + +Our demo application which utilizes two primary data collections within Redis to simulate an e-commerce platform's inventory system: + +1. `products` collection : This collection stores detailed information about each product, including name, description, category and price. + +![collection-products](./images/collection-products.png) + +:::tip + +Utilize [RedisInsight](https://redis.com/redis-enterprise/redis-insight/) to interactively explore your Redis database and execute raw Redis commands in a user-friendly workbench environment. +::: + +2. `storeInventory` collection : This collection maintains the inventory status of products across different store locations. It records the quantity of each product available at various stores, facilitating inventory tracking and management. + +![collection-store-inventory](./images/collection-store-inventory.png) + +For the purpose of this demo, we simulate an e-commerce operation in various regions across New York (US), with each store location identified by a `storeId` and associated `stockQty` for products. + +![new-york-state](./images/new-york-state.png) + +### Indexing data + +To enable Geo Location searches within our storeInventory collection, it's crucial to index the data appropriately. Redis offers multiple methods for creating indexes: using the Command Line Interface (CLI) or using client libraries like Redis OM, node redis .. etc. + +1. Using CLI command + +To facilitate geo-spatial queries and other search operations on the `storeInventory` collection, follow these commands: + +```sh +# Remove existing index +FT.DROPINDEX "storeInventory:storeInventoryId:index" + +# Create a new index with geo-spatial and other field capabilities +FT.CREATE "storeInventory:storeInventoryId:index" + ON JSON + PREFIX 1 "storeInventory:storeInventoryId:" + SCHEMA + "$.storeId" AS "storeId" TAG SEPARATOR "|" + "$.storeName" AS "storeName" TEXT + "$.storeLocation" AS "storeLocation" GEO + "$.productId" AS "productId" TAG SEPARATOR "|" + "$.productDisplayName" AS "productDisplayName" TEXT + "$.stockQty" AS "stockQty" NUMERIC + "$.statusCode" AS "statusCode" NUMERIC +``` + +2. Using Redis OM + +For applications leveraging the Node.js environment, Redis OM provides an elegant, object-mapping approach to interact with Redis. Below is an implementation example to set up the index using Redis OM: + +```ts title="server/src/common/models/store-inventory-repo.ts" +// Import necessary Redis OM classes +import { + Schema as RedisSchema, + Repository as RedisRepository, + EntityId as RedisEntityId, +} from 'redis-om'; + +import { getNodeRedisClient } from '../utils/redis/redis-wrapper'; + +// Define a prefix for store inventory keys and the schema for indexing +const STORE_INVENTORY_KEY_PREFIX = 'storeInventory:storeInventoryId'; +const schema = new RedisSchema(STORE_INVENTORY_KEY_PREFIX, { + storeId: { type: 'string', indexed: true }, + storeName: { type: 'text', indexed: true }, + storeLocation: { type: 'point', indexed: true }, // Uses longitude,latitude format + + productId: { type: 'string', indexed: true }, + productDisplayName: { type: 'text', indexed: true }, + stockQty: { type: 'number', indexed: true }, + statusCode: { type: 'number', indexed: true }, +}); + +/* + A Repository is the main interface into Redis OM. It gives us the methods to read, write, and remove a specific Entity + */ +const getRepository = () => { + const redisClient = getNodeRedisClient(); + const repository = new RedisRepository(schema, redisClient); + return repository; +}; + +/* +we need to create an index or we won't be able to search. +Redis OM uses hash to see if index needs to be recreated or not +*/ +const createRedisIndex = async () => { + const repository = getRepository(); + await repository.createIndex(); +}; + +export { + getRepository, + createRedisIndex, + RedisEntityId, + STORE_INVENTORY_KEY_PREFIX, +}; +``` + +```ts title="server/src/services/products/src/index.ts" +import * as StoreInventoryRepo from '../../../common/models/store-inventory-repo'; + +app.listen(PORT, async () => { + //... + + // Create index for store inventory on startup + await StoreInventoryRepo.createRedisIndex(); + //... +}); +``` + +## Building Geo Location search with Redis + +### Sample raw query + +Once the data is indexed, you can execute raw Redis queries to perform Geo Location searches and other spatial operations. Here are two sample queries to demonstrate the capabilities: + +1. **Searching Products Within a Radius**: This sample query demonstrates how to find products within a `50-mile` radius of a specific location (`New York City`) with a particular product name (`puma`). It combines geo-spatial search capabilities with text search to filter results based on both location and product name. + +```sh +FT.SEARCH "storeInventory:storeInventoryId:index" "( ( ( (@statusCode:[1 1]) (@stockQty:[(0 +inf]) ) (@storeLocation:[-73.968285 40.785091 50 mi]) ) (@productDisplayName:'puma') )" +``` + +This query leverages the `FT.SEARCH` command to perform a search within the `storeInventory:storeInventoryId:index`. It specifies a circular area defined by the center's longitude and latitude and a radius of `50 miles`. Additionally, it filters products by availability `(@stockQty:[(0 +inf]))` and `@statusCode` indicating an active status `([1 1])`, combined with a match on the product display name containing `puma`. + +![geo-query-1](./images/geo-query-1.png) + +2. **Aggregate Query for Sorted Results**: This aggregate query extends the first example by sorting the results based on the geographical distance from the search location and limiting the results to the first 100. + +```sh +FT.AGGREGATE "storeInventory:storeInventoryId:index" + "( ( ( (@statusCode:[1 1]) (@stockQty:[(0 +inf]) ) (@storeLocation:[-73.968285 40.785091 50 mi]) ) (@productDisplayName:'puma') )" + LOAD 6 "@storeId" "@storeName" "@storeLocation" "@productId" "@productDisplayName" "@stockQty" + APPLY "geodistance(@storeLocation, -73.968285, 40.785091)/1609" + AS "distInMiles" + SORTBY 1 "@distInMiles" + LIMIT 0 100 +``` + +In this query, `FT.AGGREGATE` is used to process and transform search results. The `APPLY` clause calculates the distance between each store location and the specified coordinates, converting the result into miles. The `SORTBY` clause orders the results by this distance, and `LIMIT` caps the output to 100 entries, making the query highly relevant for applications requiring sorted proximity-based search results. + +![geo-query-2](./images/geo-query-2.png) + +### API endpoint + +The `getStoreProductsByGeoFilter` API endpoint enables clients to search for store products based on geographical location and product name, demonstrating a practical application of Redis Geo Location search capabilities. + +**API Request** + +The request payload specifies the product name to search for, the search radius in miles, and the user's current location in latitude and longitude coordinates. + +```json +POST http://localhost:3000/products/getStoreProductsByGeoFilter +{ + "productDisplayName":"puma", + + "searchRadiusInMiles":50, + "userLocation": { + "latitude": 40.785091, + "longitude": -73.968285 + } +} +``` + +**API Response** + +The response returns an array of products matching the search criteria, including detailed information about each product and its distance from the user's location. + +```json +{ + "data": [ + { + "productId": "11000", + "price": 3995, + "productDisplayName": "Puma Men Slick 3HD Yellow Black Watches", + "variantName": "Slick 3HD Yellow", + "brandName": "Puma", + "ageGroup": "Adults-Men", + "gender": "Men", + "displayCategories": "Accessories", + "masterCategory_typeName": "Accessories", + "subCategory_typeName": "Watches", + "styleImages_default_imageURL": "http://host.docker.internal:8080/images/11000.jpg", + "productDescriptors_description_value": "...", + + "stockQty": "5", + "storeId": "11_NY_MELVILLE", + "storeLocation": { + "longitude": -73.41512, + "latitude": 40.79343 + }, + "distInMiles": "46.59194" + } + //... + ], + "error": null +} +``` + +### API implementation + +This section outlines the implementation of the `getStoreProductsByGeoFilter` API, focusing on the `searchStoreInventoryByGeoFilter` function that executes the core search logic. + +1. **Function Overview**: `searchStoreInventoryByGeoFilter` accepts an inventory filter object that includes optional product display name, search radius in miles, and user location. It constructs a query to find store products within the specified radius that match the product name. + +2. **Query Construction**: The function builds a search query using Redis OM's fluent API, specifying conditions for product availability, stock quantity, and proximity to the user's location. It also optionally filters products by name if specified. + +3. **Executing the Query**: The constructed query is executed against Redis using the ft.aggregate method, which allows for complex aggregations and transformations. The query results are processed to calculate the distance in miles from the user's location and sort the results accordingly. + +4. **Result Processing**: The function filters out duplicate products across different stores, ensuring unique product listings in the final output. It then formats the store locations into a more readable structure and compiles the final list of products to return. + +```ts +import * as StoreInventoryRepo from '../../../common/models/store-inventory-repo'; + +interface IInventoryBodyFilter { + productDisplayName?: string; + + searchRadiusInMiles?: number; + userLocation?: { + latitude?: number; + longitude?: number; + }; +} + +const searchStoreInventoryByGeoFilter = async ( + _inventoryFilter: IInventoryBodyFilter, +) => { + // (1) --- + const redisClient = getNodeRedisClient(); + const repository = StoreInventoryRepo.getRepository(); + let storeProducts: IStoreInventory[] = []; + const trimmedStoreProducts: IStoreInventory[] = []; // similar item of other stores are removed + const uniqueProductIds = {}; + + if ( + repository && + _inventoryFilter?.userLocation?.latitude && + _inventoryFilter?.userLocation?.longitude + ) { + const lat = _inventoryFilter.userLocation.latitude; + const long = _inventoryFilter.userLocation.longitude; + const radiusInMiles = _inventoryFilter.searchRadiusInMiles || 500; + + // (2) --- Query Construction + let queryBuilder = repository + .search() + .and('statusCode') + .eq(1) + .and('stockQty') + .gt(0) + .and('storeLocation') + .inRadius((circle) => { + return circle.latitude(lat).longitude(long).radius(radiusInMiles).miles; + }); + + if (_inventoryFilter.productDisplayName) { + queryBuilder = queryBuilder + .and('productDisplayName') + .matches(_inventoryFilter.productDisplayName); + } + + console.log(queryBuilder.query); + + /* Sample queryBuilder.query to run on CLI + FT.SEARCH "storeInventory:storeInventoryId:index" "( ( ( (@statusCode:[1 1]) (@stockQty:[(0 +inf]) ) (@storeLocation:[-73.968285 40.785091 50 mi]) ) (@productDisplayName:'puma') )" + */ + + // (3) --- Executing the Query + const indexName = `storeInventory:storeInventoryId:index`; + const aggregator = await redisClient.ft.aggregate( + indexName, + queryBuilder.query, + { + LOAD: [ + '@storeId', + '@storeName', + '@storeLocation', + '@productId', + '@productDisplayName', + '@stockQty', + ], + STEPS: [ + { + type: AggregateSteps.APPLY, + expression: `geodistance(@storeLocation, ${long}, ${lat})/1609`, + AS: 'distInMiles', + }, + { + type: AggregateSteps.SORTBY, + BY: ['@distInMiles', '@productId'], + }, + { + type: AggregateSteps.LIMIT, + from: 0, + size: 1000, + }, + ], + }, + ); + + /* Sample command to run on CLI + FT.AGGREGATE "storeInventory:storeInventoryId:index" + "( ( ( (@statusCode:[1 1]) (@stockQty:[(0 +inf]) ) (@storeLocation:[-73.968285 40.785091 50 mi]) ) (@productDisplayName:'puma') )" + "LOAD" "6" "@storeId" "@storeName" "@storeLocation" "@productId" "@productDisplayName" "@stockQty" + "APPLY" "geodistance(@storeLocation, -73.968285, 40.785091)/1609" + "AS" "distInMiles" + "SORTBY" "1" "@distInMiles" + "LIMIT" "0" "100" + */ + + storeProducts = aggregator.results; + + if (!storeProducts.length) { + // throw `Product not found with in ${radiusInMiles}mi range!`; + } else { + // (4) --- Result Processing + storeProducts.forEach((storeProduct) => { + if ( + storeProduct?.productId && + !uniqueProductIds[storeProduct.productId] + ) { + uniqueProductIds[storeProduct.productId] = true; + + if (typeof storeProduct.storeLocation == 'string') { + const location = storeProduct.storeLocation.split(','); + storeProduct.storeLocation = { + longitude: Number(location[0]), + latitude: Number(location[1]), + }; + } + + trimmedStoreProducts.push(storeProduct); + } + }); + } + } else { + throw 'Mandatory fields like userLocation latitude / longitude missing !'; + } + + return { + storeProducts: trimmedStoreProducts, + productIds: Object.keys(uniqueProductIds), + }; +}; +``` + +The implementation demonstrates a practical use case for Redis's Geo Location search capabilities, showcasing how to perform proximity searches combined with other filtering criteria (like product name) and present the results in a user-friendly format. + +### Front end + +Make sure to select `Geo location search` in settings page to enable the feature. + +![settings](./images/settings.png) + +Within the dashboard, users have the option to select a random zip code and search for products (say titled "puma"). The search results are comprehensively displayed, including essential details such as the product name, available stock quantity, the name of the store, and the distance of the store from the user's selected location. + +![geo-dashboard.](./images/geo-dashboard.png) + +## Ready to use Redis for Geo Location search? + +Redis's Geo Location search capabilities offer a powerful and efficient way to perform proximity-based queries and analyses. + +By leveraging Redis's in-memory data store and specialized geo commands, developers can build scalable, high-performance applications that respond quickly to location-based queries. The integration with the JavaScript ecosystem further simplifies the development process, enabling seamless application development and deployment. + +### References + +- [Redis YouTube channel](https://www.youtube.com/c/Redisinc) +- Clients like [Node Redis](https://github.com/redis/node-redis) and [Redis om Node](https://github.com/redis/redis-om-node) help you to use Redis in Node.js applications. +- [RedisInsight](https://redis.com/redis-enterprise/redis-insight/) : To view your Redis data or to play with raw Redis commands in the workbench +- [Try Redis Cloud for free](https://redis.com/try-free/) diff --git a/docs/howtos/solutions/index-solutions.mdx b/docs/howtos/solutions/index-solutions.mdx new file mode 100644 index 00000000000..113d620e43d --- /dev/null +++ b/docs/howtos/solutions/index-solutions.mdx @@ -0,0 +1,231 @@ +--- +id: index-solutions +title: Solution Tutorials +sidebar_label: Overview +slug: /howtos/solutions +--- + +import RedisCard from '@theme/RedisCard'; + +This page provides a listing of dozens of popular app solution tutorials from Redis. + +## Microservices + +Learn how to easily build, test and deploy code for common microservice and caching design patterns across different industries using Redis. + +
+
+ +
+ +
+ +
+ +
+ +
+
+ +
+ +
+ +
+ +
+ +## Fraud detection + +
+
+ +
+ +
+ +
+ +
+ +## Caching architecture + +
+
+ +
+ +
+ +
+ +
+ +
+
+ +
+ +
+ +
+ +
+ +## Real-time Inventory + +
+
+ +
+ +
+ +
+ +
+ +## Mobile Banking + +
+
+ +
+ +
+ +
+ +
+ +## Vectors + +
+ +
+ +
+
+ +
+
+ +
+ +
+ +
+ +
+ +
+ +
+ +
+ +
+ +## Triggers and Functions + +
+ +
+ +
+ +
+ +## Geo Location Search + +
+ +
+ +
+ +
diff --git a/docs/howtos/solutions/microservices/api-gateway-caching/images/api-gateway-caching-with-redis-architecture.png b/docs/howtos/solutions/microservices/api-gateway-caching/images/api-gateway-caching-with-redis-architecture.png new file mode 100644 index 00000000000..6d6e383dd67 Binary files /dev/null and b/docs/howtos/solutions/microservices/api-gateway-caching/images/api-gateway-caching-with-redis-architecture.png differ diff --git a/docs/howtos/solutions/microservices/api-gateway-caching/index-api-gateway-caching.mdx b/docs/howtos/solutions/microservices/api-gateway-caching/index-api-gateway-caching.mdx new file mode 100644 index 00000000000..6204dc3e008 --- /dev/null +++ b/docs/howtos/solutions/microservices/api-gateway-caching/index-api-gateway-caching.mdx @@ -0,0 +1,219 @@ +--- +id: index-solutions-api-gateway-caching +title: How to use Redis for API Gateway Caching +sidebar_label: How to use Redis for API Gateway Caching +slug: /howtos/solutions/microservices/api-gateway-caching +authors: [prasan, will] +--- + +import Authors from '@theme/Authors'; +import MicroservicesEcommerceDesign from '../common-data/microservices-ecommerce.mdx'; +import MicroservicesArchitectureWithRedis from '../common-data/microservices-arch-with-redis.mdx'; +import SourceCode from '../common-data/microservices-source-code-tip.mdx'; +import RedisEnterprise from '../common-data/redis-enterprise.mdx'; + + + + + +## What is API gateway caching? + +So you're building a microservices application. But you find yourself struggling with ways to handle authentication that let you reuse code and maximize performance. Typically for authentication you might use sessions, OAuth, authorization tokens, etc. For the purposes of this tutorial, let's assume we're using an authorization token. In a monolithic application, authentication is pretty straightforward: + +When a request comes in: + +1. Decode the `Authorization` header. +1. Validate the credentials. +1. Store the session information on the request object or cache for further use down the line by the application. + +However, you might be puzzled by how to do this with microservices. Ordinarily, in a microservices application an API gateway serves as the single entry point for clients, which routes traffic to the appropriate services. Depending on the nature of the request, those services may or may not require a user to be authenticated. You might think it's a good idea to handle authentication in each respective service. + +While this works, you end up with a fair amount of duplicated code. Plus, it's difficult to understand when and where slowdowns happen and to scale services appropriately, because you repeat some of the same work in each service. A more effective way to handle authentication is to deal with it at the API gateway layer, and then pass the session information down to each service. + +Once you decide to handle authentication at the API gateway layer, you must decide where to store sessions. + +Imagine you're building an e-commerce application that uses MongoDB/ any relational database as the primary data store. You could store sessions in primary database, but think about how many times the application needs to hit primary database to retrieve session information. If you have millions of customers, you don't want to go to database for every single request made to the API. + +This is where Redis comes in. + +## Why you should use Redis for API gateway caching + +Redis is an in-memory datastore, which – among other things – makes it a perfect tool for caching session data. Redis allows you to reduce the load on a primary database while speeding up database reads. The rest of this tutorial covers how to accomplish this in the context of an e-commerce application. + +## Microservices architecture for an e-commerce application + + + +The diagram illustrates how the API gateway uses Redis as a cache for session information. The API gateway gets the session from Redis and then passes it on to each microservice. This provides an easy way to handle sessions in a single place, and to permeate them throughout the rest of the microservices. + +![API gateway caching with Redis architecture diagram](./images/api-gateway-caching-with-redis-architecture.png) + +:::tip + +Use a **Redis Cloud Cluster** to get the benefit of linear scaling to ensure API calls perform under peak loads. That also provides 99.999% uptime and Active-Active geo-distribution, which prevents loss of authentication and session data. + +::: + +## E-commerce application frontend using Next.js and Tailwind + + + + + +## API gateway caching in a microservices application with Redis + +What's nice about a microservice architecture is that each service is set up so it can scale independently. Now, seeing as how each service might require authentication, you likely want to obtain session information for most requests. Therefore, it makes sense to use the API gateway to cache and retrieve session information and to subsequently pass the information on to each service. Let's see how you might accomplish this. + +In our sample application, all requests are routed through the API gateway. We use [Express](https://expressjs.com/) to set up the API gateway, and the `Authorization` header to pass the authorization token from the frontend to the API. For every request, the API gateway gets the authorization token and looks it up in Redis. Then it passes it along to the correct microservice. + +This code validates the session: + +```typescript +import { + createProxyMiddleware, + responseInterceptor, +} from 'http-proxy-middleware'; + +//----- +const app: Express = express(); + +app.use(cors()); +app.use(async (req, res, next) => { + const authorizationHeader = req.header('Authorization'); + const sessionInfo = await getSessionInfo(authorizationHeader); //---- (1) + + //add session info to request + if (sessionInfo?.sessionData && sessionInfo?.sessionId) { + req.session = sessionInfo?.sessionData; + req.sessionId = sessionInfo?.sessionId; + } + next(); +}); + +app.use( + '/orders', + createProxyMiddleware({ + // http://localhost:3000/orders/bar -> http://localhost:3001/orders/bar + target: 'http://localhost:3001', + changeOrigin: true, + selfHandleResponse: true, + onProxyReq(proxyReq, req, res) { + // pass session info to microservice + proxyReq.setHeader('x-session', req.session); + }, + onProxyRes: applyAuthToResponse, //---- (2) + }), +); + +app.use( + '/orderHistory', + createProxyMiddleware({ + target: 'http://localhost:3002', + changeOrigin: true, + selfHandleResponse: true, + onProxyReq(proxyReq, req, res) { + // pass session info to microservice + proxyReq.setHeader('x-session', req.session); + }, + onProxyRes: applyAuthToResponse, //---- (2) + }), +); +//----- + +const getSessionInfo = async (authHeader?: string) => { + // (For demo purpose only) random userId and sessionId values are created for first time, then userId is fetched gainst that sessionId for future requests + let sessionId = ''; + let sessionData: string | null = ''; + + if (!!authHeader) { + sessionId = authHeader.split(/\s/)[1]; + } else { + sessionId = 'SES_' + randomUUID(); // generate random new sessionId + } + + const nodeRedisClient = getNodeRedisClient(); + if (nodeRedisClient) { + const exists = await nodeRedisClient.exists(sessionId); + if (!exists) { + await nodeRedisClient.set( + sessionId, + JSON.stringify({ userId: 'USR_' + randomUUID() }), + ); // generate random new userId + } + sessionData = await nodeRedisClient.get(sessionId); + } + + return { + sessionId: sessionId, + sessionData: sessionData, + }; +}; + +const applyAuthToResponse = responseInterceptor( + // adding sessionId to the response so that frontend can store it for future requests + + async (responseBuffer, proxyRes, req, res) => { + // detect json responses + if ( + !!proxyRes.headers['content-type'] && + proxyRes.headers['content-type'].includes('application/json') + ) { + let data = JSON.parse(responseBuffer.toString('utf8')); + + // manipulate JSON data here + if (!!(req as Request).sessionId) { + data = Object.assign({}, data, { auth: (req as Request).sessionId }); + } + + // return manipulated JSON + return JSON.stringify(data); + } + + // return other content-types as-is + return responseBuffer; + }, +); +``` + +:::info + +This example is not meant to represent the best way to handle authentication. Instead, it illustrates what you might do with respect to Redis. You will likely have a different setup for authentication, but the concept of storing a session in Redis is similar. + +::: + +In the code above, we check for the `Authorization` header, otherwise we create a new one and store it in Redis. Then we retrieve the session from Redis. Further down the line we attach the session to the `x-session` header prior to calling the orders service. + +Now let's see how the orders service receives the session. + +```typescript {9} +router.post(API_NAMES.CREATE_ORDER, async (req: Request, res: Response) => { + const body = req.body; + const result: IApiResponseBody = { + data: null, + error: null, + }; + + const sessionData = req.header('x-session'); + const userId = sessionData ? JSON.parse(sessionData).userId : ""; + ... +}); +``` + +The highlighted line above shows how to pull the session out of the `x-session` header and get the `userId`. + +## Ready to use Redis for API gateway caching ? + +That's all there is to it! You now know how to use Redis for API gateway caching. It's not complicated to get started, but this simple practice can help you scale as you build out microservices. + +To learn more about Redis, check out the additional resources below: + +### Additional resources + +- Microservices with Redis + - [CQRS](/howtos/solutions/microservices/cqrs) + - [Interservice communication](/howtos/solutions/microservices/interservice-communication) + - [Query caching](/howtos/solutions/microservices/caching) +- [Redis YouTube channel](https://www.youtube.com/c/Redisinc) +- Clients like [Node Redis](https://github.com/redis/node-redis) and [Redis om Node](https://github.com/redis/redis-om-node) help you to use Redis in Node.js applications. +- [RedisInsight](https://redis.com/redis-enterprise/redis-insight/) : To view your Redis data or to play with raw Redis commands in the workbench +- [Try Redis Cloud for free](https://redis.com/try-free/) diff --git a/docs/howtos/solutions/microservices/caching/images/cache-aside.jpg b/docs/howtos/solutions/microservices/caching/images/cache-aside.jpg new file mode 100644 index 00000000000..ad187c542e9 Binary files /dev/null and b/docs/howtos/solutions/microservices/caching/images/cache-aside.jpg differ diff --git a/docs/howtos/solutions/microservices/caching/images/redis-cache-aside-cache-hit.png b/docs/howtos/solutions/microservices/caching/images/redis-cache-aside-cache-hit.png new file mode 100644 index 00000000000..f013bee5a49 Binary files /dev/null and b/docs/howtos/solutions/microservices/caching/images/redis-cache-aside-cache-hit.png differ diff --git a/docs/howtos/solutions/microservices/caching/images/redis-cache-aside-cache-miss.png b/docs/howtos/solutions/microservices/caching/images/redis-cache-aside-cache-miss.png new file mode 100644 index 00000000000..34ab6faef57 Binary files /dev/null and b/docs/howtos/solutions/microservices/caching/images/redis-cache-aside-cache-miss.png differ diff --git a/docs/howtos/solutions/microservices/caching/index-caching.mdx b/docs/howtos/solutions/microservices/caching/index-caching.mdx new file mode 100644 index 00000000000..a4abc43faf8 --- /dev/null +++ b/docs/howtos/solutions/microservices/caching/index-caching.mdx @@ -0,0 +1,260 @@ +--- +id: index-solutions-caching +title: How to use Redis for Query Caching +sidebar_label: How to use Redis for Query Caching +slug: /howtos/solutions/microservices/caching +authors: [prasan, will] +--- + +import Authors from '@theme/Authors'; +import MicroservicesEcommerceDesign from '../common-data/microservices-ecommerce.mdx'; +import MicroservicesArchitectureWithRedis from '../common-data/microservices-arch-with-redis.mdx'; +import SourceCode from '../common-data/microservices-source-code-tip.mdx'; +import RedisCloud from '../common-data/redis-enterprise.mdx'; + +import cacheMissImage from './images/redis-cache-aside-cache-miss.png'; +import cacheHitImage from './images/redis-cache-aside-cache-hit.png'; + + + + + +## What is query caching? + +Have you ever been in a situation where your database queries are slowing down? +Query caching is the technique you need to speed database queries by using different caching methods while keeping costs down! Imagine that you built an e-commerce application. It started small but is growing fast. By now, you have an extensive product catalog and millions of customers. + +That's good for business, but a hardship for technology. Your queries to primary database (MongoDB/ Postgressql) are beginning to slow down, even though you already attempted to optimize them. Even though you can squeak out a little extra performance, it isn't enough to satisfy your customers. + +## Why you should use Redis for query caching + +Redis is an in-memory datastore, best known for caching. Redis allows you to reduce the load on a primary database while speeding up database reads. + +With any e-commerce application, there is one specific type of query that is most often requested. If you guessed that it’s the product search query, you’re correct! + +To improve product search in an e-commerce application, you can implement one of following caching patterns: + +- **Cache prefetching**: An entire product catalog can be pre-cached in Redis, and the application can perform any product query on Redis similar to the primary database. +- **Cache-aside** pattern: Redis is filled on demand, based on whatever search parameters are requested by the frontend. + +:::tip + +If you use **Redis Cloud**, cache aside is easier due to its support for JSON and search. You also get additional features such as real-time performance, High scalability, resiliency, and fault tolerance. You can also call upon high-availability features such as Active-Active geo-redundancy. + +::: + +This tutorial focuses on the **cache-aside** pattern. The goal of this design pattern is to set up **optimal** caching (load-as-you-go) for better read operations. With caching, you might be familiar with a "cache miss," where you do not find data in the cache, and a "cache hit," where you can find data in the cache. Let's look at how the cache-aside pattern works with Redis for both a "cache miss" and a "cache hit." + +### Cache-aside with Redis (cache miss) + +Cache miss when using the cache-aside pattern with Redis + +This diagram illustrates the steps taken in the cache-aside pattern when there is a "cache miss." To understand how this works, consider the following process: + +1. An application requests data from the backend. +1. The backend checks to find out if the data is available in Redis. +1. Data is not found (a cache miss), so the data is fetched from the database. +1. The data returned from the database is subsequently stored in Redis. +1. The data is then returned to the application. + +### Cache-aside with Redis (cache hit) + +Now that you have seen what a "cache miss" looks like, let's cover a "cache hit." Here is the same diagram, but with the "cache hit" steps highlighted in green. + +Cache hit when using the cache-aside pattern with Redis + +1. An application requests data from the backend. +1. The backend checks to find out if the data is available in Redis. +1. The data is then returned to the application. + +The cache-aside pattern is useful when you need to: + +1. **Query data frequently**: When you have a large volume of reads (as is the case in an e-commerce application), the cache-aside pattern gives you an immediate performance gain for subsequent data requests. +1. **Fill the cache on demand**: The cache-aside pattern fills the cache as data is requested rather than pre-caching, thus saving on space and cost. This is useful when it isn't clear what data will need to be cached. +1. **Be cost-conscious**: Since cache size is directly related to the cost of cache storage in the cloud, the smaller the size, the less you pay. + +:::tip + +If you use **Redis Cloud** and a database that uses a JDBC driver, you can take advantage of [**Redis Smart Cache**](https://redis.com/blog/redis-smart-cache/), which lets you add caching to an application without changing the code. [**Click here to learn more!**](https://github.com/redis-field-engineering/redis-smart-cache) + +::: + +## Microservices architecture for an e-commerce application + + + +## E-commerce application frontend using Next.js and Tailwind + + + + + +## Caching in a microservices application with Redis and primary database (MongoDB/ Postgressql) + +In our sample application, the products service publishes an API for filtering products. Here's what a call to the API looks like: + +### Get products by filter request + +```json title="docs/api/get-products-by-filter.md" +// POST http://localhost:3000/products/getProductsByFilter +{ + "productDisplayName": "puma" +} +``` + +### Get products by filter response (cache miss) + +```json {25} +{ + "data": [ + { + "productId": "11000", + "price": 3995, + "productDisplayName": "Puma Men Slick 3HD Yellow Black Watches", + "variantName": "Slick 3HD Yellow", + "brandName": "Puma", + "ageGroup": "Adults-Men", + "gender": "Men", + "displayCategories": "Accessories", + "masterCategory_typeName": "Accessories", + "subCategory_typeName": "Watches", + "styleImages_default_imageURL": "http://host.docker.internal:8080/images/11000.jpg", + "productDescriptors_description_value": "

Stylish and comfortable, ...", + "createdOn": "2023-07-13T14:07:38.020Z", + "createdBy": "ADMIN", + "lastUpdatedOn": "2023-07-13T14:07:38.020Z", + "lastUpdatedBy": null, + "statusCode": 1 + } + //... + ], + "error": null, + "isFromCache": false +} +``` + +### Get products by filter response (cache hit) + +```json {6} +{ + "data": [ + //...same data as above + ], + "error": null, + "isFromCache": true // now the data comes from the cache rather DB +} +``` + +### Implementing cache-aside with Redis and primary database (MongoDB/ Postgressql) + +The following code shows the function used to search for products in primary database: + +```typescript title="server/src/services/products/src/service-impl.ts" +async function getProductsByFilter(productFilter: Product) { + const prisma = getPrismaClient(); + + const whereQuery: Prisma.ProductWhereInput = { + statusCode: DB_ROW_STATUS.ACTIVE, + }; + + if (productFilter && productFilter.productDisplayName) { + whereQuery.productDisplayName = { + contains: productFilter.productDisplayName, + mode: 'insensitive', + }; + } + + const products: Product[] = await prisma.product.findMany({ + where: whereQuery, + }); + + return products; +} +``` + +You simply make a call to primary database (MongoDB/ Postgressql) to find products based on a filter on the product's `displayName` property. You can set up multiple columns for better fuzzy searching, but we simplified it for the purposes of this tutorial. + +Using primary database directly without Redis works for a while, but eventually it slows down. That's why you might use Redis, to speed things up. The cache-aside pattern helps you balance performance with cost. + +The basic decision tree for cache-aside is as follows. + +When the frontend requests products: + +1. Form a hash with the contents of the request (i.e., the search parameters). +1. Check Redis to see if a value exists for the hash. +1. Is there a cache hit? If data is found for the hash, it is returned; the process stops here. +1. Is there a cache miss? When data is not found, it is read out of primary database and subsequently stored in Redis prior to being returned. + +Here’s the code used to implement the decision tree: + +```typescript title="server/src/services/products/src/routes.ts" +const getHashKey = (_filter: Document) => { + let retKey = ''; + if (_filter) { + const text = JSON.stringify(_filter); + retKey = crypto.createHash('sha256').update(text).digest('hex'); + } + return 'CACHE_ASIDE_' + retKey; +}; + +router.post(API.GET_PRODUCTS_BY_FILTER, async (req: Request, res: Response) => { + const body = req.body; + // using node-redis + const redis = getNodeRedisClient(); + + //get data from redis + const hashKey = getHashKey(req.body); + const cachedData = await redis.get(hashKey); + const docArr = cachedData ? JSON.parse(cachedData) : []; + + if (docArr && docArr.length) { + result.data = docArr; + result.isFromCache = true; + } else { + // get data from primary database + const dbData = await getProductsByFilter(body); //method shown earlier + + if (body && body.productDisplayName && dbData.length) { + // set data in redis (no need to wait) + redis.set(hashKey, JSON.stringify(dbData), { + EX: 60, // cache expiration in seconds + }); + } + + result.data = dbData; + } + + res.send(result); +}); +``` + +:::tip + +You need to decide what expiry or time to live (TTL) works best for your particular use case. + +::: + +## Ready to use Redis for query caching? + +You now know how to use Redis for caching with one of the most common caching patterns (cache-aside). It's possible to incrementally adopt Redis wherever needed with different strategies/patterns. For more resources on the topic of microservices, check out the links below: + +### Additional resources + +- Microservices with Redis + - [CQRS](/howtos/solutions/microservices/cqrs) + - [Interservice communication](/howtos/solutions/microservices/interservice-communication) + - [API gateway caching](/howtos/solutions/microservices/api-gateway-caching) +- [Redis YouTube channel](https://www.youtube.com/c/Redisinc) +- Clients like [Node Redis](https://github.com/redis/node-redis) and [Redis om Node](https://github.com/redis/redis-om-node) help you to use Redis in Node.js applications. +- [RedisInsight](https://redis.com/redis-enterprise/redis-insight/) : To view your Redis data or to play with raw Redis commands in the workbench +- [Try Redis Cloud for free](https://redis.com/try-free/) diff --git a/docs/howtos/solutions/microservices/common-data/images/design-cart-2.png b/docs/howtos/solutions/microservices/common-data/images/design-cart-2.png new file mode 100644 index 00000000000..173812accd9 Binary files /dev/null and b/docs/howtos/solutions/microservices/common-data/images/design-cart-2.png differ diff --git a/docs/howtos/solutions/microservices/common-data/images/design-cart.png b/docs/howtos/solutions/microservices/common-data/images/design-cart.png new file mode 100644 index 00000000000..e97caf2e5b4 Binary files /dev/null and b/docs/howtos/solutions/microservices/common-data/images/design-cart.png differ diff --git a/docs/howtos/solutions/microservices/common-data/images/design-dashboard.png b/docs/howtos/solutions/microservices/common-data/images/design-dashboard.png new file mode 100644 index 00000000000..8f353798de0 Binary files /dev/null and b/docs/howtos/solutions/microservices/common-data/images/design-dashboard.png differ diff --git a/docs/howtos/solutions/microservices/common-data/images/design-order-history.png b/docs/howtos/solutions/microservices/common-data/images/design-order-history.png new file mode 100644 index 00000000000..1c1ba3bc0d9 Binary files /dev/null and b/docs/howtos/solutions/microservices/common-data/images/design-order-history.png differ diff --git a/docs/howtos/solutions/microservices/common-data/images/initial-microservices-arch.png b/docs/howtos/solutions/microservices/common-data/images/initial-microservices-arch.png new file mode 100644 index 00000000000..c17df575d3b Binary files /dev/null and b/docs/howtos/solutions/microservices/common-data/images/initial-microservices-arch.png differ diff --git a/docs/howtos/solutions/microservices/common-data/images/redis-microservices-arch.png b/docs/howtos/solutions/microservices/common-data/images/redis-microservices-arch.png new file mode 100644 index 00000000000..ae3d2b8cc15 Binary files /dev/null and b/docs/howtos/solutions/microservices/common-data/images/redis-microservices-arch.png differ diff --git a/docs/howtos/solutions/microservices/common-data/microservices-arch-with-redis-old.mdx b/docs/howtos/solutions/microservices/common-data/microservices-arch-with-redis-old.mdx new file mode 100644 index 00000000000..fe4df59d0ed --- /dev/null +++ b/docs/howtos/solutions/microservices/common-data/microservices-arch-with-redis-old.mdx @@ -0,0 +1,10 @@ +The e-commerce microservices application discussed in the rest of this tutorial uses the following architecture: + +1. `products service`: handles querying products from the database and returning them to the frontend +1. `orders service`: handles validating and creating orders +1. `order history service`: handles querying a customer's order history +1. `payments service`: handles processing orders for payment +1. `digital identity service`: handles storing digital identity and calculating identity score +1. `api gateway`: unifies services under a single endpoint +1. `mongodb`: serves as the primary database, storing orders, order history, products, etc. +1. `redis`: serves as the **stream processor** and caching database diff --git a/docs/howtos/solutions/microservices/common-data/microservices-arch-with-redis.mdx b/docs/howtos/solutions/microservices/common-data/microservices-arch-with-redis.mdx new file mode 100644 index 00000000000..282a8952980 --- /dev/null +++ b/docs/howtos/solutions/microservices/common-data/microservices-arch-with-redis.mdx @@ -0,0 +1,16 @@ +The e-commerce microservices application discussed in the rest of this tutorial uses the following architecture: + +1. `products service`: handles querying products from the database and returning them to the frontend +1. `orders service`: handles validating and creating orders +1. `order history service`: handles querying a customer's order history +1. `payments service`: handles processing orders for payment +1. `digital identity service`: handles storing digital identity and calculating identity score +1. `api gateway`: unifies services under a single endpoint +1. `mongodb/ postgresql`: serves as the primary database, storing orders, order history, products, etc. +1. `redis`: serves as the **stream processor** and caching database + +:::info + +You don't need to use MongoDB/ Postgresql as your primary database in the demo application; you can use other [prisma supported databases](https://www.prisma.io/docs/reference/database-reference/supported-databases) as well. This is just an example. + +::: diff --git a/docs/howtos/solutions/microservices/common-data/microservices-arch.mdx b/docs/howtos/solutions/microservices/common-data/microservices-arch.mdx new file mode 100644 index 00000000000..cf1e99c84a3 --- /dev/null +++ b/docs/howtos/solutions/microservices/common-data/microservices-arch.mdx @@ -0,0 +1,14 @@ +Lets take a look at the architecture of the demo application: + +1. `products service`: handles querying products from the database and returning them to the frontend +1. `orders service`: handles validating and creating orders +1. `order history service`: handles querying a customer's order history +1. `payments service`: handles processing orders for payment +1. `api gateway`: unifies the services under a single endpoint +1. `mongodb/ postgresql`: serves as the write-optimized database for storing orders, order history, products, etc. + +:::info + +You don't need to use MongoDB/ Postgresql as your write-optimized database in the demo application; you can use other [prisma supported databases](https://www.prisma.io/docs/reference/database-reference/supported-databases) as well. This is just an example. + +::: diff --git a/docs/howtos/solutions/microservices/common-data/microservices-ecommerce-old.mdx b/docs/howtos/solutions/microservices/common-data/microservices-ecommerce-old.mdx new file mode 100644 index 00000000000..88f06afcf31 --- /dev/null +++ b/docs/howtos/solutions/microservices/common-data/microservices-ecommerce-old.mdx @@ -0,0 +1,13 @@ +The e-commerce microservices application consists of a frontend, built using [Next.js](https://nextjs.org/) with [TailwindCSS](https://tailwindcss.com/). The application backend uses [Node.js](https://nodejs.org). The data is stored in +[Redis](https://redis.com/try-free/) and MongoDB. Below you will find screenshots of the frontend of the e-commerce app: + +- `Dashboard`: Shows the list of products with search functionality + + ![redis microservices e-commerce app frontend products page](images/design-dashboard.png) + +- `Shopping Cart`: Add products to the cart, then check out using the "Buy Now" button + ![redis microservices e-commerce app frontend shopping cart](images/design-cart-2.png) + +- `Order history`: Once an order is placed, the `Orders` link in the top navigation bar shows the order status and history + + ![redis microservices e-commerce app frontend order history page](images/design-order-history.png) diff --git a/docs/howtos/solutions/microservices/common-data/microservices-ecommerce.mdx b/docs/howtos/solutions/microservices/common-data/microservices-ecommerce.mdx new file mode 100644 index 00000000000..62ef1b49cda --- /dev/null +++ b/docs/howtos/solutions/microservices/common-data/microservices-ecommerce.mdx @@ -0,0 +1,13 @@ +The e-commerce microservices application consists of a frontend, built using [Next.js](https://nextjs.org/) with [TailwindCSS](https://tailwindcss.com/). The application backend uses [Node.js](https://nodejs.org). The data is stored in +[Redis](https://redis.com/try-free/) and MongoDB/ Postgressql using [Prisma](https://www.prisma.io/docs/reference/database-reference/supported-databases). Below you will find screenshots of the frontend of the e-commerce app: + +- `Dashboard`: Shows the list of products with search functionality + + ![redis microservices e-commerce app frontend products page](images/design-dashboard.png) + +- `Shopping Cart`: Add products to the cart, then check out using the "Buy Now" button + ![redis microservices e-commerce app frontend shopping cart](images/design-cart-2.png) + +- `Order history`: Once an order is placed, the `Orders` link in the top navigation bar shows the order status and history + + ![redis microservices e-commerce app frontend order history page](images/design-order-history.png) diff --git a/docs/howtos/solutions/microservices/common-data/microservices-source-code-tip-old.mdx b/docs/howtos/solutions/microservices/common-data/microservices-source-code-tip-old.mdx new file mode 100644 index 00000000000..1de0c9f6156 --- /dev/null +++ b/docs/howtos/solutions/microservices/common-data/microservices-source-code-tip-old.mdx @@ -0,0 +1,7 @@ +:::tip GITHUB CODE + +Below is a command to the clone the source code for the application used in this tutorial + +git clone --branch v1.0.0 https://github.com/redis-developer/redis-microservices-ecommerce-solutions + +::: diff --git a/docs/howtos/solutions/microservices/common-data/microservices-source-code-tip.mdx b/docs/howtos/solutions/microservices/common-data/microservices-source-code-tip.mdx new file mode 100644 index 00000000000..c3a5c22dc4a --- /dev/null +++ b/docs/howtos/solutions/microservices/common-data/microservices-source-code-tip.mdx @@ -0,0 +1,7 @@ +:::tip GITHUB CODE + +Below is a command to the clone the source code for the application used in this tutorial + +git clone --branch v4.2.0 https://github.com/redis-developer/redis-microservices-ecommerce-solutions + +::: diff --git a/docs/howtos/solutions/microservices/common-data/redis-enterprise.mdx b/docs/howtos/solutions/microservices/common-data/redis-enterprise.mdx new file mode 100644 index 00000000000..90bf6b099e0 --- /dev/null +++ b/docs/howtos/solutions/microservices/common-data/redis-enterprise.mdx @@ -0,0 +1,13 @@ +You can use **Redis Cloud** as a multi-model primary database. Redis Cloud is a fully managed, highly available, secure, and real-time data platform. It can store data on both RAM or **Flash**. It also supports **Active-Active** (multi-zone read and write replicas) on different cloud vendors, providing extreme high availability and scalability. Active-Active offers global scalability while maintaining local speed for database reads and writes. + +Redis Cloud has many built-in modular capabilities, making it a unified, real-time data platform. Redis Cloud is far more than a document database. + +- **JSON**: Persists JSON documents +- **Search**: Indexes and searches JSON documents +- **Probabilistic data structures**: Provides bloom filters and other probabilistic data structures +- **Time Series**: Supports time series data structures +- **Triggers and Functions**: Syncs data to external databases via different pattern (write-behind/ write-through) or executes custom logic. + +Use [RedisInsight](https://redis.com/redis-enterprise/redis-insight/) to view your Redis data or to play with raw Redis commands in the workbench. + +If you're interested in diving deeper, try [Redis Cloud](https://redis.com/try-free) today for free! diff --git a/docs/howtos/solutions/microservices/cqrs/images/cqrs-architecture-with-cdc.png b/docs/howtos/solutions/microservices/cqrs/images/cqrs-architecture-with-cdc.png new file mode 100644 index 00000000000..94cb112fed3 Binary files /dev/null and b/docs/howtos/solutions/microservices/cqrs/images/cqrs-architecture-with-cdc.png differ diff --git a/docs/howtos/solutions/microservices/cqrs/images/cqrs-architecture-with-redis.png b/docs/howtos/solutions/microservices/cqrs/images/cqrs-architecture-with-redis.png new file mode 100644 index 00000000000..7f821d03dba Binary files /dev/null and b/docs/howtos/solutions/microservices/cqrs/images/cqrs-architecture-with-redis.png differ diff --git a/docs/howtos/solutions/microservices/cqrs/images/cqrs-pattern.png b/docs/howtos/solutions/microservices/cqrs/images/cqrs-pattern.png new file mode 100644 index 00000000000..4ce74c90ec3 Binary files /dev/null and b/docs/howtos/solutions/microservices/cqrs/images/cqrs-pattern.png differ diff --git a/docs/howtos/solutions/microservices/cqrs/images/redis-1-order-with-products.png b/docs/howtos/solutions/microservices/cqrs/images/redis-1-order-with-products.png new file mode 100644 index 00000000000..dea925a1238 Binary files /dev/null and b/docs/howtos/solutions/microservices/cqrs/images/redis-1-order-with-products.png differ diff --git a/docs/howtos/solutions/microservices/cqrs/index-cqrs.mdx b/docs/howtos/solutions/microservices/cqrs/index-cqrs.mdx new file mode 100644 index 00000000000..5edc3ca0327 --- /dev/null +++ b/docs/howtos/solutions/microservices/cqrs/index-cqrs.mdx @@ -0,0 +1,344 @@ +--- +id: index-solutions-cqrs +title: How to Build an E-Commerce App Using Redis with the CQRS Pattern +sidebar_label: How to Build an E-Commerce App Using Redis with the CQRS Pattern +slug: /howtos/solutions/microservices/cqrs +authors: [prasan, will] +--- + +import Authors from '@theme/Authors'; +import MicroservicesEcommerceDesign from '../common-data/microservices-ecommerce.mdx'; +import InitialMicroservicesArchitecture from '../common-data/microservices-arch.mdx'; +import SourceCode from '../common-data/microservices-source-code-tip.mdx'; +import RedisCloud from '../common-data/redis-enterprise.mdx'; + +import cqrsPattern from './images/cqrs-pattern.png'; +import cqrsArchitectureWithCdc from './images/cqrs-architecture-with-cdc.png'; +import imageSampleOrder from './images/redis-1-order-with-products.png'; + + + + + +## What is command and query responsibility segregation (CQRS)? + +Command Query Responsibility Segregation (CQRS) is a critical pattern within a microservice architecture. It decouples reads (queries) and writes (commands), which permits read and write workloads to work independently. + +Commands(write) focus on higher durability and consistency, while queries(read) focus on performance. This enables a microservice to write data to a slower system of record disk-based database, while pre-fetching and caching that data in a cache for real-time reads. + +The idea is simple: you separate commands such as "Order this product" (a write operation) from queries such as "Show me my order history" (a read operation). CQRS applications are often messaging-based and rely on [eventual consistency](https://en.wikipedia.org/wiki/Eventual_consistency). + +The sample data architecture that follows demonstrates how to use Redis with CQRS: + +CQRS architecture with Redis + +The architecture illustrated in the diagram uses the Change Data Capture pattern (noted as "Integrated CDC") to track the changed state on the command database and to replicate it to the query database (Redis). This is a common pattern used with CQRS. + +Implementing CDC requires: + +1. Taking the data snapshot from the system of record +1. Performing an ETL operation finalized to load the data on the target cache database +1. Setting up a mechanism to continuously stream the changes in the system of record to the cache + +:::tip + +While you can implement your own CDC mechanism with Redis using Triggers and Functions, Redis Cloud comes with its own integrated CDC mechanism to solve this problem for you. + +::: + +## Why you might use CQRS + +> _To improve application performance, scale your read and write operations separately._ + +Consider the following scenario: You have an e-commerce application that allows a customer to populate a shopping cart with products. The site has a "Buy Now" button to facilitate ordering those products. When first starting out, you might set up and populate a product database (perhaps a SQL database). Then you might write a backend API to handle the processes of creating an order, creating an invoice, processing payments, handling fulfillment, and updating the customer's order history… all in one go. + +This method of synchronous order processing seemed like a good idea. But you soon find out that your database slows down as you gain more customers and have a higher sales volume. In reality, most applications have significantly more reads than writes. You should scale those operations separately. + +You decide that you need to process orders quickly so the customer doesn't have to wait. Then, when you have time, you can create an invoice, process payment, handle fulfillment, etc. + +So you decide to separate each of these steps. Using a microservices approach with CQRS allows you to scale your reads and writes independently as well as aid in decoupling your microservices. With a CQRS model, a single service is responsible for handling an entire command from end to end. One service should not depend on another service in order to complete a command. + +## Microservices CQRS architecture for an e-commerce application + + + +## Using CQRS in a microservices architecture + +Note that in the current architecture all the services use the same underlying database. Even though you’re technically separating reads and writes, you can't scale the write-optimized database independently. This is where Redis comes in. If you put Redis in front of your write-optimized database, you can use it for reads while writing to the write-optimized database. The benefit of Redis is that it’s fast for reads and writes, which is why it’s the best choice for caching and CQRS. + +:::info + +For the purposes of this tutorial, we’re not highlighting how communication is coordinated between our services, such as how new orders are processed for payment. That process uses Redis Streams, and is outlined in our [interservice communication guide](/howtos/solutions/microservices/interservice-communication). + +::: + +:::tip + +When your e-commerce application eventually needs to scale across the globe, Redis Cloud provides Active-Active geo-distribution for reads and writes at local latencies as well as availability of 99.999% uptime. + +::: + +Let's look at some sample code that helps facilitate the CQRS pattern with Redis and Primary database (MongoDB/ Postgressql). + +## E-commerce application frontend using Next.js and Tailwind + + + + + +## Building a CQRS microservices application with Redis and Primary database (MongoDB/ Postgressql) + +Let's look at the sample code for the `order service` and see the `CreateOrder` command (a write operation). Then we look at the `order history service` to see the `ViewOrderHistory` command (a read operation). + +### Create order command API + +The code that follows shows an example API request and response to create an order. + +#### Create order request + +```json title="docs/api/create-order.md" +// POST http://api-gateway/orders/createOrder +{ + "products": [ + { + "productId": "11002", + "qty": 1, + "productPrice": 4950 + }, + { + "productId": "11012", + "qty": 2, + "productPrice": 1195 + } + ] +} +``` + +#### Create order response + +```json +{ + "data": "d4075f43-c262-4027-ad25-7b1bc8c490b6", //orderId + "error": null +} +``` + +When you make a request, it goes through the API gateway to the `orders service`. Ultimately, it ends up calling a `createOrder` function which looks as follows: + +```typescript title="server/src/services/orders/src/service-impl.ts" +const createOrder = async ( + order: IOrder, + //... +) => { + if (!order) { + throw 'Order data is mandatory!'; + } + + const userId = order.userId || USERS.DEFAULT; // Used as a shortcut, in a real app you would use customer session data to fetch user details + const orderId = uuidv4(); + + order.orderId = orderId; + order.orderStatusCode = ORDER_STATUS.CREATED; + order.userId = userId; + order.createdBy = userId; + order.statusCode = DB_ROW_STATUS.ACTIVE; + order.potentialFraud = false; + + order = await validateOrder(order); + + const products = await getProductDetails(order); + addProductDataToOrders(order, products); + + await addOrderToRedis(order); + + await addOrderToPrismaDB(order); + + //... + + return orderId; +}; +``` + +:::info + +For tutorial simplicity, we add data to both primary database and Redis in the same service (double-write). As mentioned earlier, a common pattern is to have your services write to one database, and then separately use a CDC mechanism to update the other database. For example, you could write directly to Redis, then use **RedisGears** to handle synchronizing Redis and primary database in the background. For the purposes of this tutorial, we don't outline exactly how you might handle synchronization, but instead focus on how the data is stored and accessed in Redis. + +::: + +:::tip + +If you're using **Redis Cloud**, you can take advantage of the **integrated CDC** mechanism to avoid having to roll your own. + +::: + +Note that in the previous code block we call the `addOrderToRedis` function to store orders in Redis. We use [Redis OM for Node.js](https://github.com/redis/redis-om-node) to store the order entities in Redis. This is what that function looks like: + +```typescript title="server/src/services/orders/src/service-impl.ts" +import { Schema, Repository } from 'redis-om'; +import { getNodeRedisClient } from '../utils/redis/redis-wrapper'; + +//Redis Om schema for Order +const schema = new Schema('Order', { + orderId: { type: 'string', indexed: true }, + + orderStatusCode: { type: 'number', indexed: true }, + potentialFraud: { type: 'boolean', indexed: false }, + userId: { type: 'string', indexed: true }, + + createdOn: { type: 'date', indexed: false }, + createdBy: { type: 'string', indexed: true }, + lastUpdatedOn: { type: 'date', indexed: false }, + lastUpdatedBy: { type: 'string', indexed: false }, + statusCode: { type: 'number', indexed: true }, +}); + +//Redis OM repository for Order (to read, write and remove orders) +const getOrderRepository = () => { + const redisClient = getNodeRedisClient(); + const repository = new Repository(schema, redisClient); + return repository; +}; + +//Redis indexes data for search +const createRedisIndex = async () => { + const repository = getRepository(); + await repository.createIndex(); +}; + +const addOrderToRedis = async (order: OrderWithIncludes) => { + if (order) { + const repository = getOrderRepository(); + //insert Order in to Redis + await repository.save(order.orderId, order); + } +}; +``` + +Sample **Order** view using [RedisInsight](https://redis.com/redis-enterprise/redis-insight/) + +sample order + +:::tip +Download [**RedisInsight**](https://redis.com/redis-enterprise/redis-insight/) to view your Redis data or to play with raw Redis commands in the workbench. +::: + +### Order history API + +The code that follows shows an example API request and response to get a customer's order history. + +#### Order history request + +```json title="docs/api/view-order-history.md" +// GET http://api-gateway/orderHistory/viewOrderHistory +``` + +#### Order history response + +```json +{ + "data": [ + { + "orderId": "d4075f43-c262-4027-ad25-7b1bc8c490b6", + "userId": "USR_22fcf2ee-465f-4341-89c2-c9d16b1f711b", + "orderStatusCode": 4, + "products": [ + { + "productId": "11002", + "qty": 1, + "productPrice": 4950, + "productData": { + "productId": "11002", + "price": 4950, + "productDisplayName": "Puma Men Race Black Watch", + "variantName": "Race 85", + "brandName": "Puma", + "ageGroup": "Adults-Men", + "gender": "Men", + "displayCategories": "Accessories", + "masterCategory_typeName": "Accessories", + "subCategory_typeName": "Watches", + "styleImages_default_imageURL": "http://host.docker.internal:8080/images/11002.jpg", + "productDescriptors_description_value": "

This watch from puma comes in a heavy duty design. The assymentric dial and chunky..." + } + }, + { + "productId": "11012", + "qty": 2, + "productPrice": 1195, + "productData": { + "productId": "11012", + "price": 1195, + "productDisplayName": "Wrangler Women Frill Check Multi Tops", + "variantName": "FRILL CHECK", + "brandName": "Wrangler", + "ageGroup": "Adults-Women", + "gender": "Women", + "displayCategories": "Sale and Clearance,Casual Wear", + "masterCategory_typeName": "Apparel", + "subCategory_typeName": "Topwear", + "styleImages_default_imageURL": "http://host.docker.internal:8080/images/11012.jpg", + "productDescriptors_description_value": "

Composition
Navy blue, red, yellow and white checked top made of 100% cotton, with a jabot collar, buttoned ..." + } + } + ], + "createdBy": "USR_22fcf2ee-465f-4341-89c2-c9d16b1f711b", + "lastUpdatedOn": "2023-07-13T14:11:49.997Z", + "lastUpdatedBy": "USR_22fcf2ee-465f-4341-89c2-c9d16b1f711b" + } + ], + "error": null +} +``` + +When you make a request, it goes through the API gateway to the `order history service`. Ultimately, it ends up calling a `viewOrderHistory` function, which looks as follows: + +```typescript title="server/src/services/order-history/src/service-impl.ts" +const viewOrderHistory = async (userId: string) => { + const repository = OrderRepo.getRepository(); + let orders: Partial[] = []; + const queryBuilder = repository + .search() + .where('createdBy') + .eq(userId) + .and('orderStatusCode') + .gte(ORDER_STATUS.CREATED) //returns CREATED and PAYMENT_SUCCESS + .and('statusCode') + .eq(DB_ROW_STATUS.ACTIVE); + + console.log(queryBuilder.query); + orders = []>await queryBuilder.return.all(); +}; +``` + +:::info + +Note that the `order history service` only needs to go to Redis for all orders. This is because we handle storage and synchronization between Redis and primary database within the `orders service`. + +::: + +You might be used to using Redis as a cache and both storing and retrieving stringified JSON values or perhaps hashed values. However, look closely at the code above. In it, we store orders as **JSON** documents, and then use [Redis OM](https://github.com/redis/redis-om-node) to search for the orders that belong to a specific user. Redis operates like a search engine, here, with the ability to speed up queries and scale independently from the primary database (which in this case is MongoDB/ Postgressql). + +## Ready to use Redis with the CQRS pattern? + +Hopefully, this tutorial has helped you visualize how to use Redis with the CQRS pattern. It can help to reduce the load on your primary database while still allowing you to store and search JSON documents. For additional resources related to this topic, check out the links below: + +### Additional resources + +- Microservices with Redis + - [Interservice communication](/howtos/solutions/microservices/interservice-communication) + - [Query caching](/howtos/solutions/microservices/caching) + - [API gateway caching](/howtos/solutions/microservices/api-gateway-caching) +- [Redis YouTube channel](https://www.youtube.com/c/Redisinc) +- Clients like [Node Redis](https://github.com/redis/node-redis) and [Redis om Node](https://github.com/redis/redis-om-node) help you to use Redis in Node.js applications. +- [RedisInsight](https://redis.com/redis-enterprise/redis-insight/) : To view your Redis data or to play with raw Redis commands in the workbench +- [Try Redis Cloud for free](https://redis.com/try-free/) diff --git a/docs/howtos/solutions/microservices/interservice-communication/images/02-orders-stream.png b/docs/howtos/solutions/microservices/interservice-communication/images/02-orders-stream.png new file mode 100644 index 00000000000..45a2a43ddb8 Binary files /dev/null and b/docs/howtos/solutions/microservices/interservice-communication/images/02-orders-stream.png differ diff --git a/docs/howtos/solutions/microservices/interservice-communication/images/04-payments-stream.png b/docs/howtos/solutions/microservices/interservice-communication/images/04-payments-stream.png new file mode 100644 index 00000000000..4c410fc7697 Binary files /dev/null and b/docs/howtos/solutions/microservices/interservice-communication/images/04-payments-stream.png differ diff --git a/docs/howtos/solutions/microservices/interservice-communication/images/interservice-communication-event-flow-diagram.png b/docs/howtos/solutions/microservices/interservice-communication/images/interservice-communication-event-flow-diagram.png new file mode 100644 index 00000000000..c89ea49dcc5 Binary files /dev/null and b/docs/howtos/solutions/microservices/interservice-communication/images/interservice-communication-event-flow-diagram.png differ diff --git a/docs/howtos/solutions/microservices/interservice-communication/images/interservice-communication-redis-streams.png b/docs/howtos/solutions/microservices/interservice-communication/images/interservice-communication-redis-streams.png new file mode 100644 index 00000000000..608f5889003 Binary files /dev/null and b/docs/howtos/solutions/microservices/interservice-communication/images/interservice-communication-redis-streams.png differ diff --git a/docs/howtos/solutions/microservices/interservice-communication/index-interservice-communication.mdx b/docs/howtos/solutions/microservices/interservice-communication/index-interservice-communication.mdx new file mode 100644 index 00000000000..7b81b848fbf --- /dev/null +++ b/docs/howtos/solutions/microservices/interservice-communication/index-interservice-communication.mdx @@ -0,0 +1,403 @@ +--- +id: index-solutions-interservice-communication +title: Microservices Communication with Redis Streams +sidebar_label: Microservices Communication with Redis Streams +slug: /howtos/solutions/microservices/interservice-communication +authors: [prasan, will] +--- + +import Authors from '@theme/Authors'; +import MicroservicesEcommerceDesign from '../common-data/microservices-ecommerce-old.mdx'; +import MicroservicesArchitectureWithRedis from '../common-data/microservices-arch-with-redis-old.mdx'; +import SourceCode from '../common-data/microservices-source-code-tip-old.mdx'; +import RedisCloud from '../common-data/redis-enterprise.mdx'; + + + + + +## What is interservice communication? + +When building a microservices application, people use a variety of options for communication between services. Among them: + +1. **Publish/Subscribe** model: In the pub/sub model (fire and forget), a publisher produces messages, and subscribers that are _active at the time_ consume those messages. Inactive subscribers cannot receive the messages at a later point in time. +1. **Streaming**: Most microservices applications use an event-streaming solution because of: + - **Message persistence**: Unlike the pub/sub model, messages stored in streams can be read by multiple consumers at any time; they fan out. So consumers can read messages at a later point in time, even if they were not active when the message was originally appended to the stream. + - **Inherent replayability**: Even if a subscriber crashes during the message processing, it can re-read the exact same unacknowledged message from the stream. For example, say the crashed subscriber never comes back online; the consumer group feature allows consumers to process unacknowledged messages of other consumers after a specified time. + - **Separation of concerns**: Producers can produce messages to stream at **high speed** separately and consumers can process messages at their own speed separately. This separation of concerns solves both the "fast producer -> slow consumer" and "slow producer -> fast consumer" problem, allowing for the scaling of those services independently. + +In an event-driven microservices architecture, you might have some services that publish an API, and other services that are simply producers and consumers of events with no external API. + +## Why you should use Redis for interservice communication + +Consider the following scenario: You have an e-commerce application with an architecture that is broken down into different microservices including as `create an order`, `create an invoice`, `process a payment`, `fulfill and order`, and so on. Microservices allow you to separate these commands into different services to scale independently, enabling more customers to get their orders processed quickly and simultaneously, which results in a better user experience, higher sales volume, and less-cranky customer service staff. + +When you use microservices, you need a tool for interservice communication. Initially, you might consider using a product like **Kafka** for streaming, but setting it up is rather complicated. What many people don't know about Redis is that it supports streams in the same way Kafka does. Given that you are likely already using Redis for caching, it makes sense to also use it for stream processing. To reduce the complexity of application architecture and maintenance, **Redis** is a great option for interservice communication. Here we break down the process of using Redis with streams for interservice communication. + +## Microservices architecture for an e-commerce application + + + +This diagram illustrates how Redis Streams is used as the message broker between the `orders service` and the `payments service`: + +![Microservices interservice communication with Redis streams message broker](images/interservice-communication-redis-streams.png) + +:::tip + +Redis Streams is more cost-effective than using Kafka or other similar technologies. +With sub-millisecond latency and a lightweight Streams log data structure, Redis is easier to deploy, develop, and operate. +::: + +## Using Redis for interservice communication in an event-driven architecture + +The following event flow diagram illustrates how the `orders service` and `payments service` communicate through Redis with streams: + +![Microservices streaming with Redis event flow diagram](images/interservice-communication-event-flow-diagram.png) + +Let's outline the streams and events used below: + +1. The `orders service` inserts order data into the database. + + ```json + //sample order data + { + "orderId": "01GTP3K2TZQQCQ0T2G43DSSMTD", + "products": [ + { + "productId": 11000, + "qty": 3, + "productPrice": 3995, + "productData": { + "productDisplayName": "Puma Men Slick 3HD Yellow Black Watches", + "variantName": "Slick 3HD Yellow", + "brandName": "Puma", + "ageGroup": "Adults-Men", + "gender": "Men" + //... + } + }, + { + "productId": 11001, + "qty": 2, + "productPrice": 5450, + "productData": { + "productDisplayName": "Puma Men Top Fluctuation Red Black Watches", + "variantName": "Top Fluctuation Red", + "brandName": "Puma", + "ageGroup": "Adults-Men", + "gender": "Men" + //... + } + } + ], + "userId": "USR_4e7acc44-e91e-4c5c-9112-bdd99d799dd3", + "orderStatusCode": 1, //order created + "createdBy": "USR_4e7acc44-e91e-4c5c-9112-bdd99d799dd3", + "statusCode": 1 + } + ``` + +2. The `orders service` also appends minimal data (orderId, orderAmount, and userId) to the `ORDERS_STREAM` to signal new order creation (i.e., it acts as `PRODUCER` of the `ORDERS_STREAM`). + + ![orders-stream](./images/02-orders-stream.png) + +3. The `payments service` listens to the `ORDERS_STREAM` and processes payments for new orders, then inserts payment data into the database (i.e, it acts as the `CONSUMER` of the `ORDERS_STREAM`). + + ```json + //sample payment data + { + "paymentId": "6403212956a976300afbaac1", + "orderId": "01GTP3K2TZQQCQ0T2G43DSSMTD", + "orderAmount": 22885, + "paidAmount": 22885, + "orderStatusCode": 3, //payment successful + "userId": "USR_4e7acc44-e91e-4c5c-9112-bdd99d799dd3", + "createdOn": { + "$date": { + "$numberLong": "1677926697841" + } + }, + "createdBy": "USR_4e7acc44-e91e-4c5c-9112-bdd99d799dd3", + "statusCode": 1 + } + ``` + +4. The `payments service` appends minimal data (orderId, paymentId, orderStatusCode, and userId) to the `PAYMENTS_STREAM` to signal a new payment (i.e., it acts as the `PRODUCER` of the `PAYMENTS_STREAM`). + ![payments-stream](./images/04-payments-stream.png) + +5. The `orders service` listens to the `PAYMENTS_STREAM` and updates the orderStatus and paymentId for orders in the database accordingly as the order payment is fulfilled (i.e., it acts as the `CONSUMER` of the `PAYMENTS_STREAM`). + +```json +{ + //order collection update + "orderId": "01GTP3K2TZQQCQ0T2G43DSSMTD", + "paymentId": "6403212956a976300afbaac1", + "orderStatusCode": 3 //payment success + //... +} +``` + +## E-commerce application frontend using Next.js and Tailwind + + + + + +## Building an interservice communication application with Redis + +We use Redis to broker the events sent between the `orders service` and the `payments service`. + +### Producer 1 (orders service) + +Let's look at some of the code in the orders service to understand how it works: + +1. Orders are created. +2. After order creation, the `orders service` appends minimal data to the `ORDERS_STREAM` to signal new order creation. + +```typescript title="server/src/services/orders/src/service-impl.ts" +const addOrderIdToStream = async ( + orderId: string, + orderAmount: number, + userId: string, +) => { + const nodeRedisClient = getNodeRedisClient(); + if (orderId && nodeRedisClient) { + const streamKeyName = 'ORDERS_STREAM'; + const entry = { + orderId: orderId, + orderAmount: orderAmount.toFixed(2), + userId: userId, + }; + const id = '*'; //* = auto generate + //xAdd adds entry to specified stream + await nodeRedisClient.xAdd(streamKeyName, id, entry); + } +}; +``` + +### Consumer 1 (payments service) + +3. The `payments service` listens to the `ORDERS_STREAM` + +```typescript title="server/src/services/payments/src/service-impl.ts" +// Below is some code for how you would use Redis to listen for the stream events: + +async function listenToStream( + onMessage: (message: any, messageId: string) => Promise, +) { + // using node-redis + const redis = getNodeRedisClient(); + const streamKeyName = 'ORDERS_STREAM'; //stream name + const groupName = 'ORDERS_CON_GROUP'; // listening consumer group name (custom) + const consumerName = 'PAYMENTS_CON'; // listening consumer name (custom) + const readMaxCount = 100; + + // Check if the stream group already exists + if (!(await redis.exists(streamKeyName))) { + const idPosition = '0'; //0 = start, $ = end or any specific id + await nodeRedisClient.xGroupCreate(streamKeyName, groupName, idPosition, { + MKSTREAM: true, + }); + } + + // setup a loop to listen for stream events + while (true) { + // read set of messages from different streams + const dataArr = await nodeRedisClient.xReadGroup( + commandOptions({ + isolated: true, + }), + groupName, + consumerName, + [ + { + // you can specify multiple streams in array + key: streamKeyName, + id: '>', // Next entry ID that no consumer in this group has read + }, + ], + { + COUNT: readMaxCount, // Read n entries at a time + BLOCK: 0, // block for 0 (infinite) seconds if there are none. + }, + ); + + for (let data of dataArr) { + for (let messageItem of data.messages) { + // process the message received (in our case, perform payment) + await onMessage(messageItem.message, messageItem.id); + + // acknowledge individual messages after processing + nodeRedisClient.xAck(streamKeyName, groupName, messageItem.id); + } + } + } +} + +// `listenToStream` listens for events and calls the `onMessage` callback to further handle the events. +listenToStream({ + onMessage: processPaymentForNewOrders, +}); + +const processPaymentForNewOrders: IMessageHandler = async ( + message, + messageId, +) => { + /* + message = { + orderId: "", + orderAmount: "", + userId: "", + } + */ + // process payment for new orderId and insert "payments" data to database +}; +``` + +:::note + +There are a few important things to note here: + +1. Make sure the stream group doesn't exist prior to creating it. +1. Use `isolated: true,` in order to use the blocking version of `XREADGROUP` in [isolated execution](https://github.com/redis/node-redis/blob/master/docs/isolated-execution.md) mode. +1. Acknowledge individual messages after you process them to remove the messages from the pending orders queue and to avoid processing them more than once. + +::: + +### Producer 2 (payments service) + +4. The `payments service` appends minimal data to `PAYMENTS_STREAM` to signal that a payment has been fulfilled. + +```typescript title="server/src/services/payments/src/service-impl.ts" +const addPaymentIdToStream = async ( + orderId: string, + paymentId: string, + orderStatus: number, + userId: string, +) => { + const nodeRedisClient = getNodeRedisClient(); + if (orderId && nodeRedisClient) { + const streamKeyName = 'PAYMENTS_STREAM'; + const entry = { + orderId: orderId, + paymentId: paymentId, + orderStatusCode: orderStatus.toString(), + userId: userId, + }; + const id = '*'; //* = auto generate + //xAdd adds entry to specified stream + await nodeRedisClient.xAdd(streamKeyName, id, entry); + } +}; +``` + +### Consumer 2 (orders service) + +5. The `orders service` listens to the `PAYMENTS_STREAM` and updates the order when payments are fulfilled. + +```typescript title="server/src/services/orders/src/service-impl.ts" +//Below is some code for how you would use Redis to listen for the stream events: + +async function listenToStream( + onMessage: (message: any, messageId: string) => Promise, +) { + // using node-redis + const redis = getNodeRedisClient(); + const streamKeyName = 'PAYMENTS_STREAM'; //stream name + const groupName = 'PAYMENTS_CON_GROUP'; //listening consumer group name (custom) + const consumerName = 'ORDERS_CON'; //listening consumer name (custom) + const readMaxCount = 100; + + // Check if the stream group already exists + if (!(await redis.exists(streamKeyName))) { + const idPosition = '0'; //0 = start, $ = end or any specific id + await nodeRedisClient.xGroupCreate(streamKeyName, groupName, idPosition, { + MKSTREAM: true, + }); + } + + // setup a loop to listen for stream events + while (true) { + // read set of messages from different streams + const dataArr = await nodeRedisClient.xReadGroup( + commandOptions({ + isolated: true, + }), + groupName, + consumerName, + [ + { + // you can specify multiple streams in array + key: streamKeyName, + id: '>', // Next entry ID that no consumer in this group has read + }, + ], + { + COUNT: readMaxCount, // Read n entries at a time + BLOCK: 0, // block for 0 (infinite) seconds if there are none. + }, + ); + + for (let data of dataArr) { + for (let messageItem of data.messages) { + //process the message received (in our case, updateOrderStatus) + await onMessage(messageItem.message, messageItem.id); + + // acknowledge individual messages after processing + nodeRedisClient.xAck(streamKeyName, groupName, messageItem.id); + } + } + } +} + +// `listenToStream` listens for events and calls the `onMessage` callback to further handle the events. +listenToStream({ + onMessage: updateOrderStatus, +}); + +const updateOrderStatus: IMessageHandler = async (message, messageId) => { + /* + message = { + orderId: "", + paymentId: "", + orderStatusCode:"", + userId: "", + } + */ + // updates orderStatus and paymentId in database accordingly for the order which has fulfilled payment + // updateOrderStatusInRedis(orderId,paymentId,orderStatusCode,userId) + // updateOrderStatusInMongoDB(orderId,paymentId,orderStatusCode,userId) +}; +``` + +:::tip + +It's a best practice to validate all incoming messages to make sure you can work with them. + +::: + +For the purposes of our application, we make a call to update the order status in both Redis and primary database in the same service (For simplicity, we are not using any synchronization technique between databases rather focusing on how the data is stored and accessed in Redis). Another common pattern is to have your services write to one database, and then separately use a CDC mechanism to update the other database. For example, you could write directly to Redis, then use **Triggers and Functions** to handle synchronizing Redis and primary database in the background. + +:::tip + +If you use **Redis Cloud**, you will find that Redis Streams is available on the same multi-tenant data platform you already use for caching. Redis Cloud also has high availability, message persistence, support for multiple clients, and resiliency with primary/secondary data replication… all built in. + +::: + +## Ready to use Redis for streaming? + +That's all there is to it! You now know how to use Redis for streaming as both a producer and a consumer. Hopefully, you can draw some inspiration from this tutorial and apply it to your own event streaming application. For more on this topic, check out the additional resources below: + +### Additional resources + +- Redis Streams + - Explore streams in detail in the [Redis University course on Redis Streams](https://university.redis.com/courses/ru202/) + - Check out our e-book on [Understanding Streams in Redis and Kafka: A Visual Guide](https://redis.com/docs/understanding-streams-in-redis-and-kafka-a-visual-guide/) +- Microservices with Redis + - [CQRS](/howtos/solutions/microservices/cqrs) + - [Query caching](/howtos/solutions/microservices/caching) + - [API gateway caching](/howtos/solutions/microservices/api-gateway-caching) +- [Redis YouTube channel](https://www.youtube.com/c/Redisinc) +- Clients like [Node Redis](https://github.com/redis/node-redis) and [Redis om Node](https://github.com/redis/redis-om-node) help you to use Redis in Node.js applications. +- [RedisInsight](https://redis.com/redis-enterprise/redis-insight/) : To view your Redis data or to play with raw Redis commands in the workbench +- [Try Redis Cloud for free](https://redis.com/try-free/) diff --git a/docs/howtos/solutions/mobile-banking/account-dashboard/images/01-ui-balance-over-time.png b/docs/howtos/solutions/mobile-banking/account-dashboard/images/01-ui-balance-over-time.png new file mode 100644 index 00000000000..2c7ac23ae76 Binary files /dev/null and b/docs/howtos/solutions/mobile-banking/account-dashboard/images/01-ui-balance-over-time.png differ diff --git a/docs/howtos/solutions/mobile-banking/account-dashboard/images/02-ui-biggest-spenders.png b/docs/howtos/solutions/mobile-banking/account-dashboard/images/02-ui-biggest-spenders.png new file mode 100644 index 00000000000..cba80472155 Binary files /dev/null and b/docs/howtos/solutions/mobile-banking/account-dashboard/images/02-ui-biggest-spenders.png differ diff --git a/docs/howtos/solutions/mobile-banking/account-dashboard/images/03-ui-search-transactions.png b/docs/howtos/solutions/mobile-banking/account-dashboard/images/03-ui-search-transactions.png new file mode 100644 index 00000000000..22f06e6bd47 Binary files /dev/null and b/docs/howtos/solutions/mobile-banking/account-dashboard/images/03-ui-search-transactions.png differ diff --git a/docs/howtos/solutions/mobile-banking/account-dashboard/images/04-ui-recent-transactions.png b/docs/howtos/solutions/mobile-banking/account-dashboard/images/04-ui-recent-transactions.png new file mode 100644 index 00000000000..c23d9d2d2b4 Binary files /dev/null and b/docs/howtos/solutions/mobile-banking/account-dashboard/images/04-ui-recent-transactions.png differ diff --git a/docs/howtos/solutions/mobile-banking/account-dashboard/images/dashboard.png b/docs/howtos/solutions/mobile-banking/account-dashboard/images/dashboard.png new file mode 100644 index 00000000000..e04021654d8 Binary files /dev/null and b/docs/howtos/solutions/mobile-banking/account-dashboard/images/dashboard.png differ diff --git a/docs/howtos/solutions/mobile-banking/account-dashboard/index-account-dashboard.mdx b/docs/howtos/solutions/mobile-banking/account-dashboard/index-account-dashboard.mdx new file mode 100644 index 00000000000..ec1dd6ebd5b --- /dev/null +++ b/docs/howtos/solutions/mobile-banking/account-dashboard/index-account-dashboard.mdx @@ -0,0 +1,219 @@ +--- +id: index-mb-account-dashboard +title: Mobile Banking Account Dashboard Using Redis +sidebar_label: Mobile Banking Account Dashboard Using Redis +slug: /howtos/solutions/mobile-banking/account-dashboard +authors: [prasan, will] +--- + +import Authors from '@theme/Authors'; + +import GeneralAdditionalResources from '../common-mb/additional-resources.mdx'; +import MobileBankingSourceCode from '../common-mb/source-code-tip.mdx'; +import MobileBankingDataSeeding from '../common-mb/data-seeding.mdx'; + + + + + +## What is a mobile banking account dashboard? + +An account dashboard is a page in a mobile banking app that instantly renders account highlights to users. A customer can click on any of the accounts on the dashboard to see the real-time account details, such as latest transactions, mortgage amount they have left to pay, checking and savings, etc. + +An account dashboard makes a customer's finances easily visible in one place. It reduces financial complexity for the customer and fosters customer loyalty. + +The following diagram is an example data architecture for an account dashboard: + +![dashboard](./images/dashboard.png) + +1. Banks store information in a number of separate databases that support individual banking products +2. Key customer account details (balances, recent transactions) across the banks product portfolio are prefetched into Redis Cloud using Redis Data Integration (RDI) +3. Redis Cloud powers customer's account dashboards, enabling mobile banking users to view balances and other high-priority information immediately upon login + +## Why you should use Redis for account dashboards in mobile banking + +- **Resilience**: Redis Cloud provides resilience with 99.999% uptime and Active-Active Geo Distribution to prevent loss of critical user profile data + +- **Scalability**: Redis Cloud provides < 1ms performance at incredibly high scale to ensure apps perform under peak loads + +- **JSON Support**: Provides the ability to create and store account information as JSON documents with the < 1ms speed of Redis + +- **Querying and Indexing**: Redis Cloud can quickly identify and store data from multiple different databases and index data to make it readily searchable + +:::note + +Redis Stack supports the **JSON** data type and allows you to index and querying JSON and [**more**](https://redis.io/docs/stack/). So your Redis data is not limited to simple key-value stringified data. + +::: + +## Building an account dashboard with Redis + + + +Download the above source code and run following command to start the demo application + +```sh +docker compose up -d +``` + +After docker up & running, open [http://localhost:8080/](http://localhost:8080/) url in browser to view application + +### Data seeding + + + +### Balance over time + +**Dashboard widget** + +![Chart](./images/01-ui-balance-over-time.png) + +**API endpoint** + +| | | +| ------------- | --------------------------------- | +| Endpoint | `/transaction/balance` | +| Code Location | `/routers/transaction-router.js` | +| Parameters | none | +| Return value | `[{x: timestamp, y: value}, ...]` | + +The balance endpoint leverages [**Time Series**](https://redis.io/docs/stack/timeseries/), It returns the range of all values from the time series object `balance_ts`. The resulting range is converted to an array of objects with each object containing an `x` property containing the timestamp and a `y` property containing the associated value. This endpoint supplies the time series chart with coordinates to plot a visualization of the balance over time. + +```js title="app/routers/transaction-router.js" +const BALANCE_TS = 'balance_ts'; + +/* fetch transactions up to sometime ago */ +transactionRouter.get('/balance', async (req, res) => { + //time series range + const balance = await redis.ts.range( + BALANCE_TS, + Date.now() - 1000 * 60 * 5, //from + Date.now(), //to + ); + + let balancePayload = balance.map((entry) => { + return { + x: entry.timestamp, + y: entry.value, + }; + }); + + res.send(balancePayload); +}); +``` + +### Biggest spenders + +**Dashboard widget** + +![Chart](./images/02-ui-biggest-spenders.png) + +**API end point** + +| | | +| ------------- | -------------------------------- | +| Endpoint | `/transaction//biggestspenders` | +| Code Location | `/routers/transaction-router.js` | +| Parameters | none | +| Return value | `{labels:[...], series:[...] }` | + +The biggest spenders endpoint leverages [**sorted sets**](https://redis.io/docs/manual/patterns/indexes/) as a secondary index, It retrieves all members of the sorted set `bigspenders` that have scores greater than zero. The top five or fewer are returned to provide the UI pie chart with data. The labels array contains the names of the biggest spenders and the series array contains the numeric values associated with each member name. + +```js title="app/routers/transaction-router.js" +const SORTED_SET_KEY = 'bigspenders'; + +/* fetch top 5 biggest spenders */ +transactionRouter.get('/biggestspenders', async (req, res) => { + const range = await redis.zRangeByScoreWithScores( + SORTED_SET_KEY, + 0, + Infinity, + ); + let series = []; + let labels = []; + + range.slice(0, 5).forEach((spender) => { + series.push(parseFloat(spender.score.toFixed(2))); + labels.push(spender.value); + }); + + res.send({ series, labels }); +}); +``` + +### Search existing transactions + +**Dashboard widget** + +![Search transactions](./images/03-ui-search-transactions.png) + +**API end point** + +| | | +| ---------------- | -------------------------------- | +| Endpoint | `/transaction/search` | +| Code Location | `/routers/transaction-router.js` | +| Query Parameters | term | +| Return value | array of results matching term | + +The search endpoint leverages [**Search and Query**](https://redis.io/docs/stack/search/), It receives a `term` query parameter from the UI. A [Redis om Node](https://github.com/redis/redis-om-node) query for the fields `description`, `fromAccountName`, and `accountType` will trigger and return results. + +```js title="app/routers/transaction-router.js" +transactionRouter.get('/search', async (req, res) => { + const term = req.query.term; + + let results; + + if (term.length >= 3) { + results = await bankRepo + .search() + .where('description') + .matches(term) + .or('fromAccountName') + .matches(term) + .or('transactionType') + .equals(term) + .return.all({ pageSize: 1000 }); + } + res.send(results); +}); +``` + +### Get recent transactions + +**Dashboard widget** + +![View recent transactions](./images/04-ui-recent-transactions.png) + +**API end point** + +| | | +| ------------- | -------------------------------- | +| Endpoint | `/transaction/transactions` | +| Code Location | `/routers/transaction-router.js` | +| Parameters | none | +| Return value | array of results | + +Even the transactions endpoint leverages [**Search and Query**](https://redis.io/docs/stack/search/). A [Redis om Node](https://github.com/redis/redis-om-node) query will trigger and return ten most recent transactions. + +```js title="app/routers/transaction-router.js" +/* return ten most recent transactions */ +transactionRouter.get('/transactions', async (req, res) => { + const transactions = await bankRepo + .search() + .sortBy('transactionDate', 'DESC') + .return.all({ pageSize: 10 }); + + res.send(transactions.slice(0, 10)); +}); +``` + +## Ready to use Redis in account dashboard? + +Hopefully, this tutorial has helped you visualize how to use Redis for account dashboard, specifically in the context of mobile banking. For additional resources related to this topic, check out the links below: + +### Additional resources + +- [Mobile Banking Session Management](/howtos/solutions/mobile-banking/session-management) + + diff --git a/docs/howtos/solutions/mobile-banking/common-mb/additional-resources.mdx b/docs/howtos/solutions/mobile-banking/common-mb/additional-resources.mdx new file mode 100644 index 00000000000..a12a7c8e84c --- /dev/null +++ b/docs/howtos/solutions/mobile-banking/common-mb/additional-resources.mdx @@ -0,0 +1,4 @@ +- [Redis YouTube channel](https://www.youtube.com/c/Redisinc) +- Clients like [Node Redis](https://github.com/redis/node-redis) and [Redis OM Node](https://github.com/redis/redis-om-node) help you to use Redis in Node.js applications. +- [RedisInsight](https://redis.com/redis-enterprise/redis-insight/): To view your Redis data or to play with raw Redis commands in the workbench +- [Try Redis Cloud for free](https://redis.com/try-free/) diff --git a/docs/howtos/solutions/mobile-banking/common-mb/data-seeding.mdx b/docs/howtos/solutions/mobile-banking/common-mb/data-seeding.mdx new file mode 100644 index 00000000000..d6a4d3de8ce --- /dev/null +++ b/docs/howtos/solutions/mobile-banking/common-mb/data-seeding.mdx @@ -0,0 +1,76 @@ +This application leverages **Redis core data structures, JSON, TimeSeries, Search and Query features**. The data seeded is later used to show a searchable transaction overview with realtime updates as well as a personal finance management overview with realtime balance and biggest spenders updates. + +On application startup in `app/server.js`, a cron is scheduled to create random bank transactions at regular intervals and seed those transactions in to Redis. + +```js title="app/server.js" +//cron job to trigger createBankTransaction() at regular intervals + +cron.schedule('*/10 * * * * *', async () => { + const userName = process.env.REDIS_USERNAME; + + createBankTransaction(userName); + + //... +}); +``` + +- The transaction generator creates a randomized banking debit or credit which will reflect on a (default) starting user balance of $100,000.00 +- The **transaction data** is saved as a JSON document within Redis. +- To capture **balance over time**, the `balanceAfter` value is recorded in a TimeSeries with the key `balance_ts` for every transaction. +- To track **biggest spenders**, an associated **`fromAccountName`** member within the sorted set `bigspenders` is incremented by the transaction amount. Note that this amount can be positive or negative. + +```js title="app/transactions/transactionsGenerator.js" +let balance = 100000.0; +const BALANCE_TS = 'balance_ts'; +const SORTED_SET_KEY = 'bigspenders'; + +export const createBankTransaction = async () => { + //to create random bank transaction + let vendorsList = source.source; //app/transactions/transaction_sources.js + const random = Math.floor(Math.random() * 9999999999); + + const vendor = vendorsList[random % vendorsList.length]; //random vendor from the list + + const amount = createTransactionAmount(vendor.fromAccountName, random); + const transaction = { + id: random * random, + fromAccount: Math.floor((random / 2) * 3).toString(), + fromAccountName: vendor.fromAccountName, + toAccount: '1580783161', + toAccountName: 'bob', + amount: amount, + description: vendor.description, + transactionDate: new Date(), + transactionType: vendor.type, + balanceAfter: balance, + }; + + //redis json feature + const bankTransaction = await bankTransactionRepository.save(transaction); + console.log('Created bankTransaction!'); + // ... +}; + +const createTransactionAmount = (vendor, random) => { + let amount = createAmount(); //random amount + balance += amount; + balance = parseFloat(balance.toFixed(2)); + + //redis time series feature + redis.ts.add(BALANCE_TS, '*', balance, { DUPLICATE_POLICY: 'first' }); + //redis sorted set as secondary index + redis.zIncrBy(SORTED_SET_KEY, amount * -1, vendor); + + return amount; +}; +``` + +Sample `bankTransaction` data view using [RedisInsight](https://redis.com/redis-enterprise/redis-insight/) + +![bank transaction data](./images/bank-transaction-data.png) + +![bank transaction json](./images/bank-transaction-json.png) + +:::tip +Download [**RedisInsight**](https://redis.com/redis-enterprise/redis-insight/) to view your Redis data or to play with raw Redis commands in the workbench. +::: diff --git a/docs/howtos/solutions/mobile-banking/common-mb/images/bank-transaction-data.png b/docs/howtos/solutions/mobile-banking/common-mb/images/bank-transaction-data.png new file mode 100644 index 00000000000..4e392dc160b Binary files /dev/null and b/docs/howtos/solutions/mobile-banking/common-mb/images/bank-transaction-data.png differ diff --git a/docs/howtos/solutions/mobile-banking/common-mb/images/bank-transaction-json.png b/docs/howtos/solutions/mobile-banking/common-mb/images/bank-transaction-json.png new file mode 100644 index 00000000000..2c68ffc0d6d Binary files /dev/null and b/docs/howtos/solutions/mobile-banking/common-mb/images/bank-transaction-json.png differ diff --git a/docs/howtos/solutions/mobile-banking/common-mb/source-code-tip.mdx b/docs/howtos/solutions/mobile-banking/common-mb/source-code-tip.mdx new file mode 100644 index 00000000000..9c810983140 --- /dev/null +++ b/docs/howtos/solutions/mobile-banking/common-mb/source-code-tip.mdx @@ -0,0 +1,7 @@ +:::tip GITHUB CODE + +Below is a command to the clone the source code for the application used in this tutorial + +git clone --branch v1.2.0 https://github.com/redis-developer/mobile-banking-solutions + +::: diff --git a/docs/howtos/solutions/mobile-banking/session-management/images/auth.png b/docs/howtos/solutions/mobile-banking/session-management/images/auth.png new file mode 100644 index 00000000000..59f3a985466 Binary files /dev/null and b/docs/howtos/solutions/mobile-banking/session-management/images/auth.png differ diff --git a/docs/howtos/solutions/mobile-banking/session-management/images/bank-transaction-data.png b/docs/howtos/solutions/mobile-banking/session-management/images/bank-transaction-data.png new file mode 100644 index 00000000000..4e392dc160b Binary files /dev/null and b/docs/howtos/solutions/mobile-banking/session-management/images/bank-transaction-data.png differ diff --git a/docs/howtos/solutions/mobile-banking/session-management/images/bank-transaction-json.png b/docs/howtos/solutions/mobile-banking/session-management/images/bank-transaction-json.png new file mode 100644 index 00000000000..2c68ffc0d6d Binary files /dev/null and b/docs/howtos/solutions/mobile-banking/session-management/images/bank-transaction-json.png differ diff --git a/docs/howtos/solutions/mobile-banking/session-management/images/browser-cookie-entry.png b/docs/howtos/solutions/mobile-banking/session-management/images/browser-cookie-entry.png new file mode 100644 index 00000000000..671e70f25bd Binary files /dev/null and b/docs/howtos/solutions/mobile-banking/session-management/images/browser-cookie-entry.png differ diff --git a/docs/howtos/solutions/mobile-banking/session-management/images/demo-dashboard-balance-widget.png b/docs/howtos/solutions/mobile-banking/session-management/images/demo-dashboard-balance-widget.png new file mode 100644 index 00000000000..27f142eb359 Binary files /dev/null and b/docs/howtos/solutions/mobile-banking/session-management/images/demo-dashboard-balance-widget.png differ diff --git a/docs/howtos/solutions/mobile-banking/session-management/images/demo-dashboard.png b/docs/howtos/solutions/mobile-banking/session-management/images/demo-dashboard.png new file mode 100644 index 00000000000..ca17959c167 Binary files /dev/null and b/docs/howtos/solutions/mobile-banking/session-management/images/demo-dashboard.png differ diff --git a/docs/howtos/solutions/mobile-banking/session-management/images/demo-login.png b/docs/howtos/solutions/mobile-banking/session-management/images/demo-login.png new file mode 100644 index 00000000000..ec6174cf470 Binary files /dev/null and b/docs/howtos/solutions/mobile-banking/session-management/images/demo-login.png differ diff --git a/docs/howtos/solutions/mobile-banking/session-management/images/session-balance-update.png b/docs/howtos/solutions/mobile-banking/session-management/images/session-balance-update.png new file mode 100644 index 00000000000..bb46e790a6a Binary files /dev/null and b/docs/howtos/solutions/mobile-banking/session-management/images/session-balance-update.png differ diff --git a/docs/howtos/solutions/mobile-banking/session-management/images/session-balance-update2.png b/docs/howtos/solutions/mobile-banking/session-management/images/session-balance-update2.png new file mode 100644 index 00000000000..a936c4abcfa Binary files /dev/null and b/docs/howtos/solutions/mobile-banking/session-management/images/session-balance-update2.png differ diff --git a/docs/howtos/solutions/mobile-banking/session-management/images/session-entry.png b/docs/howtos/solutions/mobile-banking/session-management/images/session-entry.png new file mode 100644 index 00000000000..d16e65e5723 Binary files /dev/null and b/docs/howtos/solutions/mobile-banking/session-management/images/session-entry.png differ diff --git a/docs/howtos/solutions/mobile-banking/session-management/index-session-management.mdx b/docs/howtos/solutions/mobile-banking/session-management/index-session-management.mdx new file mode 100644 index 00000000000..e8c3db1ca65 --- /dev/null +++ b/docs/howtos/solutions/mobile-banking/session-management/index-session-management.mdx @@ -0,0 +1,199 @@ +--- +id: index-mb-session-management +title: Mobile Banking Authentication and Session Storage Using Redis +sidebar_label: Mobile Banking Authentication and Session Storage Using Redis +slug: /howtos/solutions/mobile-banking/session-management +authors: [prasan, will] +--- + +import Authors from '@theme/Authors'; + +import GeneralAdditionalResources from '../common-mb/additional-resources.mdx'; +import MobileBankingSourceCode from '../common-mb/source-code-tip.mdx'; +import MobileBankingDataSeeding from '../common-mb/data-seeding.mdx'; + +import MobileBankingDashboardBalance from './images/demo-dashboard-balance-widget.png'; + + + + + +## What is authentication and session storage for mobile banking? + +After a user has successfully entered their login credentials, mobile banking apps use a `token` and `sessionId` created by the server to represent a user's identity. The `token` is stored in Redis for the duration of a user session and also sent in the login response to the banking application client (mobile/ browser). The client application then sends the `token` with every request to server and server validates it before processing the request. + +![auth](./images/auth.png) + +:::note + +Redis Stack supports the **JSON** data type and allows you to index and querying JSON and [**more**](https://redis.io/docs/stack/). So your session store is not limited to simple key-value stringified data. + +::: + +The session store houses critical information related to each user as they navigate an application for the duration of their session. Mobile banking session data may include, but is not limited to following information: + +- User's profile information, such as name, date of birth, email address, etc. +- User's permissions, such as `user`, `admin`, `supervisor`, `super-admin`, etc. +- Other app-related data like recent transaction(s), balance etc. +- Session expiration, such as one hour from now, one week from now, etc. + +## Why you should use Redis for mobile banking session management? + +- **Resilience**: Redis Cloud offers incredible resilience with **99.999% uptime**. After all, authentication token stores must provide round-the-clock availability. This ensures that users get uninterrupted, 24/7 access to their applications. + +- **Scalability**: Token stores need to be highly scalable so that they don't become a bottleneck when a **high volume of users** authenticate at once. Redis Cloud provides **< 1ms latency** at incredibly high throughput (up to **100MM ops/second**) which makes authentication and session data access much faster! + +- **Integration with common libraries and platforms**: Since Redis open source is integrated into most session management libraries and platforms, Redis Cloud can seamlessly integrate when upgrading from open source Redis (e.g. `express-session` and [`connect-redis-stack`](https://www.npmjs.com/package/connect-redis-stack) libraries integration is demonstrated in this tutorial) + +:::tip + +Read our ebook that answers the question: [**Are JSON Web Tokens (JWT) Safe?**](https://redis.com/docs/json-web-tokens-jwts-are-not-safe/) It discusses when and how to safely use JWTs, with battle-tested solutions for session management. + +::: + +## Building session management with Redis + + + +Download the above source code and run following command to start the demo application + +```sh +docker compose up +``` + +After docker up & running, open [http://localhost:8080/](http://localhost:8080/) url in browser to view application + +### Data seeding + + + +### Session configuration + +Redis is integrated into many session management libraries, We will be using [connect-redis-stack](https://www.npmjs.com/package/connect-redis-stack) library for this demo which provides Redis session storage for your +[express-session](https://www.npmjs.com/package/express-session) application. + +The following code illustrates configuring Redis sessions and with `express-session`. + +```js title="app/server.js" +import session from 'express-session'; +import { RedisStackStore } from 'connect-redis-stack'; + +/* configure your session store */ +const store = new RedisStackStore({ + client: redis, //redis client + prefix: 'redisBank:', //redis key prefix + ttlInSeconds: 3600, //session expiry time +}); + +const app = express(); + +// ... + +app.use( + session({ + store: store, //using redis store for session + resave: false, + saveUninitialized: false, + secret: '5UP3r 53Cr37', //from env file + }), +); + +//... +app.listen(8080, () => console.log('Listening on port 8080')); +``` + +### Login API (Session id generation) + +![login page](./images/demo-login.png) + +Let's look at the `/perform_login` API code which is triggered on the click of Login button from [login page](http://localhost:8080/) + +Since [connect-redis-stack](https://www.npmjs.com/package/connect-redis-stack) is an express middleware, a session is automatically created at the start of the request, and updated at the end of the HTTP(API) response if `req.session` variable is altered. + +```js +app.post('/perform_login', (req, res) => { + let session = req.session; + console.log(session); + /* + Session { + cookie: { path: '/', _expires: null, originalMaxAge: null, httpOnly: true } + } + */ + //hardcoded user for demo + if (req.body.username == 'bob' && req.body.password == 'foobared') { + //on successful login (for bob user) + session = req.session; + session.userid = req.body.username; //create session data + res.redirect('/index.html'); + } else { + res.redirect('/auth-login.html'); + } +}); +``` + +In above code - `session.userid` variable is assigned with a value on successful login (for "bob" user), so a session is created in Redis with assigned data and only Redis key (sessionId) is stored in client cookie. + +- Dashboard page after successful login + ![dashboard](./images/demo-dashboard.png) + +- Session entry in Redis + ![session entry](./images/session-entry.png) + +- Open developer tools in Dashboard page to check client cookie `connect.sid` (containing only sessionId) + ![browser cookie entry](./images/browser-cookie-entry.png) + +Now on every other API request from client, [connect-redis-stack](https://www.npmjs.com/package/connect-redis-stack) library makes sure to load session details from redis to `req.session` variable based on the client cookie (sessionId). + +### Balance API (Session storage) + +Consider the below `/transaction/balance` API code to demonstrate session storage. + +We have to modify the `req.session` variable to update session data. +Let's add more session data like current balance amount of the user . + +```js title="app/routers/transaction-router.js" +/* fetch all transactions up to an hour ago /transaction/balance */ +transactionRouter.get('/balance', async (req, res) => { + const balance = await redis.ts.range( + BALANCE_TS, + Date.now() - 1000 * 60 * 5, + Date.now(), + ); + + let balancePayload = balance.map((entry) => { + return { + x: entry.timestamp, + y: entry.value, + }; + }); + + let session = req.session; + if (session.userid && balancePayload.length) { + //adding latest BalanceAmount to session + session.currentBalanceAmount = balancePayload[balancePayload.length - 1]; //updating session data + } + + res.send(balancePayload); +}); +``` + +- Updated session entry in Redis with `currentBalanceAmount` field ('x' denoting timestamp and 'y' denoting balance amount at that timestamp) + ![session update](./images/session-balance-update2.png) + +- Verify the latest balance amount in the Dashboard UI + dashboard balance + +## Ready to use Redis in session management? + +Hopefully, this tutorial has helped you visualize how to use Redis for better session management, specifically in the context of mobile banking. For additional resources related to this topic, check out the links below: + +### Additional resources + +- [Are JSON Web Tokens (JWT) Safe?](https://redis.com/docs/json-web-tokens-jwts-are-not-safe/) + + diff --git a/docs/howtos/solutions/real-time-inventory/available-to-promise/api/decrement-many-skus.mdx b/docs/howtos/solutions/real-time-inventory/available-to-promise/api/decrement-many-skus.mdx new file mode 100644 index 00000000000..9de46b156ea --- /dev/null +++ b/docs/howtos/solutions/real-time-inventory/available-to-promise/api/decrement-many-skus.mdx @@ -0,0 +1,93 @@ +The code that follows shows an example API request and response for decrementManySKUs activity. + +**decrementManySKUs API Request** + +```json +POST http://localhost:3000/api/decrementManySKUs +[{ + "sku":1019688, + "quantity":4 +},{ + "sku":1003622, + "quantity":2 +},{ + "sku":1006702, + "quantity":2 +}] +``` + +**decrementManySKUs API Response** + +```json +{ + "data": [ + { + "sku": 1019688, + "name": "5-Year Protection Plan - Geek Squad", + "type": "BlackTie", + "totalQuantity": 28 //previous value 32 + }, + { + "sku": 1003622, + "name": "Aquarius - Fender Stratocaster 1,000-Piece Jigsaw Puzzle - Black/Red/White/Yellow/Green/Orange/Blue", + "type": "HardGood", + "totalQuantity": 8 //previous value 10 + }, + { + "sku": 1006702, + "name": "Clash of the Titans [DVD] [2010]", + "type": "Movie", + "totalQuantity": 8 //previous value 10 + } + ], + "error": null +} +``` + +When you make a request, it goes through the API gateway to the inventory service. Ultimately, it ends up calling a `decrementManySKUs` function which looks as follows: + +```typescript title="src/inventory-service.ts" + static async decrementManySKUs(_productsFilter: IProductBodyFilter[]): Promise { + /** + decrement quantity of specific Products. + + :param _productWithIds: Product list with Id + :return: Product list + */ + let retItems: IProduct[] = []; + + if (_productsFilter && _productsFilter.length) { + //validation only + const promArr: Promise[] = []; + for (let p of _productsFilter) { + if (p.sku) { + //validating if all products in stock + const promObj = InventoryServiceCls.validateQuantityOnDecrementSKU(p.sku, p.quantity); + promArr.push(promObj) + } + } + await Promise.all(promArr); + + //decrement only + const promArr2: Promise[] = []; + for (let p of _productsFilter) { + if (p.sku && p.quantity) { + const isDecrement = true; //increments with negative value + const isReturnProduct = false; + const promObj2 = InventoryServiceCls.incrementSKU(p.sku, p.quantity, isDecrement, isReturnProduct); + promArr2.push(promObj2) + } + } + await Promise.all(promArr2); + + + //retrieve updated products + retItems = await InventoryServiceCls.retrieveManySKUs(_productsFilter); + } + else { + throw `Input params failed !`; + } + + return retItems; + } +``` diff --git a/docs/howtos/solutions/real-time-inventory/available-to-promise/api/decrement-sku.mdx b/docs/howtos/solutions/real-time-inventory/available-to-promise/api/decrement-sku.mdx new file mode 100644 index 00000000000..59f50f1db49 --- /dev/null +++ b/docs/howtos/solutions/real-time-inventory/available-to-promise/api/decrement-sku.mdx @@ -0,0 +1,73 @@ +The code that follows shows an example API request and response for decrementSKU activity. + +**decrementSKU API Request** + +```json +POST http://localhost:3000/api/decrementSKU +{ + "sku":1019688, + "quantity":4 +} +``` + +**decrementSKU API Response** + +```json +{ + "data": { + "sku": 1019688, + "name": "5-Year Protection Plan - Geek Squad", + "type": "BlackTie", + "totalQuantity": 16 //previous value 20 + }, + "error": null +} +``` + +When you make a request, it goes through the API gateway to the inventory service. Ultimately, it ends up calling a `decrementSKU` function which looks as follows: + +```typescript title="src/inventory-service.ts" +static async decrementSKU(_productId: number, _decrQuantity: number): Promise { + /** + decrement quantity of a Product. + + :param _productId: Product Id + :param _decrQuantity: new decrement quantity + :return: Product with Quantity + */ + let retItem: IProduct = {}; + + //validating if product in stock + let isValid = await InventoryServiceCls.validateQuantityOnDecrementSKU(_productId, _decrQuantity); + + if (isValid) { + const isDecrement = true; //increments with negative value + const isReturnProduct = true; + retItem = await InventoryServiceCls.incrementSKU(_productId, _decrQuantity, isDecrement, isReturnProduct); + } + + return retItem; + } + + static async validateQuantityOnDecrementSKU(_productId: number, _decrQuantity?: number): Promise { + let isValid = false; + + if (!_decrQuantity) { + _decrQuantity = 1; + } + + if (_productId) { + const product = await InventoryServiceCls.retrieveSKU(_productId); + if (product && product.totalQuantity && product.totalQuantity > 0 + && (product.totalQuantity - _decrQuantity >= 0)) { + + isValid = true; + } + else { + throw `For product with Id ${_productId}, available quantity(${product.totalQuantity}) is lesser than decrement quantity(${_decrQuantity})`; + } + + } + return isValid; + } +``` diff --git a/docs/howtos/solutions/real-time-inventory/available-to-promise/api/increment-sku.mdx b/docs/howtos/solutions/real-time-inventory/available-to-promise/api/increment-sku.mdx new file mode 100644 index 00000000000..54be0949655 --- /dev/null +++ b/docs/howtos/solutions/real-time-inventory/available-to-promise/api/increment-sku.mdx @@ -0,0 +1,66 @@ +The code that follows shows an example API request and response for incrementSKU activity. + +**incrementSKU API Request** + +```json +POST http://localhost:3000/api/incrementSKU +{ + "sku":1019688, + "quantity":2 +} +``` + +**incrementSKU API Response** + +```json +{ + "data": { + "sku": 1019688, + "name": "5-Year Protection Plan - Geek Squad", + "type": "BlackTie", + "totalQuantity": 12 //previous value 10 + }, + "error": null +} +``` + +When you make a request, it goes through the API gateway to the inventory service. Ultimately, it ends up calling a `incrementSKU` function which looks as follows: + +```typescript title="src/inventory-service.ts" +static async incrementSKU(_productId: number, _incrQuantity: number, _isDecrement: boolean, _isReturnProduct: boolean): Promise { + /** + increment quantity of a Product. + + :param _productId: Product Id + :param _incrQuantity: new increment quantity + :return: Product with Quantity + */ + + const redisOmClient = getRedisOmClient(); + let retItem: IProduct = {}; + + if (!_incrQuantity) { + _incrQuantity = 1; + } + if (_isDecrement) { + _incrQuantity = _incrQuantity * -1; + } + if (redisOmClient && _productId && _incrQuantity) { + + const updateKey = `${ProductRepo.PRODUCT_KEY_PREFIX}:${_productId}`; + + //increment json number field by specific (positive/ negative) value + await redisOmClient.redis?.json.numIncrBy(updateKey, '$.totalQuantity', _incrQuantity); + + if (_isReturnProduct) { + retItem = await InventoryServiceCls.retrieveSKU(_productId); + } + + } + else { + throw `Input params failed !`; + } + + return retItem; + } +``` diff --git a/docs/howtos/solutions/real-time-inventory/available-to-promise/api/retrieve-many-skus.mdx b/docs/howtos/solutions/real-time-inventory/available-to-promise/api/retrieve-many-skus.mdx new file mode 100644 index 00000000000..629e8d3082a --- /dev/null +++ b/docs/howtos/solutions/real-time-inventory/available-to-promise/api/retrieve-many-skus.mdx @@ -0,0 +1,97 @@ +The code that follows shows an example API request and response for retrieveManySKUs activity. + +**retrieveManySKUs API Request** + +```json +POST http://localhost:3000/api/retrieveManySKUs +[{ + "sku":1019688 +},{ + "sku":1003622 +},{ + "sku":1006702 +}] +``` + +**retrieveManySKUs API Response** + +```json +{ + "data": [ + { + "sku": 1019688, + "name": "5-Year Protection Plan - Geek Squad", + "type": "BlackTie", + "totalQuantity": 24 + }, + { + "sku": 1003622, + "name": "Aquarius - Fender Stratocaster 1,000-Piece Jigsaw Puzzle - Black/Red/White/Yellow/Green/Orange/Blue", + "type": "HardGood", + "totalQuantity": 10 + }, + { + "sku": 1006702, + "name": "Clash of the Titans [DVD] [2010]", + "type": "Movie", + "totalQuantity": 10 + } + ], + "error": null +} +``` + +When you make a request, it goes through the API gateway to the inventory service. Ultimately, it ends up calling a `retrieveManySKUs` function which looks as follows: + +```typescript title="src/inventory-service.ts" +static async retrieveManySKUs(_productWithIds: IProductBodyFilter[]): Promise { + /** + Get current Quantity of specific Products. + + :param _productWithIds: Product list with Id + :return: Product list + */ + const repository = ProductRepo.getRepository(); + let retItems: IProduct[] = []; + + if (repository && _productWithIds && _productWithIds.length) { + + //string id array + const idArr = _productWithIds.map((product) => { + return product.sku?.toString() || "" + }); + + //fetch products by IDs (using redis om library) + const result = await repository.fetch(...idArr); + + let productsArr: IProduct[] = []; + + if (idArr.length == 1) { + productsArr = [result]; + } + else { + productsArr = result; + } + + if (productsArr && productsArr.length) { + + retItems = productsArr.map((product) => { + return { + sku: product.sku, + name: product.name, + type: product.type, + totalQuantity: product.totalQuantity + } + }); + } + else { + throw `No products found !`; + } + } + else { + throw `Input params failed !`; + } + + return retItems; + } +``` diff --git a/docs/howtos/solutions/real-time-inventory/available-to-promise/api/retrieve-sku.mdx b/docs/howtos/solutions/real-time-inventory/available-to-promise/api/retrieve-sku.mdx new file mode 100644 index 00000000000..eef88d0d2bc --- /dev/null +++ b/docs/howtos/solutions/real-time-inventory/available-to-promise/api/retrieve-sku.mdx @@ -0,0 +1,62 @@ +The code that follows shows an example API request and response for retrieveSKU activity. + +**retrieveSKU API Request** + +```json +GET http://localhost:3000/api/retrieveSKU?sku=1019688 +``` + +**retrieveSKU API Response** + +```json +{ + "data": { + "sku": 1019688, + "name": "5-Year Protection Plan - Geek Squad", + "type": "BlackTie", + "totalQuantity": 10 + }, + "error": null +} +``` + +When you make a request, it goes through the API gateway to the inventory service. Ultimately, it ends up calling a `retrieveSKU` function which looks as follows: + +**code** + +```typescript title="src/inventory-service.ts" + +static async retrieveSKU(_productId: number): Promise { + /** + Get current Quantity of a Product. + + :param _productId: Product Id + :return: Product with Quantity + */ + const repository = ProductRepo.getRepository(); + let retItem: IProduct = {}; + + if (repository && _productId) { + //fetch product by ID (using redis om library) + const product = await repository.fetch(_productId.toString()); + + if (product) { + retItem = { + sku: product.sku, + name: product.name, + type: product.type, + totalQuantity: product.totalQuantity + } + } + else { + throw `Product with Id ${_productId} not found`; + } + } + else { + throw `Input params failed !`; + } + + return retItem; +} + +``` diff --git a/docs/howtos/solutions/real-time-inventory/available-to-promise/api/update-sku.mdx b/docs/howtos/solutions/real-time-inventory/available-to-promise/api/update-sku.mdx new file mode 100644 index 00000000000..e6f3dabd035 --- /dev/null +++ b/docs/howtos/solutions/real-time-inventory/available-to-promise/api/update-sku.mdx @@ -0,0 +1,69 @@ +The code that follows shows an example API request and response for updateSKU activity. + +**updateSKU API Request** + +```json +POST http://localhost:3000/api/updateSKU +{ + "sku":1019688, + "quantity":25 +} +``` + +**updateSKU API Response** + +```json +{ + "data": { + "sku": 1019688, + "name": "5-Year Protection Plan - Geek Squad", + "type": "BlackTie", + "totalQuantity": 25 //updated value + }, + "error": null +} +``` + +When you make a request, it goes through the API gateway to the inventory service. Ultimately, it ends up calling a `updateSKU` function which looks as follows: + +```typescript title="src/inventory-service.ts" + static async updateSKU(_productId: number, _quantity: number): Promise { + /** + Set Quantity of a Product. + + :param _productId: Product Id + :param _quantity: new quantity + :return: Product with Quantity + */ + const repository = ProductRepo.getRepository(); + let retItem: IProduct = {}; + + if (repository && _productId && _quantity >= 0) { + //fetch product by ID (using redis om library) + const product = await repository.fetch(_productId.toString()); + + if (product) { + //update the product fields + product.totalQuantity = _quantity; + + // save the modified product + const savedItem = await repository.save(product); + + retItem = { + sku: savedItem.sku, + name: savedItem.name, + type: savedItem.type, + totalQuantity: savedItem.totalQuantity + } + } + else { + throw `Product with Id ${_productId} not found`; + } + } + else { + throw `Input params failed !`; + } + + return retItem; + } +``` diff --git a/docs/howtos/solutions/real-time-inventory/available-to-promise/images/atp.png b/docs/howtos/solutions/real-time-inventory/available-to-promise/images/atp.png new file mode 100644 index 00000000000..6e3e3d43016 Binary files /dev/null and b/docs/howtos/solutions/real-time-inventory/available-to-promise/images/atp.png differ diff --git a/docs/howtos/solutions/real-time-inventory/available-to-promise/images/atp2.png b/docs/howtos/solutions/real-time-inventory/available-to-promise/images/atp2.png new file mode 100644 index 00000000000..916fccdbf1e Binary files /dev/null and b/docs/howtos/solutions/real-time-inventory/available-to-promise/images/atp2.png differ diff --git a/docs/howtos/solutions/real-time-inventory/available-to-promise/index-rti-available-to-promise.mdx b/docs/howtos/solutions/real-time-inventory/available-to-promise/index-rti-available-to-promise.mdx new file mode 100644 index 00000000000..05e91a05906 --- /dev/null +++ b/docs/howtos/solutions/real-time-inventory/available-to-promise/index-rti-available-to-promise.mdx @@ -0,0 +1,119 @@ +--- +id: index-rti-available-to-promise +title: Available to Promise in Real-time Inventory Using Redis +sidebar_label: Available to Promise in Real-time Inventory +slug: /howtos/solutions/real-time-inventory/available-to-promise +authors: [prasan, will] +--- + +import Authors from '@theme/Authors'; + +import RetrieveSKU from './api/retrieve-sku.mdx'; +import UpdateSKU from './api/update-sku.mdx'; +import IncrementSKU from './api/increment-sku.mdx'; +import DecrementSKU from './api/decrement-sku.mdx'; +import RetrieveManySKUs from './api/retrieve-many-skus.mdx'; +import DecrementManySKUs from './api/decrement-many-skus.mdx'; + +import RealTimeInventorySourceCode from '../common-rti/source-code-tip.mdx'; +import RealTimeInventoryChallenges from '../common-rti/rti-challenges.mdx'; +import RealTimeInventoryCustomerProofs from '../common-rti/customer-proofs.mdx'; +import GeneralAdditionalResources from '../common-rti/additional-resources.mdx'; + + + + + +## What is available-to-promise (ATP)? + +The major requirement in a **retail inventory system** is presenting an accurate, real-time view of inventory to shoppers and store associates enabling buy-online-pickup-in-store (BOPIS). Optimizing fulfillment from multiple inventory locations. + +**Available to promise (ATP)** is the projected amount of inventory left available to sell, not including allocated inventory. It allows businesses to control distribution to their customers and predict inventory. The ATP model helps retailers keep inventory costs down such as ordering costs, carrying costs and stock-out costs. ATP is helpful as long as consumer buying forecasts remain correct. Implementing ATP processes effectively for retailers can mean the difference between sustained growth and an inventory that repeatedly runs out of customer's favorite products missing sales opportunities and harming customer experience. + +### How to calculate available-to-promise + +Calculating available-to-promise is a relatively simple undertaking. Complete the following formula for an accurate breakdown of available-to-promise capabilities: + +``` +Available-to-promise = QuantityOnHand + Supply - Demand +``` + +This formula includes the following elements: + +- QuantityOnHand: the total number of products that are immediately available to a company +- Supply: the total stock of a product available for sale +- Demand: the amount of a specific product that consumers are willing to purchase + +## Current challenges in real time inventory + + + +## Why you should use Redis for available-to-promise + +- **Increased inventory visibility**: Redis Cloud provides highly scalable, real-time inventory synchronization between stores providing views into what stock is Available-To-Promise. Customers want to buy from a retailer who can check stock across multiple locations and provide real-time views on what's available locally. + +- **Enhanced customer experience**: Sub-millisecond latency means online customers can easily get real-time views of shopping carts, pricing, and in stock availability. Redis Cloud built-in search engine delivers full text and aggregated faceted search of inventory in real time, scaling performance to instantly search inventories with millions of product types helping customers fill their shopping carts faster, keeping them engaged and loyal. + +- **Cost efficiency at scale**: Redis Cloud offers real-time, bi-directional consistency between stores and data integration capabilities with enterprise systems without the complexity and costs of managing message brokers, auditing, and reconciliation. + +## Real time inventory with Redis + +![atp](./images/atp2.png) + +Using Redis, System delivers real-time synchronization of inventory across stores, in transit and warehouses. Provide retailers the most accurate, timely data on inventory across their entire store network and consumers positive customer experiences searching and locating inventory. + +Redis Data Integration (RDI) capabilities enable accurate real-time inventory management and system of record synchronization. Redis advanced inventory search and query capabilities provide accurate available inventory information to multichannel and omnichannel customers and store associates. + +This solution increases inventory turnover ratios resulting in lower inventory costs, higher revenue and profits. It also reduces the impact of customer searches on Systems of Record and Inventory Management Systems (IMS). + +### Customer proof points + + + +## Building a real time inventory service with redis + + + +Managing inventory or a **SKU (stock keeping unit)** process contains some activities like : + +1. RetrieveSKU : Fetch the current quantity of a product +1. UpdateSKU : Update the latest quantity of a product +1. IncrementSKU : Increment the quantity by a specific value (Say, when more products are procured) +1. DecrementSKU : Decrement the quantity by a specific value (Say, after order fulfillment of the product) +1. RetrieveManySKUs : Fetch the current quantity of **multiple** products (Say, to verify products in stock before payment) +1. DecrementManySKUs: Decrement the quantity of **multiple** products (Say, after an order fulfillment with multiple products) + +### RetrieveSKU + + + +### UpdateSKU + + + +### IncrementSKU + + + +### DecrementSKU + + + +### RetrieveManySKUs + + + +### DecrementManySKUs + + + +## Ready to use Redis in a Real time inventory system? + +Hopefully, this tutorial has helped you visualize how to use Redis in a Real time inventory system for product availability across different location stores. For additional resources related to this topic, check out the links below: + +### Additional resources + +- Real time inventory with Redis + - [Real time Local Inventory Search](/howtos/solutions/real-time-inventory/local-inventory-search) +- General + diff --git a/docs/howtos/solutions/real-time-inventory/common-rti/additional-resources.mdx b/docs/howtos/solutions/real-time-inventory/common-rti/additional-resources.mdx new file mode 100644 index 00000000000..9fe1b25724a --- /dev/null +++ b/docs/howtos/solutions/real-time-inventory/common-rti/additional-resources.mdx @@ -0,0 +1,4 @@ +- [Redis YouTube channel](https://www.youtube.com/c/Redisinc) +- Clients like [Node Redis](https://github.com/redis/node-redis) and [Redis om Node](https://github.com/redis/redis-om-node) help you to use Redis in Node.js applications. +- [RedisInsight](https://redis.com/redis-enterprise/redis-insight/) : To view your Redis data or to play with raw Redis commands in the workbench +- [Try Redis Cloud for free](https://redis.com/try-free/) diff --git a/docs/howtos/solutions/real-time-inventory/common-rti/customer-proofs.mdx b/docs/howtos/solutions/real-time-inventory/common-rti/customer-proofs.mdx new file mode 100644 index 00000000000..2f2e82fe363 --- /dev/null +++ b/docs/howtos/solutions/real-time-inventory/common-rti/customer-proofs.mdx @@ -0,0 +1 @@ +- Redis Cloud on Google Cloud Platform enables [**Ulta Beauty to build a “digital store of the future”**](https://redis.com/customers/ulta-beauty/) diff --git a/docs/howtos/solutions/real-time-inventory/common-rti/rti-challenges.mdx b/docs/howtos/solutions/real-time-inventory/common-rti/rti-challenges.mdx new file mode 100644 index 00000000000..8bba0b0e7d0 --- /dev/null +++ b/docs/howtos/solutions/real-time-inventory/common-rti/rti-challenges.mdx @@ -0,0 +1,9 @@ +- **Over and under-stocking**: While adopting a multi-channel business model (online & in store), lack of inventory visibility results in over and under-stocking of inventory in different regions and stores. + +- Consumers **seek convenience**: The ability to search across regional store locations and pickup merchandise **immediately** rather than wait for shipping is a key differentiator for retailers. + +- Consumers **seek speed**: All retailers, even small or family-run, must compete against the **customer experience** of large online retailers like Alibaba, FlipKart, Shopee, and Amazon. + +- **High inventory costs**: Retailers seek to lower inventory costs by eliminating missed sales from out-of-stock scenarios which also leads to higher “inventory turnover ratios.” + +- **Brand value**: Inaccurate store inventory counts lead to frustrated customers and lower sales. The operational pain will **impact the status quo**. diff --git a/docs/howtos/solutions/real-time-inventory/common-rti/source-code-tip.mdx b/docs/howtos/solutions/real-time-inventory/common-rti/source-code-tip.mdx new file mode 100644 index 00000000000..8a58f8671a3 --- /dev/null +++ b/docs/howtos/solutions/real-time-inventory/common-rti/source-code-tip.mdx @@ -0,0 +1,7 @@ +:::tip GITHUB CODE + +Below is a command to the clone the source code for the application used in this tutorial + +git clone https://github.com/redis-developer/redis-real-time-inventory-solutions + +::: diff --git a/docs/howtos/solutions/real-time-inventory/local-inventory-search/api/inventory-search-with-distance.mdx b/docs/howtos/solutions/real-time-inventory/local-inventory-search/api/inventory-search-with-distance.mdx new file mode 100644 index 00000000000..721d7fa6e46 --- /dev/null +++ b/docs/howtos/solutions/real-time-inventory/local-inventory-search/api/inventory-search-with-distance.mdx @@ -0,0 +1,141 @@ +The code that follows shows an example API request and response for `inventorySearchWithDistance` API: + +**inventorySearchWithDistance API Request** + +```json title="POST http://localhost:3000/api/inventorySearchWithDistance" +{ + "sku": 1019688, + "searchRadiusInKm": 500, + "userLocation": { + "latitude": 42.88023, + "longitude": -78.878738 + } +} +``` + +**inventorySearchWithDistance API Response** + +```json title="inventorySearchWithDistance API Response" +{ + "data": [ + { + "storeId": "02_NY_ROCHESTER", + "storeLocation": { + "longitude": -77.608849, + "latitude": 43.156578 + }, + "sku": "1019688", + "quantity": "38", + "distInKm": "107.74513" + }, + { + "storeId": "05_NY_WATERTOWN", + "storeLocation": { + "longitude": -75.910759, + "latitude": 43.974785 + }, + "sku": "1019688", + "quantity": "31", + "distInKm": "268.86249" + }, + { + "storeId": "10_NY_POUGHKEEPSIE", + "storeLocation": { + "longitude": -73.923912, + "latitude": 41.70829 + }, + "sku": "1019688", + "quantity": "45", + "distInKm": "427.90787" + } + ], + "error": null +} +``` + +When you make a request, it goes through the API gateway to the inventory service. Ultimately, it ends up calling an `inventorySearchWithDistance` function which looks as follows: + +```typescript title="src/inventory-service.ts" +/** + * Search Product in stores within search radius, Also sort results by distance from current user location to store. + * + * :param _inventoryFilter: Product Id (sku), searchRadiusInKm and current userLocation + * :return: Inventory product list + */ +static async inventorySearchWithDistance(_inventoryFilter: IInventoryBodyFilter): Promise { + const nodeRedisClient = getNodeRedisClient(); + + const repository = StoresInventoryRepo.getRepository(); + let retItems: IStoresInventory[] = []; + + if (nodeRedisClient && repository && _inventoryFilter?.sku + && _inventoryFilter?.userLocation?.latitude + && _inventoryFilter?.userLocation?.longitude) { + + const lat = _inventoryFilter.userLocation.latitude; + const long = _inventoryFilter.userLocation.longitude; + const radiusInKm = _inventoryFilter.searchRadiusInKm || 1000; + + const queryBuilder = repository.search() + .where('sku') + .eq(_inventoryFilter.sku) + .and('quantity') + .gt(0) + .and('storeLocation') + .inRadius((circle) => { + return circle + .latitude(lat) + .longitude(long) + .radius(radiusInKm) + .kilometers + }); + + console.log(queryBuilder.query); + /* Sample queryBuilder query + ( ( (@sku:[1019688 1019688]) (@quantity:[(0 +inf]) ) (@storeLocation:[-78.878738 42.88023 500 km]) ) + */ + + const indexName = `${StoresInventoryRepo.STORES_INVENTORY_KEY_PREFIX}:index`; + const aggregator = await nodeRedisClient.ft.aggregate( + indexName, + queryBuilder.query, + { + LOAD: ["@storeId", "@storeLocation", "@sku", "@quantity"], + STEPS: [{ + type: AggregateSteps.APPLY, + expression: `geodistance(@storeLocation, ${long}, ${lat})/1000`, + AS: 'distInKm' + }, { + type: AggregateSteps.SORTBY, + BY: "@distInKm" + }] + }); + + /* Sample command to run query directly on CLI + FT.AGGREGATE StoresInventory:index '( ( (@sku:[1019688 1019688]) (@quantity:[(0 +inf]) ) (@storeLocation:[-78.878738 42.88023 500 km]) )' LOAD 4 @storeId @storeLocation @sku @quantity APPLY "geodistance(@storeLocation,-78.878738,42.88043)/1000" AS distInKm SORTBY 1 @distInKm + */ + + retItems = aggregator.results; + + if (!retItems.length) { + throw `Product not found with in ${radiusInKm}km range!`; + } + else { + retItems = retItems.map((item) => { + if (typeof item.storeLocation == "string") { + const location = item.storeLocation.split(","); + item.storeLocation = { + longitude: Number(location[0]), + latitude: Number(location[1]), + } + } + return item; + }) + } + } + else { + throw `Input params failed !`; + } + return retItems; +} +``` diff --git a/docs/howtos/solutions/real-time-inventory/local-inventory-search/api/inventory-search.mdx b/docs/howtos/solutions/real-time-inventory/local-inventory-search/api/inventory-search.mdx new file mode 100644 index 00000000000..b064b50643d --- /dev/null +++ b/docs/howtos/solutions/real-time-inventory/local-inventory-search/api/inventory-search.mdx @@ -0,0 +1,111 @@ +The code that follows shows an example API request and response for the `inventorySearch` API: + +**inventorySearch API Request** + +```json title="POST http://localhost:3000/api/inventorySearch" +{ + "sku":1019688, + "searchRadiusInKm":500, + "userLocation": { + "latitude": 42.880230, + "longitude": -78.878738 + } +} +``` + +**inventorySearch API Response** + +```json +{ + "data": [ + { + "storeId": "02_NY_ROCHESTER", + "storeLocation": { + "longitude": -77.608849, + "latitude": 43.156578 + }, + "sku": 1019688, + "quantity": 38 + }, + { + "storeId": "05_NY_WATERTOWN", + "storeLocation": { + "longitude": -75.910759, + "latitude": 43.974785 + }, + "sku": 1019688, + "quantity": 31 + }, + { + "storeId": "10_NY_POUGHKEEPSIE", + "storeLocation": { + "longitude": -73.923912, + "latitude": 41.70829 + }, + "sku": 1019688, + "quantity": 45 + } + ], + "error": null +} +``` + +When you make a request, it goes through the API gateway to the `inventory service`. Ultimately, it ends up calling an `inventorySearch` function which looks as follows: + +```typescript title="src/inventory-service.ts" +/** + * Search Product in stores within search radius. + * + * :param _inventoryFilter: Product Id (sku), searchRadiusInKm and current userLocation + * :return: Inventory product list + */ +static async inventorySearch(_inventoryFilter: IInventoryBodyFilter): Promise { + const nodeRedisClient = getNodeRedisClient(); + + const repository = StoresInventoryRepo.getRepository(); + let retItems: IStoresInventory[] = []; + + if (nodeRedisClient && repository && _inventoryFilter?.sku + && _inventoryFilter?.userLocation?.latitude + && _inventoryFilter?.userLocation?.longitude) { + + const lat = _inventoryFilter.userLocation.latitude; + const long = _inventoryFilter.userLocation.longitude; + const radiusInKm = _inventoryFilter.searchRadiusInKm || 1000; + + const queryBuilder = repository.search() + .where('sku') + .eq(_inventoryFilter.sku) + .and('quantity') + .gt(0) + .and('storeLocation') + .inRadius((circle) => { + return circle + .latitude(lat) + .longitude(long) + .radius(radiusInKm) + .kilometers + }); + + console.log(queryBuilder.query); + /* Sample queryBuilder query + ( ( (@sku:[1019688 1019688]) (@quantity:[(0 +inf]) ) (@storeLocation:[-78.878738 42.88023 500 km]) ) + */ + + retItems = await queryBuilder.return.all(); + + /* Sample command to run query directly on CLI + FT.SEARCH StoresInventory:index '( ( (@sku:[1019688 1019688]) (@quantity:[(0 +inf]) ) (@storeLocation:[-78.878738 42.88023 500 km]) )' + */ + + + if (!retItems.length) { + throw `Product not found with in ${radiusInKm}km range!`; + } + } + else { + throw `Input params failed !`; + } + return retItems; +} +``` diff --git a/docs/howtos/solutions/real-time-inventory/local-inventory-search/images/inventory-data.png b/docs/howtos/solutions/real-time-inventory/local-inventory-search/images/inventory-data.png new file mode 100644 index 00000000000..cb31b292b8a Binary files /dev/null and b/docs/howtos/solutions/real-time-inventory/local-inventory-search/images/inventory-data.png differ diff --git a/docs/howtos/solutions/real-time-inventory/local-inventory-search/images/local-search.png b/docs/howtos/solutions/real-time-inventory/local-inventory-search/images/local-search.png new file mode 100644 index 00000000000..2c98a3f0a4f Binary files /dev/null and b/docs/howtos/solutions/real-time-inventory/local-inventory-search/images/local-search.png differ diff --git a/docs/howtos/solutions/real-time-inventory/local-inventory-search/images/newyork-state.png b/docs/howtos/solutions/real-time-inventory/local-inventory-search/images/newyork-state.png new file mode 100644 index 00000000000..59474b38d88 Binary files /dev/null and b/docs/howtos/solutions/real-time-inventory/local-inventory-search/images/newyork-state.png differ diff --git a/docs/howtos/solutions/real-time-inventory/local-inventory-search/images/product-data.png b/docs/howtos/solutions/real-time-inventory/local-inventory-search/images/product-data.png new file mode 100644 index 00000000000..5908e8c75d6 Binary files /dev/null and b/docs/howtos/solutions/real-time-inventory/local-inventory-search/images/product-data.png differ diff --git a/docs/howtos/solutions/real-time-inventory/local-inventory-search/index-rti-local-inventory-search.mdx b/docs/howtos/solutions/real-time-inventory/local-inventory-search/index-rti-local-inventory-search.mdx new file mode 100644 index 00000000000..7dd52113a91 --- /dev/null +++ b/docs/howtos/solutions/real-time-inventory/local-inventory-search/index-rti-local-inventory-search.mdx @@ -0,0 +1,122 @@ +--- +id: index-rti-local-inventory-search +title: Real-time Local Inventory Search Using Redis +sidebar_label: Real-time Local Inventory Search Using Redis +slug: /howtos/solutions/real-time-inventory/local-inventory-search +authors: [prasan, will] +--- + +import Authors from '@theme/Authors'; + +import InventorySearch from './api/inventory-search.mdx'; +import InventorySearchWithDistance from './api/inventory-search-with-distance.mdx'; + +import RealTimeInventorySourceCode from '../common-rti/source-code-tip.mdx'; +import RealTimeInventoryChallenges from '../common-rti/rti-challenges.mdx'; +import RealTimeInventoryCustomerProofs from '../common-rti/customer-proofs.mdx'; +import GeneralAdditionalResources from '../common-rti/additional-resources.mdx'; + +import newYorkState from './images/newyork-state.png'; + + + + + +## What is real-time local inventory search? + +**Real-time local inventory search** is a method of utilizing advanced product search capabilities across a group of stores or warehouses in a region or geographic area by which a retailer can enhance the customer experience with a localized view of inventory while fulfilling orders from the closest store possible. + +Geospatial search of merchandise local to the consumer helps sell stock faster, lowers inventory levels, and thus increases inventory turnover ratio. +Consumers locate a product online, place the order in their browser or mobile device, and pick up at nearest store location. This is called “buy-online-pickup-in-store” (BOPIS) + +## Current challenges in real time inventory + + + +## Why you should use Redis for local inventory search + +- **Accurate location/regional inventory search**: Redis Cloud geospatial search capabilities enable retailers to provide local inventories by store location across geographies and regions based on a consumer's location. This enables a real-time view of store inventory and and seamless BOPIS shopping experience. + +- **Consistent and accurate inventory view across multichannel and omnichannel experiences**: Accurate inventory information no matter what channel the shopper is using, in-store, kiosk, online, or mobile. Redis Cloud provides a single source of truth for inventory information across all channels. + +- **Real-time search performance at scale**: Redis Cloud real-time search and query engine allows retailers to provide instant application and inventory search responses and scale performance effortlessly during peak periods. + +## Real-time local inventory search with Redis + +![local-search](./images/local-search.png) + +Redis provides geospatial search capabilities across a group of stores or warehouses in a region or geographic area allowing a retailer to quickly show the available inventory local to the customer. + +Redis Cloud processes event streams, keeping store inventories up-to-date in real-time. This enhances the customer experience with localized, accurate search of inventory while fulfilling orders from the nearest and fewest stores possible. + +This solution lowers days sales of inventory (DSI), selling inventory faster and carrying less inventory for increased revenue generation and profits over a shorter time period. + +It also reduces fulfillment costs to home and local stores enhancing a retailer's ability to fulfill orders with the lowest delivery and shipping costs. + +:::tip Customer proof points + + + +::: + +## Building a real time local inventory search with redis + + + +### Setting up the data + +Once the application source code is downloaded, run following commands to populate data in Redis: + +```shell +# install packages +npm install + +# Seed data to Redis +npm run seed +``` + +The demo uses two collections: + +- **Product collection**: Stores product details like `productId`, `name`, `price`, `image`, and other details + ![product data](./images/product-data.png) + +:::tip +Download [RedisInsight](https://redis.com/redis-enterprise/redis-insight/) to view your Redis data or to play with raw Redis commands in the workbench. +::: + +- **StoresInventory collection**: Stores product quantity available at different local stores. + +For demo purpose, we are using the below regions in New York, US as store locations. Products are mapped to these location stores with a `storeId` and `quantity`. + +Regions in NewYork State + +![inventory data](./images/inventory-data.png) + +Let's build the following APIs to demonstrate geospatial search using Redis: + +- **InventorySearch API**: Search Products in local stores within a search radius. +- **InventorySearchWithDistance API**: Search Product in local stores within search radius and sort results by distance from current user location to store. + +### InventorySearch API + + + +### InventorySearchWithDistance API + + + +## Ready to use Redis for real-time local inventory search? + +Hopefully this tutorial has helped you visualize how to use Redis for real-time local inventory search across different regional stores. For additional resources related to this topic, check out the links below: + +### Additional resources + +- Real time inventory with Redis + - [Available to Promise in Real-time Inventory](/howtos/solutions/real-time-inventory/available-to-promise) +- General + diff --git a/docs/howtos/solutions/triggers-and-functions/common-tf/images/01-dashboard-semantic-text.png b/docs/howtos/solutions/triggers-and-functions/common-tf/images/01-dashboard-semantic-text.png new file mode 100644 index 00000000000..658a3c7af9a Binary files /dev/null and b/docs/howtos/solutions/triggers-and-functions/common-tf/images/01-dashboard-semantic-text.png differ diff --git a/docs/howtos/solutions/triggers-and-functions/common-tf/images/01-dashboard.png b/docs/howtos/solutions/triggers-and-functions/common-tf/images/01-dashboard.png new file mode 100644 index 00000000000..cbb383335e0 Binary files /dev/null and b/docs/howtos/solutions/triggers-and-functions/common-tf/images/01-dashboard.png differ diff --git a/docs/howtos/solutions/triggers-and-functions/common-tf/images/02-cart.png b/docs/howtos/solutions/triggers-and-functions/common-tf/images/02-cart.png new file mode 100644 index 00000000000..9f096691ed5 Binary files /dev/null and b/docs/howtos/solutions/triggers-and-functions/common-tf/images/02-cart.png differ diff --git a/docs/howtos/solutions/triggers-and-functions/common-tf/images/05-order-history.png b/docs/howtos/solutions/triggers-and-functions/common-tf/images/05-order-history.png new file mode 100644 index 00000000000..747560cba9f Binary files /dev/null and b/docs/howtos/solutions/triggers-and-functions/common-tf/images/05-order-history.png differ diff --git a/docs/howtos/solutions/triggers-and-functions/common-tf/images/06-admin-charts.png b/docs/howtos/solutions/triggers-and-functions/common-tf/images/06-admin-charts.png new file mode 100644 index 00000000000..4bfa493670a Binary files /dev/null and b/docs/howtos/solutions/triggers-and-functions/common-tf/images/06-admin-charts.png differ diff --git a/docs/howtos/solutions/triggers-and-functions/common-tf/images/07-admin-top-trending.png b/docs/howtos/solutions/triggers-and-functions/common-tf/images/07-admin-top-trending.png new file mode 100644 index 00000000000..5b8be9e8222 Binary files /dev/null and b/docs/howtos/solutions/triggers-and-functions/common-tf/images/07-admin-top-trending.png differ diff --git a/docs/howtos/solutions/triggers-and-functions/common-tf/images/08-settings.png b/docs/howtos/solutions/triggers-and-functions/common-tf/images/08-settings.png new file mode 100644 index 00000000000..ddd70498a6a Binary files /dev/null and b/docs/howtos/solutions/triggers-and-functions/common-tf/images/08-settings.png differ diff --git a/docs/howtos/solutions/triggers-and-functions/common-tf/microservices-ecommerce-tf.mdx b/docs/howtos/solutions/triggers-and-functions/common-tf/microservices-ecommerce-tf.mdx new file mode 100644 index 00000000000..08f2343ceda --- /dev/null +++ b/docs/howtos/solutions/triggers-and-functions/common-tf/microservices-ecommerce-tf.mdx @@ -0,0 +1,20 @@ +The e-commerce microservices application consists of a frontend, built using [Next.js](https://nextjs.org/) with [TailwindCSS](https://tailwindcss.com/). The application backend uses [Node.js](https://nodejs.org). The data is stored in [Redis](https://redis.com/try-free/) and either MongoDB or PostgreSQL, using [Prisma](https://www.prisma.io/docs/reference/database-reference/supported-databases). Below are screenshots showcasing the frontend of the e-commerce app. + +**Dashboard:** Displays a list of products with different search functionalities, configurable in the settings page. +![Redis Microservices E-commerce App Frontend - Products Page](images/01-dashboard.png) + +**Settings:** Accessible by clicking the gear icon at the top right of the dashboard. Control the search bar, chatbot visibility, and other features here. +![Redis Microservices E-commerce App Frontend - Settings Page](images/08-settings.png) + +**Dashboard (Semantic Text Search):** Configured for semantic text search, the search bar enables natural language queries. Example: "pure cotton blue shirts." +![Redis Microservices E-commerce App Frontend - Semantic Text Search](images/01-dashboard-semantic-text.png) + +**Shopping Cart:** Add products to the cart and check out using the "Buy Now" button. +![Redis Microservices E-commerce App Frontend - Shopping Cart](images/02-cart.png) + +**Order History:** Post-purchase, the 'Orders' link in the top navigation bar shows the order status and history. +![Redis Microservices E-commerce App Frontend - Order History Page](images/05-order-history.png) + +**Admin Panel:** Accessible via the 'admin' link in the top navigation. Displays purchase statistics and trending products. +![Redis Microservices E-commerce App Frontend - Admin Page](images/06-admin-charts.png) +![Redis Microservices E-commerce App Frontend - Admin Page](images/07-admin-top-trending.png) diff --git a/docs/howtos/solutions/triggers-and-functions/common-tf/microservices-source-code-tf.mdx b/docs/howtos/solutions/triggers-and-functions/common-tf/microservices-source-code-tf.mdx new file mode 100644 index 00000000000..7aa733bf44c --- /dev/null +++ b/docs/howtos/solutions/triggers-and-functions/common-tf/microservices-source-code-tf.mdx @@ -0,0 +1,7 @@ +:::tip GITHUB CODE + +Below is a command to the clone the source code for the application used in this tutorial + +git clone --branch v9.2.0 https://github.com/redis-developer/redis-microservices-ecommerce-solutions + +::: diff --git a/docs/howtos/solutions/triggers-and-functions/getting-started-tf/images/add-triggers-redis-insight.png b/docs/howtos/solutions/triggers-and-functions/getting-started-tf/images/add-triggers-redis-insight.png new file mode 100644 index 00000000000..604142753bc Binary files /dev/null and b/docs/howtos/solutions/triggers-and-functions/getting-started-tf/images/add-triggers-redis-insight.png differ diff --git a/docs/howtos/solutions/triggers-and-functions/getting-started-tf/images/how-triggers-work.png b/docs/howtos/solutions/triggers-and-functions/getting-started-tf/images/how-triggers-work.png new file mode 100644 index 00000000000..6b45a83022f Binary files /dev/null and b/docs/howtos/solutions/triggers-and-functions/getting-started-tf/images/how-triggers-work.png differ diff --git a/docs/howtos/solutions/triggers-and-functions/getting-started-tf/images/key-space-trigger-test-ri.png b/docs/howtos/solutions/triggers-and-functions/getting-started-tf/images/key-space-trigger-test-ri.png new file mode 100644 index 00000000000..8c19f3c633b Binary files /dev/null and b/docs/howtos/solutions/triggers-and-functions/getting-started-tf/images/key-space-trigger-test-ri.png differ diff --git a/docs/howtos/solutions/triggers-and-functions/getting-started-tf/images/key-space-trigger-verify-ri.png b/docs/howtos/solutions/triggers-and-functions/getting-started-tf/images/key-space-trigger-verify-ri.png new file mode 100644 index 00000000000..18452cd64bc Binary files /dev/null and b/docs/howtos/solutions/triggers-and-functions/getting-started-tf/images/key-space-trigger-verify-ri.png differ diff --git a/docs/howtos/solutions/triggers-and-functions/getting-started-tf/images/on-demand-trigger-test-ri.png b/docs/howtos/solutions/triggers-and-functions/getting-started-tf/images/on-demand-trigger-test-ri.png new file mode 100644 index 00000000000..21e5276126a Binary files /dev/null and b/docs/howtos/solutions/triggers-and-functions/getting-started-tf/images/on-demand-trigger-test-ri.png differ diff --git a/docs/howtos/solutions/triggers-and-functions/getting-started-tf/images/on-demand-trigger-verify-ri.png b/docs/howtos/solutions/triggers-and-functions/getting-started-tf/images/on-demand-trigger-verify-ri.png new file mode 100644 index 00000000000..307ecaf6630 Binary files /dev/null and b/docs/howtos/solutions/triggers-and-functions/getting-started-tf/images/on-demand-trigger-verify-ri.png differ diff --git a/docs/howtos/solutions/triggers-and-functions/getting-started-tf/images/stream-trigger-test-ri.png b/docs/howtos/solutions/triggers-and-functions/getting-started-tf/images/stream-trigger-test-ri.png new file mode 100644 index 00000000000..86f6d5b2d37 Binary files /dev/null and b/docs/howtos/solutions/triggers-and-functions/getting-started-tf/images/stream-trigger-test-ri.png differ diff --git a/docs/howtos/solutions/triggers-and-functions/getting-started-tf/images/stream-trigger-verify-admin-charts.png b/docs/howtos/solutions/triggers-and-functions/getting-started-tf/images/stream-trigger-verify-admin-charts.png new file mode 100644 index 00000000000..368c564ff4f Binary files /dev/null and b/docs/howtos/solutions/triggers-and-functions/getting-started-tf/images/stream-trigger-verify-admin-charts.png differ diff --git a/docs/howtos/solutions/triggers-and-functions/getting-started-tf/images/stream-trigger-verify-admin-top-trending.png b/docs/howtos/solutions/triggers-and-functions/getting-started-tf/images/stream-trigger-verify-admin-top-trending.png new file mode 100644 index 00000000000..441124b66f0 Binary files /dev/null and b/docs/howtos/solutions/triggers-and-functions/getting-started-tf/images/stream-trigger-verify-admin-top-trending.png differ diff --git a/docs/howtos/solutions/triggers-and-functions/getting-started-tf/index-triggers-and-functions.mdx b/docs/howtos/solutions/triggers-and-functions/getting-started-tf/index-triggers-and-functions.mdx new file mode 100644 index 00000000000..87fc28fb9d0 --- /dev/null +++ b/docs/howtos/solutions/triggers-and-functions/getting-started-tf/index-triggers-and-functions.mdx @@ -0,0 +1,633 @@ +--- +id: index-triggers-and-functions +title: Getting Started With Triggers and Functions in Redis +sidebar_label: Getting Started With Triggers and Functions in Redis +slug: /howtos/solutions/triggers-and-functions/getting-started +authors: [prasan, will] +--- + +import Authors from '@theme/Authors'; +import InitialMicroservicesArchitecture from '../../microservices/common-data/microservices-arch.mdx'; +import MicroservicesEcommerceTFDesign from '../common-tf/microservices-ecommerce-tf.mdx'; +import SourceCode from '../common-tf/microservices-source-code-tf.mdx'; + + + +## What you will learn in this tutorial + +In this comprehensive tutorial on Redis 7.2's **Triggers and Functions**, you'll gain insights and practical skills in the following areas: + +- **Understanding Redis Triggers and Functions**: Grasp the fundamentals of Redis's new programmability features, including how to use JavaScript code for data-driven triggers and functions. +- **Application Scenarios**: Explore real-world applications in an e-commerce context, such as inventory management and sales statistics calculations. +- **Types of Triggers**: Learn the distinction and use cases for `On-demand` Triggers, `KeySpace` Triggers, and `Stream` Triggers. +- **Hands-on Implementation**: Get practical experience by creating and deploying various triggers and functions in a simulated e-commerce environment. + +## Microservices architecture for an e-commerce application + + + + + +## E-commerce application frontend using Next.js and Tailwind + + + +## What are triggers and functions ? + +Triggers and functions represent a revolutionary step in Redis's programmability, introduced in Redis 7.2. This feature empowers developers to program, store, and execute **JavaScript code** in response to data changes directly within the Redis database, similar to stored procedures or triggers in traditional SQL databases. + +This capability lets developers define events (called `triggers`) to execute `functions` closer to the data. That is, developers define business logic that executes in response to database events or commands. That speeds up the code and related interactions, because there is no wait to bring code from clients into the database. + +![how-triggers-work](./images/how-triggers-work.png) + +### Advantages + +Incorporating triggers and functions into Redis capitalizes on its renowned real-time performance and simplicity: + +- **Reduced Latency**: By processing tasks directly within Redis, you minimize network overhead, saving time and computational resources. +- **Real-time event processing**: Triggers are executed in **real-time** and keep the **atomicity** of the command. This removes potential data inconsistencies that are introduced by applying async logic through application code. +- **JavaScript**: Use the most known language by professional developers. This lowers the learning curve compared to the lesser-known Lua Functions in earlier Redis Gears. +- **Compatibility**: Seamlessly integrate with existing Redis Stack capabilities and data structures. + +### Types of triggers and functions + +Triggers and functions in Redis can be categorized into three types, based on their activation methods: + +- **On-demand Triggers**: These are explicitly invoked by calling them directly. +- **KeySpace Triggers**: Triggered by operations on keys, such as creation, update, or deletion. +- **Stream Triggers**: Activated when new entries are added to a Redis stream. + +### Sample product data + +To illustrate the application of triggers and functions, let's consider a simplified e-commerce dataset. This dataset includes detailed product information, which we will use throughout our tutorial. + +```ts title="database/fashion-dataset/001/products/*.json" +const products = [ + { + productId: '11000', + price: 3995, + productDisplayName: 'Puma Men Slick 3HD Yellow Black Watches', + variantName: 'Slick 3HD Yellow', + brandName: 'Puma', + ageGroup: 'Adults-Men', + gender: 'Men', + displayCategories: 'Accessories', + masterCategory_typeName: 'Accessories', + subCategory_typeName: 'Watches', + styleImages_default_imageURL: + 'http://host.docker.internal:8080/images/11000.jpg', + productDescriptors_description_value: 'Stylish and comfortable, ...', + stockQty: 25, + }, + //... +]; +``` + +## OnDemand trigger + +On-demand triggers in Redis are JavaScript functions that are explicitly invoked to perform specific tasks. + +### Application Scenario: Resetting Inventory + +In our e-commerce demo, consider a feature where we need to reset the stock quantity of all products. We'll implement this by clicking a `RESET STOCK QTY` button in the UI dashboard, triggering the `resetInventory` function. + +### Creating the function + +Let's craft a function named `resetInventory` under the namespace `OnDemandTriggers`. This function will reset the inventory (stock quantity) of all products to 25. + +```js title="database/src/triggers/on-demand-trigger.js" +#!js name=OnDemandTriggers api_version=1.0 + +redis.registerAsyncFunction('resetInventory', async function (client) { + let cursor = '0'; + const DEFAULT_PRODUCT_QTY = 25; + + redis.log('resetInventory'); + do { + client.block((client) => { + //scan all the product keys in the database + let res = client.call('scan', cursor, 'match', 'products:productId:*'); + cursor = res[0]; + let keys = res[1]; + // loop through all the product keys and set the stockQty to 25 + keys.forEach((key) => { + if (!key.match('index:hash')) { + client.call( + 'JSON.SET', + key, + '$.stockQty', + DEFAULT_PRODUCT_QTY.toString(), + ); + } + }); + }); + } while (cursor != '0'); + + return 'resetInventory completed !'; +}); +``` + +### Adding the function to Redis + +We can add functions to Redis using various methods: + +1. Using redis-cli + +```sh +redis-cli -x TFUNCTION LOAD < ./on-demand-trigger.js +# or if you want to replace the function +redis-cli -x TFUNCTION LOAD REPLACE . < ./on-demand-trigger.js +``` + +2. Using code + +```ts title="database/src/triggers.ts" +import type { NodeRedisClientType } from './config.js'; +import * as path from 'path'; +import * as fs from 'fs/promises'; + +async function addTriggerToRedis( + fileRelativePath: string, + redisClient: NodeRedisClientType, +) { + const filePath = path.join(__dirname, fileRelativePath); + const fileData = await fs.readFile(filePath); + let jsCode = fileData.toString(); + jsCode = jsCode.replace(/\r?\n/g, '\n'); + + try { + const result = await redisClient.sendCommand([ + 'TFUNCTION', + 'LOAD', + 'REPLACE', + jsCode, + ]); + console.log(`addTriggersToRedis ${fileRelativePath}`, result); + } catch (err) { + console.log(err); + } +} +``` + +```ts +addTriggerToRedis('triggers/on-demand-trigger.js', redisClient); +``` + +3. Using RedisInsight + +Navigate to the `Triggers and Functions` section in RedisInsight, then to `Libraries`, and use create library to paste and save your function. + +![add-triggers-redis-insight](./images/add-triggers-redis-insight.png) + +### Testing the function + +1. Using redis-cli + +```sh +redis-cli TFCALLASYNC OnDemandTriggers.resetInventory 0 +``` + +2. Using code + +Clicking on the 'RESET STOCK QTY' button triggers the triggerResetInventory API. + +```json +POST http://localhost:3000/products/triggerResetInventory +{ +} +``` + +This invokes the `resetInventory` function: + +```ts title="server/src/services/products/src/service-impl.ts" +const triggerResetInventory = async () => { + const redisClient = getNodeRedisClient(); + + //@ts-ignore + const result = await redisClient.sendCommand( + ['TFCALLASYNC', 'OnDemandTriggers.resetInventory', '0'], + { + isolated: true, + }, + ); + console.log(`triggerResetInventory : `, result); + + return result; +}; +``` + +3. Using RedisInsight + +Test the command in RedisInsight's workbench and view the results. + +![on-demand-trigger-test-ri](./images/on-demand-trigger-test-ri.png) + +### Verifying data integrity + +Post-execution, check whether the `stockQty` for each product is reset to the default value. + +![on-demand-trigger-verify-ri](./images/on-demand-trigger-verify-ri.png) + +## KeySpace trigger + +A KeySpace trigger allows you to execute custom logic whenever a set of keys matching a specific pattern is added/ modified in the Redis database. It provides a way to react to changes in the data and perform actions based on those changes. + +### Application Scenario : Managing product stock quantity + +In our e-commerce demo, let's address a common need: decreasing product stock quantity upon placing an order. We'll achieve this using a `KeySpace trigger` that listens to `orders:orderId` keys and updates the product stock quantities accordingly. + +### Creating the function + +We'll develop `updateProductStockQty` under the `KeySpaceTriggers` namespace. This function will be responsible for adjusting stock quantities based on order details. + +```js title="database/src/triggers/key-space-trigger.js" +#!js name=KeySpaceTriggers api_version=1.0 +redis.registerKeySpaceTrigger( + 'updateProductStockQty', + 'orders:orderId:', // Keys starting with this prefix are monitored + function (client, data) { + const errors = []; + + try { + if ( + client && + data?.event == 'json.set' && + data?.key != 'orders:orderId:index:hash' + ) { + const orderId = data.key; + // get the order details from the orderId key + let result = client.call('JSON.GET', orderId); + result = result ? JSON.parse(result) : ''; + const order = Array.isArray(result) ? result[0] : result; + + if (order?.products?.length && !order.triggerProcessed) { + try { + //create a log stream to log the trigger events and errors + client.call( + 'XGROUP', + 'CREATE', + 'TRIGGER_LOGS_STREAM', + 'TRIGGER_LOGS_GROUP', + '$', + 'MKSTREAM', + ); + } catch (streamConErr) { + // if log stream already exists + } + + // reduce stockQty for each product in the order + for (const product of order.products) { + let decreaseQtyBy = (-1 * product.qty).toString(); + client.call( + 'JSON.NUMINCRBY', + `products:productId:${product.productId}`, + '.stockQty', + decreaseQtyBy, + ); + + // add log entry + client.call( + 'XADD', + 'TRIGGER_LOGS_STREAM', + '*', + 'message', + `For productId ${product.productId}, stockQty ${decreaseQtyBy}`, + 'orderId', + orderId, + 'function', + 'updateProductStockQty', + ); + } + + // set triggerProcessed flag to avoid duplicate processing + client.call('JSON.SET', orderId, '.triggerProcessed', '1'); + } + } + } catch (generalErr) { + generalErr = JSON.stringify( + generalErr, + Object.getOwnPropertyNames(generalErr), + ); + errors.push(generalErr); + } + + if (errors.length) { + //log error + client.call( + 'XADD', + 'TRIGGER_LOGS_STREAM', + '*', + 'message', + JSON.stringify(errors), + 'orderId', + data.key, + 'function', + 'updateProductStockQty', + ); + } + }, +); +``` + +In this script, we listen to changes in the `orders:orderId:` keys. Upon detecting a new order, the function retrieves the order details and accordingly decreases the stock quantity for each product in the order. + +### Adding the function to Redis + +We can add functions to Redis using various methods: + +1. Using redis-cli + +```sh +redis-cli -x TFUNCTION LOAD < ./key-space-trigger.js +# or if you want to replace the function +redis-cli -x TFUNCTION LOAD REPLACE . < ./key-space-trigger.js +``` + +2. Using code + +```ts title="database/src/triggers.ts" +import type { NodeRedisClientType } from './config.js'; +import * as path from 'path'; +import * as fs from 'fs/promises'; + +async function addTriggerToRedis( + fileRelativePath: string, + redisClient: NodeRedisClientType, +) { + const filePath = path.join(__dirname, fileRelativePath); + const fileData = await fs.readFile(filePath); + let jsCode = fileData.toString(); + jsCode = jsCode.replace(/\r?\n/g, '\n'); + + try { + const result = await redisClient.sendCommand([ + 'TFUNCTION', + 'LOAD', + 'REPLACE', + jsCode, + ]); + console.log(`addTriggersToRedis ${fileRelativePath}`, result); + } catch (err) { + console.log(err); + } +} +``` + +```ts +addTriggerToRedis('triggers/key-space-trigger.js', redisClient); +``` + +3. Using RedisInsight + +Navigate to the `Triggers and Functions` section in RedisInsight, then to `Libraries`, and use create library to paste and save your function. + +![add-triggers-redis-insight](./images/add-triggers-redis-insight.png) + +### Testing the function + +In our demo, placing an order through the `Buy Now` button triggers the `createOrder` API, which in turn creates a new `orders:orderId:` key, activating the `updateProductStockQty` function. + +Sample createOrder API request: + +```json +POST http://localhost:3000/orders/createOrder +{ + "products": [ + { + "productId": "11002", + "qty": 1, + "productPrice": 4950, + }, + { + "productId": "11012", + "qty": 2, + "productPrice": 1195, + } + ] +} +``` + +A sample order creation command in Redis: + +```sh +"JSON.SET" "orders:orderId:24b38a47-2b7d-4c5d-ba25-b74749e34c65" "$" "{"products":[{"productId":"10381","qty":1,"productPrice":2499,"productData":{}},{"productId":"11030","qty":1,"productPrice":1099,"productData":{}}],"userId":"USR_f0f00a86-7131-40e1-9d89-765b4cc1927f","orderId":"24b38a47-2b7d-4c5d-ba25-b74749e34c65","orderStatusCode":1}" +``` + +The creation of this new key triggers `updateProductStockQty`, leading to the adjustment of stock quantities. + +Monitor the trigger's activity in the `TRIGGER_LOGS_STREAM` for logs and potential errors. + +![key-space-trigger-test-ri](./images/key-space-trigger-test-ri.png) + +### Verifying data integrity + +After the function execution, verify the decreased `stockQty` for each involved product. + +![key-space-trigger-verify-ri](./images/key-space-trigger-verify-ri.png) + +## Stream trigger + +A stream trigger allows you to listen to a Redis stream and execute a function whenever new data is added to the stream. It is commonly used for real-time data processing and event-driven architectures. + +### Application Scenario : Calculating sales statistics + +In our e-commerce demo, let's consider a feature where we need to calculate sales statistics for the products. We'll implement this using a `Stream trigger` that listens to `TRANSACTION_STREAM` and updates the sales statistics accordingly. + +### Creating the function + +We'll develop `calculateStats` under the `StreamTriggers` namespace. This function will be responsible for calculating sales statistics based on the order details. + +```js title="database/src/triggers/stream-trigger.js" +#!js name=StreamTriggers api_version=1.0 + +redis.registerStreamTrigger( + 'calculateStats', // trigger name + 'TRANSACTION_STREAM', // Detects new data added to the stream + function (client, data) { + var streamEntry = {}; + for (let i = 0; i < data.record?.length; i++) { + streamEntry[data.record[i][0]] = data.record[i][1]; + } + + streamEntry.transactionPipeline = JSON.parse( + streamEntry.transactionPipeline, + ); + streamEntry.orderDetails = JSON.parse(streamEntry.orderDetails); + + if ( + streamEntry.transactionPipeline?.length == 1 && + streamEntry.transactionPipeline[0] == 'PAYMENT_PROCESSED' && + streamEntry.orderDetails + ) { + //log + client.call( + 'XADD', + 'TRIGGER_LOGS_STREAM', + '*', + 'message', + `${streamEntry.transactionPipeline}`, + 'orderId', + `orders:orderId:${streamEntry.orderDetails.orderId}`, + 'function', + 'calculateStats', + ); + + const orderAmount = parseInt(streamEntry.orderDetails.orderAmount); //remove decimal + const products = streamEntry.orderDetails.products; + + // sales + client.call('INCRBY', 'statsTotalPurchaseAmount', orderAmount.toString()); + + for (let product of products) { + const totalProductAmount = + parseInt(product.qty) * parseInt(product.productPrice); + + // trending products + client.call( + 'ZINCRBY', + 'statsProductPurchaseQtySet', + product.qty.toString(), + product.productId, + ); + + // category wise purchase interest + const category = ( + product.productData.masterCategory_typeName + + ':' + + product.productData.subCategory_typeName + ).toLowerCase(); + client.call( + 'ZINCRBY', + 'statsCategoryPurchaseAmountSet', + totalProductAmount.toString(), + category, + ); + + // largest brand purchases + const brand = product.productData.brandName; + client.call( + 'ZINCRBY', + 'statsBrandPurchaseAmountSet', + totalProductAmount.toString(), + brand, + ); + } + } + }, + { + isStreamTrimmed: false, //whether the stream should be trimmed automatically after the data is processed by the consumer. + window: 1, + }, +); +``` + +In above `calculateStats` function, we are listening to `TRANSACTION_STREAM` and updating different sales statistics like + +- `statsTotalPurchaseAmount` variable stores total purchase amount +- `statsProductPurchaseQtySet` is a sorted set which tracks trending products based on highest purchase quantity +- `statsCategoryPurchaseAmountSet` is a sorted set which tracks category wise purchase interest +- `statsBrandPurchaseAmountSet` is a sorted set which tracks largest brand purchases + +### Adding the function to Redis + +We can add functions to Redis using various methods: + +1. Using redis-cli + +```sh +redis-cli -x TFUNCTION LOAD < ./stream-trigger.js +# or if you want to replace the function +redis-cli -x TFUNCTION LOAD REPLACE . < ./stream-trigger.js +``` + +2. Using code + +```ts title="database/src/triggers.ts" +import type { NodeRedisClientType } from './config.js'; +import * as path from 'path'; +import * as fs from 'fs/promises'; + +async function addTriggerToRedis( + fileRelativePath: string, + redisClient: NodeRedisClientType, +) { + const filePath = path.join(__dirname, fileRelativePath); + const fileData = await fs.readFile(filePath); + let jsCode = fileData.toString(); + jsCode = jsCode.replace(/\r?\n/g, '\n'); + + try { + const result = await redisClient.sendCommand([ + 'TFUNCTION', + 'LOAD', + 'REPLACE', + jsCode, + ]); + console.log(`addTriggersToRedis ${fileRelativePath}`, result); + } catch (err) { + console.log(err); + } +} +``` + +```ts +addTriggerToRedis('triggers/stream-trigger.js', redisClient); +``` + +3. Using RedisInsight + +Navigate to the `Triggers and Functions` section in RedisInsight, then to `Libraries`, and use create library to paste and save your function. + +![add-triggers-redis-insight](./images/add-triggers-redis-insight.png) + +### Testing the function + +In our demo, placing an order through the `Buy Now` button creates a new order involving different transaction steps like transaction risk assessment, payment fulfillment etc. All these steps along with order details are logged in `TRANSACTION_STREAM`. + +Sample code to add details to a stream: + +```ts +const addMessageToTransactionStream = async (message) => { + if (message) { + const streamKeyName = 'TRANSACTION_STREAM'; + try { + const nodeRedisClient = getNodeRedisClient(); + if (nodeRedisClient && message) { + const id = '*'; //* = auto generate + await nodeRedisClient.xAdd(streamKeyName, id, message); + } + } catch (err) { + console.error('addMessageToTransactionStream error !', err); + } + } +}; +``` + +A sample command to add details to the stream: + +```sh +"XADD" "TRANSACTION_STREAM" "*" "action" "PAYMENT_PROCESSED" "userId" "USR_f0f00a86-7131" "orderDetails" "{'orderId':'bc438c5d-117e-41bd-97fa-943c03be0b1c','products':[],'paymentId':'clrrl8yp50007pf851m7f92u2'}" "transactionPipeline" "['PAYMENT_PROCESSED']" +``` + +The `calculateStats` function listens to `TRANSACTION_STREAM` stream for `PAYMENT_PROCESSED` action and updates the sales statistics accordingly. + +Check different stats variable values in RedisInsight which were used in trigger function `calculateStats`. +![stream-trigger-test-ri](./images/stream-trigger-test-ri.png) + +### Verifying data integrity + +After the function execution, verify the updated admin dashboard. + +**Admin Panel:** Accessible via the 'admin' link in the top navigation. Check purchase statistics and trending products in UI. +![E-commerce App Frontend - Admin Page](./images/stream-trigger-verify-admin-charts.png) +![E-commerce App Frontend - Admin Page](./images/stream-trigger-verify-admin-top-trending.png) + +## Ready to use Redis triggers and functions + +We've covered key concepts like `On-demand`, `KeySpace`, and `Stream` triggers, and applied them in real e-commerce scenarios. These advanced functionalities of Redis open up a myriad of possibilities for data processing and automation, allowing you to build applications that are not only faster but also more intelligent. + +As you continue to explore Redis and its evolving ecosystem, remember that these `triggers and functions` are just the beginning. Redis offers a rich set of features that can be combined in creative ways to solve complex problems and deliver high-performance solutions. + +### References + +- [Triggers and Functions quick start](https://redis.io/docs/interact/programmability/triggers-and-functions/quick_start_ri/) diff --git a/docs/howtos/solutions/vector/common-ai/images/01-dashboard-semantic-image.png b/docs/howtos/solutions/vector/common-ai/images/01-dashboard-semantic-image.png new file mode 100644 index 00000000000..82907b0fd46 Binary files /dev/null and b/docs/howtos/solutions/vector/common-ai/images/01-dashboard-semantic-image.png differ diff --git a/docs/howtos/solutions/vector/common-ai/images/01-dashboard-semantic-text.png b/docs/howtos/solutions/vector/common-ai/images/01-dashboard-semantic-text.png new file mode 100644 index 00000000000..6e0bc20e139 Binary files /dev/null and b/docs/howtos/solutions/vector/common-ai/images/01-dashboard-semantic-text.png differ diff --git a/docs/howtos/solutions/vector/common-ai/images/01-dashboard.png b/docs/howtos/solutions/vector/common-ai/images/01-dashboard.png new file mode 100644 index 00000000000..46ea83abb61 Binary files /dev/null and b/docs/howtos/solutions/vector/common-ai/images/01-dashboard.png differ diff --git a/docs/howtos/solutions/vector/common-ai/images/02-ai-bot.png b/docs/howtos/solutions/vector/common-ai/images/02-ai-bot.png new file mode 100644 index 00000000000..d5fb913aa01 Binary files /dev/null and b/docs/howtos/solutions/vector/common-ai/images/02-ai-bot.png differ diff --git a/docs/howtos/solutions/vector/common-ai/images/03-ai-bot-product.png b/docs/howtos/solutions/vector/common-ai/images/03-ai-bot-product.png new file mode 100644 index 00000000000..0cccb7bc23a Binary files /dev/null and b/docs/howtos/solutions/vector/common-ai/images/03-ai-bot-product.png differ diff --git a/docs/howtos/solutions/vector/common-ai/images/04-ai-product-shopping-cart.png b/docs/howtos/solutions/vector/common-ai/images/04-ai-product-shopping-cart.png new file mode 100644 index 00000000000..8bd09fd547e Binary files /dev/null and b/docs/howtos/solutions/vector/common-ai/images/04-ai-product-shopping-cart.png differ diff --git a/docs/howtos/solutions/vector/common-ai/images/05-order-history.png b/docs/howtos/solutions/vector/common-ai/images/05-order-history.png new file mode 100644 index 00000000000..4c0d7059449 Binary files /dev/null and b/docs/howtos/solutions/vector/common-ai/images/05-order-history.png differ diff --git a/docs/howtos/solutions/vector/common-ai/images/06-admin-charts.png b/docs/howtos/solutions/vector/common-ai/images/06-admin-charts.png new file mode 100644 index 00000000000..4bfa493670a Binary files /dev/null and b/docs/howtos/solutions/vector/common-ai/images/06-admin-charts.png differ diff --git a/docs/howtos/solutions/vector/common-ai/images/07-admin-top-trending.png b/docs/howtos/solutions/vector/common-ai/images/07-admin-top-trending.png new file mode 100644 index 00000000000..4f73d08869d Binary files /dev/null and b/docs/howtos/solutions/vector/common-ai/images/07-admin-top-trending.png differ diff --git a/docs/howtos/solutions/vector/common-ai/images/08-settings-ai.png b/docs/howtos/solutions/vector/common-ai/images/08-settings-ai.png new file mode 100644 index 00000000000..56a4496f09e Binary files /dev/null and b/docs/howtos/solutions/vector/common-ai/images/08-settings-ai.png differ diff --git a/docs/howtos/solutions/vector/common-ai/images/08-settings-image-summary.png b/docs/howtos/solutions/vector/common-ai/images/08-settings-image-summary.png new file mode 100644 index 00000000000..56a4496f09e Binary files /dev/null and b/docs/howtos/solutions/vector/common-ai/images/08-settings-image-summary.png differ diff --git a/docs/howtos/solutions/vector/common-ai/microservices-ecommerce-ai.mdx b/docs/howtos/solutions/vector/common-ai/microservices-ecommerce-ai.mdx new file mode 100644 index 00000000000..5f316f38ec7 --- /dev/null +++ b/docs/howtos/solutions/vector/common-ai/microservices-ecommerce-ai.mdx @@ -0,0 +1,29 @@ +The e-commerce microservices application consists of a frontend, built using [Next.js](https://nextjs.org/) with [TailwindCSS](https://tailwindcss.com/). The application backend uses [Node.js](https://nodejs.org). The data is stored in [Redis](https://redis.com/try-free/) and either MongoDB or PostgreSQL, using [Prisma](https://www.prisma.io/docs/reference/database-reference/supported-databases). Below are screenshots showcasing the frontend of the e-commerce app. + +**Dashboard:** Displays a list of products with different search functionalities, configurable in the settings page. +![Redis Microservices E-commerce App Frontend - Products Page](images/01-dashboard.png) + +**Settings:** Accessible by clicking the gear icon at the top right of the dashboard. Control the search bar, chatbot visibility, and other features here. +![Redis Microservices E-commerce App Frontend - Settings Page](images/08-settings-ai.png) + +**Dashboard (Semantic Text Search):** Configured for semantic text search, the search bar enables natural language queries. Example: "pure cotton blue shirts." +![Redis Microservices E-commerce App Frontend - Semantic Text Search](images/01-dashboard-semantic-text.png) + +**Dashboard (Semantic Image-Based Queries):** Configured for semantic image summary search, the search bar allows for image-based queries. Example: "Left chest nike logo." +![Redis Microservices E-commerce App Frontend - Semantic Image Search](images/01-dashboard-semantic-image.png) + +**Chat Bot:** Located at the bottom right corner of the page, assisting in product searches and detailed views. +![Redis Microservices E-commerce App Frontend - Chat Bot](images/02-ai-bot.png) + +Selecting a product in the chat displays its details on the dashboard. +![Redis Microservices E-commerce App Frontend - Product Details](images/03-ai-bot-product.png) + +**Shopping Cart:** Add products to the cart and check out using the "Buy Now" button. +![Redis Microservices E-commerce App Frontend - Shopping Cart](images/04-ai-product-shopping-cart.png) + +**Order History:** Post-purchase, the 'Orders' link in the top navigation bar shows the order status and history. +![Redis Microservices E-commerce App Frontend - Order History Page](images/05-order-history.png) + +**Admin Panel:** Accessible via the 'admin' link in the top navigation. Displays purchase statistics and trending products. +![Redis Microservices E-commerce App Frontend - Admin Page](images/06-admin-charts.png) +![Redis Microservices E-commerce App Frontend - Admin Page](images/07-admin-top-trending.png) diff --git a/docs/howtos/solutions/vector/common-ai/microservices-source-code-ai.mdx b/docs/howtos/solutions/vector/common-ai/microservices-source-code-ai.mdx new file mode 100644 index 00000000000..7aa733bf44c --- /dev/null +++ b/docs/howtos/solutions/vector/common-ai/microservices-source-code-ai.mdx @@ -0,0 +1,7 @@ +:::tip GITHUB CODE + +Below is a command to the clone the source code for the application used in this tutorial + +git clone --branch v9.2.0 https://github.com/redis-developer/redis-microservices-ecommerce-solutions + +::: diff --git a/docs/howtos/solutions/vector/gen-ai-chatbot/images/chat-bot-flow.png b/docs/howtos/solutions/vector/gen-ai-chatbot/images/chat-bot-flow.png new file mode 100644 index 00000000000..8e4e67ac59a Binary files /dev/null and b/docs/howtos/solutions/vector/gen-ai-chatbot/images/chat-bot-flow.png differ diff --git a/docs/howtos/solutions/vector/gen-ai-chatbot/images/redis-insight-ai-products.png b/docs/howtos/solutions/vector/gen-ai-chatbot/images/redis-insight-ai-products.png new file mode 100644 index 00000000000..2607f2a3927 Binary files /dev/null and b/docs/howtos/solutions/vector/gen-ai-chatbot/images/redis-insight-ai-products.png differ diff --git a/docs/howtos/solutions/vector/gen-ai-chatbot/images/redis-insight-chat-history.png b/docs/howtos/solutions/vector/gen-ai-chatbot/images/redis-insight-chat-history.png new file mode 100644 index 00000000000..32fc0f8a65d Binary files /dev/null and b/docs/howtos/solutions/vector/gen-ai-chatbot/images/redis-insight-chat-history.png differ diff --git a/docs/howtos/solutions/vector/gen-ai-chatbot/images/redis-insight-chat-log.png b/docs/howtos/solutions/vector/gen-ai-chatbot/images/redis-insight-chat-log.png new file mode 100644 index 00000000000..ee2c3318216 Binary files /dev/null and b/docs/howtos/solutions/vector/gen-ai-chatbot/images/redis-insight-chat-log.png differ diff --git a/docs/howtos/solutions/vector/gen-ai-chatbot/index-gen-ai-chatbot.mdx b/docs/howtos/solutions/vector/gen-ai-chatbot/index-gen-ai-chatbot.mdx new file mode 100644 index 00000000000..ee9ff9d32f9 --- /dev/null +++ b/docs/howtos/solutions/vector/gen-ai-chatbot/index-gen-ai-chatbot.mdx @@ -0,0 +1,547 @@ +--- +id: index-solutions-gen-ai-chatbot +title: How to Build a GenAI Chatbot Using LangChain and Redis +sidebar_label: How to Build a GenAI Chatbot Using LangChain and Redis +slug: /howtos/solutions/vector/gen-ai-chatbot +authors: [prasan, will] +--- + +import Authors from '@theme/Authors'; +import InitialMicroservicesArchitecture from '../../microservices/common-data/microservices-arch.mdx'; +import MicroservicesEcommerceAIDesign from '../common-ai/microservices-ecommerce-ai.mdx'; +import SourceCode from '../common-ai/microservices-source-code-ai.mdx'; + + + +## What you will learn in this tutorial + +In this tutorial, you'll learn how to build a GenAI chatbot using `LangChain` and `Redis`. You'll also learn how to use `OpenAI's` language model to generate responses to user queries and how to use Redis to store and retrieve data. + +Here's what's covered: + +- **E-Commerce App** : A sample e-commerce application where users can search for products and ask questions about them, add them to their cart, and purchase them. +- **Chatbot Architecture** : The architecture of the chatbot, including the flow diagram, sample user prompt and it's AI response. +- **Database setup** : Generating OpenAI embeddings for products and storing them in Redis. +- **Setting up the chatbot API** : Creating a chatbot API that uses OpenAI and Redis to answer user questions and recommend products. + +## Terminology + +**Generative AI**, also known as **GenAI**, is a category of artificial intelligence that specializes in creating new content based on pre-existing data. It can generate a wide array of content types, including text, images, videos, sounds, code, 3D designs, and other media formats. Unlike traditional AI models that focus on analyzing and interpreting existing data, GenAI models learn from existing data and then use their knowledge to generate something entirely new. + +**[LangChain](https://js.langchain.com)** is an innovative library for building language model applications. It offers a structured way to combine different components like language models (e.g., OpenAI's models), storage solutions (like Redis), and custom logic. This modular approach facilitates the creation of sophisticated AI applications, including chatbots. + +**[OpenAI](https://openai.com/)** provides advanced language models like GPT-3, which have revolutionized the field with their ability to understand and generate human-like text. These models form the backbone of many modern AI applications, including chatbots. + +## Microservices architecture for an e-commerce application + + + + + +## E-commerce application frontend using Next.js and Tailwind + + + +## Chatbot architecture + +### Flow diagram + +![flow diagram](./images/chat-bot-flow.png) + +1> **Create Standalone Question**: Create a standalone question using `OpenAI's` language model. + +A standalone question is just a question reduced to the minimum number of words needed to express the request for information. + +```js +//Example +userQuestion = + "I'm thinking of buying one of your T-shirts but I need to know what your returns policy is as some T-shirts just don't fit me and I don't want to waste money."; + +//semanticMeaning of above question +standAloneQuestion = 'What is your return policy?'; +``` + +2> **Create Embeddings for Question**: Once the question is created, `OpenAI's` language model generates an embedding for the question. + +3> **Find Nearest Match in Redis Vector Store**: The embedding is then used to query `Redis` vector store. The system searches for the nearest match to the question embedding among stored vectors + +4> **Get Answer**: With the user initial question, the nearest match from the vector store, and the conversation memory, `OpenAI's` language model generates an answer. This answer is then provided to the user. + +Note : The system maintains a conversation memory, which tracks the ongoing conversation's context. This memory is crucial for ensuring the continuity and relevance of the conversation. + +5> **User Receives Answer**: The answer is sent back to the user, completing the interaction cycle. The conversation memory is updated with this latest exchange to inform future responses. + +### Sample user prompt and AI response + +Say, OriginalQuestion of user is as follows: + +`I am looking for a watch, Can you recommend anything for formal occasions with price under 50 dollars?` + +Converted standaloneQuestion by openAI is as follows: + +`What watches do you recommend for formal occasions with a price under $50?` + +After vector similarity search on **Redis**, we get the following similarProducts: + +```ts +similarProducts = [ + { + pageContent: ` Product details are as follows: + productId: 11005. + productDisplayName: Puma Men Visor 3HD Black Watch. + price: 5495 ...`, + metadata: { productId: '11005' }, + }, + { + pageContent: ` Product details are as follows: + productId: 11006. + productDisplayName: Puma Men Race Luminous Black Chronograph Watch. + price: 7795 ... `, + metadata: { productId: '11006' }, + }, +]; +``` + +The final openAI response with above context and earlier chat history (if any) is as follows: + +```ts +answer = `I recommend two watches for formal occasions with a price under $50. + +First, we have the Puma Men Visor 3HD Black Watch priced at $54.95. This watch features a heavy-duty design with a stylish dial and chunky casing, giving it a tough appearance - perfect for navigating the urban jungle. It has a square dial shape and a 32 mm case diameter. The watch comes with a 2-year warranty and is water-resistant up to 50 meters. + +Second, we have the Puma Men Race Luminous Black Chronograph Watch priced at $77.95. This watch also features a heavy-duty design with a stylish dial and chunky casing. It has a round dial shape and a 40 mm case diameter. The watch comes with a 2-year warranty and is water-resistant up to 50 meters. + +Both these watches from Puma are perfect for formal occasions and are priced under $50. I hope this helps, and please let me know if you have any other questions!`; +``` + +## Database setup + +:::info +Sign up for an [OpenAI account](https://platform.openai.com/) to get your API key to be used in the demo (add OPEN_AI_API_KEY variable in .env file). You can also refer to the [OpenAI API documentation](https://platform.openai.com/docs/api-reference/introduction) for more information. +::: + + + +### Sample data + +For the purposes of this tutorial, let's consider a simplified e-commerce context. The `products` JSON provided offers a glimpse into AI search functionalities we'll be operating on. + +```ts title="database/fashion-dataset/001/products/*.json" +const products = [ + { + productId: '11000', + price: 3995, + productDisplayName: 'Puma Men Slick 3HD Yellow Black Watches', + variantName: 'Slick 3HD Yellow', + brandName: 'Puma', + ageGroup: 'Adults-Men', + gender: 'Men', + displayCategories: 'Accessories', + masterCategory_typeName: 'Accessories', + subCategory_typeName: 'Watches', + styleImages_default_imageURL: + 'http://host.docker.internal:8080/images/11000.jpg', + productDescriptors_description_value: + '

Stylish and comfortable, ...', + stockQty: 25, + }, + //... +]; +``` + +### OpenAI embeddings seeding + +Below is the sample code to seed `products` data as openAI embeddings into Redis. + +```ts title="database/src/open-ai.ts" +import { Document } from 'langchain/document'; +import { OpenAIEmbeddings } from 'langchain/embeddings/openai'; +import { RedisVectorStore } from 'langchain/vectorstores/redis'; + +/** + * Adds OpenAI embeddings to Redis for the given products. + * + * @param _products - An array of (ecommerce) products. + * @param _redisClient - The Redis client used to connect to the Redis server. + * @param _openAIApiKey - The API key for accessing the OpenAI service. + */ +const addOpenAIEmbeddingsToRedis = async ( + _products, + _redisClient, + _openAIApiKey, +) => { + if (_products?.length > 0 && _redisClient && _openAIApiKey) { + // Check if the data is already seeded + const existingKeys = await _redisClient.keys('openAIProducts:*'); + if (existingKeys.length > 0) { + console.log('seeding openAIEmbeddings skipped !'); + return; + } + + const vectorDocs: Document[] = []; + // Create a document for each product + for (let product of _products) { + let doc = new Document({ + metadata: { + productId: product.productId, + }, + pageContent: ` Product details are as follows: + productId: ${product.productId}. + + productDisplayName: ${product.productDisplayName}. + + price: ${product.price}. + + variantName: ${product.variantName}. + + brandName: ${product.brandName}. + + ageGroup: ${product.ageGroup}. + + gender: ${product.gender}. + + productColors: ${product.productColors} + + Category: ${product.displayCategories}, ${product.masterCategory_typeName} - ${product.subCategory_typeName} + + productDescription: ${product.productDescriptors_description_value}`, + }); + + vectorDocs.push(doc); + } + + // Create a new OpenAIEmbeddings instance + const embeddings = new OpenAIEmbeddings({ + openAIApiKey: _openAIApiKey, + }); + // Add the documents to the RedisVectorStore + const vectorStore = await RedisVectorStore.fromDocuments( + vectorDocs, + embeddings, + { + redisClient: _redisClient, + indexName: 'openAIProductsIdx', + keyPrefix: 'openAIProducts:', + }, + ); + console.log('seeding OpenAIEmbeddings completed'); + } +}; +``` + +You can observe openAIProducts JSON in RedisInsight: + +![Redis Insight AI products](./images/redis-insight-ai-products.png) + +:::tip + +Download [RedisInsight](https://redis.com/redis-enterprise/redis-insight/) to visually explore your Redis data or to engage with raw Redis commands in the workbench. + +::: + +## Setting up the chatbot API + +Once products data is seeded as openAI embeddings into Redis, we can create a `chatbot` API to answer user questions and recommend products. + +### API end point + +The code that follows shows an example API request and response for the `chatBot` API: + +**Request** + +```json +POST http://localhost:3000/products/chatBot +{ + "userMessage":"I am looking for a watch, Can you recommend anything for formal occasions with price under 50 dollars?" +} +``` + +**Response** + +```json +{ + "data": "I recommend two watches for formal occasions with a price under $50. + + First, we have the Puma Men Visor 3HD Black Watch priced at $54.95. This watch features a heavy-duty design with a stylish dial and chunky casing, giving it a tough appearance - perfect for navigating the urban jungle. It has a square dial shape and a 32 mm case diameter. The watch comes with a 2-year warranty and is water-resistant up to 50 meters. + + Second, we have the Puma Men Race Luminous Black Chronograph Watch priced at $77.95. This watch also features a heavy-duty design with a stylish dial and chunky casing. It has a round dial shape and a 40 mm case diameter. The watch comes with a 2-year warranty and is water-resistant up to 50 meters. + + Both these watches from Puma are perfect for formal occasions and are priced under $50. I hope this helps, and please let me know if you have any other questions!", + + "error": null, + "auth": "SES_54f211db-50a7-45df-8067-c3dc4272beb2" +} +``` + +### API implementation + +When you make a request, it goes through the API gateway to the `products` service. Ultimately, it ends up calling an `chatBotMessage` function which looks as follows: + +```ts title="server/src/services/products/src/open-ai-prompt.ts" +import { + ChatOpenAI, + ChatOpenAICallOptions, +} from 'langchain/chat_models/openai'; +import { PromptTemplate } from 'langchain/prompts'; +import { OpenAIEmbeddings } from 'langchain/embeddings/openai'; +import { RedisVectorStore } from 'langchain/vectorstores/redis'; +import { StringOutputParser } from 'langchain/schema/output_parser'; +import { Document } from 'langchain/document'; + +let llm: ChatOpenAI; + +const chatBotMessage = async ( + _userMessage: string, + _sessionId: string, + _openAIApiKey: string, +) => { + const CHAT_BOT_LOG = 'CHAT_BOT_LOG_STREAM'; + const redisWrapperInst = getRedis(); + + // Add user message to chat history + const chatHistoryName = 'chatHistory:' + _sessionId; + redisWrapperInst.addItemToList( + chatHistoryName, + 'userMessage: ' + _userMessage, + ); + // add log + addMessageToStream( + { name: 'originalQuestion', comments: _userMessage }, + CHAT_BOT_LOG, + ); + + // (1) Create a standalone question + const standaloneQuestion = await convertToStandAloneQuestion( + _userMessage, + _sessionId, + _openAIApiKey, + ); + // add log + addMessageToStream( + { name: 'standaloneQuestion', comments: standaloneQuestion }, + CHAT_BOT_LOG, + ); + + // (2) Get similar products from Redis + const similarProducts = await getSimilarProductsByVSS( + standaloneQuestion, + _openAIApiKey, + ); + if (similarProducts?.length) { + // add log + addMessageToStream( + { name: 'similarProducts', comments: JSON.stringify(similarProducts) }, + CHAT_BOT_LOG, + ); + } + + // Combine the product details into a single document + const productDetails = combineVectorDocuments(similarProducts); + console.log('productDetails:', productDetails); + + // (3) Get answer from OpenAI + const answer = await convertToAnswer( + _userMessage, + standaloneQuestion, + productDetails, + _sessionId, + _openAIApiKey, + ); + // add log + addMessageToStream({ name: 'answer', comments: answer }, CHAT_BOT_LOG); + + // Add answer to chat history + redisWrapperInst.addItemToList( + chatHistoryName, + 'openAIMessage(You): ' + answer, + ); + + return answer; +}; +``` + +Below function converts the userMessage to standaloneQuestion using `openAI` + +```ts title="server/src/services/products/src/open-ai-prompt.ts" +// (1) Create a standalone question +const convertToStandAloneQuestion = async ( + _userQuestion: string, + _sessionId: string, + _openAIApiKey: string, +) => { + const llm = getOpenAIInstance(_openAIApiKey); + + const chatHistory = await getChatHistory(_sessionId); + + const standaloneQuestionTemplate = `Given some conversation history (if any) and a question, convert it to a standalone question. + *********************************************************** + conversation history: + ${chatHistory} + *********************************************************** + question: {question} + standalone question:`; + + const standaloneQuestionPrompt = PromptTemplate.fromTemplate( + standaloneQuestionTemplate, + ); + + const chain = standaloneQuestionPrompt + .pipe(llm) + .pipe(new StringOutputParser()); + + const response = await chain.invoke({ + question: _userQuestion, + }); + + return response; +}; +const getOpenAIInstance = (_openAIApiKey: string) => { + if (!llm) { + llm = new ChatOpenAI({ + openAIApiKey: _openAIApiKey, + }); + } + return llm; +}; + +const getChatHistory = async (_sessionId: string, _separator?: string) => { + let chatHistory = ''; + if (!_separator) { + _separator = '\n\n'; + } + if (_sessionId) { + const redisWrapperInst = getRedis(); + const chatHistoryName = 'chatHistory:' + _sessionId; + const items = await redisWrapperInst.getAllItemsFromList(chatHistoryName); + + if (items?.length) { + chatHistory = items.join(_separator); + } + } + return chatHistory; +}; +const combineVectorDocuments = ( + _vectorDocs: Document[], + _separator?: string, +) => { + if (!_separator) { + _separator = '\n\n --------------------- \n\n'; + } + return _vectorDocs.map((doc) => doc.pageContent).join(_separator); +}; +``` + +Below function uses `Redis` to find similar products for the standaloneQuestion + +```ts title="server/src/services/products/src/open-ai-prompt.ts" +// (2) Get similar products from Redis +const getSimilarProductsByVSS = async ( + _standAloneQuestion: string, + _openAIApiKey: string, +) => { + const client = getNodeRedisClient(); + + const embeddings = new OpenAIEmbeddings({ + openAIApiKey: _openAIApiKey, + }); + const vectorStore = new RedisVectorStore(embeddings, { + redisClient: client, + indexName: 'openAIProductsIdx', + keyPrefix: 'openAIProducts:', + }); + + const KNN = 3; + /* Simple standalone search in the vector DB */ + const vectorDocs = await vectorStore.similaritySearch( + _standAloneQuestion, + KNN, + ); + + return vectorDocs; +}; +``` + +Below function uses `openAI` to convert the standaloneQuestion, similar products from Redis and other context to a human understandable answer. + +```ts title="server/src/services/products/src/open-ai-prompt.ts" +// (3) Get answer from OpenAI +const convertToAnswer = async ( + _originalQuestion: string, + _standAloneQuestion: string, + _productDetails: string, + _sessionId: string, + _openAIApiKey: string, +) => { + const llm = getOpenAIInstance(_openAIApiKey); + + const chatHistory = await getChatHistory(_sessionId); + + const answerTemplate = ` + Please assume the persona of a retail shopping assistant for this conversation. + Use a friendly tone, and assume the target audience are normal people looking for a product in a ecommerce website. + + *********************************************************** + ${ + chatHistory + ? ` + Conversation history between user and you is : + ${chatHistory} + ` + : '' + } + *********************************************************** + OriginalQuestion of user is : {originalQuestion} + *********************************************************** + converted stand alone question is : {standAloneQuestion} + *********************************************************** + resulting details of products for the stand alone question are : + {productDetails} + Note : Different product details are separated by "---------------------" (if any) + *********************************************************** + Answer the question based on the context provided and the conversation history. + + If you don't know the answer, please direct the questioner to email help@redis.com. Don't try to suggest any product out of context as it may not be in the store. + + Let the answer include product display name, price and optional other details based on question asked. + + Let the product display name be a link like productDisplayName + so that user can click on it and go to the product page with help of productId. + + answer: `; + + const answerPrompt = PromptTemplate.fromTemplate(answerTemplate); + const chain = answerPrompt.pipe(llm).pipe(new StringOutputParser()); + + const response = await chain.invoke({ + originalQuestion: _originalQuestion, + standAloneQuestion: _standAloneQuestion, + productDetails: _productDetails, + }); + + return response; +}; +``` + +You can observe chat history and intermediate chat logs in RedisInsight: + +![Redis Insight chat history](./images/redis-insight-chat-history.png) + +![Redis Insight chat log](./images/redis-insight-chat-log.png) + +:::tip + +Download [RedisInsight](https://redis.com/redis-enterprise/redis-insight/) to visually explore your Redis data or to engage with raw Redis commands in the workbench. + +::: + +## Ready to use Redis for genAI chatbot? + +Building a GenAI chatbot using LangChain and Redis involves integrating advanced AI models with efficient storage solutions. This tutorial covers the fundamental steps and code needed to develop a chatbot capable of handling e-commerce queries. With these tools, you can create a responsive, intelligent chatbot for a variety of applications + +## Further reading + +- [Perform vector similarity search using Redis](/howtos/solutions/vector/getting-started-vector) + +- [LangChain JS](https://js.langchain.com/docs/get_started/quickstart) + - [Learn LangChain](https://scrimba.com/learn/langchain) +- [LangChain redis integration](https://js.langchain.com/docs/integrations/vectorstores/redis) diff --git a/docs/howtos/solutions/vector/getting-started-vector/chart/common.js b/docs/howtos/solutions/vector/getting-started-vector/chart/common.js new file mode 100644 index 00000000000..c5f990909c5 --- /dev/null +++ b/docs/howtos/solutions/vector/getting-started-vector/chart/common.js @@ -0,0 +1,30 @@ +// Sample data +const products = [ + { + name: 'Puma Men Race Black Watch', + price: 150, + quality: 5, + popularity: 8, + }, + { + name: 'Puma Men Top Fluctuation Red Black Watch', + price: 180, + quality: 7, + popularity: 6, + }, + { + name: 'Inkfruit Women Behind Cream Tshirt', + price: 5, + quality: 9, + popularity: 7, + }, +]; + +const dataWithAttributes = products.map((product) => ({ + x: product.price, + y: product.quality, + label: product.name, +})); + +const product1Point = { x: products[0].price, y: products[0].quality }; +const product2Point = { x: products[1].price, y: products[1].quality }; diff --git a/docs/howtos/solutions/vector/getting-started-vector/chart/cosine.js b/docs/howtos/solutions/vector/getting-started-vector/chart/cosine.js new file mode 100644 index 00000000000..2035affa807 --- /dev/null +++ b/docs/howtos/solutions/vector/getting-started-vector/chart/cosine.js @@ -0,0 +1,95 @@ +/* eslint-disable @typescript-eslint/no-unsafe-return */ +/* eslint-disable @typescript-eslint/no-unsafe-call */ +/* eslint-disable @typescript-eslint/no-unsafe-argument */ + +const ctxCosine = document + .getElementById('productChartCosine') + .getContext('2d'); + +function cosineSimilarity(point1, point2) { + let dotProduct = point1.x * point2.x + point1.y * point2.y; + let magnitudePoint1 = Math.sqrt( + Math.pow(point1.x, 2) + Math.pow(point1.y, 2), + ); + let magnitudePoint2 = Math.sqrt( + Math.pow(point2.x, 2) + Math.pow(point2.y, 2), + ); + return dotProduct / (magnitudePoint1 * magnitudePoint2); +} + +const cosineSim = cosineSimilarity(product1Point, product2Point); + +const scatterChartCosine = new Chart(ctxCosine, { + type: 'scatter', + data: { + datasets: [ + { + label: 'Products', + data: dataWithAttributes, + pointBackgroundColor: ['black', 'red', 'bisque'], + pointRadius: 5, + }, + { + label: 'Vector for Product-1', + data: [{ x: 0, y: 0 }, product1Point], + showLine: true, + fill: false, + borderColor: 'black', + pointRadius: [0, 5], + lineTension: 0, + }, + { + label: 'Vector for Product-2', + data: [{ x: 0, y: 0 }, product2Point], + showLine: true, + fill: false, + borderColor: 'red', + pointRadius: [0, 5], + lineTension: 0, + }, + ], + }, + options: { + responsive: true, + plugins: { + legend: { + position: 'top', + }, + title: { + display: true, + text: `Cosine Similarity between Product-1 and Product-2 is ${cosineSim}`, + }, + }, + scales: { + x: { + type: 'linear', + position: 'bottom', + title: { + display: true, + text: 'Price ($)', + }, + ticks: { + beginAtZero: true, + }, + min: 0, // Ensure it starts from 0 + }, + y: { + title: { + display: true, + text: 'Quality (1-10)', + }, + ticks: { + beginAtZero: true, + }, + min: 0, // Ensure it starts from 0 + }, + }, + tooltips: { + callbacks: { + title: function (tooltipItem, data) { + return data.datasets[0].data[tooltipItem[0].index].label; + }, + }, + }, + }, +}); diff --git a/docs/howtos/solutions/vector/getting-started-vector/chart/euclidean.js b/docs/howtos/solutions/vector/getting-started-vector/chart/euclidean.js new file mode 100644 index 00000000000..15fce10c435 --- /dev/null +++ b/docs/howtos/solutions/vector/getting-started-vector/chart/euclidean.js @@ -0,0 +1,78 @@ +/* eslint-disable @typescript-eslint/no-unsafe-call */ +// eslint-disable-next-line @typescript-eslint/no-unsafe-call +const ctx = document.getElementById('productChartEuclidean').getContext('2d'); + +function euclideanDistance(point1, point2) { + return Math.sqrt( + Math.pow(point1.x - point2.x, 2) + Math.pow(point1.y - point2.y, 2), + ); +} + +const distance = euclideanDistance(product1Point, product2Point); + +const scatterChart = new Chart(ctx, { + type: 'scatter', + data: { + datasets: [ + { + label: 'Products', + data: dataWithAttributes, + pointBackgroundColor: ['black', 'red', 'bisque'], + pointRadius: 5, + }, + { + label: `Euclidean Distance: ${distance.toFixed(2)}`, + data: [product1Point, product2Point], + showLine: true, + fill: false, + borderColor: 'green', + pointRadius: 0, + lineTension: 0, + }, + ], + }, + options: { + responsive: true, + plugins: { + legend: { + position: 'top', + }, + title: { + display: true, + text: `Euclidean Distance between Product-1 and Product-2`, + }, + }, + scales: { + x: { + type: 'linear', + position: 'bottom', + title: { + display: true, + text: 'Price ($)', + }, + ticks: { + beginAtZero: true, + }, + min: 0, // Ensure it starts from 0 + }, + y: { + title: { + display: true, + text: 'Quality (1-10)', + }, + ticks: { + beginAtZero: true, + }, + min: 0, // Ensure it starts from 0 + }, + }, + tooltips: { + callbacks: { + title: function (tooltipItem, data) { + // eslint-disable-next-line @typescript-eslint/no-unsafe-return + return data.datasets[0].data[tooltipItem[0].index].label; + }, + }, + }, + }, +}); diff --git a/docs/howtos/solutions/vector/getting-started-vector/chart/index.html b/docs/howtos/solutions/vector/getting-started-vector/chart/index.html new file mode 100644 index 00000000000..c089045aa21 --- /dev/null +++ b/docs/howtos/solutions/vector/getting-started-vector/chart/index.html @@ -0,0 +1,28 @@ + + + + + Product Attributes Visualization + + + + + +

+ +
+
+
+
+ +
+ + + + + + diff --git a/docs/howtos/solutions/vector/getting-started-vector/images/11001.jpg b/docs/howtos/solutions/vector/getting-started-vector/images/11001.jpg new file mode 100644 index 00000000000..6366a8dda81 Binary files /dev/null and b/docs/howtos/solutions/vector/getting-started-vector/images/11001.jpg differ diff --git a/docs/howtos/solutions/vector/getting-started-vector/images/cosine-chart.png b/docs/howtos/solutions/vector/getting-started-vector/images/cosine-chart.png new file mode 100644 index 00000000000..073ef9709b9 Binary files /dev/null and b/docs/howtos/solutions/vector/getting-started-vector/images/cosine-chart.png differ diff --git a/docs/howtos/solutions/vector/getting-started-vector/images/cosine-formula.png b/docs/howtos/solutions/vector/getting-started-vector/images/cosine-formula.png new file mode 100644 index 00000000000..51b7ab52438 Binary files /dev/null and b/docs/howtos/solutions/vector/getting-started-vector/images/cosine-formula.png differ diff --git a/docs/howtos/solutions/vector/getting-started-vector/images/cosine-sample.png b/docs/howtos/solutions/vector/getting-started-vector/images/cosine-sample.png new file mode 100644 index 00000000000..cdcbb0beef4 Binary files /dev/null and b/docs/howtos/solutions/vector/getting-started-vector/images/cosine-sample.png differ diff --git a/docs/howtos/solutions/vector/getting-started-vector/images/euclidean-distance-chart.png b/docs/howtos/solutions/vector/getting-started-vector/images/euclidean-distance-chart.png new file mode 100644 index 00000000000..7e8d13b29f0 Binary files /dev/null and b/docs/howtos/solutions/vector/getting-started-vector/images/euclidean-distance-chart.png differ diff --git a/docs/howtos/solutions/vector/getting-started-vector/images/euclidean-distance-formula.png b/docs/howtos/solutions/vector/getting-started-vector/images/euclidean-distance-formula.png new file mode 100644 index 00000000000..8cbaa051809 Binary files /dev/null and b/docs/howtos/solutions/vector/getting-started-vector/images/euclidean-distance-formula.png differ diff --git a/docs/howtos/solutions/vector/getting-started-vector/images/euclidean-distance-sample.png b/docs/howtos/solutions/vector/getting-started-vector/images/euclidean-distance-sample.png new file mode 100644 index 00000000000..10c4103d325 Binary files /dev/null and b/docs/howtos/solutions/vector/getting-started-vector/images/euclidean-distance-sample.png differ diff --git a/docs/howtos/solutions/vector/getting-started-vector/images/ip-formula.png b/docs/howtos/solutions/vector/getting-started-vector/images/ip-formula.png new file mode 100644 index 00000000000..2f1912d7c99 Binary files /dev/null and b/docs/howtos/solutions/vector/getting-started-vector/images/ip-formula.png differ diff --git a/docs/howtos/solutions/vector/getting-started-vector/images/ip-sample.png b/docs/howtos/solutions/vector/getting-started-vector/images/ip-sample.png new file mode 100644 index 00000000000..5c29c8ba6e3 Binary files /dev/null and b/docs/howtos/solutions/vector/getting-started-vector/images/ip-sample.png differ diff --git a/docs/howtos/solutions/vector/getting-started-vector/images/products-data-gui.png b/docs/howtos/solutions/vector/getting-started-vector/images/products-data-gui.png new file mode 100644 index 00000000000..df6021442c9 Binary files /dev/null and b/docs/howtos/solutions/vector/getting-started-vector/images/products-data-gui.png differ diff --git a/docs/howtos/solutions/vector/getting-started-vector/index-getting-started-vector.mdx b/docs/howtos/solutions/vector/getting-started-vector/index-getting-started-vector.mdx new file mode 100644 index 00000000000..a0fa5e800d2 --- /dev/null +++ b/docs/howtos/solutions/vector/getting-started-vector/index-getting-started-vector.mdx @@ -0,0 +1,811 @@ +--- +id: index-getting-started-vector +title: How to Perform Vector Similarity Search Using Redis in NodeJS +sidebar_label: How to Perform Vector Similarity Search Using Redis in NodeJS +slug: /howtos/solutions/vector/getting-started-vector +authors: [prasan, will] +--- + +import Authors from '@theme/Authors'; +import SampleWatchImage from './images/11001.jpg'; +import EuclideanDistanceFormulaImage from './images/euclidean-distance-formula.png'; +import EuclideanDistanceSampleImage from './images/euclidean-distance-sample.png'; +import CosineFormulaImage from './images/cosine-formula.png'; +import CosineSampleImage from './images/cosine-sample.png'; +import IpFormulaImage from './images/ip-formula.png'; +import IpSampleImage from './images/ip-sample.png'; + + + +## What you Will learn in this tutorial + +This tutorial is a comprehensive guide to leveraging Redis for vector similarity search in a NodeJS environment. Aimed at software developers with expertise in the NodeJS/ JavaScript ecosystem, this tutorial will provide you with the knowledge and techniques required for advanced vector operations. Here's what's covered: + +- [Foundational Concepts](./#vectors-introduction): + + - [About Vectors](./#what-is-a-vector-in-machine-learning): Delve into the foundational concept of vectors in machine learning. + - [Vector Databases](./#what-is-a-vector-database): Understand specialized databases designed to handle vector data efficiently. + - [Vector Similarity](./#what-is-vector-similarity): Grasp the concept and significance of comparing vectors. Discover some use cases where vector similarity plays a pivotal role, from recommendation systems to content retrieval. + +- [Vector Generation](./#generating-vectors): + + - [Textual Content](./#sentence-text-vector): Learn techniques to generate vectors from textual data. + - [Imagery Content](./#image-vector): Understand how images can be represented as vectors and how they're processed. + +- [Redis Database Setup](./#database-setup): + + - [Data Seeding](./#sample-data-seeding): Get hands-on with populating your Redis database with vector data. + - [Index Creation](./#create-vector-index): Understand the process of indexing vector fields in Redis, optimizing for both accuracy and performance. + +- Advanced Vector Queries in Redis: + + - [KNN (k-Nearest Neighbors) Queries](./#what-is-vector-knn-query): Dive into the concept of KNN and its implementation in Redis to retrieve vectors most similar to a query vector. + - [Range Queries](./#what-is-vector-range-query): Discover how to retrieve vectors within a certain distance or range from a target vector. + +- [Vector Similarity Calculations](./#how-to-calculate-vector-similarity): (Optionally) if you want to understand the math behind vector similarity search + + - [Euclidean Distance](./#euclidean-distance-l2-norm): Understand the L2 norm method for calculating similarity. + - [Cosine Similarity](./#cosine-similarity): Dive into the angular differences and its importance in vector space. + - [Inner Product](./#inner-product): Learn about another essential metric in understanding vector similarities. + +- [Additional Resources](./#further-reading): Take your learning further with other resources related to vectors in Redis. + +## Vectors introduction + +### What is a vector in machine learning? + +In the context of machine learning, a vector is a mathematical representation of data. It is an ordered list of numbers that encode the features or attributes of a piece of data. + +Vectors can be thought of as points in a multi-dimensional space where each dimension corresponds to a feature. +**For example**, consider a simple dataset about ecommerce `products`. Each product might have features such as `price`, `quality`, and `popularity`. + +| Id | Product | Price ($) | Quality (1 - 10) | Popularity (1 - 10) | +| --- | ---------------------------------------- | --------- | ---------------- | ------------------- | +| 1 | Puma Men Race Black Watch | 150 | 5 | 8 | +| 2 | Puma Men Top Fluctuation Red Black Watch | 180 | 7 | 6 | +| 3 | Inkfruit Women Behind Cream Tshirt | 5 | 9 | 7 | + +Now, product 1 `Puma Men Race Black Watch` might be represented as the vector `[150, 5, 8]` + +In a more complex scenario, like natural language processing (NLP), words or entire sentences can be converted into dense vectors (often referred to as embeddings) that capture the semantic meaning of the text.Vectors play a foundational role in many machine learning algorithms, particularly those that involve distance measurements, such as clustering and classification algorithms. + +### What is a vector database? + +A vector database is a specialized system optimized for storing and searching vectors. Designed explicitly for efficiency, these databases play a crucial role in powering vector search applications, including recommendation systems, image search, and textual content retrieval. Often referred to as vector stores, vector indexes, or vector search engines, these databases employ vector similarity algorithms to identify vectors that closely match a given query vector. + +:::tip + +[**Redis Cloud**](https://redis.com/try-free) is a popular choice for vector databases, as it offers a rich set of data structures and commands that are well-suited for vector storage and search. Redis Cloud allows you to index vectors and perform vector similarity search in a few different ways outlined further in this tutorial. It also maintains a high level of performance and scalability. + +::: + +### What is vector similarity? + +Vector similarity is a measure that quantifies how alike two vectors are, typically by evaluating the `distance` or `angle` between them in a multi-dimensional space. +When vectors represent data points, such as texts or images, the similarity score can indicate how similar the underlying data points are in terms of their features or content. + +**Use cases for vector similarity:** + +- **Recommendation Systems**: If you have vectors representing user preferences or item profiles, you can quickly find items that are most similar to a user's preference vector. +- **Image Search**: Store vectors representing image features, and then retrieve images most similar to a given image's vector. +- **Textual Content Retrieval**: Store vectors representing textual content (e.g., articles, product descriptions) and find the most relevant texts for a given query vector. + +:::tip CALCULATING VECTOR SIMILARITY + +If you're interested in learning more about the mathematics behind vector similarity, scroll down to the [**How to calculate vector similarity?**](#how-to-calculate-vector-similarity) section. + +::: + +## Generating vectors + +In our scenario, our focus revolves around generating sentence (product description) and image (product image) embeddings or vectors. There's an abundance of AI model repositories, like GitHub, where AI models are pre-trained, maintained, and shared. + +For sentence embeddings, we'll employ a model from [Hugging Face Model Hub](https://huggingface.co/models), and for image embeddings, one from [TensorFlow Hub](https://tfhub.dev/) will be leveraged for variety. + +:::tip GITHUB CODE + +Below is a command to the clone the source code used in this tutorial + +git clone https://github.com/redis-developer/redis-vector-nodejs-solutions.git +::: + +### Sentence/ text vector + +To generate sentence embeddings, we'll make use of a Hugging Face model titled [Xenova/all-distilroberta-v1](https://huggingface.co/Xenova/all-distilroberta-v1). It's a compatible version of [sentence-transformers/all-distilroberta-v1](https://huggingface.co/sentence-transformers/all-distilroberta-v1) for transformer.js with ONNX weights. + +:::info + +[Hugging Face Transformers](https://huggingface.co/docs/transformers.js/index) +is a renowned open-source tool for Natural Language Processing (NLP) tasks. +It simplifies the use of cutting-edge NLP models. + +The transformers.j library is essentially the JavaScript version of Hugging Face's popular Python library. + +::: + +:::info + +[ONNX (Open Neural Network eXchange)](https://onnx.ai) is an open standard +that defines a common set of operators and a common file format to represent deep +learning models in a wide variety of frameworks, including PyTorch and TensorFlow + +::: + +Below, you'll find a Node.js code snippet that illustrates how to create vector embeddings for any provided `sentence`: + +```sh +npm install @xenova/transformers +``` + +```js title="src/text-vector-gen.ts" +import * as transformers from '@xenova/transformers'; + +async function generateSentenceEmbeddings(_sentence): Promise { + let modelName = 'Xenova/all-distilroberta-v1'; + let pipe = await transformers.pipeline('feature-extraction', modelName); + + let vectorOutput = await pipe(_sentence, { + pooling: 'mean', + normalize: true, + }); + + const embeddings: number[] = Object.values(vectorOutput?.data); + return embeddings; +} + +export { generateSentenceEmbeddings }; +``` + +Here's a glimpse of the vector output for a sample text: + +```js title="sample output" +const embeddings = await generateSentenceEmbeddings('I Love Redis !'); +console.log(embeddings); +/* + 768 dim vector output + embeddings = [ + -0.005076113156974316, -0.006047232076525688, -0.03189406543970108, + -0.019677048549056053, 0.05152582749724388, -0.035989608615636826, + -0.009754283353686333, 0.002385444939136505, -0.04979122802615166, + ....] +*/ +``` + +### Image vector + +To obtain image embeddings, we'll leverage the [mobilenet](https://github.com/tensorflow/tfjs-models/tree/master/mobilenet) model from TensorFlow. + +Below, you'll find a Node.js code snippet that illustrates how to create vector embeddings for any provided `image`: + +```sh +npm i @tensorflow/tfjs @tensorflow/tfjs-node @tensorflow-models/mobilenet jpeg-js +``` + +```js title="src/image-vector-gen.ts" +import * as tf from '@tensorflow/tfjs-node'; +import * as mobilenet from '@tensorflow-models/mobilenet'; + +import * as jpeg from 'jpeg-js'; + +import * as path from 'path'; +import { fileURLToPath } from 'url'; +import * as fs from 'fs/promises'; + +const __filename = fileURLToPath(import.meta.url); +const __dirname = path.dirname(__filename); + +async function decodeImage(imagePath) { + imagePath = path.join(__dirname, imagePath); + + const buffer = await fs.readFile(imagePath); + const rawImageData = jpeg.decode(buffer); + const imageTensor = tf.browser.fromPixels(rawImageData); + return imageTensor; +} + +async function generateImageEmbeddings(imagePath: string) { + const imageTensor = await decodeImage(imagePath); + + // Load MobileNet model + const model = await mobilenet.load(); + + // Classify and predict what the image is + const prediction = await model.classify(imageTensor); + console.log(`${imagePath} prediction`, prediction); + + // Preprocess the image and get the intermediate activation. + const activation = model.infer(imageTensor, true); + + // Convert the tensor to a regular array. + const vectorOutput = await activation.data(); + + imageTensor.dispose(); // Clean up tensor + + return vectorOutput; //DIM 1024 +} +``` + +
+ +
+ +:::tip Image classification model + +We are using [mobilenet model](https://github.com/tensorflow/tfjs-models/tree/master/mobilenet) which is trained only on small [set of image classes](https://github.com/tensorflow/tfjs-examples/blob/master/mobilenet/imagenet_classes.js). The choice of an image classification model depends on various factors, such as the dataset size, dataset diversity, computational resources, and the specific requirements of your application. There are various alternative image classification models, such as EfficientNet, ResNets, and Vision Transformers (ViT), that you can select based on your needs. +::: + +Below is an illustration of the vector output for a sample watch image: + +
+ +ecommerce watch + +
+ +```js title="sample output" +//watch image +const imageEmbeddings = await generateImageEmbeddings('images/11001.jpg'); +console.log(imageEmbeddings); +/* + 1024 dim vector output + imageEmbeddings = [ + 0.013823275454342365, 0.33256298303604126, 0, + 2.2764432430267334, 0.14010703563690186, 0.972867488861084, + 1.2307443618774414, 2.254523992538452, 0.44696325063705444, + ....] + + images/11001.jpg (mobilenet model) prediction [ + { className: 'digital watch', probability: 0.28117117285728455 }, + { className: 'spotlight, spot', probability: 0.15369531512260437 }, + { className: 'digital clock', probability: 0.15267866849899292 } +] +*/ +``` + +## Database setup + +:::tip GITHUB CODE + +Below is a command to the clone the source code used in this tutorial + +git clone https://github.com/redis-developer/redis-vector-nodejs-solutions.git +::: + +### Sample Data seeding + +For the purposes of this tutorial, let's consider a simplified e-commerce context. The `products` JSON provided offers a glimpse into vector search functionalities we'll be discussing. + +```js title="src/data.ts" +const products = [ + { + _id: '1', + price: 4950, + productDisplayName: 'Puma Men Race Black Watch', + brandName: 'Puma', + ageGroup: 'Adults-Men', + gender: 'Men', + masterCategory: 'Accessories', + subCategory: 'Watches', + imageURL: 'images/11002.jpg', + productDescription: + '

This watch from puma comes in a heavy duty design. The asymmetric dial and chunky casing gives this watch a tough appearance perfect for navigating the urban jungle.

Dial shape
: Round
Case diameter: 32 cm
Warranty: 2 Years

Stainless steel case with a fixed bezel for added durability, style and comfort
Leather straps with a tang clasp for comfort and style
Black dial with cat logo on the 12 hour mark
Date aperture at the 3 hour mark
Analog time display
Solid case back made of stainless steel for enhanced durability
Water resistant upto 100 metres

', + }, + { + _id: '2', + price: 5450, + productDisplayName: 'Puma Men Top Fluctuation Red Black Watches', + brandName: 'Puma', + ageGroup: 'Adults-Men', + gender: 'Men', + masterCategory: 'Accessories', + subCategory: 'Watches', + imageURL: 'images/11001.jpg', + productDescription: + '

This watch from puma comes in a clean sleek design. This active watch is perfect for urban wear and can serve you well in the gym or a night of clubbing.

Case diameter
: 40 mm<

', + }, + + { + _id: '3', + price: 499, + productDisplayName: 'Inkfruit Women Behind Cream Tshirts', + brandName: 'Inkfruit', + ageGroup: 'Adults-Women', + gender: 'Women', + masterCategory: 'Apparel', + subCategory: 'Topwear', + imageURL: 'images/11008.jpg', + productDescription: + '

Composition
Yellow round neck t-shirt made of 100% cotton, has short sleeves and graphic print on the front

Fitting
Comfort

Wash care
Hand wash separately in cool water at 30 degrees
Do not scrub
Do not bleach
Turn inside out and dry flat in shade
Warm iron on reverse
Do not iron on print

Flaunt your pretty, long legs in style with this inkfruit t-shirt. The graphic print oozes sensuality, while the cotton fabric keeps you fresh and comfortable all day. Team this with a short denim skirt and high-heeled sandals and get behind the wheel in style.

Model statistics
The model wears size M in t-shirts
Height: 5\'7", Chest: 33"

', + }, +]; +``` + +Below is the sample code to seed `products` data as JSON in Redis. The data also includes vectors of both product descriptions and images. + +```js title="src/index.ts" +async function addProductWithEmbeddings(_products) { + const nodeRedisClient = getNodeRedisClient(); + + if (_products && _products.length) { + for (let product of _products) { + console.log( + `generating description embeddings for product ${product._id}`, + ); + const sentenceEmbedding = await generateSentenceEmbeddings( + product.productDescription, + ); + product['productDescriptionEmbeddings'] = sentenceEmbedding; + + console.log(`generating image embeddings for product ${product._id}`); + const imageEmbedding = await generateImageEmbeddings(product.imageURL); + product['productImageEmbeddings'] = imageEmbedding; + + await nodeRedisClient.json.set(`products:${product._id}`, '$', { + ...product, + }); + console.log(`product ${product._id} added to redis`); + } + } +} +``` + +You can observe products JSON data in RedisInsight: + +![products data in RedisInsight](./images/products-data-gui.png) + +:::tip + +Download [RedisInsight](https://redis.com/redis-enterprise/redis-insight/) to visually explore your Redis data or to engage with raw Redis commands in the workbench. + +::: + +### Create vector index + +For searches to be conducted on JSON fields in Redis, they must be indexed. The methodology below highlights the process of indexing different types of fields. This encompasses vector fields such as `productDescriptionEmbeddings` and `productImageEmbeddings`. + +```ts title="src/redis-index.ts" +import { + createClient, + SchemaFieldTypes, + VectorAlgorithms, + RediSearchSchema, +} from 'redis'; + +const PRODUCTS_KEY_PREFIX = 'products'; +const PRODUCTS_INDEX_KEY = 'idx:products'; +const REDIS_URI = 'redis://localhost:6379'; +let nodeRedisClient = null; + +const getNodeRedisClient = async () => { + if (!nodeRedisClient) { + nodeRedisClient = createClient({ url: REDIS_URI }); + await nodeRedisClient.connect(); + } + return nodeRedisClient; +}; + +const createRedisIndex = async () => { + /* (RAW COMMAND) + FT.CREATE idx:products + ON JSON + PREFIX 1 "products:" + SCHEMA + "$.productDisplayName" as productDisplayName TEXT NOSTEM SORTABLE + "$.brandName" as brandName TEXT NOSTEM SORTABLE + "$.price" as price NUMERIC SORTABLE + "$.masterCategory" as "masterCategory" TAG + "$.subCategory" as subCategory TAG + "$.productDescriptionEmbeddings" as productDescriptionEmbeddings VECTOR "FLAT" 10 + "TYPE" FLOAT32 + "DIM" 768 + "DISTANCE_METRIC" "L2" + "INITIAL_CAP" 111 + "BLOCK_SIZE" 111 + "$.productDescription" as productDescription TEXT NOSTEM SORTABLE + "$.imageURL" as imageURL TEXT NOSTEM + "$.productImageEmbeddings" as productImageEmbeddings VECTOR "HNSW" 8 + "TYPE" FLOAT32 + "DIM" 1024 + "DISTANCE_METRIC" "COSINE" + "INITIAL_CAP" 111 + + */ + const nodeRedisClient = await getNodeRedisClient(); + + const schema: RediSearchSchema = { + '$.productDisplayName': { + type: SchemaFieldTypes.TEXT, + NOSTEM: true, + SORTABLE: true, + AS: 'productDisplayName', + }, + '$.brandName': { + type: SchemaFieldTypes.TEXT, + NOSTEM: true, + SORTABLE: true, + AS: 'brandName', + }, + '$.price': { + type: SchemaFieldTypes.NUMERIC, + SORTABLE: true, + AS: 'price', + }, + '$.masterCategory': { + type: SchemaFieldTypes.TAG, + AS: 'masterCategory', + }, + '$.subCategory': { + type: SchemaFieldTypes.TAG, + AS: 'subCategory', + }, + '$.productDescriptionEmbeddings': { + type: SchemaFieldTypes.VECTOR, + TYPE: 'FLOAT32', + ALGORITHM: VectorAlgorithms.FLAT, + DIM: 768, + DISTANCE_METRIC: 'L2', + INITIAL_CAP: 111, + BLOCK_SIZE: 111, + AS: 'productDescriptionEmbeddings', + }, + '$.productDescription': { + type: SchemaFieldTypes.TEXT, + NOSTEM: true, + SORTABLE: true, + AS: 'productDescription', + }, + '$.imageURL': { + type: SchemaFieldTypes.TEXT, + NOSTEM: true, + AS: 'imageURL', + }, + '$.productImageEmbeddings': { + type: SchemaFieldTypes.VECTOR, + TYPE: 'FLOAT32', + ALGORITHM: VectorAlgorithms.HNSW, //Hierarchical Navigable Small World graphs + DIM: 1024, + DISTANCE_METRIC: 'COSINE', + INITIAL_CAP: 111, + AS: 'productImageEmbeddings', + }, + }; + console.log(`index ${PRODUCTS_INDEX_KEY} created`); + + try { + await nodeRedisClient.ft.dropIndex(PRODUCTS_INDEX_KEY); + } catch (indexErr) { + console.error(indexErr); + } + await nodeRedisClient.ft.create(PRODUCTS_INDEX_KEY, schema, { + ON: 'JSON', + PREFIX: PRODUCTS_KEY_PREFIX, + }); +}; +``` + +:::info FLAT VS HNSW indexing + +FLAT: When vectors are indexed in a "FLAT" structure, they're stored in their original form without any added hierarchy. A search against a FLAT index will require the algorithm to scan each vector linearly to find the most similar matches. While this is accurate, it's computationally intensive and slower, making it ideal for smaller datasets. + +HNSW (Hierarchical Navigable Small World): HNSW is a graph-centric method tailored for indexing high-dimensional data. With larger datasets, linear comparisons against every vector in the index become time-consuming. HNSW employs a probabilistic approach, ensuring faster search results but with a slight trade-off in accuracy. + +::: + +:::info INITIAL_CAP and BLOCK_SIZE parameters + +Both INITIAL_CAP and BLOCK_SIZE are configuration parameters that control how vectors are stored and indexed. + +INITIAL_CAP defines the initial capacity of the vector index. It helps in pre-allocating space for the index. + +BLOCK_SIZE defines the size of each block of the vector index. As more vectors are added, Redis will allocate memory in chunks, with each chunk being the size of the BLOCK_SIZE. It helps in optimizing the memory allocations during index growth. + +::: + +## What is vector KNN query? + +KNN, or k-Nearest Neighbors, is an algorithm used in both classification and regression tasks, but when referring to "KNN Search," we're typically discussing the task of finding the "k" points in a dataset that are closest (most similar) to a given query point. In the context of vector search, this means identifying the "k" vectors in our database that are most similar to a given query vector, usually based on some distance metric like cosine similarity or Euclidean distance. + +### Vector KNN query with Redis + +Redis allows you to index and then search for vectors [using the KNN approach](https://redis.io/docs/interact/search-and-query/search/vectors/). + +Below, you'll find a Node.js code snippet that illustrates how to perform `KNN query` for any provided `search text`: + +```ts title="src/knn-query.ts" +const float32Buffer = (arr) => { + const floatArray = new Float32Array(arr); + const float32Buffer = Buffer.from(floatArray.buffer); + return float32Buffer; +}; +const queryProductDescriptionEmbeddingsByKNN = async ( + _searchTxt, + _resultCount, +) => { + //A KNN query will give us the top n documents that best match the query vector. + + /* sample raw query + + FT.SEARCH idx:products + "*=>[KNN 5 @productDescriptionEmbeddings $searchBlob AS score]" + RETURN 4 score brandName productDisplayName imageURL + SORTBY score + PARAMS 2 searchBlob "6\xf7\..." + DIALECT 2 + + */ + //https://redis.io/docs/interact/search-and-query/query/ + + console.log(`queryProductDescriptionEmbeddingsByKNN started`); + let results = {}; + if (_searchTxt) { + _resultCount = _resultCount ?? 5; + + const nodeRedisClient = getNodeRedisClient(); + const searchTxtVectorArr = await generateSentenceEmbeddings(_searchTxt); + + const searchQuery = `*=>[KNN ${_resultCount} @productDescriptionEmbeddings $searchBlob AS score]`; + + results = await nodeRedisClient.ft.search(PRODUCTS_INDEX_KEY, searchQuery, { + PARAMS: { + searchBlob: float32Buffer(searchTxtVectorArr), + }, + RETURN: ['score', 'brandName', 'productDisplayName', 'imageURL'], + SORTBY: { + BY: 'score', + // DIRECTION: "DESC" + }, + DIALECT: 2, + }); + } else { + throw 'Search text cannot be empty'; + } + + return results; +}; +``` + +Please find output for a KNN query in Redis **(A lower score or distance in the output signifies a higher degree of similarity.)** + +```js title="sample output" +const result = await queryProductDescriptionEmbeddingsByKNN( + 'Puma watch with cat', //search text + 3, //max number of results expected +); +console.log(JSON.stringify(result, null, 4)); + +/* +{ + "total": 3, + "documents": [ + { + "id": "products:1", + "value": { + "score": "0.762174725533", + "brandName": "Puma", + "productDisplayName": "Puma Men Race Black Watch", + "imageURL": "images/11002.jpg" + } + }, + { + "id": "products:2", + "value": { + "score": "0.825711071491", + "brandName": "Puma", + "productDisplayName": "Puma Men Top Fluctuation Red Black Watches", + "imageURL": "images/11001.jpg" + } + }, + { + "id": "products:3", + "value": { + "score": "1.79949247837", + "brandName": "Inkfruit", + "productDisplayName": "Inkfruit Women Behind Cream Tshirts", + "imageURL": "images/11008.jpg" + } + } + ] +} +*/ +``` + +:::note +KNN queries can be combined with standard Redis search functionalities using [Hybrid queries](https://redis.io/docs/interact/search-and-query/search/vectors/#hybrid-knn-queries). +::: + +## What is vector range query? + +Range queries retrieve data that falls within a specified range of values. +For vectors, a "range query" typically refers to retrieving all vectors within a certain distance of a target vector. The "range" in this context is a radius in the vector space. + +### Vector range query with Redis + +Below, you'll find a Node.js code snippet that illustrates how to perform vector `range query` for any range (radius/ distance)provided: + +```js title="src/range-query.ts" +const queryProductDescriptionEmbeddingsByRange = async (_searchTxt, _range) => { + /* sample raw query + + FT.SEARCH idx:products + "@productDescriptionEmbeddings:[VECTOR_RANGE $searchRange $searchBlob]=>{$YIELD_DISTANCE_AS: score}" + RETURN 4 score brandName productDisplayName imageURL + SORTBY score + PARAMS 4 searchRange 0.685 searchBlob "A=\xe1\xbb\x8a\xad\x...." + DIALECT 2 + */ + + console.log(`queryProductDescriptionEmbeddingsByRange started`); + let results = {}; + if (_searchTxt) { + _range = _range ?? 1.0; + + const nodeRedisClient = getNodeRedisClient(); + + const searchTxtVectorArr = await generateSentenceEmbeddings(_searchTxt); + + const searchQuery = + '@productDescriptionEmbeddings:[VECTOR_RANGE $searchRange $searchBlob]=>{$YIELD_DISTANCE_AS: score}'; + + results = await nodeRedisClient.ft.search(PRODUCTS_INDEX_KEY, searchQuery, { + PARAMS: { + searchBlob: float32Buffer(searchTxtVectorArr), + searchRange: _range, + }, + RETURN: ['score', 'brandName', 'productDisplayName', 'imageURL'], + SORTBY: { + BY: 'score', + // DIRECTION: "DESC" + }, + DIALECT: 2, + }); + } else { + throw 'Search text cannot be empty'; + } + + return results; +}; +``` + +Please find output for a range query in Redis + +```js title="sample output" +const result2 = await queryProductDescriptionEmbeddingsByRange( + 'Puma watch with cat', //search text + 1.0, //with in range or distance +); +console.log(JSON.stringify(result2, null, 4)); +/* +{ + "total": 2, + "documents": [ + { + "id": "products:1", + "value": { + "score": "0.762174725533", + "brandName": "Puma", + "productDisplayName": "Puma Men Race Black Watch", + "imageURL": "images/11002.jpg" + } + }, + { + "id": "products:2", + "value": { + "score": "0.825711071491", + "brandName": "Puma", + "productDisplayName": "Puma Men Top Fluctuation Red Black Watches", + "imageURL": "images/11001.jpg" + } + } + ] +} +*/ +``` + +## Image vs text vector query + +:::info Image vs text vector query +The syntax for vector KNN/ range queries is consistent, regardless of whether you're working with image vectors or text vectors. Just as there's a method for text vector queries named `queryProductDescriptionEmbeddingsByKNN`, there's a corresponding method for images titled `queryProductImageEmbeddingsByKNN` in the code base. +::: + +:::tip GITHUB CODE + +Below is a command to the clone the source code used in this tutorial + +git clone https://github.com/redis-developer/redis-vector-nodejs-solutions.git +::: + +Hopefully this tutorial has helped you visualize how to use Redis for vector similarity search. + +--- + +(Optional) If you want to also understand the math behind vector similarity search , then read following + +## How to calculate vector similarity? + +Several techniques are available to assess vector similarity, with some of the most prevalent ones being: + +### Euclidean Distance (L2 norm) + +**Euclidean Distance (L2 norm)** calculates the linear distance between two points within a multi-dimensional space. Lower values indicate closer proximity, and hence higher similarity. + +EuclideanDistanceFormulaImage + +For illustration purposes, let's assess `product 1` and `product 2` from the earlier ecommerce dataset and determine the `Euclidean Distance` considering all features. + +EuclideanDistanceSampleImage + +As an example, we will use a 2D chart made with [chart.js](https://www.chartjs.org/) comparing the `Price vs. Quality` features of our products, focusing solely on these two attributes to compute the `Euclidean Distance`. + +![chart](./images/euclidean-distance-chart.png) + +### Cosine Similarity + +**Cosine Similarity** measures the cosine of the angle between two vectors. The cosine similarity value ranges between -1 and 1. A value closer to 1 implies a smaller angle and higher similarity, while a value closer to -1 implies a larger angle and lower similarity. Cosine similarity is particularly popular in NLP when dealing with text vectors. + +CosineFormulaImage + +:::note +If two vectors are pointing in the same direction, the cosine of the angle between them is 1. If they're orthogonal, the cosine is 0, and if they're pointing in opposite directions, the cosine is -1. +::: + +Again, consider `product 1` and `product 2` from the previous dataset and calculate the `Cosine Distance` for all features. + +![sample](./images/cosine-sample.png) + +Using [chart.js](https://www.chartjs.org/), we've crafted a 2D chart of `Price vs. Quality` features. It visualizes the `Cosine Similarity` solely based on these attributes. + +![chart](./images/cosine-chart.png) + +### Inner Product + +**Inner Product (dot product)** The inner product (or dot product) isn't a distance metric in the traditional sense but can be used to calculate similarity, especially when vectors are normalized (have a magnitude of 1). It's the sum of the products of the corresponding entries of the two sequences of numbers. + +IpFormulaImage + +:::note +The inner product can be thought of as a measure of how much two vectors "align" +in a given vector space. Higher values indicate higher similarity. However, the raw +values can be large for long vectors; hence, normalization is recommended for better +interpretation. If the vectors are normalized, their dot product will be `1 if they are identical` and `0 if they are orthogonal` (uncorrelated). +::: + +Considering our `product 1` and `product 2`, let's compute the `Inner Product` across all features. + +![sample](./images/ip-sample.png) + +:::tip +Vectors can also be stored in databases in **binary formats** to save space. In practical applications, it's crucial to strike a balance between the dimensionality of the vectors (which impacts storage and computational costs) and the quality or granularity of the information they capture. +::: + +## Further reading + +- [Vector search in Redis 7.2](https://redis.com/blog/introducing-redis-7-2/) + +- [Redis VSS getting started](https://github.com/redis-developer/redis-ai-resources/tree/main/python-recipes/vector-search) +- [Redis vector use cases](https://redis.com/solutions/use-cases/vector-database/) +- [Vector query docs](https://redis.io/docs/interact/search-and-query/search/vectors/) diff --git a/docs/howtos/solutions/vector/image-summary-search/images/01-ui-settings.png b/docs/howtos/solutions/vector/image-summary-search/images/01-ui-settings.png new file mode 100644 index 00000000000..e8460e19d8e Binary files /dev/null and b/docs/howtos/solutions/vector/image-summary-search/images/01-ui-settings.png differ diff --git a/docs/howtos/solutions/vector/image-summary-search/images/02-ui-image-search.png b/docs/howtos/solutions/vector/image-summary-search/images/02-ui-image-search.png new file mode 100644 index 00000000000..82907b0fd46 Binary files /dev/null and b/docs/howtos/solutions/vector/image-summary-search/images/02-ui-image-search.png differ diff --git a/docs/howtos/solutions/vector/image-summary-search/images/03-ui-toggle-image-summary.jpg b/docs/howtos/solutions/vector/image-summary-search/images/03-ui-toggle-image-summary.jpg new file mode 100644 index 00000000000..84a384e42ea Binary files /dev/null and b/docs/howtos/solutions/vector/image-summary-search/images/03-ui-toggle-image-summary.jpg differ diff --git a/docs/howtos/solutions/vector/image-summary-search/images/product-img.webp b/docs/howtos/solutions/vector/image-summary-search/images/product-img.webp new file mode 100644 index 00000000000..5c2de3f1498 Binary files /dev/null and b/docs/howtos/solutions/vector/image-summary-search/images/product-img.webp differ diff --git a/docs/howtos/solutions/vector/image-summary-search/images/redis-insight-ai-image.png b/docs/howtos/solutions/vector/image-summary-search/images/redis-insight-ai-image.png new file mode 100644 index 00000000000..c665bbb6d35 Binary files /dev/null and b/docs/howtos/solutions/vector/image-summary-search/images/redis-insight-ai-image.png differ diff --git a/docs/howtos/solutions/vector/image-summary-search/index-image-summary-search.mdx b/docs/howtos/solutions/vector/image-summary-search/index-image-summary-search.mdx new file mode 100644 index 00000000000..931e4da7078 --- /dev/null +++ b/docs/howtos/solutions/vector/image-summary-search/index-image-summary-search.mdx @@ -0,0 +1,458 @@ +--- +id: index-image-summary-search +title: Semantic Image Based Queries Using LangChain (OpenAI) and Redis +sidebar_label: Semantic Image Based Queries Using LangChain (OpenAI) and Redis +slug: /howtos/solutions/vector/image-summary-search +authors: [prasan, will] +--- + +import Authors from '@theme/Authors'; +import InitialMicroservicesArchitecture from '../../microservices/common-data/microservices-arch.mdx'; +import MicroservicesEcommerceAIDesign from '../common-ai/microservices-ecommerce-ai.mdx'; +import sampleProductImage from './images/product-img.webp'; +import SourceCode from '../common-ai/microservices-source-code-ai.mdx'; + + + +## What you will learn in this tutorial + +This tutorial demonstrates how to perform semantic search on product images using LangChain (OpenAI) and Redis. Specifically, we'll cover the following topics: + +- **E-Commerce Application Context** : Consider a sample e-commerce application scenario where customers can utilize image-based queries for product searches, add items to their shopping cart, and complete purchases, thereby highlighting a real-world application of semantic search. + +- **Database setup** : This involves generating descriptive summaries for product images, creating semantic embeddings for generated summaries and efficiently storing them in Redis. + +- **Setting up the search API** : This API is designed to process user queries in the context of image content. It integrates the capabilities of OpenAI for semantic analysis with Redis for efficient data retrieval and storage. + +## Terminology + +**[LangChain](https://js.langchain.com)** is an innovative library for building language model applications. It offers a structured way to combine different components like language models (e.g., OpenAI's models), storage solutions (like Redis), and custom logic. This modular approach facilitates the creation of sophisticated AI applications. + +**[OpenAI](https://openai.com/)** provides advanced language models like GPT-3, which have revolutionized the field with their ability to understand and generate human-like text. These models form the backbone of many modern AI applications including semantic text/ image search and chatbots. + +## Microservices architecture for an e-commerce application + + + + + +## E-commerce application frontend using Next.js and Tailwind + + + +## Database setup + +:::info +Sign up for an [OpenAI account](https://platform.openai.com/) to get your API key to be used in the demo (add OPEN_AI_API_KEY variable in .env file). You can also refer to the [OpenAI API documentation](https://platform.openai.com/docs/api-reference/introduction) for more information. +::: + + + +### Sample data + +In this tutorial, we'll use a simplified e-commerce dataset. Specifically, our JSON structure includes `product` details and a key named `styleImages_default_imageURL`, which links to an image of the product. This image will be the focus of our AI-driven semantic search. + +```ts title="database/fashion-dataset/001/products/*.json" +const products = [ + { + productId: '11000', + price: 3995, + productDisplayName: 'Puma Men Slick 3HD Yellow Black Watches', + variantName: 'Slick 3HD Yellow', + brandName: 'Puma', + // Additional product details... + styleImages_default_imageURL: + 'http://host.docker.internal:8080/images/11000.jpg', + // Other properties... + }, + // Additional products... +]; +``` + +### Generating OpenAI image summary + +The following code segment outlines the process of generating a text summary for a product image using OpenAI's capabilities. We'll first convert the image URL to a base64 string using `fetchImageAndConvertToBase64` function and then utilize OpenAI to generate a summary of the image using `getOpenAIImageSummary` function. + +```ts title="database/src/open-ai-image.ts" +import { + ChatOpenAI, + ChatOpenAICallOptions, +} from 'langchain/chat_models/openai'; +import { HumanMessage } from 'langchain/schema'; +import { Document } from 'langchain/document'; +import { OpenAIEmbeddings } from 'langchain/embeddings/openai'; +import { RedisVectorStore } from 'langchain/vectorstores/redis'; + +let llm: ChatOpenAI; + +// Instantiates the LangChain ChatOpenAI instance +const getOpenAIVisionInstance = (_openAIApiKey: string) => { + //OpenAI supports images with text in input messages with their gpt-4-vision-preview. + if (!llm) { + llm = new ChatOpenAI({ + openAIApiKey: _openAIApiKey, + modelName: 'gpt-4-vision-preview', + maxTokens: 1024, + }); + } + return llm; +}; + +const fetchImageAndConvertToBase64 = async (_imageURL: string) => { + let base64Image = ''; + try { + const response = await axios.get(_imageURL, { + responseType: 'arraybuffer', + }); + // Convert image to Base64 + base64Image = Buffer.from(response.data, 'binary').toString('base64'); + } catch (error) { + console.error( + `Error fetching or converting the image: ${_imageURL}`, + error, + ); + } + return base64Image; +}; + +// Generates an OpenAI summary for a given base64 image string +const getOpenAIImageSummary = async ( + _openAIApiKey: string, + _base64Image: string, + _product: Prisma.ProductCreateInput, +) => { + /* + Reference : https://js.langchain.com/docs/integrations/chat/openai#multimodal-messages + + - This function utilizes OpenAI's multimodal capabilities to generate a summary from the image. + - It constructs a prompt that combines the product description with the image. + - OpenAI's vision model then processes this prompt to generate a detailed summary. + + */ + let imageSummary = ''; + + try { + if (_openAIApiKey && _base64Image && _product) { + const llmInst = getOpenAIVisionInstance(_openAIApiKey); + + const text = `Below are the product details and image of an e-commerce product for reference. Please conduct and provide a comprehensive analysis of the product depicted in the image . + + Product Details: + ${_product.productDescriptors_description_value} + + Image: + `; + // Constructing a multimodal message combining text and image + const imagePromptMessage = new HumanMessage({ + content: [ + { + type: 'text', + text: text, + }, + { + type: 'image_url', + image_url: { + url: `data:image/jpeg;base64,${_base64Image}`, + detail: 'high', // low, high (if you want more detail) + }, + }, + ], + }); + + // Invoking the LangChain ChatOpenAI model with the constructed message + const response = await llmInst.invoke([imagePromptMessage]); + if (response?.content) { + imageSummary = response.content; + } + } + } catch (err) { + console.log( + `Error generating OpenAIImageSummary for product id ${_product.productId}`, + err, + ); + } + return imageSummary; +}; +``` + +### Sample image & OpenAI summary + +The following section demonstrates the result of the above process. We'll use the image of a Puma T-shirt and generate a summary using OpenAI's capabilities. + +Sample Product Image + +Comprehensive summary generated by the OpenAI model is as follows: + +```txt +This product is a black round neck T-shirt featuring a design consistent with the Puma brand aesthetic, which includes their iconic leaping cat logo in a contrasting yellow color placed prominently across the chest area. The T-shirt is made from 100% cotton, suggesting it is likely to be breathable and soft to the touch. It has a classic short-sleeve design with a ribbed neckline for added texture and durability. There is also mention of a vented hem, which may offer additional comfort and mobility. + +The T-shirt is described to have a 'comfort' fit, which typically means it is designed to be neither too tight nor too loose, allowing for ease of movement without being baggy. This could be ideal for casual wear or active use. + +Care instructions are also comprehensive, advising a gentle machine wash with similar colors in cool water at 30 degrees Celsius, indicating it is relatively easy to care for. However, one should avoid bleaching, tumble drying, and dry cleaning it, but a warm iron is permissible. + +Looking at the image provided: + +- The T-shirt appears to fit the model well, in accordance with the described 'comfort' fit. +- The color contrast between the T-shirt and the graphic gives the garment a modern, sporty look. +- The model is paired with denim jeans, showcasing the T-shirt's versatility for casual occasions. However, the product description suggests it can be part of an athletic ensemble when combined with Puma shorts and shoes. +- Considering the model's statistics, prospective buyers could infer how this T-shirt might fit on a person with similar measurements. + +Overall, the T-shirt is positioned as a versatile item suitable for both lifestyle and sporting activities, with a strong brand identity through the graphic, and is likely comfortable and easy to maintain based on the product details provided. +``` + +### Seeding Image summary embeddings + +The `addImageSummaryEmbeddingsToRedis` function plays a critical role in integrating AI-generated image summaries with Redis. This process involves two main steps: + +1. **Generating Vector Documents**: Utilizing the `getImageSummaryVectorDocuments` function, we transform image summaries into vector documents. This transformation is crucial as it converts textual summaries into a format suitable for Redis storage. + +1. **Seeding Embeddings into Redis**: The `seedImageSummaryEmbeddings` function is then employed to store these vector documents into Redis. This step is essential for enabling efficient retrieval and search capabilities within the Redis database. + +```ts +// Function to generate vector documents from image summaries +const getImageSummaryVectorDocuments = async ( + _products: Prisma.ProductCreateInput[], + _openAIApiKey: string, +) => { + const vectorDocs: Document[] = []; + + if (_products?.length > 0) { + let count = 1; + for (let product of _products) { + if (product) { + let imageURL = product.styleImages_default_imageURL; //cdn url + const imageData = await fetchImageAndConvertToBase64(imageURL); + imageSummary = await getOpenAIImageSummary( + _openAIApiKey, + imageData, + product, + ); + console.log( + `openAI imageSummary #${count++} generated for product id: ${ + product.productId + }`, + ); + + if (imageSummary) { + let doc = new Document({ + metadata: { + productId: product.productId, + imageURL: imageURL, + }, + pageContent: imageSummary, + }); + vectorDocs.push(doc); + } + } + } + } + return vectorDocs; +}; + +// Seeding vector documents into Redis +const seedImageSummaryEmbeddings = async ( + vectorDocs: Document[], + _redisClient: NodeRedisClientType, + _openAIApiKey: string, +) => { + if (vectorDocs?.length && _redisClient && _openAIApiKey) { + const embeddings = new OpenAIEmbeddings({ + openAIApiKey: _openAIApiKey, + }); + const vectorStore = await RedisVectorStore.fromDocuments( + vectorDocs, + embeddings, + { + redisClient: _redisClient, + indexName: 'openAIProductImgIdx', + keyPrefix: 'openAIProductImgText:', + }, + ); + console.log('seeding imageSummaryEmbeddings completed'); + } +}; + +const addImageSummaryEmbeddingsToRedis = async ( + _products: Prisma.ProductCreateInput[], + _redisClient: NodeRedisClientType, + _openAIApiKey: string, +) => { + const vectorDocs = await getImageSummaryVectorDocuments( + _products, + _openAIApiKey, + ); + + await seedImageSummaryEmbeddings(vectorDocs, _redisClient, _openAIApiKey); +}; +``` + +The image below shows the JSON structure of **openAI image summary** within RedisInsight. +![Redis Insight AI products](./images/redis-insight-ai-image.png) + +:::tip + +Download [RedisInsight](https://redis.com/redis-enterprise/redis-insight/) to visually explore your Redis data or to engage with raw Redis commands in the workbench. + +::: + +## Setting up the search API + +### API end point + +This section covers the API request and response structure for `getProductsByVSSImageSummary`, which is essential for retrieving products based on semantic search using image summaries. + +**Request Format** + +The example request format for the API is as follows: + +```json +POST http://localhost:3000/products/getProductsByVSSImageSummary +{ + "searchText":"Left chest nike logo", + + //optional + "maxProductCount": 4, // 2 (default) + "similarityScoreLimit":0.2, // 0.2 (default) +} +``` + +**Response Structure** + +The response from the API is a JSON object containing an array of product details that match the semantic search criteria: + +```json +{ + "data": [ + { + "productId": "10017", + "price": 3995, + "productDisplayName": "Nike Women As The Windru Blue Jackets", + "brandName": "Nike", + "styleImages_default_imageURL": "http://host.docker.internal:8080/products/01/10017/product-img.webp", + "productDescriptors_description_value": " Blue and White jacket made of 100% polyester, with an interior pocket ...", + "stockQty": 25, + "similarityScore": 0.163541972637, + "imageSummary": "The product in the image is a blue and white jacket featuring a design consistent with the provided description. ..." + } + // Additional products... + ], + "error": null, + "auth": "SES_fd57d7f4-3deb-418f-9a95-6749cd06e348" +} +``` + +### API implementation + +The backend implementation of this API involves following steps: + +1. `getProductsByVSSImageSummary` function handles the API Request. +1. `getSimilarProductsScoreByVSSImageSummary` function performs semantic search on image summaries. It integrates with `OpenAI's` semantic analysis capabilities to interpret the searchText and identify relevant products from `Redis` vector store. + +```ts title="server/src/services/products/src/open-ai-prompt.ts" +const getSimilarProductsScoreByVSSImageSummary = async ( + _params: IParamsGetProductsByVSS, +) => { + let { + standAloneQuestion, + openAIApiKey, + + //optional + KNN, + scoreLimit, + } = _params; + + let vectorDocs: Document[] = []; + const client = getNodeRedisClient(); + + KNN = KNN || 2; + scoreLimit = scoreLimit || 1; + + const embeddings = new OpenAIEmbeddings({ + openAIApiKey: openAIApiKey, + }); + + // create vector store + const vectorStore = new RedisVectorStore(embeddings, { + redisClient: client, + indexName: 'openAIProductImgIdx', + keyPrefix: 'openAIProductImgText:', + }); + + // search for similar products + const vectorDocsWithScore = await vectorStore.similaritySearchWithScore( + standAloneQuestion, + KNN, + ); + + // filter by scoreLimit + for (let [doc, score] of vectorDocsWithScore) { + if (score <= scoreLimit) { + doc['similarityScore'] = score; + vectorDocs.push(doc); + } + } + + return vectorDocs; +}; +``` + +```ts title="server/src/services/products/src/service-impl.ts" +const getProductsByVSSImageSummary = async ( + productsVSSFilter: IProductsVSSBodyFilter, +) => { + let { searchText, maxProductCount, similarityScoreLimit } = productsVSSFilter; + let products: IProduct[] = []; + + const openAIApiKey = process.env.OPEN_AI_API_KEY || ''; + maxProductCount = maxProductCount || 2; + similarityScoreLimit = similarityScoreLimit || 0.2; + + //VSS search + const vectorDocs = await getSimilarProductsScoreByVSSImageSummary({ + standAloneQuestion: searchText, + openAIApiKey: openAIApiKey, + KNN: maxProductCount, + scoreLimit: similarityScoreLimit, + }); + + if (vectorDocs?.length) { + const productIds = vectorDocs.map((doc) => doc?.metadata?.productId); + + //get product with details + products = await getProductByIds(productIds, true); + } + + //... + + return products; +}; +``` + +### Frontend UI + +- **Settings configuration**: Initially, ensure that the `Semantic image summary search` option is enabled in the settings page. + ![settings page](./images/01-ui-settings.png) + +- **Performing a search**: On the dashboard page, users can conduct searches using image-based queries. For example, if the query is `Left chest nike logo`, the search results will display products like a Nike jacket, characterized by a logo on its left chest, reflecting the query. + + ![search page](./images/02-ui-image-search.png) + +- **Viewing image summaries**: Users can click on any product image to view the corresponding image summary generated by OpenAI. This feature offers an insightful glimpse into how AI interprets and summarizes visual content. + ![toggle image summary](./images/03-ui-toggle-image-summary.jpg) + +## Ready to use Redis for semantic image based queries? + +Performing semantic search on image summaries is a powerful tool for e-commerce applications. It allows users to search for products based on their descriptions or images, enabling a more intuitive and efficient shopping experience. This tutorial has demonstrated how to integrate OpenAI's semantic analysis capabilities with Redis to create a robust search engine for e-commerce applications. + +## Further reading + +- [Perform vector similarity search using Redis](/howtos/solutions/vector/getting-started-vector) + +- [LangChain JS](https://js.langchain.com/docs/get_started/quickstart) + - [Learn LangChain](https://scrimba.com/learn/langchain) +- [LangChain redis integration](https://js.langchain.com/docs/integrations/vectorstores/redis) diff --git a/docs/howtos/solutions/vector/semantic-text-search/images/01-ui-settings2.png b/docs/howtos/solutions/vector/semantic-text-search/images/01-ui-settings2.png new file mode 100644 index 00000000000..927cbe19337 Binary files /dev/null and b/docs/howtos/solutions/vector/semantic-text-search/images/01-ui-settings2.png differ diff --git a/docs/howtos/solutions/vector/semantic-text-search/images/02-ui-image-search2.png b/docs/howtos/solutions/vector/semantic-text-search/images/02-ui-image-search2.png new file mode 100644 index 00000000000..6e0bc20e139 Binary files /dev/null and b/docs/howtos/solutions/vector/semantic-text-search/images/02-ui-image-search2.png differ diff --git a/docs/howtos/solutions/vector/semantic-text-search/images/redis-insight-ai-product-details.png b/docs/howtos/solutions/vector/semantic-text-search/images/redis-insight-ai-product-details.png new file mode 100644 index 00000000000..51cd041a39f Binary files /dev/null and b/docs/howtos/solutions/vector/semantic-text-search/images/redis-insight-ai-product-details.png differ diff --git a/docs/howtos/solutions/vector/semantic-text-search/index-semantic-text-search.mdx b/docs/howtos/solutions/vector/semantic-text-search/index-semantic-text-search.mdx new file mode 100644 index 00000000000..9cb55b3f264 --- /dev/null +++ b/docs/howtos/solutions/vector/semantic-text-search/index-semantic-text-search.mdx @@ -0,0 +1,340 @@ +--- +id: index-semantic-text-search +title: Semantic Text Search Using LangChain (OpenAI) and Redis +sidebar_label: Semantic Text Search Using LangChain (OpenAI) and Redis +slug: /howtos/solutions/vector/semantic-text-search +authors: [prasan, will] +--- + +import Authors from '@theme/Authors'; +import InitialMicroservicesArchitecture from '../../microservices/common-data/microservices-arch.mdx'; +import MicroservicesEcommerceAIDesign from '../common-ai/microservices-ecommerce-ai.mdx'; +import SourceCode from '../common-ai/microservices-source-code-ai.mdx'; + + + +## What you will learn in this tutorial + +This tutorial explores the implementation of semantic text search in product descriptions using LangChain (OpenAI) and Redis. The focus areas include: + +- **Contextualizing E-Commerce**: Dive into an e-commerce scenario where semantic text search empowers users to find products through detailed textual queries. +- **Database Implementation**: Learn to create and store semantic embeddings from product descriptions in Redis for efficient search capabilities. +- **Search API Development**: Understand how to build an API that leverages OpenAI for semantic analysis of text and Redis for data management. + +## Terminology + +- **[LangChain](https://js.langchain.com)**: A versatile library for developing language model applications, combining language models, storage systems, and custom logic. +- **[OpenAI](https://openai.com/)**: A provider of cutting-edge language models like GPT-3, essential for applications in semantic search and conversational AI. + +## Microservices architecture for an e-commerce application + + + + + +## E-commerce application frontend using Next.js and Tailwind + + + +## Database setup + +:::info +Sign up for an [OpenAI account](https://platform.openai.com/) to get your API key to be used in the demo (add OPEN_AI_API_KEY variable in .env file). You can also refer to the [OpenAI API documentation](https://platform.openai.com/docs/api-reference/introduction) for more information. +::: + + + +### Sample data + +Consider a simplified e-commerce dataset featuring product details for semantic search. + +```ts title="database/fashion-dataset/001/products/*.json" +const products = [ + { + productId: '11000', + price: 3995, + productDisplayName: 'Puma Men Slick 3HD Yellow Black Watches', + variantName: 'Slick 3HD Yellow', + brandName: 'Puma', + ageGroup: 'Adults-Men', + gender: 'Men', + displayCategories: 'Accessories', + masterCategory_typeName: 'Accessories', + subCategory_typeName: 'Watches', + styleImages_default_imageURL: + 'http://host.docker.internal:8080/images/11000.jpg', + productDescriptors_description_value: 'Stylish and comfortable, ...', + stockQty: 25, + }, + //... +]; +``` + +### Seeding product details embeddings + +Implement the `addEmbeddingsToRedis` function to integrate AI-generated product description embeddings with Redis. + +This process involves two main steps: + +1. **Generating Vector Documents**: Utilizing the `convertToVectorDocuments` function, we transform product details into vector documents. This transformation is crucial as it converts product details into a format suitable for Redis storage. + +1. **Seeding Embeddings into Redis**: The `seedOpenAIEmbeddings` function is then employed to store these vector documents into Redis. This step is essential for enabling efficient retrieval and search capabilities within the Redis database. + +```ts +import { Document } from 'langchain/document'; +import { OpenAIEmbeddings } from 'langchain/embeddings/openai'; +import { RedisVectorStore } from 'langchain/vectorstores/redis'; + +const convertToVectorDocuments = async ( + _products: Prisma.ProductCreateInput[], +) => { + const vectorDocs: Document[] = []; + + if (_products?.length > 0) { + for (let product of _products) { + let doc = new Document({ + metadata: { + productId: product.productId, + }, + pageContent: ` Product details are as follows: + productId: ${product.productId}. + + productDisplayName: ${product.productDisplayName}. + + price: ${product.price}. + + variantName: ${product.variantName}. + + brandName: ${product.brandName}. + + ageGroup: ${product.ageGroup}. + + gender: ${product.gender}. + + productColors: ${product.productColors} + + Category: ${product.displayCategories}, ${product.masterCategory_typeName} - ${product.subCategory_typeName} + + productDescription: ${product.productDescriptors_description_value}`, + }); + + vectorDocs.push(doc); + } + } + return vectorDocs; +}; + +const seedOpenAIEmbeddings = async ( + vectorDocs: Document[], + _redisClient: NodeRedisClientType, + _openAIApiKey: string, +) => { + if (vectorDocs?.length && _redisClient && _openAIApiKey) { + console.log('openAIEmbeddings started !'); + + const embeddings = new OpenAIEmbeddings({ + openAIApiKey: _openAIApiKey, + }); + const vectorStore = await RedisVectorStore.fromDocuments( + vectorDocs, + embeddings, + { + redisClient: _redisClient, + indexName: 'openAIProductsIdx', + keyPrefix: 'openAIProducts:', + }, + ); + console.log('OpenAIEmbeddings completed'); + } +}; + +const addEmbeddingsToRedis = async ( + _products: Prisma.ProductCreateInput[], + _redisClient: NodeRedisClientType, + _openAIApiKey: string, + _huggingFaceApiKey?: string, +) => { + if (_products?.length > 0 && _redisClient && _openAIApiKey) { + const vectorDocs = await convertToVectorDocuments(_products); + + await seedOpenAIEmbeddings(vectorDocs, _redisClient, _openAIApiKey); + } +}; +``` + +Examine the structured `openAI product details` within Redis using RedisInsight. + +![Redis Insight AI products](./images/redis-insight-ai-product-details.png) + +:::tip + +Download [RedisInsight](https://redis.com/redis-enterprise/redis-insight/) to visually explore your Redis data or to engage with raw Redis commands in the workbench. + +::: + +## Setting up the search API + +### API end point + +This section covers the API request and response structure for `getProductsByVSSText`, which is essential for retrieving products based on semantic text search. + +**Request Format** + +The example request format for the API is as follows: + +```json +POST http://localhost:3000/products/getProductsByVSSText +{ + "searchText":"pure cotton blue shirts", + + //optional + "maxProductCount": 4, // 2 (default) + "similarityScoreLimit":0.2, // 0.2 (default) +} +``` + +**Response Structure** + +The response from the API is a JSON object containing an array of product details that match the semantic search criteria: + +```json +{ + "data": [ + { + "productId": "11031", + "price": 1099, + "productDisplayName": "Jealous 21 Women Check Blue Tops", + "productDescriptors_description_value": "Composition : Green and navy blue checked round neck blouson tunic top made of 100% cotton, has a full buttoned placket, three fourth sleeves with buttoned cuffs and a belt below the waist

Fitting
Regular

Wash care
Machine/hand wash separately in mild detergent
Do not bleach or wring
Dry in shade
Medium iron

If you're in the mood to have some checked fun, this blouson tunic top from jealous 21 will fulfil your heart's desire with élan. The cotton fabric promises comfort, while the smart checks guarantee unparalleled attention. Pair this top with leggings and ballerinas for a cute, neat look.

Model statistics
The model wears size M in tops
Height: 5'7\"; Chest: 33\"; Waist: 25\"

", + "stockQty": 25, + "productColors": "Blue,Green", + "similarityScore": 0.168704152107 + //... + } + ], + "error": null, + "auth": "SES_fd57d7f4-3deb-418f-9a95-6749cd06e348" +} +``` + +### API implementation + +The backend implementation of this API involves following steps: + +1. `getProductsByVSSText` function handles the API Request. +1. `getSimilarProductsScoreByVSS` function performs semantic search on product details. It integrates with `OpenAI's` semantic analysis capabilities to interpret the searchText and identify relevant products from `Redis` vector store. + +```ts title="server/src/services/products/src/open-ai-prompt.ts" +const getSimilarProductsScoreByVSS = async ( + _params: IParamsGetProductsByVSS, +) => { + let { + standAloneQuestion, + openAIApiKey, + + //optional + KNN, + scoreLimit, + } = _params; + + let vectorDocs: Document[] = []; + const client = getNodeRedisClient(); + + KNN = KNN || 2; + scoreLimit = scoreLimit || 1; + + let embeddings = new OpenAIEmbeddings({ + openAIApiKey: openAIApiKey, + }); + let indexName = 'openAIProductsIdx'; + let keyPrefix = 'openAIProducts:'; + + if (embeddings) { + // create vector store + const vectorStore = new RedisVectorStore(embeddings, { + redisClient: client, + indexName: indexName, + keyPrefix: keyPrefix, + }); + + // search for similar products + const vectorDocsWithScore = await vectorStore.similaritySearchWithScore( + standAloneQuestion, + KNN, + ); + + // filter by scoreLimit + for (let [doc, score] of vectorDocsWithScore) { + if (score <= scoreLimit) { + doc['similarityScore'] = score; + vectorDocs.push(doc); + } + } + } + + return vectorDocs; +}; +``` + +```ts title="server/src/services/products/src/service-impl.ts" +const getProductsByVSSText = async ( + productsVSSFilter: IProductsVSSBodyFilter, +) => { + let { searchText, maxProductCount, similarityScoreLimit } = productsVSSFilter; + let products: IProduct[] = []; + + const openAIApiKey = process.env.OPEN_AI_API_KEY || ''; + maxProductCount = maxProductCount || 2; + similarityScoreLimit = similarityScoreLimit || 0.2; + + if (!openAIApiKey) { + throw new Error('Please provide openAI API key in .env file'); + } + + if (!searchText) { + throw new Error('Please provide search text'); + } + + //VSS search + const vectorDocs = await getSimilarProductsScoreByVSS({ + standAloneQuestion: searchText, + openAIApiKey: openAIApiKey, + KNN: maxProductCount, + scoreLimit: similarityScoreLimit, + }); + + if (vectorDocs?.length) { + const productIds = vectorDocs.map((doc) => doc?.metadata?.productId); + + //get product with details + products = await getProductByIds(productIds, true); + } + + //... + + return products; +}; +``` + +### Frontend UI + +- **Settings configuration**: Enable `Semantic text search` in the settings page + ![settings page](./images/01-ui-settings2.png) + +- **Performing a search**: Use textual queries on the dashboard. + ![search page](./images/02-ui-image-search2.png) + +- Note: Users can click on the product description within the product card + to view the complete details. + +## Ready to use Redis for semantic text search? + +Discover the power of semantic text search for enhancing the e-commerce experience. This tutorial guides you through integrating OpenAI's semantic capabilities with Redis for a dynamic product search engine. + +## Further reading + +- [Perform vector similarity search using Redis](/howtos/solutions/vector/getting-started-vector) +- [Semantic Image Based Queries Using LangChain (OpenAI) and Redis](/howtos/solutions/vector/image-summary-search) + +- [LangChain JS](https://js.langchain.com/docs/get_started/quickstart) + - [Learn LangChain](https://scrimba.com/learn/langchain) +- [LangChain redis integration](https://js.langchain.com/docs/integrations/vectorstores/redis) diff --git a/docs/howtos/solutions/vector/video-qa/images/ask-question.png b/docs/howtos/solutions/vector/video-qa/images/ask-question.png new file mode 100644 index 00000000000..a1e0ad2a692 Binary files /dev/null and b/docs/howtos/solutions/vector/video-qa/images/ask-question.png differ diff --git a/docs/howtos/solutions/vector/video-qa/images/redisinsight-keys.png b/docs/howtos/solutions/vector/video-qa/images/redisinsight-keys.png new file mode 100644 index 00000000000..8753a66dcfb Binary files /dev/null and b/docs/howtos/solutions/vector/video-qa/images/redisinsight-keys.png differ diff --git a/docs/howtos/solutions/vector/video-qa/images/upload-videos.png b/docs/howtos/solutions/vector/video-qa/images/upload-videos.png new file mode 100644 index 00000000000..44694350934 Binary files /dev/null and b/docs/howtos/solutions/vector/video-qa/images/upload-videos.png differ diff --git a/docs/howtos/solutions/vector/video-qa/images/video-qa-existing-answer.png b/docs/howtos/solutions/vector/video-qa/images/video-qa-existing-answer.png new file mode 100644 index 00000000000..29f3375352e Binary files /dev/null and b/docs/howtos/solutions/vector/video-qa/images/video-qa-existing-answer.png differ diff --git a/docs/howtos/solutions/vector/video-qa/images/video-qa-supporting-videos.png b/docs/howtos/solutions/vector/video-qa/images/video-qa-supporting-videos.png new file mode 100644 index 00000000000..cc1161e5c6b Binary files /dev/null and b/docs/howtos/solutions/vector/video-qa/images/video-qa-supporting-videos.png differ diff --git a/docs/howtos/solutions/vector/video-qa/index-video-qa.mdx b/docs/howtos/solutions/vector/video-qa/index-video-qa.mdx new file mode 100644 index 00000000000..e86854eeaac --- /dev/null +++ b/docs/howtos/solutions/vector/video-qa/index-video-qa.mdx @@ -0,0 +1,849 @@ +--- +id: index-video-qa +title: Building an AI-Powered Video Q&A Application with Redis and LangChain +sidebar_label: Building an AI-Powered Video Q&A Application with Redis and LangChain +slug: /howtos/solutions/vector/ai-qa-videos-langchain-redis-openai-google +authors: [will, prasan] +--- + +import Authors from '@theme/Authors'; + + + +## What you will learn in this tutorial + +This tutorial focuses on building a Q&A answer engine for video content. It will cover the following topics: + +1. How to use `OpenAI`, `Google Gemini`, and `LangChain` to summarize video content and generate vector embeddings +1. How to use `Redis` to store and search vector embeddings +1. How to use `Redis` as a semantic vector search cache + +:::tip GITHUB CODE + +Below is a command to the clone the source code for the application used in this tutorial + +```bash +git clone https://github.com/redis-developer/video-qa-semantic-vector-caching +``` + +::: + +## Introduction + +Before we dive into the details of this tutorial, let's go over a few concepts that are important to understand when building generative AI applications. + +1. **Generative AI** is a rapidly evolving field that focuses on creating content, whether it's text, images, or even video. It leverages deep learning techniques to generate new, unique outputs based on learned patterns and data. +1. **Retrieval-Augmented Generation (RAG)** combines generative models with external knowledge sources to provide more accurate and informed responses. This technique is particularly useful in applications where context-specific information is critical. +1. [**LangChain**](https://www.langchain.com/) is a powerful library that facilitates the development of applications involving language models. It simplifies tasks such as summarization, question answering, and interaction with generative models like ChatGPT or Google Gemini. +1. **Google Gemini** and **OpenAI/ChatGPT** are generative models that can be used to generate text based on a given prompt. They are useful for applications that require a large amount of text generation, such as summarization or question answering. +1. **Semantic vector search** is a technique that uses vector embeddings to find similar items in a database. It is typically combined with RAG to provide more accurate responses to user queries. +1. **Redis** is an in-memory database that can be used to store and search vector embeddings. It is particularly useful for applications that require fast, real-time responses. + +Our application leverages these technologies to create a unique Q&A platform based on video content. Users can upload YouTube video URLs or IDs, and the application utilizes generative AI to summarize these videos, formulate potential questions, and create a searchable database. This database can then be queried to find answers to user-submitted questions, drawing directly from the video content. + +## High-level overview of the AI video Q&A application with Redis + +Here's how our application uses AI and semantic vector search to answer user questions based on video content: + +1. **Uploading videos**: Users can upload YouTube videos either via links (e.g. `https://www.youtube.com/watch?v=LaiQFZ5bXaM`) or video IDs (e.g. `LaiQFZ5bXaM`). The application processes these inputs to retrieve necessary video information. For the purposes of this tutorial, the app is pre-seeded with a collection of videos from the [Redis YouTube channel](https://www.youtube.com/@Redisinc). However, when you run the application you can adjust it to cover your own set of videos. + +![Upload videos screenshot](./images/upload-videos.png). + +2. **Video processing and AI interaction**: Using the [Youtube Data API](https://developers.google.com/youtube/v3), the application obtains video `titles`, `descriptions`, and `thumbnails`. It also uses [SearchAPI.io](https://searchapi.io) to retrieve video transcripts. These transcripts are then passed to a large language model (LLM) - either Google Gemini or OpenAI's ChatGPT - for summarization and sample question generation. The LLM also generates vector embeddings for these summaries. + +An example summary and sample questions generated by the LLM are shown below: + +```text title="https://www.youtube.com/watch?v=LaiQFZ5bXaM" +Summary: +The video provides a walkthrough of building a real-time stock tracking application +using Redis Stack, demonstrating its capability to handle multiple data models and +act as a message broker in a single integrated database. The application maintains +a watch list of stock symbols, along with real-time trading information and a chart +updated with live data from the Alpaca API. The presenter uses Redis Stack features +such as sets, JSON documents, time series, Pub/Sub, and Top-K filter to store and +manage different types of data. An architecture diagram is provided, explaining the +interconnection between the front end, API service, and streaming service within +the application. Code snippets highlight key aspects of the API and streaming +service written in Python, highlighting the use of Redis Bloom, Redis JSON, Redis +Time Series, and Redis Search for managing data. The video concludes with a +demonstration of how data structures are visualized and managed in RedisInsight, +emphasizing how Redis Stack can simplify the building of a complex real-time +application by replacing multiple traditional technologies with one solution. + +Example Questions and Answers: + +Q1: What is Redis Stack and what role does it play in the application? +Q2: How is the stock watch list stored and managed within the application? +Q3: What type of data does the application store using time series capabilities of +Redis Stack? +Q4: Can you explain the use of the Top-K filter in the application? +Q5: What methods are used to update the front end with real-time information in +the application? +Q6: How does the application sync the watch list with the streaming service? +Q7: What frontend technologies are mentioned for building the UI of the application? +Q8: How does Redis Insight help in managing the application data? +``` + +3. **Data storage with Redis**: All generated data, including video summaries, potential questions, and vector embeddings, are stored in Redis. The app utilizes Redis's diverse data types for efficient data handling, caching, and quick retrieval. + +![RedisInsight keys](./images/redisinsight-keys.png) + +4. **Search and answer retrieval**: The frontend, built with Next.js, allows users to ask questions. The application then searches the Redis database using semantic vector similarity to find relevant video content. It further uses the LLM to formulate answers, prioritizing information from video transcripts. + +![Asking a question](./images/ask-question.png) + +5. **Presentation of results**: The app displays the most relevant videos along with the AI-generated answers, offering a comprehensive and interactive user experience. It also displays cached results from previous queries using semantic vector caching for faster response times. + +![Existing answers](./images/video-qa-existing-answer.png) + +## Setting Up the Environment + +To get started with our AI-powered video Q&A application, you'll first need to set up your development environment. We'll follow the instructions outlined in the project's `README.md` file. + +### Requirements + +- [Node.js](https://nodejs.org/) +- [Docker](https://www.docker.com/) +- [SearchAPI.io API Key](https://www.searchapi.io/) + - This is used to retrieve video transcripts and free for up to 100 requests. The application will cache the results to help avoid exceeding the free tier. +- [Google API Key](https://console.cloud.google.com/apis/credentials) + - You must have the following APIs enabled: + - YouTube Data API v3 + - Generative Language API + - This is used to retrieve video information and prompt the Google Gemini model. This is not free. +- [OpenAI API Key](https://platform.openai.com/api-keys) + - This is used to prompt the OpenAI ChatGPT model. This is not free. + +### Setting Up Redis + +Redis is used as our database to store and retrieve data efficiently. You can start quickly with a cloud-hosted Redis instance by signing up at [redis.com/try-free](https://redis.com/try-free). This is ideal for both development and testing purposes. You can easily store the data for this application within the limitations of the Redis free tier. + +### Cloning the Repository + +First, clone the repository containing our project: + +```bash +git clone https://github.com/redis-developer/video-qa-semantic-vector-caching +``` + +### Installing Dependencies + +After setting up your Node.js environment, you'll need to install the necessary packages. Navigate to the root of your project directory and run the following command: + +```bash +npm install +``` + +This command will install all the dependencies listed in the `package.json` file, ensuring you have everything needed to run the application. + +### Configuration + +Before running the application, make sure to configure the environment variables. There is a script to automatically generate the `.env` files for you. Run the following command: + +```bash +npm run setup +``` + +This will generate the following files: + +1. `app/.env` - This file contains the environment variables for the Next.js application. +1. `app/.env.docker` - This file contains overrides for the environment variables when running in Docker. +1. `services/video-search/.env` - This file contains the environment variables for the video search service. +1. `services/video-search/.env.docker` - This file contains overrides for the environment variables when running in Docker. + +By default, you should not need to touch the environment files in the `app`. However, you will need to configure the environment files in the `services/video-search` directory. + +The `services/video-search/.env` looks like this: + +```bash +USE= + +REDIS_URL= +SEARCHAPI_API_KEY= +YOUTUBE_TRANSCRIPT_PREFIX= +YOUTUBE_VIDEO_INFO_PREFIX= + +GOOGLE_API_KEY= +GOOGLE_EMBEDDING_MODEL= +GOOGLE_SUMMARY_MODEL= + +OPENAI_API_KEY= +OPENAI_ORGANIZATION= +OPENAI_EMBEDDING_MODEL= +OPENAI_SUMMARY_MODEL= +``` + +For Gemini models, you can use the following if you are not sure what to do: + +```bash +GOOGLE_EMBEDDING_MODEL=embedding-001 +GOOGLE_SUMMARY_MODEL=gemini-pro +``` + +For OpenAI models, you can use the following if you are not sure what to do: + +```bash +OPENAI_EMBEDDING_MODEL=text-embedding-ada-002 +OPENAI_SUMMARY_MODEL=gpt-4-1106-preview +``` + +> NOTE: Depending on your OpenAI tier you may have to use a different summary model. `gpt-3.5` models will be okay. + +The `_PREFIX` environment variables are used to prefix the keys in Redis. This is useful if you want to use the same Redis instance for multiple applications. THey have the following defaults: + +```bash +YOUTUBE_TRANSCRIPT_PREFIX=transcripts: +YOUTUBE_VIDEO_INFO_PREFIX=yt-videos: +``` + +If you're satisfied with the defaults, you can delete these values from the `.env` file. + +Lastly, the `services/video-search/.env.docker` file contains overrides for the Redis URL when used in Docker. By default this app sets up a local Redis instance in Docker. If you are using a cloud instance, you can simply add the URL to your `.env` and delete the override in the `.env.docker` file. + +## Running the application + +After installing and configuring the application, run the following command to build the Docker images and run containers: + +```bash +npm run dev +``` + +This command builds the app and the video service, and deploys them to Docker. It is all setup for hot reloading, so if you make changes to the code, it will automatically restart the servers. + +Once the containers are up and running, the application will be accessible via your web browser: + +- **Client**: Available at [http://localhost](http://localhost) (Port 80). +- **Video search service**: Accessible at [http://localhost:8000](http://localhost:8000/api/healthcheck). + +This setup allows you to interact with the client-side application through your browser and make requests to the video search service hosted on a separate port. + +The video search service doesn't publish a client application. Instead, it exposes a REST API that can be used to interact with the service. You can validate that it is running by checking Docker or by visiting the following URL: + +- [http://localhost:8000/api/healthcheck](http://localhost:8000/api/healthcheck) + +You should be up and running now! The rest of this tutorial is focused on how the application works and how to use it, with code examples. + +## How to build a video Q&A application with Redis and LangChain + +### Video uploading and processing + +#### Handling video uploads and retrieving video transcripts and metadata + +The backend is set up to handle YouTube video links or IDs. The relevant code snippet from the project demonstrates how these inputs are processed. + +```typescript title="services/video-search/src/transcripts/load.ts" +export type VideoDocument = Document<{ + id: string; + link: string; + title: string; + description: string; + thumbnail: string; +}>; + +export async function load(videos: string[] = config.youtube.VIDEOS) { + // Parse the video URLs to get a list of video IDs + const videosToLoad: string[] = videos.map(parseVideoUrl).filter((video) => { + return typeof video === 'string'; + }) as string[]; + + // Get video title, description, and thumbnail from YouTube API v3 + const videoInfo = await getVideoInfo(videosToLoad); + + // Get video transcripts from SearchAPI.io, join the video info + const transcripts = await mapAsyncInOrder(videosToLoad, async (video) => { + return await getTranscript(video, videoInfo[video]); + }); + + // Return the videos as documents with metadata, and pageContent being the transcript + return transcripts.filter( + (transcript) => typeof transcript !== 'undefined', + ) as VideoDocument[]; +} +``` + +In the same file you will see two caches: + +```typescript title="services/video-search/src/transcripts/load.ts" +const cache = cacheAside(config.youtube.TRANSCRIPT_PREFIX); +const videoCache = jsonCacheAside(config.youtube.VIDEO_INFO_PREFIX); +``` + +These caches are used to store the transcripts (as a `string`) and video metadata (as `JSON`) in Redis. The `cache` functions are helper functions that use Redis to store and retrieve data. They looks like this: + +```typescript title="services/video-search/src/db.ts" +export function cacheAside(prefix: string) { + return { + get: async (key: string) => { + return await client.get(`${prefix}${key}`); + }, + set: async (key: string, value: string) => { + return await client.set(`${prefix}${key}`, value); + }, + }; +} + +export function jsonCacheAside(prefix: string) { + return { + get: async (key: string): Promise => { + return client.json.get(`${prefix}${key}`) as T; + }, + set: async (key: string, value: RedisJSON) => { + return await client.json.set(`${prefix}${key}`, '$', value); + }, + }; +} +``` + +You will see these functions used elsewhere in the app. They are used to prevent unnecessary API calls, in this case to SearchAPI.io and the YouTube API. + +#### Summarizing video content with LangChain, Redis, Google Gemini, and OpenAI ChatGPT + +After obtaining the video transcripts and metadata, the transcripts are then summarized using LangChain and the LLMs, both Gemini and ChatGPT. There are a few interesting pieces of code to understand here: + +1. The `prompt` used to ask the LLM to summarize the video transcript and generate sample questions +1. The `refinement chain` used to obtain the summarized video and sample questions +1. The `vector embedding chain` that uses the LLM to generate text embeddings and store them in Redis + +The LLM `summary prompt` is split into two parts. This is done to allow analyzing videos where the transcript length is larger than the LLM's accepted context. + +```typescript title="services/video-search/src/api/templates/video.ts" +import { PromptTemplate } from 'langchain/prompts'; + +const summaryTemplate = ` +You are an expert in summarizing YouTube videos. +Your goal is to create a summary of a video. +Below you find the transcript of a video: +-------- +{text} +-------- + +The transcript of the video will also be used as the basis for a question and answer bot. +Provide some examples questions and answers that could be asked about the video. Make these questions very specific. + +Total output will be a summary of the video and a list of example questions the user could ask of the video. + +SUMMARY AND QUESTIONS: +`; + +export const SUMMARY_PROMPT = PromptTemplate.fromTemplate(summaryTemplate); + +const summaryRefineTemplate = ` +You are an expert in summarizing YouTube videos. +Your goal is to create a summary of a video. +We have provided an existing summary up to a certain point: {existing_answer} + +Below you find the transcript of a video: +-------- +{text} +-------- + +Given the new context, refine the summary and example questions. +The transcript of the video will also be used as the basis for a question and answer bot. +Provide some examples questions and answers that could be asked about the video. Make +these questions very specific. +If the context isn't useful, return the original summary and questions. +Total output will be a summary of the video and a list of example questions the user could ask of the video. + +SUMMARY AND QUESTIONS: +`; + +export const SUMMARY_REFINE_PROMPT = PromptTemplate.fromTemplate( + summaryRefineTemplate, +); +``` + +The `summary prompts` are used to create a `refinement chain` with LangChain. LangChain will automatically handle splitting the video transcript document(s) and calling the LLM accordingly. + +```typescript title="services/video-search/src/api/prompt.ts" {1-5,30-35} +const videoSummarizeChain = loadSummarizationChain(llm, { + type: 'refine', + questionPrompt: SUMMARY_PROMPT, + refinePrompt: SUMMARY_REFINE_PROMPT, +}); + +const summaryCache = cacheAside(`${prefix}-${config.redis.SUMMARY_PREFIX}`); + +async function summarizeVideos(videos: VideoDocument[]) { + const summarizedDocs: VideoDocument[] = []; + + for (const video of videos) { + log.debug(`Summarizing ${video.metadata.link}`, { + ...video.metadata, + location: `${prefix}.summarize.docs`, + }); + const existingSummary = await summaryCache.get(video.metadata.id); + + if (typeof existingSummary === 'string') { + summarizedDocs.push( + new Document({ + metadata: video.metadata, + pageContent: existingSummary, + }), + ); + + continue; + } + + const splitter = new TokenTextSplitter({ + chunkSize: 10000, + chunkOverlap: 250, + }); + const docsSummary = await splitter.splitDocuments([video]); + const summary = await videoSummarizeChain.run(docsSummary); + + log.debug(`Summarized ${video.metadata.link}:\n ${summary}`, { + summary, + location: `${prefix}.summarize.docs`, + }); + await summaryCache.set(video.metadata.id, summary); + + summarizedDocs.push( + new Document({ + metadata: video.metadata, + pageContent: summary, + }), + ); + } + + return summarizedDocs; +} +``` + +Notice the `summaryCache` is used to first ask Redis if the video has already been summarized. If it has, it will return the summary and skip the LLM. This is a great example of how Redis can be used to cache data and avoid unnecessary API calls. Below is an example video summary with questions. + +```text title="https://www.youtube.com/watch?v=LaiQFZ5bXaM" +Summary: +The video provides a walkthrough of building a real-time stock tracking application +using Redis Stack, demonstrating its capability to handle multiple data models and +act as a message broker in a single integrated database. The application maintains +a watch list of stock symbols, along with real-time trading information and a chart +updated with live data from the Alpaca API. The presenter uses Redis Stack features +such as sets, JSON documents, time series, Pub/Sub, and Top-K filter to store and +manage different types of data. An architecture diagram is provided, explaining the +interconnection between the front end, API service, and streaming service within +the application. Code snippets highlight key aspects of the API and streaming +service written in Python, highlighting the use of Redis Bloom, Redis JSON, Redis +Time Series, and Redis Search for managing data. The video concludes with a +demonstration of how data structures are visualized and managed in RedisInsight, +emphasizing how Redis Stack can simplify the building of a complex real-time +application by replacing multiple traditional technologies with one solution. + +Example Questions and Answers: + +Q1: What is Redis Stack and what role does it play in the application? +Q2: How is the stock watch list stored and managed within the application? +Q3: What type of data does the application store using time series capabilities of +Redis Stack? +Q4: Can you explain the use of the Top-K filter in the application? +Q5: What methods are used to update the front end with real-time information in +the application? +Q6: How does the application sync the watch list with the streaming service? +Q7: What frontend technologies are mentioned for building the UI of the application? +Q8: How does Redis Insight help in managing the application data? +``` + +The `vector embedding chain` is used to generate vector embeddings for the video summaries. This is done by asking the LLM to generate text embeddings for the summary. The `vector embedding chain` is defined as follows: + +```typescript title="services/video-search/src/api/store.ts" +const vectorStore = new RedisVectorStore(embeddings, { + redisClient: client, + indexName: `${prefix}-${config.redis.VIDEO_INDEX_NAME}`, + keyPrefix: `${prefix}-${config.redis.VIDEO_PREFIX}`, + indexOptions: { + ALGORITHM: VectorAlgorithms.HNSW, + DISTANCE_METRIC: 'IP', + }, +}); +``` + +The vector store uses the `RedisVectorStore` class from LangChain. This class is a wrapper around Redis that allows you to store and search vector embeddings. We are using the `HNSW` algorithm and the `IP` distance metric. For more information on the supported algorithms and distance metrics, see the [Redis vector store documentation](https://redis.io/docs/interact/search-and-query/advanced-concepts/vectors/). We pass the `embeddings` object to the `RedisVectorStore` constructor. This object is defined as follows: + +```typescript title="services/video-search/src/api/llms/google.ts" +new GoogleGenerativeAIEmbeddings({ + apiKey: config.google.API_KEY, + modelName: modelName ?? config.google.EMBEDDING_MODEL, + taskType: TaskType.SEMANTIC_SIMILARITY, +}); +``` + +Or for OpenAI: + +```typescript title="services/video-search/src/api/llms/openai.ts" +new OpenAIEmbeddings({ + openAIApiKey: config.openai.API_KEY, + modelName: modelName ?? config.openai.EMBEDDING_MODEL, + configuration: { + organization: config.openai.ORGANIZATION, + }, +}); +``` + +The `embeddings` object is used to generate vector embeddings for the video summaries. These embeddings are then stored in Redis using the `vectorStore`. + +```typescript title="services/video-search/src/api/store.ts" {28} +async function storeVideoVectors(documents: VideoDocument[]) { + log.debug('Storing documents...', { + location: `${prefix}.store.store`, + }); + const newDocuments: VideoDocument[] = []; + + await Promise.all( + documents.map(async (doc) => { + const exists = await client.sIsMember( + `${prefix}-${config.redis.VECTOR_SET}`, + doc.metadata.id, + ); + + if (!exists) { + newDocuments.push(doc); + } + }), + ); + + log.debug(`Found ${newDocuments.length} new documents`, { + location: `${prefix}.store.store`, + }); + + if (newDocuments.length === 0) { + return; + } + + await vectorStore.addDocuments(newDocuments); + + await Promise.all( + newDocuments.map(async (doc) => { + await client.sAdd( + `${prefix}-${config.redis.VECTOR_SET}`, + doc.metadata.id, + ); + }), + ); +} +``` + +Notice that we first check if we have already generated a vector using the Redis Set `VECTOR_SET`. If we have, we skip the LLM and use the existing vector. This avoids unnecessary API calls and can speed things up. + +### Redis vector search functionality and AI integration for video Q&A + +One of the key features of our application is the ability to search through video content using AI-generated queries. This section will cover how the backend handles search requests and interacts with the AI models. + +#### Converting questions into vectors + +When a user submits a question through the frontend, the backend performs the following steps to obtain the answer to the question as well as supporting videos: + +1. We generate a semantically similar question to the one being asked. This helps to find the most relevant videos. +1. We then use the `vectorStore` to search for the most relevant videos based on the semantic question. +1. If we don't find any relevant videos, we search with the original question. +1. Once we find videos, we call the LLM to answer the question. +1. Finally, we return the answer and supporting videos to the user. + +To answer a question, we first generate a semantically similar question to the one being asked. This is done using the `QUESTION_PROMPT` defined below: + +```typescript title="services/video-search/src/api/templates/questions.ts" +import { PromptTemplate } from 'langchain/prompts'; + +const questionTemplate = ` +You are an expert in summarizing questions. +Your goal is to reduce a question down to its simplest form while still retaining the semantic meaning. +Below you find the question: +-------- +{question} +-------- + +Total output will be a semantically similar question that will be used to search an existing dataset. + +SEMANTIC QUESTION: +`; + +export const QUESTION_PROMPT = PromptTemplate.fromTemplate(questionTemplate); +``` + +Using this prompt, we generate the `semantic question` and use it to search for videos. We may also need to search using the original `question` if we don't find any videos with the `semantic question`. This is done using the `ORIGINAL_QUESTION_PROMPT` defined below: + +```typescript title="services/video-search/src/api/search.ts" {12-14,22,27,37,44,46-52} +async function getVideos(question: string) { + log.debug( + `Performing similarity search for videos that answer: ${question}`, + { + question, + location: `${prefix}.search.search`, + }, + ); + + const KNN = config.searches.KNN; + /* Simple standalone search in the vector DB */ + return await (vectorStore.similaritySearch(question, KNN) as Promise< + VideoDocument[] + >); +} + +async function searchVideos(question: string) { + log.debug(`Original question: ${question}`, { + location: `${prefix}.search.search`, + }); + + const semanticQuestion = await prompt.getSemanticQuestion(question); + + log.debug(`Semantic question: ${semanticQuestion}`, { + location: `${prefix}.search.search`, + }); + let videos = await getVideos(semanticQuestion); + + if (videos.length === 0) { + log.debug( + 'No videos found for semantic question, trying with original question', + { + location: `${prefix}.search.search`, + }, + ); + + videos = await getVideos(question); + } + + log.debug(`Found ${videos.length} videos`, { + location: `${prefix}.search.search`, + }); + + const answerDocument = await prompt.answerQuestion(question, videos); + + return [ + { + ...answerDocument.metadata, + question: answerDocument.pageContent, + isOriginal: true, + }, + ]; +} +``` + +The code above shows the whole process for getting answers from the LLM and returning them to the user. Once relevant videos are identified, the backend uses either Google Gemini or OpenAI's ChatGPT to generate answers. These answers are formulated based on the video transcripts stored in Redis, ensuring they are contextually relevant to the user's query. The `ANSWER_PROMPT` used to ask the LLM for answers is as follows: + +```typescript title="services/video-search/src/api/templates/answers.ts" +import { PromptTemplate } from 'langchain/prompts'; + +const answerTemplate = ` +You are an expert in answering questions about Redis and Redis Stack. +Your goal is to take a question and some relevant information extracted from videos and return the answer to the question. + +- Try to mostly use the provided video info, but if you can't find the answer there you can use other resources. +- Make sure your answer is related to Redis. All questions are about Redis. For example, if a question is asking about strings, it is asking about Redis strings. +- The answer should be formatted as a reference document using markdown. Make all headings and links bold, and add new paragraphs around any code blocks. +- Your answer should include as much detail as possible and be no shorter than 500 words. + +Here is some extracted video information relevant to the question: {data} + +Below you find the question: +-------- +{question} +-------- + +Total output will be the answer to the question. + +ANSWER: +`; + +export const ANSWER_PROMPT = PromptTemplate.fromTemplate(answerTemplate); +``` + +That's it! The backend will now return the answer and supporting videos to the user. + +## Going further with semantic answer caching + +The application we've built in this tutorial is a great starting point for exploring the possibilities of AI-powered video Q&A. However, there are many ways to improve the application and make it more efficient. One such improvement is to use Redis as a semantic vector cache. + +Note in the previous section, we discussed making a call to the LLM to answer every question. There is a performance bottleneck during this step, because LLM response times vary, but can take several seconds. What if there was a way we could prevent unnecessary calls to the LLM? This is where `semantic vector caching` comes in. + +### What is semantic vector caching? + +Semantic vector caching happens when you take the results of a call to an LLM and cache them alongside the vector embedding for the prompt. In the case of our application, we could generate vector embeddings for the questions and store them in Redis with the answer from the LLM. This would allow us to avoid calling the LLM for similar questions that have already been answered. + +You might ask why store the question as a vector? Why not just store the question as a string? The answer is that storing the question as a vector allows us to perform semantic vector similarity searches. So rather than relying on someone asking the exact same question, we can determine an acceptable similarity score and return answers for similar questions. + +### How to implement semantic vector caching in Redis + +If you're already familiar with storing vectors in Redis, which we have covered in this tutorial, semantic vector caching is an extension of that and operates in essentially the same way. The only difference is that we are storing the question as a vector, rather than the video summary. We are also using the [cache aside](https://www.youtube.com/watch?v=AJhTduDOVCs) pattern. The process is as follows: + +1. When a user asks a question, we perform a vector similarity search for existing answers to the question. +1. If we find an answer, we return it to the user. Thus, avoiding a call to the LLM. +1. If we don't find an answer, we call the LLM to generate an answer. +1. We then store the question as a vector in Redis, along with the answer from the LLM. + +In order to store the question vectors we need to create a new vector store. This will create an index specifically for the question and answer vector. The code looks like this: + +```typescript title="services/video-search/src/api/store.ts" {6-7} +const answerVectorStore = new RedisVectorStore(embeddings, { + redisClient: client, + indexName: `${prefix}-${config.redis.ANSWER_INDEX_NAME}`, + keyPrefix: `${prefix}-${config.redis.ANSWER_PREFIX}`, + indexOptions: { + ALGORITHM: VectorAlgorithms.FLAT, + DISTANCE_METRIC: 'L2', + }, +}); +``` + +The `answerVectorStore` looks nearly identical to the `vectorStore` we defined earlier, but it uses a different [algorithm and distance metric](https://redis.io/docs/interact/search-and-query/advanced-concepts/vectors/). This algorithm is better suited for similarity searches for our questions. + +The following code demonstrates how to use the `answerVectorStore` to check if a similar question has already been answered. + +```typescript title="services/video-search/src/api/search.ts" {16-19} +async function checkAnswerCache(question: string) { + const haveAnswers = await answerVectorStore.checkIndexExists(); + + if (!(haveAnswers && config.searches.answerCache)) { + return; + } + + log.debug(`Searching for closest answer to question: ${question}`, { + location: `${prefix}.search.getAnswer`, + question, + }); + + /** + * Scores will be between 0 and 1, where 0 is most accurate and 1 is least accurate + */ + let results = (await answerVectorStore.similaritySearchWithScore( + question, + config.searches.KNN, + )) as Array<[AnswerDocument, number]>; + + if (Array.isArray(results) && results.length > 0) { + // Filter out results with too high similarity score + results = results.filter( + (result) => result[1] <= config.searches.maxSimilarityScore, + ); + + const inaccurateResults = results.filter( + (result) => result[1] > config.searches.maxSimilarityScore, + ); + + if (Array.isArray(inaccurateResults) && inaccurateResults.length > 0) { + log.debug( + `Rejected ${inaccurateResults.length} similar answers that have a score > ${config.searches.maxSimilarityScore}`, + { + location: `${prefix}.search.getAnswer`, + scores: inaccurateResults.map((result) => result[1]), + }, + ); + } + } + + if (Array.isArray(results) && results.length > 0) { + log.debug( + `Accepted ${results.length} similar answers that have a score <= ${config.searches.maxSimilarityScore}`, + { + location: `${prefix}.search.getAnswer`, + scores: results.map((result) => result[1]), + }, + ); + + return results.map((result) => { + return { + ...result[0].metadata, + question: result[0].pageContent, + isOriginal: false, + }; + }); + } +} +``` + +The `similaritySearchWithScore` will find similar questions to the one being asked. It ranks them from `0` to `1`, where `0` is most similar or "closest". We then filter out any results that are too similar, as defined by the `maxSimilarityScore` environment variable. If we find any results, we return them to the user. Using a max score is crucial here, because we don't want to return inaccurate results. + +To complete this process, we need to apply the `cache aside` pattern and store the question as a vector in Redis. This is done as follows: + +```typescript title="services/video-search/src/api/search.ts" {3,9-15,23-29,50-52} +async function searchVideos( + question: string, + { useCache = config.searches.answerCache }: VideoSearchOptions = {}, +) { + log.debug(`Original question: ${question}`, { + location: `${prefix}.search.search`, + }); + + if (useCache) { + const existingAnswer = await checkAnswerCache(question); + + if (typeof existingAnswer !== 'undefined') { + return existingAnswer; + } + } + + const semanticQuestion = await prompt.getSemanticQuestion(question); + + log.debug(`Semantic question: ${semanticQuestion}`, { + location: `${prefix}.search.search`, + }); + + if (useCache) { + const existingAnswer = await checkAnswerCache(semanticQuestion); + + if (typeof existingAnswer !== 'undefined') { + return existingAnswer; + } + } + + let videos = await getVideos(semanticQuestion); + + if (videos.length === 0) { + log.debug( + 'No videos found for semantic question, trying with original question', + { + location: `${prefix}.search.search`, + }, + ); + + videos = await getVideos(question); + } + + log.debug(`Found ${videos.length} videos`, { + location: `${prefix}.search.search`, + }); + + const answerDocument = await prompt.answerQuestion(question, videos); + + if (config.searches.answerCache) { + await answerVectorStore.addDocuments([answerDocument]); + } + + return [ + { + ...answerDocument.metadata, + question: answerDocument.pageContent, + isOriginal: true, + }, + ]; +} +``` + +When a question is asked, we first check the answer cache. We check both the question and the generated semantic question. If we find an answer, we return it to the user. If we don't find an answer, we call the LLM to generate an answer. We then store the question as a vector in Redis, along with the answer from the LLM. It may look like we're doing more work here than we were without the cache, but keep in mind the LLM is the bottleneck. By doing this, we are avoiding unnecessary calls to the LLM. + +Below are a couple screenshots from the application to see what it looks like when you find an existing answer to a question: + +![Existing video answer](./images/video-qa-existing-answer.png) + +![Supporting videos](./images/video-qa-supporting-videos.png) + +## Conclusion + +In this tutorial, we've explored how to build an AI-powered video Q&A application using Redis, LangChain, and various other technologies. We've covered setting up the environment, processing video uploads, and implementing search functionality. You also saw how to use Redis as a `vector store` and `semantic vector cache`. + +> NOTE: Not included in this tutorial is an overview of the frontend `Next.js` app. However, you can find the code in the [GitHub repository](https://github.com/redis-developer/video-qa-semantic-vector-caching) in the `app` directory. + +### Key takeaways + +- Generative AI can be leveraged to create powerful applications without writing a ton of code. +- Redis is highly versatile and efficient in handling AI-generated data and vectors. +- LangChain makes it easy to integrate AI models with vector stores. + +Remember, Redis offers an easy start with cloud-hosted instances, which you can sign up for at [redis.com/try-free](https://redis.com/try-free). This makes experimenting with AI and Redis more accessible than ever. + +We hope this tutorial inspires you to explore the exciting possibilities of combining AI with powerful databases like Redis to create innovative applications. + +## Further reading + +- [Perform vector similarity search using Redis](/howtos/solutions/vector/getting-started-vector) +- [Building a generative AI chatbot using Redis](/howtos/solutions/vector/gen-ai-chatbot) +- [LangChain JS](https://js.langchain.com/docs/get_started/quickstart) + - [Learn LangChain](https://scrimba.com/learn/langchain) +- [LangChain Redis integration](https://js.langchain.com/docs/integrations/vectorstores/redis) diff --git a/docs/modules/redisai/index-redisai.mdx b/docs/modules/redisai/index-redisai.mdx deleted file mode 100644 index f44b89a19bf..00000000000 --- a/docs/modules/redisai/index-redisai.mdx +++ /dev/null @@ -1,6 +0,0 @@ ---- -id: index-redisai -title: Redisai -sidebar_label: RedisAI -slug: /modules/redisai ---- diff --git a/docs/modules/redisbloom/index-redisbloom.mdx b/docs/modules/redisbloom/index-redisbloom.mdx index 81708007323..5e25e4515b0 100644 --- a/docs/modules/redisbloom/index-redisbloom.mdx +++ b/docs/modules/redisbloom/index-redisbloom.mdx @@ -1,38 +1,38 @@ --- id: index-redisbloom -title: RedisBloom -sidebar_label: RedisBloom +title: Probabilistic Data Structures +sidebar_label: Probabilistic Data slug: /modules/redisbloom --- -RedisBloom extends Redis core to support additional probabilistic data structures. It allows for solving computer science problems in a constant memory space with extremely fast processing and a low error rate. It supports scalable Bloom and Cuckoo filters to determine (with a specified degree of certainty) whether an item is present or absent from a collection. +Redis Stack provides additional probabilistic data structures. It allows for solving computer science problems in a constant memory space with extremely fast processing and a low error rate. It supports scalable Bloom and Cuckoo filters to determine (with a specified degree of certainty) whether an item is present or absent from a collection. -The RedisBloom module provides four data types: +The four probabilistic data types: - Bloom filter: A probabilistic data structure that can test for presence. A Bloom filter is a data structure designed to tell you, rapidly and memory-efficiently, whether an element is present in a set. Bloom filters typically exhibit better performance and scalability when inserting items (so if you're often adding items to your dataset then Bloom may be ideal). - Cuckoo filter: An alternative to Bloom filters, Cuckoo filters comes with additional support for deletion of elements from a set. These filters are quicker on check operations. - Count-min sketch: A count-min sketch is generally used to determine the frequency of events in a stream. You can query the count-min sketch to get an estimate of the frequency of any given event. -- Top-K: The Top-k probabilistic data structure in RedisBloom is a deterministic algorithm that approximates frequencies for the top k items. With Top-K, you’ll be notified in real time whenever elements enter into or are expelled from your Top-K list. If an element add-command enters the list, the dropped element will be returned. +- Top-K: The Top-k probabilistic data structure is a deterministic algorithm that approximates frequencies for the top k items. With Top-K, you’ll be notified in real time whenever elements enter into or are expelled from your Top-K list. If an element add-command enters the list, the dropped element will be returned. ### Step 1. Register and subscribe -Follow [this link to register](/create/cloud/rediscloud) and subscribe to Redis Enterprise Cloud +Follow [this link to register](/create/cloud/rediscloud) and subscribe to Redis Cloud ![Redisbloom](redisbloom1.png) -### Step 2. Create a database with RedisBloom Module +### Step 2. Create a database with Redis Stack ![Redisbloom](redisbloom.png) ### Step 3. Connect to a database -Follow [this](explore/redisinsight) link to know how to connect to a database +Follow [this](https://redis.io/docs/ui/insight/) link to know how to connect to a database -### Step 4. Getting Started with RedisBloom +### Step 4. Getting Started with Probabilistic Data Structures -In the next steps you will use some basic RedisBloom commands. You can run them from the Redis command-line interface (redis-cli) or use the CLI available in RedisInsight. (See part 2 of this tutorial to learn more about using the RedisInsight CLI.) To interact with RedisBloom, you use the BF.ADD and BF.EXISTS commands. +In the next steps you will use some basic commands. You can run them from the Redis command-line interface (redis-cli) or use the CLI available in RedisInsight. (See part 2 of this tutorial to learn more about using the RedisInsight CLI.) To interact with Redis, you use the BF.ADD and BF.EXISTS commands. -Let’s go ahead and test drive some RedisBloom-specific operations. We will create a basic dataset based on unique visitors’ IP addresses, and you will see how to: +Let’s go ahead and test drive some probabilistic-specific operations. We will create a basic dataset based on unique visitors’ IP addresses, and you will see how to: - Create a Bloom filter - Determine whether or not an item exists in the Bloom filter @@ -102,4 +102,4 @@ In the above example, the first command shows the result as “1” for both the ### Next Steps -- Learn more about RedisBloom in the [Quick Start](https://oss.redis.com/redisbloom/Quick_Start/) tutorial. +- Learn more about Probabilistic data in the [Quick Start](/howtos/quick-start) tutorial. diff --git a/docs/modules/redisearch/index-redisearch.mdx b/docs/modules/redisearch/index-redisearch.mdx index 992bd92ce65..f665acfe0cd 100644 --- a/docs/modules/redisearch/index-redisearch.mdx +++ b/docs/modules/redisearch/index-redisearch.mdx @@ -1,17 +1,17 @@ --- id: index-redisearch -title: RediSearch -sidebar_label: RediSearch +title: Redis Search +sidebar_label: Redis Search slug: /modules/redisearch --- -RediSearch is a powerful text search and secondary indexing engine, built on top of Redis as a Redis module. Written in C, RediSearch is extremely fast compared to other open-source search engines. It implements multiple data types and commands that fundamentally change what you can do with Redis. RediSearch supports capabilities for search and filtering such as geo-spatial queries, retrieving only IDs (instead of whole documents), and custom document scoring. Aggregations can combine map, filter, and reduce/group-by operations in custom pipelines that run across millions of elements in an instant. +Redis Search is a powerful text search and secondary indexing engine, built on top of Redis as a Redis module. Written in C, Redis Search is extremely fast compared to other open-source search engines. It implements multiple data types and commands that fundamentally change what you can do with Redis. Redis Search supports capabilities for search and filtering such as geo-spatial queries, retrieving only IDs (instead of whole documents), and custom document scoring. Aggregations can combine map, filter, and reduce/group-by operations in custom pipelines that run across millions of elements in an instant. -RediSearch also supports auto-completion with fuzzy prefix matching, and atomic real-time insertion of new documents to a search index.With the latest RediSearch 2.0 release, it’s now easier than ever to create a secondary index on top of your existing data. You can just add RediSearch to your existing Redis database, create an index, and start querying it, without having to migrate your data or use new commands for adding data to the index. This drastically lowers the learning curve for new RediSearch users and lets you create indexes on your existing Redis databases—without even having to restart them. +Redis Search also supports auto-completion with fuzzy prefix matching, and atomic real-time insertion of new documents to a search index.With the latest Redis Search 2.0 release, it’s now easier than ever to create a secondary index on top of your existing data. You can just add Redis Search to your existing Redis database, create an index, and start querying it, without having to migrate your data or use new commands for adding data to the index. This drastically lowers the learning curve for new Redis Search users and lets you create indexes on your existing Redis databases—without even having to restart them. ### Step 1. Register and subscribe -Follow [this link to register](/create/cloud/rediscloud) and subscribe to Redis Enterprise Cloud +Follow [this link to register](/create/cloud/rediscloud) and subscribe to Redis Cloud ![Redisearch](redisearch3.png) @@ -21,7 +21,7 @@ Follow [this link to register](/create/cloud/rediscloud) and subscribe to Redis ### Step 3. Connect to a database -Follow [this](explore/redisinsight) link to know how to connect to a database +Follow [this](https://redis.io/docs/ui/insight/) link to know how to connect to a database ### Step 4. Getting Started with Redisearch @@ -33,7 +33,7 @@ To begin, let’s create a basic dataset based on movies information, which we w ![Redisearch](redisearch12.png) -#### Insert data into RediSearch +#### Insert data into Redis Search We are now ready to insert some data. This example uses movies data stored as Redis Hashes, so let’s insert a couple of movies: @@ -47,7 +47,7 @@ HSET movies:11002 title "Star Wars: Episode V - The Empire Strikes Back" plot "L (integer) 6 ``` -Your Redis Enterprise Cloud database now contains two Hashes. It is simple to retrieve information using the HMGET command, if you know the key of the movies (movies:11002): +Your Redis Cloud database now contains two Hashes. It is simple to retrieve information using the HMGET command, if you know the key of the movies (movies:11002): ``` > HMGET movies:11002 title rating @@ -56,7 +56,7 @@ Your Redis Enterprise Cloud database now contains two Hashes. It is simple to re 2) "8.7" ``` -#### Create an index in RediSearch +#### Create an index in Redis Search To be able to query the hashes on the field for title, say, or genre, you must first create an index. To create an index, you must define a schema to list the fields and their types that are indexed, and that you can use in your queries. @@ -80,13 +80,13 @@ In the command above, we: Before running queries on our new index, though, let’s take a closer look at the elements of the FT.CREATE command: - idx:movies: the name of the index, which you will use when doing queries -- ON hash: the type of structure to be indexed. (Note that RediSearch 2.0 supports only the Hash structure, but this parameter will allow RediSearch to index other structures in the future.) +- ON hash: the type of structure to be indexed. (Note that Redis Search 2.0 supports only the Hash structure, but this parameter will allow Redis Search to index other structures in the future.) - PREFIX 1 “movies:”: the prefix of the keys that should be indexed. This is a list, so since we want to index only movies:\* keys the number is 1. If you want to index movies and TV shows with the same fields, you could use: PREFIX 2 “movies:” “tv_show:” - SCHEMA …: defines the schema, the fields, and their type to index. As you can see in the command, we are using TEXT, NUMERIC, and TAG, as well as SORTABLE parameters. -The RediSearch 2.0 engine will scan the database using the PREFIX values, and update the index based on the schema definition. This makes it easy to add an index to an existing application that uses Hashes, there’s no need to change your code. +The Redis Search 2.0 engine will scan the database using the PREFIX values, and update the index based on the schema definition. This makes it easy to add an index to an existing application that uses Hashes, there’s no need to change your code. -#### Search the movies in the RediSearch index +#### Search the movies in the Redis Search index You can now use the FT.SEARCH to search your database, for example, to search all movies sorted by release year: @@ -117,9 +117,14 @@ You can also search “action” movies that contain “star” in the index (in 4) "1980" ``` -The FT.SEARCH command is the base command to search your database, it has many options and is associated with a powerful and rich query syntax that you can find in the documentation. (Note: You can also use the index to do data aggregation using the FT.AGGREGATE command.) +The FT.SEARCH command is the base command to search your database, it has many options and is associated with a powerful and rich query syntax that you can find in the documentation. + +:::tip + +You can also use the index to do data aggregation using the FT.AGGREGATE command. + +::: ### Next Steps -- Learn more about RediSearch in the [Getting Started with RediSearch 2.0](https://github.com/RediSearch/redisearch-getting-started/) tutorial on GitHub. -- [How to list and search Movie database using Redisearch](/howtos/moviesdatabase/getting-started) +- Learn more about Redis Search in the [Getting Started with Redis Search 2.0](https://github.com/RediSearch/redisearch-getting-started/) tutorial on GitHub. diff --git a/docs/modules/redisgears/index-redisgears.mdx b/docs/modules/redisgears/index-redisgears.mdx index e19b9847927..36836d82fd0 100644 --- a/docs/modules/redisgears/index-redisgears.mdx +++ b/docs/modules/redisgears/index-redisgears.mdx @@ -1,6 +1,6 @@ --- id: index-redisgears -title: RedisGears -sidebar_label: RedisGears +title: Triggers and Functions +sidebar_label: Triggers and Functions slug: /modules/redisgears --- diff --git a/docs/modules/redisgraph/index-redisgraph.mdx b/docs/modules/redisgraph/index-redisgraph.mdx index 2c19a4733d6..25f80d3e408 100644 --- a/docs/modules/redisgraph/index-redisgraph.mdx +++ b/docs/modules/redisgraph/index-redisgraph.mdx @@ -4,12 +4,14 @@ title: RedisGraph sidebar_label: RedisGraph slug: /modules/redisgraph --- +import GraphEol from '@site/docs/common/_graph-eol.mdx'; + RedisGraph is a Redis module that enables enterprises to process any kind of connected data much faster than with traditional relational or existing graph databases. RedisGraph implements a unique data storage and processing solution (with sparse-adjacency matrices and GraphBLAS) to deliver the fastest and most efficient way to store, manage, and process connected data in graphs. With RedisGraph, you can process complex transactions 10 - 600 times faster than with traditional graph solutions while using 50 - 60% less memory resources than other graph databases! ### Step 1. Register and subscribe -Follow [this link to register](/create/cloud/rediscloud) and subscribe to Redis Enterprise Cloud +Follow [this link to register](/create/cloud/rediscloud) and subscribe to Redis Cloud ![RedisGraph](redisgraph1.png) @@ -19,7 +21,7 @@ Follow [this link to register](/create/cloud/rediscloud) and subscribe to Redis ### Step 3. Connect to a database -Follow [this](explore/redisinsight) link to know how to connect to a database +Follow [this](https://redis.io/docs/ui/insight/) link to know how to connect to a database ### Step 4. Getting Started with RedisGraph diff --git a/docs/modules/redisjson/index-redisjson.mdx b/docs/modules/redisjson/index-redisjson.mdx index c72cd6b7bce..045d8245748 100644 --- a/docs/modules/redisjson/index-redisjson.mdx +++ b/docs/modules/redisjson/index-redisjson.mdx @@ -1,31 +1,31 @@ --- id: index-redisjson -title: RedisJSON -sidebar_label: RedisJSON +title: Redis JSON +sidebar_label: Redis JSON slug: /modules/redisjson --- -The RedisJSON module provides in-memory manipulation of JSON documents at high velocity and volume. With RedisJSON, you can natively store document data in a hierarchical, tree-like format to scale and query documents efficiently, significantly improving performance over storing and manipulating JSON with Lua scripts and core Redis data structures. +Redis Stack provides in-memory manipulation of JSON documents at high velocity and volume. With Redis Stack, you can natively store document data in a hierarchical, tree-like format to scale and query documents efficiently, significantly improving performance over storing and manipulating JSON with Lua scripts and core Redis data structures. ### Step 1. Register and subscribe -Follow [this link to register](/create/cloud/rediscloud) and subscribe to Redis Enterprise Cloud +Follow [this link to register](/create/cloud/rediscloud) and subscribe to Redis Cloud -![RedisJSON](redisjson3.png) +![Redis JSON](redisjson3.png) -### Step 2. Create a database with RedisJSON Module +### Step 2. Create a database with Redis JSON Module -![RedisJSON](redisjson1.png) +![Redis JSON](redisjson1.png) ### Step 3. Connect to a database -Follow [this](explore/redisinsight) link to know how to connect to a database +Follow [this](https://redis.io/docs/ui/insight/) link to know how to connect to a database -### Step 4. Getting Started with RedisJSON +### Step 4. Getting Started with Redis JSON -The following steps use some basic RedisJSON commands. You can run them from the Redis command-line interface (redis-cli) or use the CLI available in RedisInsight. +The following steps use some basic Redis JSON commands. You can run them from the Redis command-line interface (redis-cli) or use the CLI available in RedisInsight. -To interact with RedisJSON, you will most often use the JSON.SET and JSON.GET commands. Before using RedisJSON, you should familiarize yourself with its commands and syntax as detailed in the documentation: RedisJSON Commands. +To interact with Redis JSON, you will most often use the JSON.SET and JSON.GET commands. Before using Redis JSON, you should familiarize yourself with its commands and syntax as detailed in the documentation: Redis JSON Commands. Let’s go ahead and test drive some JSON-specific operations for setting and retrieving a Redis key with a JSON value: @@ -36,7 +36,7 @@ Let’s go ahead and test drive some JSON-specific operations for setting and re #### Scalar -Under RedisJSON, a key can contain any valid JSON value. It can be scalar, objects or arrays. JSON scalar is basically a string. You will have to use the JSON.SET command to set the JSON value. For new Redis keys the path must be the root, so you will use “.” path in the example below. For existing keys, when the entire path exists, the value that it contains is replaced with the JSON value. Here you will use JSON.SET to set the JSON scalar value to “Hello JSON!” Scalar will contain a string that holds “Hello JSON!” +Under Redis JSON, a key can contain any valid JSON value. It can be scalar, objects or arrays. JSON scalar is basically a string. You will have to use the JSON.SET command to set the JSON value. For new Redis keys the path must be the root, so you will use “.” path in the example below. For existing keys, when the entire path exists, the value that it contains is replaced with the JSON value. Here you will use JSON.SET to set the JSON scalar value to “Hello JSON!” Scalar will contain a string that holds “Hello JSON!” ``` >> JSON.SET scalar . ' "Hello JSON!" ' @@ -144,5 +144,5 @@ A JSON object can also have another object. Here is a simple example of a JSON o ### Next Steps -- Learn more about [RedisJSON](https://oss.redis.com/redisjson/) in the Quickstart tutorial. -- [How to build shopping cart app using NodeJS and RedisJSON](/howtos/shoppingcart) +- Learn more about [Redis JSON](https://oss.redis.com/redisjson/) in the Quickstart tutorial. +- [How to build shopping cart app using NodeJS and Redis JSON](/howtos/shoppingcart) diff --git a/docs/modules/redistimeseries/index-redistimeseries.mdx b/docs/modules/redistimeseries/index-redistimeseries.mdx index 16f3cb241ba..bde683ffa9b 100644 --- a/docs/modules/redistimeseries/index-redistimeseries.mdx +++ b/docs/modules/redistimeseries/index-redistimeseries.mdx @@ -1,27 +1,27 @@ --- id: index-redistimeseries -title: RedisTimeSeries -sidebar_label: RedisTimeSeries +title: Redis Time Series +sidebar_label: Redis Time Series slug: /modules/redistimeseries --- -RedisTimeseries is a Redis module that enhances your experience managing time-series data with Redis. It simplifies the use of Redis for time-series use cases such as internet of things (IoT) data, stock prices, and telemetry. With RedisTimeSeries, you can ingest and query millions of samples and events at the speed of Redis. Advanced tooling such as downsampling and aggregation ensure a small memory footprint without impacting performance. Use a variety of queries for visualization and monitoring with built-in connectors to popular monitoring tools like Grafana, Prometheus, and Telegraf. +RedisTimeseries is a Redis module that enhances your experience managing time-series data with Redis. It simplifies the use of Redis for time-series use cases such as internet of things (IoT) data, stock prices, and telemetry. With Redis Time Series, you can ingest and query millions of samples and events at the speed of Redis. Advanced tooling such as downsampling and aggregation ensure a small memory footprint without impacting performance. Use a variety of queries for visualization and monitoring with built-in connectors to popular monitoring tools like Grafana, Prometheus, and Telegraf. ### Step 1. Register and subscribe -Follow [this link to register](/create/cloud/rediscloud) and subscribe to Redis Enterprise Cloud +Follow [this link to register](/create/cloud/rediscloud) and subscribe to Redis Cloud ![Redistimeseries](redistimeseries.png) -### Step 2. Create a database with RedisTimeSeries Module +### Step 2. Create a database with Redis Time Series Module ![Redistimeseries](redistimeseries1.png) ### Step 3. Connect to a database -Follow [this](explore/redisinsight) link to know how to connect to a database +Follow [this](https://redis.io/docs/ui/insight/) link to know how to connect to a database -### Step 4. Getting Started with RedisTimeSeries +### Step 4. Getting Started with Redis Time Series This section will walk you through using some basic RedisTimeseries commands. You can run them from the Redis command-line interface (redis-cli) or use the CLI available in RedisInsight. (See part 2 of this tutorial to learn more about using the RedisInsight CLI.) Using a basic air-quality dataset, we will show you how to: @@ -30,11 +30,11 @@ Using a basic air-quality dataset, we will show you how to: - Add a new sample to the list of series - Query a range across one or multiple time series -![RedisTimeSeries](redistimeseriesflow.png) +![Redis Time Series](redistimeseriesflow.png) #### Create a new time series -Let’s create a time series representing air quality dataset measurements. To interact with RedisTimeSeries you will most often use the TS.RANGE command, but here you will create a time series per measurement using the TS.CREATE command. Once created, all the measurements will be sent using TS.ADD. +Let’s create a time series representing air quality dataset measurements. To interact with Redis Time Series you will most often use the TS.RANGE command, but here you will create a time series per measurement using the TS.CREATE command. Once created, all the measurements will be sent using TS.ADD. The sample command below creates a time series and populates it with three entries: @@ -141,4 +141,4 @@ You can use various aggregation types such as avg, sum, min, max, range, count, ### Next Steps -- Learn more about RedisTimeSeries in the [Quickstart](https://oss.redis.com/redistimeseries/) tutorial. +- Learn more about Redis Time Series in the [Quickstart](https://oss.redis.com/redistimeseries/) tutorial. diff --git a/docs/operate/continuous-integration-continuous-deployment/argocd/images/argo_preview.jpg b/docs/operate/_continuous-integration-continuous-deployment/argocd/images/argo_preview.jpg similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/argocd/images/argo_preview.jpg rename to docs/operate/_continuous-integration-continuous-deployment/argocd/images/argo_preview.jpg diff --git a/docs/operate/continuous-integration-continuous-deployment/argocd/images/argo_preview.png b/docs/operate/_continuous-integration-continuous-deployment/argocd/images/argo_preview.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/argocd/images/argo_preview.png rename to docs/operate/_continuous-integration-continuous-deployment/argocd/images/argo_preview.png diff --git a/docs/operate/continuous-integration-continuous-deployment/argocd/images/argo_preview1.png b/docs/operate/_continuous-integration-continuous-deployment/argocd/images/argo_preview1.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/argocd/images/argo_preview1.png rename to docs/operate/_continuous-integration-continuous-deployment/argocd/images/argo_preview1.png diff --git a/docs/operate/continuous-integration-continuous-deployment/argocd/images/argocd_1.png b/docs/operate/_continuous-integration-continuous-deployment/argocd/images/argocd_1.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/argocd/images/argocd_1.png rename to docs/operate/_continuous-integration-continuous-deployment/argocd/images/argocd_1.png diff --git a/docs/operate/continuous-integration-continuous-deployment/argocd/images/argocd_10.png b/docs/operate/_continuous-integration-continuous-deployment/argocd/images/argocd_10.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/argocd/images/argocd_10.png rename to docs/operate/_continuous-integration-continuous-deployment/argocd/images/argocd_10.png diff --git a/docs/operate/continuous-integration-continuous-deployment/argocd/images/argocd_11.png b/docs/operate/_continuous-integration-continuous-deployment/argocd/images/argocd_11.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/argocd/images/argocd_11.png rename to docs/operate/_continuous-integration-continuous-deployment/argocd/images/argocd_11.png diff --git a/docs/operate/continuous-integration-continuous-deployment/argocd/images/argocd_12.png b/docs/operate/_continuous-integration-continuous-deployment/argocd/images/argocd_12.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/argocd/images/argocd_12.png rename to docs/operate/_continuous-integration-continuous-deployment/argocd/images/argocd_12.png diff --git a/docs/operate/continuous-integration-continuous-deployment/argocd/images/argocd_13.png b/docs/operate/_continuous-integration-continuous-deployment/argocd/images/argocd_13.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/argocd/images/argocd_13.png rename to docs/operate/_continuous-integration-continuous-deployment/argocd/images/argocd_13.png diff --git a/docs/operate/continuous-integration-continuous-deployment/argocd/images/argocd_14.png b/docs/operate/_continuous-integration-continuous-deployment/argocd/images/argocd_14.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/argocd/images/argocd_14.png rename to docs/operate/_continuous-integration-continuous-deployment/argocd/images/argocd_14.png diff --git a/docs/operate/continuous-integration-continuous-deployment/argocd/images/argocd_15.png b/docs/operate/_continuous-integration-continuous-deployment/argocd/images/argocd_15.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/argocd/images/argocd_15.png rename to docs/operate/_continuous-integration-continuous-deployment/argocd/images/argocd_15.png diff --git a/docs/operate/continuous-integration-continuous-deployment/argocd/images/argocd_16.png b/docs/operate/_continuous-integration-continuous-deployment/argocd/images/argocd_16.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/argocd/images/argocd_16.png rename to docs/operate/_continuous-integration-continuous-deployment/argocd/images/argocd_16.png diff --git a/docs/operate/continuous-integration-continuous-deployment/argocd/images/argocd_17.png b/docs/operate/_continuous-integration-continuous-deployment/argocd/images/argocd_17.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/argocd/images/argocd_17.png rename to docs/operate/_continuous-integration-continuous-deployment/argocd/images/argocd_17.png diff --git a/docs/operate/continuous-integration-continuous-deployment/argocd/images/argocd_18.png b/docs/operate/_continuous-integration-continuous-deployment/argocd/images/argocd_18.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/argocd/images/argocd_18.png rename to docs/operate/_continuous-integration-continuous-deployment/argocd/images/argocd_18.png diff --git a/docs/operate/continuous-integration-continuous-deployment/argocd/images/argocd_2.png b/docs/operate/_continuous-integration-continuous-deployment/argocd/images/argocd_2.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/argocd/images/argocd_2.png rename to docs/operate/_continuous-integration-continuous-deployment/argocd/images/argocd_2.png diff --git a/docs/operate/continuous-integration-continuous-deployment/argocd/images/argocd_21.png b/docs/operate/_continuous-integration-continuous-deployment/argocd/images/argocd_21.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/argocd/images/argocd_21.png rename to docs/operate/_continuous-integration-continuous-deployment/argocd/images/argocd_21.png diff --git a/docs/operate/continuous-integration-continuous-deployment/argocd/images/argocd_3.png b/docs/operate/_continuous-integration-continuous-deployment/argocd/images/argocd_3.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/argocd/images/argocd_3.png rename to docs/operate/_continuous-integration-continuous-deployment/argocd/images/argocd_3.png diff --git a/docs/operate/continuous-integration-continuous-deployment/argocd/images/argocd_4.png b/docs/operate/_continuous-integration-continuous-deployment/argocd/images/argocd_4.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/argocd/images/argocd_4.png rename to docs/operate/_continuous-integration-continuous-deployment/argocd/images/argocd_4.png diff --git a/docs/operate/continuous-integration-continuous-deployment/argocd/images/argocd_5.png b/docs/operate/_continuous-integration-continuous-deployment/argocd/images/argocd_5.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/argocd/images/argocd_5.png rename to docs/operate/_continuous-integration-continuous-deployment/argocd/images/argocd_5.png diff --git a/docs/operate/continuous-integration-continuous-deployment/argocd/images/argocd_6.png b/docs/operate/_continuous-integration-continuous-deployment/argocd/images/argocd_6.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/argocd/images/argocd_6.png rename to docs/operate/_continuous-integration-continuous-deployment/argocd/images/argocd_6.png diff --git a/docs/operate/continuous-integration-continuous-deployment/argocd/images/argocd_7.png b/docs/operate/_continuous-integration-continuous-deployment/argocd/images/argocd_7.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/argocd/images/argocd_7.png rename to docs/operate/_continuous-integration-continuous-deployment/argocd/images/argocd_7.png diff --git a/docs/operate/continuous-integration-continuous-deployment/argocd/images/argocd_8.png b/docs/operate/_continuous-integration-continuous-deployment/argocd/images/argocd_8.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/argocd/images/argocd_8.png rename to docs/operate/_continuous-integration-continuous-deployment/argocd/images/argocd_8.png diff --git a/docs/operate/continuous-integration-continuous-deployment/argocd/images/argocd_9.png b/docs/operate/_continuous-integration-continuous-deployment/argocd/images/argocd_9.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/argocd/images/argocd_9.png rename to docs/operate/_continuous-integration-continuous-deployment/argocd/images/argocd_9.png diff --git a/docs/operate/continuous-integration-continuous-deployment/argocd/index-argocd.mdx b/docs/operate/_continuous-integration-continuous-deployment/argocd/index-argocd.mdx similarity index 97% rename from docs/operate/continuous-integration-continuous-deployment/argocd/index-argocd.mdx rename to docs/operate/_continuous-integration-continuous-deployment/argocd/index-argocd.mdx index 2b43733d025..7853b343b4b 100644 --- a/docs/operate/continuous-integration-continuous-deployment/argocd/index-argocd.mdx +++ b/docs/operate/_continuous-integration-continuous-deployment/argocd/index-argocd.mdx @@ -6,6 +6,10 @@ slug: /operate/continuous-integration-continuous-deployment/argocd authors: [ajeet, talon] --- +import Authors from '@theme/Authors'; + + + ## What is an Argo CD? Argo CD is a combination of the two terms “Argo” and “CD,” [Argo](https://argoproj.github.io/) being an open source container-native workflow engine for Kubernetes. It is a [CNCF-hosted project](https://www.cncf.io/blog/2020/04/07/toc-welcomes-argo-into-the-cncf-incubator/) that provides an easy way to combine all three modes of computing—services, workflows, and event-based—all of which are very useful for creating jobs and applications on Kubernetes. It is an engine that makes it easy to specify, schedule, and coordinate the running of complex workflows and applications on Kubernetes. The CD in the name refers to [continuous delivery](https://en.wikipedia.org/wiki/Continuous_delivery), which is an extension of continuous integration (CI) since it automatically deploys all code changes to a testing and/or production environment after the build stage. @@ -18,7 +22,11 @@ Argo CD empowers organizations to declaratively build and run cloud native appli Built specifically to make the continuous deployment process to a Kubernetes cluster simpler and more efficient, Argo CD solves multiple challenges, such as the need to set up and install additional tools outside of Jenkins for a complete CI/CD process to Kubernetes, the need to configure access control to Kubernetes in and out (including cloud platforms), the need to have visibility of deployment status once a new app gets pushed to a Kubernetes cluster, and more. -Please note that Argo CD isn’t just deployed inside of Kubernetes but should be looked at as an extension of Kubernetes as it uses existing Kubernetes resources and functionalities like etcd and controllers to store data and monitor real-time updates of application state. +:::note + +Argo CD isn’t just deployed inside of Kubernetes but should be looked at as an extension of Kubernetes as it uses existing Kubernetes resources and functionalities like etcd and controllers to store data and monitor real-time updates of application state. + +::: ## How does Argo CD work? diff --git a/docs/operate/continuous-integration-continuous-deployment/circleci/images/circleci-auth.png b/docs/operate/_continuous-integration-continuous-deployment/circleci/images/circleci-auth.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/circleci/images/circleci-auth.png rename to docs/operate/_continuous-integration-continuous-deployment/circleci/images/circleci-auth.png diff --git a/docs/operate/continuous-integration-continuous-deployment/circleci/images/circleci-config.png b/docs/operate/_continuous-integration-continuous-deployment/circleci/images/circleci-config.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/circleci/images/circleci-config.png rename to docs/operate/_continuous-integration-continuous-deployment/circleci/images/circleci-config.png diff --git a/docs/operate/continuous-integration-continuous-deployment/circleci/images/circleci-config2.png b/docs/operate/_continuous-integration-continuous-deployment/circleci/images/circleci-config2.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/circleci/images/circleci-config2.png rename to docs/operate/_continuous-integration-continuous-deployment/circleci/images/circleci-config2.png diff --git a/docs/operate/continuous-integration-continuous-deployment/circleci/images/circleci-demoapp.png b/docs/operate/_continuous-integration-continuous-deployment/circleci/images/circleci-demoapp.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/circleci/images/circleci-demoapp.png rename to docs/operate/_continuous-integration-continuous-deployment/circleci/images/circleci-demoapp.png diff --git a/docs/operate/continuous-integration-continuous-deployment/circleci/images/circleci-login.png b/docs/operate/_continuous-integration-continuous-deployment/circleci/images/circleci-login.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/circleci/images/circleci-login.png rename to docs/operate/_continuous-integration-continuous-deployment/circleci/images/circleci-login.png diff --git a/docs/operate/continuous-integration-continuous-deployment/circleci/images/circleci-merge.png b/docs/operate/_continuous-integration-continuous-deployment/circleci/images/circleci-merge.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/circleci/images/circleci-merge.png rename to docs/operate/_continuous-integration-continuous-deployment/circleci/images/circleci-merge.png diff --git a/docs/operate/continuous-integration-continuous-deployment/circleci/images/circleci-proj.png b/docs/operate/_continuous-integration-continuous-deployment/circleci/images/circleci-proj.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/circleci/images/circleci-proj.png rename to docs/operate/_continuous-integration-continuous-deployment/circleci/images/circleci-proj.png diff --git a/docs/operate/continuous-integration-continuous-deployment/circleci/images/circleci-redis.png b/docs/operate/_continuous-integration-continuous-deployment/circleci/images/circleci-redis.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/circleci/images/circleci-redis.png rename to docs/operate/_continuous-integration-continuous-deployment/circleci/images/circleci-redis.png diff --git a/docs/operate/continuous-integration-continuous-deployment/circleci/images/circlecidiagram.png b/docs/operate/_continuous-integration-continuous-deployment/circleci/images/circlecidiagram.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/circleci/images/circlecidiagram.png rename to docs/operate/_continuous-integration-continuous-deployment/circleci/images/circlecidiagram.png diff --git a/docs/operate/continuous-integration-continuous-deployment/circleci/images/heroku-envir.png b/docs/operate/_continuous-integration-continuous-deployment/circleci/images/heroku-envir.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/circleci/images/heroku-envir.png rename to docs/operate/_continuous-integration-continuous-deployment/circleci/images/heroku-envir.png diff --git a/docs/operate/continuous-integration-continuous-deployment/circleci/images/heroku-setup.png b/docs/operate/_continuous-integration-continuous-deployment/circleci/images/heroku-setup.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/circleci/images/heroku-setup.png rename to docs/operate/_continuous-integration-continuous-deployment/circleci/images/heroku-setup.png diff --git a/docs/operate/continuous-integration-continuous-deployment/circleci/images/heroku-trigger.png b/docs/operate/_continuous-integration-continuous-deployment/circleci/images/heroku-trigger.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/circleci/images/heroku-trigger.png rename to docs/operate/_continuous-integration-continuous-deployment/circleci/images/heroku-trigger.png diff --git a/docs/operate/continuous-integration-continuous-deployment/circleci/images/rate-limiting-example.png b/docs/operate/_continuous-integration-continuous-deployment/circleci/images/rate-limiting-example.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/circleci/images/rate-limiting-example.png rename to docs/operate/_continuous-integration-continuous-deployment/circleci/images/rate-limiting-example.png diff --git a/docs/operate/continuous-integration-continuous-deployment/circleci/index-circleci.mdx b/docs/operate/_continuous-integration-continuous-deployment/circleci/index-circleci.mdx similarity index 99% rename from docs/operate/continuous-integration-continuous-deployment/circleci/index-circleci.mdx rename to docs/operate/_continuous-integration-continuous-deployment/circleci/index-circleci.mdx index 8450948ce89..6a6157ff692 100644 --- a/docs/operate/continuous-integration-continuous-deployment/circleci/index-circleci.mdx +++ b/docs/operate/_continuous-integration-continuous-deployment/circleci/index-circleci.mdx @@ -6,6 +6,10 @@ slug: /operate/continuous-integration-continuous-deployment/circleci authors: [talon, ajeet] --- +import Authors from '@theme/Authors'; + + + ## **What is CircleCI?** ![CircleCI and Redis logos](images/circleci-redis.png) diff --git a/docs/operate/_continuous-integration-continuous-deployment/index-continuous-integration-continuous-deployment.mdx b/docs/operate/_continuous-integration-continuous-deployment/index-continuous-integration-continuous-deployment.mdx new file mode 100644 index 00000000000..8cca1b9982d --- /dev/null +++ b/docs/operate/_continuous-integration-continuous-deployment/index-continuous-integration-continuous-deployment.mdx @@ -0,0 +1,10 @@ +--- +id: index-continuous-integration-continuous-deployment +title: Continuous Integration/Deployment +sidebar_label: Overview +slug: /operate/continuous-integration-continuous-deployment +--- + +import RedisCard from '@theme/RedisCard'; + +The following links show you the different ways to embed Redis into your continuous integration and continuous deployment process. diff --git a/docs/operate/continuous-integration-continuous-deployment/jenkins/images/image1.png b/docs/operate/_continuous-integration-continuous-deployment/jenkins/images/image1.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/jenkins/images/image1.png rename to docs/operate/_continuous-integration-continuous-deployment/jenkins/images/image1.png diff --git a/docs/operate/continuous-integration-continuous-deployment/jenkins/images/image10.png b/docs/operate/_continuous-integration-continuous-deployment/jenkins/images/image10.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/jenkins/images/image10.png rename to docs/operate/_continuous-integration-continuous-deployment/jenkins/images/image10.png diff --git a/docs/operate/continuous-integration-continuous-deployment/jenkins/images/image11.png b/docs/operate/_continuous-integration-continuous-deployment/jenkins/images/image11.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/jenkins/images/image11.png rename to docs/operate/_continuous-integration-continuous-deployment/jenkins/images/image11.png diff --git a/docs/operate/continuous-integration-continuous-deployment/jenkins/images/image12.png b/docs/operate/_continuous-integration-continuous-deployment/jenkins/images/image12.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/jenkins/images/image12.png rename to docs/operate/_continuous-integration-continuous-deployment/jenkins/images/image12.png diff --git a/docs/operate/continuous-integration-continuous-deployment/jenkins/images/image13.png b/docs/operate/_continuous-integration-continuous-deployment/jenkins/images/image13.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/jenkins/images/image13.png rename to docs/operate/_continuous-integration-continuous-deployment/jenkins/images/image13.png diff --git a/docs/operate/continuous-integration-continuous-deployment/jenkins/images/image14.png b/docs/operate/_continuous-integration-continuous-deployment/jenkins/images/image14.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/jenkins/images/image14.png rename to docs/operate/_continuous-integration-continuous-deployment/jenkins/images/image14.png diff --git a/docs/operate/continuous-integration-continuous-deployment/jenkins/images/image15.png b/docs/operate/_continuous-integration-continuous-deployment/jenkins/images/image15.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/jenkins/images/image15.png rename to docs/operate/_continuous-integration-continuous-deployment/jenkins/images/image15.png diff --git a/docs/operate/continuous-integration-continuous-deployment/jenkins/images/image16.png b/docs/operate/_continuous-integration-continuous-deployment/jenkins/images/image16.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/jenkins/images/image16.png rename to docs/operate/_continuous-integration-continuous-deployment/jenkins/images/image16.png diff --git a/docs/operate/continuous-integration-continuous-deployment/jenkins/images/image17.png b/docs/operate/_continuous-integration-continuous-deployment/jenkins/images/image17.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/jenkins/images/image17.png rename to docs/operate/_continuous-integration-continuous-deployment/jenkins/images/image17.png diff --git a/docs/operate/continuous-integration-continuous-deployment/jenkins/images/image18.png b/docs/operate/_continuous-integration-continuous-deployment/jenkins/images/image18.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/jenkins/images/image18.png rename to docs/operate/_continuous-integration-continuous-deployment/jenkins/images/image18.png diff --git a/docs/operate/continuous-integration-continuous-deployment/jenkins/images/image19.png b/docs/operate/_continuous-integration-continuous-deployment/jenkins/images/image19.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/jenkins/images/image19.png rename to docs/operate/_continuous-integration-continuous-deployment/jenkins/images/image19.png diff --git a/docs/operate/continuous-integration-continuous-deployment/jenkins/images/image2.png b/docs/operate/_continuous-integration-continuous-deployment/jenkins/images/image2.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/jenkins/images/image2.png rename to docs/operate/_continuous-integration-continuous-deployment/jenkins/images/image2.png diff --git a/docs/operate/continuous-integration-continuous-deployment/jenkins/images/image20.png b/docs/operate/_continuous-integration-continuous-deployment/jenkins/images/image20.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/jenkins/images/image20.png rename to docs/operate/_continuous-integration-continuous-deployment/jenkins/images/image20.png diff --git a/docs/operate/continuous-integration-continuous-deployment/jenkins/images/image3.png b/docs/operate/_continuous-integration-continuous-deployment/jenkins/images/image3.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/jenkins/images/image3.png rename to docs/operate/_continuous-integration-continuous-deployment/jenkins/images/image3.png diff --git a/docs/operate/continuous-integration-continuous-deployment/jenkins/images/image4.png b/docs/operate/_continuous-integration-continuous-deployment/jenkins/images/image4.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/jenkins/images/image4.png rename to docs/operate/_continuous-integration-continuous-deployment/jenkins/images/image4.png diff --git a/docs/operate/continuous-integration-continuous-deployment/jenkins/images/image5.png b/docs/operate/_continuous-integration-continuous-deployment/jenkins/images/image5.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/jenkins/images/image5.png rename to docs/operate/_continuous-integration-continuous-deployment/jenkins/images/image5.png diff --git a/docs/operate/continuous-integration-continuous-deployment/jenkins/images/image6.png b/docs/operate/_continuous-integration-continuous-deployment/jenkins/images/image6.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/jenkins/images/image6.png rename to docs/operate/_continuous-integration-continuous-deployment/jenkins/images/image6.png diff --git a/docs/operate/continuous-integration-continuous-deployment/jenkins/images/image7.png b/docs/operate/_continuous-integration-continuous-deployment/jenkins/images/image7.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/jenkins/images/image7.png rename to docs/operate/_continuous-integration-continuous-deployment/jenkins/images/image7.png diff --git a/docs/operate/continuous-integration-continuous-deployment/jenkins/images/image8.png b/docs/operate/_continuous-integration-continuous-deployment/jenkins/images/image8.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/jenkins/images/image8.png rename to docs/operate/_continuous-integration-continuous-deployment/jenkins/images/image8.png diff --git a/docs/operate/continuous-integration-continuous-deployment/jenkins/images/image9.png b/docs/operate/_continuous-integration-continuous-deployment/jenkins/images/image9.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/jenkins/images/image9.png rename to docs/operate/_continuous-integration-continuous-deployment/jenkins/images/image9.png diff --git a/docs/operate/continuous-integration-continuous-deployment/jenkins/images/status.png b/docs/operate/_continuous-integration-continuous-deployment/jenkins/images/status.png similarity index 100% rename from docs/operate/continuous-integration-continuous-deployment/jenkins/images/status.png rename to docs/operate/_continuous-integration-continuous-deployment/jenkins/images/status.png diff --git a/docs/operate/continuous-integration-continuous-deployment/jenkins/index-jenkins.mdx b/docs/operate/_continuous-integration-continuous-deployment/jenkins/index-jenkins.mdx similarity index 99% rename from docs/operate/continuous-integration-continuous-deployment/jenkins/index-jenkins.mdx rename to docs/operate/_continuous-integration-continuous-deployment/jenkins/index-jenkins.mdx index e2e85f75e56..5567345ef20 100644 --- a/docs/operate/continuous-integration-continuous-deployment/jenkins/index-jenkins.mdx +++ b/docs/operate/_continuous-integration-continuous-deployment/jenkins/index-jenkins.mdx @@ -6,6 +6,10 @@ slug: /operate/continuous-integration-continuous-deployment/jenkins authors: [ajeet, matthew] --- +import Authors from '@theme/Authors'; + + + [Jenkins](https://www.jenkins.io/) is currently [the most popular CI tool](https://cd.foundation/announcement/2019/08/14/jenkins-celebrates-15-years/), with ~15M users. It is an open source automation server which enables developers to reliably build, test, and deploy their software. It was forked in 2011 from a project called Hudson after a [dispute with Oracle](https://www.infoq.com/news/2011/01/jenkins/), and is used for [Continuous Integration and Continuous Delivery (CI/CD)](https://stackoverflow.com/questions/28608015/continuous-integration-vs-continuous-delivery-vs-continuous-deployment) and test automation. Jenkins is based on Java and provides over [1700 plugins](https://plugins.jenkins.io/) to automate your developer workflow and save a lot of your time in executing your repetitive tasks. ![image](images/image1.png) diff --git a/docs/operate/_index-operate.mdx b/docs/operate/_index-operate.mdx new file mode 100644 index 00000000000..676998d6b9d --- /dev/null +++ b/docs/operate/_index-operate.mdx @@ -0,0 +1,33 @@ +--- +id: index-operate +title: Operate Your Redis Database +sidebar_label: Overview +slug: /operate +--- + +import RedisCard from '@theme/RedisCard'; + +The following links demonstrate various ways to provision Redis and accelerate app deployment using Devops. + + + docusaurus mascot + + +# Explore by Category + +
+
+ +
+
+ +
+
diff --git a/docs/operate/continuous-integration-continuous-deployment/index-continuous-integration-continuous-deployment.mdx b/docs/operate/continuous-integration-continuous-deployment/index-continuous-integration-continuous-deployment.mdx deleted file mode 100644 index 097a4459f96..00000000000 --- a/docs/operate/continuous-integration-continuous-deployment/index-continuous-integration-continuous-deployment.mdx +++ /dev/null @@ -1,36 +0,0 @@ ---- -id: index-continuous-integration-continuous-deployment -title: Continuous Integration/Deployment -sidebar_label: Overview -slug: /operate/continuous-integration-continuous-deployment ---- - -import RedisCard from '@site/src/theme/RedisCard'; - -The following links show you the different ways to embed Redis into your continuous integration and continuous deployment process. - -
-
- -
-
- -
-
-
-
- -
-
diff --git a/docs/operate/index-operate.mdx b/docs/operate/index-operate.mdx deleted file mode 100644 index 689ce421eb8..00000000000 --- a/docs/operate/index-operate.mdx +++ /dev/null @@ -1,55 +0,0 @@ ---- -id: index-operate -title: Operate Your Redis Database -sidebar_label: Overview -slug: /operate ---- - -import RedisCard from '@site/src/theme/RedisCard'; - -The following links demonstrate various ways to provision Redis and accelerate app deployment using Devops. - - - docusaurus mascot - - -# Explore by Category - -
-
- -
-
- -
-
- -
-
- -
-
- -
-
diff --git a/docs/operate/index-operate.mdx.orig b/docs/operate/index-operate.mdx.orig deleted file mode 100644 index 6cb8714398c..00000000000 --- a/docs/operate/index-operate.mdx.orig +++ /dev/null @@ -1,77 +0,0 @@ ---- -id: index-operate -title: Operating Your Redis Database -sidebar_label: Overview -slug: /operate ---- -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; - -The following links provide information about how to operate your Redis database using DevOps tools. - - -
- -
-
- -## Top 5 Reasons why DevOps Teams Love Redis Enterprise -[---> Read the blog](https://redis.com/blog/why-devops-teams-love-redis-enterprise/) -
-
- -
-
- -## DevOps Tools -![DevOps logo](/img/logos/devopslogo.jpg) - -
-
-
- -# Explore by Category - -
-
- -
-
- -
-
- -
-
- -
-
- -
-
- - - diff --git a/docs/operate/observability/_index-observability.mdx b/docs/operate/observability/_index-observability.mdx new file mode 100644 index 00000000000..ba7fbafca09 --- /dev/null +++ b/docs/operate/observability/_index-observability.mdx @@ -0,0 +1,27 @@ +--- +id: index-observability +title: Observability +sidebar_label: Overview +slug: /operate/observability +--- + +import RedisCard from '@theme/RedisCard'; + +The following links demonstrate different ways in which you can observe key indicators critical to operating Redis. + +
+
+ +
+
+ +
+
diff --git a/docs/howtos/redistimeseries/using-prometheus/images/image1.png b/docs/operate/observability/_prometheus/images/image1.png similarity index 100% rename from docs/howtos/redistimeseries/using-prometheus/images/image1.png rename to docs/operate/observability/_prometheus/images/image1.png diff --git a/docs/howtos/redistimeseries/using-prometheus/images/image2.png b/docs/operate/observability/_prometheus/images/image2.png similarity index 100% rename from docs/howtos/redistimeseries/using-prometheus/images/image2.png rename to docs/operate/observability/_prometheus/images/image2.png diff --git a/docs/howtos/redistimeseries/using-prometheus/images/image_3.png b/docs/operate/observability/_prometheus/images/image_3.png similarity index 100% rename from docs/howtos/redistimeseries/using-prometheus/images/image_3.png rename to docs/operate/observability/_prometheus/images/image_3.png diff --git a/docs/howtos/redistimeseries/using-prometheus/images/image_4.png b/docs/operate/observability/_prometheus/images/image_4.png similarity index 100% rename from docs/howtos/redistimeseries/using-prometheus/images/image_4.png rename to docs/operate/observability/_prometheus/images/image_4.png diff --git a/docs/howtos/redistimeseries/using-prometheus/images/image_6.png b/docs/operate/observability/_prometheus/images/image_6.png similarity index 100% rename from docs/howtos/redistimeseries/using-prometheus/images/image_6.png rename to docs/operate/observability/_prometheus/images/image_6.png diff --git a/docs/howtos/redistimeseries/using-prometheus/images/image_7.png b/docs/operate/observability/_prometheus/images/image_7.png similarity index 100% rename from docs/howtos/redistimeseries/using-prometheus/images/image_7.png rename to docs/operate/observability/_prometheus/images/image_7.png diff --git a/docs/howtos/redistimeseries/using-prometheus/images/image_8.png b/docs/operate/observability/_prometheus/images/image_8.png similarity index 100% rename from docs/howtos/redistimeseries/using-prometheus/images/image_8.png rename to docs/operate/observability/_prometheus/images/image_8.png diff --git a/docs/howtos/redistimeseries/using-prometheus/images/prometheus.png b/docs/operate/observability/_prometheus/images/prometheus.png similarity index 100% rename from docs/howtos/redistimeseries/using-prometheus/images/prometheus.png rename to docs/operate/observability/_prometheus/images/prometheus.png diff --git a/docs/operate/observability/_prometheus/index-prometheus.mdx b/docs/operate/observability/_prometheus/index-prometheus.mdx new file mode 100644 index 00000000000..fde70dc064d --- /dev/null +++ b/docs/operate/observability/_prometheus/index-prometheus.mdx @@ -0,0 +1,217 @@ +--- +id: index-prometheus +title: How to monitor Redis with Prometheus and Grafana for Real-Time Analytics +sidebar_label: Using Time Series data model in Redis Stack along with Prometheus and Grafana +slug: /operate/observability/prometheus +authors: [ajeet] +--- + +import Authors from '@theme/Authors'; + + + +![My Image](images/prometheus.png) + +Time-series data is basically a series of data stored in time order and produced continuously over a long period of time. These measurements and events are tracked, monitored, downsampled, and aggregated over time. The events could be, for example, IoT sensor data. Every sensor is a source of time-series data. Each data point in the series stores the source information and other sensor measurements as labels. Data labels from every source may not conform to the same structure or order. + +A time-series database is a database system designed to store and retrieve such data for each point in time. Timestamped data can include data generated at regular intervals as well as data generated at unpredictable intervals. + +### When do you use a time-series database? + +- When your application needs data that accumulates quickly and your other databases aren’t designed to handle that scale. +- For financial or industrial applications. +- When your application needs to perform real-time analysis of billions of records. +- When your application needs to perform online queries at millisecond timescales, and support CPU-efficient ad-hoc queries. + +### Challenges with the existing traditional databases + +You might find numerous solutions that still store time-series data in a relational database, but they’re quite inefficient and come with their own set of drawbacks. A typical time-series database is usually built to only manage time-series data, hence one of the challenges it faces is with use cases that involve some sort of computation on top of time-series data. One good example could be capturing a live video feed in a time-series database. If you want to run an AI model for face recognition, you would have to extract the time-series data, apply some sort of data transformation and then do computation. +Relational databases carry the overhead of locking and synchronization that aren’t required for the immutable time-series data. This results in slower-than-required performance for both ingest and queries. When scaling out, it also means investing in additional compute resources. These databases enforce a rigid structure for labels and can’t accommodate unstructured data. They also require scheduled jobs for cleaning up old data. Beyond the time-series use case, these databases are also used for other use cases, which means overuse of running time-series queries may affect other workloads. + +### What is Redis Stack ? + +[Redis Stack](https://redis.io/docs/stack/about/) extends the core capabilities of Redis OSS and provides a complete developer experience for debugging and more. In addition to all of the features of Redis OSS, Redis Stack supports: + +- Queryable JSON documents +- Querying across hashes and JSON documents +- Time series data support (ingestion & querying), including full-text search +- Probabilistic data structures + +### What is the Time Series data model in Redis Stack? + +[Redis Stack](https://redis.io/docs/stack/about/) supports time-series data model that addresses the needs of handling time-series data. It removes the limitations enforced by relational databases and enables you to collect, manage, and deliver time-series data at scale. As an in-memory database, Redis can ingest over 500,000 records per second on a standard node. Our benchmarks show that you can ingest over 11.5 million records per second with a cluster of 16 Redis shards. + +Time Series support in Redis Stack is resource-efficient. With Redis, you can add rules to compact data by downSampling. For example, if you’ve collected more than one billion data points in a day, you could aggregate the data by every minute in order to downSample it, thereby reducing the dataset size to 1,440 data points (24 \* 60 = 1,440). You can also set data retention policies and expire the data by time when you don’t need them anymore. Redis allows you to aggregate data by average, minimum, maximum, sum, count, range, first, and last. You can run over 100,000 aggregation queries per second with sub-millisecond latency. You can also perform reverse lookups on the labels in a specific time range. + +### Notables features related to Time Series in Redis Stack includes: + +- High volume inserts, low latency reads +- Query by start time and end-time +- Aggregated queries (Min, Max, Avg, Sum, Range, Count, First, Last, STD.P, STD.S, Var.P, Var.S) for any time bucket +- Configurable maximum retention period +- DownSampling/Compaction - automatically updated aggregate time series +- Secondary index - each time series has labels (field value pairs) which will allows to query by labels + +### Why Prometheus? + +Prometheus is an open-source systems monitoring and alerting toolkit. It collects and stores its metrics as time series data, i.e. metrics information. The metrics are numeric measurements in a time series, meaning changes recorded over time. These metrics are stored with the timestamp at which it was recorded, alongside optional key-value pairs called labels. Metrics play an important role in understanding why your application is working in a certain way. + +### Prometheus remote storage adapter for Time Series data model of Redis Stack + +In the [Time Series Database over Redis](https://github.com/RedisTimeSeries) organization you can find projects that help you integrate Time Series data model of Redis Stack with other tools, including Prometheus and Grafana. The Prometheus remote storage adapter for Redis is hosted [on GitHub here](https://github.com/RedisTimeSeries/prometheus-redistimeseries-adapter.) It’s basically a read/write adapter to use Redis Stack as a backend database. This timeSeries Adapter receives Prometheus metrics via the remote write, and writes to Redis. + +## Getting Started + +### Prerequisites: + +- Install GIT +- Install Docker +- Install Docker Compose + +### Step 1. Clone the repository + +``` + git clone https://github.com/RedisTimeSeries/prometheus-redistimeseries-adapter +``` + +### Step 2. Examining the Docker Compose File + +This Docker compose defines 4 services - + +1. Prometheus +2. Adapter +3. Grafana +4. Redis + +```yaml + version: '3' + services: + prometheus: + image: "prom/prometheus:v2.8.0" + command: ["--config.file=/prometheus.yml"] + volumes: + - ./prometheus.yaml:/prometheus.yml + ports: + - 9090:9090 + adapter: + image: "redislabs/prometheus-redistimeseries-adapter:master" + command: ["-redis-address", "redis:6379", "-web.listen-address", "0.0.0.0:9201"] + redis: + image: "redislabs/redistimeseries:edge" + ports: + - "6379:6379" + grafana: + build: ./grafana/ + ports: + - "3000:3000" +``` + +#### Prometheus + +The `prometheus` service directly uses an image “prom/prometheus” that’s pulled from Docker Hub. It then binds the container and the host machine to the exposed port, 9090. The Prometheus configuration file is accessed by mounting the volume on the host and container. + +#### Storage Adapter + +The `adapter` service uses an image “`redislabs/prometheus-redistimeseries-adapter:master`” that’s pulled from Docker Hub. Sets the default command for the container: `-redis-address", "redis:6379 and listen to the address 0.0.0.0:9201. ` + +#### Redis + +The `Redis` service directly uses an image “`redislabs/redistimeseries:edge`” that’s pulled from Docker Hub. It then binds the container and the host machine to the exposed port, `6379` + +#### Grafana + +The `grafana` service uses an image that’s built from the `Dockerfile` in the current directory. It then binds the container and the host machine to the exposed port, `3000`. + +### Step 3. Run the Docker Compose + +Change directory to compose and execute the following command: + +```bash + docker-compose up -d +``` + +```bash + docker-compose ps + NAME COMMAND SERVICE STATUS PORTS + compose-adapter-1 "/adapter/redis-ts-a…" adapter running + compose-grafana-1 "/run.sh" grafana running 0.0.0.0:3000->3000/tcp + compose-prometheus-1 "/bin/prometheus --c…" prometheus running 0.0.0.0:9090->9090/tcp + compose-redis-1 "docker-entrypoint.s…" redis running 0.0.0.0:6379->6379/tcp +``` + +### Step 4. Accessing Grafana + +Open `http://hostIP:3000` to access the Grafana dashboard. The default username and password is admin/admin. + +### Step 5. Add Prometheus Data Source + +In the left sidebar, you will see the “Configuration” option. Select “Data Source” and choose Prometheus. + +![Adding the Prometheus data source](images/image1.png) + +Click “Save and Test”. + +### Step 6. Importing Prometheus Data Source + +Click on “Import” for all the Prometheus dashboards. + +![Importing the Prometheus data source](images/image2.png) + +### Step 7. Adding Redis Data Source + +Again, click on “Data Sources” and add Redis. + +![Adding the Redis data source](images/image_3.png) + +Click "Import". + +![Importing the Redis data source](images/image_4.png) + +### Step 8. Running the Sensor Script + +It’s time to test drive a few demo scripts built by the Redis team. To start with, clone the following repository: + +``` + git clone https://github.com/RedisTimeSeries/prometheus-demos +``` + +This repo contains a set of basic demos showcasing the integration of Time Series data model of Redis Stack with Prometheus and Grafana. Let’s pick up a sensor script. + +``` + python3 weather_station/sensors.py +``` + +This script will add random measurements for temperature and humidity for a number of sensors. + +Go to “Add Panel” on the top right corner of the Grafana dashboard and start adding temperature and humidity values. + +![alt_text](images/image_7.png) + +### Step 9. Accessing Prometheus Dashboard + +Open up `https://HOSTIP:9090` to access the Prometheus dashboard for the sensor values without any further configuration. + +![Accessing the Prometheus dashboard](images/image_8.png) + +### Further References: + +- [Prometheus remote storage adapter for Time Series with Redis Stack](https://github.com/RedisTimeSeries/prometheus-redistimeseries-adapter) +- [Remote Storage Integration](https://prometheus.io/docs/prometheus/latest/storage/#remote-storage-integrations) +- [Time Series Demos with Redis Stack](https://github.com/RedisTimeSeries/prometheus-demos) + +## + + diff --git a/docs/explore/redisexplorer/cluster_databases_dashboard.png b/docs/operate/observability/_redisexplorer/cluster_databases_dashboard.png similarity index 100% rename from docs/explore/redisexplorer/cluster_databases_dashboard.png rename to docs/operate/observability/_redisexplorer/cluster_databases_dashboard.png diff --git a/docs/explore/redisexplorer/cluster_nodes.png b/docs/operate/observability/_redisexplorer/cluster_nodes.png similarity index 100% rename from docs/explore/redisexplorer/cluster_nodes.png rename to docs/operate/observability/_redisexplorer/cluster_nodes.png diff --git a/docs/explore/redisexplorer/cluster_nodes_dashboard.png b/docs/operate/observability/_redisexplorer/cluster_nodes_dashboard.png similarity index 100% rename from docs/explore/redisexplorer/cluster_nodes_dashboard.png rename to docs/operate/observability/_redisexplorer/cluster_nodes_dashboard.png diff --git a/docs/explore/redisexplorer/cluster_overview.png b/docs/operate/observability/_redisexplorer/cluster_overview.png similarity index 100% rename from docs/explore/redisexplorer/cluster_overview.png rename to docs/operate/observability/_redisexplorer/cluster_overview.png diff --git a/docs/explore/redisexplorer/cluster_overview_dashboard.png b/docs/operate/observability/_redisexplorer/cluster_overview_dashboard.png similarity index 100% rename from docs/explore/redisexplorer/cluster_overview_dashboard.png rename to docs/operate/observability/_redisexplorer/cluster_overview_dashboard.png diff --git a/docs/explore/redisexplorer/datasource.png b/docs/operate/observability/_redisexplorer/datasource.png similarity index 100% rename from docs/explore/redisexplorer/datasource.png rename to docs/operate/observability/_redisexplorer/datasource.png diff --git a/docs/explore/redisexplorer/enterprise_cluster_dashboard.png b/docs/operate/observability/_redisexplorer/enterprise_cluster_dashboard.png similarity index 100% rename from docs/explore/redisexplorer/enterprise_cluster_dashboard.png rename to docs/operate/observability/_redisexplorer/enterprise_cluster_dashboard.png diff --git a/docs/explore/redisexplorer/explorer_options.png b/docs/operate/observability/_redisexplorer/explorer_options.png similarity index 100% rename from docs/explore/redisexplorer/explorer_options.png rename to docs/operate/observability/_redisexplorer/explorer_options.png diff --git a/docs/explore/redisexplorer/grafana.png b/docs/operate/observability/_redisexplorer/grafana.png similarity index 100% rename from docs/explore/redisexplorer/grafana.png rename to docs/operate/observability/_redisexplorer/grafana.png diff --git a/docs/operate/observability/_redisexplorer/index-redisexplorer.mdx b/docs/operate/observability/_redisexplorer/index-redisexplorer.mdx new file mode 100644 index 00000000000..ed5fe3a2dd1 --- /dev/null +++ b/docs/operate/observability/_redisexplorer/index-redisexplorer.mdx @@ -0,0 +1,130 @@ +--- +id: index-redisexplorer +title: How to create Grafana Dashboards for Redis Enterprise cluster in 5 Minutes +sidebar_label: Grafana Dashboards for Redis Enterprise Cluster +slug: /operate/observability/redisexplorer +authors: [ajeet] +--- + +import Authors from '@theme/Authors'; + + + +Redis Enterprise clusters are a set of nodes, typically two or more, providing database services. Clusters are inherently multi-tenant, and a single cluster can manage multiple databases accessed through individual endpoints. Redis Enterprise software provides REST API to retrieve information about cluster, database, nodes and metrics. + +Redis Explorer plugin is the latest plugin in the Grafana Labs that adds support for Redis Enterprise software. It is a plugin for Grafana that connects to Redis Enterprise software clusters using REST API. It provides application pages to add Redis Data Sources for managed databases and dashboards to see cluster configuration. + +![The Redis Explorer plugin](redisexplorer.png) + +Redis Explorer allows you to create the following dashboard over Grafana: + +#### Enterprise Clusters Dashboard + +The Enterprise Clusters dashboard provides basic information about the cluster, license, and displays most important metrics. + +![Enterprise Cluster Dashboard](enterprise_cluster_dashboard.png) + +#### Cluster Overview Dashboard + +The Cluster Overview dashboard provides the most important information and metrics for the selected cluster. + +![The Cluster Overview Dashboard](cluster_overview_dashboard.png) + +#### Cluster Nodes Dashboard + +Cluster Nodes dashboard provides information and metrics for each node participating in the cluster. + +![The Cluster Nodes dashboard](cluster_nodes_dashboard.png) + +#### Cluster Databases Dashboard + +The Cluster Databases dashboard provides information and metrics for each databases managed by cluster. + +![The Cluster databases dashboard](cluster_databases_dashboard.png) + +### Getting Started + +### Pre-requisites + +- Grafana 8.0+ is required for Redis Explorer 2.X. +- Grafana 7.1+ is required for Redis Explorer 1.X. +- Docker +- Redis Enterprise Cluster + +### Step 1. Setup Redis Enterprise Cluster + +[Follow these steps](/create/docker/) to setup Redis Enterprise cluster nodes. + +![Set up Redis Enterprise](tryfree1.png) + +![Redis Enterprise Cluster](redis_enterprise_cluster.png) + +### Step 2. Install Grafana + +```bash + brew install grafana +``` + +### Step 3. Install redis-explorer-app + +Use the grafana-cli tool to install from the command line: +Redis Application plugin and Redis Data Source will be auto-installed as dependencies. + +```bash + grafana-cli plugins install redis-explorer-app +``` + +### Step 4. Using Docker + +You can even run Redis Explorer plugin using Docker: + +```bash + docker run -p 3000:3000 --name=grafana -e "GF_INSTALL_PLUGINS=redis-explorer-app" grafana/grafana +``` + +Open `https://IP:3000` to access grafana. The default username/password is admin/admin. + +### Step 5. Log in to Grafana + +![Logging into Grafana](grafana.png) + +### Step 6. Choose Redis Explorer in the sidebar + +Once you add the datasource, you should be able to choose the right option: + +![Explorer Options](explorer_options.png) + +### Step 7. Getting the Redis Enterprise Cluster Overview + +![Redis Enterprise Cluster Overview](cluster_overview.png) + +### Step 8. Displaying the Redis Enterprise Cluster Nodes + +![Redis Enterprise Cluster Nodes](cluster_nodes.png) + +### Further References + +- [Redis Explorer plugin for Grafana](https://grafana.com/grafana/plugins/redis-explorer-app/) +- [Redis Plugins for Grafana Quickstart Guide](https://redisgrafana.github.io/quickstart/) +- [Introducing the Redis Data Source Plug-in for Grafana](https://redis.com/blog/how-to-use-the-new-redis-data-source-for-grafana-plug-in/) +- [How to Use the New Redis Data Source for Grafana Plug-in](https://redis.com/blog/how-to-use-the-new-redis-data-source-for-grafana-plug-in/) +- [3 Real-Life Apps Built with Redis Data Source for Grafana](https://redis.com/blog/3-real-life-apps-built-with-redis-data-source-for-grafana/) +- [How to Manage Real-Time IoT Sensor Data in Redis](https://redis.com/blog/how-to-manage-real-time-iot-sensor-data-in-redis/) +- [Real-time observability with Redis and Grafana](https://grafana.com/go/observabilitycon/real-time-observability-with-redis-and-grafana/) + +## + + diff --git a/docs/explore/redisexplorer/redis_enterprise_cluster.png b/docs/operate/observability/_redisexplorer/redis_enterprise_cluster.png similarity index 100% rename from docs/explore/redisexplorer/redis_enterprise_cluster.png rename to docs/operate/observability/_redisexplorer/redis_enterprise_cluster.png diff --git a/docs/explore/redisexplorer/redisexplorer.png b/docs/operate/observability/_redisexplorer/redisexplorer.png similarity index 100% rename from docs/explore/redisexplorer/redisexplorer.png rename to docs/operate/observability/_redisexplorer/redisexplorer.png diff --git a/docs/explore/redisexplorer/tryfree1.png b/docs/operate/observability/_redisexplorer/tryfree1.png similarity index 100% rename from docs/explore/redisexplorer/tryfree1.png rename to docs/operate/observability/_redisexplorer/tryfree1.png diff --git a/docs/operate/observability/datadog/index-datadog.mdx b/docs/operate/observability/datadog/index-datadog.mdx index ab870943463..c0767209d88 100644 --- a/docs/operate/observability/datadog/index-datadog.mdx +++ b/docs/operate/observability/datadog/index-datadog.mdx @@ -6,6 +6,10 @@ slug: /operate/observability/datadog authors: [ajeet, christian] --- +import Authors from '@theme/Authors'; + + + ![Datadog](images/datadog-redis.png) Devops and SRE practitioners are already keenly aware of the importance of system reliability, as it’s one of the shared goals in every high performing organization. Defining clear reliability targets based on solid data is crucial for productive collaboration between developers and SREs. This need spans the entire infrastructure from application to backend database services. diff --git a/docs/operate/observability/index-observability.mdx b/docs/operate/observability/index-observability.mdx deleted file mode 100644 index f017b2c8ca7..00000000000 --- a/docs/operate/observability/index-observability.mdx +++ /dev/null @@ -1,43 +0,0 @@ ---- -id: index-observability -title: Observability -sidebar_label: Overview -slug: /operate/observability ---- - -import RedisCard from '@site/src/theme/RedisCard'; - -The following links demonstrate different ways in which you can observe key indicators critical to operating Redis. - -
-
- -
-
- -
-
- -
-
-
-
- -
-
diff --git a/docs/operate/observability/prometheus/images/image1.png b/docs/operate/observability/prometheus/images/image1.png deleted file mode 100644 index dc183d7d99c..00000000000 Binary files a/docs/operate/observability/prometheus/images/image1.png and /dev/null differ diff --git a/docs/operate/observability/prometheus/images/image2.png b/docs/operate/observability/prometheus/images/image2.png deleted file mode 100644 index e51a5aba351..00000000000 Binary files a/docs/operate/observability/prometheus/images/image2.png and /dev/null differ diff --git a/docs/operate/observability/prometheus/images/image_3.png b/docs/operate/observability/prometheus/images/image_3.png deleted file mode 100644 index 7fd95157ebe..00000000000 Binary files a/docs/operate/observability/prometheus/images/image_3.png and /dev/null differ diff --git a/docs/operate/observability/prometheus/images/image_4.png b/docs/operate/observability/prometheus/images/image_4.png deleted file mode 100644 index 9579b227571..00000000000 Binary files a/docs/operate/observability/prometheus/images/image_4.png and /dev/null differ diff --git a/docs/operate/observability/prometheus/images/image_6.png b/docs/operate/observability/prometheus/images/image_6.png deleted file mode 100644 index d92dfaff4dc..00000000000 Binary files a/docs/operate/observability/prometheus/images/image_6.png and /dev/null differ diff --git a/docs/operate/observability/prometheus/images/image_7.png b/docs/operate/observability/prometheus/images/image_7.png deleted file mode 100644 index 562d2f124f4..00000000000 Binary files a/docs/operate/observability/prometheus/images/image_7.png and /dev/null differ diff --git a/docs/operate/observability/prometheus/images/image_8.png b/docs/operate/observability/prometheus/images/image_8.png deleted file mode 100644 index 423880f6e0b..00000000000 Binary files a/docs/operate/observability/prometheus/images/image_8.png and /dev/null differ diff --git a/docs/operate/observability/prometheus/images/prometheus.png b/docs/operate/observability/prometheus/images/prometheus.png deleted file mode 100644 index 22f3c6928fd..00000000000 Binary files a/docs/operate/observability/prometheus/images/prometheus.png and /dev/null differ diff --git a/docs/operate/observability/prometheus/index-prometheus.mdx b/docs/operate/observability/prometheus/index-prometheus.mdx deleted file mode 100644 index ab7480c3ee1..00000000000 --- a/docs/operate/observability/prometheus/index-prometheus.mdx +++ /dev/null @@ -1,206 +0,0 @@ ---- -id: index-prometheus -title: How to monitor Redis with Prometheus and Grafana for Real-Time Analytics -sidebar_label: Using RedisTimeSeries with Prometheus and Grafana -slug: /operate/observability/prometheus -authors: [ajeet] ---- - -![My Image](images/prometheus.png) - -Time-series data is basically a series of data stored in time order and produced continuously over a long period of time. These measurements and events are tracked, monitored, downsampled, and aggregated over time. The events could be, for example, IoT sensor data. Every sensor is a source of time-series data. Each data point in the series stores the source information and other sensor measurements as labels. Data labels from every source may not conform to the same structure or order. - -A time-series database is a database system designed to store and retrieve such data for each point in time. Timestamped data can include data generated at regular intervals as well as data generated at unpredictable intervals. - -### When do you use a time-series database? - -- When your application needs data that accumulates quickly and your other databases aren’t designed to handle that scale. -- For financial or industrial applications. -- When your application needs to perform real-time analysis of billions of records. -- When your application needs to perform online queries at millisecond timescales, and support CPU-efficient ad-hoc queries. - -### Challenges with the existing traditional databases - -You might find numerous solutions that still store time-series data in a relational database, but they’re quite inefficient and come with their own set of drawbacks. A typical time-series database is usually built to only manage time-series data, hence one of the challenges it faces is with use cases that involve some sort of computation on top of time-series data. One good example could be capturing a live video feed in a time-series database. If you want to run an AI model for face recognition, you would have to extract the time-series data, apply some sort of data transformation and then do computation. -Relational databases carry the overhead of locking and synchronization that aren’t required for the immutable time-series data. This results in slower-than-required performance for both ingest and queries. When scaling out, it also means investing in additional compute resources. These databases enforce a rigid structure for labels and can’t accommodate unstructured data. They also require scheduled jobs for cleaning up old data. Beyond the time-series use case, these databases are also used for other use cases, which means overuse of running time-series queries may affect other workloads. - -### What is RedisTimeSeries? - -RedisTimeSeries is a purpose-built time-series database that addresses the needs of handling time-series data. It removes the limitations enforced by relational databases and enables you to collect, manage, and deliver time-series data at scale. As an in-memory database, RedisTimeSeries can ingest over 500,000 records per second on a standard node. Our benchmarks show that you can ingest over 11.5 million records per second with a cluster of 16 Redis shards. - -RedisTimeSeries is resource-efficient. With RedisTimeSeries, you can add rules to compact data by downsampling. For example, if you’ve collected more than one billion data points in a day, you could aggregate the data by every minute in order to downsample it, thereby reducing the dataset size to 1,440 data points (24 \* 60 = 1,440). You can also set data retention policies and expire the data by time when you don’t need them anymore. RedisTimeSeries allows you to aggregate data by average, minimum, maximum, sum, count, range, first, and last. You can run over 100,000 aggregation queries per second with sub-millisecond latency. You can also perform reverse lookups on the labels in a specific time range. - -### Notables features of RedisTimeseries includes: - -- High volume inserts, low latency reads -- Query by start time and end-time -- Aggregated queries (Min, Max, Avg, Sum, Range, Count, First, Last, STD.P, STD.S, Var.P, Var.S) for any time bucket -- Configurable maximum retention period -- Downsampling/Compaction - automatically updated aggregate time series -- Secondary index - each time series has labels (field value pairs) which will allows to query by labels - -### Why Prometheus? - -Prometheus is an open-source systems monitoring and alerting toolkit. It collects and stores its metrics as time series data, i.e. metrics information. The metrics are numeric measurements in a time series, meaning changes recorded over time. These metrics are stored with the timestamp at which it was recorded, alongside optional key-value pairs called labels. Metrics play an important role in understanding why your application is working in a certain way. - -### Prometheus remote storage adapter for RedisTimeSeries - -In the RedisTimeSeries organization you can find projects that help you integrate RedisTimeSeries with other tools, including Prometheus and Grafana. The Prometheus remote storage adapter for RedisTimeSeries is available and the project is hosted [on GitHub here](https://github.com/RedisTimeSeries/prometheus-redistimeseries-adapter.) It’s basically a read/write adapter to use RedisTimeSeries as a backend database. RedisTimeSeries Adapter receives Prometheus metrics via the remote write, and writes to Redis with the RedisTimeSeries module. - -## Getting Started - -### Prerequisites: - -- Install GIT -- Install Docker -- Install Docker Compose - -### Step 1. Clone the repository - -``` - git clone https://github.com/RedisTimeSeries/prometheus-redistimeseries-adapter -``` - -### Step 2. Examining the Docker Compose File - -This Docker compose defines 4 services - - -1. Prometheus -2. Adapter -3. Grafana -4. Redis - -```yaml - version: '3' - services: - prometheus: - image: "prom/prometheus:v2.8.0" - command: ["--config.file=/prometheus.yml"] - volumes: - - ./prometheus.yaml:/prometheus.yml - ports: - - 9090:9090 - adapter: - image: "redislabs/prometheus-redistimeseries-adapter:master" - command: ["-redis-address", "redis:6379", "-web.listen-address", "0.0.0.0:9201"] - redis: - image: "redislabs/redistimeseries:edge" - ports: - - "6379:6379" - grafana: - build: ./grafana/ - ports: - - "3000:3000" -``` - -#### Prometheus - -The `prometheus` service directly uses an image “prom/prometheus” that’s pulled from Docker Hub. It then binds the container and the host machine to the exposed port, 9090. The Prometheus configuration file is accessed by mounting the volume on the host and container. - -#### Storage Adapter - -The `adapter` service uses an image “`redislabs/prometheus-redistimeseries-adapter:master`” that’s pulled from Docker Hub. Sets the default command for the container: `-redis-address", "redis:6379 and listen to the address 0.0.0.0:9201. ` - -#### Redis - -The `Redis` service directly uses an image “`redislabs/redistimeseries:edge`” that’s pulled from Docker Hub. It then binds the container and the host machine to the exposed port, `6379` - -#### Grafana - -The `grafana` service uses an image that’s built from the `Dockerfile` in the current directory. It then binds the container and the host machine to the exposed port, `3000`. - -### Step 3. Run the Docker Compose - -Change directory to compose and execute the following command: - -```bash - docker-compose up -d -``` - -```bash - docker-compose ps - NAME COMMAND SERVICE STATUS PORTS - compose-adapter-1 "/adapter/redis-ts-a…" adapter running - compose-grafana-1 "/run.sh" grafana running 0.0.0.0:3000->3000/tcp - compose-prometheus-1 "/bin/prometheus --c…" prometheus running 0.0.0.0:9090->9090/tcp - compose-redis-1 "docker-entrypoint.s…" redis running 0.0.0.0:6379->6379/tcp -``` - -### Step 4. Accessing Grafana - -Open `http://hostIP:3000` to access the Grafana dashboard. The default username and password is admin/admin. - -### Step 5. Add Prometheus Data Source - -In the left sidebar, you will see the “Configuration” option. Select “Data Source” and choose Prometheus. - -![Adding the Prometheus data source](images/image1.png) - -Click “Save and Test”. - -### Step 6. Importing Prometheus Data Source - -Click on “Import” for all the Prometheus dashboards. - -![Importing the Prometheus data source](images/image2.png) - -### Step 7. Adding Redis Data Source - -Again, click on “Data Sources” and add Redis. - -![Adding the Redis data source](images/image_3.png) - -Click "Import". - -![Importing the Redis data source](images/image_4.png) - -### Step 8. Running the Sensor Script - -It’s time to test drive a few demo scripts built by the Redis team. To start with, clone the following repository: - -``` - git clone https://github.com/RedisTimeSeries/prometheus-demos -``` - -This repo contains a set of basic demoes showcasing the integration of RedisTimeSeries with Prometheus and Grafana. Let’s pick up a sensor script. - -``` - python3 weather_station/sensors.py -``` - -This script will add random measurements for temperature and humidity for a number of sensors. - -Go to “Add Panel” on the top right corner of the Grafana dashboard and start adding temperature and humidity values. - -![alt_text](images/image_7.png) - -### Step 9. Accessing Prometheus Dashboard - -Open up `https://HOSTIP:9090` to access the Prometheus dashboard for the sensor values without any further configuration. - -![Accessing the Prometheus dashboard](images/image_8.png) - -### Further References: - -- [Prometheus remote storage adapter for RedisTimeSeries](https://github.com/RedisTimeSeries/prometheus-redistimeseries-adapter) -- [Remote Storage Integration](https://prometheus.io/docs/prometheus/latest/storage/#remote-storage-integrations) -- [RedisTimeSeries Demos](https://github.com/RedisTimeSeries/prometheus-demos) - -## - - diff --git a/docs/operate/observability/redisdatasource/index-redisdatasource.mdx b/docs/operate/observability/redisdatasource/index-redisdatasource.mdx index 711e869ade0..ac3fbf55402 100644 --- a/docs/operate/observability/redisdatasource/index-redisdatasource.mdx +++ b/docs/operate/observability/redisdatasource/index-redisdatasource.mdx @@ -8,8 +8,9 @@ authors: [ajeet] import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; +import Authors from '@theme/Authors'; + + The Redis Data Source for Grafana is a plug-in that allows users to connect to the Redis database and build dashboards in Grafana to easily monitor Redis and application data. It provides an out-of-the-box predefined dashboard, but also lets you build customized dashboards tuned to your specific needs. @@ -22,10 +23,10 @@ The Redis Data Source for Grafana is a plug-in that allows users to connect to t - Redis Cluster and Sentinel supported since version 1.2. - Data Source supports: - - [RedisTimeSeries](https://oss.redis.com/redistimeseries/): `TS.GET`, `TS.INFO`, `TS.MRANGE`, `TS.QUERYINDEX`, `TS.RANGE` - - [RedisGears](https://oss.redis.com/redisgears/): `RG.DUMPREGISTRATIONS`, `RG.PYEXECUTE`, `RG.PYSTATS` - - [RedisSearch](https://oss.redis.com/redisearch/): `FT.INFO` - - [RedisGraph](https://oss.redis.com/redisgraph/): `GRAPH.QUERY`, `GRAPH.SLOWLOG` + - [Redis Time Series](https://oss.redis.com/redistimeseries/): `TS.GET`, `TS.INFO`, `TS.MRANGE`, `TS.QUERYINDEX`, `TS.RANGE` + - [Triggers and Functions](https://oss.redis.com/redisgears/): `RG.DUMPREGISTRATIONS`, `RG.PYEXECUTE`, `RG.PYSTATS` + - [Search](https://oss.redis.com/redisearch/): `FT.INFO` + - [Graph](https://oss.redis.com/redisgraph/): `GRAPH.QUERY`, `GRAPH.SLOWLOG` - - - Redis Launchpad - - - diff --git a/docs/operate/observability/redisexplorer/redis_enterprise_cluster.png b/docs/operate/observability/redisexplorer/redis_enterprise_cluster.png deleted file mode 100644 index 87b885e8d18..00000000000 Binary files a/docs/operate/observability/redisexplorer/redis_enterprise_cluster.png and /dev/null differ diff --git a/docs/operate/observability/redisexplorer/redisexplorer.png b/docs/operate/observability/redisexplorer/redisexplorer.png deleted file mode 100644 index 54f58130121..00000000000 Binary files a/docs/operate/observability/redisexplorer/redisexplorer.png and /dev/null differ diff --git a/docs/operate/observability/redisexplorer/tryfree1.png b/docs/operate/observability/redisexplorer/tryfree1.png deleted file mode 100644 index f16da8a6b15..00000000000 Binary files a/docs/operate/observability/redisexplorer/tryfree1.png and /dev/null differ diff --git a/docs/operate/orchestration/docker/index-docker.mdx b/docs/operate/orchestration/docker/index-docker.mdx index 619f6f44c8b..6eafa748e02 100644 --- a/docs/operate/orchestration/docker/index-docker.mdx +++ b/docs/operate/orchestration/docker/index-docker.mdx @@ -8,8 +8,9 @@ authors: [ajeet] import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; +import Authors from '@theme/Authors'; + + - Redis Launchpad - diff --git a/docs/operate/orchestration/index-orchestration.mdx b/docs/operate/orchestration/index-orchestration.mdx index 2f2a162c4f7..ef36fcd7b94 100644 --- a/docs/operate/orchestration/index-orchestration.mdx +++ b/docs/operate/orchestration/index-orchestration.mdx @@ -5,7 +5,7 @@ sidebar_label: Overview slug: /operate/orchestration --- -import RedisCard from '@site/src/theme/RedisCard'; +import RedisCard from '@theme/RedisCard'; The following links show you with the various ways to connect your containerized workloads to Redis. diff --git a/docs/operate/orchestration/kubernetes-gke/index-kubernetes-gke.mdx b/docs/operate/orchestration/kubernetes-gke/index-kubernetes-gke.mdx index c61d1cd42e5..fba7151c82a 100644 --- a/docs/operate/orchestration/kubernetes-gke/index-kubernetes-gke.mdx +++ b/docs/operate/orchestration/kubernetes-gke/index-kubernetes-gke.mdx @@ -6,10 +6,9 @@ slug: /operate/orchestration/kubernetes/kubernetes-gke authors: [ajeet] --- -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; +import Authors from '@theme/Authors'; + + ### Step 1. Prerequisites @@ -150,13 +149,11 @@ Open `https://localhost:8443` in the browser to see the Redis Enterprise Softwar target="_blank" rel="noopener" className="link"> - Redis Launchpad - diff --git a/docs/operate/orchestration/kubernetes-operator/index-kubernetes-operator.mdx b/docs/operate/orchestration/kubernetes-operator/index-kubernetes-operator.mdx index 0623f5db493..9bcebd0039c 100644 --- a/docs/operate/orchestration/kubernetes-operator/index-kubernetes-operator.mdx +++ b/docs/operate/orchestration/kubernetes-operator/index-kubernetes-operator.mdx @@ -6,10 +6,9 @@ slug: /operate/orchestration/kubernetes-operator authors: [ajeet] --- -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; -import useBaseUrl from '@docusaurus/useBaseUrl'; -import RedisCard from '@site/src/theme/RedisCard'; +import Authors from '@theme/Authors'; + + ![My Image](images/image1.png) diff --git a/docs/operate/orchestration/nodejs-nginx-redis/index-nodejs-nginx-redis.mdx b/docs/operate/orchestration/nodejs-nginx-redis/index-nodejs-nginx-redis.mdx index f1bd469e136..69c1797bb62 100644 --- a/docs/operate/orchestration/nodejs-nginx-redis/index-nodejs-nginx-redis.mdx +++ b/docs/operate/orchestration/nodejs-nginx-redis/index-nodejs-nginx-redis.mdx @@ -6,6 +6,10 @@ slug: /operate/docker/nodejs-nginx-redis authors: [ajeet] --- +import Authors from '@theme/Authors'; + + + Thanks to [Node.js](https://nodejs.dev/) - Millions of frontend developers that write JavaScript for the browser are now able to write the server-side code in addition to the client-side code without the need to learn a completely different language. Node.js is a free, open-sourced, cross-platform JavaScript run-time environment. It is capable to handle thousands of concurrent connections with a single server without introducing the burden of managing thread concurrency, which could be a significant source of bugs. ![Nginx-node](docker_nginx.png) diff --git a/docs/operate/provisioning/azure-cache-terraform-private/index-azure-cache-terraform-private.mdx b/docs/operate/provisioning/_azure-cache-terraform-private/index-azure-cache-terraform-private.mdx similarity index 82% rename from docs/operate/provisioning/azure-cache-terraform-private/index-azure-cache-terraform-private.mdx rename to docs/operate/provisioning/_azure-cache-terraform-private/index-azure-cache-terraform-private.mdx index a7743cdc68b..28d0806ee84 100644 --- a/docs/operate/provisioning/azure-cache-terraform-private/index-azure-cache-terraform-private.mdx +++ b/docs/operate/provisioning/_azure-cache-terraform-private/index-azure-cache-terraform-private.mdx @@ -1,13 +1,13 @@ --- id: index-azure-cache-terraform-private -title: Azure Cache for Redis Enterprise using Terraform with Private Link -sidebar_label: Azure Cache for Redis Enterprise using Terraform with Private Link +title: Azure Cache for Redis Cloud using Terraform with Private Link +sidebar_label: Azure Cache for Redis Cloud using Terraform with Private Link slug: /operate/provisioning/azure-cache-terraform-private --- Azure Private Link for Azure Cache for Redis provides private connectivity from a virtual network to your cache instance. This means that you can now use Azure Private Link to connect to an Azure Cache for Redis instance from your virtual network via a private endpoint, which is assigned a private IP address in a subnet within the virtual network. It simplifies the network architecture and secures the connection between endpoints in Azure by eliminating data exposure to the public internet. Private Link carries traffic privately, reducing your exposure to threats and helps you meet compliance standards. -Azure Resource Manager (a.k.a AzureRM) is the deployment and management service for Azure. It provides a management layer that enables you to create, update, and delete resources in your Azure account. You can use management features, like access control, locks, and tags, to secure and organize your resources after deployment. The "azurerm_redis_enterprise_cluster" is a resource that manages a Redis Enterprise cluster. This is a template to get started with the 'azurerm_redis_enterprise_cluster' resource available in the 'azurerm' provider with Terraform. +Azure Resource Manager (a.k.a AzureRM) is the deployment and management service for Azure. It provides a management layer that enables you to create, update, and delete resources in your Azure account. You can use management features, like access control, locks, and tags, to secure and organize your resources after deployment. The "azurerm_redis_enterprise_cluster" is a resource that manages a Redis Cloud cluster. This is a template to get started with the 'azurerm_redis_enterprise_cluster' resource available in the 'azurerm' provider with Terraform. ### Prerequisites @@ -104,7 +104,7 @@ terraform output redisgeek_config allowfullscreen> -##### 2. Do More with Azure Cache for Redis, Enterprise Tiers +##### 2. Do More with Azure Cache for Redis, Cloud Tiers
-##### Do More with Azure Cache for Redis, Enterprise Tiers +##### Do More with Azure Cache for Redis, Cloud Tiers