Installing & Connecting to a Redis Docker Container Locally

Installing & Connecting to a Redis Docker Container Locally

I found myself needing to locally run a Python application that required a Redis connection. I merely needed to run the application and didn't want to have to tinker with Docker compose or write a Dockerfile for the application. It has been a while since I worked with Docker so I figured I would document the process.

Prerequisites

  • If you do not have Docker installed locally, download and install it.
  • You technically do not need an application to run against the docker container to test the setup, but it is helpful for the final step.

Installing and Running the Redis Container

Redis has an official container on DockerHub. This will work fine as is for my purposes.

To pull the Redis container image, simply run the following in the terminal.

docker pull redis

Once pulled, the image can be run via:

docker run  -p 127.0.0.1:6379:6379/tcp --name container-redis-test -d redis

Inside the container, Redis will run using its default configuration which includes running on port 6379. The (-p) argument tells Docker to expose external port 6379 and map it to the container's internal port 6379 where Redis is running. The (-d) argument runs the container as a daemon in the background.

Run the following to ensure your container is running:

docker ps

Note: You can also view the container inside the Docker desktop application as well as stop the container, delete it, etc.

Connecting to Redis from within the Container

First, I need to connect to the container. This can be achieved by running:

docker exec -it container-redis-test  sh

This gives me a shell prompt. To run Redis commands to ensure Redis itself is running within the container, I'll use the redis-cli tools that come with the DockerHub image. At the prompt, I can run:

# redis-cli`

This gives me a new prompt for the redis-cli. Now, I can run Redis commands:

127.0.0.1:6379> set mykey myvalue
OK
127.0.0.1:6379> get mykey
"myvalue"

The container appear to be running fine. Next, I will ensure I can connect to it from a Python application.

Note: The Redis DockerHub image has Redis authentication turned off by default. As such, you do not need to enter a password. If you intend to use this for more than local testing, you should enable a password using the Redis config.

Connecting to Redis via the Local Application

The Python application I am trying to run is designed to work within Google's VPC. As such it does not require a password. Also, the Python app is using the redis-py client.

To test the setup, I added the following fastapi handler:

import os
import redis
...
@app.get('/redis')
def index():
    redis_host = os.environ.get('REDIS_HOST', 'localhost')
    redis_port = int(os.environ.get('REDIS_PORT', 0))
    redis_client = redis.StrictRedis(host=redis_host, port=redis_port)

    value = redis_client.incr('counter', 1)
    return 'Value is {}'.format(value)

The command to run the application is uvicorn main:app --reload. As such, I easily run the application with the Redis environment variables via

REDIS_HOST=127.0.0.1 REDIS_PORT=6379 uvicorn main:app --reload

When I load the application in the browser and hit the /redis endpoint I see the response:

"Value is 1"

With each browser, reload the value goes up:

"Value is 2"

Excellent!

Conclusion

With minimal tinkering, I was able to run an existing Python application locally against a Redis instance running inside of a Docker container. This is great. However, I need to remember to shut down the container and start a new one each time I need to run the application. In a future post, I will document using Docker compose to automatically start and stop the Redis container when running the application.

If you found this information useful, you might also check out my articles on Dockerizing Node applications and pushing custom container images to Google Container Registry using Google Cloud Build.

Image Credit: Photo by Pixabay from Pexels