Reputation: 1
I am trying to setup a Redis clustered environment. The setup is as follows (see pic via link):
The issue that I am having is getting S3 to make the remote connection. In fact, I have tried simply deploying 1 Master and attempting the Sentinel remote connection but this too fails. Please see the simplified config files for the master and sentinel.
local
redis-server /path/to/local/redis.conf
local/redis.conf
bind 127.0.0.1 192.168.20.37
port 6379
dir .
remote
redis-sentinel /path/to/remote/sentinel.conf
remote/sentinel.conf
bind 127.0.0.1 192.168.20.140
port 16379
sentinel monitor redis-cluster 192.168.20.37 6379 2
After starting Sentinel on the remote image, Sentinel fails to connect to the master. See the following output:
# oO0OoO0OoO0Oo Redis is starting oO0OoO0OoO0Oo
# Redis version=4.0.8, bits=64, commit=00000000, modified=0, pid=15095, just started
# Configuration loaded
* Increased maximum number of open files to 10032 (it was originally set to 1024).
* Running mode=sentinel, port=16379.
# Sentinel ID is 66c95f52fbc72b6a33009c36d9ac6b4e91988b81
# +monitor master mymaster 192.168.20.37 6379 quorum 2
# +sdown master mymaster 192.168.20.37 6379
If I run Sentinel from the local image, making IP changes, it works as expected. No firewalls, no NAT. I should also note that I can successfully make remote client connections to the Master.
Any suggestions for this seemingly 'simple' setup?
Upvotes: 0
Views: 1026
Reputation: 1
I was able to resolve the issue by binding Sentinel to 0.0.0.0. It appears as though when you specify localhost + IP it only binds to localhost.
Upvotes: 0