Reputation: 4352
I am trying to connect to an Elasticsearch node from Python with SSL.
I'm using the basic code for that:
from elasticsearch import Elasticsearch
from ssl import create_default_context
context = create_default_context(cafile="path/to/cafile.pem")
es = Elasticsearch("https://elasticsearch.url:port", ssl_context=context, http_auth=('elastic','yourpassword'))
From: https://github.com/elastic/elasticsearch-py
I need to supply cafile.pem
, and http_auth
parameters. On the server where my Python is running, SSL connection is already set up, so I can do basic queries to Elasticsearch. It was set up using keys in the ~/.ssh
directory: id_rsa
, id_rsa.pub
.
So, now I am wondering whether I should supply id_rsa.pub
key in place of path/to/cafile.pem
, and if yes, then I would need to change permissions of ~/.ssh
folder which seems like not a good idea from security perspective.
Then, I am not sure that .pub
is the same as .pem
, do I need to convert it first? Then, should http_auth
just be omitted since I do not use any password when I do simple queries from the terminal?
How should I go about this issue of setting up access in Python to ES with SSL according to best practices?
I tried both .pub
and generated from it pem
: https://serverfault.com/questions/706336/how-to-get-a-pem-file-from-ssh-key-pair
But both failed to create_default_context
with an unknown error
in context.load_verify_locations(cafile, capath, cadata)
.
Upvotes: 7
Views: 21300
Reputation: 21
We are using Elasticsearch 8.12.0 deployed by docker-compose with self-signed certs
from elasticsearch import Elasticsearch
es = Elasticsearch(
"https://hostname:9200/",
ca_certs="path/to/ca.crt",
basic_auth=("username", "password")
)
#print(es.info())
indices = es.indices.get(index="_all")
for index in indices:
print(index)
Upvotes: 2
Reputation: 29600
For self-signed certificates, using:
from elastic_transport import NodeConfig
from elasticsearch import AsyncElasticsearch
client = AsyncElasticsearch(
hosts=[
NodeConfig(
scheme= "https",
host="<host URL>",
port=443,
verify_certs=False,
ca_certs=None,
ssl_show_warn=False,
)
],
http_auth=("username", "password"),
verify_certs=False,
ca_certs="/path/to/cafile.pem", # PEM format
client_cert="/path/to/tls.cert" # PEM format
client_key="/path/to/tls.key" # PEM format
)
client.info()
Explanation:
verify_certs=False
disables the underlying Python SSL modules from verifying the self-signed certs, but properly sends it upstream to the server. For non-self-signed certificates, you should try enabling verify_certs=True
.AsyncElasticsearch
, but if you need the sync Elasticsearch
version, it should be directly compatible as all the parameters are the same. See: https://elasticsearch-py.readthedocs.io/en/v8.8.1/async.html#getting-started-with-async)So, now I am wondering whether I should supply id_rsa.pub key in place of path/to/cafile.pem, and if yes, then I would need to change permissions of ~/.ssh folder which seems like not a good idea from security perspective.
These SSH keys are likely not related to Elasticsearch, but for allowing you to connect (via SSH) to your nodes running your Elasticsearch instance.
Upvotes: 1
Reputation: 344
Elasticsearch Docker image & Python2.7. Have Copied ssl certificate file to root of the project. Made sure it's readable, ownership and group ownership will allow read access. Put pass and login to constants.
es = Elasticsearch(
hosts=[
"https://localhost:9200"
],
http_auth=(USR_LOGIN, USR_PASS),
use_ssl=True,
verify_certs=True,
ca_certs="./http_ca.crt",
)
Upvotes: 4
Reputation: 4352
The answer for my particular case turned out to be very simple. I found it here:
https://elasticsearch-py.readthedocs.io/en/master/
es = Elasticsearch(['https://user:secret@localhost:443'])
Just specified https
url
inside and it worked out right away.
Upvotes: 7