Reputation: 533
MySQL and elasticsearch are hosted on aws. And the logstash is running on an ec2. I am not using a VPC.I can connect to MySQL locally or on my ec2.
Modifying question a bit. Here is the new logstash file in my ec2. My es2 instance is not SSL certified, is that a problem?
input {
jdbc {
jdbc_connection_string => "jdbc:mysql://aws.xxxxx.us-east-1.rds.amazonaws.com:3306/stuffed?user=admin&password=pword"
jdbc_user => "admin"
jdbc_password => "pword"
schedule => "* * * * *"
jdbc_validate_connection => true
jdbc_driver_library => "mysql-connector-java-8.0.19.jar"
jdbc_driver_class => "com.mysql.cj.jdbc.Driver"
statement => "SELECT * from foo"
type => "foo"
tags => ["foo"]
}
jdbc {
jdbc_connection_string => "jdbc:mysql://aws.xxxxx.us-east-1.rds.amazonaws.com:3306/stuffed?user=admin&password=pword"
jdbc_user => "admin"
jdbc_password => "pword"
schedule => "* * * * *"
jdbc_validate_connection => true
jdbc_driver_library => "mysql-connector-java-8.0.19.jar"
jdbc_driver_class => "com.mysql.cj.jdbc.Driver"
statement => "SELECT * from cat"
type => "cat"
tags => ["cat"]
}
jdbc {
jdbc_connection_string => "jdbc:mysql://aws.xxxxx.us-east-1.rds.amazonaws.com:3306/stuffed?user=admin&password=pword"
jdbc_user => "admin"
jdbc_password => "pword"
schedule => "* * * * *"
jdbc_validate_connection => true
jdbc_driver_library => "mysql-connector-java-8.0.19.jar"
jdbc_driver_class => "com.mysql.cj.jdbc.Driver"
statement => "SELECT * from rest"
type => "rest"
tags => ["rest"]
}
}
output {
stdout { codec => json_lines }
if "foo" in [tags] {
amazon_es {
hosts => ["https://es1.stuffed-es-mysql.us-east-1.es.amazonaws.com"]
index => "foo"
region => "us-east-1"
aws_access_key_id => "id"
aws_secret_access_key => "key"
document_type => "foo-%{+YYYY.MM.dd}"
}
}
if "cat" in [tags] {
amazon_es {
hosts => ["https://es1.stuffed-es-mysql.us-east-1.es.amazonaws.com"]
index => "cat"
region => "us-east-1"
aws_access_key_id => "id"
aws_secret_access_key => "key"
document_type => "cat-%{+YYYY.MM.dd}"
}
}
if "rest" in [tags] {
amazon_es {
hosts => ["https://es1.stuffed-es-mysql.us-east-1.es.amazonaws.com"]
index => "rest"
region => "us-east-1"
aws_access_key_id => "id"
aws_secret_access_key => "key"
document_type => "rest-%{+YYYY.MM.dd}"
}
}
}
Now the issue I'm getting is a 403 forbidden error.
I did create a user in AWS with AmazonESFullAccess (AWS managed policy) permission. I'm not sure what else to do anymore. I am not using a VPC, I was trying to avoid that. So I want to stick with public access.
but I get an error error_type=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError, :error=>"Elasticsearch Unreachable: [https://user:[email protected]:9200/]
and the logstash ouput for this instance is:
elasticsearch {
hosts => ["https://es-mysql.us-east-1.es.amazonaws.com/"]
index => "category"
user => "user"
password => "password"
document_type => "cat-%{+YYYY.MM.dd}"
}
Obviously this isn't the preferred method, but I'm really just trying to setup a dev/personal environment.
Also, I am able to login into Kibana with this instance.
For the first elastic search service:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::111111111:user/root"
},
"Action": "es:*",
"Resource": "arn:aws:es:us-east-1:11111111:domain/stuffed-es-mysql/*"
},
{
"Effect": "Allow",
"Principal": {
"AWS": "*"
},
"Action": "es:*",
"Resource": "arn:aws:es:us-east-1:166216189490:domain/stuffed-es-mysql/*",
"Condition": {
"IpAddress": {
"aws:SourceIp": [
"11.11.11.111",
"111.111.1.111",
"111.111.1.1",
"1.11.111.111",
]
}
}
}
]
}
For the second ElasticSearch Service I created:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": "*"
},
"Action": "es:*",
"Resource": "arn:aws:es:us-east-1:xxxxxxxxx:domain/es-mysql/*"
}
]
}
Upvotes: 2
Views: 1939
Reputation: 533
SO I resolved the two issues.
Problem one: I was still having a problem with my logstash input connection to my RDS (which wasn't the problem posted but I thought I'd share anyway) :
Problem:
The RDS MySQL instance was the wrong version. It was set to the recommended value by AWS (5.7) and I need the latest 8.0.*. This was causing a problem with jdbc not working correctly.
Solution:
Update RDS MySQL instance to 8.0.17. Now my logstash was able to read the input from MySQL RDS.
Problem two: The output of my logstash was not working.
Problem:
Receiving a 403 forbidden error.
Solution:
Removed the https requirement when setting up the ES service. This most likely worked for me because my ec2 is not ssl certified.
The solution to problem two was questioned before in my post. And being new to logstash setup on an ec2, and elasticsearch service (AWS) I was not going with my instincts. But, I removed the https requirement when setting up the ES service. Yes not the greatest idea, but this a dev environment. This fixed the issue. Why? Because my ec2 service is not ssl certified. This makes sense. One of the common problems you get with a 403 error is because the source is sending a request to a SSL certified destination.
I think this is a common problem for people who want to run something on a budget and it's overlooked. Most people want to jump to your Access Policy, IP Policy or Security group and that's understandable.
Upvotes: 1