Reputation: 1383
I have an EC2-instance running docker using docker-compose.
I would like all logs written to a file by one of the applications to send that data async to CloudWatch. It would be ideal to have a separate container that is sharing the logs directory with the application container and runs something like tail -f
and sends the output to CloudWatch.
I am no expert, but I imagine that Filebeat does something similar, but Don't know if I can configure it to send to CloudWatch.
Upvotes: 0
Views: 1068
Reputation: 1383
So I ended up using this docker image
This is my service in my docker-compose.yml
cloudwatch:
image: iconara/awslogs:latest
command: "--region eu-central-1 --config-file /etc/awslogs/app.conf"
volumes:
- ./environment/cloudwatch/conf/awscli.conf:/etc/awslogs/app.conf
- ./environment/cloudwatch/state:/var/lib/awslogs/
- ./logs:/app-logs/general/
- ./project/storage/logs:/app-logs/laravel/
- ~/.aws:/root/.aws
I am mounting the credentials so I can use this for development on Mac OSX which is probably not needed on an EC2 instance, depending on your setup. I also mounted the state file so that the same logs wont be pushed twice if the container is restarted.
This is my ./environment/cloudwatch/conf/awscli.conf
[general]
state_file = /var/lib/awslogs/agent-state
use_gzip_http_content_encoding = true
[/app-logs/logs/laravel.log]
datetime_format = %Y-%m-%d %H:%M:%S
file = /app-logs/laravel/laravel.log
buffer_duration = 5000
#log_stream_name = {instance_id}
log_stream_name = development
initial_position = start_of_file
log_group_name = /app-logs/laravel/laravel.log
multi_line_start_pattern = {datetime_format}
Upvotes: 2