Docker Community Forums

Share and learn in the Docker community.

How to put apache log to CloudWatch

Hi.
I am trying to put apache logs to cloud watch.
However i can’t.
There are no error information,then I can’t resolve by myself.
My environment is the following information.

[ec2-user@ip-xx-x-x-xx httpd]$ sudo cat /etc/awslogs/awscli.conf
[plugins]
cwlogs = cwlogs
[default]
region = ap-northeast-1
aws_access_key_id = xxxxxx
aws_secret_access_key = xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx6MpI4

[general]
state_file = /var/awslogs/state/agent-state

[/var/log/messages]
file = /var/log/messages
log_group_name = /var/log/messages
log_stream_name = {instance_id}
datetime_format = %b %d %H:%M:%S

[/var/log/httpd_stg2]
file = /var/log/httpd/error_*
buffer_duration = 5000
log_stream_name = {instance_id}
log_group_name = /var/log/httpd_stg2
[ec2-user@ip-xx-x-x-xx httpd]$ cat /etc/default/docker
export AWS_REGION=ap-northeast-1
export AWS_ACCESS_KEY_ID=xxxxxxxxxxxxxxxxxxxxxxxx
export AWS_SECRET_ACCESS_KEY=xxxxxxxxxxxxxxxxxxxxxxx
[ec2-user@ip-xx-x-x-xx httpd]$ cat /etc/init/docker.override
env AWS_REGION=ap-northeast-1
env AWS_ACCESS_KEY_ID=xxxxxxxxx
env AWS_SECRET_ACCESS_KEY=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx6MpI4
sudo docker run --name suitecrm_stg --privileged -h httpd -d -p 8889:80 -p 14739:22 --log-driver=awslogs \
> --log-opt awslogs-region=ap-northeast-1 \
> --log-opt awslogs-group=/var/log/httpd_stg2 \
> --log-opt awslogs-stream=b8fa3c77b52a7a4895982f6d2f9e6d1392cf30ecaaaf982af36499d97fcfda2a \
> b6cb73e9f038 /sbin/init
8b285b5e8f91f049ef3e980a7624ea739e2d2756ac46290c199eebbecf5e1b03

The above command was success.
When i browsed webpage,i could check apache error on docker container.

[Tue Apr 18 14:36:21.155075 2017] [:error] [pid 62] [client dddddd:54399] PHP   6. LDAPAuthenticate->__construct() /var/www/SuiteCRM/modules/Users/authentication/AuthenticationController.php:98, referer: http://xx.xx.x.x.x:8889/index.php?module=Home&action=index
[Tue Apr 18 14:36:21.155082 2017] [:error] [pid 62] [client dddddd:54399] PHP   7. SugarAuthenticate->__construct() /var/www/SuiteCRM/modules/Users/authentication/LDAPAuthenticate/LDAPAuthenticate.php:61, referer: http://xx.xx.x.x.x:8889/index.php?module=Home&action=index
[Tue Apr 18 14:36:21.155089 2017] [:error] [pid 62] [client dddddd:54399] PHP   8. require_once() /var/www/SuiteCRM/modules/Users/authentication/SugarAuthenticate/SugarAuthenticate.php:66, referer: http://xx.xx.x.x.x:8889/index.php?module=Home&action=index

however i could not check on CloudWatch.


What is this problem?

Hi Shiratsu,
I haven’t used plugins for log redirection so my solution will be slightly different. One thing you can do is modify your app (Apache in this case) to output its logs to standard out. There’s a couple of ways you can do this but I think the symlink method should work fine (https://serverfault.com/questions/711168/writing-apache2-logs-to-stdout-stderr). Once that is done, you then create an IAM role that grants the EC2 instance that is running your container(s) the ability to create logs in Cloudwatch. At the time that I tested this all out, the ability to create Cloudwatch Loggroups on the fly wasn’t an option so you will have to go in and create the Loggroup for the containers to stream to. Once you have created and assigned the IAM role to the EC2 instance and also have the Log group created, you can then launch your container with the --log-driver=awslogs option and your apache logs should start streaming to Cloudwatch.

Thank you for replying.
I could understand almost thing.
However i am stil anxious,because i could not understand completely.
So i would like to confirm my understanding.

I have to do something.
At first
to set symlink for accesslog and errorlog

RUN ln -sf /proc/self/fd/1 /var/log/apache2/access.log && \
    ln -sf /proc/self/fd/1 /var/log/apache2/error.log

Then,
to add the ability(create logs in cloud watch)
Actually i could put log for cloud watch by command line.

aws logs put-log-events --log-group-name "apache_error_stg" --log-stream-name "xxxxxxxxxxxxxxxx" --log-events timestamp=xxxxxx,message='Hello CloudWatch' --sequence-token 49569706334742091875879266998614607044570927078983076050

I think i don need to add the auth for create log.
Is that correct?

Then,
I have to create group by aws console.

Then,
to assign IAM role to the EC2 instance.
As i said before,i already could put log by command line.
I think i don need to add the auth for create log.
Is that correct?

Edit - just noticed that you are exporting your AWS access and secret key as env variables. I completely missed that on my first read through. Yeah that should work in terms of permissions but I would go ahead and create the role and assign it to the EC2 instance since I know that method works.

so you are able to create logs in Cloudwatch via the command line because the api key you have stored in your credentials file (on your local machine) is associated with an account that has those permissions. By creating an IAM role and assigning it to your EC2 instance, you are giving the EC2 instance the permissions to write logs as well. So you definitely want to perform the role creation steps (as a side note, you may be able to copy your api key on to the EC2 instance and configure it to use that to communicate with cloudwatch but that is a bit more complicated and honestly a bit less secure than using the IAM role route)

Once you have the role configured and assigned to the EC2 instance, the next step is to create the LogGroup in Cloudwatch that you are going to have your container(s) stream to and then finally you launch your container(s) with the with the appropriate cloudwatch options (–log-driver and --log-opt)

so for example, if I create a Cloudwatch Loggroup called mytest-web-logs, I would configure my container to send to it via the command below –

docker run -d -p 80:80 --network=mytestsite_network --name mytest-web --log-driver=awslogs --log-opt awslogs-group=mytest-web-logs mycompany/mytest-web:latest

let me know if the above helps and if not, I can write up step by step instructions (with screenshots) for the whole process.

it just dawned on me that you posted your question in the Docker for AWS sub-forum. Are you trying to do this in a Docker Swarm (via the Docker for AWS template)? if so, then streaming to cloudwatch should already be taken care of for you and all you need to do is configure apache to log to stdout. Otherwise the steps that I noted above will work for getting logs into cloudwatch for containers not running in a Docker for AWS environment.

I try to attach policy,However i couldn’t.

[ec2-user@ip-10-x-x-xx ~]$ cat CloudWatchLogRole.json
{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Action": [
        "logs:CreateLogStream",
        "logs:PutLogEvents"
      ],
      "Effect": "Allow",
      "Principal": {
        "Service": "ec2.amazonaws.com"
      }
    }
  ]
}
[ec2-user@ip-10-x-x-xx ~]$ aws iam create-role       --role-name CloudWatchLogRole       --assume-role-policy-document file://~/CloudWatchLogRole.json

An error occurred (MalformedPolicyDocument) when calling the CreateRole operation: AssumeRole policy may only specify STS AssumeRole actions.

Could you describe the way step by step?

I’m so sorry.

no need to apologize. Docker is still a fairly young product so it is going to take some time to hammer out all the proper steps for things. I will definitely type up the ‘how to’ doc as it could help other people down the road as well. I have a sick baby at home so I more than likely won’t get it written until this weekend. In the meantime let’s continue to troubleshoot this together. I noticed the AssumeRole error you get when trying to create the role. Any chance you can use the console to create the policy first (which has an option to validate your syntax prior to creation)? Basically, create the policy like the one I pasted here (yours should work as well), then assign that policy to a Role and finally assign that role to the EC2 instance.

and let me know if that gets you any further.

Thank you,kevinmcgarry.
Yesterday,I attached this policy.

The host ec2 instance for docker has this policy.