babis21
babis21

Reputation: 1890

Docker image with R (rocker/r.base) and python does not work when running on EC2, but local is fine

I have started using docker recently. Seems pretty exciting, the fact that you can build apps once and run them in any machine, sounds amazing!

The truth is that I have experienced something else. I have an R image as a base (rocker/r.base) and I want to install python to it, so I can run a flask app on the image which will expose an endpoint that once called, it will run an R script and log some results.

The whole thing is a part of a docker-compose file (there are 3-4 more images on the compose, all do a separate operation).

So I manage to build the image and run it locally with docker-compose.

The weird things starts when I try to deploy the image to an AWS EC2 instance. So far, I have managed to run the other images on AWS. But for this specific image, the build failed because of some library dependency errors that I get, and after fixing this, the execution is failing. To get into a few more details:

So it seems that Rscript is not installed on my container at AWS!! This could be because of the manual installs that I did, in which I have used --allow-downgrades so the proper versions will be installed. I suspect that these downgrades perhaps remove some libraries?

Anyway, I can't really figure out what is going wrong. It just sounds terrible to me. I thought that once you get something running locally, then you can run it anywhere with docker, but it seems that either this is NOT the case, or I am missing something VERY BAD.

BTW, my local an production (AWS) server use same version of docker (1.12.3, build 6b644ec) and docker-compose (version 1.7.0, build 0d7bf73) and I have enabled i386 architecture to both machines (just in case this was the issue, both are actually using amd64).

Not sure if I expect some answer from this post, so far I have not found similar situations while googling, but if anyone has faced such situation before, speak freely! :)

Cheers, Babis

UPDATE:

You are absolutely right Yaron. I attach some more info. First, the docker-compose.yml:

version: '2'
services:
  recommend.service:    # This service is fine both locally and on AWS
    build:
      context: .
      dockerfile: ./docker/recommend.service.Dockerfile
    ports:
     - "8081:8081"
    restart: always
    env_file: .env
  cltv.service:     # This is the service that has problems to be built and run on AWS. Locally is fine
    build:
      context: .
      dockerfile: ./docker/cltv.service.Dockerfile
    ports:
     - "8082:8082"
    restart: always
    env_file: .env
  rabbit:       # Works everywhere
    image: rabbitmq
    hostname: smart.rabbitmq
    restart: always
    volumes:
      - rabbit.data:/var/lib/rabbitmq
  celery:       # Works everywhere
    build:
      context: .
      dockerfile: ./docker/smart.celery.Dockerfile
    depends_on:
     - rabbit
    restart: always
    env_file: .env
volumes:
  rabbit.data:
    driver: local

And the ./docker/cltv.service.Dockerfile:

## Adapted from zamora/r-devtools
## Start with the official rocker image (lightweight Debian)
FROM rocker/r-base:latest

MAINTAINER Babis <[email protected]>

ENV DEBIAN_FRONTEND noninteractive

RUN mkdir -p /usr/src/cltv_app
WORKDIR      /usr/src/cltv_app

COPY requirements.txt           /usr/src/cltv_app/requirements.txt
COPY requirements-cltv.txt      /usr/src/cltv_app/requirements-cltv.txt
COPY cltv/                      /usr/src/cltv_app/cltv/
COPY services/                  /usr/src/cltv_app/services/

# Install external dependencies
RUN apt-get update -qq \
 && apt-get install -y --no-install-recommends --allow-downgrades \
 libcurl3=7.50.1-1 \        # I added that because of dependencies issue while building in the server. Locally I didn't need it
 libcurl4-openssl-dev \
 libssl-dev \
 libsqlite3-dev \
 libxml2-dev \
 qpdf \
 vim \
 libgsl-dev \
 && apt-get clean \
 && rm -rf /var/lib/apt/lists/ \
 && rm -rf /tmp/downloaded_packages/ /tmp/*.rds

# Install devtools and testthat
RUN install2.r --error \
    devtools \
    testthat \
    gsl

# Install some required libraries
RUN Rscript -e 'devtools::install_github("mplatzer/BTYDplus", dependencies=TRUE)'

## Note: I had added the below in order to get a succesful build because of the dependencies errors. In my local, the image is built and run without these!!!
#RUN apt-get update
#RUN apt-get install -y --allow-downgrades libkrb5support0=1.14.3+dfsg-2
#RUN apt-get install -y libkrb5-3=1.14.3+dfsg-2
#RUN apt-get install -y libk5crypto3=1.14.3+dfsg-2
#RUN apt-get install -y libgssapi-krb5-2=1.14.3+dfsg-2
#RUN apt-get update
#RUN apt-get install -y krb5-multidev

# Install python and postgres required packages
RUN apt-get update
RUN apt-get install -y python3.4 python3-dev libpq-dev python-pip

RUN pip install --no-cache-dir -r requirements.txt
RUN pip install --no-cache-dir -r requirements-cltv.txt
RUN pip install -e cltv/.

EXPOSE 8082

# Set a big timeout
CMD ["/usr/local/bin/gunicorn", "--log-config", "cltv/logging.conf",  "cltv.wsgi:app", "--bind", "0.0.0.0:8082", "--timeout", "86400"]

And I trigger the whole process with:

docker-compose up --build -d

Thanks, please suggest if you need to see something else on the setup!

Upvotes: 2

Views: 2240

Answers (1)

babis21
babis21

Reputation: 1890

Reason was that r.base had conflicts with some python packages, and could not use python with this R image.

I ended up with a solution of a docker-inside-docker image, that had ubuntu as base. I installed the required python libraries in the ubuntu image, installed docker on it and inside this image I used docker to build this r.base image. Then I used docker run from inside ubuntu container, and manage to do my work :)

Upvotes: 2

Related Questions