Configuring Bitbucket Pipelines with Docker to Connect to AWS

I am trying to set up Bitbucket pipelines for deployment to ECS like here: https://confluence.atlassian.com/bitbucket/deploy-to-amazon-ecs-892623902.html

These instructions tell how to push to a Docker host, but I want to push an image to the Amazon image repository. I have set AWS_SECRET_ACCESS_KEY and AWS_ACCESS_KEY_ID to my Bitbucket parameter list and I can run this command locally without issue (keys defined in ~ / .aws / credentials). However, I keep getting the "no basic credentials" error. I'm wondering if it doesn't recognize variables somehow. The docs here: http://docs.aws.amazon.com/cli/latest/userguide/cli-chap-getting-started.html says that:

The AWS CLI uses the provider chain to look up AWS credentials in a variety of places, including system or user environment variables and local AWS configuration files. So I'm not sure why it doesn't work. My bitbucket pipelines configuration is the same (I haven't included anything extra):

      - export IMAGE_NAME=$AWS_REPO_ID.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com/my/repo-name:$BITBUCKET_COMMIT
      # build the Docker image (this will use the Dockerfile in the root of the repo)
      - docker build -t $IMAGE_NAME .
      # authenticate with the AWS repo (this gets and runs the docker login command)
      - eval $(aws ecr get-login --region $AWS_DEFAULT_REGION)
      # push the new Docker image to the repo
      - docker push $IMAGE_NAME

      

Is there a way to specify credentials for aws ecr get-login to use? I even tried this, but it doesn't work:

      - mkdir -p ~/.aws
      - echo -e "[default]\n" > ~/.aws/credentials
      - echo -e "aws_access_key_id = $AWS_ACCESS_KEY_ID\n" >> ~/.aws/credentials
      - echo -e "aws_secret_access_key = $AWS_SECRET_ACCESS_KEY\n" >> ~/.aws/credentials

      

thank

+3


source to share


2 answers


Try the following:

BitBucket-pipeline.yml

pipelines:
  custom:
    example-image-builder:
      - step:
          image: python:3
          script:
            - export CLONE_ROOT=${BITBUCKET_CLONE_DIR}/../example
            - export IMAGE_LOCATION=<ENTER IMAGE LOCATION HERE>
            - export BUILD_CONTEXT=${BITBUCKET_CLONE_DIR}/build/example-image-builder/dockerfile
            - pip install awscli
            - aws s3 cp s3://example-deployment-bucket/deploy-keys/bitbucket-read-key .
            - chmod 0400 bitbucket-read-key
            - ssh-agent bash -c 'ssh-add bitbucket-read-key; git clone --depth 1 git@bitbucket.org:example.git -b master ${CLONE_ROOT}'
            - cp ${CLONE_ROOT}/requirements.txt ${BUILD_CONTEXT}/requirements.txt
            - eval $(aws ecr get-login --region us-east-1 --no-include-email)
            - docker build --no-cache --file=${BUILD_CONTEXT}/dockerfile --build-arg AWS_ACCESS_KEY_ID=${AWS_ACCESS_KEY_ID} --build-arg AWS_SECRET_ACCESS_KEY=${AWS_SECRET_ACCESS_KEY} --tag=${IMAGE_LOCATION} ${BUILD_CONTEXT}
            - docker push ${IMAGE_LOCATION}

options:
  docker: true

      



dockerfile

FROM python:3
MAINTAINER Me <me@me.me>
COPY requirements.txt requirements.txt
ENV DEBIAN_FRONTEND noninteractive
ARG AWS_ACCESS_KEY_ID
ARG AWS_SECRET_ACCESS_KEY
RUN apt-get update && apt-get -y install stuff
ENTRYPOINT ["/bin/bash"]

      

I'm running out of time, so I've now included more than just an answer to your question. But it will be a good enough template to work with. Ask questions in the comments if there is any line you do not understand and I will edit the answer.

0


source


I had the same problem. the bug is mostly related to the old version of awscli. you need to use docker image with more recent awscli. for my project i am using linkmobility / maven-awscli

  • You need to set Environment Variables in Bitbucket

  • small changes in your pipeline



image: Docker-Image-With-awscli

  • eval $ (aws ecr get-login --no-include-email --region $ {AWS_DEFAULT_REGION})
0


source







All Articles