5

I have the following gitlab-ci.yaml

image: docker:19.03.13

variables:
  DOCKER_TLS_CERTDIR: "/certs"

services:
  - docker:19.03.13-dind

build-django_and_fastapi:
  before_script:
    - echo "$DOCKER_REGISTRY_PASS" | docker login $DOCKER_REGISTRY --username $DOCKER_REGISTRY_USER --password-stdin
  stage: build
  script:
    - mkdir test
    - cd test
    - git clone https://gitlab-ci-token:${CI_JOB_TOKEN}@gitlab.com/xxxx/yyyy.git
    - cd ..
    - docker build ./test

I got /bin/sh: eval: line xxx: git: not found

How add git package to docker:19.03.13 image

5
  • why do you need the other project, what is so important that you need it for building, and why do you only build within that directory? there is no clear reason, why you need git for that, and why you need the other project. Commented Oct 17, 2021 at 17:00
  • 1
    Whatever it is. I want to clone something and do something. I need git for that. Commented Oct 17, 2021 at 17:02
  • sorry to be so inappropriate, but there is a clear misunderstanding in technologies you are using, and there are always other paths to reach a goal. you try to force your path instead of letting other people help you to guide you through a safer and maybe easier passage. I can explain you how to create a docker image with git to be used in your job. but maybe that is not what you need and makes your live just more complicated. what you maybe need are just multiple docker images where you can depend from, or maybe you are missunderstanding fundamentals of gitlab CI. i just want to help you. Commented Oct 17, 2021 at 17:06
  • I have two gitlab repos. Now i am creating a single docker image from both of the them. I have created a third repo with only Dockerfile (which copies the clones repos) and I will trigger the pipline manually. Commented Oct 17, 2021 at 17:10
  • you see there are a lot of different options to tackle this problem, and imho there are a few which are easier to handle. i hope my contribution is helpful for you, and gives you guidance Commented Oct 17, 2021 at 18:07

3 Answers 3

4

To sumnmarize out of the comments: the goal is to build a docker image out of the content of two other repositories. Therefore we want to use git and docker at the same build stage - or at least this is the attempted try. I provide here different options, which can be used to achieve this.

Option 1: Migrating fetching logic into Dockerfile

Instead of messing around with a build image, i would migrate this logic into my Dockerfile. Normally i have easier ways of handling such things within the Dockerfile, and even if it adds another layer for git and for the removal - i am faster than trying to mess around with a build image, containing docker and git.

But it depends on your Dockerfile and the Docker base image you are using, with debian/alpine/etc this is quiet easy to achieve, they have their own package managers on board.

Option 2: Building a docker image for building containing docker and git

This option is the one i least favor. There is always an issue with properly setting up docker, or installing git additionally. But i will outline the process here too.

What you need in this case is an own Docker image where you either:

  • pick the docker image and install git
  • pick an git image and install docker
  • build a fresh image from the ground
  • (you can always try to figure out which package manager is used, and install it in the script block)

but it adds complexity and is more effort than doing Option 1. and offers lesser safety than Option 3.

Option 3: Use API instead of GIT (my recommended way)

Instead of using git for fetching the content, there is also the API https://docs.gitlab.com/ee/api/repositories.html#get-file-archive

Which allows you to download a special ref as a zip/tar etc. which can be easier used than a git checkout. this can also be combined with option 1. as it allows easy fetching of the content via curl.

This option has also the benefit, of not loading the git history, just the current state. Which might improve build times.

Option 4: multiple build steps

Instead of trying to merge the docker build and the git checkout, you can split both into two jobs. first one with git, fetching the repositories and one for the docker build.

Important to note here is the artifacts directive, with which you can define which files are available at the next stage/build. Take a look at https://docs.gitlab.com/ee/ci/yaml/#artifacts which is a good resource regarding that directive.

Option 5: Using git Submodules

Instead of doing the checkout manually, the other repositories could also be added as git submodules. Which can be seen as sub directories which point to other git repositories. There is a special checkout behaviour attached, but with a closer look into submodules, you should figure this out quiet easily.

Be aware to also set the GIT_SUBMODULE_STRATEGY so those will be fetched. https://docs.gitlab.com/ee/ci/runners/configure_runners.html#git-submodule-strategy

Sign up to request clarification or add additional context in comments.

3 Comments

i forgot an option i will add it - added mutliple steps and artifact directive
note worthy is also the possibility of using git submodules - i will add it
for option 2 there is an official image with git preinstalled: docker:git
2

I had to install git inside the docker image

image: docker:19.03.13

variables:
  DOCKER_TLS_CERTDIR: "/certs"

services:
  - docker:19.03.13-dind

build-django_and_fastapi:
  stage: build
  script:
    - docker login -u gitlab-ci-token -p $CI_JOB_TOKEN registry.gitlab.com
    - apk update
    - apk add git
    - mkdir test
    - cd test
    - git clone https://gitlab-ci-token:${CI_JOB_TOKEN}@gitlab.com/xxxx/yyyy.git
    - cd ..
    - docker build ./test

So in the script i have added apk add git

Regarding the approach of making a image from src code of multiple repos

I prefer to prepare the full context folder for the DockerFile and then build it.

So in the script I do

script
  - make folders for src codes
  - clone the src codes into those folders
  - build the image using docker build

The reason i do this, we can take advantage of the cache while image building. So if 10 steps go well, then next time i build the image it uses the cached layers and start from the 11th step.

Because image building will need some editing the Dockerfile till we get the image right. So caching of the layers is very helpful.

If i try to git clone inside dockerfile, it may not take advantage of the cache.

Ofcouse on local pc i can use the cache mechanism, but in gitlab docker i am not sure how can i use it.

2 Comments

Please consider adding .git of all the repos to the .dockerignore file - if you don't need the git history within the docker image.
Take a look at blog.callr.tech/… for loading an existing image up front and some other best practices to build docker images
0

This image / services combination worked for me:

default:
  image: docker:19-git
  services:
    - docker:19-dind
  before_script:
    - docker info

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.