To sumnmarize out of the comments: the goal is to build a docker image out of the content of two other repositories. Therefore we want to use git and docker at the same build stage - or at least this is the attempted try. I provide here different options, which can be used to achieve this.
Option 1: Migrating fetching logic into Dockerfile
Instead of messing around with a build image, i would migrate this logic into my Dockerfile. Normally i have easier ways of handling such things within the Dockerfile, and even if it adds another layer for git and for the removal - i am faster than trying to mess around with a build image, containing docker and git.
But it depends on your Dockerfile and the Docker base image you are using, with debian/alpine/etc this is quiet easy to achieve, they have their own package managers on board.
Option 2: Building a docker image for building containing docker and git
This option is the one i least favor. There is always an issue with properly setting up docker, or installing git additionally. But i will outline the process here too.
What you need in this case is an own Docker image where you either:
- pick the docker image and install git
- pick an git image and install docker
- build a fresh image from the ground
- (you can always try to figure out which package manager is used, and install it in the script block)
but it adds complexity and is more effort than doing Option 1. and offers lesser safety than Option 3.
Option 3: Use API instead of GIT (my recommended way)
Instead of using git for fetching the content, there is also the API https://docs.gitlab.com/ee/api/repositories.html#get-file-archive
Which allows you to download a special ref as a zip/tar etc. which can be easier used than a git checkout. this can also be combined with option 1. as it allows easy fetching of the content via curl.
This option has also the benefit, of not loading the git history, just the current state. Which might improve build times.
Option 4: multiple build steps
Instead of trying to merge the docker build and the git checkout, you can split both into two jobs. first one with git, fetching the repositories and one for the docker build.
Important to note here is the artifacts directive, with which you can define which files are available at the next stage/build. Take a look at https://docs.gitlab.com/ee/ci/yaml/#artifacts which is a good resource regarding that directive.
Option 5: Using git Submodules
Instead of doing the checkout manually, the other repositories could also be added as git submodules. Which can be seen as sub directories which point to other git repositories. There is a special checkout behaviour attached, but with a closer look into submodules, you should figure this out quiet easily.
Be aware to also set the GIT_SUBMODULE_STRATEGY so those will be fetched. https://docs.gitlab.com/ee/ci/runners/configure_runners.html#git-submodule-strategy