4

I have multiple repositories in Bitbucket that need to use the same bitbucket-pipelines.yml (which contain the import of an exported bitbucket-pipelines.yml file). Instead of manually updating this file in every repository, I want to centralize it in a separate repository and have all other repositories import it dynamically.

how to further centralize the imported Bitbucket pipeline in a single repository and share it across multiple repositories without manual duplication?

I tried using git subtree and git submodule However, this creates a folder instead of placing the file directly in the root of the target repository. Is there a way to share or link the bitbucket-pipelines.yml at the root of multiple repositories?

What’s the best approach to achieve this in Bitbucket?

1 Answer 1

1

Sharing Bitbucket Pipelines configurations between repositories is something that has been desired and requested for a long time.

Up until August 2023, it was straight up impossible (as you can also read in this SO post).

Then, in March 2023, Bitbucket stated in the community form they were working on this feature and tracking it in this Jira issue: BCLOUD-14078: Sharing pipeline yml files.

Finally, in August 2023 it's possible, but Premium Plan only (Feature "Scale CI workflows"):

In my opinion, it being available only for Premium plans (as of now, March 2025 still) is rather disappointing, since other CI/CD platforms like GitLab or GitHub have been providing this feature free of charge, for everyone, for years already.

It's also very limited in functionality compared to GitLab and GitHub, with you only being able to customize the included pipeline via CI-Variables and Secrets (as written in the Q&A section of that blog post). You can't overwrite specific configuration (so far).


Further ideas on your workaround & more (dirty ideas section)

  • You might get a git submodule solution working by packing it in a pre-commit Git hook: get the submodule, moving the file out of the folder, cleaning everything up and staging the file. Here's an implementation proposal by ChatGPT for example. Add the hook to the repo for version control - if you've got a Node.js project, Husky is a great package.

  • You could have a push-based solution, i.e. instead of pulling code in with Git submodules, you have a pipeline in the repo of your shared pipeline file, that pushes your shared pipeline to many other repos by doing a lot of git clone, cp ... and git push - that's dirty, too, because now your shared repo needs to know about (="depends on") all others that use it. To avoid unnecessarily triggering other pipelines, use [skip ci] in the commit message there.

Sign up to request clarification or add additional context in comments.

4 Comments

Export/import is already implemented (as suggested in 'Share Pipeline Workflow Configuration'). My question is how to further centralize the imported Bitbucket pipeline in a single repository and share it across multiple repositories without manual duplication.
If you already export your single pipeline YML file in a central location and import it in other repos, that's considered to be centralized with no duplication. How do you envision "further centralization"?
The goal is to maintain a single source of truth for a specific file in a central Bitbucket repository and have it automatically available in the root directory of multiple other repositories. This would ensure that any updates to the file in the central repository immediately reflect in all consuming repositories, eliminating the need for manual duplication or synchronization. Is there an approach within Bitbucket Cloud to achieve this, such as submodules without nesting, or an alternative method?
Wat you do with exporting the pipeline config (from a single source oft truth) to multiple other repos is already a solution without manual duplication or synchronization. That's best practice as you have it then. If you imagine not even having the bitbucket-pipelines.yml that import the shared config: that would be considered bad and obscure, because there'd be pipelines running and no one could trace where they come from. And technically Bitbucket needs to see such a file in the repo to know that you want to run a pipeline at all.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.