Recursive Pipeline Element
Projects depending on other projects is central to the overall BuildStream design and still needs to be implemented.
The specifics of this have not been clearly decided, but the way it should work is:
A new pipeline element is added to the Element plugins
The pipeline Element imports an external project via it's sources
When it is time to build the pipeline element, it will stage the buildstream project in a sandbox directory, but without using any sandbox features to run, just to have a staging directory to hold the exact revision of the project it wants to depend on
- After staging, the pipeline element will instantiate a Pipeline object (buildstream internal) by loading that project, this will share the same Context object that main pipeline was created with, which indicates things like where the artifact cache is found, where sources are mirrored and what target architecture we're building for.
- We will add this new pipeline to the running Scheduler, so that there remains only one Scheduler for a given buildstream process and elements on multiple pipelines may be built interleaved, however the scheduler decides.
Some things to consider is what happens when you stage a pipeline element, and what is the output of a pipeline element. Ideally the pipeline element creates artifacts for each element it builds and it's output is like a stack, only a statement that the given cache key is present and can be staged. After this, when staging a pipeline element, that element will stage the individual artifacts in the correct deterministic staging order for a pipeline depending element to build on top of. A simpler (and more expensive disk wise) option would be to compound the complete pipeline into a separate artifact and have that as one single chunk containing everything which was built.