Hello @ArunRaaman,
Welcome to the MS Q&A platform.
Your understanding is correct. When you publish a pipeline in Azure Data Factory, all the existing pipelines also get published.
Regarding your second question, Data Factory doesn't allow cherry-picking of commits or selective publishing of resources. Publishes will include all changes made in the data factory. However, on rare occasions when you need selective publishing, consider using a hotfix. For more information, see Hotfix production environment.
Regarding your third question, yes, you can use Git collaboration feature of ADF to manage your pipelines from feature branches and control the changes that merge into the main branch.
In general, all development and testing work will be done on the feature branches and merged into the main branch when ready for deployment. When we publish the changes to the main branch, it will only affect the changes in the main branch and will now affect any changes in other feature branches.
You can create a pull request in the Git repository for the ADF instance to merge from a feature branch to the main branch in ADF. This will allow you to review and approve the changes before they are merged into the main branch. You can also configure branch policies to enforce certain rules, such as requiring approval from a specific user or group before a pull request can be merged.
Reference documents:
https://github.com/MicrosoftDocs/azure-docs/blob/main/articles/data-factory/continuous-integration-delivery.md
https://github.com/MicrosoftDocs/azure-docs/blob/main/articles/data-factory/continuous-integration-delivery-hotfix-environment.md
https://github.com/MicrosoftDocs/azure-docs/blob/main/articles/data-factory/tutorial-control-flow-portal.md
I hope this helps. Please let me know if you have any further questions.
If this answers your question, please consider accepting the answer by hitting the Accept answer and up-vote as it helps the community look for answers to similar questions