I'm trying to install a pypi package which is installed through an alternative index from our own DevOps artifact.
When I use this requirements.txt:
azure-common==1.1.25
azure-core==1.9.0
azure-functions==1.4.0
azure-identity==1.4.0
azure-keyvault-secrets==4.2.0
azure-storage-blob==12.6.0
azure-storage-file-datalake==12.2.0
google-api-core
google-api-python-client
google-auth
google-auth-httplib2
googleapis-common-protos
oauth2client==4.1.3
pyodbc==4.0.30
pandas==1.1.3
pyarrow==1.0.1
pyspark
ipython
everything install fine (confirmed with checking through the following code):
import pkg_resources
for d in pkg_resources.working_set:
print(d)
When I add a package with an alternative index to the end of the requirements.txt it installs none of the packages defined in the txt file (I anonimized the alternative URL bcs of security reasons):
azure-common==1.1.25
azure-core==1.9.0
azure-functions==1.4.0
azure-identity==1.4.0
azure-keyvault-secrets==4.2.0
azure-storage-blob==12.6.0
azure-storage-file-datalake==12.2.0
google-api-core
google-api-python-client
google-auth
google-auth-httplib2
googleapis-common-protos
oauth2client==4.1.3
pyodbc==4.0.30
pandas==1.1.3
pyarrow==1.0.1
pyspark
ipython
-i https://<PAT_NAME>:<PAT_TOKEN>@pkgs.dev.azure.com/<ORG>/<TEAM>/_packaging/Sparkhouse/pypi/simple/
sparkhouse
when I use the same requirements file for Azure Functions or locally they all install successfully.
Please tell me why it doesn't install with an alternative index line and/or is there any logs available somewhere of the spark pool in Azure Synapse of the package installation to check what is going wrong.