Hello @Benitez Gaucho
Thanks for reaching out to us, are you mentioning Llama 2 on Azure? If yes, the answer is Yes too.
Llama 2 is now available in the model catalog in Azure Machine Learning.
At Microsoft Inspire, Microsoft and Meta expanded their AI partnership and announced support for Llama 2 family of models on Azure and Windows. Llama 2 is the next generation of large language model (LLM) developed and released by Meta. It is pretrained on 2 trillion tokens of public data and is designed to enable developers and organizations to build generative AI-powered tools and experiences. With this partnership, Microsoft is excited to be Meta’s preferred partner as they release their new version of Llama 2 to commercial customers for the first time.
The model catalog, currently in public preview in Azure Machine Learning, is your hub for foundation models, and empowers users to easily discover, customize and operationalize large foundation models at scale. The native support for Llama 2 within the Azure Machine Learning model catalog enables users to use these models, without having to manage any of the infrastructure or environment dependencies. It provides out-of-the-box support for model finetuning and evaluation, including a selection of optimizer libraries like DeepSpeed and ORT (ONNX RunTime), which speed up fine-tuning, along with LoRA (Low-Rank Adaptation of Large Language Models) to greatly reduce memory and compute requirements for fine-tuning. Deployments of Llama 2 models in Azure come standard with Azure AI Content Safety integration, offering a built-in layered approach to safety, and following responsible AI best practices.
Please check on below screenshot to find it -
More information please refer to this blog - https://techcommunity.microsoft.com/t5/ai-machine-learning-blog/introducing-llama-2-on-azure/ba-p/3881233 and please explore the feature and let me know if you have any questions.
Regards,
Yutong
-Please kindly accept the answer and vote 'Yes' if you feel helpful to support the community, thanks a lot.