Hi Saeed,
Welcome to Microsoft Learn QA !
I will try to help you troubleshoot if your setup is correct :
- Set Up Azure Blob Storage:
- Create a storage account in Azure.
- Create a container within the storage account.
- Generate a Shared Access Signature (SAS) token or use Azure Active Directory (AAD) for authentication.
- Upload Data from Assistant 1:
- Use the Azure Storage SDK for Python (
azure-storage-blob
) to upload the DataFrame. - Convert the DataFrame to a format like CSV or Parquet before uploading.
from azure.storage.blob import BlobServiceClient import pandas as pd df = pd.DataFrame({'col1': [1, 2], 'col2': [3, 4]}) csv_data = df.to_csv(index=False) blob_service_client = BlobServiceClient.from_connection_string("your_connection_string") blob_client = blob_service_client.get_blob_client(container="your_container", blob="df.csv") blob_client.upload_blob(csv_data, overwrite=True)
- Download Data in Assistant 2: Use the same SDK to download the DataFrame.
from azure.storage.blob import BlobServiceClient import pandas as pd blob_service_client = BlobServiceClient.from_connection_string("your_connection_string") blob_client = blob_service_client.get_blob_client(container="your_container", blob="df.csv") downloaded_blob = blob_client.download_blob() df = pd.read_csv(downloaded_blob.content_as_text())
Create a REST API to handle data exchange between the assistants
- Create an API : Use a framework like Flask or FastAPI to create an API endpoint that accepts and returns data.
from flask import Flask, request, jsonify import pandas as pd app = Flask(__name__) @app.route('/upload', methods=['POST']) def upload(): data = request.json df = pd.DataFrame(data) # Save df to a shared location or process it return jsonify({"status": "success"}) @app.route('/download', methods=['GET']) def download(): # Load df from a shared location df = pd.DataFrame({'col1': [1, 2], 'col2': [3, 4]}) return df.to_json(orient='split') if __name__ == '__main__': app.run(host='0.0.0.0', port=5000)
- Assistant 1 Uploads Data: Send a POST request to the
/upload
endpoint with the DataFrame.import requests import pandas as pd df = pd.DataFrame({'col1': [1, 2], 'col2': [3, 4]}) requests.post('http://your-api-endpoint/upload', json=df.to_dict())
- Assistant 2 Downloads Data: Send a GET request to the
/download
endpoint to retrieve the DataFrame.import requests import pandas as pd response = requests.get('http://your-api-endpoint/download') df = pd.read_json(response.json(), orient='split')
If the assistants are within the same network or can establish a secure connection, you can use direct data transfer methods like SFTP or SCP.
Steps to Use Direct Data Transfer:
- Set Up an SFTP Server:
- Configure an SFTP server accessible by both assistants.
- Assistant 1 Uploads Data:
- Use an SFTP client library like
paramiko
to upload the DataFrame.import paramiko import pandas as pd df = pd.DataFrame({'col1': [1, 2], 'col2': [3, 4]}) df.to_csv('df.csv', index=False) transport = paramiko.Transport(('sftp_server', 22)) transport.connect(username='your_username', password='your_password') sftp = paramiko.SFTPClient.from_transport(transport) sftp.put('df.csv', '/remote/path/df.csv') sftp.close() transport.close()
- Assistant 2 Downloads Data:
- Use the same library to download the DataFrame.
- Use the same library to download the DataFrame.
Conclusion
Each method has its pros and cons, and the best approach depends on your specific requirements, such as security, ease of implementation, and network constraints. Azure Blob Storage is a robust solution for cloud-based data sharing, while API endpoints offer flexibility and direct data transfer methods provide simplicity for within-network scenarios.
If you continue to face issues with Azure Blob Storage, ensure that the connection string, container name, and blob name are correctly specified, and that the necessary permissions are granted.
Feel free to reach out if you need further assistance!
Best regards,
[Your Name]
Azure AI Services Support