Request for Guidance on Sharing Data Between Assistants in Different Sandboxes

Saeed Misaghian 0 Reputation points
2025-02-27T21:04:13.01+00:00

Hi Support Team,

I am working with two assistants (

from autogen_agentchat.agents import AssistantAgent

`), each having their own agent and access to various tools such as code interpreters and functions.

However, I'm facing a challenge when it comes to sharing data (such as a DataFrame) between the two assistants.

The flow is as follows:

  1. Assistant 1 generates a DataFrame (let's call it df).
  2. Assistant 2 needs access to the same df in order to perform further operations, but since the assistants operate in different sandboxes, Assistant 2's agents cannot access the data generated by Assistant 1.

I tried using Azure Blob Storage as a potential solution to upload and download the data between the assistants, but the agents are failing to upload the data into the blob storage despite having defined an upload function.

Could you please provide guidance on the following:

  1. What is the best method to share data securely between assistants in different sandboxes?
  2. Are there alternative data exchange mechanisms I could implement (e.g., via shared cloud storage, APIs, or direct data transfer)? Thanks Saeed
Azure AI services
Azure AI services
A group of Azure services, SDKs, and APIs designed to make apps more intelligent, engaging, and discoverable.
3,628 questions
{count} votes

1 answer

Sort by: Most helpful
  1. Amira Bedhiafi 33,866 Reputation points Volunteer Moderator
    2025-03-02T23:08:27.6633333+00:00

    Hi Saeed,

    Welcome to Microsoft Learn QA !

    I will try to help you troubleshoot if your setup is correct :

    1. Set Up Azure Blob Storage:
      • Create a storage account in Azure.
      • Create a container within the storage account.
      • Generate a Shared Access Signature (SAS) token or use Azure Active Directory (AAD) for authentication.
    2. Upload Data from Assistant 1:
    • Use the Azure Storage SDK for Python (azure-storage-blob) to upload the DataFrame.
    • Convert the DataFrame to a format like CSV or Parquet before uploading.
         
         from azure.storage.blob import BlobServiceClient
         
         import pandas as pd
         
         
         
         df = pd.DataFrame({'col1': [1, 2], 'col2': [3, 4]})
         
         
         
         csv_data = df.to_csv(index=False)
         
         
         
         blob_service_client = BlobServiceClient.from_connection_string("your_connection_string")
         
         blob_client = blob_service_client.get_blob_client(container="your_container", blob="df.csv")
         
         blob_client.upload_blob(csv_data, overwrite=True)
         
      
    1. Download Data in Assistant 2: Use the same SDK to download the DataFrame.
         
         from azure.storage.blob import BlobServiceClient
         
         import pandas as pd
         
         
         blob_service_client = BlobServiceClient.from_connection_string("your_connection_string")
         
         blob_client = blob_service_client.get_blob_client(container="your_container", blob="df.csv")
         
         downloaded_blob = blob_client.download_blob()
         
         df = pd.read_csv(downloaded_blob.content_as_text())
         
      

    Create a REST API to handle data exchange between the assistants

    1. Create an API : Use a framework like Flask or FastAPI to create an API endpoint that accepts and returns data.
         
         from flask import Flask, request, jsonify
         
         import pandas as pd
         
         app = Flask(__name__)
         
         @app.route('/upload', methods=['POST'])
         
         def upload():
         
          data = request.json
         
          df = pd.DataFrame(data)
         
         # Save df to a shared location or process it
         
         return jsonify({"status": "success"})
         
         @app.route('/download', methods=['GET'])
         
         def download():
         
         # Load df from a shared location
         
         df = pd.DataFrame({'col1': [1, 2], 'col2': [3, 4]})
         
         return df.to_json(orient='split')
         
         if __name__ == '__main__':
         
         app.run(host='0.0.0.0', port=5000)
         
      
    2. Assistant 1 Uploads Data: Send a POST request to the /upload endpoint with the DataFrame.
         
         import requests
         
         import pandas as pd
         
         df = pd.DataFrame({'col1': [1, 2], 'col2': [3, 4]})
         
         requests.post('http://your-api-endpoint/upload', json=df.to_dict())
         
      
    3. Assistant 2 Downloads Data: Send a GET request to the /download endpoint to retrieve the DataFrame.
         
         import requests
         
         import pandas as pd
         
         response = requests.get('http://your-api-endpoint/download')
         
         df = pd.read_json(response.json(), orient='split')
         
      

    If the assistants are within the same network or can establish a secure connection, you can use direct data transfer methods like SFTP or SCP.

    Steps to Use Direct Data Transfer:

    1. Set Up an SFTP Server:
      • Configure an SFTP server accessible by both assistants.
    2. Assistant 1 Uploads Data:
    • Use an SFTP client library like paramiko to upload the DataFrame.
         
         import paramiko
         
         import pandas as pd
         
         df = pd.DataFrame({'col1': [1, 2], 'col2': [3, 4]})
         
         df.to_csv('df.csv', index=False)
         
         transport = paramiko.Transport(('sftp_server', 22))
         
         transport.connect(username='your_username', password='your_password')
         
         sftp = paramiko.SFTPClient.from_transport(transport)
         
         sftp.put('df.csv', '/remote/path/df.csv')
         
         sftp.close()
         
         transport.close()
         
      
    1. Assistant 2 Downloads Data:
      • Use the same library to download the DataFrame.
        
        
      import paramiko import pandas as pd transport = paramiko.Transport(('sftp_server', 22)) transport.connect(username='your_username', password='your_password') sftp = paramiko.SFTPClient.from_transport(transport) sftp.get('/remote/path/df.csv', 'df.csv') sftp.close() transport.close() df = pd.read_csv('df.csv')
      
      

    Conclusion

    Each method has its pros and cons, and the best approach depends on your specific requirements, such as security, ease of implementation, and network constraints. Azure Blob Storage is a robust solution for cloud-based data sharing, while API endpoints offer flexibility and direct data transfer methods provide simplicity for within-network scenarios.

    If you continue to face issues with Azure Blob Storage, ensure that the connection string, container name, and blob name are correctly specified, and that the necessary permissions are granted.

    Feel free to reach out if you need further assistance!

    Best regards,

    [Your Name]

    Azure AI Services Support

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.