Thanks for reaching out. You need to register your storage as a datastore. Then write dataframe to a local file and upload to datastore as shown below (refer to this post as well):
from azureml.core import Workspace, Dataset
subscription_id = 'id'
resource_group = 'resource group'
workspace_name = 'workspace name'
ws = Workspace(subscription_id, resource_group, workspace_name)
#write dataframe to a local file (e.g. csv, parquet)
local_path = 'data/prepared.csv'
df.to_csv(local_path)
# get the datastore to upload prepared data
datastore = ws.get_default_datastore()
# upload the local file from src_dir to the target_path in datastore
datastore.upload(src_dir='data', target_path='data')
If you continue to experience errors, please share your code so we can investigate further, thanks.