Hi,
I'm using the Azure SDK for .NET to manipulate files on the data lake (Gen2)
Within an Azure Function, I would like to add some data to a csv file stored on the data lake.
I came up with this method, should work according to the documentation (or I did not fully understand it).
Problem is that the data is not 'flushed' to the file. It remains the original content.
Can't figure out what's going on here i'm afraid :-(
Any tips ?
Regards,
Sven Peeters
PS : I must add data incrementally, otherwise the memory consumption can become an issue here.
public void AddFileContents(string fullPath, string content, string leaseId = null)
{
DataLakeFileClient dataLakeFileClient = GetFileSystemClient().GetFileClient(fullPath);
dataLakeFileClient.CreateIfNotExists();
long currentLength = dataLakeFileClient.GetProperties().Value.ContentLength;
byte[] byteArray = Encoding.UTF8.GetBytes(content);
MemoryStream mStream = new MemoryStream(byteArray);
long fileSize = mStream.Length;
dataLakeFileClient.Append(mStream, currentLength, leaseId: leaseId);
dataLakeFileClient.Flush(position: currentLength, close: true, conditions: new DataLakeRequestConditions() { LeaseId = leaseId });
}