Hello @Tarun Vashisth , and thank you for reaching out. The short answer is yes, both operations should be possible.
I'm not sure if you are using a custom transform or built-in transform, but it should be possible with both. You can accomplish that as following:
First step: -
Create a Transform with AudioOverlay as below:
{
"@odata.type": "#Microsoft.Media.StandardEncoderPreset",
"filters": {
"rotation": "Auto",
"overlays": [
{
"@odata.type": "#Microsoft.Media.AudioOverlay",
"inputLabel": "audioOverlay",
"start": "PT5S",
"end": "PT15S",
"fadeInDuration": "PT5S",
"fadeOutDuration": "PT5S",
"audioGainLevel": 0.5
}
]
},
"codecs": [
{
"@odata.type": "#Microsoft.Media.AacAudio",
"channels": 2,
"samplingRate": 48000,
"bitrate": 128000,
"profile": "AacLc"
},
{
"@odata.type": "#Microsoft.Media.H264Video",
"keyFrameInterval": "PT2S",
"stretchMode": "AutoSize",
"syncMode": "Auto",
"sceneChangeDetection": false,
"rateControlMode": "ABR",
"complexity": "Balanced",
"layers": [
{
"width": "1280",
"height": "720",
"label": "1280x720",
"bitrate": 4500000,
"maxBitrate": 4500000,
"bFrames": 3,
"frameRate": "0",
"slices": 0,
"adaptiveBFrame": true,
"profile": "Auto",
"level": "auto",
"bufferWindow": "PT5S",
"referenceFrames": 3,
"crf": 23,
"entropyMode": "Cabac"
}
]
}
],
"formats": [
{
"@odata.type": "#Microsoft.Media.Mp4Format",
"filenamePattern": "Video-{Basename}-{Label}-{Bitrate}{Extension}",
"outputFiles": []
}
]
}
You can control the overlay:
- fadeInDuration & fadeOutDuration: To control the fade transition period
- start & end: Control the start and end of the audio fade
- audioGainLevel: You can use it to control the audio volume. Value between 0 - 1.0 For more information you can check our RestFul API documentation here [https://learn.microsoft.com/en-us/rest/api/media/jobs/create?tabs=HTTP#audiooverlay
Second step: -
When you submit the job, you should send the overlay audio asset. We do not have a sample for audio overlay, but we have a sample here [https://github.com/Azure-Samples/media-services-v3-dotnet/blob/main/VideoEncoding/Encoding_OverlayImage/Program.cs for image overlay which you can use as a starting point. The most important part is the below
List<JobInput> jobInputs = new List<JobInput>() {
new JobInputAsset(assetName: inputAssetName),
new JobInputAsset(assetName: overlayAssetName, label: OverlayLabel)
};
Notice how you submit 2 assets as input, the first input asset is the video to be encoded, and the second one is the audio overlay, and the label should match what you set in your transform "audioOverlay" in the above example.
The request payload will look something as below:
PUT https://management.azure.com/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/contosoresources/providers/Microsoft.Media/mediaServices/mediaservices/transforms/createdTransform/jobs/job1?api-version=2021-06-01
{
"properties": {
"input": {
"@odata.type": "#Microsoft.Media.JobInputs",
"inputs": [
{
"@odata.type": "#Microsoft.Media.JobInputAsset",
"assetName": "Ignite"
},
{
"@odata.type": "#Microsoft.Media.JobInputAsset",
"assetName": "AudioSampleMP3-1m26s",
"label": "audioOverlay"
}
]
},
"outputs": [
{
"@odata.type": "#Microsoft.Media.JobOutputAsset",
"assetName": "AudioFadeInFadeOut-20230126133224"
}
]
}
}
For more information about the Job creation, you can check the documentation here [https://learn.microsoft.com/en-us/rest/api/media/jobs/create?tabs=HTTP