Scheduling Live Events

JN 1 Reputation point

I want to create a list of live events happening in the future, each with a live output and a streaming locator so that I can distribute them to viewers.
But as far as I can see, Live Outputs are active from the time they are created and until deleted, and they cannot be created if the Live Event (i.e. ingest point) is not running and receiving streams. I also want to use the same Live Event so I do not need to configure a new ingest point for each event.
Must I create the Live Output just-in-time when the live stream should start? what is the best practice in such a case?

Azure Media Services
Azure Media Services
A group of Azure services that includes encoding, format conversion, on-demand streaming, content protection, and live streaming services.
314 questions
{count} votes

3 answers

Sort by: Most helpful
  1. John Deutscher (MSFT) 2,126 Reputation points

    We recently introduced a new lower cost Standby state for Live events to enable you to hold a small hotpool of live events in your application that are "ready to go". You can then hand them out to the application, update any settings needed with an Update call and then transition them to "Running" state where they will be ready to receive RTMP ingest data.

    I would create the LiveOutput ahead of time, creating the streaming locator before even moving to Running. That way you have the LiveOutput streaming URL ready to go before ingest even begins.

    See a similar flow in this Node.js sample (it does not use standby yet...) -

    Sounds like that could help your specific scenario?

    0 comments No comments

  2. JN 1 Reputation point

    While this might be a good flow, it will not work with my scenario as it will require reconfiguration of the live encoders with a new ingest point at the beginning of each live event.
    The process I am now trying to get working (will update) is as follows. Would be happy to hear your thoughts:
    * 1 live events for each encoder. Starting and stopping them will have an effect on cost but not on technical infrastructure (so if the interval between broadcasts is small we will keep them running).
    * Pre-creating assets and streaming locators for each broadcast. Streaming locators will already go into front end.
    * At the time of the broadcast (or 10 minutes prior) we will create the live output, and link it to the appropriate asset.
    * At the end of the broadcast we will delete the live output, asset and streaming locators remaining for catch-up.


    0 comments No comments

  3. John Deutscher (MSFT) 2,126 Reputation points

    It should not be necessary to configure a new ingest point at the beginning of each live event.

    You can force that to be static and deterministic each time.

    1) Set the useStaticHostname to true
    2) Always set an accessToken on the LiveEventInput - set it to a static GUID value like

    accessToken: "acf7b6ef-8a37-425f-b8fc-51c2d6a5a86a", // Use this value when you want to make sure the ingest URL is static and always the same. If omitted, the service will generate a random GUID value.

    I made comments on this in the sample code here -

    The rest of your workflow looks fine, but note that you can also pre-create LiveOutputs ahead of time if needed. 10 minutes is just arbitrary as you noted.

    The only comment is that if you clean up your Asset and Streaming Locators, you do a cache bust on the CDN for the live to vod delivery. If you do not clean those up, but instead continue to use the same Streaming Locator for the VOD asset, your viewers should gain the benefit of seeing content that is already in their local CDN POP cache. It avoids having to create a second VOD asset and locator which doubles the egress out to the CDN again.

    0 comments No comments