快速入門:將 1:1 視訊通話新增至您的應用程式

使用通訊服務呼叫 SDK 將 1 對 1 的視訊通話新增至您的應用程式,以開始使用 Azure 通訊服務。 您將瞭解如何使用適用于 JavaScript 的Azure 通訊服務呼叫 SDK 來啟動和接聽視訊通話。

範例程式碼

如果您想要跳到結尾,您可以在 GitHub上下載本快速入門作為範例。

注意

您可以使用 Azure 通訊服務UI 程式庫來存取對Azure 通訊服務使用者的輸出呼叫。 UI 程式庫可讓開發人員只使用幾行程式碼,將已啟用 VoIP 的呼叫用戶端新增至其應用程式中。

必要條件

設定

建立新的 Node.js 應用程式

開啟您的終端機或命令視窗,為您的應用程式建立新的目錄,並瀏覽至該目錄。

mkdir calling-quickstart && cd calling-quickstart

安裝套件

npm install使用 命令來安裝 Azure 通訊服務呼叫適用于 JavaScript 的 SDK。

重要

本快速入門使用 Azure 通訊服務呼叫 SDK 版本 1.4.4

npm install @azure/communication-common --save
npm install @azure/communication-calling@1.4.4 --save

設定應用程式架構

本快速入門會使用 Webpack 來組合應用程式資產。 執行下列命令以安裝 webpackwebpack-cliwebpack-dev-server npm 套件,並在 中 package.json 將其列為開發相依性:

npm install webpack@4.42.0 webpack-cli@3.3.11 webpack-dev-server@3.10.3 --save-dev

此程式碼如下:

index.html 專案的根目錄中建立檔案。 我們會使用此檔案來設定基本版面配置,讓使用者能夠撥打 1:1 視訊通話。

<!-- index.html -->
<!DOCTYPE html>
<html>
    <head>
        <title>Azure Communication Services - Calling Web SDK</title>
        <link rel="stylesheet" type="text/css" href="styles.css"/>
    </head>
    <body>
        <h4>Azure Communication Services - Calling Web SDK</h4>
        <input id="user-access-token"
            type="text"
            placeholder="User access token"
            style="margin-bottom:1em; width: 500px;"/>
        <button id="initialize-call-agent" type="button">Initialize Call Agent</button>
        <br>
        <br>
        <input id="callee-acs-user-id"
            type="text"
            placeholder="Enter callee's Azure Communication Services user identity in format: '8:acs:resourceId_userId'"
            style="margin-bottom:1em; width: 500px; display: block;"/>
        <button id="start-call-button" type="button" disabled="true">Start Call</button>
        <button id="hangup-call-button" type="button" disabled="true">Hang up Call</button>
        <button id="accept-call-button" type="button" disabled="true">Accept Call</button>
        <button id="start-video-button" type="button" disabled="true">Start Video</button>
        <button id="stop-video-button" type="button" disabled="true">Stop Video</button>
        <br>
        <br>
        <div id="connectedLabel" style="color: #13bb13;" hidden>Call is connected!</div>
        <br>
        <div id="remoteVideosGallery" style="width: 40%;" hidden>Remote participants' video streams:</div>
        <br>
        <div id="localVideoContainer" style="width: 30%;" hidden>Local video stream:</div>
        <!-- points to the bundle generated from client.js -->
        <script src="./bundle.js"></script>
    </body>
</html>

下列類別和介面會處理Azure 通訊服務呼叫 SDK 的一些主要功能:

名稱 描述
CallClient 呼叫 SDK 的主要進入點。
AzureCommunicationTokenCredential 實作 CommunicationTokenCredential 介面,這個介面用來具現化 callAgent
CallAgent 用來啟動和管理呼叫。
DeviceManager 用來管理媒體裝置。
Call 用於表示通話
LocalVideoStream 用於在本機系統上建立相機裝置的本機視訊串流。
RemoteParticipant 用於代表通話中的遠端參與者
RemoteVideoStream 用於代表來自遠端參與者的遠端視訊串流。

在名為 的專案 client.js 根目錄中建立檔案,以包含本快速入門的應用程式邏輯。 將下列程式碼新增至 client.js:

// Make sure to install the necessary dependencies
const { CallClient, VideoStreamRenderer, LocalVideoStream } = require('@azure/communication-calling');
const { AzureCommunicationTokenCredential } = require('@azure/communication-common');
const { AzureLogger, setLogLevel } = require("@azure/logger");
// Set the log level and output
setLogLevel('verbose');
AzureLogger.log = (...args) => {
    console.log(...args);
};

// Calling web sdk objects
let callAgent;
let deviceManager;
let call;
let incomingCall;
let localVideoStream;
let localVideoStreamRenderer;

// UI widgets
let userAccessToken = document.getElementById('user-access-token');
let calleeAcsUserId = document.getElementById('callee-acs-user-id');
let initializeCallAgentButton = document.getElementById('initialize-call-agent');
let startCallButton = document.getElementById('start-call-button');
let hangUpCallButton = document.getElementById('hangup-call-button');
let acceptCallButton = document.getElementById('accept-call-button');
let startVideoButton = document.getElementById('start-video-button');
let stopVideoButton = document.getElementById('stop-video-button');
let connectedLabel = document.getElementById('connectedLabel');
let remoteVideosGallery = document.getElementById('remoteVideosGallery');
let localVideoContainer = document.getElementById('localVideoContainer');

/**
 * Using the CallClient, initialize a CallAgent instance with a CommunicationUserCredential which enable us to make outgoing calls and receive incoming calls. 
 * You can then use the CallClient.getDeviceManager() API instance to get the DeviceManager.
 */
initializeCallAgentButton.onclick = async () => {
    try {
        const callClient = new CallClient(); 
        tokenCredential = new AzureCommunicationTokenCredential(userAccessToken.value.trim());
        callAgent = await callClient.createCallAgent(tokenCredential)
        // Set up a camera device to use.
        deviceManager = await callClient.getDeviceManager();
        await deviceManager.askDevicePermission({ video: true });
        await deviceManager.askDevicePermission({ audio: true });
        // Listen for an incoming call to accept.
        callAgent.on('incomingCall', async (args) => {
            try {
                incomingCall = args.incomingCall;
                acceptCallButton.disabled = false;
                startCallButton.disabled = true;
            } catch (error) {
                console.error(error);
            }
        });

        startCallButton.disabled = false;
        initializeCallAgentButton.disabled = true;
    } catch(error) {
        console.error(error);
    }
}

/**
 * Place a 1:1 outgoing video call to a user
 * Add an event listener to initiate a call when the `startCallButton` is clicked:
 * First you have to enumerate local cameras using the deviceManager `getCameraList` API.
 * In this quickstart we're using the first camera in the collection. Once the desired camera is selected, a
 * LocalVideoStream instance will be constructed and passed within `videoOptions` as an item within the
 * localVideoStream array to the call method. Once your call connects it will automatically start sending a video stream to the other participant. 
 */
startCallButton.onclick = async () => {
    try {
        const localVideoStream = await createLocalVideoStream();
        const videoOptions = localVideoStream ? { localVideoStreams: [localVideoStream] } : undefined;
        call = callAgent.startCall([{ communicationUserId: calleeAcsUserId.value.trim() }], { videoOptions });
        // Subscribe to the call's properties and events.
        subscribeToCall(call);
    } catch (error) {
        console.error(error);
    }
}

/**
 * Accepting an incoming call with video
 * Add an event listener to accept a call when the `acceptCallButton` is clicked:
 * After subscribing to the `CallAgent.on('incomingCall')` event, you can accept the incoming call.
 * You can pass the local video stream which you want to use to accept the call with.
 */
acceptCallButton.onclick = async () => {
    try {
        const localVideoStream = await createLocalVideoStream();
        const videoOptions = localVideoStream ? { localVideoStreams: [localVideoStream] } : undefined;
        call = await incomingCall.accept({ videoOptions });
        // Subscribe to the call's properties and events.
        subscribeToCall(call);
    } catch (error) {
        console.error(error);
    }
}

/**
 * Subscribe to a call obj.
 * Listen for property changes and collection updates.
 */
subscribeToCall = (call) => {
    try {
        // Inspect the initial call.id value.
        console.log(`Call Id: ${call.id}`);
        //Subscribe to call's 'idChanged' event for value changes.
        call.on('idChanged', () => {
            console.log(`Call Id changed: ${call.id}`); 
        });

        // Inspect the initial call.state value.
        console.log(`Call state: ${call.state}`);
        // Subscribe to call's 'stateChanged' event for value changes.
        call.on('stateChanged', async () => {
            console.log(`Call state changed: ${call.state}`);
            if(call.state === 'Connected') {
                connectedLabel.hidden = false;
                acceptCallButton.disabled = true;
                startCallButton.disabled = true;
                hangUpCallButton.disabled = false;
                startVideoButton.disabled = false;
                stopVideoButton.disabled = false;
                remoteVideosGallery.hidden = false;
            } else if (call.state === 'Disconnected') {
                connectedLabel.hidden = true;
                startCallButton.disabled = false;
                hangUpCallButton.disabled = true;
                startVideoButton.disabled = true;
                stopVideoButton.disabled = true;
                console.log(`Call ended, call end reason={code=${call.callEndReason.code}, subCode=${call.callEndReason.subCode}}`);
            }   
        });

        call.localVideoStreams.forEach(async (lvs) => {
            localVideoStream = lvs;
            await displayLocalVideoStream();
        });
        call.on('localVideoStreamsUpdated', e => {
            e.added.forEach(async (lvs) => {
                localVideoStream = lvs;
                await displayLocalVideoStream();
            });
            e.removed.forEach(lvs => {
               removeLocalVideoStream();
            });
        });
        
        // Inspect the call's current remote participants and subscribe to them.
        call.remoteParticipants.forEach(remoteParticipant => {
            subscribeToRemoteParticipant(remoteParticipant);
        });
        // Subscribe to the call's 'remoteParticipantsUpdated' event to be
        // notified when new participants are added to the call or removed from the call.
        call.on('remoteParticipantsUpdated', e => {
            // Subscribe to new remote participants that are added to the call.
            e.added.forEach(remoteParticipant => {
                subscribeToRemoteParticipant(remoteParticipant)
            });
            // Unsubscribe from participants that are removed from the call
            e.removed.forEach(remoteParticipant => {
                console.log('Remote participant removed from the call.');
            });
        });
    } catch (error) {
        console.error(error);
    }
}

/**
 * Subscribe to a remote participant obj.
 * Listen for property changes and collection udpates.
 */
subscribeToRemoteParticipant = (remoteParticipant) => {
    try {
        // Inspect the initial remoteParticipant.state value.
        console.log(`Remote participant state: ${remoteParticipant.state}`);
        // Subscribe to remoteParticipant's 'stateChanged' event for value changes.
        remoteParticipant.on('stateChanged', () => {
            console.log(`Remote participant state changed: ${remoteParticipant.state}`);
        });

        // Inspect the remoteParticipants's current videoStreams and subscribe to them.
        remoteParticipant.videoStreams.forEach(remoteVideoStream => {
            subscribeToRemoteVideoStream(remoteVideoStream)
        });
        // Subscribe to the remoteParticipant's 'videoStreamsUpdated' event to be
        // notified when the remoteParticiapant adds new videoStreams and removes video streams.
        remoteParticipant.on('videoStreamsUpdated', e => {
            // Subscribe to new remote participant's video streams that were added.
            e.added.forEach(remoteVideoStream => {
                subscribeToRemoteVideoStream(remoteVideoStream)
            });
            // Unsubscribe from remote participant's video streams that were removed.
            e.removed.forEach(remoteVideoStream => {
                console.log('Remote participant video stream was removed.');
            })
        });
    } catch (error) {
        console.error(error);
    }
}

/**
 * Subscribe to a remote participant's remote video stream obj.
 * You have to subscribe to the 'isAvailableChanged' event to render the remoteVideoStream. If the 'isAvailable' property
 * changes to 'true', a remote participant is sending a stream. Whenever availability of a remote stream changes
 * you can choose to destroy the whole 'Renderer', a specific 'RendererView' or keep them, but this will result in displaying blank video frame.
 */
subscribeToRemoteVideoStream = async (remoteVideoStream) => {
    let renderer = new VideoStreamRenderer(remoteVideoStream);
    let view;
    let remoteVideoContainer = document.createElement('div');
    remoteVideoContainer.className = 'remote-video-container';

    /**
     * isReceiving API is currently a @beta feature.
     * To use this api, please use 'beta' version of Azure Communication Services Calling Web SDK.
     * Create a CSS class to style your loading spinner.
     *
    let loadingSpinner = document.createElement('div');
    loadingSpinner.className = 'loading-spinner';
    remoteVideoStream.on('isReceivingChanged', () => {
        try {
            if (remoteVideoStream.isAvailable) {
                const isReceiving = remoteVideoStream.isReceiving;
                const isLoadingSpinnerActive = remoteVideoContainer.contains(loadingSpinner);
                if (!isReceiving && !isLoadingSpinnerActive) {
                    remoteVideoContainer.appendChild(loadingSpinner);
                } else if (isReceiving && isLoadingSpinnerActive) {
                    remoteVideoContainer.removeChild(loadingSpinner);
                }
            }
        } catch (e) {
            console.error(e);
        }
    });
    */

    const createView = async () => {
        // Create a renderer view for the remote video stream.
        view = await renderer.createView();
        // Attach the renderer view to the UI.
        remoteVideoContainer.appendChild(view.target);
        remoteVideosGallery.appendChild(remoteVideoContainer);
    }

    // Remote participant has switched video on/off
    remoteVideoStream.on('isAvailableChanged', async () => {
        try {
            if (remoteVideoStream.isAvailable) {
                await createView();
            } else {
                view.dispose();
                remoteVideosGallery.removeChild(remoteVideoContainer);
            }
        } catch (e) {
            console.error(e);
        }
    });

    // Remote participant has video on initially.
    if (remoteVideoStream.isAvailable) {
        try {
            await createView();
        } catch (e) {
            console.error(e);
        }
    }
}

/**
 * Start your local video stream.
 * This will send your local video stream to remote participants so they can view it.
 */
startVideoButton.onclick = async () => {
    try {
        const localVideoStream = await createLocalVideoStream();
        await call.startVideo(localVideoStream);
    } catch (error) {
        console.error(error);
    }
}

/**
 * Stop your local video stream.
 * This will stop your local video stream from being sent to remote participants.
 */
stopVideoButton.onclick = async () => {
    try {
        await call.stopVideo(localVideoStream);
    } catch (error) {
        console.error(error);
    }
}

/**
 * To render a LocalVideoStream, you need to create a new instance of VideoStreamRenderer, and then
 * create a new VideoStreamRendererView instance using the asynchronous createView() method.
 * You may then attach view.target to any UI element. 
 */
createLocalVideoStream = async () => {
    const camera = (await deviceManager.getCameras())[0];
    if (camera) {
        return new LocalVideoStream(camera);
    } else {
        console.error(`No camera device found on the system`);
    }
}

/**
 * Display your local video stream preview in your UI
 */
displayLocalVideoStream = async () => {
    try {
        localVideoStreamRenderer = new VideoStreamRenderer(localVideoStream);
        const view = await localVideoStreamRenderer.createView();
        localVideoContainer.hidden = false;
        localVideoContainer.appendChild(view.target);
    } catch (error) {
        console.error(error);
    } 
}

/**
 * Remove your local video stream preview from your UI
 */
removeLocalVideoStream = async() => {
    try {
        localVideoStreamRenderer.dispose();
        localVideoContainer.hidden = true;
    } catch (error) {
        console.error(error);
    } 
}

/**
 * End current call
 */
hangUpCallButton.addEventListener("click", async () => {
    // end the current call
    await call.hangUp();
});

在名為 的專案 styles.css 根目錄中建立檔案,以包含本快速入門的應用程式樣式。 將下列程式碼新增至 styles.css:

/**
 * CSS for styling the loading spinner over the remote video stream
 */
.remote-video-container {
    position: relative;
}
.loading-spinner {
    border: 12px solid #f3f3f3;
    border-radius: 50%;
    border-top: 12px solid #ca5010;
    width: 100px;
    height: 100px;
    -webkit-animation: spin 2s linear infinite; /* Safari */
    animation: spin 2s linear infinite;
    position: absolute;
    margin: auto;
    top: 0;
    bottom: 0;
    left: 0;
    right: 0;
    transform: translate(-50%, -50%);
}
@keyframes spin {
    0% { transform: rotate(0deg); }
    100% { transform: rotate(360deg); }
}
/* Safari */
@-webkit-keyframes spin {
    0% { -webkit-transform: rotate(0deg); }
    100% { -webkit-transform: rotate(360deg); }
}

執行程式碼

使用 webpack-dev-server 來建置並執行您的應用程式。 執行下列命令,將應用程式主機組合在本機 Webserver 中:

npx webpack-dev-server --entry ./client.js --output bundle.js --debug --devtool inline-source-map

開啟瀏覽器,並在兩個索引標籤上流覽至 http://localhost:8080/.You ,應該會看到下列畫面:

1 對 1 個視訊通話頁面 - a

在第一個索引標籤上,輸入有效的使用者存取權杖,在另一個索引標籤上輸入另一個不同的有效使用者存取權杖。

如果您還沒有可供使用的權杖,請參閱 使用者存取權杖檔

在兩個索引標籤上,按一下 [初始化呼叫代理程式] 按鈕。 您應該會看見下列畫面:

1 對 1 個視訊通話頁面 - b

在第一個索引標籤上,輸入第二個索引標籤Azure 通訊服務使用者身分識別,然後按一下 [開始通話] 按鈕。 第一個索引標籤會啟動第二個索引標籤的撥出呼叫,而第二個索引標籤的 [接受通話] 按鈕會變成啟用: 1 個在 1 個視訊通話頁面上 - c

從第二個索引標籤中,按一下 [接受通話] 按鈕,而通話會啟動並聯機。 您應該會看到下列畫面: 1 在 1 個視訊通話頁面上 - d

這兩個索引標籤現在都已成功在 1 到 1 個視訊通話中。 這兩個索引標籤都可以聽到彼此的音訊,並查看彼此的視訊串流。

使用通訊服務呼叫用戶端程式庫將視訊通話新增至您的應用程式,開始使用 Azure 通訊服務。 瞭解如何包含 1:1 視訊通話,以及如何建立或加入群組通話。 此外,您可以使用適用于 Android 的Azure 通訊服務通話 SDK 來啟動、接聽和加入視訊通話。

如果您想要開始使用範例程式碼,您可以 下載範例應用程式

必要條件

建立具有空白活動的 Android 應用程式

從 Android Studio 中,選取 [啟動新的 Android Studio 專案]。

顯示在 Android Studio 中選取 [啟動新的 Android Studio 專案] 按鈕的螢幕擷取畫面。

[手機和平板電腦] 底下,選取 [空白活動] 專案範本。

顯示 [專案範本畫面] 中選取 [空白活動] 選項的螢幕擷取畫面。

針對 [最低 SDK],選取 [API 26:Android 8.0 (Oreo) 或更新版本。 請參閱 SDK 支援版本

顯示已選取 API 選項的螢幕擷取畫面。

安裝套件

找出您的專案層級 build.gradle ,並將 新增 mavenCentral() 至 和 下的 buildscript 存放庫清單 allprojects

buildscript {
    repositories {
    ...
        mavenCentral()
    ...
    }
}
allprojects {
    repositories {
    ...
        mavenCentral()
    ...
    }
}

然後在模組層級 build.gradle 中,將下列幾行新增至 dependenciesandroid 區段:

android {
    ...
    packagingOptions {
        pickFirst  'META-INF/*'
    }
    compileOptions {
        sourceCompatibility JavaVersion.VERSION_1_8
        targetCompatibility JavaVersion.VERSION_1_8
    }
}

dependencies {
    ...
    implementation 'com.azure.android:azure-communication-calling:2.0.0'
    ...
}

將權限新增至應用程式資訊清單

若要要求呼叫所需的許可權,您必須先在應用程式資訊清單中宣告許可權, app/src/main/AndroidManifest.xml () 。 以下列程式碼取代檔案的內容:

<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
    package="com.contoso.acsquickstart">

    <uses-permission android:name="android.permission.INTERNET" />
    <uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
    <uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
    <uses-permission android:name="android.permission.RECORD_AUDIO" />
    <uses-permission android:name="android.permission.CAMERA" />

    <application
        android:allowBackup="true"
        android:icon="@mipmap/ic_launcher"
        android:label="@string/app_name"
        android:roundIcon="@mipmap/ic_launcher_round"
        android:supportsRtl="true"
        android:theme="@style/AppTheme">
        <!--Our Calling SDK depends on the Apache HTTP SDK.
When targeting Android SDK 28+, this library needs to be explicitly referenced.
See https://developer.android.com/about/versions/pie/android-9.0-changes-28#apache-p-->
        <uses-library android:name="org.apache.http.legacy" android:required="false"/>
        <activity android:name=".MainActivity">
            <intent-filter>
                <action android:name="android.intent.action.MAIN" />

                <category android:name="android.intent.category.LAUNCHER" />
            </intent-filter>
        </activity>
    </application>

</manifest>
    

設定應用程式的配置

您需要呼叫者識別碼或群組通話識別碼的文字輸入、撥打通話的按鈕,以及掛接通話的額外按鈕。

也需要兩個按鈕來開啟和關閉本機視訊。 您必須為本機和遠端視訊串流放置兩個容器。 您可以透過設計工具或編輯版面配置 XML 來新增這些按鈕。

移至 app/src/main/res/layout/activity_main.xml,並以下列程式碼取代檔案的內容:

<?xml version="1.0" encoding="utf-8"?>
<androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:app="http://schemas.android.com/apk/res-auto"
    xmlns:tools="http://schemas.android.com/tools"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    tools:context=".MainActivity">
    <LinearLayout
        android:layout_width="match_parent"
        android:layout_height="wrap_content"
        android:orientation="vertical">

        <EditText
            android:id="@+id/call_id"
            android:layout_width="match_parent"
            android:layout_height="wrap_content"
            android:ems="10"
            android:gravity="center"
            android:hint="Callee ID"
            android:inputType="textPersonName"
            app:layout_constraintBottom_toTopOf="@+id/call_button"
            app:layout_constraintEnd_toEndOf="parent"
            app:layout_constraintStart_toStartOf="parent"
            app:layout_constraintTop_toTopOf="parent"
            app:layout_constraintVertical_bias="0.064" />
        <LinearLayout
            android:layout_width="match_parent"
            android:layout_height="wrap_content">
            <Button
                android:id="@+id/call_button"
                android:layout_width="wrap_content"
                android:layout_height="wrap_content"
                android:layout_marginBottom="16dp"
                android:gravity="center"
                android:text="Call"
                app:layout_constraintBottom_toBottomOf="parent"
                app:layout_constraintEnd_toEndOf="parent"
                app:layout_constraintStart_toStartOf="parent" />
            <Button
                android:id="@+id/show_preview"
                android:layout_width="wrap_content"
                android:layout_height="wrap_content"
                android:layout_marginBottom="16dp"
                android:gravity="center"
                android:text="Show Video"
                app:layout_constraintBottom_toBottomOf="parent"
                app:layout_constraintEnd_toEndOf="parent"
                app:layout_constraintStart_toStartOf="parent" />
            <Button
                android:id="@+id/hide_preview"
                android:layout_width="wrap_content"
                android:layout_height="wrap_content"
                android:layout_marginBottom="16dp"
                android:gravity="center"
                android:text="Hide Video"
                app:layout_constraintBottom_toBottomOf="parent"
                app:layout_constraintEnd_toEndOf="parent"
                app:layout_constraintStart_toStartOf="parent" />
            <Button
                android:id="@+id/hang_up"
                android:layout_width="wrap_content"
                android:layout_height="wrap_content"
                android:layout_marginBottom="16dp"
                android:gravity="center"
                android:text="Hang Up"
                app:layout_constraintBottom_toBottomOf="parent"
                app:layout_constraintEnd_toEndOf="parent"
                app:layout_constraintStart_toStartOf="parent" />
        </LinearLayout>
        <ScrollView
            android:layout_width="match_parent"
            android:layout_height="wrap_content">
            <GridLayout
                android:id="@+id/remotevideocontainer"
                android:layout_width="match_parent"
                android:layout_height="wrap_content"
                android:columnCount="2"
                android:rowCount="2"
                android:padding="10dp"></GridLayout>
        </ScrollView>
    </LinearLayout>
    <FrameLayout
        android:layout_width="match_parent"
        android:layout_height="match_parent">
        <LinearLayout
            android:id="@+id/localvideocontainer"
            android:layout_width="180dp"
            android:layout_height="300dp"
            android:layout_gravity="right|bottom"
            android:orientation="vertical"
            android:padding="10dp">
            <Button
                android:id="@+id/switch_source"
                android:layout_width="match_parent"
                android:layout_height="wrap_content"
                android:gravity="center"
                android:text="Switch Source"
                android:visibility="invisible" />
        </LinearLayout>

    </FrameLayout>
</androidx.constraintlayout.widget.ConstraintLayout>

建立主要活動 Scaffolding 和繫結

建立版面配置後,您可以新增系結,以及活動的基本 Scaffolding。 活動會處理要求執行時間許可權、建立呼叫代理程式,並在按下按鈕時放置呼叫。

方法 onCreate 會覆寫以叫 getAllPermissions 用 和 createAgent ,並新增呼叫按鈕的系結。 只有在建立活動時,才會發生此事件一次。 如需 的詳細資訊 onCreate ,請參閱 瞭解活動生命週期指南。

移至 MainActivity.java 檔案,並以下列程式碼取代內容:

package com.example.videocallingquickstart;

import androidx.appcompat.app.AppCompatActivity;
import androidx.core.app.ActivityCompat;

import android.Manifest;
import android.content.pm.PackageManager;
import android.media.AudioManager;
import android.os.Bundle;
import android.util.DisplayMetrics;
import android.util.Log;
import android.view.View;
import android.widget.Button;
import android.widget.EditText;
import android.widget.GridLayout;
import android.widget.Toast;
import android.widget.LinearLayout;
import android.content.Context;
import com.azure.android.communication.calling.CallState;
import com.azure.android.communication.calling.CallingCommunicationException;
import com.azure.android.communication.calling.ParticipantsUpdatedListener;
import com.azure.android.communication.calling.PropertyChangedEvent;
import com.azure.android.communication.calling.PropertyChangedListener;
import com.azure.android.communication.calling.StartCallOptions;
import com.azure.android.communication.calling.VideoDeviceInfo;
import com.azure.android.communication.common.CommunicationIdentifier;
import com.azure.android.communication.common.CommunicationTokenCredential;
import com.azure.android.communication.calling.CallAgent;
import com.azure.android.communication.calling.CallClient;
import com.azure.android.communication.calling.DeviceManager;
import com.azure.android.communication.calling.VideoOptions;
import com.azure.android.communication.calling.LocalVideoStream;
import com.azure.android.communication.calling.VideoStreamRenderer;
import com.azure.android.communication.calling.VideoStreamRendererView;
import com.azure.android.communication.calling.CreateViewOptions;
import com.azure.android.communication.calling.ScalingMode;
import com.azure.android.communication.calling.IncomingCall;
import com.azure.android.communication.calling.Call;
import com.azure.android.communication.calling.AcceptCallOptions;
import com.azure.android.communication.calling.ParticipantsUpdatedEvent;
import com.azure.android.communication.calling.RemoteParticipant;
import com.azure.android.communication.calling.RemoteVideoStream;
import com.azure.android.communication.calling.RemoteVideoStreamsEvent;
import com.azure.android.communication.calling.RendererListener;
import com.azure.android.communication.common.CommunicationUserIdentifier;
import com.azure.android.communication.common.MicrosoftTeamsUserIdentifier;
import com.azure.android.communication.common.PhoneNumberIdentifier;
import com.azure.android.communication.common.UnknownIdentifier;

import java.util.ArrayList;
import java.util.HashSet;
import java.util.List;
import java.util.HashMap;
import java.util.Map;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.Executors;

import java.util.ArrayList;
import java.util.HashSet;
import java.util.List;
import java.util.HashMap;
import java.util.Map;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.Executors;

public class MainActivity extends AppCompatActivity {

    private CallAgent callAgent;
    private VideoDeviceInfo currentCamera;
    private LocalVideoStream currentVideoStream;
    private DeviceManager deviceManager;
    private IncomingCall incomingCall;
    private Call call;
    VideoStreamRenderer previewRenderer;
    VideoStreamRendererView preview;
    final Map<Integer, StreamData> streamData = new HashMap<>();
    private boolean renderRemoteVideo = true;
    private ParticipantsUpdatedListener remoteParticipantUpdatedListener;
    private PropertyChangedListener onStateChangedListener;

    final HashSet<String> joinedParticipants = new HashSet<>();

    Button switchSourceButton;

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);

        getAllPermissions();
        createAgent();

        handleIncomingCall();

        Button callButton = findViewById(R.id.call_button);
        callButton.setOnClickListener(l -> startCall());
        Button hangupButton = findViewById(R.id.hang_up);
        hangupButton.setOnClickListener(l -> hangUp());
        Button startVideo = findViewById(R.id.show_preview);
        startVideo.setOnClickListener(l -> turnOnLocalVideo());
        Button stopVideo = findViewById(R.id.hide_preview);
        stopVideo.setOnClickListener(l -> turnOffLocalVideo());

        switchSourceButton = findViewById(R.id.switch_source);
        switchSourceButton.setOnClickListener(l -> switchSource());

        setVolumeControlStream(AudioManager.STREAM_VOICE_CALL);
    }
	
	/**
     * Request each required permission if the app doesn't already have it.
     */
    private void getAllPermissions() {
        // See section on requesting permissions
    }

    /**
      * Create the call agent for placing calls
      */
    private void createAgent() {
        // See section on creating the call agent
    }
	
	/**
     * Handle incoming calls
     */
    private void handleIncomingCall() {
        // See section on answering incoming call
    }
	
	/**
     * Place a call to the callee id provided in `callee_id` text input.
     */
    private void startCall() {
        // See section on starting the call
    }

    /**
     * End calls
     */
    private void hangUp() {
        // See section on ending the call
    }
	
	/**
     * Mid-call operations
     */
    public void turnOnLocalVideo() {
        // See section
    }
	
    public void turnOffLocalVideo() {
        // See section
    }
	
	/**
     * Change the active camera for the next available
     */
	public void switchSource() {
		// See section
    }
}

要求執行階段時的權限

針對 Android 6.0 和更新版本, (API 層級 23) 和 targetSdkVersion 23 或更新版本,會在執行時間授與許可權,而不是在安裝應用程式時授與許可權。 為了支援它, getAllPermissions 可以實作 來呼叫 ActivityCompat.checkSelfPermissionActivityCompat.requestPermissions 每個必要許可權。

/**
 * Request each required permission if the app doesn't already have it.
 */
private void getAllPermissions() {
    String[] requiredPermissions = new String[]{Manifest.permission.RECORD_AUDIO, Manifest.permission.CAMERA, Manifest.permission.WRITE_EXTERNAL_STORAGE, Manifest.permission.READ_PHONE_STATE};
    ArrayList<String> permissionsToAskFor = new ArrayList<>();
    for (String permission : requiredPermissions) {
        if (ActivityCompat.checkSelfPermission(this, permission) != PackageManager.PERMISSION_GRANTED) {
            permissionsToAskFor.add(permission);
        }
    }
    if (!permissionsToAskFor.isEmpty()) {
        ActivityCompat.requestPermissions(this, permissionsToAskFor.toArray(new String[0]), 1);
    }
}

注意

當您設計應用程式時,請考慮何時應要求這些許可權。 您應該視需要要求權限,而不是提前要求。 如需詳細資訊,請參閱 Android 許可權指南

物件模型

下列類別和介面會處理Azure 通訊服務呼叫 SDK 的一些主要功能:

名稱 描述
CallClient 呼叫 SDK 的主要進入點。
CallAgent 用來啟動和管理呼叫。
CommunicationTokenCredential 用來具現化 CallAgent 的權杖認證。
CommunicationIdentifier 當做可能屬於通話的不同類型的參與者使用。

從使用者存取權杖建立代理程式

您需要使用者權杖,才能建立已驗證的呼叫代理程式。 一般而言,此權杖是從具有應用程式專屬驗證的服務產生。 如需使用者存取權杖的詳細資訊,請參閱 使用者存取權杖

針對快速入門,請將 取代 <User_Access_Token> 為您Azure 通訊服務資源產生的使用者存取權杖。

/**
 * Create the call agent for placing calls
 */
private void createAgent() {
    Context context = this.getApplicationContext();
    String userToken = "<USER_ACCESS_TOKEN>";
    try {
        CommunicationTokenCredential credential = new CommunicationTokenCredential(userToken);
        CallClient callClient = new CallClient();
        deviceManager = callClient.getDeviceManager(context).get();
        callAgent = callClient.createCallAgent(getApplicationContext(), credential).get(); 
    } catch (Exception ex) {
        Toast.makeText(context, "Failed to create call agent.", Toast.LENGTH_SHORT).show();
    }
}

使用通話代理程式啟動視訊通話

您可以使用呼叫代理程式來撥打電話。 您只需要提供被呼叫端識別碼和通話選項的清單。

若要使用視訊來呼叫,您必須使用 deviceManagergetCameras API 列舉本機相機。 選取所需的相機之後,請使用它來建構 LocalVideoStream 實例。 然後將它 videoOptions 當做陣列中的 localVideoStream 專案傳遞至呼叫方法。 當通話連線時,它會自動開始將視訊串流從選取的相機傳送給其他參與者。

private void startCall() {
    Context context = this.getApplicationContext();
    EditText callIdView = findViewById(R.id.call_id);
    String callId = callIdView.getText().toString();
    ArrayList<CommunicationIdentifier> participants = new ArrayList<CommunicationIdentifier>();
    List<VideoDeviceInfo> cameras = deviceManager.getCameras();


    StartCallOptions options = new StartCallOptions();
    if(!cameras.isEmpty()) {
        currentCamera = getNextAvailableCamera(null);
        currentVideoStream = new LocalVideoStream(currentCamera, context);
        LocalVideoStream[] videoStreams = new LocalVideoStream[1];
        videoStreams[0] = currentVideoStream;
        VideoOptions videoOptions = new VideoOptions(videoStreams);
        options.setVideoOptions(videoOptions);
        showPreview(currentVideoStream);
    }
    participants.add(new CommunicationUserIdentifier(callId));

    call = callAgent.startCall(
            context,
            participants,
            options);

    //Subscribe to events on updates of call state and remote participants
    remoteParticipantUpdatedListener = this::handleRemoteParticipantsUpdate;
    onStateChangedListener = this::handleCallOnStateChanged;
    call.addOnRemoteParticipantsUpdatedListener(remoteParticipantUpdatedListener);
    call.addOnStateChangedListener(onStateChangedListener);
}

在本快速入門中,您會依賴 函 getNextAvailableCamera 式來挑選呼叫所使用的相機。 函式會接受相機的列舉作為輸入,並逐一查看清單,以取得下一個可用的相機。 如果引數為 null ,函式會挑選清單中的第一個裝置。 如果您選取 [ 開始通話] 時沒有可用的相機,則會改為啟動音訊通話。 但是,如果遠端參與者回答影片,您仍然可以看到遠端視訊串流。

private VideoDeviceInfo getNextAvailableCamera(VideoDeviceInfo camera) {
    List<VideoDeviceInfo> cameras = deviceManager.getCameras();
    int currentIndex = 0;
    if (camera == null) {
        return cameras.isEmpty() ? null : cameras.get(0);
    }

    for (int i = 0; i < cameras.size(); i++) {
        if (camera.getId().equals(cameras.get(i).getId())) {
            currentIndex = i;
            break;
        }
    }
    int newIndex = (currentIndex + 1) % cameras.size();
    return cameras.get(newIndex);
}

建構 LocalVideoStream 實例之後,您可以建立轉譯器,在 UI 上顯示它。

private void showPreview(LocalVideoStream stream) {
        previewRenderer = new VideoStreamRenderer(stream, this);
        LinearLayout layout = findViewById(R.id.localvideocontainer);
        preview = previewRenderer.createView(new CreateViewOptions(ScalingMode.FIT));
        preview.setTag(0);
        runOnUiThread(() -> {
            layout.addView(preview);
            switchSourceButton.setVisibility(View.VISIBLE);
        });
    }

若要允許使用者切換本機視訊來源,請使用 switchSource 。 此方法會挑選下一個可用的相機,並將它定義為本機資料流程。

public void switchSource() {
    if (currentVideoStream != null) {
        try {
            currentCamera = getNextAvailableCamera(currentCamera);
            currentVideoStream.switchSource(currentCamera).get();
        } catch (InterruptedException e) {
            e.printStackTrace();
        } catch (ExecutionException e) {
            e.printStackTrace();
        }
    }
}

接聽來電

您可以藉由訂閱 addOnIncomingCallListenercallAgent 來取得來電。

private void handleIncomingCall() {
    callAgent.addOnIncomingCallListener((incomingCall) -> {
        this.incomingCall = incomingCall;
        Executors.newCachedThreadPool().submit(this::answerIncomingCall);
    });
}

若要接受上視訊相機的呼叫,請使用 deviceManagergetCameras API 列舉本機相機。 挑選相機,並建構 LocalVideoStream 實例。 先將它 acceptCallOptions 傳入 ,再呼叫 accept 物件上的 call 方法。

private void answerIncomingCall() {
    Context context = this.getApplicationContext();
    if (incomingCall == null){
        return;
    }
    AcceptCallOptions acceptCallOptions = new AcceptCallOptions();
    List<VideoDeviceInfo> cameras = deviceManager.getCameras();
    if(!cameras.isEmpty()) {
        currentCamera = getNextAvailableCamera(null);
        currentVideoStream = new LocalVideoStream(currentCamera, context);
        LocalVideoStream[] videoStreams = new LocalVideoStream[1];
        videoStreams[0] = currentVideoStream;
        VideoOptions videoOptions = new VideoOptions(videoStreams);
        acceptCallOptions.setVideoOptions(videoOptions);
        showPreview(currentVideoStream);
    }
    try {
        call = incomingCall.accept(context, acceptCallOptions).get();
    } catch (InterruptedException e) {
        e.printStackTrace();
    } catch (ExecutionException e) {
        e.printStackTrace();
    }

    //Subscribe to events on updates of call state and remote participants
    remoteParticipantUpdatedListener = this::handleRemoteParticipantsUpdate;
    onStateChangedListener = this::handleCallOnStateChanged;
    call.addOnRemoteParticipantsUpdatedListener(remoteParticipantUpdatedListener);
    call.addOnStateChangedListener(onStateChangedListener);
}

遠端參與者和遠端視訊串流

當您開始通話或接聽來電時,您需要訂閱 addOnRemoteParticipantsUpdatedListener 事件來處理遠端參與者。

remoteParticipantUpdatedListener = this::handleRemoteParticipantsUpdate;
call.addOnRemoteParticipantsUpdatedListener(remoteParticipantUpdatedListener);

使用在相同類別內定義的事件接聽程式時,請將接聽程式繫結至變數。 將變數傳遞為引數,以新增和移除接聽程式方法。

如果您嘗試將接聽程式直接傳遞為引數,您會失去該接聽程式的參考。 JAVA 會建立這些接聽程式的新實例,而不是參考先前建立的接聽程式。 您無法移除先前的實例,因為您沒有這些實例的參考。

遠端視訊串流更新

針對 1:1 通話,您必須處理新增的參與者。 當您移除遠端參與者時,通話會結束。 針對新增的參與者,您可以訂閱 addOnVideoStreamsUpdatedListener 以處理視訊串流更新。

public void handleRemoteParticipantsUpdate(ParticipantsUpdatedEvent args) {
    handleAddedParticipants(args.getAddedParticipants());
}

private void handleAddedParticipants(List<RemoteParticipant> participants) {
    for (RemoteParticipant remoteParticipant : participants) {
        if(!joinedParticipants.contains(getId(remoteParticipant))) {
            joinedParticipants.add(getId(remoteParticipant));

            if (renderRemoteVideo) {
                for (RemoteVideoStream stream : remoteParticipant.getVideoStreams()) {
                    StreamData data = new StreamData(stream, null, null);
                    streamData.put(stream.getId(), data);
                    startRenderingVideo(data);
                }
            }
            remoteParticipant.addOnVideoStreamsUpdatedListener(videoStreamsEventArgs -> videoStreamsUpdated(videoStreamsEventArgs));
        }
    }
}

private void videoStreamsUpdated(RemoteVideoStreamsEvent videoStreamsEventArgs) {
    for(RemoteVideoStream stream : videoStreamsEventArgs.getAddedRemoteVideoStreams()) {
        StreamData data = new StreamData(stream, null, null);
        streamData.put(stream.getId(), data);
        if (renderRemoteVideo) {
            startRenderingVideo(data);
        }
    }

    for(RemoteVideoStream stream : videoStreamsEventArgs.getRemovedRemoteVideoStreams()) {
        stopRenderingVideo(stream);
    }
}


public String getId(final RemoteParticipant remoteParticipant) {
    final CommunicationIdentifier identifier = remoteParticipant.getIdentifier();
    if (identifier instanceof PhoneNumberIdentifier) {
        return ((PhoneNumberIdentifier) identifier).getPhoneNumber();
    } else if (identifier instanceof MicrosoftTeamsUserIdentifier) {
        return ((MicrosoftTeamsUserIdentifier) identifier).getUserId();
    } else if (identifier instanceof CommunicationUserIdentifier) {
        return ((CommunicationUserIdentifier) identifier).getId();
    } else {
        return ((UnknownIdentifier) identifier).getId();
    }
}

轉譯遠端影片

建立遠端視訊資料流程的轉譯器,並將其附加至檢視以開始轉譯遠端檢視。 處置檢視以停止轉譯。

void startRenderingVideo(StreamData data){
    if (data.renderer != null) {
        return;
    }
    GridLayout layout = ((GridLayout)findViewById(R.id.remotevideocontainer));
    data.renderer = new VideoStreamRenderer(data.stream, this);
    data.renderer.addRendererListener(new RendererListener() {
        @Override
        public void onFirstFrameRendered() {
            String text = data.renderer.getSize().toString();
            Log.i("MainActivity", "Video rendering at: " + text);
        }

        @Override
        public void onRendererFailedToStart() {
            String text = "Video failed to render";
            Log.i("MainActivity", text);
        }
    });
    data.rendererView = data.renderer.createView(new CreateViewOptions(ScalingMode.FIT));
    data.rendererView.setTag(data.stream.getId());
    runOnUiThread(() -> {
        GridLayout.LayoutParams params = new GridLayout.LayoutParams(layout.getLayoutParams());
        DisplayMetrics displayMetrics = new DisplayMetrics();
        getWindowManager().getDefaultDisplay().getMetrics(displayMetrics);
        params.height = (int)(displayMetrics.heightPixels / 2.5);
        params.width = displayMetrics.widthPixels / 2;
        layout.addView(data.rendererView, params);
    });
}

void stopRenderingVideo(RemoteVideoStream stream) {
    StreamData data = streamData.get(stream.getId());
    if (data == null || data.renderer == null) {
        return;
    }
    runOnUiThread(() -> {
        GridLayout layout = findViewById(R.id.remotevideocontainer);
        for(int i = 0; i < layout.getChildCount(); ++ i) {
            View childView =  layout.getChildAt(i);
            if ((int)childView.getTag() == data.stream.getId()) {
                layout.removeViewAt(i);
            }
        }
    });
    data.rendererView = null;
    // Dispose renderer
    data.renderer.dispose();
    data.renderer = null;
}

static class StreamData {
    RemoteVideoStream stream;
    VideoStreamRenderer renderer;
    VideoStreamRendererView rendererView;
    StreamData(RemoteVideoStream stream, VideoStreamRenderer renderer, VideoStreamRendererView rendererView) {
        this.stream = stream;
        this.renderer = renderer;
        this.rendererView = rendererView;
    }
}

通話狀態更新

呼叫的狀態可能會從已連線變更為已中斷連線。 當通話已連線時,您可以處理遠端參與者,而當通話中斷連線時,您會處置 previewRenderer 來停止本機視訊。

private void handleCallOnStateChanged(PropertyChangedEvent args) {
        if (call.getState() == CallState.CONNECTED) {
            runOnUiThread(() -> Toast.makeText(this, "Call is CONNECTED", Toast.LENGTH_SHORT).show());
            handleCallState();
        }
        if (call.getState() == CallState.DISCONNECTED) {
            runOnUiThread(() -> Toast.makeText(this, "Call is DISCONNECTED", Toast.LENGTH_SHORT).show());
            if (previewRenderer != null) {
                previewRenderer.dispose();
            }
            switchSourceButton.setVisibility(View.INVISIBLE);
        }
    }

結束通話

在呼叫實例上呼叫 hangUp() 函式來結束呼叫。 previewRenderer處置 以停止本機視訊。

private void hangUp() {
    try {
        call.hangUp().get();
        switchSourceButton.setVisibility(View.INVISIBLE);
    } catch (ExecutionException | InterruptedException e) {
        e.printStackTrace();
    }
    if (previewRenderer != null) {
        previewRenderer.dispose();
    }
}

隱藏並顯示本機影片

當呼叫啟動時,您可以使用 停止本機視訊轉譯和串流 turnOffLocalVideo() ,這個方法會移除包裝本機轉譯的檢視,並處置目前的資料流程。 若要繼續串流並再次轉譯本機預覽,請使用 turnOnLocalVideo() ,這個方法會顯示影片預覽並開始串流。

public void turnOnLocalVideo() {
    List<VideoDeviceInfo> cameras = deviceManager.getCameras();
    if(!cameras.isEmpty()) {
        try {
            currentVideoStream = new LocalVideoStream(currentCamera, this);
            showPreview(currentVideoStream);
            call.startVideo(this, currentVideoStream).get();
            switchSourceButton.setVisibility(View.VISIBLE);
        } catch (CallingCommunicationException acsException) {
            acsException.printStackTrace();
        } catch (ExecutionException | InterruptedException e) {
            e.printStackTrace();
        }
    }
}

public void turnOffLocalVideo() {
    try {
        LinearLayout container = findViewById(R.id.localvideocontainer);
        for (int i = 0; i < container.getChildCount(); ++i) {
            Object tag = container.getChildAt(i).getTag();
            if (tag != null && (int)tag == 0) {
                container.removeViewAt(i);
            }
        }
        switchSourceButton.setVisibility(View.INVISIBLE);
        previewRenderer.dispose();
        previewRenderer = null;
        call.stopVideo(this, currentVideoStream).get();
    } catch (CallingCommunicationException acsException) {
        acsException.printStackTrace();
    } catch (ExecutionException | InterruptedException e) {
        e.printStackTrace();
    }
}

執行程式碼

您現在可以使用 Android Studio 工具列上的 [ 執行應用程式 ] 按鈕來啟動應用程式。

已完成的應用程式 1:1 通話
顯示已完成應用程式的螢幕擷取畫面。 螢幕擷取畫面,顯示呼叫上的應用程式。

新增群組通話功能

現在您可以更新應用程式,讓使用者選擇 1 對 1 通話或群組通話。

更新配置

如果 SDK 建立 1:1 呼叫或加入群組呼叫,請使用選項按鈕來選取。 選項按鈕位於頂端,因此 app/src/main/res/layout/activity_main.xml 結尾的第一個區段如下所示。

<?xml version="1.0" encoding="utf-8"?>
<androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:app="http://schemas.android.com/apk/res-auto"
    xmlns:tools="http://schemas.android.com/tools"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    tools:context=".MainActivity">
    <LinearLayout
        android:layout_width="match_parent"
        android:layout_height="wrap_content"
        android:orientation="vertical">

        <RadioGroup
            android:layout_width="match_parent"
            android:layout_height="wrap_content">

            <RadioButton
                android:id="@+id/one_to_one_call"
                android:layout_width="match_parent"
                android:layout_height="wrap_content"
                android:text="One to one call" />

            <RadioButton
                android:id="@+id/group_call"
                android:layout_width="match_parent"
                android:layout_height="wrap_content"
                android:text="Group call" />

        </RadioGroup>

        <EditText
            android:id="@+id/call_id"
            android:layout_width="match_parent"
            android:layout_height="wrap_content"
            android:ems="10"
            android:gravity="center"
            android:hint="Callee ID"
            android:inputType="textPersonName"
            app:layout_constraintBottom_toTopOf="@+id/call_button"
            app:layout_constraintEnd_toEndOf="parent"
            app:layout_constraintStart_toStartOf="parent"
            app:layout_constraintTop_toTopOf="parent"
            app:layout_constraintVertical_bias="0.064" />
.
.
.
</androidx.constraintlayout.widget.ConstraintLayout>

更新 MainActivity.JAVA

您現在可以更新元素和邏輯,以決定何時要建立 1:1 呼叫,以及何時加入群組呼叫。 程式碼的第一個部分需要更新,才能新增相依性、專案和其他組態。

相依性:

import android.widget.RadioButton;
import com.azure.android.communication.calling.GroupCallLocator;
import com.azure.android.communication.calling.JoinCallOptions;
import java.util.UUID;

全域元素:

RadioButton oneToOneCall, groupCall;

更新 onCreate()

@Override
protected void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);
    setContentView(R.layout.activity_main);

    getAllPermissions();
    createAgent();

    handleIncomingCall();

    Button callButton = findViewById(R.id.call_button);
    callButton.setOnClickListener(l -> startCall());
    Button hangupButton = findViewById(R.id.hang_up);
    hangupButton.setOnClickListener(l -> hangUp());
    Button startVideo = findViewById(R.id.show_preview);
    startVideo.setOnClickListener(l -> turnOnLocalVideo());
    Button stopVideo = findViewById(R.id.hide_preview);
    stopVideo.setOnClickListener(l -> turnOffLocalVideo());

    switchSourceButton = findViewById(R.id.switch_source);
    switchSourceButton.setOnClickListener(l -> switchSource());

    setVolumeControlStream(AudioManager.STREAM_VOICE_CALL);

    oneToOneCall = findViewById(R.id.one_to_one_call);
    oneToOneCall.setOnClickListener(this::onCallTypeSelected);
    oneToOneCall.setChecked(true);
    groupCall = findViewById(R.id.group_call);
    groupCall.setOnClickListener(this::onCallTypeSelected);

}

更新 startCall()

private void startCall() {
        Context context = this.getApplicationContext();
        EditText callIdView = findViewById(R.id.call_id);
        String callId = callIdView.getText().toString();
        ArrayList<CommunicationIdentifier> participants = new ArrayList<CommunicationIdentifier>();
        List<VideoDeviceInfo> cameras = deviceManager.getCameras();


        if(oneToOneCall.isChecked()){
        StartCallOptions options = new StartCallOptions();
        if(!cameras.isEmpty()) {
            currentCamera = getNextAvailableCamera(null);
            currentVideoStream = new LocalVideoStream(currentCamera, context);
            LocalVideoStream[] videoStreams = new LocalVideoStream[1];
            videoStreams[0] = currentVideoStream;
            VideoOptions videoOptions = new VideoOptions(videoStreams);
            options.setVideoOptions(videoOptions);
            showPreview(currentVideoStream);
        }
        participants.add(new CommunicationUserIdentifier(callId));

        call = callAgent.startCall(
                context,
                participants,
                options);
        }
        else{

            JoinCallOptions options = new JoinCallOptions();
            if(!cameras.isEmpty()) {
                currentCamera = getNextAvailableCamera(null);
                currentVideoStream = new LocalVideoStream(currentCamera, context);
                LocalVideoStream[] videoStreams = new LocalVideoStream[1];
                videoStreams[0] = currentVideoStream;
                VideoOptions videoOptions = new VideoOptions(videoStreams);
                options.setVideoOptions(videoOptions);
                showPreview(currentVideoStream);
            }
            GroupCallLocator groupCallLocator = new GroupCallLocator(UUID.fromString(callId));

            call = callAgent.join(
                    context,
                    groupCallLocator,
                    options);
        }



        remoteParticipantUpdatedListener = this::handleRemoteParticipantsUpdate;
        onStateChangedListener = this::handleCallOnStateChanged;
        call.addOnRemoteParticipantsUpdatedListener(remoteParticipantUpdatedListener);
        call.addOnStateChangedListener(onStateChangedListener);
    }

新增 onCallTypeSelected()

public void onCallTypeSelected(View view) {
    boolean checked = ((RadioButton) view).isChecked();
    EditText callIdView = findViewById(R.id.call_id);

    switch(view.getId()) {
        case R.id.one_to_one_call:
            if (checked){
                callIdView.setHint("Callee id");
            }
            break;
        case R.id.group_call:
            if (checked){
                callIdView.setHint("Group Call GUID");
            }
            break;
    }
}

執行升級的應用程式

此時,您可以使用 Android Studio 工具列上的 [ 執行應用程式 ] 按鈕來啟動應用程式。

螢幕更新 群組通話
顯示應用程式已更新的螢幕擷取畫面。 此螢幕擷取畫面顯示群組呼叫上的應用程式。

使用通訊服務呼叫 SDK 開始Azure 通訊服務,將一個視訊通話新增至您的應用程式。 您將瞭解如何使用適用于 iOS 的Azure 通訊服務通話 SDK 來開始和接聽視訊通話。

範例程式碼

如果您想要跳到結尾,您可以在 GitHub上下載本快速入門作為範例。

必要條件

  • 取得具有作用中訂用帳戶的 Azure 帳戶。 免費建立帳戶

  • 執行 Xcode 的 Mac,以及安裝在您 Keychain 中的有效開發人員憑證。

  • 建立作用中的通訊服務資源。 建立通訊服務資源。 您必須記錄此快速入門 的連接字串

  • 針對您的 Azure 通訊服務的使用者存取權杖。 您也可以使用 Azure CLI,並使用連接字串執行 命令,以建立使用者和存取權杖。

    az communication identity token issue --scope voip --connection-string "yourConnectionString"
    

    如需詳細資訊,請參閱使用 Azure CLI 建立和管理存取權杖

設定

建立 XCode 專案

在 Xcode 中建立新的 iOS 專案,並選取 [單一檢視應用程式] 範本。 本教學課程使用 SwiftUI 架構,因此您應該將 Language 設定為 Swift,並將使用者介面設定為 SwiftUI。 進行本快速入門期間,您不會建立測試。 您可以取消核取 [包含測試]。

顯示 Xcode 內 [新增專案] 視窗的螢幕擷取畫面。

安裝 CocoaPods

使用本指南在 Mac 上安裝 CocoaPods

使用 CocoaPods 安裝套件和相依性

  1. 若要建立 Podfile 應用程式的 ,請開啟終端機並流覽至專案資料夾並執行 pod init。

  2. 將下列程式碼新增至 Podfile 並儲存。 請參閱 SDK 支援版本

platform :ios, '13.0'
use_frameworks!

target 'VideoCallingQuickstart' do
  pod 'AzureCommunicationCalling', '~> 1.0.0'
end
  1. 執行 Pod 安裝。

  2. 使用 Xcode 開啟 .xcworkspace

要求存取麥克風和相機

若要存取裝置的麥克風和相機,您必須使用 NSMicrophoneUsageDescriptionNSCameraUsageDescription 來更新應用程式的資訊屬性清單。 您可以將相關聯的值設定為字串,其中包含系統用來要求使用者存取的對話方塊。

以滑鼠右鍵按一下 Info.plist 專案樹狀結構的專案,然後選取 [開啟為 > 原始程式碼]。 將以下幾行新增至最上層 <dict> 區段中,然後儲存檔案。

<key>NSMicrophoneUsageDescription</key>
<string>Need microphone access for VOIP calling.</string>
<key>NSCameraUsageDescription</key>
<string>Need camera access for video calling</string>

設定應用程式架構

開啟專案的 ContentView.swift 檔案,並將匯入宣告新增至檔案頂端,以匯 AzureCommunicationCalling 入程式庫和 AVFoundation 。 AVFoundation 可用來從程式碼擷取音訊許可權。

import AzureCommunicationCalling
import AVFoundation

物件模型

下列類別和介面會處理 iOS Azure 通訊服務呼叫 SDK 的一些主要功能。

名稱 描述
CallClient CallClient是呼叫 SDK 的主要進入點。
CallAgent CallAgent用來啟動和管理呼叫。
CommunicationTokenCredential CommunicationTokenCredential會用來做為權杖認證來具現化 CallAgent
CommunicationIdentifier CommunicationIdentifier用來代表使用者的身分識別,這可以是下列其中一個選項: CommunicationUserIdentifierPhoneNumberIdentifierCallingApplication

建立通話代理程式

以一些簡單的 UI 控制項取代 ContentView struc t 的實作,讓使用者起始和結束呼叫。 在本快速入門中,我們會將商務邏輯新增至這些控制項。

struct ContentView: View {
    @State var callee: String = ""
    @State var callClient: CallClient?
    @State var callAgent: CallAgent?
    @State var call: Call?
    @State var deviceManager: DeviceManager?
    @State var localVideoStream:[LocalVideoStream]?
    @State var incomingCall: IncomingCall?
    @State var sendingVideo:Bool = false
    @State var errorMessage:String = "Unknown"

    @State var remoteVideoStreamData:[Int32:RemoteVideoStreamData] = [:]
    @State var previewRenderer:VideoStreamRenderer? = nil
    @State var previewView:RendererView? = nil
    @State var remoteRenderer:VideoStreamRenderer? = nil
    @State var remoteViews:[RendererView] = []
    @State var remoteParticipant: RemoteParticipant?
    @State var remoteVideoSize:String = "Unknown"
    @State var isIncomingCall:Bool = false
    
    @State var callObserver:CallObserver?
    @State var remoteParticipantObserver:RemoteParticipantObserver?

    var body: some View {
        NavigationView {
            ZStack{
                Form {
                    Section {
                        TextField("Who would you like to call?", text: $callee)
                        Button(action: startCall) {
                            Text("Start Call")
                        }.disabled(callAgent == nil)
                        Button(action: endCall) {
                            Text("End Call")
                        }.disabled(call == nil)
                        Button(action: toggleLocalVideo) {
                            HStack {
                                Text(sendingVideo ? "Turn Off Video" : "Turn On Video")
                            }
                        }
                    }
                }
                // Show incoming call banner
                if (isIncomingCall) {
                    HStack() {
                        VStack {
                            Text("Incoming call")
                                .padding(10)
                                .frame(maxWidth: .infinity, alignment: .topLeading)
                        }
                        Button(action: answerIncomingCall) {
                            HStack {
                                Text("Answer")
                            }
                            .frame(width:80)
                            .padding(.vertical, 10)
                            .background(Color(.green))
                        }
                        Button(action: declineIncomingCall) {
                            HStack {
                                Text("Decline")
                            }
                            .frame(width:80)
                            .padding(.vertical, 10)
                            .background(Color(.red))
                        }
                    }
                    .frame(maxWidth: .infinity, alignment: .topLeading)
                    .padding(10)
                    .background(Color.gray)
                }
                ZStack{
                    VStack{
                        ForEach(remoteViews, id:\.self) { renderer in
                            ZStack{
                                VStack{
                                    RemoteVideoView(view: renderer)
                                        .frame(width: .infinity, height: .infinity)
                                        .background(Color(.lightGray))
                                }
                            }
                            Button(action: endCall) {
                                Text("End Call")
                            }.disabled(call == nil)
                            Button(action: toggleLocalVideo) {
                                HStack {
                                    Text(sendingVideo ? "Turn Off Video" : "Turn On Video")
                                }
                            }
                        }
                        
                    }.frame(maxWidth: .infinity, maxHeight: .infinity, alignment: .topLeading)
                    VStack{
                        if(sendingVideo)
                        {
                            VStack{
                                PreviewVideoStream(view: previewView!)
                                    .frame(width: 135, height: 240)
                                    .background(Color(.lightGray))
                            }
                        }
                    }.frame(maxWidth:.infinity, maxHeight:.infinity,alignment: .bottomTrailing)
                }
            }
     .navigationBarTitle("Video Calling Quickstart")
        }.onAppear{
            // Authenticate the client
            
            // Initialize the CallAgent and access Device Manager
            
            // Ask for permissions
        }
    }
}

//Functions and Observers

struct PreviewVideoStream: UIViewRepresentable {
    let view:RendererView
    func makeUIView(context: Context) -> UIView {
        return view
    }
    func updateUIView(_ uiView: UIView, context: Context) {}
}

struct RemoteVideoView: UIViewRepresentable {
    let view:RendererView
    func makeUIView(context: Context) -> UIView {
        return view
    }
    func updateUIView(_ uiView: UIView, context: Context) {}
}

struct ContentView_Previews: PreviewProvider {
    static var previews: some View {
        ContentView()
    }
}

驗證用戶端

若要初始化 CallAgent 實例,您需要讓使用者存取權杖能夠進行,並接收呼叫。 如果您沒有可用的權杖,請參閱 使用者存取權杖 檔。

一旦您有權杖,請將下列程式碼新增至 onAppear 中的 ContentView.swift 回呼。 您必須將 取代 <USER ACCESS TOKEN> 為資源的有效 使用者存取權杖

var userCredential: CommunicationTokenCredential?
do {
    userCredential = try CommunicationTokenCredential(token: "<USER ACCESS TOKEN>")
} catch {
    print("ERROR: It was not possible to create user credential.")
    return
}

初始化 CallAgent 和存取裝置管理員

若要從 CallClient 建立 CallAgent 實例,請使用 callClient.createCallAgent 方法,在初始化呼叫後以非同步方式傳回 CallAgent 物件。 DeviceManager 可讓您列舉可用於傳輸音訊/視訊串流的呼叫中的本機裝置。 它也可讓您要求使用者存取麥克風/相機的許可權。

self.callClient = CallClient()
self.callClient?.createCallAgent(userCredential: userCredential!) { (agent, error) in
    if error != nil {
        print("ERROR: It was not possible to create a call agent.")
        return
    }

    else {
        self.callAgent = agent
        print("Call agent successfully created.")
        self.callAgent!.delegate = incomingCallHandler
        self.callClient?.getDeviceManager { (deviceManager, error) in
            if (error == nil) {
                print("Got device manager instance")
                self.deviceManager = deviceManager
            } else {
                print("Failed to get device manager instance")
            }
        }
    }
}

要求許可權

我們需要將下列程式碼新增至回呼, onAppear 以要求音訊和視訊的許可權。

AVAudioSession.sharedInstance().requestRecordPermission { (granted) in
    if granted {
        AVCaptureDevice.requestAccess(for: .video) { (videoGranted) in
            /* NO OPERATION */
        }
    }
}

顯示本機影片

開始通話之前,您可以管理與視訊相關的設定。 在本快速入門中,我們會介紹在通話之前或期間切換本機視訊的實作。

首先,我們需要使用 deviceManager 存取本機相機。 選取所需的相機之後,我們可以建構 LocalVideoStream 並建立 VideoStreamRenderer ,然後將它附加至 previewView 。 在通話期間,我們可以使用 startVideostopVideo 來啟動或停止傳送 LocalVideoStream 給遠端參與者。 此函式也適用于處理撥入電話。

func toggleLocalVideo() {
    // toggling video before call starts
    if (call == nil)
    {
        if(!sendingVideo)
        {
            self.callClient = CallClient()
            self.callClient?.getDeviceManager { (deviceManager, error) in
                if (error == nil) {
                    print("Got device manager instance")
                    self.deviceManager = deviceManager
                } else {
                    print("Failed to get device manager instance")
                }
            }
            guard let deviceManager = deviceManager else {
                return
            }
            let camera = deviceManager.cameras.first
            let scalingMode = ScalingMode.fit
            if (self.localVideoStream == nil) {
                self.localVideoStream = [LocalVideoStream]()
            }
            localVideoStream!.append(LocalVideoStream(camera: camera!))
            previewRenderer = try! VideoStreamRenderer(localVideoStream: localVideoStream!.first!)
            previewView = try! previewRenderer!.createView(withOptions: CreateViewOptions(scalingMode:scalingMode))
            self.sendingVideo = true
        }
        else{
            self.sendingVideo = false
            self.previewView = nil
            self.previewRenderer!.dispose()
            self.previewRenderer = nil
        }
    }
    // toggle local video during the call
    else{
        if (sendingVideo) {
            call!.stopVideo(stream: localVideoStream!.first!) { (error) in
                if (error != nil) {
                    print("cannot stop video")
                }
                else {
                    self.sendingVideo = false
                    self.previewView = nil
                    self.previewRenderer!.dispose()
                    self.previewRenderer = nil
                }
            }
        }
        else {
            guard let deviceManager = deviceManager else {
                return
            }
            let camera = deviceManager.cameras.first
            let scalingMode = ScalingMode.fit
            if (self.localVideoStream == nil) {
                self.localVideoStream = [LocalVideoStream]()
            }
            localVideoStream!.append(LocalVideoStream(camera: camera!))
            previewRenderer = try! VideoStreamRenderer(localVideoStream: localVideoStream!.first!)
            previewView = try! previewRenderer!.createView(withOptions: CreateViewOptions(scalingMode:scalingMode))
            call!.startVideo(stream:(localVideoStream?.first)!) { (error) in
                if (error != nil) {
                    print("cannot start video")
                }
                else {
                    self.sendingVideo = true
                }
            }
        }
    }
}

撥出電話

方法 startCall 會設定為點選 [開始通話] 按鈕時所執行的動作。 在本快速入門中,傳出電話預設為音訊。 若要使用視訊開始通話,我們需要設定 VideoOptionsLocalVideoStream ,並使用 傳遞它 startCallOptions 來設定通話的初始選項。

func startCall() {
        let startCallOptions = StartCallOptions()
        if(sendingVideo)
        {
            if (self.localVideoStream == nil) {
                self.localVideoStream = [LocalVideoStream]()
            }
            let videoOptions = VideoOptions(localVideoStreams: localVideoStream!)
            startCallOptions.videoOptions = videoOptions
        }
        let callees:[CommunicationIdentifier] = [CommunicationUserIdentifier(self.callee)]
        self.callAgent?.startCall(participants: callees, options: startCallOptions) { (call, error) in
            setCallAndObersever(call: call, error: error)
        }
    }

CallObserverRemotePariticipantObserver 可用來管理中接事件和遠端參與者。 我們會在 函式 setCallAndOberserver 中設定觀察者。

func setCallAndObersever(call:Call!, error:Error?) {
    if (error == nil) {
        self.call = call
        self.callObserver = CallObserver(self)
        self.call!.delegate = self.callObserver
        self.remoteParticipantObserver = RemoteParticipantObserver(self)
    } else {
        print("Failed to get call object")
    }
}

接聽來電

若要接聽來電,請實 IncomingCallHandler 作 來顯示來電橫幅,以接聽或拒絕通話。 將下列實作放在 IncomingCallHandler.swift 中。

final class IncomingCallHandler: NSObject, CallAgentDelegate, IncomingCallDelegate {
    public var contentView: ContentView?
    private var incomingCall: IncomingCall?

    private static var instance: IncomingCallHandler?
    static func getOrCreateInstance() -> IncomingCallHandler {
        if let c = instance {
            return c
        }
        instance = IncomingCallHandler()
        return instance!
    }

    private override init() {}
    
    public func callAgent(_ callAgent: CallAgent, didRecieveIncomingCall incomingCall: IncomingCall) {
        self.incomingCall = incomingCall
        self.incomingCall?.delegate = self
        contentView?.showIncomingCallBanner(self.incomingCall!)
    }
    
    public func callAgent(_ callAgent: CallAgent, didUpdateCalls args: CallsUpdatedEventArgs) {
        if let removedCall = args.removedCalls.first {
            contentView?.callRemoved(removedCall)
            self.incomingCall = nil
        }
    }
}

我們需要將下列程式碼新增至 中的 ContentView.swift 回呼, onAppear 以建立 的 IncomingCallHandler 實例:

let incomingCallHandler = IncomingCallHandler.getOrCreateInstance()
incomingCallHandler.contentView = self

成功建立 CallAgent 之後,將委派設定為 CallAgent:

self.callAgent!.delegate = incomingCallHandler

一旦有撥入電話,就會 IncomingCallHandler 呼叫 函式 showIncomingCallBanner 以顯示 answerdecline 按鈕。

func showIncomingCallBanner(_ incomingCall: IncomingCall?) {
    isIncomingCall = true
    self.incomingCall = incomingCall
}

附加至 answerdecline 的動作會實作為下列程式碼。 若要使用視訊接聽通話,我們必須開啟本機視訊,並使用 設定 的選項 AcceptCallOptionslocalVideoStream

func answerIncomingCall() {
    isIncomingCall = false
    let options = AcceptCallOptions()
    if (self.incomingCall != nil) {
        guard let deviceManager = deviceManager else {
            return
        }        
        if (self.localVideoStream == nil) {
            self.localVideoStream = [LocalVideoStream]()
        }
        if(sendingVideo)
        {
            let camera = deviceManager.cameras.first
            localVideoStream!.append(LocalVideoStream(camera: camera!))
            let videoOptions = VideoOptions(localVideoStreams: localVideoStream!)
            options.videoOptions = videoOptions
        }
        self.incomingCall!.accept(options: options) { (call, error) in
            setCallAndObersever(call: call, error: error)
        }
    }
}

func declineIncomingCall(){
    self.incomingCall!.reject { (error) in }
    isIncomingCall = false
}

遠端參與者視訊串流

我們可以建立類別 RemoteVideoStreamData 來處理遠端參與者的轉譯視訊串流。

public class RemoteVideoStreamData : NSObject, RendererDelegate {
    public func videoStreamRenderer(didFailToStart renderer: VideoStreamRenderer) {
        owner.errorMessage = "Renderer failed to start"
    }
    
    private var owner:ContentView
    let stream:RemoteVideoStream
    var renderer:VideoStreamRenderer? {
        didSet {
            if renderer != nil {
                renderer!.delegate = self
            }
        }
    }
    
    var views:[RendererView] = []
    init(view:ContentView, stream:RemoteVideoStream) {
        owner = view
        self.stream = stream
    }
    
    public func videoStreamRenderer(didRenderFirstFrame renderer: VideoStreamRenderer) {
        let size:StreamSize = renderer.size
        owner.remoteVideoSize = String(size.width) + " X " + String(size.height)
    }
}

訂閱事件

我們可以實作 類別 CallObserver 來訂閱事件集合,以在呼叫期間變更值時收到通知。

public class CallObserver: NSObject, CallDelegate, IncomingCallDelegate {
    private var owner: ContentView
    init(_ view:ContentView) {
            owner = view
    }
        
    public func call(_ call: Call, didChangeState args: PropertyChangedEventArgs) {
        if(call.state == CallState.connected) {
            initialCallParticipant()
        }
    }

    // render remote video streams when remote participant changes
    public func call(_ call: Call, didUpdateRemoteParticipant args: ParticipantsUpdatedEventArgs) {
        for participant in args.addedParticipants {
            participant.delegate = owner.remoteParticipantObserver
            for stream in participant.videoStreams {
                if !owner.remoteVideoStreamData.isEmpty {
                    return
                }
                let data:RemoteVideoStreamData = RemoteVideoStreamData(view: owner, stream: stream)
                let scalingMode = ScalingMode.fit
                data.renderer = try! VideoStreamRenderer(remoteVideoStream: stream)
                let view:RendererView = try! data.renderer!.createView(withOptions: CreateViewOptions(scalingMode:scalingMode))
                data.views.append(view)
                self.owner.remoteViews.append(view)
                owner.remoteVideoStreamData[stream.id] = data
            }
            owner.remoteParticipant = participant
        }
    }

    // Handle remote video streams when the call is connected
    public func initialCallParticipant() {
        for participant in owner.call!.remoteParticipants {
            participant.delegate = owner.remoteParticipantObserver
            for stream in participant.videoStreams {
                renderRemoteStream(stream)
            }
            owner.remoteParticipant = participant
        }
    }
    
    //create render for RemoteVideoStream and attach it to view
    public func renderRemoteStream(_ stream: RemoteVideoStream!) {
        if !owner.remoteVideoStreamData.isEmpty {
            return
        }
        let data:RemoteVideoStreamData = RemoteVideoStreamData(view: owner, stream: stream)
        let scalingMode = ScalingMode.fit
        data.renderer = try! VideoStreamRenderer(remoteVideoStream: stream)
        let view:RendererView = try! data.renderer!.createView(withOptions: CreateViewOptions(scalingMode:scalingMode))
        self.owner.remoteViews.append(view)
        owner.remoteVideoStreamData[stream.id] = data
    }
}

遠端參與者管理

所有遠端參與者都會以 RemoteParticipant 型別表示,而且可透過 remoteParticipants 呼叫實例上的集合取得。

我們可以實作 類別 RemoteParticipantObserver ,以訂閱遠端參與者遠端視訊串流上的更新。

public class RemoteParticipantObserver : NSObject, RemoteParticipantDelegate {
    private var owner:ContentView
    init(_ view:ContentView) {
        owner = view
    }

    public func renderRemoteStream(_ stream: RemoteVideoStream!) {
        let data:RemoteVideoStreamData = RemoteVideoStreamData(view: owner, stream: stream)
        let scalingMode = ScalingMode.fit
        data.renderer = try! VideoStreamRenderer(remoteVideoStream: stream)
        let view:RendererView = try! data.renderer!.createView(withOptions: CreateViewOptions(scalingMode:scalingMode))
        self.owner.remoteViews.append(view)
        owner.remoteVideoStreamData[stream.id] = data
    }
    
    // render RemoteVideoStream when remote participant turns on the video, dispose the renderer when remote video is off
    public func remoteParticipant(_ remoteParticipant: RemoteParticipant, didUpdateVideoStreams args: RemoteVideoStreamsEventArgs) {
        for stream in args.addedRemoteVideoStreams {
            renderRemoteStream(stream)
        }
        for stream in args.removedRemoteVideoStreams {
            for data in owner.remoteVideoStreamData.values {
                data.renderer?.dispose()
            }
            owner.remoteViews.removeAll()
        }
    }
}

執行程式碼

您可以選取 [產品 > 執行] 或使用 (⌘-R) 鍵盤快速鍵,在 iOS 模擬器上建置並執行您的應用程式。

重要事項

本文件中所述的功能目前處於公用預覽。 此預覽版本沒有服務等級協定,不建議用於處理生產工作負載。 可能不支援特定功能,或可能已經限制功能。 如需詳細資訊,請參閱 Microsoft Azure 預覽版增補使用條款

在本快速入門中,您將瞭解如何使用適用于 Windows 的 Azure 通訊服務 呼叫 SDK 來啟動 1:1 視訊通話。

UWP 範例程式碼

必要條件

若要完成本教學課程,您需要下列必要條件:

設定

建立專案

在 Visual Studio 中,使用空白應用程式 (通用 Windows) 範本建立新的專案,以設定單頁通用 Windows 平臺 (UWP) 應用程式。

顯示 Visual Studio 中 [新增 UWP 專案] 視窗的螢幕擷取畫面。

安裝套件

以滑鼠右鍵按一下您的專案,然後移至 Manage Nuget Packages 以安裝 Azure.Communication.Calling1.0.0-Beta.33或進階版本。 請確定已核取 [包含預先發行專案]。

要求存取

移至 , Package.appxmanifest 然後按一下 Capabilities 。 檢查 Internet (Client & Server) 以取得網際網路的輸入和輸出存取權。 檢查 Microphone 以存取麥克風的音訊摘要。 檢查 WebCam 以存取裝置的相機。

以滑鼠右鍵按一下並選擇 [檢視程式碼],將下列程式碼新增至您的 Package.appxmanifest

<Extensions>
<Extension Category="windows.activatableClass.inProcessServer">
<InProcessServer>
<Path>RtmMvrUap.dll</Path>
<ActivatableClass ActivatableClassId="VideoN.VideoSchemeHandler" ThreadingModel="both" />
</InProcessServer>
</Extension>
</Extensions>

設定應用程式架構

我們需要設定基本配置來附加邏輯。 為了撥打撥出電話,我們需要 提供 TextBox 被呼叫者的使用者識別碼。 我們也需要一個 Start Call 按鈕和一個 Hang Up 按鈕。 我們也需要預覽本機影片,並轉譯其他參與者的遠端視訊。 因此,我們需要兩個元素來顯示視訊串流。

開啟專案的 , MainPage.xaml 並以下列實作取代內容。

<Page
    x:Class="CallingQuickstart.MainPage"
    xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
    xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"
    xmlns:local="using:CallingQuickstart"
    xmlns:d="http://schemas.microsoft.com/expression/blend/2008"
    xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006"
    mc:Ignorable="d"
    Background="{ThemeResource ApplicationPageBackgroundThemeBrush}" Width="800" Height="600">
    <Grid>
        <Grid.RowDefinitions>
            <RowDefinition Height="60*"/>
            <RowDefinition Height="200*"/>
            <RowDefinition Height="60*"/>
        </Grid.RowDefinitions>
        <TextBox x:Name="CalleeTextBox" Text="Who would you like to call?" TextWrapping="Wrap" VerticalAlignment="Center" Grid.Row="0" Height="40" Margin="10,10,10,10" />
        <Grid Grid.Row="1">
            <Grid.RowDefinitions>
                <RowDefinition/>
            </Grid.RowDefinitions>
            <Grid.ColumnDefinitions>
                <ColumnDefinition Width="*"/>
                <ColumnDefinition Width="*"/>
            </Grid.ColumnDefinitions>
            <MediaElement x:Name="LocalVideo" HorizontalAlignment="Center" Stretch="UniformToFill" Grid.Column="0" VerticalAlignment="Center"/>
            <MediaElement x:Name="RemoteVideo" HorizontalAlignment="Center" Stretch="UniformToFill" Grid.Column="1" VerticalAlignment="Center"/>
        </Grid>
        <StackPanel Grid.Row="2" Orientation="Horizontal">
            <Button x:Name="CallButton" Content="Start Call" Click="CallButton_Click" VerticalAlignment="Center" Margin="10,0,0,0" Height="40" Width="200"/>
            <Button x:Name="HangupButton" Content="Hang Up" Click="HangupButton_Click" VerticalAlignment="Center" Margin="10,0,0,0" Height="40" Width="200"/>
            <TextBlock x:Name="State" Text="Status" TextWrapping="Wrap" VerticalAlignment="Center" Margin="40,0,0,0" Height="40" Width="200"/>
        </StackPanel>
    </Grid>
</Page>

開啟以 App.xaml.cs (按一下滑鼠右鍵,然後選擇 [檢視程式碼]) 並將這一行新增至頂端:

using CallingQuickstart;

MainPage.xaml.cs開啟 (按一下滑鼠右鍵,然後選擇 [檢視程式碼]) ,並以下列實作取代內容:

using Azure.Communication.Calling;
using Azure.WinRT.Communication;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
using Windows.Foundation;
using Windows.UI.ViewManagement;
using Windows.UI.Xaml;
using Windows.UI.Xaml.Controls;

namespace CallingQuickstart
{
    public sealed partial class MainPage : Page
    {
        CallAgent callAgent;
        Call call;
        DeviceManager deviceManager;
        Dictionary<string, RemoteParticipant> remoteParticipantDictionary = new Dictionary<string, RemoteParticipant>();

        public MainPage()
        {
            this.InitializeComponent();
            Task.Run(() => this.InitCallAgentAndDeviceManagerAsync()).Wait();
        }

        private async Task InitCallAgentAndDeviceManagerAsync()
        {
            // Initialize call agent and Device Manager
        }

        private async void Agent_OnIncomingCallAsync(object sender, IncomingCall incomingCall)
        {
            // Accept an incoming call
        }

        private async void CallButton_Click(object sender, RoutedEventArgs e)
        {
            // Start a call with video
        }

        private async void HangupButton_Click(object sender, RoutedEventArgs e)
        {
            // End the current call
        }

        private async void Call_OnStateChangedAsync(object sender, PropertyChangedEventArgs args)
        {
            var state = (sender as Call)?.State;
            await this.Dispatcher.RunAsync(Windows.UI.Core.CoreDispatcherPriority.Normal, () => {
                State.Text = state.ToString();
            });
        }
    }
}

物件模型

下列類別和介面會處理Azure 通訊服務呼叫 SDK 的一些主要功能:

名稱 描述
CallClient CallClient是呼叫用戶端程式庫的主要進入點。
CallAgent CallAgent用來啟動和聯結呼叫。
Call Call用來管理已放置或加入的呼叫。
CommunicationTokenCredential CommunicationTokenCredential會用來做為權杖認證來具現化 CallAgent
CommunicationUserIdentifier CommunicationUserIdentifier用來代表使用者的身分識別,這可以是下列其中一個選項: CommunicationUserIdentifierPhoneNumberIdentifierCallingApplication

驗證用戶端

若要初始化 CallAgent ,您需要使用者存取權杖。 一般而言,此權杖是從具有應用程式特定驗證的服務產生。 如需使用者存取權杖的詳細資訊,請參閱 使用者存取權杖 指南。

在快速入門中,將 <AUTHENTICATION_TOKEN> 取代為針對您的 Azure 通訊服務資源所產生的使用者存取權杖。

一旦您有權杖,請使用它初始化 CallAgent 實例,讓我們能夠進行和接收呼叫。 為了存取裝置上的相機,我們也需要取得裝置管理員實例。

將下列程式碼新增至 函 InitCallAgentAndDeviceManagerAsync 式。

var callClient = new CallClient();
this.deviceManager = await callClient.GetDeviceManager();

var tokenCredential = new CommunicationTokenCredential("<AUTHENTICATION_TOKEN>");
var callAgentOptions = new CallAgentOptions()
{
    DisplayName = "<DISPLAY_NAME>"
};

this.callAgent = await callClient.CreateCallAgent(tokenCredential, callAgentOptions);
this.callAgent.OnCallsUpdated += Agent_OnCallsUpdatedAsync;
this.callAgent.OnIncomingCall += Agent_OnIncomingCallAsync;

使用視訊開始通話

將 實作新增至 CallButton_Click ,以使用視訊開始通話。 我們需要使用裝置管理員實例列舉相機,並建構 LocalVideoStream 。 我們需要使用 來設定 VideoOptionsLocalVideoStream 並使用 傳遞它 startCallOptions 來設定呼叫的初始選項。 藉由附加 LocalVideoStreamMediaElement ,我們可以看到本機影片的預覽。

var startCallOptions = new StartCallOptions();

if ((LocalVideo.Source == null) && (this.deviceManager.Cameras?.Count > 0))
{
    var videoDeviceInfo = this.deviceManager.Cameras?.FirstOrDefault();
    if (videoDeviceInfo != null)
    {
        var localVideoStream = new LocalVideoStream(videoDeviceInfo);

        var localUri = await localVideoStream.MediaUriAsync();

        await Dispatcher.RunAsync(Windows.UI.Core.CoreDispatcherPriority.Normal, () =>
        {
            LocalVideo.Source = localUri;
            LocalVideo.Play();
        });

        startCallOptions.VideoOptions = new VideoOptions(new[] { localVideoStream });
    }
}

var callees = new ICommunicationIdentifier[1] { new CommunicationUserIdentifier(CalleeTextBox.Text.Trim()) };

this.call = await this.callAgent.StartCallAsync(callees, startCallOptions);
this.call.OnRemoteParticipantsUpdated += Call_OnRemoteParticipantsUpdatedAsync;
this.call.OnStateChanged += Call_OnStateChangedAsync;

接聽來電

將 實作新增至 , Agent_OnIncomingCallAsync 以接聽具有視訊的來電,並將 傳遞 LocalVideoStreamacceptCallOptions

var acceptCallOptions = new AcceptCallOptions();

if (this.deviceManager.Cameras?.Count > 0)
{
    var videoDeviceInfo = this.deviceManager.Cameras?.FirstOrDefault();
    if (videoDeviceInfo != null)
    {
        var localVideoStream = new LocalVideoStream(videoDeviceInfo);

        var localUri = await localVideoStream.MediaUriAsync();

        await Dispatcher.RunAsync(Windows.UI.Core.CoreDispatcherPriority.Normal, () =>
        {
            LocalVideo.Source = localUri;
            LocalVideo.Play();
        });

        acceptCallOptions.VideoOptions = new VideoOptions(new[] { localVideoStream });
    }
}

call = await incomingCall.AcceptAsync(acceptCallOptions);

遠端參與者和遠端視訊串流

所有遠端參與者都可透過 RemoteParticipants 呼叫實例上的 集合取得。 通話連線之後,我們可以存取通話的遠端參與者,並處理遠端視訊串流。


private async void Call_OnVideoStreamsUpdatedAsync(object sender, RemoteVideoStreamsEventArgs args)
{
    foreach (var remoteVideoStream in args.AddedRemoteVideoStreams)
    {
        await Dispatcher.RunAsync(Windows.UI.Core.CoreDispatcherPriority.Normal, async () =>
        {
            RemoteVideo.Source = await remoteVideoStream.Start();
        });
    }

    foreach (var remoteVideoStream in args.RemovedRemoteVideoStreams)
    {
        remoteVideoStream.Stop();
    }
}

private async void Agent_OnCallsUpdatedAsync(object sender, CallsUpdatedEventArgs args)
{
    foreach (var call in args.AddedCalls)
    {
        foreach (var remoteParticipant in call.RemoteParticipants)
        {
            var remoteParticipantMRI = remoteParticipant.Identifier.ToString();
            this.remoteParticipantDictionary.TryAdd(remoteParticipantMRI, remoteParticipant);
            await AddVideoStreamsAsync(remoteParticipant.VideoStreams);
            remoteParticipant.OnVideoStreamsUpdated += Call_OnVideoStreamsUpdatedAsync;
        }
    }
}

private async void Call_OnRemoteParticipantsUpdatedAsync(object sender, ParticipantsUpdatedEventArgs args)
{
    foreach (var remoteParticipant in args.AddedParticipants)
    {
        String remoteParticipantMRI = remoteParticipant.Identifier.ToString();
        this.remoteParticipantDictionary.TryAdd(remoteParticipantMRI, remoteParticipant);
        await AddVideoStreamsAsync(remoteParticipant.VideoStreams);
        remoteParticipant.OnVideoStreamsUpdated += Call_OnVideoStreamsUpdatedAsync;
    }

    foreach (var remoteParticipant in args.RemovedParticipants)
    {
        String remoteParticipantMRI = remoteParticipant.Identifier.ToString();
        this.remoteParticipantDictionary.Remove(remoteParticipantMRI);
    }
}

轉譯遠端視訊

針對每個遠端視訊串流,將它附加至 MediaElement

private async Task AddVideoStreamsAsync(IReadOnlyList<RemoteVideoStream> remoteVideoStreams)
{
    foreach (var remoteVideoStream in remoteVideoStreams)
    {
        var remoteUri = await remoteVideoStream.Start();

        await Dispatcher.RunAsync(Windows.UI.Core.CoreDispatcherPriority.Normal, () =>
        {
            RemoteVideo.Source = remoteUri;
            RemoteVideo.Play();
        });
    }
}

撥號狀態更新

當通話中斷連線後,我們需要清除視訊轉譯器,並在遠端參與者一開始加入通話時處理案例。

private async void Call_OnStateChanged(object sender, PropertyChangedEventArgs args)
{
    switch (((Call)sender).State)
    {
        case CallState.Disconnected:
            await Dispatcher.RunAsync(Windows.UI.Core.CoreDispatcherPriority.Normal, () =>
            {
                LocalVideo.Source = null;
                RemoteVideo.Source = null;
            });
            break;

        case CallState.Connected:
            foreach (var remoteParticipant in call.RemoteParticipants)
            {
                String remoteParticipantMRI = remoteParticipant.Identifier.ToString();
                remoteParticipantDictionary.TryAdd(remoteParticipantMRI, remoteParticipant);
                await AddVideoStreams(remoteParticipant.VideoStreams);
                remoteParticipant.OnVideoStreamsUpdated += Call_OnVideoStreamsUpdated;
            }
            break;

        default:
            break;
    }
}

結束通話

按一下按鈕時 Hang Up 結束目前的呼叫。 將 實作新增至 HangupButton_Click,以結束呼叫我們建立的 callAgent,並終止參與者更新和撥號狀態事件處理常式。

this.call.OnRemoteParticipantsUpdated -= Call_OnRemoteParticipantsUpdatedAsync;
this.call.OnStateChanged -= Call_OnStateChangedAsync;
await this.call.HangUpAsync(new HangUpOptions());

執行程式碼

您可以在 Visual Studio 上建置和執行程式碼。 針對解決方案平臺,我們支援 ARM64x64x86

您可以在文字欄位中提供使用者識別碼,然後按一下 Start Call 按鈕來撥打輸出視訊通話。

注意:通話 8:echo123 會停止視訊串流,因為回應 Bot 不支援視訊串流。

如需使用者識別碼 (身分識別) 請參閱 使用者存取權杖 指南。

WinUI 3 範例程式碼

必要條件

若要完成本教學課程,您需要下列必要條件:

設定

建立專案

在 Visual Studio 中,使用 [ 桌面]) 範本中的 [已封裝] ([WinUI 3] 範本建立新的專案,以設定單頁 WinUI 3 應用程式。

顯示 Visual Studio 中 [新增 WinUI 專案] 視窗的螢幕擷取畫面。

安裝套件

以滑鼠右鍵按一下您的專案,然後移至 Manage Nuget Packages 以安裝 Azure.Communication.Calling1.0.0-Beta.33或進階版本。 請確定已核取 [包含預先發行專案]。

要求存取

顯示 Visual Studio 中要求存取網際網路和麥克風的螢幕擷取畫面。

將下列程式碼新增至您的 app.manifest

<file name="RtmMvrMf.dll">
    <activatableClass name="VideoN.VideoSchemeHandler" threadingModel="both" xmlns="urn:schemas-microsoft-com:winrt.v1" />
</file>

設定應用程式架構

我們需要設定基本配置來附加邏輯。 為了撥打撥出電話,我們需要 提供 TextBox 被呼叫者的使用者識別碼。 我們也需要一個 Start Call 按鈕和一個 Hang Up 按鈕。 我們也需要預覽本機影片,並轉譯其他參與者的遠端視訊。 因此,我們需要兩個元素來顯示視訊串流。

開啟專案的 , MainWindow.xaml 並以下列實作取代內容。

<Window
    x:Class="CallingQuickstart.MainWindow"
    xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
    xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"
    xmlns:local="using:CallingQuickstart"
    xmlns:d="http://schemas.microsoft.com/expression/blend/2008"
    xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006"
    mc:Ignorable="d">
    <Grid>
        <Grid.RowDefinitions>
            <RowDefinition Height="60*"/>
            <RowDefinition Height="200*"/>
            <RowDefinition Height="60*"/>
        </Grid.RowDefinitions>
        <TextBox x:Name="CalleeTextBox" Text="Who would you like to call?" TextWrapping="Wrap" VerticalAlignment="Center" Height="40" Margin="10,10,10,10" />
        <Grid Grid.Row="1">
            <Grid.RowDefinitions>
                <RowDefinition/>
            </Grid.RowDefinitions>
            <Grid.ColumnDefinitions>
                <ColumnDefinition Width="*"/>
                <ColumnDefinition Width="*"/>
            </Grid.ColumnDefinitions>
            <MediaPlayerElement x:Name="LocalVideo" HorizontalAlignment="Center" Stretch="UniformToFill" Grid.Column="0" VerticalAlignment="Center"/>
            <MediaPlayerElement x:Name="RemoteVideo" HorizontalAlignment="Center" Stretch="UniformToFill" Grid.Column="1" VerticalAlignment="Center"/>
        </Grid>
        <StackPanel Grid.Row="2" Orientation="Horizontal">
            <Button x:Name="CallButton" Content="Start Call" Click="CallButton_Click" VerticalAlignment="Center" Margin="10,0,0,0" Height="40" Width="200"/>
            <Button x:Name="HangupButton" Content="Hang Up" Click="HangupButton_Click" VerticalAlignment="Center" Margin="10,0,0,0" Height="40" Width="200"/>
            <TextBlock x:Name="State" Text="Status" TextWrapping="Wrap" VerticalAlignment="Center" Margin="40,0,0,0" Height="40" Width="200"/>
        </StackPanel>
    </Grid>
</Window>

開啟以 App.xaml.cs (按一下滑鼠右鍵,然後選擇 [檢視程式碼]) 並將這一行新增至頂端:

using CallingQuickstart;

MainWindow.xaml.cs開啟 (按一下滑鼠右鍵,然後選擇 [檢視程式碼]) ,並以下列實作取代內容:

using Azure.Communication.Calling;
using Azure.WinRT.Communication;
using Microsoft.UI.Xaml;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
using Windows.Media.Core;

namespace CallingQuickstart
{
    public sealed partial class MainWindow : Window
    {
        CallAgent callAgent;
        Call call;
        DeviceManager deviceManager;
        Dictionary<string, RemoteParticipant> remoteParticipantDictionary = new Dictionary<string, RemoteParticipant>();

        public MainWindow()
        {
            this.InitializeComponent();
            Task.Run(() => this.InitCallAgentAndDeviceManagerAsync()).Wait();
        }

        private async Task InitCallAgentAndDeviceManagerAsync()
        {
            // Initialize call agent and Device Manager
        }

        private async void Agent_OnIncomingCallAsync(object sender, IncomingCall incomingCall)
        {
            // Accept an incoming call
        }

        private async void CallButton_Click(object sender, RoutedEventArgs e)
        {
            // Start a call with video
        }

        private async void HangupButton_Click(object sender, RoutedEventArgs e)
        {
            // End the current call
        }

        private async void Call_OnStateChangedAsync(object sender, PropertyChangedEventArgs args)
        {
            var state = (sender as Call)?.State;
            this.DispatcherQueue.TryEnqueue(() => {
                State.Text = state.ToString();
            });
        }
    }
}

物件模型

下列類別和介面會處理Azure 通訊服務呼叫 SDK 的一些主要功能:

名稱 描述
CallClient CallClient是呼叫用戶端程式庫的主要進入點。
CallAgent CallAgent用來啟動和聯結呼叫。
Call Call用來管理已放置或加入的呼叫。
CommunicationTokenCredential CommunicationTokenCredential會用來做為權杖認證來具現化 CallAgent
CommunicationUserIdentifier CommunicationUserIdentifier用來代表使用者的身分識別,這可以是下列其中一個選項: CommunicationUserIdentifierPhoneNumberIdentifierCallingApplication

驗證用戶端

若要初始化 CallAgent ,您需要使用者存取權杖。 一般而言,此權杖是從具有應用程式特定驗證的服務產生。 如需使用者存取權杖的詳細資訊,請參閱 使用者存取權杖 指南。

在快速入門中,將 <AUTHENTICATION_TOKEN> 取代為針對您的 Azure 通訊服務資源所產生的使用者存取權杖。

一旦您擁有權杖, CallAgent 請使用它初始化 實例,這可讓我們進行和接收呼叫。 為了存取裝置上的相機,我們也需要取得裝置管理員實例。

將下列程式碼新增至 函 InitCallAgentAndDeviceManagerAsync 式。

var callClient = new CallClient();
this.deviceManager = await callClient.GetDeviceManager();

var tokenCredential = new CommunicationTokenCredential("<AUTHENTICATION_TOKEN>");
var callAgentOptions = new CallAgentOptions()
{
    DisplayName = "<DISPLAY_NAME>"
};

this.callAgent = await callClient.CreateCallAgent(tokenCredential, callAgentOptions);
this.callAgent.OnCallsUpdated += Agent_OnCallsUpdatedAsync;
this.callAgent.OnIncomingCall += Agent_OnIncomingCallAsync;

使用視訊開始通話

將 實作新增至 CallButton_Click ,以使用視訊開始通話。 我們需要使用裝置管理員實例列舉相機,並建構 LocalVideoStream 。 我們需要使用 來設定 VideoOptionsLocalVideoStream 並使用 傳遞它 startCallOptions 來設定呼叫的初始選項。 藉由附加 LocalVideoStreamMediaPlayerElement ,我們可以看到本機影片的預覽。

var startCallOptions = new StartCallOptions();

if (this.deviceManager.Cameras?.Count > 0)
{
    var videoDeviceInfo = this.deviceManager.Cameras?.FirstOrDefault();
    if (videoDeviceInfo != null)
    {
        var localVideoStream = new LocalVideoStream(videoDeviceInfo);

        var localUri = await localVideoStream.MediaUriAsync();

        this.DispatcherQueue.TryEnqueue(() => {
            LocalVideo.Source = MediaSource.CreateFromUri(localUri);
            LocalVideo.MediaPlayer.Play();
        });

        startCallOptions.VideoOptions = new VideoOptions(new[] { localVideoStream });
    }
}

var callees = new ICommunicationIdentifier[1]
{
    new CommunicationUserIdentifier(CalleeTextBox.Text.Trim())
};

this.call = await this.callAgent.StartCallAsync(callees, startCallOptions);
this.call.OnRemoteParticipantsUpdated += Call_OnRemoteParticipantsUpdatedAsync;
this.call.OnStateChanged += Call_OnStateChangedAsync;

接聽來電

將 實作新增至 , Agent_OnIncomingCallAsync 以接聽具有視訊的來電,並將 傳遞 LocalVideoStreamacceptCallOptions

var acceptCallOptions = new AcceptCallOptions();

if (this.deviceManager.Cameras?.Count > 0)
{
    var videoDeviceInfo = this.deviceManager.Cameras?.FirstOrDefault();
    if (videoDeviceInfo != null)
    {
        var localVideoStream = new LocalVideoStream(videoDeviceInfo);

        var localUri = await localVideoStream.MediaUriAsync();

        this.DispatcherQueue.TryEnqueue(() => {
            LocalVideo.Source = MediaSource.CreateFromUri(localUri);
            LocalVideo.MediaPlayer.Play();
        });

        acceptCallOptions.VideoOptions = new VideoOptions(new[] { localVideoStream });
    }
}

call = await incomingCall.AcceptAsync(acceptCallOptions);

遠端參與者和遠端視訊串流

所有遠端參與者都可透過 RemoteParticipants 呼叫實例上的 集合取得。 通話連線之後,我們可以存取通話的遠端參與者,並處理遠端視訊串流。

private async void Call_OnVideoStreamsUpdatedAsync(object sender, RemoteVideoStreamsEventArgs args)
{
    foreach (var remoteVideoStream in args.AddedRemoteVideoStreams)
    {
        this.DispatcherQueue.TryEnqueue(async () => {
            RemoteVideo.Source = MediaSource.CreateFromUri(await remoteVideoStream.Start());
            RemoteVideo.MediaPlayer.Play();
        });
    }

    foreach (var remoteVideoStream in args.RemovedRemoteVideoStreams)
    {
        remoteVideoStream.Stop();
    }
}

private async void Agent_OnCallsUpdatedAsync(object sender, CallsUpdatedEventArgs args)
{
    foreach (var call in args.AddedCalls)
    {
        foreach (var remoteParticipant in call.RemoteParticipants)
        {
            var remoteParticipantMRI = remoteParticipant.Identifier.ToString();
            this.remoteParticipantDictionary.TryAdd(remoteParticipantMRI, remoteParticipant);
            await AddVideoStreamsAsync(remoteParticipant.VideoStreams);
            remoteParticipant.OnVideoStreamsUpdated += Call_OnVideoStreamsUpdatedAsync;
        }
    }
}

private async void Call_OnRemoteParticipantsUpdatedAsync(object sender, ParticipantsUpdatedEventArgs args)
{
    foreach (var remoteParticipant in args.AddedParticipants)
    {
        String remoteParticipantMRI = remoteParticipant.Identifier.ToString();
        this.remoteParticipantDictionary.TryAdd(remoteParticipantMRI, remoteParticipant);
        await AddVideoStreamsAsync(remoteParticipant.VideoStreams);
        remoteParticipant.OnVideoStreamsUpdated += Call_OnVideoStreamsUpdatedAsync;
    }

    foreach (var remoteParticipant in args.RemovedParticipants)
    {
        String remoteParticipantMRI = remoteParticipant.Identifier.ToString();
        this.remoteParticipantDictionary.Remove(remoteParticipantMRI);
    }
}

轉譯遠端視訊

針對每個遠端視訊串流,將它附加至 MediaPlayerElement

private async Task AddVideoStreamsAsync(IReadOnlyList<RemoteVideoStream> remoteVideoStreams)
{
    foreach (var remoteVideoStream in remoteVideoStreams)
    {
        var remoteUri = await remoteVideoStream.Start();

        this.DispatcherQueue.TryEnqueue(() => {
            RemoteVideo.Source = MediaSource.CreateFromUri(remoteUri);
            RemoteVideo.MediaPlayer.Play();
        });
    }
}

撥號狀態更新

當通話中斷連線後,我們需要清除視訊轉譯器,並在遠端參與者一開始加入通話時處理案例。

private async void Call_OnStateChanged(object sender, PropertyChangedEventArgs args)
{
    switch (((Call)sender).State)
    {
        case CallState.Disconnected:
            this.DispatcherQueue.TryEnqueue(() => { =>
            {
                LocalVideo.Source = null;
                RemoteVideo.Source = null;
            });
            break;

        case CallState.Connected:
            foreach (var remoteParticipant in call.RemoteParticipants)
            {
                String remoteParticipantMRI = remoteParticipant.Identifier.ToString();
                remoteParticipantDictionary.TryAdd(remoteParticipantMRI, remoteParticipant);
                await AddVideoStreams(remoteParticipant.VideoStreams);
                remoteParticipant.OnVideoStreamsUpdated += Call_OnVideoStreamsUpdated;
            }
            break;

        default:
            break;
    }
}

結束通話

按一下按鈕時 Hang Up 結束目前的呼叫。 將 實作新增至 HangupButton_Click,以結束呼叫我們建立的 callAgent,並終止參與者更新和撥號狀態事件處理常式。

this.call.OnRemoteParticipantsUpdated -= Call_OnRemoteParticipantsUpdatedAsync;
this.call.OnStateChanged -= Call_OnStateChangedAsync;
await this.call.HangUpAsync(new HangUpOptions());

執行程式碼

您可以在 Visual Studio 上建置和執行程式碼。 針對解決方案平臺,我們支援 ARM64x86x64

您可以在文字欄位中提供使用者識別碼,然後按一下 Start Call 按鈕來撥打輸出視訊通話。

注意:通話 8:echo123 會停止視訊串流,因為回應 Bot 不支援視訊串流。

如需使用者識別碼 (身分識別) 請參閱 使用者存取權杖 指南。

清除資源

如果您想要清除和移除通訊服務訂閱,則可以刪除資源或資源群組。 刪除資源群組也會刪除其關聯的任何其他資源。 深入了解如何清除資源

後續步驟

如需詳細資訊,請參閱下列文章: