3 - Uploading Video into Microsoft Azure Media Services
In order to manage, encode, and stream your videos, you must first upload your content into Microsoft Azure Media Services. Once uploaded, your content is stored in the cloud for further processing and streaming.
When deciding upon the media content to upload and store as an asset there are restrictions that apply. Each asset should only contain a unique instance of media content, such as a single TV episode or ad. Therefore, each asset should not contain multiple edits of a file, in order to reduce difficulties submitting encoding jobs, and streaming and securing the delivery of the asset later in the workflow. For example, an incorrect usage of an asset would be storing both the trailer and the feature-length movie within a single asset: you may want the trailer to have wide viewership, but restrict viewing of the movie.
This chapter describes how the Contoso developers incorporated Media Services' uploading functionality into their web service and Windows Store client application. It summarizes the decisions that they made in order to support their business requirements, and how they designed the code that performs the upload process.
For more information about the Contoso web service see "Appendix A – The Contoso Web Service."
Chapter 2, "The Azure Media Services Video-on-Demand Scenario," describes the primary business functions of the video application.
Uploading content
Media Services is an OData-based REST service that exposes objects as entities that can be queried in the same way as other OData entities. Media Services is built on OData v3, which means that you can submit HTTP request bodies in atom+pub or verbose JSON, and receive your responses in the same formats. For more information about ingesting assets using the REST API see "Ingesting Assets with the Media Services REST API" and "Ingesting Assets in Bulk with the REST API."
Uploading content with the Media Services SDK for .NET
The Media Services SDK for .NET is a wrapper around the REST APIs. The SDK provides a simple way to accomplish the tasks that are exposed by the REST API.
Uploading content with the Azure Management Portal
Video, audio, and images can be uploaded to a Media Services account through the Azure Management Portal. However, this approach limits uploads to the formats that are supported by the Azure Media Encoder. For more information see "Supported Codecs and File Types for Microsoft Azure Media Services."
There are several limitations to consider when uploading content through the Management Portal:
- You can't upload multiple files in a single upload.
- You can't upload a file larger than 200MB. However, there's no file size limit if you're uploading from an existing storage account.
- You can't upload all the file formats that are supported by Media Services. You can only upload files with the following extensions: .asf, .avi, .m2tf, .m2v, .mp4, .mpeg, .mpg, .mts, .ts, .wmv, .3gp, .3g2, .3gp2, .mod, .dv, .vob, .ismv, .m4a.
If you need to upload content into Media Services at high speed you can take advantage of high speed ingest technology offered by third-party providers. For more information see "Uploading Large Sets of Files with High Speed."
Managing assets across multiple storage accounts within Azure Media Services
Media Services accounts can be associated with one or more storage accounts, with each storage account being limited to 200TB. Attaching multiple storage accounts to a Media Services account provides the following benefits:
- Load balancing assets across multiple storage accounts.
- Scaling Media Services for large amounts of storage and processing.
- Isolating file storage from streaming or DRM protected file storage.
For more information see "Managing Media Services Assets across Multiple Storage Accounts."
Ingesting content with the Media Services SDK for .NET
To get content into Media Services you must first create an asset and add files to it, and then upload the asset. This process is known as ingesting content.
The content object in Media Services is an IAsset, which is a collection of metadata about a set of media files. Each IAsset contains one or more IAssetFile objects. There are two main approaches to ingesting assets into Media Services:
- Create an Asset, upload your content to Media Services, and then generate AssetFiles and associate them with the Asset.
- Bulk ingest a set of files by preparing a manifest that describes the asset and its associated files. Then use the upload method of your choice to upload the associated files to the manifest's blob container. Once a file is uploaded to the blob container Media Services completes the asset creation based on the configuration of the asset in the manifest.
The first approach is the preferred approach when working with a small set of media files, and is the approach adopted by the Contoso development team. For more information about ingesting assets in bulk see "Ingesting Assets in Bulk with the Media Services SDK for .NET."
Markus says: | |
---|---|
To create an asset you must first have a reference to the Media Services server context. |
Supported input formats for Azure Media Services
Various video, audio, and image file types can be uploaded to a Media Services account, with there being no restriction on the types or formats of files that you can upload using the Media Services SDK. However, the Azure Management portal restricts uploads to the formats that are supported by the Azure Media Encoder.
Content encoded with the following video codecs may be imported into Media Services for processing by Azure Media Encoder:
- H.264 (Baseline, Main, and High Profiles)
- MPEG-1
- MPEG-2 (Simple and Main Profile)
- MPEG-4 v2 (Simple Visual Profile and Advanced Simple Profile)
- VC-1 (Simple, Main, and Advanced Profiles)
- Windows Media Video (Simple, Main, and Advanced Profiles)
- DV (DVC, DVHD, DVSD, DVSL)
The following video file formats are supported for import:
File format |
File extension |
---|---|
3GPP, 3GPP2 |
.3gp, .3g2, .3gp2 |
Advanced Systems Format (ASF) |
.asf |
Advanced Video Coding High Definition (AVCHD) |
.mts, .m2tf |
Audio-Video Interleaved (AVI) |
.avi |
Digital camcorder MPEG-2 (MOD) |
.mod |
Digital video (DV) camera file |
.dv |
DVD transport stream (TS) file |
.ts |
DVD video object (VOB) file |
.vob |
Expression Encoder Screen Capture Codec file |
.xesc |
MP4 |
.mp4 |
MPEG-1 System Stream |
.mpeg, .mpg |
MPEG-2 video file |
.m2v |
Smooth Streaming File Format (PIFF 1.3) |
.ismv |
Windows Media Video (WMV) |
.wmv |
Content encoded with the following audio codecs may be imported into Media Services for processing by Azure Media Encoder:
- AC-3 (Dolby Digital audio)
- AAC (AAC-LC, HE-AAC v1 with AAC-LC core, and HE-AAC v2 with AAC-LC core)
- MP3
- Windows Media Audio (Standard, Professional, and Lossless)
The following audio file formats are supported for import:
File format |
File extension |
---|---|
AC-3 (Dolby digital) audio |
.ac3 |
Audio Interchange File Format (AIFF) |
.aiff |
Broadcast Wave Format |
.bwf |
MP3 (MPEG-1 Audio Layer 3) |
.mp3 |
MP4 audio |
.m4a |
MPEG-4 audio book |
.m4b |
WAVE file |
.wav |
Windows Media Audio |
.wma |
The following image file formats are supported for import:
File format |
File extensions |
---|---|
Bitmap |
.bmp |
GIF, Animated GIF |
.gif |
JPEG |
.jpeg, .jpg |
PNG |
.png |
TIFF |
.tif |
WPF Canvas XAML |
.xaml |
For more information on the codecs and file container formats that are supported by Azure Media Encoder see "Supported input formats" and "Introduction to encoding."
Securing media for upload into Azure Media Services
Media Services allows you to secure your media from the time it leaves your computer. All media files in Media Services are associated with an Asset object. When creating an Asset for your media by calling Asset.Create, you must specify an encryption option as a parameter by using one of the AssetCreationOptions enumeration values. Each file added to the Asset will then use the asset creation options specified when the asset is created. The AssetCreationOptions enumeration specifies four values:
- AssetCreationOptions.None
- AssetCreationOptions.StorageEncrypted
- AssetCreationOptions.CommonEncryptionProtected
- AssetCreationOptions.EnvelopeEncryptionProtected
Media can be uploaded without any protection by specifying AssetCreationOptions.None. This is not recommended as the content will not be protected during the upload, or in storage. However, media could be uploaded over an SSL connection to protect the transmission process, prior to it being stored unprotected in Azure Storage.
If you have unencrypted media that you wish to encrypt prior to upload you should specify AssetCreationOptions.StorageEncrypted when creating the asset. This encrypts media locally prior to uploading it to Azure storage where it will be stored encrypted.
Assets protected with storage encryption will be automatically unencrypted and placed in an encrypted file system prior to encoding. In addition, any storage encrypted content must be decrypted before being streamed.
If you have pre-encoded Smooth Streaming content that is already protected with PlayReady Digital Rights Management (DRM) you should specify AssetCreationOptions.CommonEncryptionProtected when creating the asset. This enumeration value specifies that an assets files are protected using a common encryption method. Therefore your content is already protected in transit and in storage.
If you have pre-encoded HLS content with AES encryption you should specify AssetCreationOptions.EnvelopeEncryptionProtected when creating the asset. This enumeration value specifies that an assets files are protected using an envelope encryption method, such as AES-CBC. Therefore your content is already protected in transit and in storage.
Bharath says: | |
---|---|
Media Services only provides on-disk storage encryption, not over the wire encryption like a Digital Rights Management (DRM) solution. |
The following figure summarizes how media can be protected during the upload process.
The options for protecting media when at rest and in transit
Markus says: | |
---|---|
The Contoso video application does not secure the web service with Secure Sockets Layer (SSL), so a malicious client could impersonate the application and send malicious data. In your own application you should protect any sensitive data that you need to transfer between the application and a web service by using SSL. |
For more information about securing your media at rest and in transit, see "Securing Your Media."
Connecting to Azure Media Services
Before you can start programming against Media Services you need to create a Media Services account in a new or existing Azure subscription. For more information see "How to Create a Media Services Account."
At the end of the Media Services account setup process you will have obtained the following connection values:
- Media Services account name.
- Media Services account key.
These values are used to make programmatic connections to Media Services. You must then setup a Visual Studio project for development with the Media Services SDK for .NET. For more information see "Setup for Development on the Media Services SDK for .NET."
Media Services controls access to its services through an OAuth protocol that requires an Access Control Service (ACS) token that is received from an authorization server.
To start programming against Media Services you must create a CloudMediaContext instance that represents the server context. The CloudMediaContext includes references to important collections including jobs, assets, files, access policies, and locators. One of the CloudMediaContext constructor overloads takes a MediaServicesCredentials object as a parameter, and this enables the reuse of ACS tokens between multiple contexts. The following code example shows how the MediaServicesCredentials object is created.
Note
You can choose not to deal with ACS tokens, and leave the Media Service SDK to manage them for you. However, this can lead to unnecessary token requests which can create performance issues both on the client and server.
private static readonly Lazy<MediaServicesCredentials> Credentials =
new Lazy<MediaServicesCredentials>(() =>
{
var credentials = new MediaServicesCredentials(
CloudConfiguration.GetConfigurationSetting("ContosoAccountName"),
CloudConfiguration.GetConfigurationSetting("ContosoAccountKey"));
credentials.RefreshToken();
return credentials;
});
The Credentials object is cached in memory as a static class variable that uses lazy initialization to defer the creation of the object until it is first used. This object contains an ACS token that can be reused if hasn't expired. If it has expired it will automatically be refreshed by the Media Services SDK using the credentials given to the MediaServicesCredentials constructor. The cached object can then be passed to the CloudMediaContext constructor in the constructor of the EncodingService class.
public EncodingService(IVideoRepository videoRepository,
IJobRepository jobRepository)
{
...
this.context = new CloudMediaContext(EncodingService.Credentials.Value);
}
When the CloudMediaContext instance is created the Credentials object will be created. Using lazy initialization to do this reduces the likelihood of the MediaServicesCredentials object having to refresh its ACS token due to expiration. For more information about lazy initialization see "Lazy Initialization."
Bharath says: | |
---|---|
If you don't cache your Media Services credentials in a multi-tenant application, performance issues will occur as a result of thread contention issues. |
For better scalability and performance, the EncodingService constructor uses the constructor overload of the CloudMediaContext class that takes a MediaCredentials object.
The Contoso developers store connection values, including the account name and password, in configuration. The values in the <ConfigurationSettings> element are the required values obtained during the Media Services account setup process.
<ConfigurationSettings>
...
<Setting name="ContosoAccountName" value="Media_Services_Account_Name" />
<Setting name="ContosoAccountKey" value="Media_Services_Account_Key" />
</ConfigurationSettings>
Configuration files can be encrypted by using the Windows Encrypting File System (EFS). Or you can create a custom solution for encrypting selected portions of a configuration file by using protected configuration. For more information see "Encrypting Configuration Information Using Protected Configuration."
Upload process in the Contoso Azure Media Services applications
The following figure shows a high-level overview of the Contoso media upload process.
A high-level overview of the Contoso media upload process
Client apps communicate with the Contoso web service through a REST web interface, which allows them to upload media assets. When a new video is uploaded a new asset is created by Media Services, and the asset is uploaded to Azure Storage before the assets details are published to the Content Management System (CMS).
This process can be decomposed into the following steps for uploading content into Media Services:
- Create a new empty Asset.
- Create an AccessPolicy instance that defines the permissions and duration of access to the asset.
- Create a Locator instance that will provide access to the asset.
- Upload the file that's associated with the Asset into blob storage.
- Publish the Asset.
- Save the Asset details to the CMS.
- Generate an AssetFile for the Asset.
- Add the Asset to the encoding pipeline.
Markus says: | |
---|---|
The Contoso web service does not contain an authentication mechanism. In your own application you should implement a secure authentication mechanism so that it's possible to link videos to the users who uploaded them. |
The following figure shows the interaction of the classes in the Contoso Windows Store Video application that implement uploading a video for processing by Media Services.
The interaction of the classes that upload a video to Media Services
For information on how the upload process works in the Contoso video web application, see "Appendix C – Understanding the Contoso Video Applications."
The upload process is managed by the VideoService class in the Contoso.UILogic project. In the OnInitialize method in the App class, the VideoService class is registered as a type mapping against the IVideoService interface with the Unity dependency injection container. Then when a view model class such as the NewVideoPageViewModel class accepts an IVideoService type, the Unity container will resolve the type and return an instance of the VideoService class
The upload process is invoked when a user selects the Create button on the NewVideoPage. Through a binding this executes the CreateVideo method in the NewVideoPageViewModel class. This creates a new Video object and initializes the object with data from the file chosen for upload. Then, the upload process begins by calling the UploadFileAsync method in the VideoService class. A CancellationToken is passed to the UploadFileAsync method so that the upload process can be cancelled by the user if required.
private async void CreateVideo()
{
...
await this.videoService.UploadFileAsync(this.videoFile, video,
this.cancellationTokenSource.Token);
...
}
The UploadFileAsync method manages the upload process in the client by invoking methods of the HttpService and AzureFileUploader classes.
public async Task UploadFileAsync(VideoFile file, Video video, CancellationToken cancellationToken)
{
var requestUri = new Uri(string.Format("{0}/{1}/?filename={2}",
this.videosBaseUrl, "generateasset", file.Name));
var responseContent =
await this.httpService.GetAsync(requestUri, cancellationToken);
var videoUploadInfo =
JsonConvert.DeserializeObject<VideoUpload>(responseContent);
await this.azureFileUploader.UploadVideoFileToBlobStorage(file,
videoUploadInfo.SasLocator, cancellationToken);
video.AssetId = videoUploadInfo.AssetId;
var videoUpload = JsonConvert.SerializeObject(video);
var uploadVideoUri = new Uri(string.Format("{0}/{1}", this.videosBaseUrl,
"publish"));
await this.httpService.PostAsync(uploadVideoUri, videoUpload,
cancellationToken);
}
This method creates a Uri that specifies that the GenerateAsset method will be called on the web service, with the filename of the file to be uploaded being passed as a parameter. The HttpService class, which implements the IHttpService interface, is used to make the call to the web service.
public async Task<string> GetAsync(Uri requestUri)
{
return await this.GetAsync(requestUri, CancellationToken.None);
}
public async Task<string> GetAsync(Uri requestUri, CancellationToken cancellationToken)
{
using (var httpClient = new HttpClient())
{
var response =
await httpClient.GetAsync(requestUri).AsTask(cancellationToken);
response.EnsureSuccessStatusCode();
return await response.Content.ReadAsStringAsync();
}
}
This method asynchronously retrieves data from the web service by using the HttpClient class to send HTTP requests and receive HTTP responses from a URI. The call to HttpClient.GetAsync sends a GET request to the specified URI as an asynchronous operation, and returns a Task of type HttpResponseMessage that represents the asynchronous operation. The Task is cancellable, and will complete after the content from the response is read. For more info about the HttpClient class see "Connecting to an HTTP server using Windows.Web.Http.HttpClient."
When the UploadFileAsync method calls HttpService.GetAsync, this calls the GenerateAsset method in the VideosController class in the Contoso.Api project.
public async Task<HttpResponseMessage> GenerateAsset(string filename)
{
...
var videoAsset = await encodingService.GenerateSasLocator(filename);
var result = new VideoAssetDTO();
Mapper.Map(videoAsset, result);
return Request.CreateResponse(HttpStatusCode.Created, result);
...
}
This method calls the asynchronous GenerateSasLocator method in the EncodingService class. The EncodingService class is registered as a type mapping against the IEncodingService interface with the Unity dependency injection container. When the VideosController class accepts an IEncodingService type, the Unity container will resolve the type and return an instance of the EncodingService class.
When a VideoAsset object is returned from the GenerateSasLocator method a VideoAssetDTO object is created, with the returned VideoAsset object being mapped onto the VideoAssetDTO, which is then returned in an HttpResponseMessage to the HttpService.GetAsync method.
The GenerateSasLocator method uses Media Services types to return a new VideoAsset that contains a new asset id to represent the asset being uploaded, and a shared access signature locator to access the asset.
public async Task<VideoAsset> GenerateSasLocator(string filename)
{
var duration = int.Parse(
CloudConfiguration.GetConfigurationSetting("SasLocatorTimeout"));
IAsset asset = await this.context.Assets.CreateAsync(
"NewAsset_" + Guid.NewGuid() + "_" + filename, AssetCreationOptions.None,
CancellationToken.None).ConfigureAwait(false);
IAccessPolicy writePolicy = await this.context.AccessPolicies.CreateAsync(
"writePolicy", TimeSpan.FromMinutes(duration), AccessPermissions.Write)
.ConfigureAwait(false);
ILocator destinationLocator = await this.context.Locators.CreateLocatorAsync(
LocatorType.Sas, asset, writePolicy).ConfigureAwait(false);
var blobUri = new UriBuilder(destinationLocator.Path);
blobUri.Path += "/" + filename;
// return the new VideoAsset
return new VideoAsset()
{ SasLocator = blobUri.Uri.AbsoluteUri, AssetId = asset.Id };
}
The method creates a new asset using AssetCreationOptions.None that specifies that no encryption is used when the asset is in transit or at rest. An access policy named writePolicy is then created that specifies that the asset can be written to for 30 minutes. A shared access signature locator is then created, using the access policy. The locator returns an entry point that can be used to access the files contained in the asset. Finally, a Uri is created to which the video file will be uploaded to in blob storage, before the VideoAsset object is created and returned.
Bharath says: | |
---|---|
The maximum number of assets allowed in a Media Services account is 1,000,000. |
Back in the UploadFileAsync method in the VideoService class the UploadFileToBlobStorage method of the AzureFileUploader class is used to upload the file to blob storage using the shared access storage locator returned by the GenerateSasLocator method.
public async Task UploadVideoFileToBlobStorage(VideoFile file, string sasLocator, CancellationToken cancellationToken)
{
var blobUri = new Uri(sasLocator);
var sasCredentials = new StorageCredentials(blobUri.Query);
var blob = new CloudBlockBlob(new Uri(blobUri.GetComponents(UriComponents.SchemeAndServer | UriComponents.Path, UriFormat.UriEscaped)), sasCredentials);
StorageFile storageFile = null;
if (string.IsNullOrEmpty(file.FutureAccessToken))
{
storageFile = await StorageFile.GetFileFromPathAsync(file.Path).AsTask(cancellationToken);
}
else
{
storageFile = await StorageApplicationPermissions.FutureAccessList.GetFileAsync(file.FutureAccessToken).AsTask(cancellationToken);
}
cancellationToken.ThrowIfCancellationRequested();
await blob.UploadFromFileAsync(storageFile);
}
This method uploads the video file to blob storage using a URI specified by the shared access scheme locator. The file is uploaded by using the UploadFromFileAsync method of the CloudBlockBlob class.
The final step of the upload process is to publish the file for processing by the encoding pipeline. To do this the UploadFileAsync method creates a Uri that specifies that the Publish method will be called on the web service, with the address of the asset being passed as a parameter. The PostAsync method of the HttpService class is used to make the call to the web service.
public async Task<string> PostAsync(Uri requestUri, string stringifyJsonPostData, CancellationToken cancellationToken)
{
using (var httpClient = new HttpClient())
{
var postData = new HttpStringContent(stringifyJsonPostData,
UnicodeEncoding.Utf8, "application/json");
var response = await httpClient.PostAsync(requestUri,
postData).AsTask(cancellationToken);
response.EnsureSuccessStatusCode();
return await response.Content.ReadAsStringAsync();
}
}
This method asynchronously sends data to the web service by using the HttpClient class to send HTTP requests and receive HTTP responses from a URI. The call to HttpClient.PostAsync sends a POST request to the specified URI as an asynchronous operation, passing data that represents a Video instance that contains the metadata for the content to be published, and returns a Task of type HttpResponseMessage that represents the asynchronous operation. The returned Task will complete after the content from the response is read.
When the UploadFileAsync method calls HttpService.PostAsync, this calls the Publish method in the VideosController class in the Contoso.Api project.
public async Task<HttpResponseMessage> Publish(VideoSaveDTO video)
{
...
var newVideo = MapSaveDTO(video);
newVideo.EncodingStatus = EncodingState.NotStarted;
...
var videoDetail = await videoService.Save(newVideo);
...
await encodingService.PublishAsset(videoPublish);
return Request.CreateResponse(HttpStatusCode.Created);
...
}
The method first uses the MapSaveDTO method to convert the VideoSaveDTO object to a VideoDetail object, before setting its EncodingState to NotStarted. The Save method of the VideoService class is then called, followed by the PublishAsset method of the EncodingService class. In turn, the Save method in the VideoService class calls the SaveVideo method in the VideoRepository class, which creates a VideoEntity object to persist the video details to the CMS database in Azure Storage. The PublishAsset method in the EncodingService is shown in the following code example.
public async Task PublishAsset(VideoPublish video)
{
var inputAsset = this.context.Assets.Where(
a => a.Id == video.AssetId).SingleOrDefault();
if (inputAsset != null)
{
if (inputAsset.AssetFiles.Count() == 0)
{
await inputAsset.GenerateFromStorageAsync().ConfigureAwait(false);
}
var videoEncodingMessage = new EncodeVideoMessage()
{
AssetId = video.AssetId,
VideoId = video.VideoId,
IncludeThumbnails = true,
Resolution = video.Resolution
};
...
IAzureQueue<EncodeVideoMessage> queue =
new AzureQueue<EncodeVideoMessage>(
Microsoft.WindowsAzure.CloudStorageAccount.Parse(
CloudConfiguration.GetConfigurationSetting(
"WorkerRoleConnectionString")),
CloudConfiguration.GetConfigurationSetting("ContosoEncodingQueueName"),
TimeSpan.FromSeconds(300));
queue.AddMessage(videoEncodingMessage);
}
}
This method performs two tasks. The first task is to generate an AssetFile for the Asset, using the GenerateFromStorageAsync extension method, and associate it with the Asset. It is important to note that the AssetFile instance and the media file are two distinct objects. The AssetFile instance contains metadata about the media file, whereas the media file contains the actual media content. The second task is to create an EncodeVideoMessage instance and add it to the AzureQueue instance to begin the encoding process. For more information about the encoding process see "Chapter 4 – Encoding and Processing Video."
Summary
This chapter has described how the Contoso developers incorporated Media Services' uploading functionality into their web service and Windows Store client application. It summarized the decisions that they made in order to support their business requirements, and how they designed the code that performs the upload process.
In this chapter, you saw how to connect to Media Services and ingest content with the Media Service SDK for .NET. The chapter also discussed how to secure your content during the upload process, both when it's in transit and at rest.
The following chapter discusses the next step in the Media Services workflow – encoding and processing uploaded media.
More information
- The page, "Ingesting Assets with the Media Services REST API" describing how to ingest assets into Media Services using the REST API, is available on MSDN.
- You can find the page, "Ingesting Assets in Bulk with the REST API" describing how to use the REST API to ingest assets into Media Services in bulk, on MSDN.
- For information about high speed ingest technology, see "Uploading Media" on MSDN.
- You can find the page, "Managing Media Services Assets across Multiple Storage Accounts" on MSDN.
- For information about ingesting assets in bulk, see the page "Ingesting Assets in Bulk with the Media Services SDK for .NET" on MSDN.
- For information about creating a Media Services account and associate it with a storage account, see "How to Create a Media Services Account" on MSDN.
- The page, "Setup for Development on the Media Services SDK for .NET" describes how to set up a Visual Studio project for Media Services development, is available on MSDN.
- For more information about lazy initialization, see "Lazy Initialization" on MSDN.
- For a detailed description of how to encrypt configuration information see "Encrypting Configuration Information Using Protected Configuration" on MSDN.
- For information about how to connect to a web service from a Windows Store application, see "Connecting to an HTTP server using Windows.Web.Http.HttpClient" on MSDN.