question

SarumantheWhite-1709 avatar image
1 Vote"
SarumantheWhite-1709 asked SarumantheWhite-1709 commented

MediaCapture: MJPG stream converted to NV12

I have an UWP (C++/WinRT) application running on Hololens 2, that reads frames from an external camera (USB-attached). I am reading the frames via winrt::Windows::Media::Capture::MediaCapture:
(1) I obtain a winrt::Windows::Media::Capture::Frames::MediaFrameSource for the camera (stream type is MediaStreamType::VideoRecord, source kind is MediaFrameSourceKind::Color).
(2) select a format for the MediaFrameSource, out of its SupportedFormats() for a particular resolution, framerate etc.
(3) then get a MediaFrameReader from the said source
(4) hook on the MediaFrameReader's event FrameArrived and call StartAsync

This works alright - the FrameArrived handler fires at the expected rate, giving me frames. My issue is the following:
- when I query SupportedFormats in MediaFrameSource, there are formats with MJPG subtype (camera supports this)
- when I select one of these MJPG formats, the FrameArrived event still fires, but the frame is (internally) converted to NV12

I need to disable this internal conversion to NV12 and retrieve the MJPG frames as generated by the camera.

I have also tried to use Media Foundation, from which I can get the MJPG frames successfully, but only on desktop. On Hololens, the API is limited for some reason, leaving me unable to obtain IMFMediaSource for the webcam. So MF is probably not an option on Hololens 2. Encoding the frames myself from NV12 to JPEG is also not an option, as HL2 does not have enough computational resources to do the required conversions.

Any ideas what else I can do to receive MJPG from the webcam?


windows-uwp
· 10
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

@SarumantheWhite-1709 Could you please test your application in a normal windows10 portable device(such as a laptop)? If this error doesn’t occur, the issue might be relative with device.

0 Votes 0 ·

@AryaDing-MSFT on desktop (Win10 20H2 x86-64), the behavior is different. If I choose NV12 or YUY2 pixel format, everything works as expected. On the other hand, if I choose MJPG (which is still listed in SupportedFormats()), the MediaCapture.Failed event fires just after I call MediaFrameReader.StartAsync() and the MediaFrameReader.FrameArrived is never invoked.

0 Votes 0 ·

@AryaDing-MSFT I can provide an UWP console application for testing if that would help.

0 Votes 0 ·

@SarumantheWhite-1709 Does your camera support MJPG type? You need to make sure that your camera supports MJPG before you can use that feature. If that supports MJPG, you could configure it using the sample code.


0 Votes 0 ·

@AryaDing-MSFT when I connect the same camera to a desktop and access it through Media Foundation, MPJG subtype is listed and I am able to retrieve valid JPEG frames from it (using a modified mfroundtrip example by sipsorcery). I have two cameras that I test with (Microsoft LifeCam VX-2000 and some Chinese noname thing), both of which advertise MJPG in SupportedFrames. They behave the same.

0 Votes 0 ·
Show more comments

1 Answer

RobCaplan avatar image
0 Votes"
RobCaplan answered SarumantheWhite-1709 commented

Hi Saruman,

HoloLens 2 development isn't (yet!) covered on the Q&A forums. For help with HoloLens 2 programming you can open a support ticket at https://aka.ms/mrsupport to work one-on-one with a HoloLens 2 programming specialist.

External cameras aren't supported on the HoloLens 2, so the behavior from one is undefined. See the supported USB classes at https://docs.microsoft.com/en-us/hololens/hololens-connect-devices#hololens-2-connect-usb-c-devices

Even with the onboard camera, I'd expect only the uncompressed & YUV formats (see https://docs.microsoft.com/en-us/windows/win32/medfound/video-subtype-guids) to be available at the MediaFrameReader level, not encoded formats such as MJPG.

You can get MJPG output by capturing to a custom sink. The pipeline will encode it before calling your sink.

Media capture is very complicated and there may be a better way to get to your end goal. If you'd like to discuss this with a HoloLens 2 media specialist please open a case at https://aka.ms/mrsupport


--Rob

· 3
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Hi Rob,

thanks for your reply. I agree that support for compressed formats in API that reads individual frames may be pointless. Still, JPEG would make sense to me as there is no inter prediction.

I'll go with the custom sink approach, but I'd like some clarification on "The pipeline will encode it before calling your sink." If the external camera can provide MJPG frames (it has the hardware to do it, like the Microsoft LifeCam VX-2000), will the framework use the JPEG data as the camera creates it, or will the framework decode and reencode the JPEG?

0 Votes 0 ·
RobCaplan avatar image RobCaplan SarumantheWhite-1709 ·

Hi Saruman,

I bounced this off of a couple of media specialists, and it's complicated. If you need help with this please open a case. Best practices will vary greatly based on your specific scenario.

That said, you'll probably get MJPG pulled directly off of the camera if the camera supports MJPG.

Their general recommendation is not to use MJPG: if you just need the frames off of the camera use NV12, if you need to move the video off the system then use H.264. This is what all the on-board systems are doing.

Note that there's no conversion from MJPG to NV12 going on in any step here - when delivering NV12 we're pulling NV12 off of the camera, not MJPG.

--Rob

0 Votes 0 ·

Hi Rob,
thanks, this helps.

0 Votes 0 ·