Hello,
Modern GPUs expose 10-bit (and even 12-bit) output because the entire video pipeline (from the frame buffer through the DAC/scaler to the cable) is capable of carrying more than 8 bits per color channel. NVIDIA's control panel simply lists whatever the display reports via EDID (Extended Display Identification Data) as "deep color" capability. Cards like NVIDIA Quadro and recent GeForce/Titan series support 30-bit (10 bpc) output over HDMI 1.3+ or DisplayPort 1.3+, and the monitor will accept that signal even if its actual panel is only 8 bpc+FRC.
Now, 8 bpc + FRC is just temporal dithering inside the monitor: it alternates between two nearby 8-bit values on successive frames so the eye perceives the intermediate shade. To the GPU it looks like a true 10 bpc display—it sends full 10-bit values, and the monitor's internal scaler applies FRC to emulate the extra gradations.
When you choose "10 bit" in the control panel, the GPU sends 10 bpc data end-to-end. The monitor's scalar then uses all 10 bits to drive its FRC engine. If you instead force an 8 bpc output, the GPU truncates (or quantizes) your colors to 8 bits first, and then the monitor can only dither that reduced data—which slightly lowers the precision of the simulated tones. In contrast, with a 10 bpc link the monitor's internal processing usually runs at 12 bits of precision to preserve the full 10 bit accuracy before dithering.
For HDR workflows (where standards like HDR10 encode in 10 bpc) the extra bit depth on the link is essential to avoid banding and preserve tonal detail. Even on an 8 bpc panel with FRC, feeding it a 10 bpc stream ensures the monitor's dithering algorithm can faithfully render the wider HDR gradations in a single pass, rather than compounding two truncation steps.
For SDR or non-critical tasks, the visual difference between true 10 bpc panels and well-implemented 8 bpc + FRC is minimal. Many professional 8 bpc+FRC monitors even win awards for color quality, and some users find them indistinguishable from native 10 bpc displays. If you don't see banding at 8 bpc or your workflow isn't HDR-centric, you can stick with 8 bpc (possibly saving a bit of interface bandwidth). Otherwise, leaving the GPU at 10 bpc output is generally the best way to maximize color fidelity on any FRC-equipped display.