Share via

My PC "refuses" to use dedicated GPU...

Anonymous
2021-06-10T04:48:03+00:00

Context:

I've been searching for a solution to this over like 2 and a half years now. My PC model is an Acer Aspire 5600U, I know it's a little bit on the -older- side, I mean, we purchased it back in 2013, but aside from gaming, for things like school, homework, watching videos, etc, it's still in a very healthy state, so I believe it still has at least a couple years of useful life until we have to inevitably get a newer computer. It has two graphics cards:

-The integrated one, an Intel HD Graphics 4000

-The dedicated one, an NVIDIA GeForce 630M

There was a time, I think 2013-2016, when my PC was able to perfectly run some early 2010's games on low to medium settings, for example, Call of Duty: Black Ops II, Battlefield 3, CS:GO, and some older games like Minecraft and GTA San Andreas. Even by the end of 2016, I downloaded GTA V and, although I had to lower all settings to the minimum, it performed just fine and was playable.

The problem:

Since early 2017, I noticed the computer started to struggle running Black Ops II on the settings that I always used on that game, so I had to start lowering them to be able to play it normally with no lag; not that big of a deal, I don't really care all that much about graphics, I just want a game to be playable, specially if I'm playing online. Same thing happened with CS:GO, and I had to lower all settings on that game aswell. Then, with Minecraft, I noticed a similar trend, and when I pressed F3 to see how the game was performing, I noticed it was using the integrated graphics card (the Intel), instead of the NVIDIA like it had always been (or at least that's what I believe though. Before all this problem started, I never really paid attention to all this stuff).

So, from this time (early 2017) to early 2019, my fix to this was to disable the Intel GPU in the device manager whenever I wanted to play a game, and then launch the game; the performance difference between the Intel and the NVIDIA was very noticeable, and in the case of Minecraft, I was able to confirm (with the F3 menu) that it was indeed using the NVIDIA GPU. Like I said, this was my fix to my initial thoughts of my PC not using the dedicated GPU in "normal" conditions.

However...

After some time, more or less by early 2019, this wasn't working anymore. Now, if I did this, Minecraft and basically every other game were running on an extremely low FPS, I think it wasn't even 1 FPS, it looked more like a PowerPoint Presentation. The odd thing was that Minecraft was still showing that it was using the NVIDIA GPU, but I don't think it was the case because of how horrible the game was performing. Then, when I enabled the Intel GPU back, games were now running not as good as when they were using the NVIDIA properly, but they were definetively not as bad as when I had it disabled.

What I've tried already:

Since then, I've dug the internet looking for a solution to this, and I think I have done almost everything to fix this, but none has worked out apparently. I don't even know where to start:

  • I updated both GPUs
  • I unistalled and reinstalled both GPUs (from the device manager; since my PC is an all-in-one, I cannot change the GPUs physically)
  • Switched the global default GPU in the NVIDIA control panel as well as for each specific program
  • Select the "open with graphics processor"-> "high-performance NVIDIA processor" on the right click menu when I try to run a game
  • Went to the BIOS, but as far as I can tell, there was no option whatsoever to do something about which GPU to use, or at least not an obvious one.
  • In the PC Settings--->Display---> Graphics Settings, I have also changed program by program which GPU will they be using, and a curious thing is that in this section, the NVIDIA GPU doesn't even show up:

In my opinion, this doesn't even makes sense, since the NVIDIA GPU shows perfectly at the device manager, and of course, the NVIDIA Control Panel also shows it:

There's only two possible solutions that I know of that I haven't tried yet, and they are: getting back to an older version, whether be for one or both GPUs, or for the Windows 10 version intself. But at this point, I'm kind of tired and hopeless after two years with this problem.

Now, I suspect this might be a hardware problem, but I'm not too sure about it, because if that was the case, I believe my PC would not even detect the NVIDIA GPU in the device manager.

This is a very frustrating problem, because even though I'm perfectly aware that this isn't a gaming PC, I KNOW it can give a little bit more gaming-wise, and I really think this is the main reason of why it's not as good at running games just like it was at least 4 years ago.

So, I would be very thankful if anybody can help me with this case...

Thanks in advance, if you need any more details about my PC, just ask for them.

Windows for home | Windows 10 | Devices and drivers

Locked Question. This question was migrated from the Microsoft Support Community. You can vote on whether it's helpful, but you can't add comments or replies or follow the question.

0 comments No comments

1 answer

Sort by: Most helpful
  1. Anonymous
    2021-06-10T05:30:29+00:00

    HI Endazer. I'm Greg, an Independent Advisor.

    For my first ten years helping in forums, I had a hobby of making old laptops run as long as possible. The trick is to have a gold standard Clean Install, which I also authored and popularized among millions of consumers over those same ten years. Even with the very best install and upkeep, however, I never had one run well for more than 7-8 years, or at least not as fast as I require which is instantaneous on everything - made possible only by that same Clean Install, I believe. So while they all may have run 10 or more years, it wasn't worth running them after 7-8, so they were given away to people who thought they were fine.

    All that said,Intel 4000 was barely capable in the early days of WIndows 10 and almost certainly isn't now. But that doesn't stop us from trying everything!

    Does your laptop have switchable graphics which requires different settings explained here:

    https://www.microcenter.com/tech_center/article...

    https://www.intel.com/content/www/us/en/support...

    http://www.geforce.com/hardware/technology/opti...

    If not then I'd find out which adapter can be made to work best with the very best driver you can find for it, and then disable the other one in BIOS or in Device Manager.

    The Display driver is so important here that merely updating it in such an end-of-life-extension scenario is not enough. So let's try everything possible to get it working best for each adapter to see what they have left in them:

    First make sure you have updated the Display driver from the PC or Display adapter maker's Support Downloads web page for your exact model number, HP Serial Number or Dell Service Tag - from the sticker on the PC.

    If necessary first remove the old driver using DDU https://www.wagnardsoft.com/content/ddu-guide-t... (not necessary with Microsoft Basic driver which is a placeholder if nothing else is available) and/or install in Safe Mode with Networking (so you have internet), or Safe Mode, accessed by one of these methods: https://www.digitalcitizen.life/4-ways-boot-saf...

    While there check also for newer chipset, BIOS/UEFI firmware (very important), network, sound, USB3 and other drivers, comparing to the ones presently installed in Device Manager reached by right clicking the Start Menu.

    If this doesn't give you the latest or ideal driver for what you need, compare it to the driver offered by the Intel driver update Utility here: http://www.intel.com/content/www/us/en/support/...

    or the Nvidia Update utility here: http://www.nvidia.com/Download/Scan.aspx?lang=e...

    or the AMD autodect utility here: http://support.amd.com/us/gpudownload/windows/P...

    For Display issues one fix that is working is to Roll back or Uninstall the Driver on the Display Device > Driver tab, restart PC to reinstall driver.

    You can also try older drivers in Device Manager > Display device > Driver tab > Update Driver > Browse > Let Me Pick.

    Then you will know you've tried everything in addition to Windows Update drivers.

    Adjust the screen resolution until it fits and looks best at Settings > System > Display.

    Then watch WIndows Updates which may push a new monitor driver based on the display driver you now have installed.

    Feel free to ask any questions. Based on the results you post back I may have other suggestions if necessary.

    ______________________________________________

    Standard Disclaimer: There are links to non-Microsoft websites. The pages appear to be providing accurate, safe information. Watch out for ads on the sites that may advertise products frequently classified as a PUP (Potentially Unwanted Products). Thoroughly research any product advertised on the sites before you decide to download and install it.

    3 people found this answer helpful.
    0 comments No comments