If you have multiple GPUs like Intel/AMD and Nvidia, you can set Nvidia graphics card as default in Windows. Here’s how to set preferred GPU.
Most CPUs, whether it’s Intel or AMD will have its own built-in Graphics Processing Unit. Often, this GPU built into CPU is called as Integrated Graphics Processing Unit or iGPU for short. This built-in GPU is very basic and is mainly intended so that the basic usage like hooking up a monitor and doing your daily tasks can be performed without additional graphics hardware. However, for more demanding tasks like gaming and content creation, you need a good dedicated GPU. The good thing is, buying and installing a GPU is pretty easy.
When you add a dedicated GPU in your system, Windows will automatically use that GPU for all the graphics heavy-lifting. In fact, it constantly switches between the integrated GPU and dedicated GPU depending on the task. Though Windows takes care of the switching between dGPU and iGPU most of the time, it has a tendency to not work from time to time. This is especially true for poorly coded or unoptimized applications. In those cases, you can force Windows to use the Nvidia GPU for all the graphics heavy-lifting.
So, without further ado, let me show you steps to set Nvidia graphics card as default in Windows 10.
Jump to:
Use PC Settings App (Windows 10 Only)
The Settings app in Windows 10 allows you to set the preferred GPU as Nvidia for any application of your choice.
Note: For this method to work, you should be using Windows 10 v1903 or v1909. Here’s how to check Windows 10 version.
1. Open the PC Settings app by pressing the keyboard shortcut Win + I. You can also search for Settings in the start menu too.
2. In the Settings app, go to the “System → Display” page. On the right page, scroll down and click on the “Graphics Settings” link.
3. Now, select the app type from the drop-down menu. If you want to set the preferred GPU for a regular win32 application, select the “Classic app” option. If the app is downloaded from Microsoft Store, select the “Universal app” option.
4. After selecting the app type, click on the “Browse” button, find the application, select it, and click on the “Open” button. Next, click on the “Add” button.
5. Now, click on the “Options” button.
6. From the pop-up window, select the “High performance” radio option and click on the “Save” button.
That is it. You done setting Nvidia as the default graphics card in Windows 10.
Use the Nvidia Control Panel
Nvidia Control Panel makes it quite easy to set the preferred GPU in Windows 10. The good thing about this method is that it gives you granular control over the graphics settings. This method is particularly suitable for old games that have compatibility issues with certain graphics settings. If you are not sure about the settings you are changing, I recommend you stick with the first method.
1. First, download and install the Nvidia Control Panel if it is not already installed.
2. Now, open the Nvidia Control Panel by searching for it in the start menu.
3. In the Nvidia Control Panel, select “Manage 3D Settings” under the 3D Settings section on the left panel. On the right panel, select the “Program Settings” tab.
4. Select the target program from the first drop-down menu. Next, go through all the options in the second section and select appropriate graphics settings.
5. Finally, click on the “Apply” button to save changes.
That is it.
I hope that helps. If you are stuck or need some help, comment below and I will try to help as much as possible.
And how can i use nVidia when i play a game through client?
There’s no option for clients, only for executable files.
Good article. I have a elementary question. I’m upgrading from Win-7 to a new Win-10. I’m reading that my current Samsung Syncmaster (may) not work with the upgrade. Here’s the question. Might I use a Nvidia graphics card in the new box so that my current monitor will function? Or, other suggestion?
Thanks
Crady Adams
Vinton, VA
RC————@aol.com
Hi Adams,
If your Samsung Syncmaster monitor has compatible ports (like HDMI) with the Nvidia graphics card then it should work. For instance, if your monitor has HDMI, DVI, or VGA ports then make sure that the graphics card also has one or more of those ports. If needed, you can also buy port conversion adapters too.
Thanks for your prompt and helpful reply. But please ponder one more issue. I’ve been trolling the internet and have found questions arise with the compatibility of my Samsung Syncmaster (P277H/P2770FH) monitor and Windows-10. It seems you are suggesting it’s not a Windows-10 thing but a cable/port issue. I’m debating between an HP vs Dell plain vanilla computer and wonder if my monitor would attach directly with an existing port they provide on the new computer, or it should attach to an add-on Nvidia card/port. Can you enlighten me? Thanks
In general, if the monitor has a compatible video input port then Windows 10 should work provided that it can install a generic driver for that specific monitor. If Windows could not find a generic driver or if you cannot install the monitor manufacturer provided driver, it can cause compatibility issue. The easiest way to check if your monitor works with Windows 10 is connect it to a Windows 10 machine of your friends or family.