Using the dedicated graphics card on an external monitor











up vote
0
down vote

favorite












I have just bought a Lenovo Yoga 730 15" with an Intel graphics card and a Geforce 1050 graphics card. It runs games perfectly fine when I'm using the internal monitor, but as soon as I change to an external monitor I'm getting much lower fps. I don't have any numbers, but it goes from smooth to not playable with the same settings.



Both monitors are 4K 60 Hz. I'm using Thunderbolt 3 with an HDMI-adapter (2.0, 4K 60 Hz) for the external monitor.



According to Windows, both the internal and the external monitor uses the Intel card. Since the games runs fine on the internal monitor I assume the physical connections goes through the Intel card even when the Geforce card is in use.



How do I force the Geforce card to be used on the external monitor?



(I'm not sure if this is the right forum for this question, point me to the correct one otherwise!)










share|improve this question






















  • Have you tried disabling the internal graphics card in UEFI? The setting could include something like IGP/IGPX or like this. Some manufacturers only include this option in their high-end gaming section.
    – Nordlys Jeger
    Nov 29 at 18:50










  • It's only possible to disable the Geforce card in UEFI, I can switch between "Intel only" and "Switchable". However, I've noticed that the Geforce card is in fact in use. If I check the task manager and run games, it's only the Geforce card thas is doing work, even if I chose to only use the Intel card in nVidia control panel. This makes my problem even more strange since the game should have the same fps when the same card is in use.
    – Thomas
    Nov 30 at 18:48















up vote
0
down vote

favorite












I have just bought a Lenovo Yoga 730 15" with an Intel graphics card and a Geforce 1050 graphics card. It runs games perfectly fine when I'm using the internal monitor, but as soon as I change to an external monitor I'm getting much lower fps. I don't have any numbers, but it goes from smooth to not playable with the same settings.



Both monitors are 4K 60 Hz. I'm using Thunderbolt 3 with an HDMI-adapter (2.0, 4K 60 Hz) for the external monitor.



According to Windows, both the internal and the external monitor uses the Intel card. Since the games runs fine on the internal monitor I assume the physical connections goes through the Intel card even when the Geforce card is in use.



How do I force the Geforce card to be used on the external monitor?



(I'm not sure if this is the right forum for this question, point me to the correct one otherwise!)










share|improve this question






















  • Have you tried disabling the internal graphics card in UEFI? The setting could include something like IGP/IGPX or like this. Some manufacturers only include this option in their high-end gaming section.
    – Nordlys Jeger
    Nov 29 at 18:50










  • It's only possible to disable the Geforce card in UEFI, I can switch between "Intel only" and "Switchable". However, I've noticed that the Geforce card is in fact in use. If I check the task manager and run games, it's only the Geforce card thas is doing work, even if I chose to only use the Intel card in nVidia control panel. This makes my problem even more strange since the game should have the same fps when the same card is in use.
    – Thomas
    Nov 30 at 18:48













up vote
0
down vote

favorite









up vote
0
down vote

favorite











I have just bought a Lenovo Yoga 730 15" with an Intel graphics card and a Geforce 1050 graphics card. It runs games perfectly fine when I'm using the internal monitor, but as soon as I change to an external monitor I'm getting much lower fps. I don't have any numbers, but it goes from smooth to not playable with the same settings.



Both monitors are 4K 60 Hz. I'm using Thunderbolt 3 with an HDMI-adapter (2.0, 4K 60 Hz) for the external monitor.



According to Windows, both the internal and the external monitor uses the Intel card. Since the games runs fine on the internal monitor I assume the physical connections goes through the Intel card even when the Geforce card is in use.



How do I force the Geforce card to be used on the external monitor?



(I'm not sure if this is the right forum for this question, point me to the correct one otherwise!)










share|improve this question













I have just bought a Lenovo Yoga 730 15" with an Intel graphics card and a Geforce 1050 graphics card. It runs games perfectly fine when I'm using the internal monitor, but as soon as I change to an external monitor I'm getting much lower fps. I don't have any numbers, but it goes from smooth to not playable with the same settings.



Both monitors are 4K 60 Hz. I'm using Thunderbolt 3 with an HDMI-adapter (2.0, 4K 60 Hz) for the external monitor.



According to Windows, both the internal and the external monitor uses the Intel card. Since the games runs fine on the internal monitor I assume the physical connections goes through the Intel card even when the Geforce card is in use.



How do I force the Geforce card to be used on the external monitor?



(I'm not sure if this is the right forum for this question, point me to the correct one otherwise!)







windows-10 graphics-card lenovo-laptop external-display






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Nov 29 at 0:19









Thomas

6




6












  • Have you tried disabling the internal graphics card in UEFI? The setting could include something like IGP/IGPX or like this. Some manufacturers only include this option in their high-end gaming section.
    – Nordlys Jeger
    Nov 29 at 18:50










  • It's only possible to disable the Geforce card in UEFI, I can switch between "Intel only" and "Switchable". However, I've noticed that the Geforce card is in fact in use. If I check the task manager and run games, it's only the Geforce card thas is doing work, even if I chose to only use the Intel card in nVidia control panel. This makes my problem even more strange since the game should have the same fps when the same card is in use.
    – Thomas
    Nov 30 at 18:48


















  • Have you tried disabling the internal graphics card in UEFI? The setting could include something like IGP/IGPX or like this. Some manufacturers only include this option in their high-end gaming section.
    – Nordlys Jeger
    Nov 29 at 18:50










  • It's only possible to disable the Geforce card in UEFI, I can switch between "Intel only" and "Switchable". However, I've noticed that the Geforce card is in fact in use. If I check the task manager and run games, it's only the Geforce card thas is doing work, even if I chose to only use the Intel card in nVidia control panel. This makes my problem even more strange since the game should have the same fps when the same card is in use.
    – Thomas
    Nov 30 at 18:48
















Have you tried disabling the internal graphics card in UEFI? The setting could include something like IGP/IGPX or like this. Some manufacturers only include this option in their high-end gaming section.
– Nordlys Jeger
Nov 29 at 18:50




Have you tried disabling the internal graphics card in UEFI? The setting could include something like IGP/IGPX or like this. Some manufacturers only include this option in their high-end gaming section.
– Nordlys Jeger
Nov 29 at 18:50












It's only possible to disable the Geforce card in UEFI, I can switch between "Intel only" and "Switchable". However, I've noticed that the Geforce card is in fact in use. If I check the task manager and run games, it's only the Geforce card thas is doing work, even if I chose to only use the Intel card in nVidia control panel. This makes my problem even more strange since the game should have the same fps when the same card is in use.
– Thomas
Nov 30 at 18:48




It's only possible to disable the Geforce card in UEFI, I can switch between "Intel only" and "Switchable". However, I've noticed that the Geforce card is in fact in use. If I check the task manager and run games, it's only the Geforce card thas is doing work, even if I chose to only use the Intel card in nVidia control panel. This makes my problem even more strange since the game should have the same fps when the same card is in use.
– Thomas
Nov 30 at 18:48















active

oldest

votes











Your Answer








StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "3"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsuperuser.com%2fquestions%2f1379249%2fusing-the-dedicated-graphics-card-on-an-external-monitor%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown






























active

oldest

votes













active

oldest

votes









active

oldest

votes






active

oldest

votes
















draft saved

draft discarded




















































Thanks for contributing an answer to Super User!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.





Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


Please pay close attention to the following guidance:


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsuperuser.com%2fquestions%2f1379249%2fusing-the-dedicated-graphics-card-on-an-external-monitor%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

flock() on closed filehandle LOCK_FILE at /usr/bin/apt-mirror

Mangá

Eduardo VII do Reino Unido