cancel
Showing results for 
Search instead for 
Did you mean: 

Does it make sense to activate the internal GPU? i9 11900K advantages / disadvantages

pitderbeste
Level 8
Hi Guys,

Does it make sense to activate the internal GPU?

Are there more advantages or more disadvantages as a result?

Regards,
PiT
2,828 Views
10 REPLIES 10

Silent_Scone
Super Moderator
Are you using the IGPU? If not, it can be disabled. Having it enabled whilst dormant will likely not have an adverse impact on performance, though.
13900KS / 8000 CAS36 / ROG APEX Z790 / ROG TUF RTX 4090

Silent Scone@ROG wrote:
Are you using the IGPU? If not, it can be disabled. Having it enabled whilst dormant will likely not have an adverse impact on performance, though.


If I may ask .. I'm using a Z590-E and I can't find any option in BIOS to disable the IGPU. It should be available in every bios?

BigJohnny
Level 13
Id leave it enabled. Its not going to tax your system unused and if your GPU takes a dump you can just plug into the IGPU and keep rolling instead of digging in the dark trying to enable it in BIOS on a black screen.

BigJohnny wrote:
Id leave it enabled.


Its also one more piece that can be broken.

After 26 Years of computer experience, I have never needed the onboard graphics.
I would disable it, it will always ask for additional drivers.

BigR2021 wrote:
Its also one more piece that can be broken.

After 26 Years of computer experience, I have never needed the onboard graphics.
I would disable it, it will always ask for additional drivers.


Just FYI onboard graphics haven't been around for 26 years. There was limited use in the 90s in corporate dumb terminals like the SPARC enhancement chipset from Weitek that wasnt on the CPU die. It didn't hit the mainstream on die until 2010 with the Intel Sandybridge with AMD following a year later when they bought ATI. Unless my math is horrendously bad that was 11 years ago.

If it breaks, then the CPU or MOBO is fried and you have bigger fish to fry than the IGPU. In my machines that have onboard graphics I leave it enabled and it never asks for updates, EVER. You can choose to leave driver updates out of windows update which I've always done. Every few months I check driver versions and download from manufactures sites if necessary but as a general rule if is isn't broke don't mess with it. If your PCIE GPU craps there isn't a thing you can do but pull the PCIE GPU, clear CMOS and lose all your settings and hope that by default its enabled. If its not then you are dead in the water until you get a replacement PCIE GPU. If you have it enabled its as simple as moving a cable or if you have a monitor with dual input run HDMI to the IGPU and just switch inputs on the monitor.

The vast majority of laptops use IGPU, Phones, Ipads etc as well. They all seem to work just fine.

The iGPU is the dominant GPU used in PCs, and it is in 100% of all game consoles, 100% of all tablets and smartphones, and around 60% of all cars (with displays), So yeah, everything using IGPU is broken. pffffft

https://www.electronicdesign.com/technologies/embedded-revolution/article/21135920/jon-peddie-resear...

If you have it enabled and not using it it does nothing and will not cause additional power draw or heat. If you have driver updates enabled in windows the worst thing it will do is update the drivers for it with the windows updates.

BigR2021 wrote:
Its also one more piece that can be broken.

After 26 Years of computer experience, I have never needed the onboard graphics.
I would disable it, it will always ask for additional drivers.

I haven't seen any onboard Graphics for A Decade
we are talking about the iGPU that is on the CPU here
if your GPU does not work or is not detected it will turn on
and there is no real harm in leaving it on

BigJohnny wrote:
Just FYI onboard graphics haven't been around for 26 years. There was limited use in the 90s in corporate dumb terminals like the SPARC enhancement chipset from Weitek that wasnt on the CPU die. It didn't hit the mainstream on die until 2010 with the Intel Sandybridge with AMD following a year later when they bought ATI. Unless my math is horrendously bad that was 11 years ago.

If it breaks, then the CPU or MOBO is fried and you have bigger fish to fry than the IGPU. In my machines that have onboard graphics I leave it enabled and it never asks for updates, EVER. You can choose to leave driver updates out of windows update which I've always done. Every few months I check driver versions and download from manufactures sites if necessary but as a general rule if is isn't broke don't mess with it. If your PCIE GPU craps there isn't a thing you can do but pull the PCIE GPU, clear CMOS and lose all your settings and hope that by default its enabled. If its not then you are dead in the water until you get a replacement PCIE GPU. If you have it enabled its as simple as moving a cable or if you have a monitor with dual input run HDMI to the IGPU and just switch inputs on the monitor.

The vast majority of laptops use IGPU, Phones, Ipads etc as well. They all seem to work just fine.

The iGPU is the dominant GPU used in PCs, and it is in 100% of all game consoles, 100% of all tablets and smartphones, and around 60% of all cars (with displays), So yeah, everything using IGPU is broken. pffffft

https://www.electronicdesign.com/technologies/embedded-revolution/article/21135920/jon-peddie-resear...

If you have it enabled and not using it it does nothing and will not cause additional power draw or heat. If you have driver updates enabled in windows the worst thing it will do is update the drivers for it with the windows updates.


Onboard Graphics have infact been around for a long time
what you are referring to is the iGPU that has been around since Sandy Bridge (not sure if the first gen iCore had them)

Actual onboard Graphics are incorporated on the Main board and were usually not that good (Most of the time these were Budget main boards)
I recall that it was one of the parts on a Main board that usually Failed
it could also be turned off in the Bios and good luck getting it on again if your GPU failed
I have personally seen these on boards with Socket 7 / Socket 423 / Socket 478 / Socket AM2 / Socket AM3 just to name a few

However since the iGPU there is no need for the existence of Onboard Graphics (Since the GPU is now baked into the CPU)
The iGPU is also a lot more reliable ... (If it breaks your CPU is also dead)
The only other issue I found would be if the Display output connection is broken on the main board (But than the iGPU still works)
ASUS Maximus 13 Extreme
Intel i9 11900K
Corsair Dominater Platinum RGB
Phanteks Enthoo Primo
Samsung 980 Pro 1TB
2 X Samsung 980 Pro 2TB
Corsair AX1200I
EK-Velocity D-RGB - Nickel + Plexi
EK-XTOP Revo Dual D5 PWM Serial - (incl. 2x pump)
XSPC RX480 + XSPC RS280
4 X Corsair SP120 PWM OEM Fans
2 X Noctua NF-A14 industrialPPC-2000 IP67 PWM
2 X Noctua NF-A14 PWM chromax.black.swap
2 X Noctua NF-S12A PWM chromax.black.swap
1 X Noctua NF-A15 HS-PWM chromax.black.swap

DjRavix wrote:




Onboard Graphics have infact been around for a long time
what you are referring to is the iGPU that has been around since Sandy Bridge (not sure if the first gen iCore had them)

Actual onboard Graphics are incorporated on the Main board and were usually not that good (Most of the time these were Budget main boards)
I recall that it was one of the parts on a Main board that usually Failed
it could also be turned off in the Bios and good luck getting it on again if your GPU failed
I have personally seen these on boards with Socket 7 / Socket 423 / Socket 478 / Socket AM2 / Socket AM3 just to name a few

However since the iGPU there is no need for the existence of Onboard Graphics (Since the GPU is now baked into the CPU)
The iGPU is also a lot more reliable ... (If it breaks your CPU is also dead)
The only other issue I found would be if the Display output connection is broken on the main board (But than the iGPU still works)


Thats pretty much what I said. The first distinction was that chipset based graphics were introduced in the 90s and the IGPU didnt make its debut until 2010 and yes thats what Im reffering to as not being around for 26 years, only 11.

If you have it disabled, I cant speak for every board but two that I have that have IGPU capabilties, will not turn on if disabled in the BIOS then you are flying blind. If it is enabled you pull the PCIE GPU it will work, just plug in the cable to the HDMI slot on the I/O backplane. Already been down that road on two ASUS boards in Z270 and Z370. An extreme and forumla. They both reacted the same way. If they are enabled you can also use that port for a small display in the case. If its disabled and no PCIE card you are in the dark until you clear the cmos as it is enabled by default on my two boards. My wife uses one and my Son uses the other. Im on HEDT that doesnt have IGPU but ran both of those machines at one point in time. 6700K then a 7700K.

Im not building or repairing my PC systems since 6 years.
Im leaving that for the technician with the spare parts.
If the graphics card gets broken it can be replaced.
Im not gambling with parts that may work or not work.
If its disabled you know it does not work.

The advantage of the 11900K i9 iGPU is that it includes the newest AV1 decoder.

AV1 is said to be more efficient at video streaming than the current vp9 standard. YouTube is said to be adopting mainly AV1 for future video decompression. You can test the limitation of current video decoders by streaming an 4K 60FPS YouTube video or 8K 24fps/60fps youtube video. Check out the stats for nerds.

AV1 is currently only available in the new AMD 6000 series RDNA 2, Nvidia RTX 3000 series GPUs and Intel Tiger Rocket and Alder Lake with iGPU.

I think your iGPU should remain disabled unless you restart your computer with the HDMI or DisplayPort plugged into your system's motherboard. If your system is already running and you attempt to plug in the motherboard's HDMI or DisplayPort, you should not receive a signal. I think this indiciates that the iGPU is sleeping and will activate on the next restart and when it receives an input signal.

Similarly, you cannot use the iGPU to decode AV1 streams unless you plug a monitor into your motherboard's HDMI/DisplayPort plug.

https://en.wikipedia.org/wiki/AV1#Hardware
https://en.wikipedia.org/wiki/Intel_Quick_Sync_Video#Hardware_decoding_and_encoding