cancel
Showing results for 
Search instead for 
Did you mean: 

3070 laptop wattage question.

Ragnaraz690
Level 11
Forgive me if this isn't the right place for this.

Recently on another forum I saw something where Lenovo Legion owners managed to reflash their 3070 140w with a 150w VBIOS, I can't remember if it was another Legion VBIOS or possibly an XMG VBIOS. This was a 3070 VBIOS, not a 3070ti just to clarify. So this begs the question, Why does the ti model get those extra 10w over the stock version when the stock 3070 can easily take that extra 10w?

I know the ti has more cores and such but that makes little odds when that power budget is split between the GPU and its VRAM. I ask this because is the A15 can house and use both the 3070 and the ti model on the same cooling system it stands to reason the stock 3070 could also use that extra 10w comfortably too. I know 10w isn't a massive jump but that could give the VRAM a little more stable OC room or a little more on the core itself. It's kind of questionable when you look at it that way. I would love to know the reasoning, or even request a 150w VBIOS for 3070 laptops if possible. It's not like it will challenge the ti model due to physical differences so there's no real reason not to.
310 Views
1 REPLY 1

xeromist
Moderator
Disclaimer: I don't work in the industry so all of the following is 2nd hand or speculation.

Often, power and voltage differences come down to binning. Nvidia takes higher quality silicon and sells it as the TI version(higher bin). The non-TI version will have different binning in terms of expected clocks and required voltages (i.e. might require more voltage and run hotter to reach the same clocks). Individual samples might work just fine but as a whole Nvidia only guarantees them to work within a certain spec. Now, individual manufacturers could take it upon themselves to do additional engineering, validation and binning to ensure the silicon can work at higher speeds. This is where you see things like a "Strix" version of the same silicon but higher performance. That costs money so those products cost more.

On the flip side, manufacturers can buy silicon from Nvidia, run it at rated clocks and power, and keep RMA rates low without the cost of extra validations.

Or sometimes it's just a differentiation to encourage people to consider a more expensive product. Product margins are often razor thin on the low end so artificial segmentation allows companies to stay in business. An annoying but necessary evil.

At any rate, manufacturers will almost never increase power and thermal limits after a product has been purchased. There is only a very minor reputational benefit vs an unknown risk of increased failures and RMAs for products that have not been tested to work that way.

BTW, even if you KNOW the silicon can run at the higher wattages with no ill effects it can still cause the fans to run at higher RPMs for longer, which increases the failure rate of the fans by some non-zero amount. In a higher margin product the manufacturer might decide the extra margin covers those additional RMA costs but doesn't on a product with lower margin. So power target is more than just silicon quality.
A bus station is where a bus stops. A train station is where a train stops. On my desk, I have a work station…