TiN escribió: You saw it first here
Intro This is not another GTX 1080 Ti review, but a extreme overclocking modding guide. There is no much benefit doing all this for watercooling or aircooling, but required when GPU is frozen to chilly -196°C. Fair bit extra voltage might help reaching higher performance marvels.
[url=https://xdevs.com/doc/xDevs.com/ocg_1080ti/psm_face.jpg[/url]
[/url] [url=https://xdevs.com/doc/xDevs.com/ocg_1080ti/psm_back.jpg[/url]
[/url]
This guide is evolution of my usual "uncork series", with all good traditions set by previous guides:
*
Reference GTX 1080/1070 - Uncorking Pascal*
Removing power limits on any NV card*
Using Raspberry Pi to control your VGA VRM*
EVGA GTX 980 Ti K|NGP|N OC Guide*
Reference GTX 780/780Ti - Big KEPLER*
EVGA GTX 680 Classified - Uncorking guide*
Reference GTX 680 - Uncorking KEPLERLet's study plain reference
EVGA GTX 1080 Ti which is also called "founders edition". This is the baseline card, which we all get to know first before going any specific custom board designs.
As usual, you are on your own alone, even when doing even little simple mod. If you card/system bursts in flames or just simply stop working, do not try to RMA it or expect a support from manufacturers. Anything below this paragraph in this article is *NOT* covered by any kind of warranty, and provided *AS IS only* for education purpose, without support from manufacturer or NVIDIA. So yes, "don't try this at home". Any RMA attempt of soldered card is easily diagnosed today and will be rejected. Interesting to see NVIDIA's return on nerfing memory bus width by unpopulated memory chips on fully routed 384-bit memory bus on PCB. That's happening because reuse of PCB from Titan X (Pascal) card. There is no much point to respin new PCB design, since both Titan X Pascal and GTX 1080 Ti have same package and essentially are same thing, just different configuration of active blocks. This is not a new trick, we saw it years ago on cards like
GeForce 8800 GTS and GeForce 9600 GSO, and others. Photo below shows old 8800 GTS card, which I had some 10 years ago. That card had 320-bit bus, derived from full 384-bit GeForce 8800 GTX full G80-based design.
Image 3: Back to future,
EVGA GeForce 8800 GTS ACS3 KO with removed memory chips
And no, it's not unlockable even if you put BGA memory chips on the PCB. This is hinted by
-K1 code on GPU die, which hints on disabled memory controller. Same *Kx* marking was present on G80 above too
.
STEP 1 - Prep your benchWill need set of common tools for successful modding of reference GTX 1080 Ti card, which is similar to any other VGA.
* Card itself. EVGA GTX 1080 Ti FE used here to play with.
* 1 x 5 KOhm multi-turn VRs (trim pots) for GPU vmod.
* 2 x 50 KOhm multi-turn VR (trim pot) for memory and PLL mod.
* 3 x 10 Ohm chip resistors, 0805 size.
* 25-45W soldering iron for trimpot mods, 80-120W for EPOWER.
* Roll of 28-32AWG insulated wire
* Soldering FLUX
* Kingpincooling TEK9 FAT
* DMM (I use fancy handheld
Fluke 87V and high-performance
Keithley 2002, but any 10$ DMM can fit needs of this guide)
* Low-ESR capacitors (2.5 or 4V rated, 680-820uF , etc) if you like to juice things up, but not required.
Design is fairly similar to previous refresh x80 cards from NVIDIA, such as GTX 780Ti/980Ti and Titan series. We have GPU in center with eleven Micron GDDR5X memory ICs around. DVI interface is now gone to give space for larger exchaust footprint. GPU bears GP102-350-Kx-A1 marking, which tells you it's larger *102* series processor (to remind you, GTX 1080 based on GP104, Titan X Pascal use GP102-400, pro Tesla P100 with HBM - GP100), silicon revision A1. It was manufactured by TSMC Taiwan. GPU area is similar to previous generation processors, but have much more transistor gates due to much smaller 16nm FinFET node process. There are some 0201 decoupling capacitors around GPU die to help with power delivery. No hidden jumpers, traces or test points are visible on the package elsewhere, so nothing to worry about. SLI and PCIe connectors look exactly same as before, no magic there. For high 5K/8K resolutions and multi-monitor surround you might want to use new SLI HB bridges, which connect both SLI fingers in 2-way SLI setup.
STEP P - Power limit overridesOften gamers and users are mistakenly referring to 6-pin or 8-pin
MiniFit.JR connectors as 75W or 150W capable inputs. Nothing can be further from truth. These power levels are nothing but just way for NV determine how capable is used board hardware in terms of power delivery system. It's imaginary target number and have nothing to do with actual power taken from connector nor power input capability. Software and NV BIOS will handle GPU clocks and reduce voltages if measured power hitting programmed BIOS limit (which can and usuall is different value than 75/150W per connector!).
If you intend to do serious overclocking and benchmarking, it may be required to trick power monitoring circuitry to report lower power reading, so you not run into power throttle. Also to make sure we are not at any physical limit of power connector itself, check
Molex 26-01-3116 specifications, which have specifications both 13A *per contact* (16AWG wire in small connector) to 8.5A/contact (18AWG wire).
This means that using common 18AWG cable, 6-pin connector specified for 17A of current (3 contacts for +12V power, 2 contacts for GND return, one contact for detect). 8-pin have 25.5A current specification (3 contacts for +12V power, 3 contacts for GND return and 2 contacts for detection). This is 204W at +12.0V level or 306W for 8-pin accordingly.
Now if somebody tells you that 6-pin can't provide more than 75W, you know they don't understand the topic very well. It's not the connector itself or cable limit the power, but active regulation of GPU/BIOS/Driver according to detection of used cables and preprogrammed limits. So now we getting to know how actual power measured?
Just like
GTX 1080 FE card is also using already known
Texas Instruments INA3221 sensor IC, which is triple-channel monitor able to measure voltage, current and power on +12VDC input rails using just few external components and current shunts. Current shunt is special type of resistor which generate little, but measurable voltage which closely depends on amount of current flowing thru it. Thus card can detect power consumption in real-time and adjust its clock speed and performance automatically to keep power within specified envelope.
You can find current shunts which are marked *RS1, RS2, RS3* on PCB with black resistive element in center and often with R002 or R005 or 2M0/5M0 mark on top. Sometimes there are no marks, but look is always very similar, like a large rectangular flat part with two or four wide metal terminals. Usually these shunts are located very close to input power connectors before the main VRM circuitry. If we manage to reduce voltage signal (which is in linear dependance from current) from these shunts, then reported power limit will be reduced as well. Don't do anything with these shunts, just understand what they are for. Common "modifications" like applying liquid metal paste or using pencil are bad idea, as they do not provide reliable and static resistance change. Pascal cards have also protection against under-reporting, meaning that GPU will be stuck in low power state if reported power is zero or around that.
Adding 10Ohm 0805 size resistor on top of every larger ceramic capacitor next to INA3221 *U26* will reduce power levels about x3 times. These chip resistors can be bought in usual electronics shop, or online at retailers like
Digikey,
Mouser. In worst case you still can use usual thru-hole resistor, but it's not as convenient and you risking damaging caps or ripping traces off the board if you apply physical force on the joints.
I don't show actual photo of this mod, as it's nearly the same as in [url=/guide/pascal_oc/]GTX 1080[/url] case. This modification also very similar for all other NVIDIA cards, with bit more details covered
here and
here before.
STEP G - GPU Voltage trim-pot modFirst and usually most viable modification is one to adjust GPU core voltage. Often (but not always!) extra voltage can help to get better stability and higher clocks, given no temperature or power limitations are hit. Maximum available voltage thru software only control on reference cards are very limited, so hence you may need hardware modification to achieve higher voltages.
Prepare VR's, in this case one 5 KOhm nominal resistance multiturn variable resistor (
potentiometers). I like to use very common blue square
3296 type. There are many electronic components retailers have these available for sale, and it should be easy to find suitable resistors from your local store.
Now remove two resistors marked in red box on photo below to enable adjustment on feedback sense.
Glue 5 KOhm trimmer on the PCB like shown on photo below, to make sure it does not rip off or short anything later on. Sometimes I also use 2-position jumper pins to add ability of disabling mod to get stock voltage on the fly. That could be useful for troubleshooting later. Two AWG32-28 wires will be needed to connect trimmer to specific points.
Example location of all three Vcore,VMEM,Vpex trimmers
Middle pin connected to point on top pad of removed resistor. Other trimmer pin connects to ground. Make sure you set resistor to maximum value (resistance measured by DMM across both wires, not connected to anything), which should be around 5 KOhm. Not a good idea to boot card right at the maximum voltage! Solder wires to tiny spots near GPU voltage controller (uP9511P IC on the bottom side of the board).
Connection points of trimmer to raise GPU voltage
Since traces and solder points on PCB are very thin and easy to peel off, make sure you fix the wire to PCB in few places, so it does not move. I found use of few cyanoacrylate superglue drops on areas free from components good enough. This will ensure modification is safely secured and will not rip apart tiny components so easy.
You can go as high as 1.5+V with this kind of modification easily, so it does also can be suitable for extreme overclocking. New 7-phase VRM can provide plenty of power for card, even when pushed over 2400MHz on GPU.
p(#side_note). Note that GPU resistance on Pascal GPUs is very low, opposite to many previous GPU generations. I measured it around 70-150 mOhm (0.09-0.15 Ohm), depending on GPU sample. Measurement was done using four-wire resistance measurement method with expensive high-accuracy
Keithley Model 2002 DMM.
If you plan to use waterblock or
Kingpincooling TEK9 LN2 container as GPU cooler, make sure you have good 120mm fan over VRM area to ensure good airflow and acceptable temperatures (best under 70°C) when overclocking. 1080Ti's PG611 reference design PCB have all seven phases fully implemented for GPU power with two PowerPhase arrays on each phase.
STEP M - Memory voltage trim-pot modMemory VRM using single phase, with same as Vcore VRM PowerPhase dual NFET and output fixed 1.35V for GDDR5X memory chips.
Modified memory circuit to raise voltage
To raise voltage all you need to do is just 50KOhm trimmer to GND at marked point.
STEP L - PLL Voltage trim-pot modPLL regulator is common and known from previous GPUs, so nothing new here. PWM chip used is uPI UP1628Q in tiny DFN package. Modification to raise voltage on it is straight forward and involve connection of 50KOhm trimmer to GND and point on the photo above.
Connected trimmer wire and monitoring point
Monitoring of PLL voltage output can be done by soldering wire to location of three resistors near bottom left GPU fan-sink mount hole.
End result might look like this:
With trim pots wiring. You can see blue wire to connect ground at unused power connector.
h2. STEP S - BONUS%
Extreme overclockers often like to put some brutal bodges over their cards, so here are locations to add more capacitors:
8-point mount of LN2 container on GTX 1080 Ti and probe points
Best to have full 8-points brackets to ensure good and even mount pressure across all GPU sides. It's compatible with all TEK9 FAT and TEK9 SLIM containers and can be bought separately on
Kingpincooling.com shopCheck again, if everything good, no shorts on the card or damaged parts, solder blobs or dirt. Make sure all trimpots are set to proper value.
Assemble your cooling solution to a card, test voltages and give it a spin on overclocking, using
EVGA Precision XOC 16.
As usual, any feedback and questions are appreciated. Feel free to share link to this guide, but keep links and references intact, as guide likely to be updated in future.
Latest version of this guide also mirror on source,
http://forum.kingpincooling.com/showthread.php?t=3961