Point here is that the card's performance can be increased in many games without breaking through what NVIDIA believes to be a safe limit, which, by default, is the 195W TDP. Switch to 3DMark 11 and the benchmark is significantly more stressful, enough to hit the card's TDP limit. Games such as Battlefield 3 don't exact a huge toll, with the GPU functioning at around 75 per cent of maximum power. You see, not all games stress a GPU to the same degree we know this when looking at the power-draw figures for our own games. Just like Intel, NVIDIA wants to boost performance when there is TDP headroom to do so, given conductive temperatures, etc. The GTX 680 has a dedicated microprocessor that amalgamates a slew of data - temperature, actual power-draw, to name but two - and determines if the GPU can run faster. Now, GTX 680 core and shader-core clock is 1,006MHz, but this is not the frequency it will operate at during most gaming periods. Any GPU engineer will tell you that a modern GPU has multiple clock domains, but for our purposes, let's consider GTX 580 to operate with a general clock of 772MHz and shader-core speed of 1,544MHz. NVIDIA's previous GPUs have run at one set speed, determined by the class of card, with, in recent times, the shader-core operating at twice this base clock. Taking a leaf out of Intel's book, NVIDIA is implementing a frequency-boosting feature called GPU Boost, and it needs some explainin'.