As technology progresses and circuitry such as modern Central Processing Units (CPUs) continue to climb in speed heat production is a topic which should not be overlooked. Although manufactures are constantly refining their CPU manufacturing process which brings down the power required for a CPU
The heat that is produced by a CPU is actually wasted power that has leaked from the circuitry as energy travelled through the CPU. Now as computers evolve (roughly doubling in speed even 18 months Moores Law) so do the components and technologies employed in designing, and producing them. These new improvements and technologies help reduce the amount of inefficient power loss, as well as overall power usage.
However in the real world when they employ new technologies cooling down the current chips is only a byproduct. Their main goal is preparing for future higher frequency chips, and adding additional features into new cores. Then as time trickles by the lowest end chip will once again be a high frequency chip producing more heat than the original lower clocked models.
If you think that just because you didnt buy the top of the line CPU youre not producing a lot...