Quantcast
Channel: Power dissipation on chip vs. Over power dissipation - Electrical Engineering Stack Exchange
Viewing all articles
Browse latest Browse all 3

Power dissipation on chip vs. Over power dissipation

$
0
0

I was reading this pdf regarding computation from MIT's opencourseware and on p.8 through some obscure mathematical manipulation they basically reach the conclusion that the cooler you make your chips, the less heat is dissipated and the bigger the maximum integration density can get. This seems intuitively logical to me, but their last statement piqued my interest:

At lower temperatures, the power dissipation on chip is decreased, but the overall power dissipation actually increases due to the requirement for refrigeration.

For their actual calculation they used room temperatures, so I'm assuming they mean that if we try to cool the chip (to any temperature lower than room temperature), it would be obselete because the cooling itself would increase the overall power dissipation.

But I don't understand this argument. You could say the same for refrigerators. What does it matter that the overall power dissipation increases? If you cool the chips, you can increase the integration density which would lead to stronger computers. What does it matter that the heat in other parts of the system increases? I might be wrong about this, but this is how refrigerators work too, no? It's just in the laws of thermodynamics..

So an answer to this question would clarify the above statement (or say it's indeed stupid). Why shouldn't we cool chips to low temperatures in order to increase the maximum integration density?


Viewing all articles
Browse latest Browse all 3

Latest Images

Trending Articles





Latest Images