They don't get used to being overclocked... that's just silly.
By the same token could one say that running a 100 watt light-bulb at 60 watts will eventually make the bulb unable to output 100 watts?
Additionally why stop at while power is applied, heck, if you leave a CPU untouched for years should it not get used to having no power running through it? what metric is used to determine "when" the chips "power memory" is being affected?