It doesn't make a bit of difference. the compiled code is still the same, unless the variable is then used in a for, like this:
int i=0;
for(int i=5;i<5;i++)
{
//code
}
Except in that case the inner definition of i simply shadows the "outer" definition of i (and in many languages you simply get a duplicate definition error so you can't do it anyway.
The example is silly because the if is redundant, as well.
Neither one is easier to read, and there is no reason to use one over another, even in the apparently magical whimsical fairlyland of "production" where apparently the requirements change to require code to adhere to redundant ruleset's based on a single persons subjective view.