c# - Dividing by 2 vs Multiplying by 0.5 -


consider following:

void foo(int start, int end) {     int mid = (start + end) / 2; }  void bar(int start, int end) {     int mid = (start + end) * 0.5; } 

why foo compiles while bar not? dividing 2 implicitly casts result int while multiplying 0.5 gives un-casted double:

cannot implicitly convert type 'double int. explicit conversion exists(are missing cast?)

what c# language designers' reasoning behind this?

the / integer division (5/3 = 1). make float division 1 of operand must floating point (float or double). because there cases when application wants access quotient or remainder of division (for remainder use %). also, integer division faster floating one.

on other hand, multiplying float gives float. save integer type have type cast yourself. floating point values have different representation in memory , can lead loss of precision.

it same thing in programming languages: of them have integer division , floating point division, more using same operator. typed languages require cast floating point integral types.


Comments

Popular posts from this blog

PHPMotion implementation - URL based videos (Hosted on separate location) -

c# - Unity IoC Lifetime per HttpRequest for UserStore -

I am trying to solve the error message 'incompatible ranks 0 and 1 in assignment' in a fortran 95 program. -