@ Rudi/Matt
Take a look at this function:
# Throwing dice. value = chance, max - limits
# if dice(60): means probability of 60%
def dice(value, max=100, show=True):
number = random.randrange(0,max+1)
if number <= value:
result = True
else:
result = False
if config.developer and show:
notify(u"Resulted in %d from %d, limit - %d, and result - %s." % (number,max,value,str(result)))
return result
This is from Alkion (written by Roman, not me) but this seems flawed because it actually picks from 101 numbers and not 100. Can any of you confirm this, I ran this on pure python 2.7 and it seems to agree with me.
Code should be:
# Throwing dice. value = chance, max - limits
# if dice(60): means probability of 60%
def dice(value, max=100, show=True):
number = random.randrange(1, max+1)
if number <= value:
result = True
else:
result = False
if config.developer and show:
notify(u"Resulted in %d from %d, limit - %d, and result - %s." % (number,max,value,str(result)))
return result
Otherwise dice(1) actually means 2% chance because number variable can be anything in range of 0 to 100 (BUT INCLUDING 0 as well) and same counts for any other throw? What do you think?