What does counts[d1 + d2] += 1 really do? Here is my code:
//call simulate(10)
public static void simulate(int rolls) {
Random rand = new Random();
int[] counts = new int[13];
for (int k = 0; k < rolls; k++) {
int d1 = rand.nextInt(6) + 1;
int d2 = rand.nextInt(6) + 1;
System.out.println(d1+"+"+d2+"+"+"="+(d1+d2));
counts[d1 + d2] += 1;
}
for (int k = 2; k <= 12; k++) {
System.out.println(k + "'s=\t" + counts[k] + "\t" + 100.0 * counts[k]/rolls);
}
}
d1andd2are 2 random numbers in range of 1-6. When combined they are within 12 making then a random index incounts.d1 + d2(random index) by 1.