Your factory has a machine for drilling holes in a sheet metal part. The mean diameter of the hole is 10mm with a standard deviation of 0.1mm.
What is the probability that any single hole will have diameter between 9.9mm and 10.1mm?
If you perform quality testing on samples of 100 parts coming off of this machine, what do you expect the mean, standard deviation, and shape of the distribution of the average diameter of holes in the quality samples to be? Why is it true that you can make these assumptions?
What is the chance that a set of 64 holes will have an average diameter between 9.99mm and 10.01mm?