Suppose it is known that the IQ scores of a certain populationof adults are approxi- mately normally distributed with a standarddeviation of 15. A simple random sample of 25 adults drawn fromthis population had a mean IQ score of 105.
a) Is there evidence at 5% significance level that the averageIQ in this population is not equal to 100?
Please also explain how you got the critical value.
Thanks!!!