answersLogoWhite

0


Best Answer

Suppose the degree of confidence is (1-a )´100%

Thus $502

2

1,

+ =

- n

S

x t

n

a

Here x = 487, S = 48, n = 100

2 2

99,

\t a » Za

Thus we get 502

10

48

487

2

+ = Za

Or 3.125

4.8

15

2

= = Za

0.0009 1 0.9982

2

\a = -a =

or

\ We can assert with 99.82% confidence that the true mean salaries will be between

$472 and $502.

User Avatar

Wiki User

12y ago
This answer is:
User Avatar
More answers
User Avatar

Ramu Vetcha

Lvl 2
2y ago

[object Object]

This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: A random sample of 100 teachers in a large metropolitan area revealed a mean weekly salary of Rs 487 with standard deviation Rs 48With what degree of confidence you can assert that the average weekly?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

What happens to the confidence interval as the standard deviation of a distribution increases?

The standard deviation is used in the numerator of the margin of error calculation. As the standard deviation increases, the margin of error increases; therefore the confidence interval width increases. So, the confidence interval gets wider.


Is it true that the larger the standard deviation the wider the confidence interval?

no


What is Confidence Intervals of Standard Error?

The standard deviation associated with a statistic and its sampling distribution.


What happen to confidence interval if increase sample size and population standard deviation simultanesous?

The increase in sample size will reduce the confidence interval. The increase in standard deviation will increase the confidence interval. The confidence interval is not based on a linear function so the overall effect will require some calculations based on the levels before and after these changes. It would depend on the relative rates at which the change in sample size and change in standard deviation occurred. If the sample size increased more quickly than then standard deviation, in some sense, then the size of the confidence interval would decrease. Conversely, if the standard deviation increased more quickly than the sample size, in some sense, then the size of the confidence interval would increase.


What happens to the confidence interval as the standard deviation of a distribution decreases?

It goes up.


The percentage that is one standard deviation away from mean?

For normally distributed data. One standard deviation (1σ)Percentage within this confidence interval68.2689492% (68.3% )Percentage outside this confidence interval31.7310508% (31.7% )Ratio outside this confidence interval1 / 3.1514871 (1 / 3.15)


How do sample size confidence level and standard deviation affect the margin of error?

this dick


What effect increasing only the population standard deviation will have on the width of the confidence interval?

It will make it wider.


How do you calculate the parameter to a 99.9 confidence interval using mean and standard deviation?

Did you mean, "How do you calculate the 99.9 % confidence interval to a parameter using the mean and the standard deviation?" ? The parameter is the population mean μ. Let xbar and s denote the sample mean and the sample standard deviation. The formula for a 99.9% confidence limit for μ is xbar - 3.08 s / √n and xbar + 3.08 s / √n where xbar is the sample mean, n the sample size and s the sample standard deviation. 3.08 comes from a Normal probability table.


As standard deviation increases what happens to the sample size in order to achieve a specified level of confidence?

decreases


The t distribution is used to construct confidence intervals for the population mean when the population standard deviation is unknown?

It can be.


If the standard deviation is doubled what will be the effect on the confidence interval?

The confidence intervals will increase. How much it will increase depends on whether the underlying probability model is additive or multiplicative.