It goes up.
You probably mean the confidence interval. When you construct a confidence interval it has a percentage coverage that is based on assumptions about the population distribution. If the population distribution is skewed there is reason to believe that (a) the statistics upon which the interval are based (namely the mean and standard deviation) might well be biased, and (b) the confidence interval will not accurately cover the population value as accurately or symmetrically as expected.
Confidence intervals may be calculated for any statistics, but the most common statistics for which CI's are computed are mean, proportion and standard deviation. I have include a link, which contains a worked out example for the confidence interval of a mean.
Expected value is the outcome of confidence of how probability distribution is characterized. If the expected value is greater than the confidence interval then the results are significant.
Why confidence interval is useful
The parameters of the underlying distribution, plus the standard error of observation.
The standard deviation is used in the numerator of the margin of error calculation. As the standard deviation increases, the margin of error increases; therefore the confidence interval width increases. So, the confidence interval gets wider.
You probably mean the confidence interval. When you construct a confidence interval it has a percentage coverage that is based on assumptions about the population distribution. If the population distribution is skewed there is reason to believe that (a) the statistics upon which the interval are based (namely the mean and standard deviation) might well be biased, and (b) the confidence interval will not accurately cover the population value as accurately or symmetrically as expected.
The confidence interval is not directly related to the mean.
no
The increase in sample size will reduce the confidence interval. The increase in standard deviation will increase the confidence interval. The confidence interval is not based on a linear function so the overall effect will require some calculations based on the levels before and after these changes. It would depend on the relative rates at which the change in sample size and change in standard deviation occurred. If the sample size increased more quickly than then standard deviation, in some sense, then the size of the confidence interval would decrease. Conversely, if the standard deviation increased more quickly than the sample size, in some sense, then the size of the confidence interval would increase.
It will make it wider.
Never!
Yes.
Confidence intervals may be calculated for any statistics, but the most common statistics for which CI's are computed are mean, proportion and standard deviation. I have include a link, which contains a worked out example for the confidence interval of a mean.
The formula for margin of error is (Z*)*(Standard Deviation))/(sqrt(N)), so as N increases, the margin of error decreases. Here N went from 100 to 5000, so N has increased by 4900. This means the margin of error decreases. Since the confidence interval is the mean plus or minus the margin of error, a smaller margin of error means that the confidence interval is narrower.
The confidence intervals will increase. How much it will increase depends on whether the underlying probability model is additive or multiplicative.
Confidence interval considers the entire data series to fix the band width with mean and standard deviation considers the present data where as prediction interval is for independent value and for future values.