There is absolutely no relationship to what you've asked. I'm pretty sure you simply framed the question in the wrong way, but to literally answer your question... none. Zero relationship. There's no such thing. There is however a relationship between standard deviation and a CI, but a CI can in no shape way or form influence a standard deviation.
Chat with our AI personalities
It goes up.
Confidence intervals may be calculated for any statistics, but the most common statistics for which CI's are computed are mean, proportion and standard deviation. I have include a link, which contains a worked out example for the confidence interval of a mean.
You probably mean the confidence interval. When you construct a confidence interval it has a percentage coverage that is based on assumptions about the population distribution. If the population distribution is skewed there is reason to believe that (a) the statistics upon which the interval are based (namely the mean and standard deviation) might well be biased, and (b) the confidence interval will not accurately cover the population value as accurately or symmetrically as expected.
Why confidence interval is useful
Did you mean, "How do you calculate the 99.9 % confidence interval to a parameter using the mean and the standard deviation?" ? The parameter is the population mean μ. Let xbar and s denote the sample mean and the sample standard deviation. The formula for a 99.9% confidence limit for μ is xbar - 3.08 s / √n and xbar + 3.08 s / √n where xbar is the sample mean, n the sample size and s the sample standard deviation. 3.08 comes from a Normal probability table.