written 5.6 years ago by | • modified 5.6 years ago |
If multiple inputs to the simulation are related to each other, then it is necessary to determine the relationship between them.
Consider, the simulation of (M,N) inventory system, where random variables such as lead time and demand are related.
Let $X_1$ represent lead time and $X_2$ represent demand, which are normally distributed.
The dependence between $X_1$ and $X_2$ can be model using bivariate normal distribution, where parameter are H$_1$, H$_2$, $\sigma1$, $\sigma2$ are estimated using techniques for suggested estimators.
For estimation of $\rho$-
- Consider 'n' independent and identically distributed pairs (X$_{11}$, X$_{21}$),..,(X$_{1n}$, X$_{2n}$)
The sample covariance is given by:
$\begin{aligned} \hat{cov} (X_1, X_2) &= \frac{1}{n-1} \sum_{j=1}^n (X_{1j} - \overline{X_1})(X_{2j} - \overline{X_2}) \\ &= \frac{1}{n-1} \bigg[ \sum_{j=1}^n (X_{1j}X_{2j} - n\overline{X_1} .\overline{X_2}) \bigg] \end{aligned}$
where $\overline{X_1}$ and $\overline{X_2}$ are sample means
- Correlation is given by
$$\hat{\rho} = \frac{\hat{cov}(X_1,X_2)}{\hat{\sigma_1}.\hat{\sigma_2}}$$
where $\hat{\sigma_1} and \hat{\sigma_2}$ are the sample standard deviations. As the value of correlation $\hat{\rho}$ is close to zero - $X_1 and X_2$ are not dependent.
Time Series Input models
Time Series models are generally used in prediction and estimation of parameters by using historical data. Especially for financial forecasting, time series models are widely used Time series models typically forecast the variables of interest by implicitly extrapolating past patterns in the data to predict future.
A time series is a stretch of values on the same scale indexed by a time parameter. Consider a sequence of random variables $X_1, X_2, X_3$...
Let these random variables be identically distributed i.e , all having the same mean and variance, but many be dependent, then such sequence of random variables is called as a time series
This sequence has lag-h auto covariance, cov$(X_t, X_{t+h})$ and lag-h autocorrelation corr$(X_t, X{t+h})$...
If the value of the autocovariance, cov$(X_t, X_{t+h})$ and lag-h autocorrelation, corr$(X_t, X{t+h})$...depends only on 'h' but not on 't', then it is covariance stationary and denoted as: $$\rho_h = corr(X_t, X_{t+h}) $$ for the lag-h autocorrelation.
Auto correlation measures the dependence between random variables which are separated with each others in the time series by h-1
Consider again a sequence of random variables $X_1,X_2,X_3$ which are independently distributed, dependent and covariance stationary. Then one can represent the process using time series models, namely AR(1) and EAR(1)
AR(1) model
Consider the following time series model $$X_t = \mu + \phi(X_{t-1}-\mu) + \epsilon_t $$ for t = 2,3,...
Here $\epsilon_2, \epsilon_3$ are independent and identically normal distributed random variables with zero mean$(\mu=0)$ and variance $\sigma_\epsilon^2$ with $-1\lt \phi \lt 1$
If the choice of the initial value $X_2$ is proper, than $X_1,X_2$ are all normally distributed with mean=$\mu$,
variance = $\frac{\sigma_{\epsilon}^2}{(1-\phi^2)}$ and lag-h auto correlation,
$\rho_h$ = $\phi^h$ for h = 1,2,..
This time series model is called autoregressive order -1 model or AR(1)
EAR(1) model Consider the following time series model:
$$ \left \{ \begin{array}{c} \phi X_{t-1}, \text{with probability } \phi \\ \phi X_{t-1}+ \epsilon_t, \text{with probability } 1-\phi \end{array} \right. $$
Here $\epsilon_2, epsilon_3$ are independent and identically exponentially distributed random variable with mean $\frac{1}{\lambda}$ and 0<$\phi$<1
Algorithm for EAR(1) time series model.
- Generate X$_1$ from exponential distribution with mean $frac{1}{\lambda}$. Set t = 2
- Generate U from the uniform distribution [0,1] if U<=$\phi$, then $$Set X_t = \phi X_{t-1}$$
Else,
Generate $\epsilon_t$ from the exponential distribution with mean $\frac{1}{\lambda}$
Set $X_t = \phi X_{t-1} + \epsilon_t$
- Set t = t+1
Go to step 2