At first let’s define the white noise. The expectation or mean of white noise is zero with some finite variance and ${{i}^{th}}$ and ${{j}^{th}}$observations are uncorrelated for $i\ne j$. Hence for white noise it needs to have following properties.

$E({{e}_{t}})=0\to (i)$

$E(e_{t}^{2})={{\sigma }^{2}}\to (ii)$

$E({{e}_{t}},{{e}_{j}})=0$ for $i\ne j\to (iii)$

If this white noise follows a normal distribution then its known as normal or Gaussain white noise which is written as $et\tilde{\ }N(0,{{\sigma }^{2}})$ or also termed as normally distributed random variable.

Further $(iii)$ can be replaced with stronger assumption of independentness which is then known as Independent white noise and if it follows normal distribution then its known as Gaussian independent white noise or normally distributed independent random variable. There is one note: Independentness means uncorrelatedness but not vice versa.

Let’s develop normally distributed 200 random variables with zero mean and a unit variance. But to make your result consistent with mine we need to use random seed, lets use seed as 1. If you don’t know about random seed then see my previous blog (here).

rm(list = ls())

set.seed(1)

n = 200 #For 200 random numbers

e = rnorm(n, mean = 0, sd = 1) # Creating 200 random numbers which is normally distributed with zero mean and unit variance.

plot(e, type = “l”, main = “200 normally distributed random numbers with zero mean and unit variance”)

hist(e, main = “histogram of Gaussain white noise with zero mean and unit variance”)

Here is histogram:

shapiro.test(e) #Testing if e is normal or not

library(tseries)

jarque.bera.test(e) # Jarque Bera Test for normaliyt

t.test(e, mu = 0) #Testing if mean of e is zero or not

acf(e, main="ACF of residuals")

library(forecast)

Box.test(e, lag=10, type = "Box-Pierce")

Box.test(e,lag=10, fitdf=0, type="Ljung-Box")

#test for stationary

kpss.test(e) #Null hypothesis is Data has no unit root or data is stationary

adf.test(e) #Null hypothesis is Data has unit root or data is non-stationary

See the video for more explanation.

$E({{e}_{t}})=0\to (i)$

$E(e_{t}^{2})={{\sigma }^{2}}\to (ii)$

$E({{e}_{t}},{{e}_{j}})=0$ for $i\ne j\to (iii)$

If this white noise follows a normal distribution then its known as normal or Gaussain white noise which is written as $et\tilde{\ }N(0,{{\sigma }^{2}})$ or also termed as normally distributed random variable.

Further $(iii)$ can be replaced with stronger assumption of independentness which is then known as Independent white noise and if it follows normal distribution then its known as Gaussian independent white noise or normally distributed independent random variable. There is one note: Independentness means uncorrelatedness but not vice versa.

Let’s develop normally distributed 200 random variables with zero mean and a unit variance. But to make your result consistent with mine we need to use random seed, lets use seed as 1. If you don’t know about random seed then see my previous blog (here).

rm(list = ls())

set.seed(1)

n = 200 #For 200 random numbers

e = rnorm(n, mean = 0, sd = 1) # Creating 200 random numbers which is normally distributed with zero mean and unit variance.

plot(e, type = “l”, main = “200 normally distributed random numbers with zero mean and unit variance”)

hist(e, main = “histogram of Gaussain white noise with zero mean and unit variance”)

Here is histogram:

shapiro.test(e) #Testing if e is normal or not

library(tseries)

jarque.bera.test(e) # Jarque Bera Test for normaliyt

t.test(e, mu = 0) #Testing if mean of e is zero or not

acf(e, main="ACF of residuals")

library(forecast)

Box.test(e, lag=10, type = "Box-Pierce")

Box.test(e,lag=10, fitdf=0, type="Ljung-Box")

#test for stationary

kpss.test(e) #Null hypothesis is Data has no unit root or data is stationary

adf.test(e) #Null hypothesis is Data has unit root or data is non-stationary

See the video for more explanation.