OKpedia  

Expected Value

The expected value of a random variable $V$ is the weighted average of all possible outcomes, with each outcome multiplied by its probability of occurring. In situations involving uncertainty, a variable $V$ can take on $n$ different values depending on the state of the world. In probability theory, the expected value is also known as the mean or mathematical expectation.

Each possible value $v_j$ of the variable is assigned a probability $p_j$ representing the likelihood of that outcome. When all possible outcomes are considered, the sum of their probabilities equals 1:

$$ \sum_{j=1}^n p_j = 1 $$

The general formula for the expected value of a discrete random variable is:

$$ \mathbb{E}[V] = \sum_{j=1}^n v_j \cdot p_j $$

The expected value represents the long-run average outcome of a random process if it were repeated an infinite number of times. It doesn’t necessarily correspond to any single observed outcome, but rather serves as a summary measure of the variable’s overall tendency.

For instance, consider a coin toss with two outcomes: heads or tails. Suppose you win $ 100 $ if it lands on heads and nothing if it lands on tails, and each outcome has a probability of $0.5$. The expected value of this game is: $$ \mathbb{E}[V] = 100 \cdot 0.5 + 0 \cdot 0.5 = 50 $$ This means that, over many repeated plays, your average winnings per toss would converge to 50.

For continuous random variables, the expected value is defined using integration:

$$ \mathbb{E}[X] = \int_{-\infty}^{\infty} x \cdot f(x) \, dx $$

where $f(x)$ is the probability density function of the variable $X$.

The expected value is a fundamental concept in statistics, economics, decision theory, and finance. It provides a rational basis for evaluating outcomes when dealing with uncertainty, guiding informed decision-making under risk.

https://www.okpedia.com/expected-value


Have a question? Leave it in the comments and we'll answer on this page.


Uncertainty




FacebookTwitterLinkedinLinkedin