数学期望Mathematical ExpectationIn probability theory the expected value (or mathematical expectation, or mean) of a discrete random variable is the sum of the probability of each possible outcome of the experiment multiplied by the outcome value (or payoff). Thus, it represents the average amount one "expects" as the outcome of the random trial when identical odds are repeated many times. Note that the value itself may not be expected in the general sense - the "expected value" itself may be unlikely or even expected value from the roll of an ordinary six-sided die is , which is not among the possible outcomes:A common application of expected value is to gambling. For example, an American roulette wheel has 38 places where the ball may land, all equally likely. A winning bet on a single number pays 35-to-1, meaning that the original stake is not lost, and 35 times that amount is won, so you receive 36 times what you've bet. Considering all 38 possible outcomes, the expected value of the profit resulting from a dollar bet on a single number is the sum of what you may lose times the odds of losing and what you will win times the odds of winning:The change in your financial holdings is −$1 when you lose, and $35 when you win. Thus one may expect, on average, to lose about five cents for every dollar bet, and the expected value of a one-dollar bet is $. In gambling, an event of which the expected value equals the stake (of which the bettor's expected profit is zero) is called a "fair game."[edit] Mathematical definitionIn general, if is a random variable defined on a probability space (where Ω is the sample space and F is the cumulative distribution function of probability, ()), then the expected value of (denoted or sometimes or ) is defined aswhere the Lebesgue integral is employed. Note that not all random variables have an expected value, since the integral may not exist (., Cauchy distribution). Two variables with the same probability distribution will have the same expected value, if it is in the gambling example mentioned the probability distribution of X admits a probability density function f(x), then the expected value can be computed asIt follows directly from the discrete case definition that if X is a constant random variable, . X = b for some fixed real number b, then the expected value of X is also expected value of an arbitrary function of X, g(X), with respect to the probability density function f(x) is given by:[edit] Conventional terminologyWhen one speaks of the "expected price", "expected height", etc. one means the expected value of a random variable that is a price, a height, etc. When one speaks of the "expected number of attempts needed to get one successful attempt," one might conservatively approximate it as the reciprocal of the probability of success for such an attempt. Cf. expected value of the geometric distribution. [edit] Properties[edit] ConstantsThe expected value of a constant is equal to the constant itself; ., if 'c' is a constant, then E(c) = c[edit] MonotonicityIf X and Y are random variables so that almost surely, then .[edit] LinearityThe expected value operator (or expectation operator) is linear in the sense thatCombining the results from previous three equations, we can see that -for any two random variables X and Y (which need to be defined on the same probability space) and any real numbers a and b.[edit] Iterated expectation[edit] Iterated expectation for discrete random variablesFor any two discrete random variables X,Y one may define the conditional expectation:which means that is a function on the expectation of X satisfiesHence, the following equation holds:The right hand side of this equation is referred to as the iterated expectation and is also sometimes called the tower rule. This proposition is treated in law of total expectation.[edit] Iterated expectation for continuous random variablesIn the continuous case, the results are completely analogous. The definition of conditional expectation would use inequalities, density functions, and integrals to replace equalities, mass functions, and summations, respectively. However, the main result still holds:[edit] InequalityIf a random variable X is always less than or equal to another random variable Y, the expectation of X is less than or equal to that of Y:If , then .In particular, since and , the absolute value of expectation of a random variable is less than or equal to the expectation of its absolute value:[edit] RepresentationThe following formula holds for any nonnegative real-valued random variable X (such that ), and positive real number α:In particular, this reduces to:[edit] Non-multiplicativityIn general, the expected value operator is not multiplicative, . is not necessarily equal to . If multiplicativity occurs, the X and Y variables are said to be uncorrelated (independent variables are a notable case of uncorrelated variables). The lack of multiplicativity gives rise to study of covariance and correlation.[edit] Functional non-invarianceIn general, the expectation operator and functions of random variables do not commute; that isA notable inequality concerning this topic is Jensen's inequality, involving expected values of convex (or concave) functions.[edit] Uses and applications of the expected valueThe expected values of the powers of X are called the moments of X; the moments about the mean of X are expected values of powers of . The moments of some random variables can be used to specify their distributions, via their moment generating empirically estimate the expected value of a random variable, one repeatedly measures observations of the variable and computes the arithmetic mean of the results. If the expected value exists, this procedure estimates the true expected value in an unbiased manner and has the property of minimizing the sum of the squares of the residuals (the sum of the squared differences between the observations and the estimate). The law of large numbers demonstrates (under fairly mild conditions) that, as the size of the sample gets larger, the variance of this estimate gets classical mechanics, the center of mass is an analogous concept to expectation. For example, suppose X is a discrete random variable with values xi and corresponding probabilities pi. Now consider a weightless rod on which are placed weights, at locations xi along the rod and having masses pi (whose sum is one). The point at which the rod balances is .Expected values can also be used to compute the variance, by means of the computational formula for the varianceA very important application of the expectation value is in the field of quantum mechanics. The expectation value of a quantum mechanical operator operating on a quantum state vector is written as . The uncertainty in can be calculated using the formula .[edit] Expectation of matricesIf X is an matrix, then the expected value of the matrix is defined as the matrix of expected values:This is utilized in covariance matrices.[edit] ComputationIt is often useful to update a computed expected value as new data comes in. This can be done as follows, where new_value is the count-th value, and we use the previous estimate to compute :[edit] Formula for non-negative integral valuesWhen a random variable takes only values in {0,1,2,3,...} we can use the following formula for computing its expectation:For example, suppose we toss a coin where the probability of heads is p. How many tosses can we expect until the first heads? Let X be this number. Note that we are counting only the tails and not the heads which ends the experiment; in particular, we can have X = 0. The expectation of X may be computed by . This is because the number of tosses is at least i exactly when the first i tosses yielded tails. This matches the expectation of a random variable with an Exponential distribution. We used the formula for Geometric progression: .