MindMap Gallery General Chapter 3 Two-Dimensional Random Variables Mind Map
This is a mind map about the general chapter 3 of two-dimensional random variables, including random vectors and joint distributions, marginal distribution functions, marginal distribution laws and conditional distribution laws, etc.
Edited at 2023-11-15 23:24:53This is a mind map about bacteria, and its main contents include: overview, morphology, types, structure, reproduction, distribution, application, and expansion. The summary is comprehensive and meticulous, suitable as review materials.
This is a mind map about plant asexual reproduction, and its main contents include: concept, spore reproduction, vegetative reproduction, tissue culture, and buds. The summary is comprehensive and meticulous, suitable as review materials.
This is a mind map about the reproductive development of animals, and its main contents include: insects, frogs, birds, sexual reproduction, and asexual reproduction. The summary is comprehensive and meticulous, suitable as review materials.
This is a mind map about bacteria, and its main contents include: overview, morphology, types, structure, reproduction, distribution, application, and expansion. The summary is comprehensive and meticulous, suitable as review materials.
This is a mind map about plant asexual reproduction, and its main contents include: concept, spore reproduction, vegetative reproduction, tissue culture, and buds. The summary is comprehensive and meticulous, suitable as review materials.
This is a mind map about the reproductive development of animals, and its main contents include: insects, frogs, birds, sexual reproduction, and asexual reproduction. The summary is comprehensive and meticulous, suitable as review materials.
General Chapter 3 Two-Dimensional Random Variables
3.1 Random vectors and joint distribution
The definition of two-dimensional random variables and the properties of distribution functions
Definition 1: Suppose the sample space of experiment E is S={e}, and X=X(e) and Y=Y(e) are two random variables defined on S. The vector (X, Y) composed of these two random variables is called a two-dimensional random variable or a two-dimensional random vector.
Definition 2: Let (X, Y) be a two-dimensional random variable. For any real number x, y, the binary function is called the distribution function of the two-dimensional random variable or the joint distribution function of the random variables X and Y.
Definition 3: Suppose the sample space of test E is S={e}, and Xi=Xi(e) is a random variable defined on S, i=1,2,…,n, composed of these n random variables An ordered group of random variables is called an n-dimensional random variable or random vector. Let be an n-dimensional random variable. For any real number , the n-ary function is called the distribution function of the n-dimensional random variable or the joint distribution function of n random variables.
Properties of distribution function F(x,y):
Domain:
The value range of the distribution function:
special value
F(x,y) is monotonic and does not decrease with respect to x or y, that is:
F(x,y) is right-continuous to x or to y, that is:
For any real numbers x1<x2, y1<y2 we have
On the contrary: Any binary function F(x,y) that satisfies the above properties must be the distribution function of a two-dimensional random variable.
Two-dimensional discrete random variable
Definition: If the values of the two-dimensional random variable (X, Y) are finite pairs or listable pairs, then (X, Y) is said to be a discrete random variable.
It is called the (probability) distribution law of two-dimensional discrete random variables (X, Y), Or called the joint (probability) distribution law of X and Y.
The expression method of distribution law: (1) formula method; (2) list method.
The basic properties of the distribution law of two-dimensional discrete random variables (X,Y):
Theorem: Suppose the distribution law of (X, Y) is the probability that a random point (X, Y) falls in any area D on the plane. The sum is the sum of all i, j such that (xi, yj)D .
Especially
Two-dimensional continuous random variable
Definition: Suppose the distribution function of a two-dimensional random variable (X, Y) is F (x, y). If there is a non-negative integrable function f (x, y) such that for any real number x, y, it is always said that ( X, Y) is a two-dimensional continuous random variable, and the function f(x, y) is called the probability density of the two-dimensional continuous random variable (X, Y), or the joint probability density of the random variables X and Y.
The probability density f(x,y) of (X,Y) has the following basic properties:
On the contrary, if the binary function f(x,y) satisfies the above two basic properties, then it must be the probability density of a certain two-dimensional random variable (X,Y).
If the probability density f(x,y) is continuous at point (x,y), then we have
Calculate probability using probability density
Theorem: Suppose the probability density of (X, Y) is f (x, y), then: (1) Suppose D is any area on the plane, then: , . (2)
Commonly used two-dimensional continuous random variables
Uniform distribution: If the probability density of random variables (X, Y) is where A is the area of bounded region D. Then (X, Y) is said to obey a uniform distribution in area D. It is recorded as
Two-dimensional normal distribution: If the probability density of random variables (X, Y) is where , then the random variables (X, Y) are said to obey the two-dimensional normal distribution with parameters, denoted as
3.2 Marginal distribution function
Definition: Let the distribution function of the two-dimensional random variable (X, Y) (the joint distribution function of the components X and Y)
Distribution function of component X: Call FX(x) the edge distribution function of (X,Y) with respect to X;
Distribution function of component Y: FY(y) is called the marginal distribution function of (X,Y) with respect to Y.
Given that the joint distribution function F(x,y) is known, the edge distribution functions FX(x) and FY(y) can be calculated; however, the respective distribution functions FX(x) and FY(y) of X and Y cannot generally be determined. Joint distribution function F(x,y).
3.3 Marginal distribution law and conditional distribution law
Definition: Two-dimensional discrete random variable (X, Y), component X and component Y are both discrete random variables. The distribution law of X is called the marginal distribution law of (X, Y) with respect to X; the distribution law of Y is called The marginal distribution law of (X,Y) with respect to Y.
Calculation formula of marginal distribution law
Theorem: The distribution law of two-dimensional discrete random variables (X, Y) is , then the edge distribution law of (X, Y) with respect to X The edge distribution law of (X, Y) with respect to Y is
The marginal distribution law of (X,Y) with respect to
Conditional distribution law and calculation formula
Under the condition that one component is known to take a certain value, the distribution law of the other component is called the conditional distribution law.
definition:
3.4 Marginal probability density and conditional probability density
edge probability density
Suppose the probability density of the two-dimensional continuous random variable (X, Y) is f (x, y), then the probability density of the component X is recorded as fX (x), which is called the edge probability density of (X, Y) with respect to X; The probability density of component Y is recorded as fY(y), which is called the edge probability density of (X,Y) with respect to Y.
Suppose the probability density of two-dimensional continuous random variable (X, Y) is f (x, y), then
This indicates: (1) Component X is a continuous random variable, (2) The probability density of component X, that is, the edge probability density of (X, Y) with respect to X is
This indicates: (1) Component Y is a continuous random variable, (2) The probability density of component Y, that is, the edge probability density of (X, Y) with respect to Y is
conditional distribution function
definition:
conditional probability density
Calculation formula
3.5 Mutually independent random variables
Definition: Let X and Y be two random variables. If for any real number x, y, there is , then X and Y are said to be independent of each other, or independent for short.
X and Y are independent of each other theorem:
Discrete random variables are mutually independent and discriminant theorem
theorem
Continuous random variables are independent of each other
Discriminant theorem
Mutual independence of finitely many or listable random variables
definition