Simple problems in probability theory. Basic formula

  • Section 1. Random Events (50 hours)
  • Thematic plan of the discipline for part-time and part-time students
  • Thematic plan of the discipline for distance learning students
  • 2.3. Structural and logical diagram of the discipline
  • Mathematics part 2. Probability theory and elements of mathematical statistics Theory
  • Section 1 Random Events
  • Section 3 Elements of mathematical statistics
  • Section 2 Random variables
  • 2.5. Practical block
  • 2.6. Point-rating system
  • Information resources of the discipline
  • Bibliography Main:
  • 3.2. Basic notes for the course “Mathematics part 2. Probability theory and elements of mathematical statistics” introduction
  • Section 1. Random events
  • 1.1. The concept of a random event
  • 1.1.1. Information from set theory
  • 1.1.2. Space of elementary events
  • 1.1.3. Event classification
  • 1.1.4. Sum and product of events
  • 1.2. Probabilities of random events.
  • 1.2.1. Relative frequency of an event, axioms of probability theory. Classic definition of probability
  • 1.2.2. Geometric definition of probability
  • Calculating the probability of an event through elements of combinatorial analysis
  • 1.2.4. Properties of event probabilities
  • 1.2.5. Independent events
  • 1.2.6. Calculation of the probability of failure-free operation of the device
  • Formulas for calculating the probability of events
  • 1.3.1. Sequence of independent tests (Bernoulli circuit)
  • 1.3.2. Conditional probability of an event
  • 1.3.4. Total Probability Formula and Bayes Formula
  • Section 2. Random variables
  • 2.1. Description of random variables
  • 2.1.1. Definition and methods of specifying a random variable One of the basic concepts of probability theory is the concept of a random variable. Let's look at some examples of random variables:
  • To specify a random variable, you need to specify its distribution law. Random variables are usually denoted by Greek letters ,,, and their possible values ​​– by Latin letters with indices xi, yi, zi.
  • 2.1.2. Discrete random variables
  • Consider events Ai containing all elementary events  leading to the value XI:
  • Let pi denote the probability of event Ai:
  • 2.1.3. Continuous random variables
  • 2.1.4. Distribution function and its properties
  • 2.1.5. Probability distribution density and its properties
  • 2.2. Numerical characteristics of random variables
  • 2.2.1. Expectation of a random variable
  • 2.2.2. Variance of a random variable
  • 2.2.3. Normal distribution of a random variable
  • 2.2.4. Binomial distribution
  • 2.2.5. Poisson distribution
  • Section 3. Elements of mathematical statistics
  • 3.1. Basic definitions
  • bar chart
  • 3.3. Point estimates of distribution parameters
  • Basic Concepts
  • Point estimates of expectation and variance
  • 3.4. Interval estimates
  • The concept of interval estimation
  • Construction of interval estimates
  • Basic statistical distributions
  • Interval estimates of the mathematical expectation of a normal distribution
  • Interval estimation of the variance of a normal distribution
  • Conclusion
  • Glossary
  • 4. Guidelines for performing laboratory work
  • Bibliography
  • Laboratory work 1 description of random variables. Numerical characteristics
  • Procedure for performing laboratory work
  • Laboratory work 2 Basic definitions. Systematization of the sample. Point estimates of distribution parameters. Interval estimates.
  • The concept of a statistical hypothesis about the type of distribution
  • Procedure for performing laboratory work
  • Cell Value Cell Value
  • 5. Guidelines for completing the test Assignment for the test
  • Guidelines for completing the test: Events and their probabilities
  • Random variables
  • Standard deviation
  • Elements of mathematical statistics
  • 6. Control unit for mastering the discipline
  • Questions for the exam for the course “Mathematics Part 2. Probability theory and elements of mathematical statistics"
  • Table continued in
  • End of table at
  • Uniformly distributed random numbers
  • Content
  • Section 1. Random events……………………………………. 18
  • Section 2. Random variables..………………………… ….. 41
  • Section 3. Elements of mathematical statistics..................... 64
  • 4. Guidelines for performing laboratory tests
  • 5. Guidelines for completing the test
      1. Formulas for calculating the probability of events

    1.3.1. Sequence of independent tests (Bernoulli circuit)

    Suppose that some experiment can be carried out repeatedly under the same conditions. Let this experience be made n times, i.e., a sequence of n tests.

    Definition. Subsequence n tests are called mutually independent , if any event related to a given test is independent of any events related to other tests.

    Let's assume that some event A likely to happen p as a result of one test or not likely to happen q= 1- p.

    Definition . Sequence of n tests forms a Bernoulli scheme if the following conditions are met:

      subsequence n tests are mutually independent,

    2) probability of an event A does not change from trial to trial and does not depend on the result in other trials.

    Event A is called the “success” of the test, and the opposite event is called “failure.” Consider the event

    =( in n tests happened exactly m“success”).

    To calculate the probability of this event, the Bernoulli formula is valid

    p() =
    , m = 1, 2, …, n , (1.6)

    Where - number of combinations of n elements by m :

    =
    =
    .

    Example 1.16. The die is tossed three times. Find:

    a) the probability that 6 points will appear twice;

    b) the probability that the number of sixes will not appear more than twice.

    Solution . We will consider the “success” of the test to be when the side with the image of 6 points appears on the die.

    a) Total number of tests – n=3, number of “successes” – m = 2. Probability of “success” - p=, and the probability of “failure” is q= 1 - =.

    .

    Then, according to Bernoulli's formula, the probability that, as a result of throwing a die three times, the side with six points will appear twice, will be equal to b) Let us denote by A an event that means that a side with a score of 6 will appear no more than twice. Then the event can be represented in the form the sum of three incompatible events
    ,

    Where A= IN

    A= 3 0 – an event when the edge of interest never appears,

    A= 3 1 - event when the edge of interest appears once,

    3 2 - event when the edge of interest appears twice.

    p(b) Let us denote by) Using the Bernoulli formula (1.6) we find
    ) = p(
    )=
    +
    +
    =

    =
    .

    1.3.2. Conditional probability of an event

    = p (

    Conditional probability reflects the influence of one event on the probability of another. Changing the conditions under which the experiment is carried out also affects

    Definition. on the probability of occurrence of the event of interest. A Let And B p(And)> 0.

    – some events, and probability Conditional probability A events Andprovided that the “event already p(AAnd). happened” is the ratio of the probability of the occurrence of these events to the probability of an event that occurred earlier than the event whose probability is required to be found. Conditional probability is denoted as

    p (A And) =
    . (1.7)

    Then by definition Example 1.17.

    (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)

    (2,1) (2,2) (2,3) (2,4) (2,5) (2,6)

    (3,1) (3,2) (3,3) (3,4) (3,5) (3,6)

    (4,1) (4,2) (4,3) (4,4) (4,5) (4,6)

    (5,1) (5,2) (5,3) (5,4) (5,5) (5,6)

    (6,1) (6,2) (6,3) (6,4) (6,5) (6,6).

    Two dice are tossed. The space of elementary events consists of ordered pairs of numbers A In Example 1.16 it was determined that the event =(number of points on the first die > 4) and event C

    .

    =(sum of points is 8) dependent. Let's make a relation A:

    (5,1) (5,2) (5,3) (5,4) (5,5) (5,6)

    (6,1) (6,2) (6,3) (6,4) (6,5) (6,6) .

    This relationship can be interpreted as follows. Let us assume that the result of the first roll is known to be that the number of points on the first die is > 4. It follows that throwing the second die can lead to one of the 12 outcomes that make up the event =(number of points on the first die > 4) and event At this event =(number of points on the first die > 4) and event only two of them can match (5,3) (6,2). In this case, the probability of the event
    . Thus, information about the occurrence of an event A influenced the likelihood of an event =(number of points on the first die > 4) and event.

          Probability of events happening

    Multiplication theorem

    Probability of events happeningA 1 A 2 A n is determined by the formula

    p(A 1 A 2 A n)= p(A 1)p(A 2 A 1))p(A n A 1 A 2 A n- 1). (1.8)

    For the product of two events it follows that

    p(AB)= p(AB)p{And)= p(AndA)p{A). (1.9)

    Example 1.18. In a batch of 25 products, 5 products are defective. 3 items are selected at random in succession. Determine the probability that all selected products are defective.

    Solution. Let's denote the events:

    A 1 = (first product is defective),

    A 2 = (second product is defective),

    A 3 = (third product is defective),

    A = (all products are defective).

    Event b) Let us denote by is the product of three events A = A 1 A 2 A 3 .

    From the multiplication theorem (1.6) we get

    p(A)= p( A 1 A 2 A 3 ) = p(A 1) p(A 2 A 1))p(A 3 A 1 A 2).

    The classical definition of probability allows us to find p(A 1) is the ratio of the number of defective products to the total number of products:

    p(A 1)= ;

    p(A 2) This the ratio of the number of defective products remaining after the removal of one to the total number of remaining products:

    p(A 2 A 1))= ;

    p(A 3) – this is the ratio of the number of defective products remaining after the removal of two defective ones to the total number of remaining products:

    p(A 3 A 1 A 2)=.

    Then the probability of the event A will be equal

    p(A) ==
    .

    Everything in the world happens deterministically or by chance...
    Aristotle

    Probability: Basic Rules

    Probability theory calculates the probabilities of various events. Fundamental to probability theory is the concept of a random event.

    For example, you throw a coin, it randomly lands on a head or a tail. You don't know in advance which side the coin will land on. You enter into an insurance contract; you do not know in advance whether payments will be made or not.

    In actuarial calculations, you need to be able to estimate the probability of various events, so probability theory plays a key role. No other branch of mathematics can deal with the probabilities of events.

    Let's take a closer look at tossing a coin. There are 2 mutually exclusive outcomes: the coat of arms falls out or the tails fall out. The outcome of the throw is random, since the observer cannot analyze and take into account all the factors that influence the result. What is the probability of the coat of arms falling out? Most will answer ½, but why?

    Let it be formal b) Let us denote by indicates the loss of the coat of arms. Let the coin toss n once. Then the probability of the event b) Let us denote by can be defined as the proportion of those throws that result in a coat of arms:

    Where n total number of throws, n(A) number of coat of arms drops.

    Relation (1) is called frequency events b) Let us denote by in a long series of tests.

    It turns out that in various series of tests the corresponding frequency at large n clusters around some constant value P(A). This quantity is called probability of an event b) Let us denote by and is designated by the letter R- abbreviation of the English word probability - probability.

    Formally we have:

    (2)

    This law is called law of large numbers.

    If the coin is fair (symmetrical), then the probability of getting a coat of arms is equal to the probability of getting heads and equals ½.

    Let b) Let us denote by And A= some events, for example, whether an insured event occurred or not. The union of two events is an event consisting of the execution of an event b) Let us denote by, events A=, or both events together. The intersection of two events b) Let us denote by And A= called an event consisting in the implementation as an event b) Let us denote by, and events A=.

    Basic Rules The calculus of event probabilities is as follows:

    1. The probability of any event lies between zero and one:

    2. Let A and B be two events, then:

    It reads like this: the probability of two events combining is equal to the sum of the probabilities of these events minus the probability of the events intersecting. If the events are incompatible or non-overlapping, then the probability of the union (sum) of two events is equal to the sum of the probabilities. This law is called the law addition probabilities.

    We say that an event is reliable if its probability is equal to 1. When analyzing certain phenomena, the question arises of how the occurrence of an event affects A= upon the occurrence of an event b) Let us denote by. To do this, enter conditional probability :

    (4)

    It reads like this: probability of occurrence b) Let us denote by given that A= equals the probability of intersection b) Let us denote by And A=, divided by the probability of the event A=.
    Formula (4) assumes that the probability of an event A= Above zero.

    Formula (4) can also be written as:

    (5)

    This is the formula multiplying probabilities.

    Conditional probability is also called a posteriori probability of an event b) Let us denote by- probability of occurrence b) Let us denote by after the offensive A=.

    In this case, the probability itself is called a priori probability. There are several other important formulas that are intensively used in actuarial calculations.

    Total Probability Formula

    Let us assume that an experiment is being carried out, the conditions of which can be determined in advance mutually mutually exclusive assumptions (hypotheses):

    We assume that there is either a hypothesis, or...or. The probabilities of these hypotheses are known and equal:

    Then the formula holds full probabilities :

    (6)

    Probability of an event occurring b) Let us denote by equal to the sum of the products of the probability of occurrence b) Let us denote by for each hypothesis on the probability of this hypothesis.

    Bayes formula

    Bayes formula allows the probability of hypotheses to be recalculated in the light of new information provided by the result b) Let us denote by.

    Bayes' formula in a certain sense is the inverse of the total probability formula.

    Consider the following practical problem.

    Problem 1

    Suppose there is a plane crash and experts are busy investigating its causes. 4 reasons why the disaster occurred are known in advance: either the cause, or, or, or. According to available statistics, these reasons have the following probabilities:



    When examining the crash site, traces of fuel ignition were found; according to statistics, the probability of this event for one reason or another is as follows:




    Question: what is the most likely cause of the disaster?

    Let's calculate the probabilities of causes under the conditions of the occurrence of an event b) Let us denote by.



    From this it can be seen that the most likely reason is the first reason, since its probability is maximum.

    Problem 2

    Consider an airplane landing at an airfield.

    When landing, weather conditions may be as follows: no low clouds (), low clouds (). In the first case, the probability of a safe landing is P1. In the second case - P2. It's clear that P1>P2.

    Devices that provide blind landing have a probability of trouble-free operation R. If there is low cloud cover and the blind landing instruments have failed, the probability of a successful landing is P3, and P3<Р2 . It is known that for a given airfield the proportion of days in a year with low clouds is equal to .

    Find the probability of the plane landing safely.

    We need to find the probability.

    There are two mutually exclusive options: the blind landing devices are working, the blind landing devices have failed, so we have:

    Hence, according to the total probability formula:

    Problem 3

    An insurance company provides life insurance. 10% of those insured by this company are smokers. If the insured person does not smoke, the probability of his death during the year is 0.01. If he is a smoker, then this probability is 0.05.

    What is the proportion of smokers among those insured who died during the year?

    Possible answers: (A) 5%, (B) 20%, (C) 36%, (D) 56%, (E) 90%.

    Solution

    Let's enter the events:

    The condition of the problem means that

    In addition, since the events form a complete group of pairwise incompatible events, then .
    The probability we are interested in is .

    Using Bayes' formula, we have:

    therefore the correct option is ( A=).

    Problem 4

    The insurance company sells life insurance contracts in three categories: standard, preferred and ultra-privileged.

    50% of all insured are standard, 40% are preferred and 10% are ultra-privileged.

    The probability of death within a year for a standard insured is 0.010, for a privileged one - 0.005, and for an ultra-privileged one - 0.001.

    What is the probability that the deceased insured is ultra-privileged?

    Solution

    Let us introduce the following events into consideration:

    In terms of these events, the probability we are interested in is . By condition:

    Since the events , , form a complete group of pairwise incompatible events, using Bayes' formula we have:

    Random variables and their characteristics

    Let it be some random variable, for example, damage from a fire or the amount of insurance payments.
    A random variable is completely characterized by its distribution function.

    Definition. Function called distribution function random variable ξ .

    Definition. If there is a function such that for arbitrary a done

    then they say that the random variable ξ It has probability density function f(x).

    Definition. Let . For a continuous distribution function F theoretical α-quantile is called the solution to the equation.

    This solution may not be the only one.

    Quantile level ½ called theoretical median , quantile levels ¼ And ¾ -lower and upper quartiles respectively.

    In actuarial applications plays an important role Chebyshev's inequality:

    at any

    Symbol of mathematical expectation.

    It reads like this: the probability that the modulus is greater than or equal to the mathematical expectation of the modulus divided by .

    Lifetime as a random variable

    The uncertainty of the moment of death is a major risk factor in life insurance.

    Nothing definite can be said about the moment of death of an individual. However, if we are dealing with a large homogeneous group of people and are not interested in the fate of individual people from this group, then we are within the framework of probability theory as the science of mass random phenomena that have the property of frequency stability.

    Respectively, we can talk about life expectancy as a random variable T.

    Survival function

    Probability theory describes the stochastic nature of any random variable T distribution function F(x), which is defined as the probability that the random variable T less than number x:

    .

    In actuarial mathematics it is nice to work not with the distribution function, but with the additional distribution function . In terms of longevity, this is the probability that a person will live to age x years.

    called survival function(survival function):

    The survival function has the following properties:

    In life tables it is usually assumed that there is some age limit (limiting age) (usually years) and, accordingly, at x>.

    When describing mortality by analytical laws, it is usually assumed that life time is unlimited, but the type and parameters of the laws are selected so that the probability of life beyond a certain age is negligible.

    The survival function has simple statistical meaning.

    Let's say that we are observing a group of newborns (usually) whom we observe and can record the moments of their death.

    Let us denote the number of living representatives of this group at age by . Then:

    .

    Symbol E here and below is used to denote mathematical expectation.

    So, the survival function is equal to the average proportion of those who survive to age from some fixed group of newborns.

    In actuarial mathematics, one often works not with the survival function, but with the value just introduced (fixing the initial group size).

    The survival function can be reconstructed from density:

    Lifespan Characteristics

    From a practical point of view, the following characteristics are important:

    1 . Average lifetime

    ,
    2 . Dispersion lifetime

    ,
    Where
    ,

    Let's not think about the lofty things for a long time - let's start right away with the definition.

    Bernoulli's scheme is when n identical independent experiments are performed, in each of which the event of interest to us may appear A, and the probability of this event P (A) = p is known. We need to determine the probability that, after n trials, event A will occur exactly k times.

    The problems that can be solved using Bernoulli's scheme are extremely varied: from simple ones (such as “find the probability that the shooter will hit 1 time out of 10”) to very severe ones (for example, problems with percentages or playing cards). In reality, this scheme is often used to solve problems related to monitoring the quality of products and the reliability of various mechanisms, all the characteristics of which must be known before starting work.

    Let's return to the definition. Since we are talking about independent trials, and in each trial the probability of event A is the same, only two outcomes are possible:

    1. A is the occurrence of event A with probability p;
    2. “not A” - event A did not appear, which happens with probability q = 1 − p.

    The most important condition, without which Bernoulli’s scheme loses its meaning, is constancy. No matter how many experiments we conduct, we are interested in the same event A, which occurs with the same probability p.

    By the way, not all problems in probability theory are reduced to constant conditions. Any competent higher mathematics tutor will tell you about this. Even something as simple as taking colorful balls out of a box is not an experience with constant conditions. They took out another ball - the ratio of colors in the box changed. Consequently, the probabilities have also changed.

    If the conditions are constant, we can accurately determine the probability that event A will occur exactly k times out of n possible. Let us formulate this fact in the form of a theorem:

    Bernoulli's theorem. Let the probability of occurrence of event A in each experiment be constant and equal to p. Then the probability that event A will appear exactly k times in n independent trials is calculated by the formula:

    where C n k is the number of combinations, q = 1 − p.

    This formula is called Bernoulli's formula. It is interesting to note that the problems given below can be completely solved without using this formula. For example, you can apply the formulas for adding probabilities. However, the amount of computation will be simply unrealistic.

    Task. The probability of producing a defective product on a machine is 0.2. Determine the probability that in a batch of ten parts produced on this machine exactly k parts will be without defects. Solve the problem for k = 0, 1, 10.

    According to the condition, we are interested in the event A of releasing products without defects, which happens each time with probability p = 1 − 0.2 = 0.8. We need to determine the probability that this event will occur k times. Event A is contrasted with the event “not A”, i.e. release of a defective product.

    Thus, we have: n = 10; p = 0.8; q = 0.2.

    So, we find the probability that all the parts in a batch are defective (k = 0), that there is only one part without defects (k = 1), and that there are no defective parts at all (k = 10):

    Task. The coin is tossed 6 times. Landing heads and tails are equally likely. Find the probability that:

    1. the coat of arms will appear three times;
    2. the coat of arms will appear once;
    3. the coat of arms will appear at least twice.

    So, we are interested in the event A, when the coat of arms falls out. The probability of this event is p = 0.5. Event A is contrasted with the event “not A”, when the result is heads, which happens with probability q = 1 − 0.5 = 0.5. We need to determine the probability that the coat of arms will appear k times.

    Thus, we have: n = 6; p = 0.5; q = 0.5.

    Let us determine the probability that the coat of arms is drawn three times, i.e. k = 3:

    Now let’s determine the probability that the coat of arms is drawn only once, i.e. k = 1:

    It remains to determine with what probability the coat of arms will appear at least twice. The main catch is in the phrase “no less.” It turns out that we will be satisfied with any k except 0 and 1, i.e. we need to find the value of the sum X = P 6 (2) + P 6 (3) + ... + P 6 (6).

    Note that this sum is also equal to (1 − P 6 (0) − P 6 (1)), i.e. From all possible options, it is enough to “cut out” those when the coat of arms fell out 1 time (k = 1) or did not appear at all (k = 0). Since we already know P 6 (1), it remains to find P 6 (0):

    Task. The probability that the TV has hidden defects is 0.2. 20 TVs arrived at the warehouse. Which event is more likely: that in this batch there are two TV sets with hidden defects or three?

    Event of interest A is the presence of a latent defect. There are n = 20 TVs in total, the probability of a hidden defect is p = 0.2. Accordingly, the probability of receiving a TV without a hidden defect is q = 1 − 0.2 = 0.8.

    We obtain the starting conditions for the Bernoulli scheme: n = 20; p = 0.2; q = 0.8.

    Let’s find the probability of getting two “defective” TVs (k = 2) and three (k = 3):

    \[\begin(array)(l)(P_(20))\left(2 \right) = C_(20)^2(p^2)(q^(18)) = \frac((20}{{2!18!}} \cdot {0,2^2} \cdot {0,8^{18}} \approx 0,137\\{P_{20}}\left(3 \right) = C_{20}^3{p^3}{q^{17}} = \frac{{20!}}{{3!17!}} \cdot {0,2^3} \cdot {0,8^{17}} \approx 0,41\end{array}\]!}

    Obviously, P 20 (3) > P 20 (2), i.e. the probability of receiving three televisions with hidden defects is greater than the probability of receiving only two such televisions. Moreover, the difference is not weak.

    A quick note about factorials. Many people experience a vague feeling of discomfort when they see the entry “0!” (read “zero factorial”). So, 0! = 1 by definition.

    P. S. And the biggest probability in the last task is to get four TVs with hidden defects. Calculate for yourself and see for yourself.

    So, let's talk about a topic that interests a lot of people. In this article I will answer the question of how to calculate the probability of an event. I will give formulas for such a calculation and several examples to make it clearer how this is done.

    What is probability

    Let's start with the fact that the probability that this or that event will occur is a certain amount of confidence in the eventual occurrence of some result. For this calculation, a total probability formula has been developed that allows you to determine whether the event you are interested in will occur or not, through the so-called conditional probabilities. This formula looks like this: P = n/m, the letters can change, but this does not affect the essence itself.

    Examples of probability

    Using a simple example, let's analyze this formula and apply it. Let's say you have a certain event (P), let it be a throw of a dice, that is, an equilateral die. And we need to calculate what is the probability of getting 2 points on it. To do this, you need the number of positive events (n), in our case - the loss of 2 points, for the total number of events (m). A roll of 2 points can only happen in one case, if there are 2 points on the dice, since otherwise the sum will be greater, it follows that n = 1. Next, we count the number of rolls of any other numbers on the dice, per 1 dice - these are 1, 2, 3, 4, 5 and 6, therefore, there are 6 favorable cases, that is, m = 6. Now, using the formula, we make a simple calculation P = 1/6 and we find that the roll of 2 points on the dice is 1/6, that is, the probability of the event is very low.

    Let's also look at an example using colored balls that are in a box: 50 white, 40 black and 30 green. You need to determine what is the probability of drawing a green ball. And so, since there are 30 balls of this color, that is, there can only be 30 positive events (n = 30), the number of all events is 120, m = 120 (based on the total number of all balls), using the formula we calculate that the probability of drawing a green ball is will be equal to P = 30/120 = 0.25, that is, 25% of 100. In the same way, you can calculate the probability of drawing a ball of a different color (black it will be 33%, white 42%).

    The need to act on probabilities occurs when the probabilities of some events are known, and it is necessary to calculate the probabilities of other events that are associated with these events.

    Addition of probabilities is used when you need to calculate the probability of a combination or logical sum of random events.

    Sum of events A And And denote A + And or AAnd. The sum of two events is an event that occurs if and only if at least one of the events occurs. It means that A + And– an event that occurs if and only if the event occurred during observation A or event And, or simultaneously A And And.

    If events A And And are mutually inconsistent and their probabilities are given, then the probability that one of these events will occur as a result of one trial is calculated using the addition of probabilities.

    Probability addition theorem. The probability that one of two mutually incompatible events will occur is equal to the sum of the probabilities of these events:

    For example, while hunting, two shots are fired. Event b) Let us denote by– hitting a duck with the first shot, event A=– hit from the second shot, event ( b) Let us denote by+ A=) – a hit from the first or second shot or from two shots. So, if two events b) Let us denote by And A=– incompatible events, then b) Let us denote by+ A=– the occurrence of at least one of these events or two events.

    Example 1. There are 30 balls of the same size in a box: 10 red, 5 blue and 15 white. Calculate the probability that a colored (not white) ball will be picked up without looking.

    Solution. Let us assume that the event b) Let us denote by- “the red ball is taken”, and the event A=- “The blue ball was taken.” Then the event is “a colored (not white) ball is taken.” Let's find the probability of the event b) Let us denote by:

    and events A=:

    Events b) Let us denote by And A=– mutually incompatible, since if one ball is taken, then it is impossible to take balls of different colors. Therefore, we use the addition of probabilities:

    The theorem for adding probabilities for several incompatible events. If events constitute a complete set of events, then the sum of their probabilities is equal to 1:

    The sum of the probabilities of opposite events is also equal to 1:

    Opposite events form a complete set of events, and the probability of a complete set of events is 1.

    Probabilities of opposite events are usually indicated in small letters p And q. In particular,

    from which the following formulas for the probability of opposite events follow:

    Example 2. The target in the shooting range is divided into 3 zones. The probability that a certain shooter will shoot at the target in the first zone is 0.15, in the second zone – 0.23, in the third zone – 0.17. Find the probability that the shooter will hit the target and the probability that the shooter will miss the target.

    Solution: Find the probability that the shooter will hit the target:

    Let's find the probability that the shooter will miss the target:

    For more complex problems, in which you need to use both addition and multiplication of probabilities, see the page "Various problems involving addition and multiplication of probabilities".

    Addition of probabilities of mutually simultaneous events

    Two random events are called joint if the occurrence of one event does not exclude the occurrence of a second event in the same observation. For example, when throwing a die the event b) Let us denote by The number 4 is considered to be rolled out, and the event A=– rolling an even number. Since 4 is an even number, the two events are compatible. In practice, there are problems of calculating the probabilities of the occurrence of one of the mutually simultaneous events.

    Probability addition theorem for joint events. The probability that one of the joint events will occur is equal to the sum of the probabilities of these events, from which the probability of the common occurrence of both events is subtracted, that is, the product of the probabilities. The formula for the probabilities of joint events has the following form:

    Since events b) Let us denote by And A= compatible, event b) Let us denote by+ A= occurs if one of three possible events occurs: or AB. According to the theorem of addition of incompatible events, we calculate as follows:

    Event b) Let us denote by will occur if one of two incompatible events occurs: or AB. However, the probability of the occurrence of one event from several incompatible events is equal to the sum of the probabilities of all these events:

    Likewise:

    Substituting expressions (6) and (7) into expression (5), we obtain the probability formula for joint events:

    When using formula (8), it should be taken into account that events b) Let us denote by And A= can be:

    • mutually independent;
    • mutually dependent.

    Probability formula for mutually independent events:

    Probability formula for mutually dependent events:

    If events b) Let us denote by And A= are inconsistent, then their coincidence is an impossible case and, thus, P(AB) = 0. The fourth probability formula for incompatible events is:

    Example 3. In auto racing, when you drive the first car, you have a better chance of winning, and when you drive the second car. Find:

    • the probability that both cars will win;
    • the probability that at least one car will win;

    1) The probability that the first car wins does not depend on the result of the second car, so the events b) Let us denote by(the first car wins) and A=(the second car will win) – independent events. Let's find the probability that both cars win:

    2) Find the probability that one of the two cars will win:

    For more complex problems, in which you need to use both addition and multiplication of probabilities, see the page "Various problems involving addition and multiplication of probabilities".

    Solve the addition of probabilities problem yourself, and then look at the solution

    Example 4. Two coins are tossed. Event A- loss of the coat of arms on the first coin. Event And- loss of the coat of arms on the second coin. Find the probability of an event =(number of points on the first die > 4) and event = A + And .

    Multiplying Probabilities

    Probability multiplication is used when the probability of a logical product of events must be calculated.

    In this case, random events must be independent. Two events are said to be mutually independent if the occurrence of one event does not affect the probability of the occurrence of the second event.

    Probability multiplication theorem for independent events. Probability of simultaneous occurrence of two independent events b) Let us denote by And A= is equal to the product of the probabilities of these events and is calculated by the formula:

    Example 5. The coin is tossed three times in a row. Find the probability that the coat of arms will appear all three times.

    Solution. The probability that the coat of arms will appear on the first toss of a coin, the second time, and the third time. Let's find the probability that the coat of arms will appear all three times:

    Solve probability multiplication problems on your own and then look at the solution

    Example 6. There is a box of nine new tennis balls. To play, three balls are taken, and after the game they are put back. When choosing balls, played balls are not distinguished from unplayed balls. What is the probability that after three games there will be no unplayed balls left in the box?

    Example 7. 32 letters of the Russian alphabet are written on cut-out alphabet cards. Five cards are drawn at random one after another and placed on the table in order of appearance. Find the probability that the letters will form the word "end".

    Example 8. From a full deck of cards (52 sheets), four cards are taken out at once. Find the probability that all four of these cards will be of different suits.

    Example 9. The same task as in example 8, but each card after being removed is returned to the deck.

    More complex problems, in which you need to use both addition and multiplication of probabilities, as well as calculate the product of several events, can be found on the page "Various problems involving addition and multiplication of probabilities".

    The probability that at least one of the mutually independent events will occur can be calculated by subtracting from 1 the product of the probabilities of opposite events, that is, using the formula:

    Example 10. Cargo is delivered by three modes of transport: river, rail and road transport. The probability that the cargo will be delivered by river transport is 0.82, by rail 0.87, by road transport 0.90. Find the probability that the cargo will be delivered by at least one of the three modes of transport.

    Did you like the article? Share with your friends!