Measuring information presentation for a lesson in computer science and ICT (grade 10) on the topic


Lesson in computer science on the topic “Measuring information. Content approach" (grade 10)

Lesson #4

Subject:

Measuring information. Content approach

Lesson type:

lesson on introducing new material

Goals:

  • Master a meaningful approach to measuring information;
  • Explain which events are equally probable;
  • Teach to find the probability of an event, the amount of information in a message, what happened from equally probable events;
  • Formation of general ideas of the modern scientific picture of the world;
  • formation of communicative qualities of a developing personality.

Equipment:

  • PC;
  • Interactive board;
  • MS PowerPoint

During the classes:

I. _ Organizational moment (2 min.)

Greetings. Post a new topic.

II . Updating knowledge (3 min.)

Checking homework.

III . Theoretical part (25 min.)

From the name of the approach to measuring information, we can conclude that the amount of information depends on its content.

Ex. 1.

Let's define the amount of information in the following messages from the position of “a lot” or “a little”

  1. The capital of Russia is Moscow
  2. The sum of the squares of the legs is equal to the square of the hypotenuse
  3. Diffraction of light is a set of phenomena that are caused by the wave nature of light and are observed during its propagation in a medium with pronounced optical inhomogeneity
  4. The Eiffel Tower is 300 meters high and weighs 9,000 tons.

Students should be asked to clarify their answers by asking leading questions about whether the message contains new and understandable information.

.

A message conveys more information if it contains new and understandable information. Such a message is called informative.

It is necessary to distinguish between the concepts information and informativeness.

— Does the biology textbook for grade 10 contain information? (Yes)

- For whom will it be informative - for a 10th or 1st grade student? (For a 10th grade student it will be informative, because it contains new and understandable information, but for a 1st grade student it will not be informative, because the information is not clear to him)

Conclusion: the amount of information depends on the information content.

The amount of information is zero if it is uninformative from the point of view of a particular person. The amount of information in an informative message is greater than zero.

But the informative content of a message in itself does not accurately determine the amount of information. By the information content you can only judge whether there is a lot of information or little.

Let's consider the concept of information content from the other side. If some message is informative, then it adds to knowledge or reduces the uncertainty of our knowledge. In other words, a message contains information if it leads to a reduction in the uncertainty of our knowledge.

Let's look at an example.

We throw a coin and try to guess which side it will land on the surface. One of two results is possible: the coin will end up in the “heads” or “tails” position. Each of these two events will turn out to be equally probable, i.e. neither one has an advantage over the other.

Before tossing a coin, we don't know exactly how it will land. This event cannot be predicted, i.e. before the throw there is uncertainty of our knowledge (one event out of two is possible). After the throw, complete certainty of knowledge comes, because we receive a visual message about the position of the coin. This visual message reduces the uncertainty of our knowledge by half, because One of two equally probable events occurred.

If we throw a six-sided die, then we also do not know before throwing which side it will land on the surface. In this case, it is possible to get one result out of six equally probable ones. The uncertainty of knowledge is six

, because
exactly six equally probable events can occur. When, after throwing the dice, we receive a visual message about the results, the uncertainty of our knowledge is reduced by six times
.

Ex. 2.

One more example. 30 tickets have been prepared for the exam.

  • What is the number of events that can happen when a ticket is drawn? (30)
  • Are these events equally likely or not? (Equally likely)
  • What is the uncertainty of the student's knowledge before he draws the ticket? (30)
  • How many times will the uncertainty of knowledge decrease after the student has drawn the ticket? (30 times)
  • Does this indicator depend on the number of the drawn ticket? (No, because the events are equally probable)

From all the examples given, we can draw the following conclusion: the greater the initial number of possible equally probable events, the greater the number of times the uncertainty of our knowledge decreases, and the greater the amount of information the message about the results of the experiment will contain.

In order for the amount of information to have a positive value, it is necessary to receive a message that at least two equally probable events have occurred. This amount of information contained in the message that one of two equally probable events occurred is taken as a unit of information and is equal to 1 bit.

And one more definition of bit:

1 bit is the amount of information that reduces the uncertainty of knowledge by half.

And now this is the task: a student can receive one of four grades in the exam: 5 - “excellent”, 4 - “good”, 3 - “satisfactory”, 2 - “unsatisfactory”. Imagine that your friend went to take an exam. Moreover, he studies very unevenly and can with equal probability receive any grade from “2” to “5”. You are worried about him, waiting for the exam result. Finally, he came to your question: “Well, what did you get?” - answered: “Four!”

Question: How many bits of information are there in his answer?

If it’s difficult to answer this question right away, then let’s approach the answer gradually. We will guess the rating by asking questions that can only be answered “yes” or “no.”

We will pose questions so that each answer reduces the number of possible results by half and, therefore, brings 1 bit of information.

First question:

— Is the grade higher than a “three”? - Yes.

After this answer, the number of options decreased by half. Only "4" and "5" remained. 1 bit of information received. Second question:

— Did you get an “A”?

- No.

One option out of the remaining two has been selected: the score is “four”. 1 more bit of information received. In total we have 2 bits.

A message about one of four equally probable results of some event carries 2 bits of information.

Let's look at one more particular problem, and then we'll get a general rule.

The bookcase has eight shelves. The book can be placed on any of them. How much information does the message contain about where the book is?

We will proceed in the same way as in the previous task. A search method in which half of the options are discarded at each step is called the bisection method.

Let's apply the bisection method to the shelving problem. Asking questions:

  • Is the book above the fourth shelf? - Yes.
  • Is the book above the sixth shelf?
  • No.
  • Is the book on the sixth shelf?
  • No.
  • Well now everything is clear! The book is on the fifth shelf!

Each answer reduced the uncertainty by half. A total of three questions were asked. This means that 3 bits of information have been typed. And if it were immediately said that the book is on the fifth shelf, then the same 3 bits of information would be transmitted by this message.

Note that searching for a value using the halving method is the most rational. In this way, you can always guess any of the eight options in three questions. If, for example, the search was carried out sequentially: “Is the book on the first shelf?” - "No". - “On the second shelf?” - “No,” etc., then we would have learned about the fifth shelf after five questions, and about the eighth - after eight.

Now let's try to get a formula that calculates the amount of information contained in a message that one of many equally probable results of some event took place.

Let us denote by the letter N

the number of possible results of an event, or, as we also called it, the uncertainty of knowledge.
The letter i
will denote the amount of information in a message about one of
N
results.

In the coin example: N

= 2,
i =
1 bit.
In the example with estimates: N
= 4,
i - 2
bits.
In the example with a rack: N
= 8,
i
= 3 bits.

It is easy to see that the relationship between these quantities is expressed by the following formula:

2i = N

Indeed: 2 x
= 2;
22 = 4; 23 = 8.

You are already familiar with the resulting formula from the basic computer science course, and we will meet with it more than once. The significance of this formula is so great that we called it the main formula of computer science. If the value N

is known, a
i
is unknown, then this formula becomes an equation for determining
i .
In mathematics it is called
an exponential equation.
Let the rack have not 8, but 16 shelves. To answer the question of how much information is contained in a message about the location of a book, you need to solve the equation:

2i= 16.

Since 16 = 24, then i

= 4 bits.

If the value N

is equal to an integer power of two (4, 8, 16, 32, 64, etc.), then the exponential equation is easy to solve in your head, since
i
will be an integer.
What, for example, is the amount of information in the message about the result of throwing a die, which has six sides and, therefore, N =
6? One can guess that the solution to the equation

2i

= 6

will be a fractional number lying between 2 and 3, since 22 = 4 < 6, and 23 = 8 > 6. How can you find out this number more precisely?

Your mathematical knowledge is not yet sufficient to solve this equation. You will learn this in 11th grade math course. Now let us tell you that the result of solving the equation for N

= 6 will be the value of
i
= 2.58496 bits with an accuracy of five decimal places.

IV . Consolidation of knowledge (10 min.)

  1. “Are you getting off at the next stop?” - they asked the man on the bus. “No,” he replied. How much information does the answer contain? Solution:

    a person can only answer “Yes” or “No”, i.e. choose one answer from two possible ones. Therefore N = 2, which means i = 1 bit.

  2. How much information does a message contain that reduces the degree of knowledge uncertainty by 8 times? Solution:

    because the uncertainty of knowledge decreases by 8 times, therefore, it was equal to 8, i.e. there were 8 equally probable events. A message that one of them has occurred carries 3 bits of information (8 = 23)

  3. A group of schoolchildren came to the pool, which had 4 swimming lanes. The coach announced that the group would swim in lane number 3. How much information did the students receive from this message? Solution:

    out of 4 tracks you need to select one, i.e. N = 4. So according to the formula i = 2, because 4 = 22

  4. There are 16 cubes in the box. All cubes are different colors. How much information is conveyed by the message that a red cube was taken from the box? Solution:

    out of 16 equally probable events you need to choose one. Therefore N = 16, therefore i = 4 (16 = 24)

  5. When guessing an integer within a certain range, 8 bits of information were obtained. How many numbers does this range contain? Solution:

    N = 28 = 256

V. _ Lesson summary (2 min.)

Class work is assessed and grades are called.

VI . Homework (3 min.)

§4, answer the questions and complete the tasks at the end of the paragraph.

Rating
( 2 ratings, average 4 out of 5 )
Did you like the article? Share with friends: