Report on the topic Search engines (computer science, grade 7 briefly)


Clock frequency.

Speed ​​of work – of course, this is the indicator we pay attention to first! When we talk about processor speed, we mean its clock speed. This value, measured in megahertz (MHz), shows how many instructions the processor can execute within a second. The clock frequency is indicated by a number in the name of the processor (for example, Pentium 4-2400, that is, a Pentium 4 generation processor with a clock frequency of 2400 MHz or 2.4 GHz).

Clock speed is undoubtedly the most important indicator of processor speed. But far from the only one. How else can we explain the strange fact that Celeron, Athlon and Pentium 4 processors operate at the same frequency... at different speeds?

This is where new factors come into play.

Processor size

Bit capacity is the maximum number of bits of information that can be processed and transmitted by the processor simultaneously.

Until recently, all processors were 32-bit (32-bit); this bit depth was reached 10 years ago. For a long time they could not increase the bit depth due to the fact that the programs were adapted for the old 32-bit platform. And since the buyer looks primarily at clock purity, manufacturers simply did not see the need for such a transition. AMD released the first 64-bit Athlon 64 processor in 2003.

Intel held out until the last until 2005. All Pentium 4 processors were still 32-bit. Only in the middle of the year, when new models of the Pentium 4 6xx series processor appeared on the market, the first ones had built-in support for 64-bit instructions.

Core type and production technology

The core is the processor chip itself, the part that is directly the “processor”. The crystal itself in modern models is small in size, and the dimensions of the finished processor increase very much due to its packaging and wiring. The processor crystal can be seen, for example, in Athlon processors; in them it is not closed. In P4, the entire upper part is hidden under a heat dissipator (which also performs a protective function; the crystal itself is not that strong). Processors based on different cores are, one might say, different processors; they may differ in cache memory size, bus frequency, manufacturing technology, etc. In most cases, the newer the core, the better the processor. An example is P4, there are two cores - Willamette and Northwood. The first core was produced using 0.18µm technology and operated exclusively on a 400Mhz bus. The lowest models had a frequency of 1.3Ghz, the maximum frequencies for the core were slightly higher than 2.2Ghz. Northwood was later released. It was already made using 0.13 micron technology and supported a bus of 400 and 533 Mhz, and also had an increased cache memory capacity. The transition to a new core has significantly increased performance and maximum operating frequency. Junior Northwood processors can be overclocked well, but in fact the overclocking potential of these processors is based on a more “fine” technical process.

Differences between Pentium and Celeron, Athlon and Duron processors

The Celeron processor is a budget (stripped down) version of the corresponding (more productive, but also much more expensive) main-stream processor, based on the core of which it was created. Celeron processors have two to four times less L2 cache. They also have a lower system bus frequency compared to their corresponding “parents”. Duron processors, compared to Athlon, have 4 times less cache memory and a lower system bus of 200MHz (266MHz for Applebred), although there are also “full-fledged” Athlons with a 200MHz FSB. In the near future, Durons based on the Morgan core will completely disappear from sale - their production has already been curtailed quite a long time ago. They should be replaced by Duron on the Applebred core, which are nothing more than AthlonXP Thoroughbred trimmed in cache. Cache-reduced Bartons have also already appeared, the core of which is called Thorton. The main characteristics of the processors can be found in the table at the end of the abstract. There are tasks in which there is almost no difference between regular and cut-down processors, and in some cases the lag is quite serious. On average, when compared with an uncut processor of the same frequency, the lag is 10-30%. But stripped-down processors tend to overclock better due to smaller cache memory and are cheaper. In short, if the price difference between a normal and a stripped-down processor is significant, then it’s worth taking the stripped-down one. Although it should be noted here that Celeron processors perform very poorly compared to full-fledged P4s - the lag in some situations reaches 50%. This does not apply to Celeron D processors, in which the second level cache is 256 KB (128 KB in regular Celerons) and the lag is no longer so terrible.

AMD processors

Firstly, with AXP (and Athlon 64) a rating is written instead of frequency, i.e. for example, a 2000+ processor actually operates at a frequency of 1667Mhz, but in terms of operating efficiency it corresponds to Athlon (Thunderbird) 2000Mhz. Temperature has recently been considered the main drawback. But the latest models (on Thoroughbred, Barton, etc. cores) are comparable in heat dissipation to the Pentium 4, and the latest, at the time of writing, models from Intel (P4 Extreme Edition) sometimes get much hotter. In terms of reliability, processors are now also not much inferior to P4; although they cannot skip cycles (run “idle”) when overheated, they have acquired a built-in thermal sensor (although it appeared in the Palomino core, very few modern motherboards can take readings from this temperature sensor). It should be noted here that the Athlon XP on the Barton core has acquired a similar BusDisconnect function - it “disconnects” the processor from the bus during idle cycles (idle), but it is virtually powerless when overheated from an increased load - here all “responsibility” is shifted to the thermal control of the motherboard . Although the “strength” of the crystal (maximum permissible pressure limits) has increased, due to the reduced core area it has actually remained the same. Therefore, the probability of burning/damaging the crystal, although it has become less, still exists. But the Athlon 64 finally had the processor chip hidden under a heat spreader, so it would be extremely difficult to damage it. All “glitches” attributed to AMD are often the result of uninstalled or incorrectly installed universal drivers for VIA chipsets (VIA 4 in 1 Service Pack) or drivers for chipsets from other manufacturers (AMD, SIS, ALi). Atholn XP and Pentium 4 processors perform very differently in different applications. For example, in complex mathematical calculations (3D modeling, specialized mathematical packages), archiving, MPEG4 encoding, P4 often “beats” AXP. But there are a number of programs that work better with AXP. Basically these are games. For the average user (playing games), it is worth focusing on them, since recoding in any case takes a lot of time, and games, on the contrary, need to carry out all the calculations as quickly as possible. AXP Barton processors with a 400Mhz bus and fundamentally new hybrid (32 and 64-bit processors “in one bottle”) K8 have already been released.

Topics of reports and abstracts in computer science

  1. Computer literacy and information culture.
  2. The role of informatization in the development of society.
  3. Transfer, transformation, storage and use of information in technology.
  4. History of number systems.
  5. Binary form of information representation, its features and advantages.
  6. Approaches to assessing the amount of information.
  7. Principles of representing data and commands in a computer.
  8. The history of the formation of the concept of “algorithm”.
  9. Tools and languages ​​for describing and presenting algorithms.
  10. Methods for developing algorithms.
  11. Construction and use of computer models.
  12. Works of J. von Neumann on the theory of computers.
  13. History of the creation and development of computers. Generations.
  14. The current state of electronic computer technology.
  15. Classes of modern computers.
  16. Personal computers, history of creation, place in the modern world.
  17. Supercomputer, purpose, capabilities, principles of construction.
  18. Multiprocessor computers and program parallelization.
  19. Pocket personal computers.
  20. Harmful effects of the computer. Methods of protection.
  21. Modern information storage devices used in computing.
  22. Displays, their evolution, directions of development.
  23. Printing devices, their evolution, directions of development.
  24. Scanners and software support for their operation.
  25. Means for input and output of audio information.
  26. The evolution of operating systems for various types of computers.
  27. Operating systems of the Windows family.
  28. Development of technologies for connecting computers into local networks.
  29. History of the formation of the worldwide Internet. Modern Internet statistics.
  30. Internet structure. Internet governing bodies and standards.
  31. Communication channels and methods of accessing the Internet.
  32. Protocols and services of the Internet.
  33. Client programs for working with e-mail. Features of their use and configuration.
  34. Graphic formats for designing Web pages.
  35. Search sites and technologies for searching information on the Internet.
  36. Educational resources on the Internet.
  37. New types of Internet services - ICQ, IP telephony, video conferencing.
  38. E-commerce and advertising on the Internet.
  39. Problems of information security on the Internet.
  40. Internet and cybercrime.
  41. Electronic payment systems, digital money.
  42. Computer literacy and information culture.
  43. Information input devices.
  44. Transfer, transformation, storage and use of information in technology.
  45. Language as a way of presenting information, a binary form of presenting information, its features and advantages.
  46. The principle of automatic execution of programs in a computer.
  47. Operating systems of the UNIX family.
  48. Construction and use of computer models.
  49. Telecommunications, telecommunication networks of various types, their purpose and capabilities.
  50. Multimedia technologies.
  51. Computer science in the life of society.
  52. Information in people's communication.
  53. Approaches to assessing the amount of information.
  54. History of computer development.
  55. The current state of electronic computer technology.
  56. Classes of modern computers.
  57. Supercomputers and their applications.
  58. A laptop is a device for professional activities.
  59. Pocket personal computers.
  60. Main types of printers.
  61. Scanners and character recognition software.
  62. Internet and cybercrime.
  63. Cryptography.
  64. Computer graphics on a PC.
  65. www. History of creation and modernity.
  66. Problems of creating artificial intelligence.
  67. Using the Internet in marketing.
  68. Searching for information on the Internet. Web indexes, Web directories.
  69. The main approaches to the programming process: object-based, structural and modular.
  70. Modern multimedia technologies.
  71. Case technologies as the main means of developing software systems.
  72. Modern technologies and their capabilities.
  73. Scanning and systems providing character recognition.
  74. World Wide Web: network access and main communication channels.
  75. Basic principles of the functioning of the Internet.
  76. Types of search engines on the Internet.
  77. Programs designed to work with email.
  78. Wireless Internet: features of its functioning.
  79. Internet information security system.
  80. Modern translator programs.
  81. Features of working with graphic computer programs: PhotoShop and CorelDraw.
  82. Electronic money systems.
  83. Informatization of society: the main problems on the way to eliminating computer illiteracy.
  84. Offenses in the field of information technology.
  85. Ethical standards of behavior in the information network.
  86. Advantages and disadvantages of working with a laptop, netbook, pocket computer.
  87. Printers and features of their functioning.
  88. The importance of computer technology in the life of a modern person.
  89. Information technologies in the system of modern education.

How good are VIA C3 processors?

Their only advantage is low heat generation. Their power dissipation is 5-20 Watts versus 40-60 (on average) for AXP and P4. C3 are compatible with the outdated (according to Intel) Socket 370, although not with all boards; for example, the new Nehemiah core requires Tualatin support on the board. In terms of speed, they are very much inferior (up to 50%, sometimes even more) to processors of similar frequency from Intel and AMD. Even some improvements like SSE support didn't do much for them. There are almost no such processors on sale and I don’t regret it at all :). If you need a quiet machine (such a processor often only needs a heatsink), but speed is not important, then you can take it. Theoretically, they should overclock quite well (manufacturing technology is quite advanced), but in practice this is not observed - this is due to the small “safety margin” and ineffective core design.

Brief message on the topic: Computer science is a science

What kind of science is computer science?

There are three main events in the life of mankind - the transition from ordinary berry picking and hunting to self-cultivation of plants and domestication of animals, the industrial revolution with its automation of production, and the information revolution. The latter is headed by computer science. So what kind of science is this?

What is computer science?

The main task of this science is to work with information using computer technology. Computer science studies how to store data, how to distribute or process it, looks for new ways to “communicate” with computers, and develops resources for understanding certain processes. Everyone is familiar with computer science first-hand, because the Internet is its brainchild. The sites present in it were created thanks to developed programming languages. Every day this science solves more and more problems of all kinds - helping people launch ships into space by performing thousands of calculations per minute, or ensuring the simple operation of an ATM so that an ordinary person can withdraw his salary.

Sections of computer science

There are three main types of computer science. They differ in their tasks and approach to solving various problems. Often this discipline is closely related to other basic sciences, such as physics or mathematics. The latter plays a major role in the understanding and development of computer science. The theoretical component of science is mainly the theory of algorithms, which solves the question of what can, in principle, be automated, how and what resources will be needed for this. She deals with more abstract, far from practical application, mathematical calculations. This section also includes formal languages, automata theory, computability, and so on. Applied computer science solves more specific problems. She is trying to put the theorists' ideas into practice. For example, scientists in this area develop artificial intelligence, the computers themselves with which data will be processed, analyze their effectiveness, build mathematical models, etc. Natural computer science solves other problems - it does not work with artificial intelligence or theoretical algorithms, but with what already exists in nature, for example, the human or rat brain, the DNA of living organisms. Computer science is one of the most promising sciences for humanity, so more and more young people are choosing this particular field for study.

Hyper Threading.

This technology is designed to increase the efficiency of the processor. Intel estimates that most of the time, only 30% of all execution units in the processor are running. Therefore, the idea arose to somehow use the remaining 70% (as you already know, the Pentium 4, which uses this technology, does not at all suffer from excess performance per megahertz). The essence of Hyper Threading is that during the execution of one “thread” of a program, idle execution devices can switch to executing another “thread” of the program. That is, it turns out something like dividing one physical processor into two virtual ones. Situations are also possible when attempts to simultaneously execute several “threads” will lead to a noticeable drop in performance. For example, because the L2 cache size is quite small, active threads will try to load the cache. It is possible that the fight for the cache will lead to constant clearing and reloading of data in it (therefore, the speed will drop). To use this technology, just one processor with Hyper Threading support is not enough; you need support from the motherboard (chipset). It is very important to remember that there is currently a lack of normal support for this technology from operating systems and, most importantly, the need to recompile, and in some cases change the algorithm, applications so that they can take full advantage of Hyper Threading. Tests prove this, often there is no increase in speed, sometimes there is even a slight drop in performance. Although there are already a number of applications in which, thanks to optimizations for HT, there is a strong increase in speed. Let's see what will happen next.

Description and purpose of processors

Definition 1
Central processing unit (CPU) is the main component of a computer that performs arithmetic and logical operations specified by the program, controls the calculation process and coordinates the operation of all PC devices.

The more powerful the processor, the faster the PC will perform.

Comment

The central processing unit is often called simply a processor, CPU (Central Processing Unit) or CPU (Central Processing Unit), less often - a crystal, a stone, a host processor.

Modern processors are microprocessors.

The microprocessor has the form of an integrated circuit - a thin rectangular plate of crystalline silicon with an area of ​​​​several square millimeters, on which circuits with billions of transistors and channels for passing signals are placed. The crystal plate is placed in a plastic or ceramic case and connected by gold wires to metal pins for connection to the PC motherboard.

Figure 1. Intel 4004 microprocessor (1971)

Finished works on a similar topic

  • Course work Processor and its components 470 rub.
  • Abstract Processor and its components 250 rub.
  • Test work Processor and its components 190 rub.

Receive completed work or specialist advice on your educational project Find out the cost

Intel Pentium IV microprocessor (2001). Left – top view, right – bottom view

Figure 2. Intel Pentium IV microprocessor (2001). Left – top view, right – bottom view

The CPU is designed to automatically execute a program.

Recently, new processors of the K8 family appeared and “in response” P4 was released

Extreme Edition (EE), what can we say about them?

P4 EE is essentially a server version of P4 (Xeon on the Gallatin core, “packed” in mPGA478), has all the advantages of regular P4 with 800Mhz FSB, plus 2Mb L3 cache.:) Athlon 64 supports 32/64-bit computing, has a 1Mb L2 cache, SSE2 support, a built-in controller for initially single-channel, later dual-channel DDR400 and a 200MHz real FSB frequency. Note that the FSB frequency in Athlon 64 systems has a purely formal meaning: in fact, it is simply the frequency of the signal relative to which the operating frequency of the CPU and other system components is calculated. The Athlon 64 FX is derived from the Operton server processor, and differs from the Athlon 64 in that it is equipped with a dual-channel buffered (registred) DDR400 controller. The general trend is this: the Athlon 64 3200+ loses to the P4 3200Mhz by about 5% in terms of performance on average, although it should be taken into account that the actual processor frequency is about 2Ghz, it turns out that a 2Ghz processor is more than a match for a 3.2Ghz processor! The current top processors P4 EE and Athlon 64 FX are on par if you average the test results. And if we compare the Athlon 64 3200+ with the regular Athlon 3200+, then the first one is almost always (with the exception of mp3 encoding ) faster by 10-40%. And now a little about 64-bit. At the moment, the Athlon 64 has practically no use for its support; there are almost no real applications suitable for use by ordinary users. Microsoft is about to release a 64-bit OS suitable for ordinary users. The existing 64-bit Linux is not suitable in this case. The most unpleasant thing is that all applications will also require improvements to use all the “power” of the new processors.

Popular message topics

  • Deserts
    include natural zones in which the surface is flat, there are few plants and quite specific animals live. Deserts can be different: saline, clayey, arctic and rocky.
  • City of Yaroslavl
    Yaroslavl is a city in Russia, which is the administrative center of the Yaroslavl region. Yaroslavl was founded in the 11th century and is one of the oldest cities in Russia; by the way, in 2010, Yaroslavl celebrated its millennium.
  • History of the development of geometry
    One of the most difficult subjects, both at school and at university, is geometry. All this because this science makes you think, think logically, and prove. Geometry is a science, information about which people received back in ancient times.
Rating
( 1 rating, average 4 out of 5 )
Did you like the article? Share with friends: