Graph theory in chemistry. graph theory




E. Babaev.  Candidate of Chemical Sciences.

      Speaking about the mathematization of science, most often they mean only the purely pragmatic use of computational methods, forgetting the apt statement of A. A. Lyubishchev about mathematics as not so much a servant as the queen of all sciences. It is the level of mathematization that brings this or that science into the category of exact if we mean by this not the use of exact quantitative estimates, but a high level of abstraction, the freedom to operate with concepts related to the categories of non-numerical mathematics.
      Among the methods of such qualitative mathematics that have found effective application in chemistry, the main role belongs to sets, groups, algebras, topological constructions and, first of all, graphs the most general method of representing chemical structures.

Take, for example, four points arbitrarily located on a plane or in space, and connect them with three lines. No matter how these points (called vertices) are located and no matter how they are connected to each other by dashes (called edges), we will get only two possible graph structures that differ from each other in the mutual arrangement of connections: one graph, similar to the letters "П " or "I", and another graph that looks like the letters "T", "E" or "U". If instead of four abstract points we take four carbon atoms, and instead of dashes chemical bonds between them, then the two indicated graphs will correspond to two possible isomers of butane normal and iso-structure.
      What is the reason for the growing interest of chemists in graph theory, this bizarre, but very simple language of dots and dashes?
      A graph has the remarkable property that it remains unchanged under any deformations of the structure that are not accompanied by breaking the links between its elements. The graph structure can be distorted, completely depriving it of symmetry in the usual sense; nevertheless, the graph will retain symmetry in the topological sense, determined by the sameness, interchangeability of terminal vertices. Given this hidden symmetry, one can, for example, predict the number of different isomeric amines obtained from the structures of butane and isobutane by replacing carbon atoms with nitrogen atoms; Graphs make it possible to use simple physical considerations to understand regularities such as "structure property".
      Another, somewhat unexpected idea to express the structural properties of graphs using numbers (for example, the degree of their branching). Intuitively, we feel that isobutane is more branched than normal butane; Quantitatively, this can be expressed, say, by the fact that the structural fragment of propane is repeated three times in the isobutane molecule, and only twice in normal butane. This structural number (called the Wiener topological index) correlates surprisingly well with saturated hydrocarbon characteristics such as boiling point or heat of combustion. Recently, a kind of fashion has appeared for the invention of various topological indices, there are already more than twenty of them; the alluring simplicity makes this Pythagorean method more and more popular * .
      The use of graph theory in chemistry is not limited to the structure of molecules. Back in the thirties, A. A. Balandin, one of the forerunners of modern mathematical chemistry, proclaimed the principle of isomorphic substitution, according to which the same graph carries uniform information about the properties of the most heterogeneous structured objects; it is only important to clearly define which elements are chosen as vertices and which relations between them will be expressed by edges. So, in addition to atoms and bonds, phases and components, isomers and reactions, macromolecules and interactions between them can be chosen as vertices and edges. One can notice a deep topological relationship between the Gibbs phase rule, Horiuchi's stoichiometric rule, and the rational classification of organic compounds according to their degree of unsaturation. With the help of graphs, interactions between elementary particles, crystal fusion, cell division are successfully described... In this sense, graph theory serves as a visual, almost universal language of interdisciplinary communication.

The development of each scientific idea traditionally goes through stages: article review monograph textbook. The inflorescence of ideas called mathematical chemistry has already passed the stage of reviews, although it has not yet reached the status of an academic discipline. Due to the diversity of directions, collections are now the main form of publications in this area; several such collections were published in 1987-1988.
      The first collection edited by R. King "Chemical Applications of Topology and Graph Theory" (M., "Mir", 1987) contains the translation of the reports of the international symposium with the participation of chemists and mathematicians from different countries. The book gives a complete picture of the colorful palette of approaches that have emerged at the intersection of graph theory and chemistry. It touches on a very wide range of issues, starting from the algebraic structure of quantum chemistry and stereochemistry, the magic rules of electronic counting, and ending with the structure of polymers and the theory of solutions. Organic chemists will undoubtedly be attracted by a new strategy for the synthesis of molecular knots such as a trefoil, an experimental implementation of the idea of ​​a Mobius molecular strip. Review articles on the use of the above-mentioned topological indices for estimating and predicting a wide variety of properties, up to the biological activity of molecules, will be of particular interest.
      The translation of this book is also useful in that the issues raised in it may help to remove a number of debatable problems in the field of methodology of chemical science. Thus, the rejection by some chemists in the 50s of the mathematical symbolism of resonance formulas was replaced in the 70s by the rejection by individual physicists of the very concept of chemical structure. In the framework of mathematical chemistry, such contradictions can be eliminated, for example, with the help of a combinatorial-topological description of both classical and quantum-chemical systems.
      Although the works of Soviet scientists are not presented in this collection, it is gratifying to note the increased interest in the problems of mathematical chemistry in domestic science. An example is the first workshop "Molecular graphs in chemical research" (Odessa, 1987), which brought together about a hundred specialists from all over the country. Compared with foreign studies, domestic works are distinguished by a more pronounced applied nature, focus on solving problems of computer synthesis, creating various data banks. Despite the high level of reports, the meeting noted an unacceptable backlog in the training of specialists in mathematical chemistry. Only at Moscow and Novosibirsk universities are occasional courses given on its individual issues. At the same time, it is time to seriously raise the question what kind of mathematics should chemistry students study? After all, even in the university mathematical programs of chemical departments, such sections as the theory of groups, combinatorial methods, the theory of graphs, topology are practically not represented; in turn, university mathematicians do not study chemistry at all. In addition to the problem of education, the issue of scientific communications is acute: an all-Union journal on mathematical chemistry is needed, which is published at least once a year. The journal "MATCH" (Mathematical Chemistry) has been published abroad for many years, and our publications are scattered among collections and various periodicals.

Until recently, the Soviet reader could get acquainted with mathematical chemistry only through the book by V.I. Sokolov "Introduction to Theoretical Stereochemistry" (M.: Nauka, 1979) and I.S. , 1977). Partially filling this gap, the Siberian branch of the publishing house "Nauka" published last year the book "Application of Graph Theory in Chemistry" (edited by N. S. Zefirov, S. I. Kuchanov). The book consists of three sections, with the first one dealing with the use of graph theory in structural chemistry; the second part deals with reaction graphs; the third shows how graphs can be used to facilitate the solution of many traditional problems in the chemical physics of polymers. Of course, this book is not yet a textbook (much of the ideas discussed are original results of the authors); nevertheless, the first part of the collection can be fully recommended for initial acquaintance with the subject.
      Another collection of the proceedings of the seminar of the Faculty of Chemistry of Moscow State University "Principles of Symmetry and Consistency in Chemistry" (edited by N. F. Stepanov) was released in 1987. The main topic of the collection is group-theoretic, graph-theoretic and system-theoretic methods in chemistry. The range of questions discussed is unconventional, and the answers to them are even less standard. The reader will learn, for example, about the reasons for the three-dimensionality of space, about the possible mechanism for the occurrence of dissymmetry in living nature, about the principles for constructing a periodic system of molecules, about the symmetry planes of chemical reactions, about describing molecular forms without using geometric parameters, and much more. Unfortunately, you can find the book only in scientific libraries, since it was not widely available for sale.
      Since we are talking about the principles of symmetry and consistency in science, one cannot but mention another unusual book "System. Symmetry. Harmony" (M.: Thought, 1988). This book is devoted to one of the versions of the so-called general systems theory (GTS) proposed and developed by Yu.A. The initial principles of Urmantsev's GTS are the concepts of system and chaos, polymorphism and isomorphism, symmetry and asymmetry, as well as harmony and disharmony.
      It seems that Urmantsev's theory should arouse the closest attention of chemists, if only because in it the traditional chemical concepts of composition, isomerism, dissymmetry are elevated to the rank of system-wide. In the book one can find striking symmetry analogues for example between isomers of leaves and molecular structures ** . Of course, when reading the book, in some places a certain level of professional impartiality is needed say, when it comes to chemical-musical parallels or the rationale for a mirror-symmetrical system of elements. Nevertheless, the book is permeated by the central idea to find a universal language that expresses the unity of the universe, akin to which, perhaps, the Castalian language of Hermann Hess' "bead game".
Speaking about the mathematical constructions of modern chemistry, one cannot ignore the wonderful book by A.F. Bochkov and V.A. Smith "Organic Synthesis" (Moscow: Nauka, 1987). Although its authors are "pure" chemists, a number of ideas discussed in the book are very close to the problems raised above. Without dwelling on the brilliant form of presentation and the depth of the content of this book, after reading which you want to do organic synthesis, we emphasize only two points. First, considering organic chemistry through the prism of its contribution to world science and culture, the authors draw a clear parallel between chemistry and mathematics as universal sciences, drawing objects and problems of their research within themselves. In other words, to the traditional status of mathematics as the queen and servant of chemistry, one can add a peculiar hypostasis of her sister. Secondly, convincing the reader that organic synthesis is an exact science, the authors appeal to the accuracy and rigor of both structural chemistry itself and the perfection of the logic of chemical ideas.
      If experimenters say so, can there be any doubt that the hour of mathematical chemistry has struck?

________________________
  * See "Chemistry and Life", 1988, No. 7, p.22.
** See "Chemistry and Life", 1989, No. 2.

1 Over the past decades, the concepts of topology and graph theory have become widespread in theoretical chemistry. They are useful in searching for quantitative relations "structure-property" and "structure-activity", as well as in solving graph-theoretic and combinatorial-algebraic problems that arise in the course of collecting, storing and processing information on structure and properties. substances.

Graphs serve, first of all, as a means of representing molecules. In the topological description of a molecule, it is depicted as a molecular graph (MG), where the vertices correspond to atoms, and the edges correspond to chemical bonds (graph-theoretic model of a molecule). Usually, only skeletal atoms are considered in this representation, for example, hydrocarbons with "erased" hydrogen atoms.

The valency of chemical elements imposes certain restrictions on the degrees of vertices. Alkane trees (connected graphs that do not have cycles) have vertex degrees (r) that cannot exceed four (r = 1, 2, 3, 4).

Graphs can be specified in matrix form, which is convenient when working with them on a computer.

The vertex adjacency matrix of a simple graph is a square matrix A = [aσχ] with elements aσχ = 1 if the vertices σ and χ are connected by an edge, and σχ = 0 otherwise. The distance matrix is ​​a square matrix D = with elements dσχ defined as the minimum number of edges (the shortest distance) between the vertices σ and χ. Sometimes adjacency and edge distance matrices (A e and D e) are also used.

The type of matrices A and D (A e and D e) depends on the method of numbering the vertices (or edges), which causes inconvenience when handling them. To characterize a graph, graph invariants are used - topological indices (TI).

Number of paths of length one

pi = xcc 0 = m = n-1

Number of paths of length two

Number of triplets of adjacent edges (with a common vertex)

The Wiener number (W), defined as a half-sum of the elements of the distance matrix of the graph under consideration:

etc.

The methodology for studying the "structure-property" relationship through topological indices in the graph-theoretic approach includes the following steps.

Selection of objects of study (training sample) and analysis of the state of numerical data on the property P for a given range of compounds.

Selection of TIs taking into account their discriminating ability, correlation ability with properties, etc.

The study of graphical dependencies "Property P - TI of the molecule graph", for example, P on n - the number of skeletal atoms, P on W - the Wiener number, etc.

Establishment of a functional (analytical) dependence P = _DTI), for example,

P \u003d a (TI) + b,

P \u003d aln (TI) + b,

P \u003d a (TI) 1 + b (TI) 2 + ... + n (TI) n + c

etc. Here a, b, c are some parameters (not to be confused with the parameters of additive circuits.) to be determined.

Numerical calculations of Р, comparison of calculated values ​​with experimental ones.

Prediction of the properties of compounds that have not yet been studied or even obtained (outside of this sample).

Topological indices are also used in the construction of additive calculation and forecasting schemes. They can be used in the development of new drugs, in assessing the carcinogenic activity of certain chemicals, in predicting the relative stability of new (not yet synthesized) compounds, etc.

However, it should be remembered that the choice of TI is often random; they may not reflect important structural features of molecules or duplicate information (obtained with the help of other indices), and calculation schemes may not have a solid theoretical foundation and are difficult to interpret physicochemically.

The team of the Department of Physical Chemistry of TvSU has been conducting a computational-theoretical study on the problem “Relationship between the properties of substances and the structure of molecules: mathematical (computer) modeling” for many years. The focus is on the targeted search for new structures, algorithms for solving a number of graph-theoretical and combinatorial problems that arise in the course of collecting and processing information about the structure and properties of substances, creating expert information retrieval systems and databases, developing quantitative methods of calculation and forecasting.

We have built additive schemes and found analytical dependences of the form P = Y(TI) for a number of organic and other molecules. According to the obtained formulas, numerical calculations of the physicochemical properties of the compounds under consideration were performed, s .

Bibliography

  1. Vinogradova M.G., Papulov Yu.G., Smolyakov V.M. Quantitative correlations of "structure property" of alkanes. Additive calculation schemes. Tver, 1999. 96 p.
  2. Chemical Applications of Topology and Graph Theory / Ed. R. King. M.: Mir, 1987. 560 p.
  3. Application of Graph Theory in Chemistry / Ed. N.S. Zefirova and S.I. Kuchanova. Novosibirsk: Nauka, 1988. 306 p.
  4. Stankevich M.I., Stankevich I.V., Zefirov N.S. Topological indices in organic chemistry // Uspekhi khimii. 1988. V.57, No. 3, S.337-366.
  5. Vinogradova M.G., Saltykova M.N. Graph-theoretic approach in studying the relationship between the structure and properties of alkylsilanes.// Fundamental Research, 2009. No. 1. pp. 17-19.
  6. Vinogradova M.G., Saltykova M.N., Efremova A.O., Malchevskaya O.A. The relationship between the structure and properties of alkylsilanes // Successes of modern natural sciences, No. 1, 2010. P. 136-137.
  7. Vinogradova M.G., Saltykova M.N., Efremova A.O. Correlations "Structure - property" of alkylsilanes: graph-theoretic approach // Successes of modern natural sciences, No. 3, 2010. P.141-142.

Bibliographic link

Vinogradova M.G. GRAPH THEORY IN CHEMISTRY // International Journal of Applied and Fundamental Research. - 2010. - No. 12. - P. 140-142;
URL: http://dev.applied-research.ru/ru/article/view?id=1031 (date of access: 12/17/2019). We bring to your attention the journals published by the publishing house "Academy of Natural History"
  • Specialty HAC RF02.00.03
  • Number of pages 410

Dissertation title Doctor of Chemistry Vonchev, Danail Georgiev

Half a century ago, Paul Dirac expressed the opinion that, in principle, all chemistry is contained in the laws of quantum mechanics, but in reality, insurmountable mathematical difficulties arise in practical calculations. Electronic computing technology has helped to reduce the distance between the possibilities and the implementation of the quantum mechanical approach. Yet calculations of molecules with a large number of electrons are complex and not precise enough, and so far only a few molecular properties can be calculated in this way. On the other hand, in organic chemistry there are important structural problems that have not been fully resolved, and above all, this is the problem of the relationship between the structure and properties of molecules. In theoretical chemistry, there is a question of quantifying the basic structural characteristics of molecules - their branching and cyclicity. This question is essential, since a quantitative analysis of the general regularities in the structure of branched and cyclic molecules can to a large extent be transferred to their other properties. In this way, it would be possible to predict the ordering of a group of isomeric compounds according to the values ​​of such properties as stability, reactivity, spectral and thermodynamic properties, etc. This could facilitate the prediction of the properties of yet unsynthesized classes of compounds and the search for structures "with predetermined properties. Despite significant efforts are still open and the question of rational coding of chemical information for the purpose of its efficient storage and use by computers.The optimal solution of this issue would have an impact on the improvement of the classification and nomenclature of both organic compounds and the mechanisms of chemical reactions.Before the theory of Periodic sys-4 of chemical elements also raises the question of a holistic and quantitative interpretation of the periodicity of the properties of chemical elements based on quantities that reflect the electronic structure better than the ordinal number of the element.

As a result, during the last decades, the development of new theoretical methods in chemistry, united under the name of mathematical chemistry, has been stimulated. The main place in it is occupied by topological methods, which reflect the most general structural and geometric properties of molecules. One of the branches of topology, graph theory, offers a convenient mathematical language for the chemist to describe the molecule, since structural formulas are essentially chemical graphs. The advantages offered by graph theory in chemical research are based on the possibility of direct application of its mathematical apparatus without the use of a computer, which is important for experimental chemists. Graph theory allows one to delve quite simply into the structural characteristics of molecules. The results obtained are general and can be formulated as theorems or rules and thus can be applied to any similar chemical (and non-chemical) objects.

After the publication of Shannon's and Wiener's fundamental works on information theory and cybernetics, interest in information-theoretic methods of research has also steadily increased. The original meaning of the term "information" is associated with information, messages and their transmission. This concept quickly left the limits of communication theory and cybernetics and penetrated into various sciences of animate and inanimate nature, society and cognition. The process of developing the information-theoretic approach in science is more complicated than the formal transfer of the cybernetic category of information to other areas of knowledge. The informational approach is not just a translation from less common languages ​​into a metalanguage. It offers a different view of systems and phenomena and allows you to get new results. By expanding the connections between, well, different scientific disciplines, this method makes it possible to find useful analogies and common patterns between niches. Developing, modern science tends to an ever greater degree of generalization, to unity. In this regard, information theory is one of the most promising areas.

An important place in this process is occupied by the application of information theory in chemistry and other natural sciences - physics, biology, etc. In these sciences, the methods of information theory are used in the study and description of those properties of objects and processes that are associated with structure, orderliness, organization systems "The usefulness of the information approach in chemistry lies primarily in the fact that new possibilities are offered for the quantitative analysis of various aspects of chemical structures - atoms, molecules, crystals, etc. In these cases, the concepts of "structural" information and "information content" of atoms and molecules.

In connection with the foregoing, the main goal of the dissertation work is to show the fruitfulness of the graph-theoretic and information-theoretic approach to structural problems in c. chemistry, from atoms and molecules to polymers and crystals, the achievement of this goal involves, as separate steps:

1. Definition of a system of quantities (information and topological indices; for the quantitative characterization of atoms, molecules, polymers and crystals.

2. Development on this basis of a new, more general approach to the question of the correlation between their properties, geometric and electronic structure. Prediction of the properties of some organic compounds, polymers and non-synthesized transactinide nMx elements.

Creation of methods for modeling the growth of crystals and crystal vacancies.

3. Generalized topological characterization of molecules by expressing the essence of their branching and cyclicity in a series of mathematically proven structural rules, and the study of the mapping of these rules by various molecular properties.

4. Creation of new, effective methods for coding chemical compounds and mechanisms of chemical reactions in connection with the improvement of their classification and nomenclature, and especially in connection with the use of computers for processing chemical information.

CHAPTER 2. RESEARCH METHOD 2L. THEOREGIO-INSTRUCTION METHOD 2.1.1 "Introduction

Information is one of the most fundamental concepts in modern science, a concept no less general than the concepts of matter and energy. This view finds justification in the very definitions of information. According to Wiener, "information is neither matter nor energy".

Ashby views information as a "measure of diversity in a given system". According to Glushkov, "information is a measure of inhomogeneity in the distribution of space and time." On this basis, the fact that in addition to material and energy nature, objects and phenomena in nature and technology also have informational properties is increasingly recognized today. Some forecasts go further, predicting that the focus of scientific research will increasingly shift towards the informational nature of processes, which will be the main object of research in the 21st century. These forecasts are essentially based on the possibility of optimal control of systems and processes through information, what, in fact? is the main function of information in cybernetics. In the future, these ideas can lead to the creation of technologies in which every atom and molecule will be controlled by information, a possibility that has found realization so far only in living nature.

The emergence of information theory usually refers to 1948, when Claude Shannon published his fundamental work. The idea of ​​information, however, as a quantity related to entropy, is much older. Back in 1894, Boltzmann established that any information obtained about a given system is associated with a decrease in the number of its possible states and, therefore, an increase in entropy means "loss of information." In 1929

Szilard developed this idea for the general case of information in physics. her CP

Later, Vrilluin "generalized the idea of ​​the relationship between entropy and information in his non-gentropic principle in a form that also covers the informational side of phenomena. Questions about the connection between information theory and thermodynamics, and, in particular, about the relationship between entropy and information, are still the subject of much attention (a detailed list of publications in this area is given in the review 58). From the latest development of the issue, Kobozev's work on the thermodynamics of thinking should be especially noted, in which the thesis about the antientropic nature of thought processes is substantiated.

The theory of information that emerged as a "special theory of communication" quickly outgrew its original limits and found application in various fields of science and technology: in chemistry, biology, medicine, linguistics, psychology, aesthetics, etc. The role of information in biology was recognized first of all. Have you resolved important issues related to the storage, processing and transmission of information in living organisms, including the coding of genetic information 60-7? assessment of the possibility of spontaneous spontaneous generation of life on Earth^, formulation of the basic laws of biological thermodynamics^, analysis of bioenergy issues, etc. The information content of objects was used as a quantitative criterion

A A A evolution ". The question was raised about the informational nature of nutrition processes ^®^^.

Information theory is still slowly penetrating into chemistry and physics, although some progress has been made in this area in recent years. The question was raised about the possible existence of an information balance of chemical reactions. An assessment was made of the information capacity of bioorganic molecules and, on this basis, a new classification of these compounds was proposed, and the specificity of chemical reactions was also assessed.

Levin, Bernstein, and others have applied information theory to molecular dynamics to describe the behavior of molecular systems that are far from equilibrium. The essence of this approach is the concept of a "surprise", a deviation from what is expected based on the microcanonical distribution. Various applications have been proposed, including the study of the performance of lasers, the determination of the branching ratio of competing reaction paths (assuming the path corresponding to the maximum of the Shannon function as the most likely), and so on.

Dodel et al. proposed to distribute the space occupied by a molecular system into a number of mutually exclusive subspaces called lodges. The best lodges containing localized groups of electrons are found by minimizing the information function. Sears et al. found a connection between quantum mechanical kinetic energies and informational quantities. As a consequence of this result, the variational principle of quantum mechanics can be formulated as the principle of minimum information. op os

Kobozev et al. related the selectivity and activity of catalysts to their informational content. They also formulated optimal information conditions for characterization and prediction of catalytic properties. Formation and growth of kris

Ouch. rp oo tall were considered as an information process”. Rakov subjected to informational analysis the treatment of catalyst surfaces with various chemical agents.

In modern analytical chemistry, there is an increasing trend towards optimal experimentation in order to obtain maximum information from a minimum number of experiments.

These new ideas are based on information theory, game theory and systems theory. Other authors have applied information theory to minimize analysis error and time, to achieve higher selectivity, to evaluate the effectiveness of analytical methods, and so on. Research of this kind also includes physical methods in analytical chemistry, including gas chromatography, atomic emission spectral analysis, etc.

Information-theoretic methods have also proved to be useful in geochemistry for characterizing the frequency distribution of chemical compounds in geochemical systems,170 for assessing the degree of complexity and for classifying these systems.

In engineering chemistry, information analysis can be used to solve such problems of chemical-technological systems as the choice of optimal operating conditions, the establishment of control requirements, etc.101.

Examples of the successful application of information theory in chemistry indicate once again that systems in nature and technology also have an informational nature. It also shows that the information approach acts as a universal language for describing systems, and, in particular, chemical structures of any type, to which it associates a certain information function and a numerical measure. It expands. field of possible applications of information theory in chemistry.

The usefulness of the information approach in chemistry is primarily that it offers the possibility of quantitative analysis of various aspects of chemical structures. The degree of complexity of these structures, their organization and specificity can be compared on a single quantitative scale. This allows one to study some of the most general properties of chemical structures, such as their branching and cyclicity, to investigate and compare the degree of organization in various classes of chemical compounds, the specificity of biologically active substances and catalysts, and allows one to approach the question of the degree of similarity and difference between two chemical objects.

The informational approach is very suitable for solving personal classification problems. In these cases, it is possible to derive general information equations for the main groupings of objects of classification (groups and periods in the Periodic system of chemical elements, homologous series of chemical compounds, series of isomeric compounds, etc.) *

The great distinguishing ability of information methods with respect to complex structures (isomers, isotopes, etc.) can be used in computerized processing and storage of chemical information. These methods are useful not only when choosing between different structures, but also between alternative hypotheses and approximations, which is of interest for quantum chemistry. The possibilities of putting forward new hypotheses based on information theory, however, are more limited, since this theory describes the relationship between variables, but does not describe the behavior of any of them.

Problem. The relationship that exists between structure and properties is another area of ​​successful application of the information theory approach in chemistry. The effectiveness of this approach will be shown in a dissertation work for qualitatively different structural levels in chemistry - electron shells of atoms, molecules, polymers, crystals and even atomic nuclei^»^. It can be implemented in both qualitative and quantitative aspects. In the first case, various structural rules can be defined on the information basis, reflecting the mutual influence of two or more structural factors. It is also possible to obtain quantitative correlations between information indices and properties?®. At the same time, in principle, information indices provide better correlations compared to other indices, since they more fully reflect the features of chemical structures. Successful correlations are possible not only with quantities directly related to entropy, but also with quantities such as binding energy, whose relationship with information is far from obvious. Here, properties are included as a separate molecule "or atom, but also their large aggregates, i.e. properties that depend on the interaction between molecules and atoms, and not only on their internal structure. In addition., Processes in chemistry can also be the subject of information analysis based on changes in information indices during interactions.

Some limitations of the informational approach should also be kept in mind. Although they are .ton, quantitative measures of information are relative, not absolute. They are also statistical characteristics and refer to aggregates, but not to individual elements of them. Information indices can be defined for various properties of atoms and molecules, but the relationship between them is often complex and implicit.

On the other hand, having multiple information indexes for a single structure can cause mixed feelings. However, it should be remembered that any of these indexes is legal. The right question here is which of these quantities are useful and to what extent.

In this chapter, for the first time, information-theoretical indices are introduced:, / characterizing the electronic structure of atoms, as well as new information indices about the symmetry, topology, and electronic structure of molecules. The application of these structural characteristics is discussed in chapter III, sections IV.2 and V 1.

2.1.2. Necessary information from information theory

Information theory offers quantitative methods for the study of the acquisition, storage, transmission, transformation and use of information. The main place in these methods is occupied by the quantitative measurement of information. The definition of the concept of the amount of information requires the rejection of widespread, but unclear ideas about information as an aggregate, facts, information, knowledge.

The concept of the amount of information is closely related to the concept of entropy as a measure of uncertainty. In 1923, Hartley characterized the uncertainty of an experiment with n different outcomes by the number ¿od n. In Shannon's statistical theory of information, published in 1948, the amount of information is defined through the concept of probability. It is known that this concept is used to describe a situation in which there is uncertainty associated with the choice of one or more elements (outcomes) from a certain set. Following Shannon, the measure of the uncertainty of the outcome X/experiment X with probability p(X¡) -¿Oy(X)) . A measure of the average uncertainty of the complete experiment X with Xt, X2, ♦ possible outcomes, with probabilities p(X4), p(X2), respectively. chp(Xn), is the quantity

H(x) = - pcx,) Log p(Xi) cg>

In statistical information theory H(X) is called the entropy of the probability distribution. The latter, in the case of /7 different outcomes, form a finite probabilistic scheme, i.e.

The concept of probability can be defined in a more general way from the point of view of set theory. Let a finite set be a partition of A into a /T) class in which /\ are disjoint sets; by some equivalence relation X * Set of equivalence classes

R/X = (2.2; is called the factor set R in X

Kolmogorov's probability function (probabilistic correspondence) p is subject to three conditions:

The number series PfXf) , P(X2) , ., P(XGG)) is called the distribution of the partition A, and the Shannon function H(X) from Eq. (2.1) expresses the entropy of the partition X

It should be borne in mind that the concept of entropy in information theory is more general than thermodynamic entropy. The latter, considered as a measure of disorder in atomic-molecular motions, is a special case of a more general concept of entro-! FDI is the measure of any disorder or uncertainty or diversity.

The amount of information X is expressed by the value of the detached uncertainty. Then the average entropy of a given event with many possible outcomes is equal to the average amount of information needed to select any event X from the set ^ ,X^,. and is determined by the Shannon formula (y-e 2.1):

I(x) = -K$Lp(x,-)logp(K) = Hw

Here K is a positive constant that determines the units of information measurement. Assuming K \u003d 4, entropy (respectively, information) is measured in decimal units. The most convenient measurement system is based on binary units (bits). In this case, K ~ W2 and the logarithm wur-u(2.4) is taken in base two and \-! is denoted by t for short.One binary unit of information (or 1 bit) is the amount of information that is obtained when the result of choosing between two equally probable possibilities is known, and in units of entropy,¿ .dgasG\ the conversion factor is the Voltzmann constant (1.38.10 yj.gra.d~divided by /a?Yu.

It is proved that the choice of the logarithmic function for the amount of information is not random, and this is the only function that satisfies the conditions of activity and non-negativity of information

Both single and average information are always positive. This property is related to the fact that the probability is always less than one, and the constant in equation (2.4) is always taken positive. E | & ring ^ Y, then

13 p(x, -) = H(x, o c2.5) and this inequality is preserved even after averaging.

The average amount of information for a given event (experience) X reaches a maximum with a uniform distribution of probabilities p(X,) - p(X2) = . . .=p(Xn)* i.e. for p(X)) for any P:

For a pair of random dependent events X and y, the average amount of information is also expressed by the Shannon formula:

1(xy> = - р(х, yj) № pix, yj) (2.7)

Equation (2.7) can be generalized for any finite set, regardless of the nature of its elements:

1(xy) = -Z Z. P(X,nYj) 16P(X-,nYj) (2.8) are two quotient sets of P with respect to two different equivalence relations x and y, and K/xy is a factor-set of sections of X; and:

Having written the joint probability in equation (2.7) as the product of the unconditional and conditional probabilities p(x;, y^ = p(><¡)"P(Уj/x¡) , и представив логарифм в виде сумш»получается уравнение:

1(Xy) = 1(X) x - 1(y/X) (2.9) where T(x/y) is the average amount of conditional information contained in y relative to x, and is given by:

1(y/X) = -Y p(X, y1) 1B p(Y;/X-,) (2.10)

Defining a function:

1 (X, y.! = 1 (Y> - 1 (y / X) (2-Sh and replacing it in equation (2.9):

1(xy) - 1(X) + 1(y) -1(x, y) (2.12) it becomes clear that T(X, y) expresses the deviation of information about the complex event (X, y") from the additivity of information about individual events (outcomes): x and y. Therefore, G (X, Y) is a measure of the degree of statistical dependence (connection) between X and y. relationship between x and y3 is symmetrical.

In the general case, for a statistical relationship between x and y and the average amount of unconditional information about X or y, the following inequalities hold:

Equality in power when the second term in equation (2.11) is zero, i.e. when each / corresponds to I for which p(y. ¡X))=

If the quantities X and y are independent, i.e. if in equation (2.12) T (X, y) \u003d 0, then

1(xy) =1(X)<2Л5>

This equation expresses the property of the additivity of the amount of information and is generalized for independent random variables. comes to mind:

1(xnx2,.,xn) = 11 1(x/) (2.16)

Probabilistic approaches to the quantitative determination of information are also known. Ingarden and Urbanich proposed an axiomatic/sizvdelenie-Shein information without probabilities, in the form of a function of finite Boolean rings. Of considerable interest is the epsilon-entropy (combinatorial approach) proposed by Kolmogorov^^ and especially the algorithmic determination of the amount of information. According to Kolmogorov, the amount of information contained in one object (set) relative to another object (set) is considered as the "minimum length" of programs, written as a sequence of zeros and ones, and allowing a one-to-one transformation; the first object in the second:: \u003d H (X / y) \u003d W "W I (R) (2-17)

Kolmogorov's algorithmic approach offers new

17 logical foundations of information theory based on the concepts of complexity and sequence, takcha&Yn-concepts "entropy" and "amount of information" turned out to be applicable to individual objects.

Incredible methods in information theory expand the content of the concept of the amount of information from the amount of reduced uncertainty to the amount of reduced uniformity or to the amount of diversity in accordance with Ashby's interpretation. Any set of probabilities normalized to unity can be considered as corresponding to a certain set of elements that possess diversity. Diversity is understood as a characteristic of the elements of a set, which consists in their difference, non-coincidence with respect to some relation of equivalence. This @ can be a set of "different elements, relationships, relations, properties of objects. The smallest unit of information, a bit, in this approach expresses the minimum difference, i.e. the difference between two objects that are not identical, differ in some property.

In this aspect, information-theoretic methods are applicable to the definition of the so-called. structural information is the amount of information contained in the structure of a given system. A structure here means any finite set whose elements are distributed over subsets (equivalence classes) depending on a certain equivalence relation.

Let this structure contain A/ elements and they are distributed according to some equivalence criterion in subsets of equivalent elements: . This distribution corresponds to a finite probabilistic scheme of the probability subset ^ pn p2> . . ?Rp elements

2.18) where ¿T - A/"u is the probability of one (randomly) chosen element to fall into / - that subset for which A/,-elements. The entropy H of the probability distribution of the elements of this structure, determined by equation (2.4), can be considered / as a measure of the average amount of information, I, that is contained in one element of the structure: - p

1u P/ , bits per element (2.19)

The general information content of the structure is given by the derivative equation (2.19):

1-M1-A//0/h-hnmm,<*.»>

There is no consensus in the literature about how to name the quantities defined by y-y (2.19) and (2.20). Some authors prefer to refer to them as average and general information content, respectively. Thus, according to Moushowitz, I is not a measure of entropy in the sense in which it is used in information theory, as it does not express the average uncertainty of a structure consisting of /\/ elements in an ensemble of all possible structures that have the same: the number of elements. I is rather the information content of the structure under consideration in relation to the system of transformations that leave the system invariant. According to Rem from equation (2.4) measures the amount of information after the experiment, and before it H(x) is a measure of the entropy associated with the uncertainty of the experiment. In our opinion, an "experiment" that reduces the uncertainty of chemical structures (atoms, molecules, etc.) is itself "the process of forming these structures from their unrelated elements. Information is here in a connected form, it is contained in a structure, and therefore often the term "information content" of the structure is used.

The concept of structural information based on the given interpretation1 of equations (2.19) and (2.20) agrees well with Ashby's ideas about the amount of information as the amount of diversity. When a system consists of the same elements, there is no diversity in it. In this case, in y-s (2.19) and (2.20)/="/

With the maximum variety of elements in the structure, Λ £ = /, and the information content of the structure is maximum:

4 "* -N16 u, T ^ ^ vi

2.1.3. Information-theoretical indices for characterizing the electronic structure of atoms of chemical elements

Recommended list of dissertations in the specialty "Organic Chemistry", 02.00.03 VAK code

  • Asymptotic problems of combinatorial coding theory and information theory 2001, Candidate of Physical and Mathematical Sciences Vilenkin, Pavel Alexandrovich2011, candidate of physical and mathematical sciences Shutkin, Yuri Sergeevich

Please note that the scientific texts presented above are posted for review and obtained through original dissertation text recognition (OCR). In this connection, they may contain errors related to the imperfection of recognition algorithms. There are no such errors in the PDF files of dissertations and abstracts that we deliver.

1 Over the past decades, the concepts of topology and graph theory have become widespread in theoretical chemistry. They are useful in searching for quantitative relations "structure-property" and "structure-activity", as well as in solving graph-theoretic and combinatorial-algebraic problems that arise in the course of collecting, storing and processing information on structure and properties. substances.

Graphs serve, first of all, as a means of representing molecules. In the topological description of a molecule, it is depicted as a molecular graph (MG), where the vertices correspond to atoms, and the edges correspond to chemical bonds (graph-theoretic model of a molecule). Usually, only skeletal atoms are considered in this representation, for example, hydrocarbons with "erased" hydrogen atoms.

The valency of chemical elements imposes certain restrictions on the degrees of vertices. Alkane trees (connected graphs that do not have cycles) have vertex degrees (r) that cannot exceed four (r = 1, 2, 3, 4).

Graphs can be specified in matrix form, which is convenient when working with them on a computer.

The vertex adjacency matrix of a simple graph is a square matrix A = [aσχ] with elements aσχ = 1 if the vertices σ and χ are connected by an edge, and σχ = 0 otherwise. The distance matrix is ​​a square matrix D = with elements dσχ defined as the minimum number of edges (the shortest distance) between the vertices σ and χ. Sometimes adjacency and edge distance matrices (A e and D e) are also used.

The type of matrices A and D (A e and D e) depends on the method of numbering the vertices (or edges), which causes inconvenience when handling them. To characterize a graph, graph invariants are used - topological indices (TI).

Number of paths of length one

pi = xcc 0 = m = n-1

Number of paths of length two

Number of triplets of adjacent edges (with a common vertex)

The Wiener number (W), defined as a half-sum of the elements of the distance matrix of the graph under consideration:

etc.

The methodology for studying the "structure-property" relationship through topological indices in the graph-theoretic approach includes the following steps.

Selection of objects of study (training sample) and analysis of the state of numerical data on the property P for a given range of compounds.

Selection of TIs taking into account their discriminating ability, correlation ability with properties, etc.

The study of graphical dependencies "Property P - TI of the molecule graph", for example, P on n - the number of skeletal atoms, P on W - the Wiener number, etc.

Establishment of a functional (analytical) dependence P = _DTI), for example,

P \u003d a (TI) + b,

P \u003d aln (TI) + b,

P \u003d a (TI) 1 + b (TI) 2 + ... + n (TI) n + c

etc. Here a, b, c are some parameters (not to be confused with the parameters of additive circuits.) to be determined.

Numerical calculations of Р, comparison of calculated values ​​with experimental ones.

Prediction of the properties of compounds that have not yet been studied or even obtained (outside of this sample).

Topological indices are also used in the construction of additive calculation and forecasting schemes. They can be used in the development of new drugs, in assessing the carcinogenic activity of certain chemicals, in predicting the relative stability of new (not yet synthesized) compounds, etc.

However, it should be remembered that the choice of TI is often random; they may not reflect important structural features of molecules or duplicate information (obtained with the help of other indices), and calculation schemes may not have a solid theoretical foundation and are difficult to interpret physicochemically.

The team of the Department of Physical Chemistry of TvSU has been conducting a computational-theoretical study on the problem “Relationship between the properties of substances and the structure of molecules: mathematical (computer) modeling” for many years. The focus is on the targeted search for new structures, algorithms for solving a number of graph-theoretical and combinatorial problems that arise in the course of collecting and processing information about the structure and properties of substances, creating expert information retrieval systems and databases, developing quantitative methods of calculation and forecasting.

We have built additive schemes and found analytical dependences of the form P = Y(TI) for a number of organic and other molecules. According to the obtained formulas, numerical calculations of the physicochemical properties of the compounds under consideration were performed, s .

Bibliography

  1. Vinogradova M.G., Papulov Yu.G., Smolyakov V.M. Quantitative correlations of "structure property" of alkanes. Additive calculation schemes. Tver, 1999. 96 p.
  2. Chemical Applications of Topology and Graph Theory / Ed. R. King. M.: Mir, 1987. 560 p.
  3. Application of Graph Theory in Chemistry / Ed. N.S. Zefirova and S.I. Kuchanova. Novosibirsk: Nauka, 1988. 306 p.
  4. Stankevich M.I., Stankevich I.V., Zefirov N.S. Topological indices in organic chemistry // Uspekhi khimii. 1988. V.57, No. 3, S.337-366.
  5. Vinogradova M.G., Saltykova M.N. Graph-theoretic approach in studying the relationship between the structure and properties of alkylsilanes.// Fundamental Research, 2009. No. 1. pp. 17-19.
  6. Vinogradova M.G., Saltykova M.N., Efremova A.O., Malchevskaya O.A. The relationship between the structure and properties of alkylsilanes // Successes of modern natural sciences, No. 1, 2010. P. 136-137.
  7. Vinogradova M.G., Saltykova M.N., Efremova A.O. Correlations "Structure - property" of alkylsilanes: graph-theoretic approach // Successes of modern natural sciences, No. 3, 2010. P.141-142.

Bibliographic link

Vinogradova M.G. GRAPH THEORY IN CHEMISTRY // International Journal of Applied and Fundamental Research. - 2010. - No. 12. - P. 140-142;
URL: https://applied-research.ru/ru/article/view?id=1031 (date of access: 12/17/2019). We bring to your attention the journals published by the publishing house "Academy of Natural History"

MUNICIPAL AUTONOMOUS GENERAL EDUCATIONAL INSTITUTION SECONDARY EDUCATIONAL SCHOOL № 2

Prepared

Legkokonets Vladislav, 10A student

Practical application of Graph Theory

Supervisor

L.I. Noskova, mathematics teacher

st.Bryukhovetskaya

2011

1.Introduction………………………………………………………………………….………….3

2. The history of the emergence of graph theory………………………………………….………..4

3.Basic definitions and theorems of graph theory……………………………….………6

4. Tasks solved with the help of graphs……………………………..……………………..8

4.1 Famous tasks………………………………….………………………...8

4.2 Some interesting tasks………………………………….……………..9

5. Application of graphs in various areas of people's lives……………………………...11

6. Problem solving……………………………………………………………………………...12

7. Conclusion………………….…………………………………………………………….13

8. List of references………….…………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………….

9.Appendix……………………………………………………………………….…………15

Introduction

Born in solving puzzles and entertaining games, graph theory has now become a simple, accessible and powerful tool for solving problems related to a wide range of problems. Graphs are literally ubiquitous. In the form of graphs, one can, for example, interpret road diagrams and electrical circuits, geographical maps and molecules of chemical compounds, connections between people and groups of people. Over the past four decades, graph theory has become one of the most rapidly developing branches of mathematics. This is driven by the demands of a rapidly expanding application area. It is used in the design of integrated circuits and control circuits, in the study of automata, logical circuits, program flowcharts, in economics and statistics, chemistry and biology, in scheduling theory. That's why relevance The topic is due, on the one hand, to the popularity of graphs and related research methods, and on the other hand, an undeveloped, integral system for its implementation.

The solution of many life problems requires long calculations, and sometimes these calculations do not bring success. This is what it consists research problem. The question arises: is it possible to find a simple, rational, short and elegant solution for their solution. Is it easier to solve problems if you use graphs? It determined topic of my research: "Practical Application of Graph Theory"

aim research was with the help of graphs to learn how to quickly solve practical problems.

Research hypothesis. The graph method is very important and widely used in various fields of science and human life.

Research objectives:

1. To study the literature and Internet resources on this issue.

2. Check the effectiveness of the graph method in solving practical problems.

3. Make a conclusion.

Practical significance of the study is that the results will undoubtedly arouse the interest of many people. Haven't any of you tried to build a family tree of your family? And how to do it correctly? The head of a transport company probably has to solve the problem of more profitable use of transport when transporting goods from a destination to several settlements. Each student faced logical transfusion tasks. It turns out they are solved with the help of graphs easily.

The following methods are used in the work: observation, search, selection, analysis.

The history of the emergence of graph theory

The mathematician Leonhard Euler (1707-1783) is considered to be the founder of graph theory. The history of the emergence of this theory can be traced through the correspondence of the great scientist. Here is a translation of the Latin text, which is taken from Euler's letter to the Italian mathematician and engineer Marinoni, sent from St. Petersburg on March 13, 1736.

“Once I was given a problem about an island located in the city of Koenigsberg and surrounded by a river, across which seven bridges are thrown.

[Appendix Fig.1] The question is whether anyone can go around them continuously, passing only once through each bridge. And then I was informed that no one had yet been able to do this, but no one had proved that it was impossible. This question, although banal, seemed to me, however, worthy of attention because neither geometry, nor algebra, nor combinatorial art is sufficient for its solution. After much deliberation, I found an easy rule, based on a quite convincing proof, by which in all problems of this kind one can immediately determine whether such a round can be made through any number and arbitrarily located bridges or cannot. The Konigsberg bridges are located so that they can be represented in the following figure [Appendix Fig.2], where A denotes an island, and B , C and D are parts of the continent separated from each other by river branches

Regarding the method he discovered for solving problems of this kind, Euler wrote:

"This solution, by its nature, seems to have little to do with mathematics, and it is not clear to me why this solution should be expected from a mathematician rather than from any other person, for this solution is supported by reason alone, and there is no need to involve to find this solution, any laws inherent in mathematics. So, I do not know how it turns out that questions that have very little to do with mathematics are more likely to be resolved by mathematicians than by others. "

So is it possible to get around the Königsberg bridges by passing only once through each of these bridges? To find the answer, let's continue Euler's letter to Marinoni:

"The question is to determine whether it is possible to get around all these seven bridges, passing through each only once, or not. My rule leads to the following solution to this question. First of all, you need to look at how many sections are separated by water - such , which have no other transition from one to another, except through the bridge.In this example, there are four such sections - A, B, C, D. Next, you need to distinguish whether the number of bridges leading to these individual sections is even or odd. So, in our case, five bridges lead to section A, and three bridges to the rest, i.e. The number of bridges leading to individual sections is odd, and this one is already enough to solve the problem.When this is determined, we apply the following rule : if the number of bridges leading to each individual section were even, then the detour in question would be possible, and at the same time it would be possible to start this detour from any section. would be odd, for only one be n if it cannot be even, then even then the transition could take place, as prescribed, but only the beginning of the detour must certainly be taken from one of those two sections to which an odd number of bridges leads. If, finally, there were more than two sections to which an odd number of bridges leads, then such a movement is generally impossible ... if other, more serious problems could be cited here, this method could be even more useful and should not be neglected " .

Basic definitions and theorems of graph theory

Graph theory is a mathematical discipline created by the efforts of mathematicians, so its presentation includes the necessary rigorous definitions. So, let's proceed to the organized introduction of the basic concepts of this theory.

    Definition 1. A graph is a collection of a finite number of points, called the vertices of the graph, and pairwise connecting some of these vertices of lines, called the edges or arcs of the graph.

This definition can be formulated differently: a graph is a non-empty set of points (vertices) and segments (edges), both ends of which belong to a given set of points

In the future, we will denote the vertices of the graph in Latin letters A, B, C, D. Sometimes the graph as a whole will be denoted by a single capital letter.

Definition 2. The vertices of the graph that do not belong to any edge are called isolated.

Definition 3. A graph consisting only of isolated vertices is called zero - count .

Notation: O "– a graph with vertices and no edges

Definition 4. A graph in which each pair of vertices is connected by an edge is called complete.

Designation: U" a graph consisting of n vertices and edges connecting all possible pairs of these vertices. Such a graph can be represented as an n-gon in which all diagonals are drawn

Definition 5. The degree of a vertex is the number of edges that the vertex belongs to.

Definition 6. A graph whose degrees of all k vertices are the same is called a homogeneous graph of degree k .

Definition 7. The complement of a given graph is a graph consisting of all the edges and their ends that must be added to the original graph in order to obtain a complete graph.

Definition 8. A graph that can be represented in a plane in such a way that its edges intersect only at the vertices is called planar.

Definition 9. A polygon of a planar graph that does not contain any vertices or edges of the graph inside is called its face.

The concepts of a plane graph and graph faces are used in solving problems for the "correct" coloring of various maps.

Definition 10. A path from A to X is a sequence of edges leading from A to X such that every two adjacent edges have a common vertex and no edge occurs more than once.

Definition 11. A cycle is a path where the start and end points are the same.

Definition 12. A simple cycle is a cycle that does not pass through any of the vertices of the graph more than once.

Definition 13. long way , laid on a loop , is the number of edges of this path.

Definition 14. Two vertices A and B in a graph are called connected (disconnected) if there exists (does not exist) a path leading from A to B in it.

Definition 15. A graph is called connected if every two of its vertices are connected; if the graph contains at least one pair of disconnected vertices, then the graph is called disconnected.

Definition 16. A tree is a connected graph that does not contain cycles.

A three-dimensional model of a graph-tree is, for example, a real tree with its intricately branched crown; the river and its tributaries also form a tree, but already flat - on the surface of the earth.

Definition 17. A disconnected graph consisting solely of trees is called a forest.

Definition 18. A tree, all n vertices of which are numbered from 1 to n, is called a tree with renumbered vertices.

So, we have considered the main definitions of graph theory, without which it would be impossible to prove theorems, and, consequently, to solve problems.

Problems solved using graphs

Famous Challenges

The traveling salesman problem

The traveling salesman problem is one of the famous problems in the theory of combinatorics. It was put up in 1934, and the best mathematicians broke their teeth about it.

The problem statement is the following.
The traveling salesman (traveling merchant) must leave the first city, visit cities 2,1,3..n once in an unknown order, and return to the first city. Distances between cities are known. In what order should the cities be traversed so that the closed path (tour) of the traveling salesman is the shortest?

Method for solving the traveling salesman problem

Greedy Algorithm “go to the nearest (which you have not yet entered) city.”
This algorithm is called “greedy” because in the last steps you have to pay dearly for greed.
Consider for example the network in Figure [app fig.3] representing a narrow rhombus. Let the salesman start from city 1. The “go to the nearest city” algorithm will take him to city 2, then 3, then 4; on the last step, you will have to pay for greed, returning along the long diagonal of the rhombus. The result is not the shortest, but the longest tour.

The problem of the Königsberg bridges.

The task is formulated as follows.
The city of Konigsberg is located on the banks of the Pregel River and two islands. Different parts of the city were connected by seven bridges. On Sundays, the townspeople took walks around the city. Question: is it possible to take a walk in such a way that, having left the house, come back, passing exactly once over each bridge.
Bridges over the Pregel river are located as in the picture
[Appendix Fig.1].

Consider a graph corresponding to the bridge scheme [appendix fig.2].

To answer the question of the problem, it is enough to find out whether the graph is Euler. (At least one vertex must have an even number of bridges). It is impossible, walking around the city, to go through all the bridges once and come back.

Several interesting challenges

1. "Routes".

Task 1

As you remember, the hunter for dead souls Chichikov visited famous landowners once each. He visited them in the following order: Manilov, Korobochka, Nozdrev, Sobakevich, Plyushkin, Tentetnikov, General Betrishchev, Petukh, Konstanzholgo, Colonel Koshkarev. A diagram was found on which Chichikov sketched the relative position of the estates and country roads connecting them. Determine which estate belongs to whom, if Chichikov did not pass any of the roads more than once [appendix fig.4].

Solution:

According to the road map, it can be seen that Chichikov began his journey with the E estate, and ended with the O estate. We notice that only two roads lead to the estates B and C, so Chichikov had to drive along these roads. Let's mark them with bold lines. The sections of the route passing through A are determined: AC and AB. Chichikov did not travel on the roads AE, AK and AM. Let's cross them out. Let's mark with a thick line ED ; cross out DK . Cross out MO and MN; mark with a bold line MF ; cross out FO ; we mark FH , NK and KO with a thick line. Let us find the only possible route under the given condition. And we get: the estate E - belongs to Manilov, D - Korobochka, C - Nozdrev, A - Sobakevich, V - Plyushkin, M - Tentetnikov, F - Betrishchev, N - Petukh, K - Konstanzholgo, O - Koshkarev [app fig.5].

Task 2

The figure shows a map of the area [appendix fig.6].

You can only move in the direction of the arrows. Each point can be visited no more than once. In how many ways can you get from point 1 to point 9? Which route is the shortest and which is the longest.

Solution:

Sequentially "stratify" the scheme into a tree, starting from vertex 1 [app fig.7]. Let's get a tree. The number of possible ways to get from 1 to 9 is equal to the number of "hanging" vertices of the tree (there are 14 of them). Obviously the shortest path is 1-5-9; the longest is 1-2-3-6-5-7-8-9.

2 "Groups, dating"

Task 1

Participants of the music festival, having met, exchanged envelopes with addresses. Prove that:

a) an even number of envelopes were sent in total;

b) the number of participants who exchanged envelopes an odd number of times is even.

Solution: Let the festival participants be A 1 , A 2 , A 3 . . . , And n are the vertices of the graph, and the edges connect pairs of vertices representing guys who exchanged envelopes [Appendix Fig.8]

Solution:

a) the degree of each vertex A i shows the number of envelopes that participant A i gave to his friends. The total number of transmitted envelopes N is equal to the sum of the degrees of all vertices of the graph N = step. A 1 + step. A 2 + + . . . + step. And n -1 + step. And n , N =2p , where p is the number of graph edges, i.e. N is even. Therefore, an even number of envelopes were sent;

b) in the equality N = step. A 1 + step. A 2 + + . . . + step. And n -1 + step. And n the sum of odd terms must be even, and this can only be if the number of odd terms is even. And this means that the number of participants who exchanged envelopes an odd number of times is even.

Task 2

Once Andrei, Boris, Volodya, Dasha and Galya agreed to go to the cinema in the evening. They decided to agree on the choice of the cinema and the session by phone. It was also decided that if it was not possible to phone someone, then the trip to the cinema would be cancelled. In the evening, not everyone gathered at the cinema, and therefore the visit to the cinema fell through. The next day, they began to find out who called whom. It turned out that Andrey called Boris and Volodya, Volodya called Boris and Dasha, Boris called Andrey and Dasha, Dasha called Andrey and Volodya, and Galya called Andrey, Volodya and Boris. Who was unable to phone and therefore did not come to the meeting?

Solution:

Let's draw five points and designate them with the letters A, B, C, D, E. These are the first letters of the names. Let's connect those dots that correspond to the names of the guys who called each other.

[app fig.9]

It can be seen from the picture that each of the guys - Andrey, Boris and Volodya - phoned everyone else. That's why these guys came to the cinema. But Galya and Dasha failed to call each other (points D and D are not connected by a segment) and therefore, in accordance with the agreement, they did not come to the cinema.

The use of graphs in various areas of people's lives

In addition to the examples given, graphs are widely used in construction, electrical engineering, management, logistics, geography, mechanical engineering, sociology, programming, automation of technological processes and industries, psychology, and advertising. So, from all of the above, the practical value of graph theory irrefutably follows, the proof of which was the goal of this study.

In any field of science and technology, you meet with graphs. Graphs are wonderful mathematical objects with which you can solve mathematical, economic and logical problems, various puzzles and simplify the conditions of problems in physics, chemistry, electronics, automation. It is convenient to formulate many mathematical facts in the language of graphs. Graph theory is part of many sciences. Graph theory is one of the most beautiful and visual mathematical theories. Recently, graph theory has found more and more applications in applied issues. Even computer chemistry has emerged - a relatively young field of chemistry based on the application of graph theory.

Molecular graphs, used in stereochemistry and structural topology, chemistry of clusters, polymers, etc., are undirected graphs that display the structure of molecules [app fig.10]. The vertices and edges of these graphs correspond to the corresponding atoms and the chemical bonds between them.

Molecular graphs and trees: [app fig.10] a, b - multigraphs resp. ethylene and formaldehyde; in-mol. isomers of pentane (trees 4, 5 are isomorphic to tree 2).

In the stereochemistry of organisms, the most often use molecular trees - the main trees of molecular graphs, which contain only all the vertices corresponding to C atoms. trees and the establishment of their isomorphism allow you to determine the pier. structures and find the total number of isomers of alkanes, alkenes and alkynes

Protein networks

Protein networks - groups of physically interacting proteins that function in a cell jointly and in a coordinated manner, controlling the interconnected processes occurring in the body [app fig. eleven].

Hierarchical system graph called a tree. A distinctive feature of a tree is that there is only one path between any two of its vertices. The tree does not contain cycles and loops.

Usually, a tree representing a hierarchical system has one main vertex, which is called the root of the tree. Each vertex of the tree (except the root) has only one ancestor - the object designated by it belongs to one top-level class. Any vertex of the tree can generate several descendants - vertices corresponding to lower-level classes.

For each pair of tree vertices, there is a unique path connecting them. This property is used when finding all the ancestors, for example, in the male line, of any person whose family tree is represented as a family tree, which is also a “tree” in the sense of graph theory.

An example of my family tree [appendix fig.12].

One more example. The figure shows the biblical family tree [appendix fig.13].

Problem solving

1. Transport task. Let there be a base with raw materials in the city of Krasnodar, which needs to be planted in the cities of Krymsk, Temryuk, Slavyansk-on-Kuban and Timashevsk in one run, while spending as little time and fuel as possible and returning back to Krasnodar.

Solution:

First, let's create a graph of all possible routes. [app fig.14], taking into account the real roads between these settlements and the distance between them. To solve this problem, we need to create another graph, a tree [app fig.15].

For the convenience of the solution, we denote the cities with numbers: Krasnodar - 1, Krymsk - 2, Temryuk - 3, Slavyansk - 4, Timashevsk - 5.

This resulted in 24 solutions, but we only need shortest paths. Of all the solutions, only two are satisfied, these are 350 km.

Similarly, it is possible and, I think, it is necessary to calculate real transportation from one locality to another.

    Logic task for transfusion. There are 8 liters of water in a bucket, and there are two pots with a capacity of 5 and 3 liters. it is required to pour 4 liters of water into a five-liter saucepan and leave 4 liters in a bucket, i.e. pour water equally into a bucket and a large saucepan.

Solution:

The situation at any moment can be described by three numbers [appendix fig.16].

As a result, we get two solutions: one in 7 moves, the other in 8 moves.

Conclusion

So, in order to learn how to solve problems, you need to understand what they are, how they are arranged, what components they consist of, what are the tools that are used to solve problems.

Solving practical problems with the help of graph theory, it became clear that at every step, at every stage of their solution, it is necessary to apply creativity.

From the very beginning, at the first stage, it lies in the fact that you need to be able to analyze and encode the condition of the problem. The second stage is a schematic notation, which consists in the geometric representation of graphs, and at this stage the element of creativity is very important because it is far from easy to find correspondences between the elements of the condition and the corresponding elements of the graph.

When solving a transport problem or a problem of compiling a family tree, I concluded that the graph method is certainly interesting, beautiful and visual.

I was convinced that graphs are widely used in economics, management, and technology. Graph theory is also used in programming. This was not discussed in this paper, but I think that this is only a matter of time.

In this scientific work, mathematical graphs, their areas of application are considered, several problems are solved with the help of graphs. Knowledge of the basics of graph theory is necessary in various areas related to production management, business (for example, construction network diagram, mail delivery schedules). In addition, while working on a scientific work, I mastered working on a computer in a WORD text editor. Thus, the tasks of scientific work are fulfilled.

So, from all of the above, the practical value of graph theory irrefutably follows, the proof of which was the goal of this work.

Literature

    Berge K. Graph theory and its applications. -M.: IIL, 1962.

    Kemeny J., Snell J., Thompson J. Introduction to finite mathematics. -M.: IIL, 1963.

    Ore O. Graphs and their application. -M.: Mir, 1965.

    Harary F. Graph theory. -M.: Mir, 1973.

    Zykov A.A. Theory of Finite Graphs. -Novosibirsk: Nauka, 1969.

    Berezina L.Yu. Graphs and their application. -M.: Education, 1979. -144 p.

    "Soros Educational Journal" No. 11 1996 (Article "Flat Graphs");

    Gardner M. "Mathematical leisure", M. "Mir", 1972 (chapter 35);

    Olekhnik S. N., Nesterenko Yu. V., Potapov M. K. "Old entertaining problems", M. "Nauka", 1988 (part 2, section 8; appendix 4);

Application

Application



P

Rice. 6

Rice. 7

Rice. eight

application

Application


Application

Application


P

Rice. fourteen

application

Application