Optimal control. General notes on optimal systems

Subscribe
Join the “koon.ru” community!
In contact with:

Optimal systems– these are systems in which a given quality of work is achieved through maximum use of the capabilities of the object, in other words, these are systems in which the object operates at the limit of its capabilities.

An optimal control system is a control system selected in one way or another and has the best qualities.

The evaluation of the control system function is carried out according to the optimality criterion. The task of the SU optimality theory is to determine in general view object management laws. Based on these laws, one can judge what can and cannot be achieved in real conditions. The classical formulation of the problem is the problem of determining the optimal control algorithm in the presence of a priori information (mathematical description including restrictions imposed on any coordinates of the system) about the control object.

Let us consider a first-order aperiodic link

W (p) = K/(Tp+1) (1)

u≤ A,(2)

for which it is necessary to ensure the minimum transition time y from the initial state y(0) to final y k. The transition function of such a system at K=1 looks like this

Rice. 1.1. Transition function of the system at U= const.

Let's consider the situation when we apply the maximum possible control action to the input of the object.

Fig.1.2. Transition function of the system at U=A= const.

t 1 - the minimum possible time of transition y from the zero state to the final state for a given object.

To obtain such a transition, there are two control laws:

    software control

A, t< t 1

y k , t ≥ t 1 ;

    feedback type control law

A, y< y k

y=(4)

y k , y ≥ y k ;

The second law is more preferable and allows for control in the event of interference.

Rice. 1.3. Structural scheme systems with a feedback type control law.

The purpose of management is the requirements presented to the control system.

    restrictions on input parameters, for example, tolerances on manufactured products, errors in stabilization of the controlled variable,

    extreme conditions (max power or efficiency, min energy loss),

    some quality indicators (content of harmful components in the final product)

Strict formalization of the control goal is very difficult due to the presence of subsystems

When formalizing the criterion, it is necessary to take into account factors influencing the behavior of the control system more high level. For example, when extracting a mineral, the maximum output of the product. But at the same time the quality deteriorates, i.e. the specified quality must be taken into account.

Thus, when choosing a formalized (mathematical) expression of the optimality criterion, it is necessary to take into account:

1) the optimality criterion must reflect economic indicators or values ​​associated with them.

2) for a specific control system, only 1 criterion is taken into account (if the problem is multi-criteria, then the global criterion is a function of particular criteria.

3) the criterion must be associated with control actions, otherwise it is useless.

4) the criterion function has a suitable form, it is desirable that the criterion has 1 extremum,

5) the information required for the criterion should not be redundant. This allows us to simplify the system of measuring devices. And increase the reliability of the system as a whole.

Test tasks for self-control

1. Management is -

A) achieving selected goals in practical activities

B) achieving selected goals in scientific activity

C) achieving selected goals in reality

D) achieving selected goals in theoretical activity

D) achieving selected goals in psychological activity

2. In control theory, it is possible to state how many problems

3. The essence of the management task is

A) in managing an object in the process of its functioning without our direct participation in the process

B) in managing an object in the process of its functioning with ours

directparticipation in the process

D) in controlling an object during its operation using sensors

4. The essence of the task of self-government is

A) in managing an object in the process of its functioning without our direct participation in the process

B) in controlling an object during its operation using sensors

C) in managing an object during its operation using a program

D) in managing an object during its operation using a computer

D) all answers are correct

5. Based on the selected optimality criterion, a

A) objective function

B) dependence of parameters

C) an objective function representing the dependence of the optimality criterion on the parameters influencing its value

D) dependence of parameters influencing its value

D) all answers are correct

Automatic systems that provide the best technical or technical-economic quality indicators under given real operating conditions and limitations are called optimal systems.
Optimal systems are divided into two classes:
- systems with “hard” settings, in which incomplete information does not interfere with achieving the control goal;
- adaptive systems in which incomplete information does not allow achieving the control goal without automatic adaptation of the system under conditions of uncertainty.
The goal of optimization is mathematically expressed as the requirement to ensure the minimum or maximum of some quality indicator, called the optimality criterion or objective function. Main quality criteria automatic systems are: the cost of development, manufacturing and operation of the system; quality of operation (accuracy and speed); reliability; energy consumed; weight; volume, etc.

The quality of functioning is described functional dependencies type:

where u are control coordinates; x - phase coordinates; f in - disturbances; t o and t k - the beginning and end of the process.
When developing optimal ACS, it is necessary to take into account the restrictions imposed on the system, which are of two types:
- natural, determined by the principle of operation of the object, for example, the operating speed of a hydraulic servomotor cannot be greater than with the dampers fully open, the speed of the motor cannot be more synchronous, etc.;
- artificial (conditional), which are deliberately introduced, for example, current limitations in the DPT for normal switching, heating, acceleration for normal well-being in the elevator, etc.
Optimality criteria can be scalar if they are represented by only one particular criterion, and vector (multi-criteria) if they are represented by a number of particular ones.
The time of the transition process can be taken as an optimality criterion those. An automatic control system is optimal in terms of performance if the minimum of this integral is ensured, taking into account the restrictions. Integral estimates of the quality of the transition process, known in TAU, are also accepted, for example, quadratic. As a criterion for the optimality of systems under random influences, the average value of the squared error of the system is used When controlling from sources with limited power, a functional is taken that characterizes the energy consumption for control where u(t) and i(t) are the voltage and current of the control circuit. Sometimes maximum profit is taken as a criterion for the optimality of complex self-propelled guns technological process I= g i P i - S, where g i is the price of the product; P i - productivity; S - costs.
Compared to less rigorous methods for designing closed-loop control systems, the advantages of optimization theory are as follows:
1). the design procedure is clearer, because includes all essential aspects of quality in a single design indicator;
2). obviously the designer can expect to receive best result in accordance with this quality indicator. Therefore, for the problem under consideration, the area of ​​restrictions is indicated;
3). incompatibility of a number of quality requirements can be detected;
4). the procedure directly includes prediction, because the quality indicator is assessed based on future values ​​of control time;
5). the resulting control system will be adaptive if the design indicator is reformulated during operation and the controller parameters are simultaneously calculated again;
6). determining optimal non-stationary processes does not introduce any additional difficulties;
7). Nonlinear objects are also considered directly, although the complexity of the calculations increases.



The difficulties inherent in optimization theory are as follows:
1). turning various design requirements into a mathematically meaningful quality indicator not an easy task; there may be trial and error;
2). existing optimal control algorithms for nonlinear systems require complex calculation programs and, in some cases, large quantity machine time;
3). the quality indicator of the resulting control system is very sensitive to various kinds erroneous assumptions and changes in the parameters of the control object.

The optimization problem is solved in three stages:
1). construction of mathematical models of the physical process, as well as quality requirements. Mathematical model quality requirements is an indicator of the quality of the system;
2). calculation of optimal control actions;
3). synthesis of a controller that generates optimal control signals.

Figure 10.1 shows the classification of optimal systems.

In a broad sense, the word “optimal” means the best in the sense of some criterion of efficiency. With this interpretation, any scientifically based system is optimal, since when choosing a system it is implied that it is in some respect better than other systems. The criteria by which the choice is made (optimality criteria) may be different. These criteria may be the quality of the dynamics of control processes, system reliability, energy consumption, its weight and dimensions, cost, etc., or a combination of these criteria with certain weighting coefficients.

Below, the term “optimal” is used in a narrow sense, when the automatic control system is assessed only by the quality of dynamic processes, and the criterion (measure) of this quality is the integral quality indicator. This description of quality criteria makes it possible to use the well-developed mathematical apparatus of the calculus of variations to find optimal control.

Next, two classes of systems are considered: program control systems, the control action in which does not use information about the current state of the object, and systems automatic regulation(programmed motion stabilization systems), operating on the feedback principle.

Variational problems that arise when constructing optimal program and stabilizing control systems are formulated in the first chapter. The second chapter outlines the mathematical theory of optimal control (the maximum principle of L. S. Pontryagin and the method of dynamic programming of R. Wellman). This theory is the foundation for building optimal systems. It provides a large amount of information about the optimal control structure. Evidence of the latter is the optimal control in terms of performance, which is the subject of the third chapter. At the same time practical use theory faces computational difficulties. The fact is that the mathematical theory of optimal control allows us to reduce the process of constructing optimal control to solving a boundary value problem for differential equations (ordinary or partial derivatives).

The difficulties of numerically solving boundary value problems lead to the fact that the construction of optimal controls for each class of control objects is an independent creative task, the solution of which requires taking into account the specific features of the object, the experience and intuition of the developer.

These circumstances prompted the search for classes of objects for which, when constructing optimal control, the boundary value problem is easily solved numerically. Such control objects turned out to be objects described by linear differential equations. These results, obtained by A. M. Letov and R. Kalman, formed the basis of a new direction in the synthesis of optimal stabilization systems, called analytical design of regulators.

Analytical design of regulators, widely used in the design of modern complex systems stabilization is the subject of the fourth and fifth chapters.

In the general case, an automatic control system consists of an op-amp control object with an operating parameter Y, a controller P and a programmer (setter) P (Fig. 6.3), which generates a command action (program) to achieve control goals, subject to the fulfillment of qualitative and quantitative requirements. The programmer takes into account the totality of external information (AND signal).

Rice. 6.3. Optimal control structure

The task of creating an optimal system is to synthesize a controller and programmer for a given control object, which the best way solve the required control goal.
In the theory of automatic control, two related problems are considered: the synthesis of an optimal programmer and the synthesis of an optimal controller. Mathematically, they are formulated in the same way and solved using the same methods. At the same time, the tasks have specific features, which at a certain stage require a differentiated approach.

System with optimal programmer (optimal software control) is called optimal according to the control mode. A system with an optimal controller is called transient optimal. An automatic control system is called optimal if the controller and programmer are optimal.
In some cases, it is assumed that the programmer is given and only the optimal controller needs to be determined.

The problem of synthesizing optimal systems is formulated as a variational problem or a mathematical programming problem. In this case, in addition to the transfer function of the control object, restrictions are set on the control actions and operating parameters of the control object, boundary conditions and an optimality criterion. Boundary (boundary) conditions determine the state of the object at the initial and final moments of time. The optimality criterion, which is a numerical indicator of the quality of the system, is usually specified in the form of a functional

J = J[u(t), y(t)],

Where u(t) – control actions; y(t) – parameters of the control object.

The optimal control problem is formulated as follows: given a control object, restrictions and boundary conditions, find a control (programmer or controller) at which the optimality criterion takes on a minimum (or maximum) value.

28. Information processing in automated process control systems. Relationship between the correlation interval and the sampling frequency of primary measuring transducers. Selecting the sampling frequency of primary measuring transducers.

Optimal control

Optimal control is the task of designing a system that provides, for a given control object or process, a control law or a control sequence of influences that ensures the maximum or minimum of a given set of system quality criteria.

To solve the optimal control problem, a mathematical model of the controlled object or process is constructed, describing its behavior over time under the influence of control actions and its own current state. The mathematical model for the optimal control problem includes: the formulation of the control goal, expressed through the control quality criterion; definition of differential or difference equations describing possible ways movement of the control object; determination of restrictions on the resources used in the form of equations or inequalities.

The most widely used methods in the design of control systems are the calculus of variations, Pontryagin's maximum principle and Bellman dynamic programming.

Sometimes (for example, when managing complex objects, such as a blast furnace in metallurgy or when analyzing economic information), the initial data and knowledge about the controlled object when setting the optimal control problem contains uncertain or fuzzy information that cannot be processed by traditional quantitative methods. In such cases, you can use optimal control algorithms based on the mathematical theory of fuzzy sets (Fuzzy control). The concepts and knowledge used are converted into a fuzzy form, fuzzy rules for deriving the decisions made are determined, and then the inverse conversion fuzzy decisions made into physical control variables.

Optimal control problem

Let us formulate the optimal control problem:

here is the state vector - control, - the initial and final moments of time.

The optimal control problem is to find state and control functions for time that minimize the functionality.

Calculus of variations

Let us consider this optimal control problem as a Lagrange problem in the calculus of variations. To find the necessary conditions for an extremum, we apply the Euler-Lagrange theorem. The Lagrange function has the form: , where are the boundary conditions. The Lagrangian has the form: , where , , are n-dimensional vectors of Lagrange multipliers.

The necessary conditions for an extremum, according to this theorem, have the form:

Necessary conditions (3-5) form the basis for determining optimal trajectories. Having written these equations, we obtain a two-point boundary problem, where part of the boundary conditions is specified at the initial moment of time, and the rest at the final moment. Methods for solving such problems are discussed in detail in the book.

Pontryagin's maximum principle

The need for the Pontryagin maximum principle arises in the case when nowhere in the admissible range of the control variable is it possible to satisfy the necessary condition (3), namely .

In this case, condition (3) is replaced by condition (6):

(6)

In this case, according to Pontryagin's maximum principle, the value of optimal control is equal to the value of control at one of the ends of the admissible range. Pontryagin's equations are written using the Hamilton function H, defined by the relation. From the equations it follows that the Hamilton function H is related to the Lagrange function L as follows: . Substituting L from the last equation into equations (3-5) we get the necessary conditions, expressed through the Hamilton function:

Necessary conditions written in this form are called Pontryagin equations. Pontryagin's maximum principle is discussed in more detail in the book.

Where is it used?

The maximum principle is especially important in control systems with maximum speed and minimum consumption energy, where relay-type controls are used that take extreme rather than intermediate values ​​within the permissible control interval.

Story

For the development of the theory of optimal control L.S. Pontryagin and his collaborators V.G. Boltyansky, R.V. Gamkrelidze and E.F. Mishchenko was awarded the Lenin Prize in 1962.

Dynamic programming method

The dynamic programming method is based on Bellman's principle of optimality, which is formulated as follows: the optimal control strategy has the property that whatever the initial state and control at the beginning of the process, subsequent controls must constitute an optimal control strategy relative to the state obtained after the initial stage of the process. The dynamic programming method is described in more detail in the book

Notes

Literature

  1. Rastrigin L.A. Modern principles management of complex objects. - M.: Sov. radio, 1980. - 232 p., BBK 32.815, ref. 12000 copies
  2. Alekseev V.M., Tikhomirov V.M. , Fomin S.V. Optimal control. - M.: Nauka, 1979, UDC 519.6, - 223 pp., dash. 24000 copies

see also


Wikimedia Foundation. 2010.

See what “Optimal control” is in other dictionaries:

    Optimal control- OU Control that provides the most favorable value of a certain optimality criterion (OC), characterizing the effectiveness of control under given restrictions. Various technical or economic... ... Dictionary-reference book of terms of normative and technical documentation

    optimal control- Management, the purpose of which is to ensure the extreme value of the management quality indicator. [Collection of recommended terms. Issue 107. Management Theory. Academy of Sciences of the USSR. Committee of Scientific and Technical Terminology. 1984]… … Technical Translator's Guide

    Optimal control- 1. The basic concept of the mathematical theory of optimal processes (belonging to the branch of mathematics under the same name: “O.u.”); means the selection of control parameters that would provide the best from the point of... ... Economic-mathematical dictionary

    Allows, under given conditions (often contradictory), to achieve the goal in the best possible way, for example. in the minimum time, with the greatest economic effect, with maximum accuracy... Big Encyclopedic Dictionary

    Aircraft section of flight dynamics dedicated to the development and use of optimization methods to determine motion control laws aircraft and its trajectories that provide the maximum or minimum of the selected criterion... ... Encyclopedia of technology

    A branch of mathematics that studies non-classical variational problems. Objects that technology deals with are usually equipped with “rudders”; with their help, a person controls movement. Mathematically, the behavior of such an object is described... ... Great Soviet Encyclopedia

    Allows, under given conditions (often contradictory), to achieve the goal in the best possible way, for example, in the minimum time, with the greatest economic effect, with maximum accuracy. * * * OPTIMUM MANAGEMENT OPTIMUM MANAGEMENT ... encyclopedic Dictionary

Return

×
Join the “koon.ru” community!
In contact with:
I am already subscribed to the community “koon.ru”