Complex random functions and their characteristics. Numerical characteristics of a random function

Subscribe
Join the “koon.ru” community!
In contact with:

In all previous paragraphs of this chapter it was assumed that control and disturbing influences are certain functions of time. However, for systems automatic control, working in real conditions, it is characteristic that these impacts are random and fundamentally unpredictable.

Consider, for example, the operation of a tracking system that controls a radar antenna. For this system, the control action is the position of the target, and the disturbing influences can be considered wind loads on the antenna, deviations of the beam from the direction towards the target due to refraction in the atmosphere, intrinsic noise in the amplification path of the system, interference from power supplies, etc. All these processes are caused by many interacting causes and are so complex that they cannot be represented by any given function of time. The same can be said regarding the control action. In practice, it cannot be considered a typical signal, for example, a stepped, linearly increasing, sinusoidal or any regular signal. In reality, the target is maneuvering, so its position at any subsequent moment cannot be accurately predicted. This maneuvering is accompanied by constant wandering of the reflecting point along the target’s body.

Thus, control signals and disturbances in real conditions are random processes. Random or stochastic process

is a function of time that is a random variable for each value of the argument. If another independent variable is used instead of time, then the term random function is used. When the conditions of a random process are repeatedly reproduced, the latter takes on different specific values ​​each time. These values ​​as a function of time are called realizations of a random process. A typical view of several implementations of the stochastic process of error in the angular coordinate of a target tracked by a radar station is presented in Fig. XIII. 14.

Mathematical description of a random process. For a fixed value of the argument, the random process is a random variable, the complete description of which is given by the distribution function

i.e., the probability that at a given moment the random variable will take a value less than As is known from probability theory, instead of the distribution function it is often more convenient to use the probability density, which is its derivative (in the generalized sense):

If we fix two moments in time, then the values ​​of the random process form a system of two random variables or a two-dimensional random vector. For his full description you need to know the two-dimensional distribution function

Rice. XIII.14. Stochastic process of error in measuring the angular coordinate of a target tracked by a radar station

or two-dimensional density

which depend on both the parameters.

For more detailed description of a random process at arbitrary moments of time, distribution and density functions of higher orders are similarly introduced. Thus, a complete statistical description random function(process) gives a finite sequence of its distribution functions:

or a sequence of their derivatives

Each of the members of these sequences has the usual properties of distribution functions or, respectively, densities. In addition, each subsequent member of the sequence determines all previous ones. For example, if we put then

We have similar formulas for any other moments of time.

This condition is called the consistency condition for a family of distribution functions. The symmetry condition is also valid:

In general, higher order densities or distribution functions are not determined by lower order densities or functions.

However, it is often useful to consider the so-called absolutely random process, the values ​​of which are independent in the aggregate for any For such a process, the distribution density of any order is determined through the one-dimensional:

Such a process is a mathematical simplification, since for sufficiently close values ​​the values ​​of any real process are close and, therefore, dependent. Another extreme case is a degenerate or singular process, determined by one or more random variables; For example,

where is a random variable; - known constants. Such a process becomes fully known if it can be measured at any point in time. In a more general case, a singular random process is characterized by a set of random variables, for example,

where are ordinary (deterministic functions of time).

Rice. XIII.15. Possible implementations of two random functions: a - with high-frequency components; b - with low-frequency components

Moment functions. In practical problems, they usually use simpler characteristics of random processes - moment functions. The moment of the first order or the mathematical expectation of the process is called the expression

If this function is considered depending on then, all implementations of the random process will be grouped around the average value of the function (Fig. XIII.15).

Mathematical expectations of higher degrees are called initial moments of order

A random function has zero mean and is called centered. The central moment of the order of a process is the mathematical expectation of the degree of the centered process

The measure of dispersion of the values ​​of a random process relative to its mathematical expectation is determined by the moment of the second order, more often called dispersion:

However, the characteristics of a random process based on the first density do not reflect changes in implementations over time. For example, two processes with the same first density (Fig. XIII. 15, a and b) differ in the rate of change of implementations, that is, in the degree of relationship between two values ​​​​accepted in one implementation at different points in time. To describe the temporary internal structure of random processes, the correlation function is used

This function is often called autocorrelation or covariance; it plays a major role in the theory of random processes.

It is easy to show that the correlation function is symmetric with respect to its arguments and when its value is equal to the variance of the random process. Indeed,

To characterize the accuracy of systems automatic regulation It is convenient to use an uncentered correlation function:

also called the second initial moment of the process.

The connection between is established by the following transformations:

When the mean square of the process will be

In automatic control systems, several random disturbing or control signals, independent or interconnected, often operate. A measure of the relationship between two random processes is the mutual correlation function

where is the joint probability density for independent processes

The cross-correlation function satisfies the equality

The theory of random processes, which uses only moments of the first and second orders, is called correlation theory. It was created by the fundamental works of A. N. Kolmogorov, D. Ya. Khinchin, N. Viier. Soviet scientists V.S. Pugachev, V.V. Solodovnikov and others made a great contribution to its development.

Stationary random processes. When considering various random processes, a group of processes is identified whose statistical properties do not change with a time shift. Such processes are called stationary. Considering the many implementations of the random process shown in Fig. XIII. 14, it can be assumed that in in this case the beginning of the time count can be chosen arbitrarily, i.e. there is a stationary process. On the contrary, in Fig. XIII. 15, obviously, we have examples of non-stationary processes.

The study of systems in which random processes are stationary is much simpler than the study of systems with non-stationary processes. However, processes in many control systems can be approximately considered stationary. This is of great practical importance in the theory of stationary random processes.

By definition of a stationary random process, its mathematical expectation must be constant when the argument is shifted by any interval T:

and the correlation function satisfies the relation

Assuming we find that the correlation function of a stationary process depends only on the difference in readings

Ergodic properties of random processes. If we have a set, or, as they say, an ensemble of realizations, then the mathematical expectation and the correlation function are obtained by averaging over the ensemble of realizations of a random process, that is, “across” the process in one or, respectively, two of its sections. It is also interesting to consider the results of averaging the implementations of a stationary process over time along the axis on the interval, defining this operation in a natural way:

This value is different for different implementations of a random process and is itself random. It can be shown that its mathematical expectation for a stationary process is equal to . At the same time, the dispersion of this quantity, as direct calculations show,

Rice. XIII.16. Structural scheme correlator

The conditions for a process to be ergodic with respect to , formulated by V.S. Pugachev, contain higher moments of a random process and are not given here.

The ergodicity properties of random processes make it possible to replace averaging over multiple implementations, which is rarely feasible in practice, with time averaging taken over one implementation when T is large.

Not all stationary processes have ergodic properties. For example, a process, all of whose realizations are random variables that do not change in time, is, as is easy to see, non-ergodic. It follows that physical meaning ergodicity consists in “good mixing” of realizations of a random process. Since this occurs in almost all applications, in what follows we will assume the processes under consideration are ergodic.

For such processes, it is possible to experimentally determine the average value and correlation function of the process using special devices - correlators. The principle of operation of correlators is clear from Fig. XIII.16.

By feeding a single signal to the input of the correlator, at its output with a sufficiently large integration time T we will have the average value of the process x, approximately coinciding with its mathematical expectation. If, then as a result we will have a second initial moment from which it is easy to determine the correlation function.

1. CONCEPT OF RANDOM FUNCTION

Until a certain time, probability theory was limited to the concept of random variables. Their use makes it possible to perform static calculations that take into account random factors. However, mechanical systems are also subject to a variety of dynamic, that is, time-varying influences of a random nature. These include, in particular, vibration and shock effects during the movement of vehicles, aerodynamic forces caused by atmospheric turbulence, seismic forces, loads caused by random deviations from the nominal operating conditions of machines.

Random dynamic phenomena are studied by analyzing trends in the economy (for example, changes in stock prices or currencies). Operation under conditions of random disturbances is typical for control systems of various dynamic objects.

To analyze such phenomena, the concept is used random function. Random function X(t) such an argument function is called t, the value of which for any t is a random variable. If the argument takes discrete values t 1 , t 2 , …, tk then they talk about a random sequence X 1 , X 2 ,…, Xk, Where X i = X(t i).

In many practical problems, a non-random argument t has the meaning of time, and the random function is called random process , and the random sequence is time series. At the same time, the argument of a random function can have a different meaning. For example, we can talk about the terrain Z(x, y), where the arguments are the location coordinates x And y, and the role of a random function is played by the height above sea level z. In what follows, for definiteness, bearing in mind the applications of random functions to the study of dynamical systems, we will talk about random processes.

Let us assume that when studying a random process X(t) produced n independent experiments, and implementations were obtained

representing n deterministic functions. The corresponding family of curves to a certain extent characterizes the properties of a random process. Thus, Fig. 1.1a shows implementations of a random process with a constant average level and a spread of values ​​around the average; in Fig. 1.1b – implementations of a random process with a constant average and changing spread, in Fig. 1.1c – implementations of a random process with time-varying mean and spread.



Fig.1.1. Typical implementations of random processes

In Fig. Figure 1.2 shows implementations of two random processes that have the same average level and spread, but differ in smoothness. Realizations of the random process in Fig. 1.2a are high-frequency in nature, and in Fig. 1.2b – low frequency.

Rice. 1.2. High-frequency and low-frequency random processes

Thus, X(t) can also be considered as a set of all possible implementations, which is subject to certain probabilistic laws. As with random variables, distribution functions or densities provide a comprehensive description of these patterns. A random process is considered given if all multidimensional laws of distribution of random variables are given X(t i), X(t 2 ), …, X(tn) for any values t 1 , t 2 , …, tn from the argument change area t. We are talking, in particular, about one-dimensional distribution density, two-dimensional distribution density etc. .

To simplify the analysis, in most cases we are limited to moment characteristics, and most often we use moments of the first and second orders. To characterize the average level of a random process, the mathematical expectation is used

. (1.1)

To characterize the amplitude of deviations of a random process from the average level, dispersion is used

To characterize the variability (smoothness) of a random process, the correlation (autocorrelation) function is used

(1.3)

As follows from (1.3), the correlation function is the covariance of random variables X(t 1) and X(t 2). Covariance, as is known from the probability theory course, characterizes the interdependence between X(t 1) and X(t 2).

Within the framework of the correlation theory of random functions, which operates only on first- and second-order moments, many technical problems can be solved. In particular, a priori as well as conditional probabilities for a random process to go beyond specified boundaries can be determined. At the same time, some practically important problems cannot be solved by means of correlation theory and require the use of multidimensional distribution densities. Such tasks include, for example, calculating the average time a random process spends above or below a given boundary.

2. TYPES OF RANDOM PROCESSES

2.1. Quasi-deterministic random processes

100 RUR bonus for first order

Select job type Graduate work Coursework Abstract Master's thesis Report on practice Article Report Review Test Monograph Problem Solving Business Plan Answers to Questions Creative work Essay Drawing Works Translation Presentations Typing Other Increasing the uniqueness of the text Master's thesis Laboratory work Online help

Find out the price

Random function - a function that, as a result of experience, can take on one or another specific form unknown in advance. Usually the argument of a random function (r.f.) is time, then the r.f. called random process(s.p.).

S.f. continuously changing argument t such a r.v. is called, the distribution of which depends not only on the argument t=t1, but also on what particular values ​​this quantity took for other values ​​of this argument t=t 2. These r.v. are correlated with each other and the closer the values ​​of the arguments are, the closer they are to each other. In the limit, when the interval between two values ​​of the argument tends to zero, the correlation coefficient is equal to one:

those. t 1 and t1+Dt1 at Dt1®0 are related by a linear relationship.

S.f. takes, as a result of one experiment, an innumerable (in general, uncountable) set of values ​​- one for each value of the argument or for each set of values ​​of the arguments. This function has one completely specific value for every moment in time. The result of measuring a continuously changing quantity is such a r.v., which in each given experiment represents a certain function of time.

S.f. can also be considered as an infinite set of r.v., depending on one or several continuously changing parameters t. To each given value parameter t corresponds to one s. in Xt. Together all s.v. X t determine s.f. X(t). These r.v. are correlated with each other and the stronger the closer they are to each other.

Elementary s.f. – this is the product of an ordinary r.v. X to some non-random function j(t): X(t)=X×j(t), i.e. such a s.f., in which it is not the form that is random, but only its scale.

S.f. - has a m.o. equal to zero. p– r.v. distribution density X(s.f. values X(t)), taken at an arbitrary value t 1 argument t.

Implementation of s.f. X(t)– described by the equation x=f1(t) at t=t1 and the equation x=f2(t) at t=t2.

Generally functions x=f1(t) And x=f2(t)– various functions. But these functions are identical and linear, the more ( t1®t2) t 1 is closer to t 2.

One-dimensional probability density s.f. p(x,t)- depends on X and from the parameter t. Two-dimensional probability density p(x1,x2;t1,t2)– joint law of distribution of values X(t1) and X(t2) With. f. X(t) for two arbitrary values t And t¢ argument t.

. (66.5)

In general, the function X(t) characterized by a large number n-dimensional laws of distribution .

M.o. s.f. X(t)- non-random function, which for each argument value t equal to m.o. ordinates s.f. with this argument t.

- function depending on x And t.

Likewise, dispersion is a non-random function.

Degree of dependence r.v. For different meanings argument is characterized by an autocorrelation function.

Autocorrelation function s.f. X(t) Kx(ti,tj), which for each pair of values ti, tj equal to the correlation moment of the corresponding ordinates of the s.f. (at i=j the correlation function (c.f.) turns into the dispersion of the c.f.);

where is the joint distribution density of two r.v. (s.f. values) taken at two arbitrary values t 1 and t 2 arguments t. At t1=t2=t we get the variance D(t).

Autocorrelation function - a set of m.o. products of deviations of two ordinates s.f. , taken with the arguments t1 And t 2, from the ordinates of the non-random function m.o. , taken with the same arguments.

The autocorrelation function characterizes the degree of variability of the s.f. when the argument changes. In Fig. it is clear that the dependence between the values ​​of the s.f. corresponding to two given values ​​of the argument t- weaker in the first case.

Rice. Correlation related random functions

If two s.f. X(t) And Y(t), forming the system are not independent, then their mutual correlation function is identically non-zero:

where is the joint distribution density of two r.v. (values ​​of two s.f. X(t) And Y(t)), taken with two arbitrary arguments ( t 1 - function argument X(t), t 2 - function argument Y(t)).

If X(t) and Y(t) are independent, then K XY( t1,t2)=0. System of n s.f. X 1(t), X2(t),...,Xn(t) characterized n m.o. , n autocorrelation functions and more n(n-1)/2 correlation functions.

Mutual correlation function (characterizes the relationship between two s.f., i.e. stochastic dependence) of two s.f. X(t) And Y(t)- non-random function of two arguments t i and t j, which for each pair of values t i, t j is equal to the correlation moment of the corresponding sections of the s.f. It establishes a connection between two values ​​of two functions (values ​​- r.v.), with two arguments t 1 and t 2.

Of particular importance are stationary random functions , the probabilistic characteristics of which do not change with any shift in the argument. M.o. stationary s.f. is constant (i.e. is not a function), and the correlation function depends only on the difference in the values ​​of the arguments t i and t j.

This even function(symmetrically OY).

At great importance time interval t=t2-t1 deviation of ordinate s.f. from her m.o. at a point in time t 2 becomes practically independent of the value of this deviation at the moment of time t 1. In this case the function KX(t), giving the value of the correlation moment between X(t1) And X(t2), at ½ t½®¥ tends to zero.

Many stationary s.f. have ergodic property, which is that with an unlimitedly increasing observation interval, the average observed value of the stationary s.f. with probability equal to 1, will indefinitely approach its m.o. Observation of stationary s.f. at different meanings t over a sufficiently large interval in one experiment is equivalent to observing its values ​​at the same value t in a number of experiments.

Sometimes it is necessary to determine the characteristics of the transformed s.f. according to the characteristics of the initial s.f. So if

(70.5),

That those. m.o. integral (derivative) of s.f. equal to the integral (derivative) of m.o. ( y(t)- rate of change of s.f. X(t), - rate of change of m.o.).

When integrating or differentiating s.f. we also get s.f. If X(t) is normally distributed, then Z(t) And Y(t) are also normally distributed. If X(t)– stationary s.f., then Z(t) no longer stationary s.f., because depends on t.

Examples of correlation functions.

1) (from (2) at b®0); 2) ;

3) ; 4) ;

5)(from (3) with b®0); 6) (from (4) with b®0).

On the charts a= 1, b= 5, s= 1.

a- characterizes the rate of decrease in the correlation between the ordinates of the s.f. with increasing difference between the arguments of these ordinates t.

a/b- characterizes the “degree of irregularity of the process.” At low a/b the ordinates of the process turn out to be highly correlated and the implementation of the process is similar to a sinusoid; at large a/b (71.5).

Formula (71) for a stationary function takes the form:

Correlation function s.f. and its derivative . For a differentiable stationary process, the ordinate s.f. and its derivative taken at the same moment in time are uncorrelated r.v. (and for a normal process, independent).

When multiplying s.f. to the deterministic one we obtain s.f. Z(t)=a(t)X(t), whose correlation function is equal to

KZ(t1,t2)=a(t1)a(t2) KX(t1,t2) (72.5),

Where a(t)- deterministic function.

The sum of two s.f. is also s.f. Z(t)=X(t)+Y(t) and its correlation function in the presence of a correlation between X(t) and Y(t):

KZ(t1,t2)=KX(t1,t2)+ KY(t1,t2)+ 2KXY(t1,t2),(73.5)

Where KXY(t1,t2)- see (68.5) - mutual correlation function of two dependent s.f. X(t) And Y(t).

If X(t) And Y(t) are independent, then KXY(t1,t2)=0. M.o. s.f. Z(t): .

Main goals

We can distinguish two main types of problems, the solution of which requires the use of the theory of random functions.

Direct task (analysis): the parameters of a certain device and its probabilistic characteristics are given ( mathematical expectations, correlation functions, distribution laws) of the function (signal, process) arriving at its “input”; it is necessary to determine the characteristics at the “output” of the device (they are used to judge the “quality” of the device’s operation).

Inverse problem (synthesis): the probabilistic characteristics of the “input” and “output” functions are specified; it is required to design an optimal device (find its parameters) that converts a given input function into an output function that has the given characteristics. The solution to this problem requires, in addition to the apparatus of random attraction functions, other disciplines and is not considered in this book.

Definition of a random function

Random function called a function of a non-random argument t, which for each fixed value of the argument is a random variable. Random Argument Functions t indicated in capital letters X(t), Y(t) etc.

For example, if U- random variable, then the function X(!)=C U - random. Indeed, for each fixed value of the argument, this function is a random variable: for t ( = 2

we get a random variable X x = AU at t 2= 1.5 - random variable X 2 = 2,25 U etc.

For brevity of further presentation, we introduce the concept of a section.

Section A random function is a random variable corresponding to a fixed value of the argument of a random function. For example, for a random function X(t) = t 2 U, given above, with argument values ​​7, = 2 and t 2= 1.5 random variables were obtained accordingly X ( = AUn X 2 = 2.2577, which are the sections of the given random function.

So, a random function can be considered as a set of random variables (X(?)), depending on the parameter t. Another interpretation of a random function is possible if we introduce the concept of its implementation.

Implementation (trajectory, selective function) random function X(t) call a non-random argument function t, equal to which a random function may turn out to be as a result of the test.

Thus, if a random function is observed in an experiment, then in reality one of its possible implementations is observed; Obviously, when the experiment is repeated, a different implementation will be observed.

Function implementations X(t) denoted by lowercase letters x t (t) t x 2 (t) etc., where the index indicates the test number. For example, if X(t)= (/sin t, Where U- a continuous random variable that took on a possible value in the first test and ( = 3, and in the second test and 2 = 4.6, then implementations X(t) are respectively non-random functions X ( (t) = 3sin t And x 2 (t) = 4.6sin t.

So, a random function can be considered as a set of its possible implementations.

Random (stochastic) process call the random argument function t, which is interpreted as time. For example, if an airplane must fly at a given constant speed, then in reality, due to the influence of random factors (temperature fluctuations, changes in wind strength, etc.), the influence of which cannot be taken into account in advance, the speed changes. In this example, the speed of the aircraft is a random function of a continuously changing argument (time), i.e. speed is a random process.

Note that if the argument of a random function changes discretely, then the corresponding values ​​of the random function (random variables) form random sequence.

The argument of a random function can be not only time. For example, if the diameter of a weaving thread is measured along its length, then due to the influence of random factors, the diameter of the thread changes. In this example, the diameter is a random function of a continuously varying argument (the length of the thread).

Obviously, it is generally impossible to define a random function analytically (by a formula). In special cases, if the form of a random function is known, and its defining parameters are random variables, it can be specified analytically. For example, random functions are:

X(t)= sin Qf, where Q is a random variable,

X(t)= G/sin t, Where U- random value,

X(t) = G/sin Qt, where ABOUT. And )

Return

×
Join the “koon.ru” community!
In contact with:
I am already subscribed to the community “koon.ru”