ISSN 1991-2927
 

ACP № 2 (60) 2020

'Automation of Control Processes' # 1 (59) 2020

Contents
MATHEMATICAL MODELING

Aleksandr Kupriianovich Ivanov, Doctor of Sciences in Engineering; graduated from the Faculty of Physics at Irkutsk State University, completed his postgraduate studies at Bauman Moscow Technical School, doctoral studies at Ulyanovsk State Technical University, Chief Staff Scientist of FRPC JSC ‘RPA ‘Mars’; an author of monographs, tutorial, articles in the field of the mathematical modeling of hierarchical real-time computer- aided control systems. e-mail: This email address is being protected from spambots. You need JavaScript enabled to view it. A.K. Ivanov

Aleksandr Leonidovich Savkin, Candidate of Military Science, Associate Professor; graduated from the Ulyanovsk Higher Military Command School of Communications and the Marshal Budjonny Military Academy of Signal Corps; completed his postgraduate studies in the Military Academy of Communications; Head of Science and Engineering Support Department of FRPC JSC ‘RPA ‘Mars’; an author of scientific works, manuals, and articles in the field of the development and modeling of communication control systems, automated information systems. e-mail: This email address is being protected from spambots. You need JavaScript enabled to view it. A.L. Savkin

The decision-making model in management bodies59_1.pdf

The decision-making modeling is based on the analogy of the intellectual activity of officials in the management bodies with inventor’s work described in terms of the catastrophe theory. The formation of decision is presented by motion in a potential field of cusp geometry. A potential function has two minima. The first minimum corresponds to initial data, the second one is in line with the findings reached. For transition, a potential barrier needs to be overcome through the pursuance of the specified research with knowledge accumulation. The activity of officials is divided in two parts. The first stage is a multistage data transformation using known algorithms, and the second one is a creative process for identifying the decision variables. Analytical solutions for three and four stages are obtained. A model of chain reaction is received in the model of the second part. The analogy with chain reactions is determined on grounds of the suggestion that information resources stimulating the intellectual conclusions with the avalanche-like growth of new knowledge are created during initial data transformation.

management bodies, decision making, mathematical model, catastrophe theory.


Aleksei Vladimirovich Golubkov, graduated from Ilya Ulyanov State Pedagogical University with the Master’s degree in the methodology of mathematical education; Postgraduate Student at the Department of Higher Mathematics of the Faculty of Physics, Mathematics, and Technology Education of UlSPU; an author of articles and certificates of software registration in the field of mathematical modeling and software engineering. e-mail: kr8589@gmail.comA.V. Golubkov

A solution to the problem of the detection of changes in object motion mode with a limited size of the kalman filters bank59_2.pdf

The paper presents a solution to the problem of detecting the changes in motion mode of an object along a complex trajectory. It is assumed that a complex trajectory can be presented by a sequence of pieces of finite length, at each of which the object moves uniformly and on a straight line, or makes a circular motion when turning to the right or to the left with constant velocity. To simulate a complex trajectory, a hybrid stochastic model is used. The task is to detect the change in the motion mode as soon as possible in order to calculate the optimal estimates of the object’s motion parameters in real time. The solution to the problem is based on a sequential decision rule about choosing the current motion mode at an unknown time instant, with a limited size of the bank of competitive Kalman filters. The algorithm for a priori estimate calculation of the size of the competitive filters bank is implemented in MATLAB, numerical experiments are carried out. The developed algorithm for estimating the size of the Kalman filter bank is used to solve the problem of early detection of changes in the motion mode of an object along a complex trajectory.

Stochastic discrete linear systems, hybrid stochastic model, a bank of Kalman filters, sequential decision making rule.


Vladimir Nikolaevich Kliachkin, Doctor of Sciences in Engineering, Professor; graduated from the Faculty of Mechanics of Ulyanovsk Polytechnic Institute; Professor at the Department of Applied Mathematics and Informatics of Ulyanovsk State Technical University; an author of scientific publications in the field of reliability, statistical methods. e-mail: v_kl@mail.ruV.N. Kliachkin

Anastasiia Valerevna Alekseeva, Postgraduate Studentatthe Departmentof Applied Mathematics and Informatics of UlSTU; graduated from the Faculty of Information System and Technologies of UlSTU; a standardization engineer at the Ulyanovsk Design Bureau of Instrumentation, JSC; an author of scientific publications in the field of statistical methods of quality control. e-mail: age-89@mail.ruA.V. Alekseeva

Study of the efficiency of statistical control of hydro-unit vibrations59_3.pdf

To diagnose the technical condition of the hydraulic unit, vibration monitoring is carried out, as the level of vibration largely determines the quality of operation of the unit. When assessing the stability of vibrations, methods of statistical process control can be used. Many monitored indicators include both independent and correlated indicators. When monitoring correlated indicators, multivariate control methods are used. Mid-level monitoring of the process is based on the Hotelling algorithm. For the analysis of multivariate scattering, the generalized dispersion algorithm is used. The methodology and test results for analyzing the effectiveness of the generalized dispersion algorithm for controlling multidimensional vibration scattering are considered. Regression dependences of the average run length on the characteristics of the process disturbance were constructed, on the basis of which the quality of vibration diagnostics can be estimated. Initially, a lot of samples are constructed that are identical to the studied process of vibration of the hydraulic unit, that is, with a vector of average and covariance matrix corresponding to the training sample obtained in the real process. Based on the results of statistical tests, regression dependences of the average length of the series on the characteristics of the process violation were obtained, on the basis of which the quality of vibration diagnostics can be estimated.

Vibration stability, multivariate scattering, generalized dispersion, average run length.


INFORMATION SYSTEMS

Damir Maratovich Valeev, graduated from Ulyanovsk State University; Postgraduate Student at the Department of Telecommunication Technologies and Networks of Ulyanovsk State University; an author of articles in the field of cryptography and steganography. e-mail: damirivaleev@gmail.comD.M. Valeev

Aleksei Arkadevich Smagin, Doctor of Sciences in Engineering, Professor; graduated from the Radioengineering Faculty of Ulyanovsk Polytechnic Institute; Head of the Department of Telecommunication Technologies and Networks of Ulyanovsk State University; an author of articles, inventions, and manuals in the field of the development of information systems of different purposes. e-mail: smaginaa1@mail.ruA.A. Smagin

Determination of payload capacity of container obtained by steganographic technique, not causing distortion59_4.pdf

This article discusses the parameters and characteristics of a steganographic algorithm that does not distort the container, such as the versatility of application to different types of information, ease of implementation, as well as the security and payload capacity. The purpose of this work is to determine the dependence of payload in hidden communication channel on the size of the embedded data and container. To achieve this goal, the tasks were set to determine payload at different sizes of the container and the data, as well as whether it was possible to embed secret data completely. To solve these problems, two experiments were conducted with different ratios of the size of the embedded secret and the container. The ratio of sizes of data and container for efficient embedding was determined. Described the set-theoretic representation of model of the stegosystem for algorithm, and shown the scheme of operation of this algorithm. Compared payload of container obtained using the proposed algorithm and the existing steganographic methods on the test images of Lenna and Baboon. The experimental results show that the proposed algorithm is more effective than most existing methods in terms of embedding capacity.

Steganography, embedding, key, payload, algorithm, secret, container.


Denis Aleksandrovich Korochentsev, Candidate of Sciences in Engineering; graduated from the Rostov Military Institute of Missile Troops; Acting Head of the Department of the Cybersecurity of Information Systems of Don State Technical University; an author of publications, certificates of computer program registration in the field of development of artificial intelligence systems, information security, cryptographic protection of information. e-mail: mytelefon@mail.ruD.A. Korochentsev

Anna Sergeevna Pavlenko, Second-Year Master’s Student at the Department of Cybersecurity of Information Systems of Don State Technical University; an author of publications, certificates of computer program registration in the field of development of artificial intelligence systems, information security, neural networks, fuzzy logic. e-mail: anutka-pavlenko@yandex.ruA.S. Pavlenko

Development of a neuro-fuzzy model of information security by identifying unmasking features of the object of protection59_5.pdf

The article deals with an information security model developed by authors that is based on adaptive neuro-fuzzy system providing timely detection of unmasking features of the object of protection. A number of unmasking features of the object of study with fuzzy values are used as input variables of the considered model. The model combines the function of modeling fuzzy inference with the ability to train an artificial neural network. A set of fuzzy inference rules is generated directly from experimental data. The output result of the system is the value of the integral index in the interval [0,1] that describes the information content of the characteristic structure of the object under study with recommendations for reducing its information content. Evaluation of the correctness of the model was carried out by comparing experimental and obtained (forecast) results.

Unmasking feature, information security, fuzzy logic, neural network, model, information security.


Dmitrii Vladimirovich Kurganov, Candidate of Sciences in Physics and Mathematics; graduated from Samara University; Associate Professor at the Department of Oil and Gas Engineering of Samara State Technical University; an author of articles, patents in the field of the modeling of oil and gas field development processes. e-mail: Dmitri.Kourganov@inbox.ruD.V. Kurganov

The calculation of the bottomhole treatment success probability using machine learning techniques59_6.pdf

Machine learning is the most widely used branch of science and engineering nowadays. The availability of electronic real- wide information is an important condition for implementing the machine learning. Through the long history of exploitation of oil fields, a significant database related with the wells’ development and applied techniques stimulating production was created and accumulated. The article deals with one of the machine-learning methods of analyzing the prediction of geological and engineering operations implemented in the producing oil wells. In particular, by the example of database data including the results of hydrochloric-acid-and-mud-acid jobs implemented in the oil fields of Ural-Volga region, as well as in terms of specific decision-tree models, the probability of successful geological and engineering operations is calculated, and recommendations on the selection of factors allowing to optimize these operations are given. The openness, average number of permeable intervals, reservoir temperature, actual well watering, fluid properties are parameters influencing the geological and engineering operations efficiency. In many cases, it is difficult to predict the influence in the future especially if other factors are available. The existing analytical models could not describe fully the factor variety in processes running in a bottomhole zone, especially in the context of nonlinear flotations, physical and chemical interaction between formation fluids and injection solutions. The described technique allows using any number of key factors and any combination of them, as well as detecting the most important of them including the parameters described, but not limited to. In this case, the application of decision tree models is an intuitive way allowing to select neatly sampling attributes at each level by the use of algorithms. The article also describes in detail the algorithm of sampling attribute calculation. Decision tree methodology can be used for solving other problems in the oil producing industry with significant practical experience.

Hydrochloric acid treatment, mud acid treatment, bottomhole treatment, yield, crude oil, oil well, probability, machine learning, decision tree, modeling.


Aleksei Andreevich Sapunkov, graduated from the Faculty of Information Systems and Technologies of Ulyanovsk State Technical University; Assistant Professor at the Department of Information Systems of UlSTU; an author of articles in the field of analyzing and forecasting the time series. e-mail: sapalks@gmail.comA.A. Sapunkov

Tatiana Vasilevna Afanaseva, Doctor of Sciences in Engineering, Associate Professor; graduated from the Radioengineering Faculty of Ulyanovsk Polytechnic Institute, Professor of the Department of Information Systems of UlSTU; an author of articles in the field of analyzing and forecasting the time series. e-mail: tv.afanasjeva@ gmail.comT.V. Afanaseva

A decision support methodology for prioritizing user requests for software modifications59_7.pdf

In this paper, approaches are analyzed and a decision support methodology for prioritizing user requests for software modifications received through a technical support service is described. This task is relevant for iteratively developing software, since at each iteration a stream of requests from end users for software modification is received. The aim of the proposed methodology is to automate the process of evaluating and prioritizing (ranking) requests to convert them into requirements. A distinctive feature of the methodology is the inclusion in the query assessment of information about the sources of queries, as well as point and temporal estimates. To analyze the changes in the number of requests of each type, it is proposed to use their forecasting based on fuzzy time series models. The proposed methodology will reduce the time costs for managers and software developers to analyze problems and make decisions on how to fix them. The article provides a formal description of the stages of the proposed methodology and considers an example of its application as a means of supporting decision-making on the inclusion of high-priority requests in the list of requirements for software development. In conclusion, conclusions are drawn on the limits of applicability of the proposed methodology.

Intellectual decision support, prioritization of requirements, System analysis, developing software products, forecasting, fuzzy forecasting models, time series.


Vsevolod Valentinovich Tipikin, Candidate of Sciences in Engineering; graduated from the Faculty of Physics and Mathematics of Ulyanovsk State University; Deputy Chief Designer at FRPC JSC ‘RPA ‘Mars’; an author of articles in the field of automation of information system design. e-mail: mars@mv.ruV.V. Tipikin

Maria Anatolevna Skibina, Candidate of Sciences in Physics and Mathematics; graduated from the Faculty of Mechanics and Mathematics of Ulyanovsk State University; Leading Engineer of FRPC JSC ‘RPA ‘Mars’; an author of articles in the field of genetic algorithms applicability to stochastic systems optimization. e-mail: mars@mv.ruM.A. Skibina

Data mining in document automation system59_8.pdf

The paper reviews the document automation systems and determines the perspectives of intellectual tools usage. It considers the document automation system developed by FRPC JSC ‘RPA ‘Mars’ and implemented in its products. The system functionality is offered to be improved by adding the data mining subsystem. The subsystem intends to accumulate data while in service and to analyze them. The paper specifies the data set, defining the process of document developing, i.e. planned and actual time of developing, amount and quality of finished documents. Data mining implies a determination of correlation and regression dependencies between attributes. Based on the results of data mining, management officials are able to arrange the process of document developing effectively. The authors give an example of regression dependencies between the time taken to develop a document of one type and its size. Random values of performances are given while simulation modeling.

Document automation, data mining, regression dependencies.


COMPUTER-AIDED ENGINEERING

Aleksei Mikhailovich Namestnikov, Doctor of Sciences in Engineering, Associate Professor; graduated from the Radioengineering Faculty of Ulyanovsk State Technical University; Professor of the Department of Information Systems of UlSTU; anauthorof more than 80 scientific papers in the field of computer-aideddesignandintelligence systems. e-mail: nam@ulstu.ru.A.M. Namestnikov

Gleb Iurevich Guskov, Candidate of Sciences in Engineering; graduated from the Faculty of Information Systems and Technologies of UlSTU; Associate Professor of the Department of Information Systems of UlSTU; an author of scientific papers in the field of ontology modeling and time-series data mining. e-mail: g.guskov@ulstu.ruG.I. Guskov

Aleksei Aleksandrovich Filippov, Candidate of Sciences in Engineering; graduated from the Faculty of Information Systems and Technologies of Ulyanovsk State Technical University; Associate Professor of the Department of Information Systems of UlSTU; an author of articles in the field of intelligent data storage and processing systems. e-mail: al.filippov@ulstu.ruA.A. Filippov

Intelligent analysis of software systems projects based on the ontological approach59_9.pdf

This article presents a new effective model, algorithms, and methods for representing the subject area of the process of developing a software systems. The subject area is presented in the form of fragments of the knowledge base of the design process support system. The knowledge base is formed in the process of analyzing class diagrams in UML notation. The proposed approaches can reduce the time of the design process and increase the quality of the obtained software system through the use of successful design solutions used in other projects. Search for successful design solutions is carried out using the developed metrics for determining the similarity of software systems projects. The metric allows calculating the design pattern match in the source code of the project.

Knowledge base, software system, UML, class diagram, project solution.


Aleksei Aleksandrovich Filippov, Candidate of Sciences in Engineering; graduated from the Faculty of Information Systems and Technologies of Ulyanovsk State Technical University; Associate Professor of the Department of Information Systems of UlSTU; an author of articles in the field of intelligent data storage and processing systems. e-mail: al.filippov@ulstu.ruA.A. Filippov

The creation of the knowledge base of design organization for automating the architecting process of software systems59_10.pdf

This article deals with an approach to automating the architecting process of software systems based on the experience of previous projects. The architecture of the software system is the understanding and presentation of the software system in the form of the architecture description. The architecture description consists of many design artifacts. Design artifacts are formed during the whole life cycle of the system. It is possible to improve the quality and reduce the time of design and implementation stages in the process of developing a new software system based on the experience of previous projects by extracting design experience units from previous projects. The design experience unit is a successful design and architectural solutions used in previous projects. The article describes the logical model of the knowledge base of the design organization that allows storing and searching for design experience units. The knowledge base model also allows link the design experience unit with the project where it was obtained, with the stage of the life cycle model where it was formed, with the concepts of the subject area, and with the design artifact from which it was extracted.

Architecting, Ontology, software system, design, implementation, knowledge base, design experience.


Nadezhda Glebovna Yarushkina, Doctor of Sciences in Engineering, Professor; graduated from Ulyanovsk Polytechnic Institute; Interim Rector of Ulyanovsk State Technical University, Head of the Department of Information Systems at UlSTU; an author of more than 300 scientific papers in the field of soft computing, fuzzy logic, and hybrid systems. e-mail: jng@ulstu.ruN.G. Yarushkina

Aleksei Sergeevich Zhelepov, Postgraduate Student at the Department of Information Systems of UlSTU; graduated from the Faculty of Information Systems and Technologies of UlSTU; a winner of the iVolga-2014 Youth Forum, a coordinator of the “Stachka” IT-Conference, an author of articles in the field of data mining. e-mail: a.zhelepov@gmail.comA.S. Zhelepov

A prototype of a system for searchning and selecting the ‘staffed’ teams of it-professionals based on data from project repositories59_11.pdf

The growth of information technology sphere leads to a serious shortage of qualified personnel. Thus, many companies are changing well-established workflow models, e.g. giving employees the ability to work remotely. That happens due to the opportunity of hiring globally. For companies focused on the development of product solutions, there is a trend of searching for “played” project teams, specialists who have successfully worked together for a long time. However, the search for such teams leaves a definite imprint on the activities of the HR department of the company: it becomes necessary to analyze the activities and technical developments of not individual employees, but of the potential team as a whole. The article describes a prototype of an automated system which main task is to search and select teams of specialists based on data from open source code repositories and related artifacts. The article details: the composition of the system architecture, the selection algorithm for the main project team, identified during the study of the group activity metrics, formulas for calculating the values of the metrics, as well as their application in solving the problem of analyzing the project repository.

Repository, remote development team, teamwork metrics, search, filtering.


ELECTRICAL ENGINEERING AND ELECTRONICS

Gennadii Nikolaevich Abramov, Doctor of Sciences in Engineering, Professor of the Department of Industrial Electronics of Togliatti State University (TlSU); Honored Worker of Higher Professional Education in Russia; an author of monographs, study guides, patents, and scientific publications in the field of analog-to-digital and digital-to-analogue conversions of monopulse electrical signals (MIES). e-mail: yuran_a@mail.ru.G.N. Abramov

Iurii Gennadevich Abramov, Master in Engineering and Technology of Electronic Devices and Hardware; graduated from Togliatti State University; a system administrator of Labyrinth Volga, LLC; an author of scientific publications and patents in the field of analog-to-digital conversions of MIES parameters. e-mail: yuran_a@mail.ruI.G. Abramov

Improved efficiency of nonius-pulse time-digital converters59_12.pdf

The article deals with two methods, both methods are implemented in two versions, reducing the additional conversion time to (first options) and (second options) times of non-pulse-time-to-digital converters (TDC). In both methods, the conversion process is organized according to the interpolation scheme, it contains the conversion scales “roughly”, “precisely” and “more precisely” and two (start and stop) recirculation generators. The whole number of follow-up periods Тstr that filling the duration of the converted time interval (TI) of the start one (in the first cases) or multiphase (in the second cases) recirculation generator RGstr is initially “roughly” determined in the course of the interpolation conversion. Then the nonius-pulse method - “exactly”, with discreteness < Tstr, digitizes that part of the converted TI, which is enclosed between the last recorded impulse of the RGstr and directly the end of the converted TI. Further, the direct encoding method - “more precisely”, with the discreteness of the conversion , encodes the TI, the duration is less than the value . Moreover, in both methods, a single-phase stop recirculation generator RGstp is used. The hardware implementation of the first method is based on the use of a single-phase RGstr and a separate direct-coding (DC) module, which only performs the “more accurate” conversion and does not participate in the “rough” and “exactly” conversions. Whereas in the second method, the DC TDC, through its multi-tap delay line, is introduced directly into the RGst, transforming it into a multiphase generator, which forms single-phase reading scales “roughly”, “exactly” and a multiphase scale “more precisely”, which ensures a decrease in of the instability of the discreteness of the transformation “exactly” and “more precisely”. The use of multiphase RGstr makes it possible to eliminate the problem of pairing (docking) of all three conversion scales.

Vernier-pulse time-to-digital converters, single- and multiphase recirculation generators, direct-coding analog- to-digital converters, pulse-counting sequence, conversion resolution.


Aleksei Aleksandrovich Zadorozhnii, graduated from the Faculty of Information Systems and Technologies of Ulyanovsk State Technical University with the specialty in Instrument Engineering, Postgraduate Student of the Faculty of Information Systems and Technologies of UlSTU; Head of a thematic integrated team in Ulyanovsk Instrument Manufacturing Design Bureau, JSC, an author of articles in the field of mathematical modeling and algorithms for air data systems. e-mail: alezador@gmail.comA.A. Zadorozhnii

Determining the reliability of the indicated speed parameter based on the dynamic object characteristics obtained during flight tests59_13.pdf

The article presents the results of calculation and modeling of the algorithm for determining the reliability of the instrument speed parameter based on the dynamic characteristics of the object. The problem statement is formulated as follows: it is necessary to conduct a flight experiment and subsequent processing of its results in order to determine the permissible range of speed changes with a known set of data on the engine mode, pitch values and measured angle α. Wherein is necessary to give problem decision by regular means of the on-board equipment of the facility, without the use of additional trajectory measurement systems, and satellite system data. The optimization and definitions of law was performed by using the computer-aided system MathLab. Analysis of the results of the introduction of this algorithm showed that the determination of reliability can be performed with limited statistics of the obtained dynamic characteristics of the object. When using data from standard aircraft systems (inertial, engine control system, aerodynamic angle sensor), as well as a large number of flights in the anticipated operating conditions, we can expect a decrease in confidence determination thresholds to ± 60 km/h.

Air data system, air data modeling, instrument speed, reliability.


Viktor Vladimirovich Podoltsev, graduated from the A.F. Mozhaysky’s Military Space Academy in 2009; Postgraduate Student of the Astrakhan State Technical University; research interests are in in the field of information processing, synchronization, noise immunity. e-mail: pvv_001@mail.ruV.V. Podoltsev

Iskandar Maratovich Azhmukhamedov, Doctor of Sciences in Engineering, Professor; graduated from Kazan State University named after V.I. Ulyanov-Lenin; Head of the Information Security Department of the Astrakhan State University; research interests are in the field of information processing, synchronization, noise immunity. e-mail: aim_agtu@mail.ruI.M. Azhmukhamedov

Evaluation of the efficiency of synchronizing pseudorandom sequences on the basis of majoritary algorithm at the development of destructive errors59_14.pdf

Despite a substantial period of time, the problem of pseudo-random sequences (PRS) synchronization still remains topical. First, this is due to the wide technologies commercialization of distributed spectrum started in the late 70-s of XX century [1] by introduction of mobile telephone systems, and due to continued development aimed at increased efficiency [2]. Secondly, the urgency of the problem can be connected to the implementation of advanced methods of multiple accesses, greatly increasing the performance of media access control (MAC) protocols using PRS. Thus, the use of PRS-oriented MAC protocols in the information processing systems is topical and the most promising tendency of information technology development under informatization conditions in country during the transition to a digital economy. The paper aims at evaluating the effectiveness of PRS synchronization on the basis of majority algorithm with increasing destructive errors and at justification of objectives for further research. The paper deals with the following methods: Ward’s method (for ‘valid interval’), the synchronization method based on a majority of the processing segments of the PRS. It is shown that the majority method is beneficial for noise immunity by correcting errors in the valid interval, in comparison with Ward’s method and can be easily implemented at sub-level access to the transmission medium MAC. Also the paper considers the synchronization method based on a majority of the processing segments of the PRS. It justifies urgency of such techniques at sub-level of MAC protocols. The article also describes goals and topical objectives of further study viz. evaluation of time characteristics of the synchronization method based on the majority of checks; evaluation of probabilistic characteristics method, providing that the information processing system is synchronized by a short segment of PRS; method modification based on obtained results for the purpose of improving its efficiency by increased probability of destructive errors in information processing and control systems using PRS-oriented MAC protocols of multiple access.

Information processing, average search time, probability of a destructive error, methods of pseudo-random sequence synchronization, method of majority information processing, Ward’s method.


© FRPC JSC 'RPA 'Mars', 2009-2018 The web-site runs on Joomla!