The irony of so-called big data is that they are easier to obtain than to use. Any accounting automation (even partial) creates points for input and accumulation of all types of information. The databases are gradually growing, gaining more tables, file archives, tons of email correspondence, reports, and numerous accounting documents.
Why are big data so difficult to use?
Most companies record all essential aspects of work, piling up different kinds of data—marketing data, production, sales, and financial statistics, and HR documents. Corporate executives must have access to a detailed analysis of all processes in the enterprise. They must have it, but usually, this does not happen.
The main systemic problem with big data is inconsistency. Information can be very heterogeneous, multi-layered and diverse. At this moment, it becomes impossible to do without the help and the power of artificial intelligence (AI).
The "traditional" use cases and applications of artificial intelligence in business include statistical process control; analysis of the types and consequences of potential failures; analysis of measuring systems; pricing and inventory management; total productive maintenance (M&R, maintenance and repair).
Requirements to customers
An AI system can be implemented by any company that meets the following requirements: the availability of tools/sensors/devices that enable the automatic collection of data on the equipment operation or production processes (infrastructure part); availability of MES, ERP, Level 1 Plant Control System 1, Level 2 Plant Control System (SCADA) or other systems that collect and consolidate the equipment operation or process data (data sources); at least 1-year history of equipment operation or process data (3-year history is preferred); it is also important that the executives recognize the need for a new approach to working with information and management systems in general.
Tasks and goals of Softline as an integrator
The task of Softline as an integrator is to scale the best knowledge and algorithms in the field of machine learning systems, making them smarter and smarter. We test the created algorithms on various platforms for each client, selecting the most appropriate solution in each specific case. Our goal is to find a solution that best solves the client’s tasks and easily integrates with its production processes.
Softline product portfolio has a number of AI and predictive analytics solutions and platforms: Microsoft, Deductor, Prognoz, etc.
Use cases of AI platforms
Objective: to meet the needs and expectations of stakeholders in terms of quality, deadlines, production and supply volumes, and the planned production cost.
Technologies: collection and analysis of statistical data on the production system: the level of standardization and compliance with standards; evaluation of four production factors: equipment, production methods, human factor, procurement and controlling the quality of incoming materials and raw materials; statistical process control (SPC); cost management at the level of production units/workplaces.
Features: big data is collected from all enterprise accounting systems: ERP, MES and other systems in which management accounting is performed. This use case involves collecting information on production costs, quality, performance, etc. Planning the product quality and cost is very closely related, and the statistical process control to increases the reliability of production and thus helps to obtain a guaranteed result. For example, it can be used to calculate the production volume following the "just in time" concept. All production chains are monitored, and product quality issues are tracked in two categories: the cost of compliance assurance and cost of eliminating nonconformities. As a result, the company will be able to calculate the total cost of each batch and even production units.
Result: the production cost can be reduced by 20-30% due to the elimination of losses in production and more efficient quality management.
Statistical Process Control (the most popular tool is Shewhart control chart)
Objective: ensuring and maintaining the process parameters at a stable level, ensuring the compliance of components (products) with the established requirements.
Technologies: collection and analysis of statistical data on the production system, integration of systems and approaches into a single enterprise information system.
Features: the developed digital enterprise model is very flexible and can be customized depending on the situation. The statistical process control methods comply with standards and GOSTs. In Russia, the practical implementation of this approach is only an emerging trend; however, large foreign enterprises have been using these technologies for more than 65 years.
Result: a set of solutions that allows you to effectively manage the enterprise processes, monitor each deviation, and as a result, manage them, improving the processes within the existing constraints.
Total productive maintenance (M&R, maintenance and repair)
Objective: to increase the efficiency of equipment operation by preventing and eliminating losses throughout the entire lifecycle.
Technologies: collection and analysis of statistical data on the equipment performance (wear, operation modes); building digital models of equipment units; predicting possible failures and mean time between failures.
Features: Softline offers a methodology for the calculation of OEE (Overall Equipment Efficiency). At most enterprises, it is about 50% (this mostly refers to CNC machines, while it is even less for general-purpose equipment)—the amount of losses due to inefficient equipment operation, maintenance, and repair was enormous.
Result: up to 2x productivity growth, reduction of expenses on repair and spare parts (30% on average).
Case Study No.1.
Task: Following the order of a large Russian cable plant, Softline has created a neural network that uses historical data to predict the demand for manufactured products and assess the likelihood of its actual order.
In production enterprises, opportunity management is a very labor- and time-consuming process. The conversion to sales is not very high, so it specialists need to understand whether a deal will be concluded or not as soon as possible. The task was to build and forecast the demand for the regular range of products and the volume of raw materials needed in all categories to further optimize the production cycle and the procurement process.
Solution: The pilot project was implemented on solutions using machine-learning algorithms. The neural network trained on the prepared sample showed the best result. Such a system can learn without human intervention and is able to make more and more accurate predictions in the future. In fact, it can be called artificial intelligence.
Results: The pilot project results show that the trained models can be used to predict the basic sales volume and consumption of raw materials in the medium term, as well as manage the stock reserves of finished products.
The efforts of sales managers were redistributed in such a way that they could prioritize the orders that are most likely to be confirmed. As a result, the enterprise saved labor costs and improved product quality by 2-3 times.
In the future, it is planned to include additional influencing factors into the model, including the factors from external sources. It will be also required to study the impact of production constraints on the conversion from demand to orders and, if possible, use them in the sales funnel prediction.
Based on the results of the pilot project, recommendations were made on using a forecasting model for placing an order in conjunction with sales funnel data to optimize sales planning processes.
One of the pilot project results was the development of recommendations on using the order placement probability prediction model combined with the sales funnel data to optimize the sales planning processes.
Case Study No.2.
Task: At the commission of one of the largest European chemical companies, Softline created a neural network that defines the process stabilization algorithm and calculates the economic effect from various input parameters of the production process.
Solution: The pilot project included creating a self-learning algorithm for assessing the current production process. It was revealed that at that time, the process was not statistically controlled and the enterprise resources were not utilized optimally, which incurred direct financial losses. Simulation modeling helped to find that the algorithm can save up to 122 tons of steam per day.
Results: the pilot project has shown that the created models can be applied at the enterprise level to solve the following tasks:
- Improving the quality of finished products, analyzing the types and consequences of potential failures;
- Minimizing the equipment downtime, analyzing the equipment failures, and planning the scheduled repairs (Ensure the collection and aggregation of information about the state of fixed assets, form a regulated basis of time between failures of main jobs and form a plan for the current repair of fixed assets);
- Eliminating the human factor (minimizing the number of errors), controlling the costs of raw materials (cross-functional collection of information on all the order processing routes to calculate the order cost and carry out a plan/actual analysis of deviations).
The project development plans include adding more influencing factors to the model (including external factors) and improving the self-learning algorithm.