Composites for Europe is one of the leading European trade fairs and forums for innovations in composites, technologies and applications. Prior to the fair, Franz has been interviewed on nebumind, its value for the composite industry and our view on the degree of digitalisation of the manufacturing industry in Germany.
„Digitization requires a vision, but not a master plan“, Composites for Europe
07 September 2020
Franz Engel is CEO and founder of the start-up company nebumind, which aims to help production companies analyze the quality of the components they produce, understand the factors that influence quality and optimize production parameters. In an interview, he explains the role these offerings play for the composite industry and how he assesses the degree of digitization of the industry in Germany.
The short version: We help production companies to analyze the quality of the components they produce, to understand the factors that influence quality and to optimize production parameters.
The detailed version: During my time in aviation, I spent a lot of time on the road in production and was mainly concerned with process automation and „in-process sensor technology“ in the CFRP area. At some point I came to the conclusion that very often it is not necessary to use even more sensors, but to intelligently evaluate the production data that one already has. The problem is that most engineers are not very well versed in data analytics and most data scientists have too little knowledge of manufacturing technologies and materials. There is a language barrier that few companies I know have overcome so far. Therefore, solutions are needed that offer both sides a common basis.
What we offer customers is a software solution that prepares manufacturing data in a way that it can be intuitively understood and analyzed with simple means. We have specialized in the areas of component quality and process stability. The core of our data preparation is 3D structuring and visualization. Our assumption is that the vast majority of component quality problems can be identified at specific positions on the component. Therefore we structure the data in a way that we quickly get all information about a certain location, e.g. how fast the AFP machine has deposited the fibers at the coordinates X,Y,Z, and with which compacting force and temperature of the heat source, etc. Our software relates everything to the specific location where the employee may have found a defect or deviation. Data visualization in 3D is a simple concept, which is clear even to people who have not studied computer science. We also refer to it as digital component twins, whereby our twins have nothing to do with CAD or simulation, but exclusively represent manufacturing data. If you find a defect in the top left-hand side of the component, you want to know the production data at exactly this position. In a first step, you compare the data with components of the same type at the same position to find out whether any of the recorded process parameters deviate from the „normal“. Furthermore, our structuring of the data is predestined to serve as a basis for machine learning. For most approaches, no further effort needs to be put into data preprocessing. We are currently evaluating this approach with a pilot customer who is already using our software, and with Fraunhofer IGCV and DLR ZLP in Augsburg. At the IGCV, we have even set up a permanent edge computer that can be used for customer projects at any time to record digital component twins for the AFP.
Most other software solutions on the market focus on business intelligence and thus on areas of production that do not require explicit knowledge of production processes: Most of them focus on analyses of logistical processes – for example, the analysis of where bottlenecks are in production, whether the material flow is stable, or even the OEE of the production facilities. For this type of analysis, it makes sense to work with time-series databases. The questions that are asked to such databases refer to times, e.g. was my machine running all day, was my material at the right time at the right machine, etc. However, these software solutions are not suitable for asking questions about a specific location in the part. Databases are always optimized for specific applications and so our solution is optimized for quality analysis and process stability analysis. This means that even for small application cases, our solution would already be 10,000 times faster than solutions from the field of business intelligence. From our point of view, it is important that the engineer is able to perform „fast“ analyses regarding component quality in order to follow up various assumptions. In addition, it makes our solution performant enough to monitor quality during operation, i.e. „in-process“.
It is the first time that productions have a tool for systematic data analysis that allows experts to perform quality analyses independent of processes. This will dramatically improve communication between silo experts, since not only can different parameters from a single process step be examined, but data from different process steps along the entire process chain can also be retrieved and analyzed. As a result, the CT expert, for example, can sit down with the AFP expert and superimpose the data from both production steps to search for deviations or correlations. Once these deviations have been found and it is clear where the cause of the deviations lies (e.g. an unfavorable combination of speed, compacting force and temperature in a ramp with a certain gradient), such combinations of parameters can also be monitored online or even used to check before production whether a „good“ component will be produced with a certain machine program.
It depends greatly on the area. Areas that were already digitized are now becoming more and more so (e.g. digital production orders, tracking of components, etc.). ERP systems were already used in many areas and are now being expanded step by step. One area that, in my opinion, is very much lagging behind is the digitization of manufacturing processes. In my opinion, companies have dealt with detailed master plans far too early on without gaining the necessary experience and building up knowledge. Many have defined for themselves that all machines must have a standard interface so that one can get the data and start the analysis – incidentally, without clearly defining what one actually wants to analyze. But this one standard interface does not exist. And the requirements for the interfaces actually also result from what you want to analyze. The interfaces generate or influence the data quality significantly. If I want to determine my OEE, I only need to know every few seconds whether my machine is still running and what it is doing approximately. But if I want to analyze the quality of components, it is not enough for me to know the laser power of the Thermoplast Fiber Placement every few seconds. Robots can move at speeds of more than one meter per second. If I only get the production values once per second, the machine has dropped a meter about which I would have practically no information at all. In the area of actual production, a lot of knowledge and understanding still needs to be built up with regard to digitization.
As mentioned above, we perceive that much is being done in areas that were already digitized, and we also think we see that these efforts have increased during the pandemic. The other areas seem to be more inhibited by the pandemic. I suspect that in the phase most companies are in at the moment, you need local personnel to connect the machines and also to exchange information in multidisciplinary teams to find out what you actually want to achieve – to find an answer to the question of what comes after the calculation of the OEE, if necessary.
I think you definitely need a vision of where you want to go with digitization, but first without a detailed master plan. Then you have to start with small projects to gain experience and build up experts. There are many steps that are not fancy and AI driven. Someone has to go to a machine with a cable, connect it and see how and what data he gets out. This person has to know something about it and also has to know which requirements are made on the data quality. Because without this person, the department with the machine learning experts, if it exists, has nothing to analyze or in a quality that is not sufficient, because the saying „shit-in, shit-out“ is still true. Quality cannot be increased retroactively.