Industry 4.0


Philip Bayard Crosby coined his “do it right the first time” philosophy back in the 1960s, which emphasized error prevention over error detection. Today – half a century later – we read daily about product recalls in all walks of life. Defective airbags in vehicles,tearing climbing ropes, carnival costumes with toxic dyes and baby food with glass splinters. In Germany alone, an incredible 1.9 million cars were recalled in 2014! It cannot always be argued that the causes of these events are due to time and cost pressures. One often finds that it is especially learning from mistakes that is not consistently implemented.

The innovation cycles, the time from concept to production to delivery of the products are getting shorter and shorter, the customizing of the products is getting higher and higher (ultimately quantity 1), excellent performance and quality as well as a favorable price are taken for granted by the customer and all this with ever increasing requirements regarding compliance, certificates, environment, corporate social responsibility and more.


IT companies promise solutions for everything. A few years ago MES was the panacea, since 2013 at the latest it has been Industry 4.0 and self-proclaimed experts rant about platitudes, quote texts from the literature and have as their only recommendation to further increase IT investments in manufacturing and the number of their consulting days, which as a rule only produces costs and not results. This creates mistrust and reservations about IT solutions for production optimization, although significant improvements could be achieved through the pragmatic use of existing solutions.
Stefan Bratzel, professor of automotive economics in Bergisch Gladbach, says with regard to the large number of recalled cars that, according to his findings, manufacturers’ quality management systems are being adapted to the new global product development and production processes in many places.
And this is precisely where the problem lies: because what is still frequently overlooked today is a compellingly necessary dovetailing of production and quality. Isolated coexistence of CAQ and MES systems inevitably leads to efficiency losses, repeat errors, and prevents learning from mistakes.
The task in the companies is therefore to meet these requirements, i.e. errors must not occur in the first place, production must be adaptable and, just in case, everything must be traceable in the end. MES software must adapt to the needs of users (without programming, if possible), use existing standards and technologies, and provide for meaningful integration of existing and functioning systems so that the information generated throughout the value chain can be meaningfully analyzed, aggregated, filtered, and beneficially distributed.


An example from daily life makes it clear: The whole production has to behave like a cook when preparing a good meal. You take the right raw materials and check if they are fresh and in order. Then you follow a tried-and-true recipe, make sure nothing burns, taste the food and arrange it on a plate at the end. So you check the quality of the ingredients, control the production, monitor the process and finally deliver the finished product to the customer. If you also pay attention to the sustainable and ecological origin of the ingredients, you not only fulfill environmental requirements, but also your corporate social responsibility and the expectations of the customer. If it still tastes good at the end, everything has worked perfectly.
In this example, mutability is also imperative, namely when the guest has special requests or even intolerances that must be taken into account individually during production.
If we transfer these thoughts to the manufacturing industry, we find that all information from the different areas must be used.

Quality planning, production planning, quality assurance, production control and all integrated functions only enable networked knowledge and thus reactions to events with targeted supply of systems and people with the right information for alerting and decision-making – even across company boundaries. Three issues play a crucial role in this:


Users at all levels – from workers to managers – must be empowered to make decisions. Everyone needs to feel they can influence and drive the processes rather than be driven by them, which helps to reduce prejudice and mistrust. Due to the ever-increasing complexity, it is imperative that people are not overwhelmed with too much and, moreover, meaningless information and that the IT systems used can be applied intuitively and according to requirements.
Involving all additional organizational units such as IT and the works council, as well as customers and suppliers, coupled with transparent communication and presentation of the upcoming tasks and goals, facilitates implementation, as sensitive issues such as IT security, data transparency and efficiency must also be considered and resolved.

Data is constantly being generated in production and quality. In most cases, these must be available to the intelligent monitoring and analysis systems at the time they occur, so that errors that are imminent can be detected before they occur and new, unknown errors can be detected immediately. Therefore, specifications for monitoring are necessary (which have already been defined in the FMEA if implemented consistently) and the interrelationships must also be known. Simple data collection in the sense of “a lot helps a lot” does not achieve the goal. On the contrary, the data to be collected must be specifically selected and, in any case, one must ask the question of the benefits and impact of changes in the data.
If the data and the key figures and evaluations derived from them are available, it is also relevant who must receive which information. The relevant information at the right time in the right place via the appropriate medium helps to have the necessary basis for decision-making in the respective case.

The technologies to collect the data and easily exchange it are all in place today. These are communication protocols, databases, and quasi-standards of various data formats. However, there are hurdles to overcome here as well:

  • The necessary open IT architectures are not in place everywhere. There are still countless IT systems that are encapsulated or where no one in the company knows how to access the data anymore because the systems are hopelessly outdated, were once developed by a working student a long time ago, or contain Excel macros that are protected with an unknown password.
  • Not every IT provider is willing to cooperate, which makes meaningful data collection much more difficult. Sometimes it is not even the lack of willingness to cooperate, but simply a completely exaggerated presentation of the complexity and thus cost of integration, which is often used as a deterrent.
  • However, integration is not just about technology: interaction and integration must take place first and foremost between people, across temporal and spatial separation, language barriers, cultural and technological differences.