Artificial intelligence (AI) has the potential to be a transformative tool for the packaging and processing industry, but the technology includes a few significant drawbacks to consider.
That’s according to PMMI Business Intelligence’s 2024 report, “The AI Advantage in Equipment: Boosting Performance and Bridging Skills Gaps,” where Business Intelligence researchers interviewed industry stakeholders to understand current applications, ambitions, and attitudes on AI in packaging and processing.
Cybersecurity concerns
AI models require data for training, and how much customer data is required depends on the technology.
Predictive maintenance, for example, requires continuous monitoring of end users’ data to function and data ownership varies between solutions. Digital twin simulations, however, can perform their function with just the machine parameters.
Navigating cybersecurity issues is always a challenge, but as data security and technology improve, legislation changes, and public trust in these tools increases, it may be possible to overcome many of these concerns.
Status of data collection/historians in companies
A key issue that can slow down the integration of AI within a company is the state of its data collection infrastructure.
To function optimally, AI models require the collection and storage of relevant data, and the level of this can vary substantially between CPG end users.
Pushback from older employees
As with most new technologies, reluctance to utilize AI tools can be generational.
This has been reported to be especially true with AI assistants, where some users tend to prefer completing tasks the way they have always done them.
“I think there’s an element of distrust about [AI]. But there’s absolutely an element of newness and not great clarity on what it is and what it can do,” says one interviewed representative of a food and beverage CPG.
Job security
Some interviewees mentioned job security as a common concern with AI integration.
Utilizing AI for machine vision has allowed more complex processes to be automated and has place some jobs at risk of being replaced.
Some other AI applications, like AI assistants or connected worker platforms, are less likely to risk job security. These are not replacing the staff completing tasks, but rather increasing the speed at which tasks are completed.
Data hallucination
Large language models (LLMs) can occasionally provide incorrect information, which is known as a data hallucination.
Understandably, this makes users reluctant to trust the outputs of AI assistants. Mistakes in applications, such as maintenance, could potentially cause machine downtime or even serious injury.
However, this concern can be overcome by having the AI verify its own outputs against either a machine manual or a highly skilled employee.
SOURCE: PMMI Business Intelligence: 2024 The AI Advantage in Equipment: Boosting Performance and Bridging Skills Gaps
For more insights from PMMI’s Business Intelligence team, find reports, including “2024 Transforming Packaging and Processing Operations” and “2023 Achieving Vertical Startups” at pmmi.org/business-intelligence.
Download the FREE report below.