It has been two years since OpenAI's generative AI (GenAI) tool ChatGPT was launched, and with many rivals emerging on the market since, GenAI technology is beginning to be deployed in many industries, including the engineering sector, but concerns remain as to its viability and appropriateness.

The engineering sector accounts for nearly a fifth of the UK's total workforce and in 2022 generated £646bn for the UK economy. Engineering is experiencing a surge following a slump during the Covid-19 coronavirus pandemic.

But there are concerns that the number of experienced engineers taking early retirement could lead to critical skills being lost. Larger engineering companies, such as Rolls-Royce and BAE Systems, are using skills academies to train new staff and the government is promoting apprenticeships.

However, some companies are considering using artificial intelligence (AI) to help bridge the skills shortage by enabling experienced engineers to use their time more effectively.

During the summer of 2024, professional engineeringthe magazine of the Institute of Mechanical Engineers (IMechE), conducted a survey on the use and challenges of AI within the sector.

Naturally, given the IMechE's focus on mechanical engineering, it concentrated on that specific discipline, but its report on the findings offers insight into the engineering sector as a whole.

Although fewer than hoped for, 125 members of the IMechE responded to the survey. Over 40% of respondents said the companies they worked for were using AI tools, with over 20% indicating they were planning to do so.

One of the reasons for the comparatively swift deployment of generative AI in the past two years is that some of the tools are relatively easy to access and do not require specialist hardware. For example, all that is needed to access ChatGPT is an internet browser.

“There's a huge opportunity to utilize this technology in engineering, but it also comes with some considerable risks,” says Alan King, head of global membership development strategy at the IMechE.

“There will need to be safeguards put in place, because the potential for things to go wrong is magnified in a profession like engineering.”

Engineering is well-regulated, with various rules, standards and regulations that need to be followed. These include government legislation, guidance documents published by the Health and Safety Executive (HSE), standards (such as the British Standards) and various good practice guidelines. All of these could act as guidelines for AI.

AI in the workplace

According to the survey, 58% of companies have introduced AI tools into engineering teams and 42% of them only use AI tools in different parts of the business. The most common AI tool used is a large language model (LLM), with nearly 60% of businesses using this.

Meanwhile, nearly a third of companies use machine learning and productivity tools, such as Microsoft 365 Copilotto assist in their work.

Generative design tools, such as those used in simulations to optimize designs or identify potential faults, are less common, with less than a fifth of organizations using them. computer vision and neural networks are even fewer, with just over a tenth using them.

Nearly a third of the survey's respondents use AI tools for written tasks, such as emails and pitches. Meanwhile, approximately a quarter of respondents use AI for data analysis. However, AI's use in data analysis is expected to grow, as nearly 60% indicated they would accept AI assistance.

The tasks that engineers would most like AI to be used for are those for simulation and tools that can improve productivity. AI tools for design optimization, predictive maintenance and research followed close behind. It is worth noting that nearly two-thirds of the respondents believe that AI tools will automate mundane and repetitive tasks, which will make engineers more productive and enable them to focus on complex or creative tasks.

“In the short term, AI is going to be operating mostly as a co-pilot for engineers. What we'll see with AI is the ability to start utilizing this technology to automate mundane tasks that might have been time-consuming, allowing engineers to move on to more interesting activities,” says King. “There is a big opportunity here, but we have to be careful so that we don't lose the human-based knowledge.”

Concerns remain

There is concern (37%) that widespread adoption of AI will result in engineering roles being replaced with AI tools. Just over a quarter believe that engineers would be replaced. Likewise, over 40% of respondents do not feel that AI tools would result in maintaining the same level of engineers.

There is also the concern (66% of respondents) that widespread adoption of AI tools will lead to reduced project oversight. This is partly due to AI tools being akin to a black box, where there is insufficient transparency to understand how AI derived a solution.

“The AI ​​world can be a bit like the Wild West, but in an engineering context, that doesn't work. You've got to have systems that are reliable, provide the right answers, are safe, and behave in an ethical way,” says King.

“If we look at the sort of framework that we've used for years, especially in areas like aerospace or nuclear engineering, there are very strict rules and guidance. “We almost have to take some of that learning and apply it as safeguarding principles to any AI systems that we're introducing.”

The lack of understanding behind an AI's design methodology, together with being unable to properly interrogate the solution, could cause problems with the verification of designs. With a growing number of solutions generated by AI systems, it will become even more vital for skilled engineers to interrogate these designs to ensure they are suitable and appropriate.

Over a half of respondents also raised concerns about the potential security risks of AI tools, as well as nearly 50% being concerned about potential historical bias in the data. Overall, nearly 55% of respondents are not comfortable with AI being used to make critical decisions in engineering.

Companies using publicly accessible LLMs, such as ChatGPT, are especially at riskNot only could they be exposing themselves to poor datasets and misinformation when importing AI generated content into their networks, they are also potentially leaking sensitive information.

There is a strong feeling among the respondents that regulatory oversight is needed to ensure AI is deployed and used appropriately in engineering. However, given the speed of technological development in AI tools and the comparatively slow legislative processes, this is easier said than done.

Some AI regulations are being developed, such as the European Union's Artificial Intelligence Actbut there is a significant risk that legislation could rapidly become obsolete.

“AI developers are applying reinforcement learning with human feedback – when they see the models do something, they'll say whether or not they think the models behaved in the right way. That's based on their perceptions and biases, but somebody who is sitting in the Middle East or Russia might have a very different view about how the model should have responded,” says King.

“You've also got to look at the data they're training the LLM on, which is usually scraped off the internet and often in English. If you're only training on English-language websites, there's a chance that it's biased toward Western cultures.”

The future of AI in engineering

The roll-out of AI tools in engineering is already well underway, but carries potential pitfalls.

This thinking was clearly articulated by one survey respondent, who noted: “A computer should be able to more easily and more quickly identify patterns and check against known problems. On the other hand, human nature will encourage people to blindly believe the results of any AI task, which could be a problem.”

Companies can also learn from the previous deployment of new technologies to identify potential risks. A key element is that different countries have different engineering regulations and guidance documents.

As such, an AI tool developed for one region may be incompatible, or at least require re-learning, before it can be deployed in a different region.

“My one hope for engineering is that it doesn't try to use AI as a way to save money, but as a way to accelerate performance,” says King. “In the long term, AI creates an inflection point for us all, because we're able to develop systems and products faster and better, you should then see an acceleration of that technology like we never have before. It should open up huge breakthroughs.”

Although AI tools have clear benefits for the automation of mundane and repetitive tasks, engineers will still need to learn new skills to fully engage with AI, ensure safety and maximize the benefits.

There will be a need for engineers that have training in coding and prompt engineering to work with AI systems, while critical thinking will become an essential skill for interrogating AI-generated solutions.

Leave a Reply

Your email address will not be published. Required fields are marked *