A seafarer mental health centre in Greece is launching what it claims is the world’s first artificial intelligence model applied to mental health screenings for seafarers.

IMEQ Center managing director Alexandra Kaloulis told TradeWinds that the predictive model, which uses machine learning algorithms, can more accurately predict psychological health challenges than screenings reviewed by psychologists alone in a workforce where isolation is a major challenge.

The company began developing an AI model several months ago, working with software engineers and Harvard University-trained data scientists.

In an interview, Kaloulis claimed that the model can predict mental health fitness for working at sea with 98% accuracy, which is better than screening by human psychologists alone.

Kaloulis, whose family is involved in the maritime sector, started the IMEQ Center in 2016 with a team of psychologists. Viewing seafaring as a high-risk occupation in which mental health challenges could impact safety, she saw a need for pre-screening assessments for mental health.

She told TradeWinds that, after spending two years developing its tests and its software platform, IMEQ did not take off at first in shipping. But the enquiries started flowing in after the Covid-19 pandemic struck and shed light on the challenges that seafarers were experiencing due to newfound isolation in a career already characterised by months at sea.

Seafarers take a series of tests on the platform. With information based on psychologists’ knowledge, the platform seeks symptoms of depression, for example, that can lead to additional testing.

Kaloulis said the addition of the AI model not only makes the platform more accurate but can also be a money saver.

Alexandra Kaloulis is managing director of the IMEQ Center. The centre provides psychological services to the shipping sector. Photo: IMEQ

But does AI present ethical concerns when used to pre-screen applicants for employment, and is there a risk of mental health discrimination?

James Liounis, a Harvard-trained data scientist involved in developing the predictive model, said AI is not inherently ethical or unethical, although applying it can raise ethical concerns about bias, privacy and job displacement, among other issues.

“In our case, AI is ethical as we only consider mental health and not discriminative factors, and we’ve done a lot of work regarding performance where we closely monitor wrongly classified individuals,” he wrote in response to TradeWinds’ questions.

That does not mean that humans are not involved, and Kaloulis said assessments are double-checked before they are sent to an employer.

“AI predictive models can offer valuable insights into mental health assessment, but they’re not necessarily better than humans,” Liounis said.

“They can process large amounts of data quickly, but human judgment and empathy play crucial roles in understanding nuances of mental health. AI shouldn’t be used by itself, but can have a positive added value when used by a psychologist.”

The IMEQ Center focus on mental health of seafarers. It is based in Athens. Photo: IMEQ Center

The proportion of seafarers found to be unfit for work at sea varies depending on the phase of the hiring process when assessments are carried out — the percentage is higher during the initial phases and lower when assessments occur closer to the end of the hiring process, Kaloulis said.

She said mental health pre-screening is not about excluding a seafarer from a particular job. When a seafarer is determined to be unfit in a pre-screening assessment, it triggers further evaluation.

“We do promote to the companies [we work with to adopt a] mentality and philosophy that, if there is an issue with a person, you don’t go ahead and make a hiring decision or promoting or retaining decision based on that,” she said. “We help the person, that’s why we offer that phase number two: consultation and therapy.”

Asked whether AI could serve other purposes related to the mental health of seafarers, Kaloulis said the technology could, for example, provide support through chatbots that seafarers can access at sea, as a form of therapy.

“You could get mental health support from an AI, which is anonymous,” she said. “People will open up.”

She said seafarers working on ships may be reticent to share their problems among their small group of coworkers, and seeking help can carry stigma.

“The next phase here is to create an application that the seafarers can have where they can share their mood [or] they can have a group where they can share their thoughts,” she said, which could help mariners who spend their days between work and their cabins. “Isolation is a big thing.”

And she said AI can be used for predictive analytics, to analyse historical data and identify patterns that can predict when seafarers might be at a higher risk of experiencing mental health issues.