The Norwegian naval frigate Helge Ingstad was declared a total loss after colliding with a tanker on 8 November 2018. But the biggest mistake was not related to the collision itself.
Knowingly, and in conflict with the official findings of the Norwegian Safety Investigation Authority (NSIA), the attorney general dragged the officer on watch on the frigate to court.
The verdict last week was a two-month suspended sentence for the officer, even though the accident involving the 113,700-dwt Sola TS (built 2017) had been shown to be a result of a complex combination of multiple causes.
This legal process demonstrates a criminalisation of something very human — committing an honest mistake. The side effect of criminalisation is an increased reluctance to admit mistakes, and its impact could extend far beyond the Norwegian navy to influence the entire maritime industry and society.
The result is a world with more undetected failures and mistakes that might potentially escalate into serious incidents, when surely it is the duty of the courts to make society safer and better.
It is typical in disasters that someone always knows about the mistakes before the accident occurs. Hence, the wisest thing would be to lower the threshold for voicing concerns and highlighting mistakes, to address issues before it’s too late.
We have to acknowledge that we all make mistakes — it’s simply human, and we cannot make humans failure-free. But by admitting mistakes, we can address failures before things go wrong. This makes it natural to share our own mistakes and try to understand those of others, precisely because we can all learn from them. This mindset is the opposite of the criminalisation of honest mistakes.
There have been examples in maritime investigations in which officers on watch have been watching a football match while on duty, making intentionally risky manoeuvres or have been drunk. Such behaviour can be labelled negligence. But this is very different from what happened on board the Helge Ingstad.
The first thing the sentenced officer did when he arrived on the bridge was to question a bright light on the horizon. He discussed this with the officer of the previous watch. Both men and two lookouts concluded that the light was an object on land. Only the helmsman thought it was another vessel, but he kept his suspicion to himself. Eight minutes later, the accident was a fact.
They tried to do their best; unfortunately, they all failed.
We all tend to believe we are good at reporting mistakes and shortcomings. But the truth is, we opt to remain silent when the stakes are high — and statistics show that we are becoming less willing to speak up.
One reason behind this trend is the failure to listen to those who do voice concerns. A prerequisite for listening to someone who speaks up is recognising the difficulty in doing so. This requires openness around mistakes, trust, organisation, maturity, will and time. Criminalising mistakes destroys this openness.
The problems revealed in the NSIA inquiry are not mitigated by imprisoning the officer of the watch. Instead of looking for someone to blame, we should be asking ourselves as an industry what can we learn from this incident and others.
It’s not enough to understand why and how an accident happens. Real learning means implementing interventions to ensure similar things don’t happen again.
However, the challenge of doing so across time zones, ships, rotations and national cultures must be acknowledged by all stakeholders, especially regulators and prosecuting authorities. Implementation requires trust, mindfulness and openness.
Encouraging an open mindset and learning from mistakes is the best way to address failures and reduce the risk of accidents.
Torkel Soma is chief scientific officer at Oslo-based safety culture organisation SAYFR
Do you have an opinion to share?