Ship & Boat International eNews: September/October 2022
Search and rescue missions in Sweden could soon be enhanced by the use of artificial intelligence (AI), particularly when it comes to transcribing unclear or garbled distress calls over VHF. For the past two years, the Swedish Maritime Administration (SMA), which is responsible for sea rescues within the country, has worked with tech companies Tenfifty and Maranics to trial an AI-managed transcription service and assess its viability for future roll-out. In early 2022, the partners launched a full-scale test of the emergency call detection system, and the SMA will make a formal evaluation this autumn.
SMA’s interest in AI stems back to 2009, when Tobias Nicander, rescue leader of the SMA’s Joint Rescue Co-ordination Centre (JRCC) in Gothenburg, first floated the idea of adopting some form of assistance for personnel monitoring the emergency channel. “Calls can sometimes be difficult to interpret because the signals are weak and the messages are incoherent," the SMA says. "There is always the risk that the operator is already working on an alarm when a new case arises. There is, thus, a small risk of missing a call.”
While Nicander did not have the requisite technology available at the time, AI has come on leaps and bounds in the past 13 years. For the two-year trial, Tenfifty built the AI engine that transcribes radio input into text, while Maranics provided the user interface for the SMA operators. All radio traffic on the emergency channel that reaches Sweden is transcribed by AI and then relayed to operators at the JRCC “within a second”, the SMA says. Nicander adds: “We have also defined keywords that should be clearly marked and remain on our screens until we say that we have seen them”. Some of these keywords include ‘Mayday’, ‘SOS’, ‘help’ and ‘sinking’, for example. “We also receive documentation of what has been said on the channel, which can be helpful in improving the process in the future.”
The system’s AI architecture analyses short segments of the audio and creates a speech representation of this audio. These representations are then converted into text by the system’s deep neural network.
The SMA tells Ship & Boat International: “The model we use is trained on a custom dataset created in collaboration between Tenfifty and the SMA, where real audio calls have been transcribed by the operators so that the system can learn how to interpret the domain-specific challenges present in VHF transmissions. The system uses an algorithm that allows some fuzziness in the spelling of the keywords so that, even if the text is not perfectly spelled, the keywords are still recognised. The AI server is a dedicated high-performance server installed within the SMA’s intranet: this helps ensure very short transcription times, as well as high integrity of information.”
These short transcription times could produce speedier response times, to increase the likelihood of successful rescues. As another bonus, the JRCC operators may experience less stress during their shifts, thanks to the reduced likelihood of them missing a distress call.
The first day of the full-scale test saw the system alert the JRCC to two Mayday calls placed in Germany and Denmark: proof for the SMA that the AI can “ interpret even weak signals” placed over significant distances. The question to be resolved in autumn is whether or not the SMA adopts this AI assistance full-time – a decision that will largely hinge on financial considerations. “We believe it must be a ‘need-to-have’ and not just a ‘nice-to-have’ system," the SMA explains.