Interprefy, the remote simultaneous interpretation
provider, has launched its new AI-driven live captioning technology for
multilingual meetings, webinars, and conferences.
new technology provides automated live translation services for organisations
looking to host international events in the language of the delegate’s
user experience will be akin to the multi-language captioning options found in video
streaming services, such as Netflix, but for live events, like conferences.
by Automated Speech Recognition (ASR) and Machine Translation (MT), the
solution transcribes and translates live speech to provide users with closed
captions in real-time. Delegates can simply turn captioning on and off in their
preferred meeting language and on the platform of their choice, such as
Microsoft Teams or ON24.
Captions should help people who don’t understand the speaker’s language, the
deaf and hard-of-hearing, and individuals joining from noisy offices or public
CEO Oddmund Braaten, said “Our new captions offering is a huge milestone on our
path towards delivering inclusive meetings without language barriers.
Complementing our real-time spoken and sign language interpretation offering,
event organisers can now select and combine the language solutions that best
fit their unique needs. Better yet, it is available to all businesses, no
matter the size, platform, or budget.”
AMI editor James
Lancaster is a familiar face in the meetings industry and international
association community. Since joining AMI in 2010, he has gained a reputation
for asking difficult questions and getting lost in convention centres. Proofer, podcaster, and panellist - in his spare time, James likes to walk,
read, listen to music, and drink beer.