Reputation: 61
I have a project that uses Azure Media Services to broadcast video streams and when a broadcast ends it feeds the generated Asset to a Job to extract insights from it.
The problem is that it generates all the insights data perfectly but the Transcription (speech-to-text) works only for 10 minutes.
I can see thumbnails analyzed at 30 minutes into the video:
{
"id": "509c12e0-e6b8-4e09-bd6a-6ddb4f12417a",
"fileName": "FaceInstanceThumbnail_509c12e0-e6b8-4e09-bd6a-6ddb4f12417a.jpg",
"instances": [
{
"adjustedStart": "0:37:32.831",
"adjustedEnd": "0:37:32.864",
"start": "0:37:32.831",
"end": "0:37:32.864"
}
]
},
but the transcript always stops generating data at the 10 min mark
{
"id": 83,
"text": "Many things are also to do you Bonnie?",
"confidence": 0.8468,
"speakerId": 3,
"language": "en-US",
"instances": [
{
"adjustedStart": "0:09:55.36",
"adjustedEnd": "0:09:58.39",
"start": "0:09:55.36",
"end": "0:09:58.39"
}
]
}
The broadcast displays correctly in the AMS Player that I have embedded in my site and if I download the broadcast files and upload it to videoindexer.ai using a trial account it generates the transcript OK for the whole video.
The Video Indexer option is there but I would like to avoid having to link another service and API into the app so any help overcoming the 10 minute limit is welcome.
Upvotes: 0
Views: 142
Reputation: 2512
could you please file a support ticket on this through the Azure Portal. That way we can track it down. You will want to upload a sample file in the support ticket as well.
Upvotes: 0