Reputation: 4859
This is a very puzzling issue that I've been around the block on several times here without coming to a solution. That question was getting too complicated to take all my updates and format correctly, so let's start fresh here.
The basic issue is that an audio clip is launched from a web page that has some kind of audio player on it. When the page appears, the browser does not know the duration of the clip, and the first time through, whatever UI the player has fails to show the player progressing through the file, even though it plays the clip audibly. Once the player has finished the clip, it has also learned the duration, and on a subsequent replay, the UI does show the progress.
This is the current state of play:
1) The original question assumed that jQuery jPlayer had something to do with the problem, but subsequent tests show that it does not. Removing jPlayer and instead using HTML5 audio tags shows exactly the same issue.
2) Originally I thought that the problem might lie with the SoX conversion of a .vox file to .wav, but a go-round on the SoX mailing list convinced me that this was not the case.
3) It's been suggested (primarily by Gyrocode.com) that the problem may lie with using .wav files, but I can reproduce the exact same problem with .ogg and other formats.
The only difference I can see between some of the samples that have been shown by Gyrocode and what I am doing is that my code initially puts in a dummy source, and then changes it dynamically before showing the audio player to the user (inside a modal dialog).
Below in skeletal form is the code.
HTML:
<div class="form-group">
<label class="col-sm-4 control-label">Play Audio</label>
<div class="col-sm-8">
<div style="box-sizing:content-box;" class="container" id="player-container">
<audio controls src="NONE" id="audio-player" preload="metadata"></audio>
</div>
</div>
</div>
JS:
/**
* JavaScript functions for populating data on the transcripts page.
*/
var oParams = new Object();
// temporarily hardcoded
oParams.userId="sc1478";
oParams.audioId="";
var oAudioList;
// loads the data table
function showModalTranscribeDialog(id) {
console.log( 'You clicked on the row for audio id '+id );
oParams.audioId=id;
$('#playTranscribeModal').modal('show');
}
function fnLoadAudio() {
console.log('in fnLoadAudio');
var url="x/audio/" + oParams.userId + "/";
console.log("searching " + url);
oAudioList = $('#audiolist').DataTable({
"pageLength" : 10,
"processing" : true,
"serverSide" : false,
"cache" : false,
"ajax" : {
"url" : url,
"dataSrc" : "payload.listOfAudio"
},
"columns" : [{
"data" : "audioId",
"title" : "Audio ID",
"width" : "6%"
}, ...
],
"columnDefs" : [
...
],
});
// clicking on row of table brings up modal dialog for playing and
// transcribing
$('#audiolist tbody').on('click', 'tr', function () {
var id = $('td', this).eq(0).text();
showModalTranscribeDialog(id);
} );
console.log('done with fnLoadAudio');
}
...
// load clip into audio player
function fnLoadPlayer() {
var alerted = false;
var media_url_wav='audio/clip/wav/'+oParams.audioId+'/';
console.log("player will play " + media_url_wav);
$("#audio-player").attr("src", media_url_wav);
}
$(document).ready(function() {
console.log('in document ready !!');
$('#dynamic').html( '<table cellpadding="0" cellspacing="0" border="0" class="display" id="audiolist"></table>' );
fnLoadAudio();
// set the ID of the selected audio in the dialog when the playTranscribe
//modal dialog is shown
$('#playTranscribeModal').on('show.bs.modal', function (e) {
...
fnLoadPlayer();
});
...
});
Here is what the code shows:
There is a JQuery Datatable on the page, and it also contains a modal dialog that pops up when a row of the table is clicked for further data gathering from the user. This dialog contains the audio player. The audio is played when the dialog is shown and the user clicks the play button.
The data table is loaded when the document is ready. The ready event handler calls fnLoadAudio() which fills in the data table and sets a click handler for when a row of table is clicked. In that event, the function showModalTranscribeDialog() is called. This first uses the row data to get the id of the audio to be played, and shows the dialog. Upon the dialog being shown another event handler uses the id to get the correct URL for the audio, and only then sets that into the src attribute of the element on the dialog.
The point this is illustrating is that the src attribute of the audio element is not populated until the dialog appears, although the audio element has been in the dialog (but hidden) all the while, with some other src value.
My theory is that it is this way of setting the src that is causing the problem, which will not occur when the audio's src is hardcoded on the page in a conventional manner.
And my question is whether there is a better way to dynamically set the source for the audio player, given the page structure as described.
Upvotes: 4
Views: 6315
Reputation: 1
For me, everything worked after I set a response header Content-Length
Upvotes: 0
Reputation: 117
This may be common knowledge now, but I couldn't find any reference to it in the docs. I just ran into this same problem, and it turned out that:
<audio controls src="www.example.com/my-audio.mp3"></audio>
would indeed not load the duration (when my-audio.mp3
is a stream)
HOWEVER: Doing
<audio controls>
<source src="www.example.com/my-audio.mp3" />
</audio>
does load the duration, as well as Download and Playback speed controls (On Chrome v106+). There does not seem to be any documentation on this particular difference of using the source
tag.
Upvotes: 2
Reputation: 4859
The answer is simply that duration is NOT (usually) part of the metadata! And when the duration IS known before playing, it's because the source was delivered as a file, not as a stream. My app uses API calls that supply streams.
Duh!.
Some formats (such as .mp3) may include a duration tag in the metadata but this is not mandatory and perhaps not even a good thing.
I made a test page with two tags on it within my webapp.
<!DOCTYPE html>
<html>
<head>
<meta charset="ISO-8859-1">
<title>Audio Test Page</title>
</head>
<body>
<h1>Audio Test Page</h1>
<br/>
<audio controls src="audio/clip/wav/103659/" id="audio-test-player" preload="auto"></audio>
<br/>
<audio controls src="audiotest/test.wav" id="audio-test-player2" preload="auto"></audio>
<br/>
</body>
</html>
The first does what the regular app does. It makes an API call that returns the file as a stream. It's the same API in fact. It reads the file as an InputStream, then copies it into the response's OutputStream.
The second loads the same file as a static resource that I temporarily placed in the webapp.
The first also fails to load the duration (even with preload="auto").
The second one loads the duration (even with preload="metadata").
It seems that when the source is delivered as a file, the browser has an opportunity to see the file's size and apply some logic to predetermine the duration. This is the same logic that the soxi utility employes to guess the duration as detailed here
$ soxi /var/tmp/test.wav
Input File : '/var/tmp/test.wav'
Channels : 1
Sample Rate : 8000
Precision : 16-bit
Duration : 00:00:10.48 = 83840 samples ~ 786 CDDA sectors
Sample Encoding: 16-bit Signed Integer PCM
But, when the source is delivered as a stream, the file size is not known and this calculation cannot take place.
A possible solution might be to include the calculated duration in metadata either in a standard or custom metadata tag and skin some player to use this, but the basic problem was one of expectations.
Duration is not normally part of metadata, and cannot be calculated by the browser on streamed delivery of the source.
Thus we come to the not completely satisfactory, but finally understood answer to this annoying problem which has spanned over
My use case is a bit non-standard, and probably not what most users of the sound APIs are trying to achieve.
Upvotes: 5
Reputation: 807
I didn't quite follow all of what's going on in your unrelated view code, but I didn't see anywhere you are listening for a 'playing' event on your media. In my code, I have to wait for the first 'playing' event before duration is valid.
//assuming _audio is a reference to an Audio element
_audio.addEventListener('playing', function() {
console.log(_audio.getDuration());
}, false);
Upvotes: 0