Rob Allsopp
Rob Allsopp

Reputation: 3518

Multi-tempo/meter js DAW

Has anyone implemented a javascript audio DAW with multiple tempo and meter change capabilities like most of the desktop daws (pro tools, sonar, and the like)? As far as I can tell, claw, openDAW, and web audio editor don't do this. Drawing a grid meter, converting between samples and MBT time, and rendering waveforms is easy when the tempo and meter do not change during the project, but when they do it gets quite a bit more complicated. I'm looking for any information on how to accomplish something like this. I'm aware that the source for Audacity is available, but I'd love to not have to dig through an enormous pile of code in a language I'm not an expert in to figure this out.

Upvotes: 0

Views: 633

Answers (2)

rihan
rihan

Reputation: 121

web-based DAW solutions exists.web-based DAW's are seen as SaaS(Software as a Service) applications. They are lightweight and contain basic fundamental DAW features. For designing rich client applications(RCA) you should take a look at GWT and Vaadin.

I recommend GWT because it is mature and has reusable components and its also AJAX driven. Also here at musicradar site they have listed nine different browser based audio workstations.you can also refer to popcorn maker which is entirely javascript code.You can get some inspiration from there to get started.

Upvotes: 1

LetterEh
LetterEh

Reputation: 26696

You're missing the last step, which will make it easier.

All measures are relative to fractions of minutes, based on the time-signature and tempo.

The math gets a little more complex, now that you can't just plot 4/4 or 6/8 across the board and be done with it, but what you're looking at is running an actual time-line (whether drawn onscreen or not), and then figuring out where each measure starts and ends, based on either the running sum of a track's current length (in minutes/seconds), or based on the left-most take's x-coordinate (starting point) + duration...

or based on the running total of each measure's length in seconds, up to the current beat you care about.

var measure = { beats : 4, denomination : 4, tempo : 80 };

Given those three data-points, you should be able to say:

var measure_length = SECONDS_PER_MINUTE / measure.tempo * measure.beats;

Of course, that's currently in seconds. To get it in ms, you'd just use MS_PER_MINUTE, or whichever other ratio of minutes you'd want to measure by.

current_position + measure_length === start_of_next_measure;

You've now separated out each dimension required to allow you to calculate each measure on the fly.

Positioning each measure on the track, to match up with where it belongs on the timeline is as simple as keeping a running tally of where X is (the left edge of the measure) in ms (really in screen-space and project-coordinates, but ms can work fine for now).

var current_position     =   0,
    current_tempo        = 120,
    current_beats        =   4,
    current_denomination =   4,
    measures = [ ];

measures.forEach(function (measure) {
    if (measure.tempo !== current_tempo) {
        /* draw tempo-change, set current_tempo */
        /* draw time-signature */
    }
    if (measure.beats        !== current_beats ||
        measure.denomination !== current_denomination) {
        /* set changes, draw time-signature */
    }
    draw_measure(measure, current_position);
    current_position = MS_PER_MINUTE / measure.beats * measure.tempo;
});

Drawing samples just requires figuring out where you're starting from, and then sticking to some resolution (MS/MS*4/Seconds).

The added benefit of separating out the calculation of the time is that you can change the resolution of your rendering on the fly, by changing which time-scale you're comparing against (ms/sec/min/etc), so long as you re-render the whole thing, after scaling.

The rabbit hole goes deeper (for instance, actual audio tracks don't really care about measures/beats, though quantization-processes do), so to write a non-destructive, non-linear DAW, you can just set start-time and duration properties on views into your audio-buffer (or views into view-buffers of your audio buffer).
Those views would be the non-destructive windows that you can resize and drag around your track.

Then there's just the logic of figuring out snaps -- what your screen-space is, versus project-space, and when you click on a track's clip, which measure, et cetera, you're in, to do audio-snapping on resize/move.

Of course, to do a 1:1 recreation of ProTools in JS in the browser would not fly (gigs of RAM for one browser tab won't do, media capture API is still insufficient for multi-tracking, disk-writes are much, much more difficult in browser than in C++, in your OS of choice, et cetera), but this should at least give you enough to run with.

Let me know if I'm missing something.

Upvotes: 0

Related Questions