Reputation: 843
I thought the parse rules of APL were straightforward: on seeing a term ⍺ f ⍵
, have the function f
receive arguments ⍺, ⍵
. Indeed, for the reduction /
operator in APL, we can think of:
+/(1 2 3 4) ⍝ REDUCE(+, (1 2 3))
10
However, this logic is thrown out the window in the case of:
2+/(1 2 3 4) ⍝ REDUCE(2, +, (1, 2, 3)) ?
(3 5 7)
That is, it seems that the parse of this needs to look two behind in the intermediate parse tree --- one behind to retrieve the operator +
, and two behind to retreive the number 2
.
This significantly complicates my mental model of how to read and parse APL expressions --- am I missing something here? Alternatively, if this is actually how this works, are there other APL operators that "look behind" more than 1 sub-expression?
Upvotes: 1
Views: 222
Reputation: 7616
As an introduction to APL, the simple rules about functions and arrays are adequate, but once you throw operators (and especially dyadic operators) into the mix, thins get a bit more complicated. While the functions/arrays rules still apply, a function can now be derived using one or more operators. In fact, you can end up looking far to the left to find out where a function "begins".
Consider e.g. the function *∘*∘*∘*
, f(a,b)=aeeeb, in the context 2*∘*∘*∘*3
:
3
this is our array
*
maybe we'll apply this function monadically to 3
, but it depends…
∘
nope: this is a dyadic operator which "grabs" the *
on its right to derive a new function
*
maybe this is ∘
's left operand, but it depends…
∘
nope: this is a dyadic operator which "grabs" the *
on its right to derive a new function
*
maybe this is ∘
's left operand, but it depends…
etc.
This is in fact how I parse APL in my head. However, a more precise overall approach is using a binding strengths table as per the documentation and the model implementation
Upvotes: 2