Reputation: 30028
lets say I want to compose functions,
like processResult
and sendResult
, but I can't just chain them because processResult
might need to call sendResult
0,1,2 or n times per each call of processResult
.
What is the proper way to do this in C++11?
I thought of 2 solutions:
1) give first function a std::function
param (and assign processResult
to it),
so it can call it when it needs to.
2) (dont like this one-seems too complicated) thread safe queue, put functions in 2 threads...
@requests for example:
input 1,2,3
calls of functions:
processResult(1)
//nothing
processResult(2)
//calls:
sendResult(10)
sendResult(20)
sendResult(50)
processREsult(3)
//calls
sendREsult(55)
Upvotes: 1
Views: 336
Reputation: 275405
The composable programming, or stack based programming, model may make sense.
Each function takes a stack of arguments and returns same. These stacks can be vector
or tuple
or generators/generic ranges. Some means for the number of input parameters to be determined may be useful.
In this case, your generator makes a 'local' stack, then you throw that stack at the consumer until done.
If you want these operations to be interleaved, you can have the producer function return a generator function or pair of generating iterators that lazily produce your elements, or you pass it an output stack whose push
method ferries it to the consumer function (ie, pass the consumer function to the producer).
The nice thing about the generator solution is that it does not fill your execution stack with crud, and you have better control over what happens next. The downside is that the producer state needs be explicitly stored, instead of living on the stack, and that the producer must be modified reasonably heavily. With lambdas this is not that bad, as you can store the next iteration of the loop as a closure, but it is still tricky.
Here is a trivial generator:
using boost::optional; // or std::optional in C++14
using boost::none_t;
template<typename T>
using Generator = std::function< optional<T>() >;
Generator<int> xrange( int min, int max, int step=1 ) {
int next = min;
return [=]()->optional<int> mutable
{
if (next > max) return {none_t};
int retval = next;
next += step;
return {retval};
};
};
If you prefer iterators, turning a Generator<T>
into a generating iterator is something you only have to write once, and it works for all Generator<T>
. And writing Generator<>
based code is easier than writing generating iterator based code. Plus Generator
s and Pipe
s are easy to chain:
template<typename T, typename U>
using Pipe = std::function< optional<T>(U) >;
template<typename A, typename B, typename C>
auto Compose( Pipe<A, B> second, Generator<C> first )
-> decltype( second( std::move(*first()) ) )
{
return [=]()->optional<A> mutable {
optional<B> b = first();
if (!b) return {none_t};
return second(std::move(*b));
};
}
template<typename A, typename B, typename C, typename D>
auto Compose( Pipe<A, B> second, Pipe<C, D> first )
-> decltype( second( std::move(*first( std::declval<D>() ) ) ) )
{
return [=](C c)->optional<A> mutable {
optional<B> b = first(c);
if (!b) return {none_t};
return second(std::move(*b));
};
}
// until C++14, when we get auto deduction of non-lambda return types:
#define RETURNS(x) -> declval(x) { return {x}; }
template<typename A, typename B, typename C>
auto operator|( Generator<A> first, Pipe<B,C> second )
RETURNS( Compose(second, first) )
template<typename A, typename B, typename C, typename D>
auto operator|( Pipe<A, B> first, Pipe<C,D> second ) {
RETURNS( Compose( second, first ) )
which we then abuse like this:
struct empty {}; // easier to pass through pipes than void
template<typename T>
void sendEverything( Pipe<empty, T> sender, Generator<T> producer ) {
Generator<empty> composed = producer | sender;
while (composed()) {}
}
and the producer happily produces data, each of which is send on to the sender, then the producer is called again. The sender can even abort the sequence by returning a none_t
.
A bit more advanced work and we'd be able to have pipes that represent one-to-many and many-to-one relationships.
(code not yet tested, so probably contains compiler errors)
template<typename Out, typename In>
using OneToManyPipe = Pipe< Generator<Out>, In >;
template<typename Out, typename In>
using ManyToOnePipe = Pipe< Out, Generator<In> >;
template<typename Out, typename In>
using ManyToManyPipe = Pipe< Generator<Out>, Generator<In> >;
template<typename Out, typename A, typename B>
auto Compose( OneToManyPipe< Out, A > second, Generator<B> first )
-> decltype( second( std::move(*first()) ) )
{
auto sub_gen = [=]()->optional<Generator<Out>> mutable {
optional<B> b = first();
if (!b) return {none_t};
return second(std::move(*b));
};
optional<Generator<Out>> sub = []()->optional<Out> { return {none_t}; };
return [sub_gen,sub]()->optional<Out> mutable {
for(;;) {
if (!sub)
return {none_t};
optional<Out> retval = (*sub)();
if (retval)
return retval;
sub = sub_gen();
}
}
}
template<typename Out, typename A, typename B, typename C>
auto Compose( OneToManyPipe< Out, A > second, OneToManyPipe<B, C> first )
-> OneToManyPipe< decltype( *second( std::move(*first()) ) ), C >;
// etc
probably boost
already does this somewhere. :)
A downside to this approach is that eventually the overloaded operators becomes ambiguous. In particular, the difference between a OneToMany
pipe and a OneToOne
pipe is that the second is a subtype of the first. I suppose the optional<T>
would make the OneToMany
"more specialized".
This does mean that any std::function< optional<T>()>
is treated like a generator, which is not right. Probably a struct generator_finished {}; variant< generator_finished, T >
is a better approach than an optional
, because using generator_finished
in a variant of your return value when you aren't a generator seems impolite.
Upvotes: 4
Reputation: 25927
This does not actually look like function composition to me, because if it was, you would be able to simply chain them. From what I read, you require, that processResult
should be able to call some external methods further processing the data. In such case, I would think about these two solutions:
std::function
to processResult
. This allows you to pass more objects: a function, functor object or even a lambda into that method.Change these two methods into so-called processor classes. You would be able to use them more flexibly than if they are functions or methods, so this one should suit your needs better if complexity of your program increases.
class VectorSender : IDataSender
{
private:
std::vector<Data> & target;
public:
VectorSender(std::vector<Data> & newTarget)
: target(newTarget)
{
}
// Implementation of IDataSender
void Send(Data data)
{
target.push_back(data);
}
};
class Processor : IDataProcessor
{
private:
sender : IDataSender;
public:
Processor(IDataSender newSender)
: sender(newSender)
{
}
// Implementation of IDataProcessor
void Process()
{
// Do some processing
if (sendData)
sender.Send(someData);
}
};
In the previous example, Sender may also get another class performing the actual sending, so you are able to "chain" more dependent objects.
Much depends on your program's architecture and I'm afraid, that no one will be able to help you further without some more detailed information about it.
Upvotes: 0
Reputation: 438
May be you just put your sendResult in an class. Define an object of that class in the class of process Result. Then you can all sendResult as many times you want from process result using this object.
Upvotes: 0
Reputation: 2793
If you simply need to call one function from the other as many times you want you can call one processResult
inside sendResult
(and vice versa, provided you forward-declare what's needed).
int processResult() { /* blah blah */ }
void sendResult() {
while(needed) {
if(needed)
processResult();
}
}
Upvotes: 0