ducin
ducin

Reputation: 26457

twisted deferred: client connects to a server, async single-threaded

We've got one async single-threaded twisted/python process. There is a X-server that listens on port X. There is also another server, Y-server that listens on port Y. Y-server is also a client of X-server (executing Y request involves passing a request to X server).

They both shall perform asynchronously in one thread. It should work like the following:

I was trying to implement such thing, but I failed, probably, because of not using deferreds. As far as I understand deferred, their job is to divide above sequence into smaller chunks so that these parts can be done by both X and Y simultaneously.

What I need is to understand a scheme of how such communication should work. A pseudo-code should do...


Below is short code of my failed attempt.

There is the main service class, which consists of protocol/factory classes:

class PyCached(protocol.Protocol):
    def __init__(self, factory, verbose):
        self.factory = factory
        self.verbose = verbose
    def dataReceived(self, data):
        log.msg(data)
        if self.verbose:
            print 'received: %s' % (data,)
        request = json.loads(data)
        if self.verbose:
            print 'request: %s' % (request,)
        command = "handle_%s" % (request.pop('command'),)
        if self.verbose:
            print 'command: %s\n' % (command,)
        result = getattr(self.factory, command)(**request)
        self.transport.write(result + "\n")

class PyCachedFactory(protocol.Factory):
    def __init__(self, verbose=False):
        self.clear()
        self.start_time = datetime.now()
        self.verbose = verbose
        log.msg('starts on %s, verbose=%s' % (self.start_time, self.verbose))

    # many many more commands performed by factory

There is also the http access server:

from twisted.web.resource import Resource
from twisted.python import log
from twisted.web.server import Site
from client import PyCachedClient

class PyCachedCommand(Resource):
    isLeaf = True

    def getServiceClient(self):
        client = PyCachedClient()
        client.connect(*self.service_address)
        return client

    def render_GET(self, request):
        '''
        Renders service status as plain text.
        '''
        log.msg('GET')
        request.setHeader('Content-Type', 'text/plain')
        try:
            client = self.getServiceClient()
            status = client.status()
            client.close()
            return "PyCached is up since %0.2f seconds" % (status['uptime'],)
        except:
            return "PyCached is down."

    def render_POST(self, request):
        '''
        Executes pycached request ad returns the response.
        '''
        log.msg('POST %s' % (str(request.args)))
        client = self.getServiceClient()
        kwargs = {k: v[0] for k,v in request.args.iteritems()}
        command_name = kwargs.pop('command')
        command = getattr(client, command_name)
        result = str(command(**kwargs))
        client.close()
        request.setHeader('Content-Type', 'text/plain')
        return result

class PyCachedSite(Site):
    '''
    Performs all operations for PyCached HTTP access.
    '''
    def __init__(self, service_address, **kwargs):
        resource = PyCachedCommand()
        resource.service_address = service_address
        Site.__init__(self, resource, **kwargs)

the http uses the main service client, which is implemented with simple sockets - and probably this is where the problem lies, since these client socket calls are blocking:

class PyCachedClient(object):
    def __init__(self):
        self.s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)

    def connect(self, host, port):
        try:
            self.s.connect((host, port))
        except socket.error:
            raise RuntimeError('Something went wrong with PyCached.')

    def close(self):
        self.s.close()

    def _receive(self):
        received = self.s.recv(1024)
        decoded = json.loads(received.rstrip('\n'))
        return decoded['value'] if decoded.has_key('value') else None

    def _send(self, command, options={}):
        request = {'command': command}
        request.update(options)
        self.s.sendall(json.dumps(request))

    def version(self):
        self._send('version')
        return self._receive()

    # many many more commands similar to version

Finally, everything is run by a twistd/TAC file - so it resides in a single thread:

from twisted.application import internet, service
from server.service import PyCachedFactory
from server.http import PyCachedSite

application = service.Application('pycached')
# pycached core service
pycachedService = internet.TCPServer(8001, PyCachedFactory())
pycachedService.setServiceParent(application)
# pycached http access
addr = ('localhost', 8001)
pycachedHttp = internet.TCPServer(8002, PyCachedSite(addr))
pycachedHttp.setServiceParent(application)

When I telnet 8001 (main service), e.g. {"command":"version"}, everything is ok. But when I ask the http, everything blocks, since client socket is blocking and main service can never respond.

Upvotes: 1

Views: 566

Answers (1)

monoid
monoid

Reputation: 1671

Deferred is just a tool for managing callbacks. X-client has to provide some way to add callback when result is received, and you should send Y-response from this callback. And Deferred is just an implementation detail.

Upvotes: 1

Related Questions