Reputation: 11
I have the intention of running machine learning algorithms written in Python on data in a database of a Ruby on Rails app. After some research I have discovered sockets and therefore created a Ruby server and Python client. I am running them both on two different command prompt terminals.
Here is the Ruby server code:
require "socket"
server = TCPServer.open(2000)
loop {
client = server.accept
client.puts(Time.now.ctime)
client.puts "Closing the connection. Bye!"
client.close
}
Here is the Python client code:
import socket
s = socket.socket()
host = "localhost"
port = 2000
s.connect((host , port))
I do not understand where the problem is. Kindly assist.
Upvotes: 0
Views: 271
Reputation: 11
Given insightful answers to my question above the code Ruby server and Python client should be as below.
For the Ruby server:
require "socket" # Get sockets from stdlib
server = TCPServer.open("127.0.0.1" , 2000) # Socket to listen on port 2000
loop { # Server runs forever
client = server.accept # Wait for a client to connect
client.puts(Time.now.ctime) # Send the time to the client
client.puts "Closing the connection. Bye!"
client.close # Disconnect from the client
}
For the Python client:
import socket # Import socket module
s = socket.socket() # Create a socket object
host = "127.0.0.1"
port = 2000 # Reserve a port for your service.
s.connect((host , port))
print s.recv(1024)
s.close() # Close the socket when done
The open() method of the TCPServer class in Ruby takes two parameters. The first being the host name and the second the port i.e
TCPServer.open(hostname , port)
Upvotes: 1