Reputation: 280
I am creating a short python script which requires me to send text back and forth between two computers via sockets. Now when I try to test my application locally on the same computer using telnet and it requires me to use the 'utf-8' encoding when I use the bytes() function as such:
connection.sendall(bytes("Unknown command", 'UTF-8'))
All works well locally, but when I try to test my application remotely on a raspberry pi and connect my computer through telnet I get the following error
TypeError: str() takes at most 1 argument (2 given)
After a bit of research I found that if I remove the utf encoding it will work remotely as such:
connection.sendall(bytes("Unknown command"))
But then this causes an error when I test locally. It says I must use an encoding. Why is this happening?
Upvotes: 1
Views: 109
Reputation: 180391
You are using two different versions of python py2 vs 3
.
Python2:
In [1]: bytes("Unknown command", 'UTF-8')
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<ipython-input-1-ddc681192da0> in <module>()
----> 1 bytes("Unknown command", 'UTF-8')
TypeError: str() takes at most 1 argument (2 given)
Python3:
In [1]: bytes("Unknown command", 'UTF-8')
Out[1]: b'Unknown command'
Upvotes: 3