I'm writing a little TCP application. When the app starts up, I try to connect to a remote server. If that connection is not successful within a couple seconds, I want to catch that, throw an error, and then start the process over again.
Here is my basic code for the connection.
var connected = false;
var socket = new net.Socket();
socket.setTimeout(2000, function () {
console.log('timeout');
console.log(connected);
if (connected) return;
// retry
});
socket.on('error', function (err) {
console.log(err);
});
socket.on('connect', function () {
console.log('connected');
connected = true;
});
socket.connect(port, host);
The variables port and host refer to a host that exists but is not listening on the specified port.
When I try this locally, after 2 seconds I see the timeout message printed to the console. When I try it in production, I see BOTH the connect and timeout messages printed to the console, with the connect message printed first. If I change the timeout to 5 seconds, or 10 seconds, or whatever, the same thing happens. Nothing is printed until the timeout is reached, then the connect message and timeout message show up at virtually the same time, but always with connect first.
I even logged the connected variable in the timeout handler and sure enough, it's set to true.
First of all, I can't figure out what would be making the connect event fire first. It's clearly not connecting because the message doesn't print until the timeout is reached, no matter what the timeout is.
Second of all, if the connection IS successful, then the timeout shouldn't be hit until 2 seconds later (or however long the timeout is set for).
Any ideas? Could it be something with the specific server setup? I'm totally lost.
Aucun commentaire:
Enregistrer un commentaire