java - Why does setting SO_TIMEOUT cause final read of SocketInputStream to return immediately? -


i'm working on test harness writes bytes on socket server , reads response. had problem last read of socket's inputstream pause 20 seconds. fixed that, don't understand why worked.

the following method given java.net.socketinputstream. call read(byte[], int, int) pausing 20 seconds on final read, 1 returns -1, indicating end-of-stream.

    private string getresponse(inputstream in) throws ioexception {      stringbuffer buffer = new stringbuffer();     bytearrayoutputstream bout = new bytearrayoutputstream();     byte[] data = new byte[1024];     int bytesread = 0;     while (bytesread >= 0) {         bytesread = in.read(data, 0, 1024);    // paused here on last read         if (bytesread > 0) {             bout.write(data, 0, bytesread);         }         buffer.append(new string(data));     }     return buffer.tostring(); } 

i able make pause go away setting so_timeout on socket. doesn't seem matter set to. socket.setsotimeout(60000), problem read in method above returns @ end-of-stream.

what's happening here? why setting so_timeout, high value, cause final read on socketinputstream return immediately?

this sounds implausible. setting socket timeout shouldn't have effect.

i think explanation changed else, , that has fixed pauses. (if guess, server closing socket wasn't doing before.)

if doesn't help, need provide sscce other people can run observe effect. , tell platform using.


Comments

Popular posts from this blog

basic authentication with http post params android -

vb.net - Virtual Keyboard commands -

css - Firefox for ubuntu renders wrong colors -