java - DatagramChannel.receive() seems to buffer packets -
i have server sending udp packets of 40 bytes every ~50 milliseconds client. client receives these packets of time. however, every ~1-3 seconds no packets arrive ~500ms , following packets after arrive @ 0ms delay, if packets buffered.
the following code called in separate thread sole purpose of receiving packets:
public synchronized int readudp(datagramchannel channel) throws ioexception { inetsocketaddress sourceaddress = (inetsocketaddress) channel.receive(in); bytesread = in.position(); in.flip(); // read data out of buffer... }
here notes on debugging findings:
- the server runs on same computer client, communication 1 way (server -> client). traffic goes 127.0.0.1 127.0.0.1 , therefore there no network interference.
- using rawcap have taken @ packets , times , no oddities visible, delta times between packets in capture ~50ms.
- the datagramchannel in blocking mode.
- the client connected server tcp simultaneously, no packets actively sent on channel.
- the buffer in packets stored bytebuffer capacity of 1024 bytes.
- the position of buffer before calling channel.receive(in) 0.
- the position of buffer after calling channel.receive(in) 40.
- the time after packet being read , arriving @ same code section again ~0ms. therefore, can conclude channel.receive(in) takes ~500ms sometimes.
- the function readudp called 1 place.
a typical glitch looks this:
time: 49 time: 51 time: 49 time: 51 time: 307 time: 0 time: 0 time: 0 time: 0 time: 1 time: 41 time: 51 time: 49
i appreciate if examine fresh pair of eyes. other close case find here: http://www.coderanch.com/t/476539/android/mobile/udp-datagramsocket-receive-glitches. code not interact android in way should note though.
Comments
Post a Comment