It works perfectly fine(server notices when client disconnects) when launching it without Mono on Windows. When using Mono on Windows or Linux it just doesn't work, server doesn't notice that client disconnected.
TcpClient socket = new TcpClient();
NetworkStream stream = socket.GetStream();
I've observed that TcpClient also doesn't seem to notice when the server closes the connection from its end. Attempting to read or write from it eventually causes a timeout, but does not change the connected state or indicate that the connection has been lost.
I could reproduce this on mono 3.10.
If the server sends an RST the client doesn't notice that the connection is closed, and timeouts without setting Connected to false.
it makes for a very bad network performance of mono
we need network performance!
I hope someone is looking at this soon
this bug is exist long enough...
Have you tried setting the `socket.LingerState` parameter? The behavior when calling `socket.Close()` is to simply call `close` on the socket and leave the OS deal with lingering.
If you can still reproduce with latest version of Mono, please attach the output for a run with `MONO_LOG_MASK=io-layer` and `MONO_LOG_LEVEL=debug` environment variables defined. This should give us some information about the different calls that happens on the socket.