Hi everyone. I'm facing an unfamiliar situation here.
I'm trying to build a self-updating program using Lacewing as the underlying network protocol, but have hit a problem when 2 testers tried to use it (whom are connected via the internet)
The binary could be considered large (about 2-3 MB) and initially I was sending it "all in one go". This works fine locally and over LAN, but "times out" over the internet. (I'm not actually sure how to detect a time out/unreachable, so I have a 10 second counter before telling the user it gave up) - The server also sends a hash and the number of bytes expected as a form of verification. The testers have had hash verification errors or it simply "timed out" each time.
I have since updated this to send the binary into separate chunks using this thread as a guide and appending to a file when it arrives on the client. This slowed it down the process by a few more seconds over LAN and locally, but I assumed this would correct the problem. I have also added a "x KB out of y KB" and progress bar so the user can see the progress. Each chunk is 16384 bytes (~ 16 KB) in size.
However, when this was tested again, it started reporting "26 KB of 0 KB" oddly to one tester before it "timed out" again. The server front end (for me) looked like it was being sent normally (although it looped a few times, which doesn't happen locally, before the server program crashed altogether).
I'm not sure what's going on here. I presume the packets may be corrupted, or are being re-sent. Has anyone had experience with sending larger (> 2 MB) binary with Lacewing? Things seem to be perfectly fine locally or over LAN, but not when used over the internet.
I'm pretty sure the internet connection can't be at fault here, it's ~ 2.3 MB/s (17.70 Mbps) upload.
I'll probably try simulating a slow connection, lag and throttling to see if I can reproduce anything tomorrow. I have discovered that both the server/client will freeze (possibly crash) if transfers are too large.
Thanks in advance!