Remote TCP Socket question


Hi All;
I’m trying to develop an application that applies this idea : “get GPS postion every 1 second, and send it to the remote server by TCP sockets”.

But I have observed that, in TCP Socket (No lisening Mode) we must wait for a DataReqeust to send Data, then I must send those gps postions when the data request is called (verify also maxlen).

So I dont’ know, what is the minimal and maximal intervall in general case betwen two DataRequest calls. Because i must synchronise in this case the GPS Polling Time Call and the Sending DataRequest Call.