I’ve gone down this same path but I can’t explain the delay I need to insert before the EOF to make it work. Here is with no delay (Unprintable ASCII is in <hex>):
tx: AT&K3<d><a>
rx: <d><a>OK<d><a>
tx: AT+KCNXCFG=1,"GPRS",m2m-east.telus.iot<d><a>
rx: <d><a>OK<d><a>
tx: AT+KCNXTIMER=1,60,1,60<d><a>
rx: <d><a>OK<d><a>
tx: AT+KUDPCFG=1,0,,0<d><a>
rx: <d><a>+KUDPCFG: 1<d><a>
rx: <d><a>+KCNX_IND: 1,4,1<d><a>
rx: <d><a>+KCNX_IND: 1,1,0<d><a>
rx: <d><a>+KUDP_IND: 1,1<d><a>
tx: AT+KUDPSND=1,pool.ntp.org,123,48<d><a>
rx: <d><a>CONNECT<d><a>
tx: <1b><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0>--EOF--Pattern--
rx: <d><a>OK<d><a>
rx: <d><a>+KUDP_NOTIF: 1,8<d><a>
rx: <d><a>+KUDP_DATA: 1,48<d><a>
rx: <d><a>+KUDP_RCV: 1,"192.99.2.8",123,48<d><a>
tx: AT+KUDPRCV=1,48<d><a>
rx: <d><a>+CME ERROR: 923<d><a>
tx: AT+KUDPCLOSE=1<d><a>
rx: <d><a>OK<d><a>
Note the “+KUDP_NOTIF: 1,8” which says not enough characters were sent. If I add a delay of 250ms or more before sending the EOF sequence, then I don’t get this and everything runs fine. I am monitoring the pins from the cell modem independently from the driver so I can tell any differences. The link is running at 115200.
Here is the same conversation with only a delay of 500ms added before sending EOF:
tx: AT&K3<d><a>
rx: <d><a>OK<d><a>
tx: AT+KCNXCFG=1,"GPRS",m2m-east.telus.iot<d><a>
rx: <d><a>OK<d><a>
tx: AT+KCNXTIMER=1,60,1,60<d><a>
rx: <d><a>OK<d><a>
tx: AT+KUDPCFG=1,0<d><a>
rx: <d><a>+KUDPCFG: 1<d><a>
rx: <d><a>OK<d><a>
rx: <d><a>+KCNX_IND: 1,4,1<d><a>
rx: <d><a>+KCNX_IND: 1,1,0<d><a>
rx: <d><a>+KUDP_IND: 1,1<d><a>
rx: <d><a>CONNECT<d><a>
tx: AT+KUDPSND=1,pool.ntp.org,123,48<d><a><1b><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0><0>
// don't send anything for at least 250ms
tx: --EOF--Pattern--
rx: <d><a>OK<d><a>
rx: <d><a>+KUDP_DATA: 1,48<d><a>
rx: <d><a>+KUDP_RCV: 1,"209.115.181.107",123,48<d><a>
tx: AT+KUDPRCV=1,48<d><a>
rx: <d><a>CONNECT<d><a>
rx: <1c><2><3><e8><0><0><9>#<0><0><5>t<ce>l<0><83><e0><d9><d9><a5><cb><ad><d3><<0><0><0><0><0><0><0><0><e0><d9><db>+<fb>u<e8><15><e0><d9><db>+<fb>x<90><1>--EOF--Pattern--
tx: AT+KUDPCLOSE=1<d><a>
rx: <d><a>OK<d><a>
Having to insert an empirically derived delay doesn’t sit well with me. In my released code I’ve doubled the delay just to add a safety margin. The cell modem is HL8548 with firmware 5.5.18.
The above modem conversation shows all characters exchanged in either printable ASCII or hex code, but some outputs have been split onto separate lines for ease of human parsing. So the letter a would be ‘a’ but a line feed which is ascii code 10 in hex is printed as . All printable ASCII characters are printed including space (32-127).
Can anybody offer guidance to the nature of the delay? Thanks.