RS-232 performance

implemented a serial interface using fcm, as per [url]https://forum.sierrawireless.com/t/simple-uart-echo-program/4739/1] (thanks josepvr) the AT command version in the examples was :laughing:

I then buffer the uart receive events data in a buffer. this is then read by a timer event running at max rate 18ms.

If I run at 9600bd I expected to see ~18 bytes per timer period, however I seem to see bursts of precisely 56 bytes at random intervals.

If I then change to 115kbd seem to see bursts of precisely 56 bytes at random intervals.

Is this normal ? :neutral_face:

Yes:

No cigar awneil, I did read the manual

It seems I have 2 devices behaving differently

The one receives 56 bytes at a time significantly slower than the incoming rate, it also often loses first byte & mis behaves at adl_fcmSubscribe()

The other device receives 120 bytes at more or less incoming rate reliably

So it seems to be a firmware issue

Both seem to have some logic going whereby they only pass incoming data when either the buffer is full or some undisclosed inter character timeout.

Yes, I think that is the logic.

Problem is I need to implement exactly the same logic on my side, however I now need to compensate for the fact that they are doing it.

This means my timeout time must be > than the character time of the full buffer ~ 56ms for a 56 buffer & 120ms for a 120 buffer @ 9600 & any future changes / baud changes :exclamation:

ooh the joys of programming someone elseā€™s API :imp:

Why?

Canā€™t you just have FCM writing to a buffer using whatever logic it likes, and your side reading from that buffer using whatever logic it likes?

On of the key uses of ā€œbuffersā€ is to be ā€œelasticā€ enough to take up the slack like thisā€¦