Casting u8 to ascii

Hi guys,

What is the accepted standard way of converting a u8 buffer to an ascii string and vice versa? ie in wip when you send receive data it is done as u8 but all the debug and string functions are ascii :frowning:

I would have assumed there was a function defined in the ADL but for the life of me i cant find one :frowning:

I understand it is unsafe to do a straight cast to ascii since I am assuming ascii is s8.

If I can guarantee from a logical perspective that the data is in fact ascii is it safe to do so though? ie the data stream will always be ascii chars

Why would you assume that?

Don’t assume - look it up and confirm it for sure!

If you know what is sending the data, and you know that it is always valid ASCII text data, you could make that assumption.

WIP neither knows nor cares the meaning of the data - it’s all just bytes to WIP.

Only you can know, from the application perspective, if you have any special case that will guarantee this.

You do know what specifically constitutes a string in the ‘C’ programming language…?

It’s a general ‘C’ thing - nothing specifically to do with ADL.

Awneil,

Thanks for your response.

I understand c strings just fine even though I am a Java developer and have been spoiled for a long time! :slight_smile:

When I say I am assuming I should point out I have referred to wm_types.h and can see a u8 is a uint8_t when using gcc and ascii is defined as a char. From my understanding a char is defined as an u8 at the lower level. Here is where I need to assume as I have never come across this before and am struggling. I understand there is the possibility it could be defined as a signed int as well however.

What your saying is if I have ascii array I can pass that to wip and disregard the signed dif warning I receive from the compiler? Vice versa if I can guarantee the response is an ascii char array I can safely cast the u8 to ascii using the usual (ascii)buffer syntax?

Thanks for your help

Correct.

Incorrect: char is a standard part of the ‘C’ programming language;
so it’s the other way around - u8 is defined as char!

See:
c-faq.com/decl/exactsizes.html
c-faq.com/decl/int16.html
keil.com/forum/14769/

Also:
keil.com/forum/16968/
keil.com/forum/5112/

Yes - although it would be better to use an explicit cast so that you don’t get the warning.

The ascii definition just exists because the standard ‘C’ library uses plain char in the string function prototypes - see: keil.com/forum/14769/

As you can see, this is a general ‘C’ thing - nothing specifically to do with Sierra Wireless or Open-AT or ADL.

Ah ok making a bit more sense now.

Explicit casting is what I was getting at.

Regarding the u8 char relationship I was under the impression historically it was the other way. Either way looks good, I can work with this .

Thanks for your help

Matt

No: char has always been part of the ‘C’ language; u8 is not, and never has been, part of the language.