play generated voice stream in a call



I need to play an audio stream in a gsm call.
I looked at the example pcm play and record, but I cannot see how to upload the audio stream to play.
there is a way to do that?

  1. start a call
  2. wait for answer
  3. upload custom audio stream
  4. play uploaded audio stream

may be using a lua script?

thanks for the help!



I’ve just finished implimenting this. A bit tricky, but works like a charm.

You essentially have to record the audio you want to play as 8kHz or 16kHz, 16bit, signed, Mono PCM, then find a method of loading the raw data onto the Q26xx.

As it turns out, the PCM format is the same as a mono WAV file, with all the headers stripped off. I wrote a small app to strip the headers off a WAV file, and then write the contents as a constant C array. i.e.

const pcm[] = { 0x00, ,,,,, };

I then manually added the resulting file to my M2M studio project, and used the array as the source for the PCM playback.

ciao, Dave



ok but I generate the audio stream dynamically.
you upload the stream in the constant with the whole plugin once, but I cannot upload the plugin before each call .

thanks anyway!



OK, so that’s even easier.

Generate your stream into RAM [i.e. streambuffer = adl_memGet(sizeof(u8) * 2 * streamlength)]; then index through the streambuffer in the same method that is used in the PCM playback sample.

How you generate the sample is another issue though. You’ll either have to buy or implement your own text to speech engine…

ciao, Dave


thanks Dave,

my idea was to use tools like festival on a linux box.
the problem is to send the stream to the wavecom box and then use it in the plugin…
it seems that lua script can handle the upload through the ip stack, but then is not clear how to use it in call.




This looks quite interesting - I hadn’t come across it during (obviously not exhaustive) search for text to speech apps. I’ll have to have a look at it. For the moment, I’m doing trials with the 8k voices from

That’s next on my list when I stomp on some other bugs.

I haven’t used Lua (everything done in OpenAT) - but it looks like you are more than 50% of the way if you can get your clip from the linux server to the Wavecom device.

My thoughts:

  1. Create clip on Linux box & convert to 8kHz (or 16kHz) RAW PCM
  2. Download to Wavecom Target using Lua (or HTTP, or something). Save it in FLASH, and remember the start address of your buffer
  3. Modify the sample provided with OpenAT to play it back your downloaded buffer instead of the recorded buffer that the sample uses. Got no idea how to do this from Lua though - if Lua doesn’t natively support PCM playback, you are going to have to get access to the Low and High Level Interrupts that are called in an OpenAT adl_audioPlayStream() subscription.

Just watch you don’t run out of memory - raw PCM is uncompressed and 8kHz playback uses (2 x 8192) bytes per second of audio…

ciao, Dave


Thanks for the help!
in the next days I will try to upload the stream, if I make some steps ahead I will post the results.

sounds dangerous… I will be careful …

I think I have to call the openat plugin for playback… the lua doc for wavecom api is not so wide.




Cool. I’ll be watching out for your notes.

ciao, Dave


Hi dave,

now I finished my application !
It remains a problem, my stream is mono 8khz,16bit, recorded from ‘espeak’ program, but my conversion from wav to pcm is buggy.
I used sox for the conversion but when I try to play it, the whole at application crash and fastrack will reset.
I send/store the stream byte to byte from pc to fastrack, maybe there is a little/big endian problem?
The file converted is:

Channels       : 1
Sample Rate    : 8000
Precision      : 16-bit
Duration       : 00:00:01.50 = 12002 samples
Sample Encoding: 16-bit Unsigned Integer PCM
Endian Type    : little
Reverse Nibbles: no
Reverse Bits   : no

What I not understand is why the whole app crash: if the stream is bad encoded, I should hear a noise but I hear a click in the headset and then the crash (I don’t hear this last :laughing:).
I store the data got from socket in a dynamically allocated buffer, then in the lowirqhandler I copy the samples in this way:

bool appPlayLowIrqHandler(adl_irqID_e Source,
		adl_irqNotificationLevel_e NotificationLevel, adl_irqEventData_t *Data) {
	int size = min(BufferSizePlay, (listen_play_buffer.size
					- listen_play_buffer.read_index));
	wm_memcpy(StreamBufferPlay, listen_play_buffer.buffer
			+ listen_play_buffer.read_index, size);
        listen_play_buffer.read_index += size;
	if (listen_play_buffer.read_index < listen_play_buffer.size) {
		// Set BufferReady flag to TRUE
        	*((adl_audioStream_t *) Data->SourceData)->BufferReady = TRUE;
		// Do NOT call the high level handler
		return FALSE;
	} else {
		// Call High level handler
		return TRUE;

there is a way to understand what happens? I don’t think to run out of memory, all the adl_memGet I call returns a valid pointer.
The debugger not works because the uart is used by the ppp bearer, and the logs are lost after the reset of the fastrack.

ciao, Luca



Well done.

I initially did so as well, and had some interesting issues - it seems that PCM ain’t always PCM. In the end I wrote my own app to simply strip off the RIFF header info from the WAV file and used that. I was also doing some compression on the PCM - but that’s another issue.

Been there, done that as well. The ‘Click’ is the first “play buffer” of your audio being played back, then your app is crashing in the Low IRQ handler when you try to load the next section of PCM into the playback buffer.
I’ve only had a cursory glance at your code, but when I was having the same issue I ended up adding HEAPS of trace statements to both the IRQ handlers. Remember that the IRQ traces don’t come out on the normal trace level in TMT.
My problem was (and I suspect yours is as well), that the compiler was not doing the pointer offset calculations the way I expected it to do when filling the playback buffer with the next sample to playback. This mal-formed pointer maths ended-up pointing the buffer somewhere it shouldn’t be and BANG, a reboot. Solution was to explicitly cast pointer variables when doing the wm_memcpy() call…

If you are using the TMT or M2M Studio, I would have thought that the CMUX operation on the UART would have let you connect PPP to a COM port and TMT to the debug stream. Can you use USB or COM2? I do agree that it’s frustrating not being able to get the debug logs out of the module after a reboot though.

ciao, Dave


of course this can cause the segfault… and now I understand the strange way to do the memcpy in the sample PBM_buffer.c …

Yes, but there is an explanation on how to have/create a cable with the two connections?

It would be enough to have an api for access the resident memory… :wink:

many many thanks!

ciao, Luca



You would need 2 separate cables:

  1. Your current cable for your current connection;
  2. An additional cable to connect UART2 or the USB on your Fastrack to a suitable port on your PC.

For debugging, I would recommend UART2 over USB - due to (potential) issues with restarting the USB whenever the Fastrack restarts…

You are aware of the Backtrace API calls, aren’t you…?

Are you also aware that you can share a UART between the application and TMT Traces?
See: viewtopic.php?f=53&t=2509&p=11123&hilit=external+com#p11123


On my fastrack I have only the D-SUN-H.D. Connector and the supplied cable with on one side the serial connector and the headset plug.
For 2 serial connections I need 2 serial connectors that comes from fastrack, I suppose there is a special cable for do that (output 2 serial from the D-SUN-H.D.) , or they are other version of fastrack with 2 connectors.

… no, I’m not aware :blush: , where I can find the doc about that? I found a ‘Fastrack_Supreme_Development_Kit_Quick_Starting_Guide.pdf’ seems not to be the same of the post.
Anyway I know that I can assign a uart to the bearer and the other for the tracing (in my case actually are both on uart1)

			WIP_NET_DEBUG_PORT_UART1, /* WIP traces on UART1 */


You’d need to add an “IESM” plug-in module to get the extra UART and USB.

For development & testing, I would certainly recommend that you should have the extra UART & USB available forthis very reason! 8)

It is (or was) in the “Tools Manual” - See the linked post.

I don’t know where the PDF docs are kept with M2MStudio - it might be easiest to download an older SDK, and find it in there.
Or just speak to your Distributor.


(logical location :stuck_out_tongue: )
or downlaod the latest sdk, unzip the exe and the doc’s are to be found in the ‘normal’ locations.
…\Wavecom Open AT Software Suite v2.30\OS\doc



Yes , I buy it NOW!

ah sorry, in the post was mentioned a ‘Tools Manual’ not the ‘User Guide’.


sorry, by bad as well. i was just giving an example of the location where to find documentation

more about the MUX can be found at
…\Wavecom Open AT Software Suite v2.30\Drivers\CMUX\DISK1\program files\Wavecom\MuxConfTool\Help



I tried with returning TRUE in the Low IRQ without to copy any sample, the result is that the app still crashes … it seems the sound stream and the memcopy aren’t the guilty.
all the handles I get are valid pointers (audio,speaker,Low/High IRQ) , the buffer too.
Can they are conflicts with wip?



First of all: EUREKA!
the problem was a debug function I called in LowIrqHandler, without to call them, no segfault, no reboot, but my sound stream in the headphone!
the debug function is called several times in the other functions in program (wip event handler, call handler , HighIrqHandler …) but in lowirq…kills the program.

void DEBUGV(const char *fmt, va_list argp) {
	char buf[256];
	sprintf(buf, "%d) ", count);
	log_size += strlen(buf);
	if (log_size > LOG_BUFFER_SIZE) {
		log_size = strlen(buf);
		log_buffer[0] = '\0';
	strcat(log_buffer, buf);
	vsprintf(buf, fmt, argp);
	log_size += strlen(buf);
	if (log_size > LOG_BUFFER_SIZE) {
		log_size = strlen(buf);
		log_buffer[0] = '\0';
	strcat(log_buffer, buf);

void DEBUG(const char *fmt, ...) {
	va_list argp;
	va_start(argp, fmt);
	DEBUGV(fmt, argp);


bool appPlayLowIrqHandler(adl_irqID_e Source, adl_irqNotificationLevel_e NotificationLevel, adl_irqEventData_t *Data) {
DEBUG("appPlayLowIrqHandler\n"); //here segfault

the stream converted with sox to .raw with the option unsigned seems to be correct.

thanks to all for the help!

next step dtmf recognition :slight_smile:


IRQ handlers - particularly the Low-Level IRQ handler - are not just like any other part of the program!

There are specific ADL restrictions on certain things that can’t be done in IRQ handlers, but you also need to take special care generally - just like you would in writing an interrupt handler on a “real” microcontroller…!