I need to play an audio stream in a gsm call.
I looked at the example pcm play and record, but I cannot see how to upload the audio stream to play.
there is a way to do that?
I’ve just finished implimenting this. A bit tricky, but works like a charm.
You essentially have to record the audio you want to play as 8kHz or 16kHz, 16bit, signed, Mono PCM, then find a method of loading the raw data onto the Q26xx.
As it turns out, the PCM format is the same as a mono WAV file, with all the headers stripped off. I wrote a small app to strip the headers off a WAV file, and then write the contents as a constant C array. i.e.
const pcm[] = { 0x00, ,,,,, };
I then manually added the resulting file to my M2M studio project, and used the array as the source for the PCM playback.
ok but I generate the audio stream dynamically.
you upload the stream in the constant with the whole plugin once, but I cannot upload the plugin before each call .
Generate your stream into RAM [i.e. streambuffer = adl_memGet(sizeof(u8) * 2 * streamlength)]; then index through the streambuffer in the same method that is used in the PCM playback sample.
How you generate the sample is another issue though. You’ll either have to buy or implement your own text to speech engine…
my idea was to use tools like festival on a linux box.
the problem is to send the stream to the wavecom box and then use it in the plugin…
it seems that lua script can handle the upload through the ip stack, but then is not clear how to use it in call.
This looks quite interesting - I hadn’t come across it during (obviously not exhaustive) search for text to speech apps. I’ll have to have a look at it. For the moment, I’m doing trials with the 8k voices from cepstral.com
That’s next on my list when I stomp on some other bugs.
I haven’t used Lua (everything done in OpenAT) - but it looks like you are more than 50% of the way if you can get your clip from the linux server to the Wavecom device.
My thoughts:
Create clip on Linux box & convert to 8kHz (or 16kHz) RAW PCM
Download to Wavecom Target using Lua (or HTTP, or something). Save it in FLASH, and remember the start address of your buffer
Modify the sample provided with OpenAT to play it back your downloaded buffer instead of the recorded buffer that the sample uses. Got no idea how to do this from Lua though - if Lua doesn’t natively support PCM playback, you are going to have to get access to the Low and High Level Interrupts that are called in an OpenAT adl_audioPlayStream() subscription.
Just watch you don’t run out of memory - raw PCM is uncompressed and 8kHz playback uses (2 x 8192) bytes per second of audio…
now I finished my application !
It remains a problem, my stream is mono 8khz,16bit, recorded from ‘espeak’ program, but my conversion from wav to pcm is buggy.
I used sox for the conversion but when I try to play it, the whole at application crash and fastrack will reset.
I send/store the stream byte to byte from pc to fastrack, maybe there is a little/big endian problem?
The file converted is:
Channels : 1
Sample Rate : 8000
Precision : 16-bit
Duration : 00:00:01.50 = 12002 samples
Sample Encoding: 16-bit Unsigned Integer PCM
Endian Type : little
Reverse Nibbles: no
Reverse Bits : no
What I not understand is why the whole app crash: if the stream is bad encoded, I should hear a noise but I hear a click in the headset and then the crash (I don’t hear this last ).
I store the data got from socket in a dynamically allocated buffer, then in the lowirqhandler I copy the samples in this way:
bool appPlayLowIrqHandler(adl_irqID_e Source,
adl_irqNotificationLevel_e NotificationLevel, adl_irqEventData_t *Data) {
int size = min(BufferSizePlay, (listen_play_buffer.size
- listen_play_buffer.read_index));
wm_memcpy(StreamBufferPlay, listen_play_buffer.buffer
+ listen_play_buffer.read_index, size);
listen_play_buffer.read_index += size;
if (listen_play_buffer.read_index < listen_play_buffer.size) {
// Set BufferReady flag to TRUE
*((adl_audioStream_t *) Data->SourceData)->BufferReady = TRUE;
// Do NOT call the high level handler
return FALSE;
} else {
// Call High level handler
return TRUE;
}
}
there is a way to understand what happens? I don’t think to run out of memory, all the adl_memGet I call returns a valid pointer.
The debugger not works because the uart is used by the ppp bearer, and the logs are lost after the reset of the fastrack.
I initially did so as well, and had some interesting issues - it seems that PCM ain’t always PCM. In the end I wrote my own app to simply strip off the RIFF header info from the WAV file and used that. I was also doing some compression on the PCM - but that’s another issue.
Been there, done that as well. The ‘Click’ is the first “play buffer” of your audio being played back, then your app is crashing in the Low IRQ handler when you try to load the next section of PCM into the playback buffer.
I’ve only had a cursory glance at your code, but when I was having the same issue I ended up adding HEAPS of trace statements to both the IRQ handlers. Remember that the IRQ traces don’t come out on the normal trace level in TMT.
My problem was (and I suspect yours is as well), that the compiler was not doing the pointer offset calculations the way I expected it to do when filling the playback buffer with the next sample to playback. This mal-formed pointer maths ended-up pointing the buffer somewhere it shouldn’t be and BANG, a reboot. Solution was to explicitly cast pointer variables when doing the wm_memcpy() call…
If you are using the TMT or M2M Studio, I would have thought that the CMUX operation on the UART would have let you connect PPP to a COM port and TMT to the debug stream. Can you use USB or COM2? I do agree that it’s frustrating not being able to get the debug logs out of the module after a reboot though.
On my fastrack I have only the D-SUN-H.D. Connector and the supplied cable with on one side the serial connector and the headset plug.
For 2 serial connections I need 2 serial connectors that comes from fastrack, I suppose there is a special cable for do that (output 2 serial from the D-SUN-H.D.) , or they are other version of fastrack with 2 connectors.
… no, I’m not aware , where I can find the doc about that? I found a ‘Fastrack_Supreme_Development_Kit_Quick_Starting_Guide.pdf’ seems not to be the same of the post.
Anyway I know that I can assign a uart to the bearer and the other for the tracing (in my case actually are both on uart1)
r = wip_netInitOpts(WIP_NET_OPT_SOCK_MAX, 12, WIP_NET_OPT_DEBUG_PORT,
WIP_NET_DEBUG_PORT_UART1, /* WIP traces on UART1 */
WIP_NET_OPT_LOG_EVENTS, TRUE, WIP_NET_OPT_END);
You’d need to add an “IESM” plug-in module to get the extra UART and USB.
For development & testing, I would certainly recommend that you should have the extra UART & USB available forthis very reason! 8)
It is (or was) in the “Tools Manual” - See the linked post.
I don’t know where the PDF docs are kept with M2MStudio - it might be easiest to download an older SDK, and find it in there.
Or just speak to your Distributor.
[m2m-workspace-dir].metadata.plugins\com.wavecom.openat.ide.spm.core\com.wavecom.openat.ide.spm.lib.os.model\6.30.0.00\doc
(logical location )
or downlaod the latest sdk, unzip the exe and the doc’s are to be found in the ‘normal’ locations.
…\Wavecom Open AT Software Suite v2.30\OS\doc
I tried with returning TRUE in the Low IRQ without to copy any sample, the result is that the app still crashes … it seems the sound stream and the memcopy aren’t the guilty.
all the handles I get are valid pointers (audio,speaker,Low/High IRQ) , the buffer too.
Can they are conflicts with wip?
First of all: EUREKA!
works!
the problem was a debug function I called in LowIrqHandler, without to call them, no segfault, no reboot, but my sound stream in the headphone!
the debug function is called several times in the other functions in program (wip event handler, call handler , HighIrqHandler …) but in lowirq…kills the program.
IRQ handlers - particularly the Low-Level IRQ handler - are not just like any other part of the program!
There are specific ADL restrictions on certain things that can’t be done in IRQ handlers, but you also need to take special care generally - just like you would in writing an interrupt handler on a “real” microcontroller…!