Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Wave is not visible on iOS #60

Open
Saintenr opened this issue Apr 29, 2021 · 4 comments
Open

Wave is not visible on iOS #60

Saintenr opened this issue Apr 29, 2021 · 4 comments

Comments

@Saintenr
Copy link

Hello kopiro and team,

First of all, i would like to say thank you for setting up such a great project.

Good work. I'm already looking forward to the new version.

Unfortunately, I have the problem that on iOS devices (no matter which browser is used) the siri wave is not displayed. The div is rendered, but the line does not move.

I also don't get any errors in the developer console.

Any idea what this could be?

@kopiro
Copy link
Owner

kopiro commented Apr 29, 2021

Hej! Does the same page correctly renders on Desktop instead?

@Saintenr
Copy link
Author

Saintenr commented Apr 29, 2021

Hey

It works perfectly in the desktop. It also works without problems under android.

image

on the iphone or ipad only the grey div and the stitch are displayed. but no waves.

for better understanding, here are my code blocks (sorry for the bad format):

navigator.mediaDevices.getUserMedia({ audio: true, video: false }).then((stream)=>{ var siriWave = this.createVoiceVisualization(); this.styleVoiceVisualization(siriWave,stream,context); }

createVoiceVisualization() { var siriWave = new SiriWave({ container: document.getElementById("fi-wave-container"), width: 250, height: 75, style: 'ios9', speed: 0.0, amplitude: 0.0, // autostart: true }); return siriWave; }

`public styleVoiceVisualization(siriWave:SiriWave, stream:MediaStream, context) {
let source = undefined;
try{
console.log("style voice inner try{}")
console.log(stream);
//create source for sound input.
source = context.createMediaStreamSource(stream);

  //create processor node.
  let processor = context.createScriptProcessor(1024, 1, 1);

  //create analyser node.
  let analyser = context.createAnalyser();

  //set fftSize to 4096
  analyser.fftSize = 4096;
  //array for frequency data.
  let myDataArray = new Float32Array(analyser.frequencyBinCount); 

  //connect source->analyser->processor->destination.
  source.connect(analyser);
  analyser.connect(processor);
  processor.connect(context.destination);

  //start siriwave
  siriWave.start();
  //event for change in audio data from source.
  processor.onaudioprocess = function(e) {
    // console.log("onaudioprocess");
   let amplitude = 0;
     let frequency = 0;

     //copy frequency data to myDataArray from analyser.
     analyser.getFloatFrequencyData(myDataArray);

     //get max frequency which is greater than -100 dB.
     myDataArray.map((item, index):number => {
         let givenFrequencyDB = item;
         if(givenFrequencyDB>-100){
             frequency = Math.max(index,frequency);
         }
         return 0;
     });
     //multipy frequency by resolution and divide it to scale for setting speed.
     frequency = ((1+frequency)*11.7185)/24000;
     //set the speed for siriwave
     siriWave.setSpeed(frequency);
     //find the max amplituded
    e.inputBuffer.getChannelData(0).map((item):number =>{
      amplitude = Math.max(amplitude, Math.abs(item));
      return 0;
    });
     amplitude = Math.abs(amplitude*17);

      if(amplitude<1&&amplitude>0.1){
        //min scale
        amplitude=1;
      }
      if(amplitude>3){
        //max scale
        amplitude=3;
      }
     //if amplitude is greater than 0 then set siriwave amplitude else set to 0.0.
     if(amplitude>=0){
         siriWave.setAmplitude(amplitude);
     }else{
         siriWave.setAmplitude(0.0);
     }
     
  };
}catch(e) {
  console.log(e);
}

}`

@Saintenr
Copy link
Author

Saintenr commented May 5, 2021

hey, sorry I found out it's not a problem with the Siri wave but maybe someone can help me anyway. (if not just close the issue :( )

I think it's because I want to use the MediaStream in two functions.

Since Apple doesn't allow this, they mute the "first" stream and that is my processing of the SiriWave.

Then the stream is sent to Watson Speech to Text. This is then the 2nd stream which is then marked as active and why the first is muted.

So it's an Apple thing, but maybe someone has an idea how I can combine the two?

Thank you very much,

cheers Saintenr

@acosme
Copy link

acosme commented Apr 12, 2024

hey, sorry I found out it's not a problem with the Siri wave but maybe someone can help me anyway. (if not just close the issue :( )

I think it's because I want to use the MediaStream in two functions.

Since Apple doesn't allow this, they mute the "first" stream and that is my processing of the SiriWave.

Then the stream is sent to Watson Speech to Text. This is then the 2nd stream which is then marked as active and why the first is muted.

So it's an Apple thing, but maybe someone has an idea how I can combine the two?

Thank you very much,

cheers Saintenr

thanks for sharing this!
in my case i put the code out the function that loads and plays and it starts work again

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants