SoundManager 2 and iPhone - iphone

The following code works in my desktop browser but not in Safari on my iPhone, it does display the alert() but doensn't play the audio. What am I doing wrong?
soundManager.url = '/assets/backend/swf/';
soundManager.preferFlash = false;
soundManager.useHTML5Audio = true;
soundManager.onready(function(){
function playSong(){
soundManager.createSound({
id: 'song_<?=$song->ID?>',
url: '/backend/song/play/<?=$song->ID?>',
type: 'audio/mp3'
}).play();
}
$('#play').click(function(){
alert('playSong()');
playSong();
});
});

From what I've seen, iPhone doesn't like "auto-played" sounds, and they require the sound to be played from user interaction.
In your case, the iOS browser must not like the fact that you're creating the sound AND THEN playing it back immediately.
Give this variant a try and see if it works for you:
soundManager.url = '/assets/backend/swf/';
soundManager.preferFlash = false;
soundManager.useHTML5Audio = true;
soundManager.onready(function(){
// Create the sounds here (don't call play)
soundManager.createSound({
id: 'song_<?=$song->ID?>', // It is not cool to put PHP here, read below!
url: '/backend/song/play/<?=$song->ID?>',
type: 'audio/mp3'
});
$('#play').click(function(){
var soundId = 'song_<?= $song->ID ?>';
soundManager.play(soundId);
});
});
As a side note: I'd advise you AGAINST mixing JS and PHP in that way. Check this SoundManager2 example on how to "augment" a regular MP3 link like this:
Click here to play blabla
into something that plays an MP3 onclick:
SoundManager2 documentation
Regards and good luck!

Related

How to implement an audio listening stream in Flutter Web?

I'm making a Flutter Web App which has to access the microphone and streams the audio data as an array of integers for further processing.
I already succeeded doing this in plain JavaScript.
Things I've tried:
The flutter_sound library, but I couldn't get it to work. I also can't find any working examples for that library.
dart:web_audio seems to be a thing, but apparently you can't even import it yet in normal Flutter Apps.
dart:js is what im trying to do right now. I was able to create an AudioContext with var audioContext = JsObject(context['AudioContext']);. However, after that I dont know what syntax can be used to transfer the JavaScript code into Dart. Here is what I'm doing in JavaScript:
function initAudio() {
try {
audioCtx = new AudioContext();
const GotAudioStream = function(stream) {
const audioSource = audioCtx.createMediaStreamSource(stream);
const audioProcessor = audioCtx.createScriptProcessor(bufSize, 1, 1);
audioSource.connect(audioProcessor);
audioProcessor.connect(audioCtx.destination);
audioStarted = true;
audioProcessor.onaudioprocess = function(e) {
checkAudioBuffer(e.inputBuffer);
};
};
navigator.mediaDevices.getUserMedia({ audio: true, video: false }).then(GotAudioStream);
}
catch (err) {
console.log(err);
}
}
Does anyone have experience with the dart:js library or another Idea on how to implement a simple (live!) audio stream in Flutter Web?
Regards,
Kaisky

Youtube iframe on ionic app generate play icon background

I'm developing an app, using Ionic 3, thats reproduce a youtube video. As I want an embedded video I use an iframe where the src is the video's url.
When I test on an Android device, i get this before the video starts playing.
Is there a way to avoid that background? or make it personalized?
Testing it using "ionic serve" makes the background completely black, so it only happens running on an android device.
Why not use a temporary <img>, as soon as the user clicks on the img, the <iframe> tag is toggled on, with Autoplay = true.
I recommend using angular youtube-player. You can detect when the video is ready to play. If the video is not ready yet, just display an image or a spinner.
Here is an example:
HTML:
<img class="video-loading-cover" src="assets/images/home/ytcover.png" height="550" width="1400" [hidden]="isVideoLoaded" alt="">
<youtube-player #youTubePlayer (ready)="playVideo($event)" (stateChange)="onPlayerStateChange($event)" (error)="hideVideo()"
width="100%" height="540px" [playerVars]="playerVars" [videoId]="videoId"></youtube-player>
TS:
#ViewChild('youTubePlayer') youTubePlayer: YT.Player;
isVideoLoaded: boolean;
videoId = 'your video id';
// optional
playerVars: YT.PlayerVars = {
autoplay: AutoPlay.AutoPlay,
loop: Loop.Loop,
playlist: 'yourPlaylist',
controls: Controls.Hide,
enablejsapi: JsApi.Enable,
origin: window.location.origin,
rel: RelatedVideos.Hide,
iv_load_policy: IvLoadPolicy.Hide,
autohide: AutoHide.HideAllControls,
showinfo: ShowInfo.Hide
};
ngOnInit() {
const tag = document.createElement('script');
tag.src = 'https://www.youtube.com/iframe_api';
document.body.appendChild(tag);
}
onPlayerStateChange(el) {
this.youTubePlayer.mute();
switch (el.data) {
case -1:
case 2:
this.youTubePlayer.playVideo();
break;
case 1:
this.isVideoLoaded = true;
break;
}
}
hideVideo(): void {
this.isVideoLoaded = false;
}
playVideo(event): void {
this.youTubePlayer.mute();
setTimeout(() => {
this.youTubePlayer.playVideo();
}, 10);
}
Note that in my example when the youtube video is loading it will call the (error)="hideVideo()" function and an image will be displayed. When it is available it will automatically hide the image and play the video.

Problems with WebAudio

I'm creating a research experiment that uses WebAudio API to record audio files spoken by the user.
I came up with a solution for this using recorder.js and everything was working fine... until I tried it yesterday.
I am now getting this error in Chrome:
"The AudioContext was not allowed to start. It must be resumed (or
created) after a user gesture on the page."
And it refers to this link: Web Audio API policy.
This appears to be a consequence of Chrome's new policy outlined at the link above.
So I attempted to solve the problem by using resume() like this:
var gumStream; //stream from getUserMedia()
var rec; //Recorder.js object
var input; //MediaStreamAudioSourceNode we'll be recording
// shim for AudioContext when it's not avb.
var AudioContext = window.AudioContext || window.webkitAudioContext;
var audioContext = new AudioContext; //new audio context to help us record
function startUserMedia() {
var constraints = { audio: true, video:false };
audioContext.resume().then(() => { // This is the new part
console.log('context resumed successfully');
});
navigator.mediaDevices.getUserMedia(constraints).then(function(stream) {
console.log("getUserMedia() success, stream created, initializing Recorder.js");
gumStream = stream;
input = audioContext.createMediaStreamSource(stream);
rec = new Recorder(input, {numChannels:1});
audio_recording_allowed = true;
}).catch(function(err) {
console.log("Error");
});
}
Now in the console I'm getting:
Error
context resumed successfully
And the stream is not initializing.
This happens in both Firefox and Chrome.
What do I need to do?
I just had this exact same problem! And technically, you helped me to find this answer. My error message wasn't as complete as yours for some reason and the link to those policy changes had the answer :)
Instead of resuming, it's best practise to create the audio context after the user interacted with the document (when I say best practise, if you have a look at padenot's first comment of 28 Sept 2018 on this thread, he mentions why in the first bullet point).
So instead of this:
var audioContext = new AudioContext; //new audio context to help us record
function startUserMedia() {
audioContext.resume().then(() => { // This is the new part
console.log('context resumed successfully');
});
}
Just set the audio context like this:
var audioContext;
function startUserMedia() {
if(!audioContext){
audioContext = new AudioContext;
}
}
This should work, as long as startUserMedia() is executed after some kind of user gesture.

inline html5 video on iphone

I want to play an HTML5 video on the iPhone but whenever I try to, the iPhone automatically pops out in fullscreen when the video '.play()' is called. How do I play the video inline without the iPhone changing the UI of it like these:
http://www.easy-bits.com/iphone-inline-video-autostart
http://www.takeyourdose.com/en (When you click "Start the 360 experience")
Edit: Here's my code:
<!DOCTYPE html>
<html lang="en">
<head>
<title>iPhone Test</title>
<meta charset="utf-8">
</head>
<body>
<button onclick="document.getElementById('vid').play()">Start</button>
<video id="vid">
<source src="/videos/tutorial.mp4" type="video/mp4">
Your browser does not support the video tag.
</video>
</body>
</html>
I'm working on a solution to this until Apple allows the "webkit-playsinline" to actually play inline.
I started a library here: https://github.com/newshorts/InlineVideo
It's very rough, but the basic gist is that you "seek" through the video instead of playing it outright. So instead of calling:
video.play()
You instead set a loop using request animation frame or setInterval, then set the:
video.currentTime = __FRAME_RATE__
So the whole thing might look like in your html:
<video controls width="300">
<source src="http://www.w3schools.com/html/mov_bbb.mp4">
</video>
<canvas></canvas>
<button>Play</button>
and your js (make sure to include jquery)
var video = $('video')[0];
var canvas = $('canvas')[0];
var ctx = canvas.getContext('2d');
var lastTime = Date.now();
var animationFrame;
var framesPerSecond = 25;
function loop() {
var time = Date.now();
var elapsed = (time - lastTime) / 1000;
// render
if(elapsed >= ((1000/framesPerSecond)/1000)) {
video.currentTime = video.currentTime + elapsed;
$(canvas).width(video.videoWidth);
$(canvas).height(video.videoHeight);
ctx.drawImage(video, 0, 0, video.videoWidth, video.videoHeight);
lastTime = time;
}
// if we are at the end of the video stop
var currentTime = (Math.round(parseFloat(video.currentTime)*10000)/10000);
var duration = (Math.round(parseFloat(video.duration)*10000)/10000);
if(currentTime >= duration) {
console.log('currentTime: ' + currentTime + ' duration: ' + video.duration);
return;
}
animationFrame = requestAnimationFrame(loop);
}
$('button').on('click', function() {
video.load();
loop();
});
http://codepen.io/newshorts/pen/yNxNKR
The real driver for Apple changing this will be the recent release of webGL for ios devices enabled by default. Basically there are going to be a whole bunch of people looking to use video textures. technically right now, that can't be done.
On IOS10 / Safari 10 you can now add the playsinline property to the HTML5 Video element, and it will just play inline.
If you create an audio element and a video element, you can play the audio via user interaction and then seek the video, rendering it to a canvas. This is something quick that I came up with (tested on iPhone iOS 9)
var canvas = document.getElementById("canvas");
var ctx = canvas.getContext('2d');
var audio = document.createElement('audio');
var video = document.createElement('video');
function onFrame(){
ctx.drawImage(video,0,0,426,240);
video.currentTime = audio.currentTime;
requestAnimationFrame(onFrame);
}
function playVideo(){
var i = 0;
function ready(){
i++;
if(i == 2){
audio.play();
onFrame();
}
}
video.addEventListener('canplaythrough',ready);
audio.addEventListener('canplaythrough',ready);
audio.src = video.src = "http://www.sample-videos.com/video/mp4/720/big_buck_bunny_720p_10mb.mp4";
audio.load();
video.load();
}
CodePen
Test Page
Apologies for writing this as an answer instead of a comment on the main thread, but I apparently do not have enough reputation points to comment!
Anyways, I am also looking to do exactly the same thing as the OP.
I noticed that there is a particular library, krpano, coupled with the krpano videoplayer plugin that allows for video to be played on iPhone INLINE! Some demos of this in action can be found here: http://krpano.com/video/
While I would prefer a simple 2D video example over these crazy panorama videos, this is the closest I have found while scouring the web. From what I can tell, they use a normal element not attached to the document:
var v = document.querySelector('video');
// remove from document
v.parentNode.removeChild(v);
// touch anywhere to play
document.ontouchstart = function () {
v.play();
}
Video element before it's removed:
<video playsinline webkit-playsinline preload="auto" crossorigin="anonymous" src="http://www.mediactiv.com/video/Milano.mp4" loop style="transform: translateZ(0px);"></video>
But that alone doesn't seem to be enough: when the video is played, it still goes fullscreen.
How do they manage to prevent the video from going fullscreen?
EDIT: After looking at both examples it looked like they both were leveraging the canvas element to render the video, so I went ahead and whipped up a demo showing off video rendering thru the canvas element. While the demo works great, it fails to deliver on iPhone (even tho the video element is completely removed from the DOM!) -- the video still jumps to full screen. I'm thinking the next step would be to apply these same principles to a WebGL canvas (that's what the krpano examples are doing), but in the meantime maybe this demo will spark some ideas in others...
http://jakesiemer.com/projects/video/index.htm

Soundcloud Streaming on Safari Mobile

I tried to use this snippet from the Soundcloud API:
<script src="http://connect.soundcloud.com/sdk.js">
<script>
SC.initialize({
client_id: 'YOUR_CLIENT_ID'
});
# stream track id 293
SC.stream("/tracks/293", function(sound){
sound.play();
});
</script>
It works in any browsers aside from Safari mobile, both on iPhone and iPad, where the music stream does not play at all.
What am I doing wrong? (I replaced track id and client id with my own details)
Thanks
Here is work around for this issue. What you need to add extra html element.
<div class="player-helper"></div>
JavaScript
SC.stream('/tracks/' + id, {
useHTML5Audio: true,
debugMode: true
}, function(sound) {
let $eventEmitter = $('.player-helper');
$eventEmitter.on('click', function () {
sound.play();
});
$eventEmitter.trigger('click');
$eventEmitter.off('click');
});
In mobile safari the play has the restriction that audio playback has to be triggered by a user action. So you'll have to add some sort of button or link which when clicked will call the sound.play() function.