I'm new on iOS, tvOS and Swift. I've been trying to play a video on tvOS App, but the video from a livestream URL which works fine on web. The TVML template and TVJS both works, even the App works if contains a single video URL(.mp4) but when I try with this streaming link it doesn´t work.
This is my TVJS
App.onLaunch = function(options){
console.log("Hello TVML!");
var resourceLoader = new ResourceLoaderJS(NativeResourceLoader.create());
var initialDoc = resourceLoader.getDocument("hello.tvml");
navigationDocument.pushDocument(initialDoc);
initialDoc.addEventListener("play", handleEvent);
initialDoc.addEventListener("select", handleEvent);
}
class ResourceLoaderJS {
constructor(nativeResourceLoader) {
this.nativeResourceLoader = nativeResourceLoader;
this.domParser = new DOMParser();
}
getDocument(name) {
var docString = this.nativeResourceLoader.loadBundleResource(name);
return this.domParser.parseFromString(docString, "application/xml");
}
}
function playVideo(title, url) {
var player = new Player();
var video = new MediaItem('video', url);
video.title = title;
player.playlist = new Playlist();
player.playlist.push(video);
player.play();
}
function handleEvent(event) {
var buttonId = event.target.getAttribute("id");
if(buttonId === "play") {
playVideo("Hello TVML!","https://new.livestream.com/accounts...");
}
}
Yes it is possible to play livestream urls in apple tv tvml. You just have to specify livestream url in the url section of the code. I was able to play .m3u8 format without any modifications to the code.
You can refer following link to get more info on creating the player:
https://developer.apple.com/library/content/samplecode/TVMLAudioVideo/Listings/client_js_application_js.html
Related
I'm making a Flutter Web App which has to access the microphone and streams the audio data as an array of integers for further processing.
I already succeeded doing this in plain JavaScript.
Things I've tried:
The flutter_sound library, but I couldn't get it to work. I also can't find any working examples for that library.
dart:web_audio seems to be a thing, but apparently you can't even import it yet in normal Flutter Apps.
dart:js is what im trying to do right now. I was able to create an AudioContext with var audioContext = JsObject(context['AudioContext']);. However, after that I dont know what syntax can be used to transfer the JavaScript code into Dart. Here is what I'm doing in JavaScript:
function initAudio() {
try {
audioCtx = new AudioContext();
const GotAudioStream = function(stream) {
const audioSource = audioCtx.createMediaStreamSource(stream);
const audioProcessor = audioCtx.createScriptProcessor(bufSize, 1, 1);
audioSource.connect(audioProcessor);
audioProcessor.connect(audioCtx.destination);
audioStarted = true;
audioProcessor.onaudioprocess = function(e) {
checkAudioBuffer(e.inputBuffer);
};
};
navigator.mediaDevices.getUserMedia({ audio: true, video: false }).then(GotAudioStream);
}
catch (err) {
console.log(err);
}
}
Does anyone have experience with the dart:js library or another Idea on how to implement a simple (live!) audio stream in Flutter Web?
Regards,
Kaisky
I'm developing an app, using Ionic 3, thats reproduce a youtube video. As I want an embedded video I use an iframe where the src is the video's url.
When I test on an Android device, i get this before the video starts playing.
Is there a way to avoid that background? or make it personalized?
Testing it using "ionic serve" makes the background completely black, so it only happens running on an android device.
Why not use a temporary <img>, as soon as the user clicks on the img, the <iframe> tag is toggled on, with Autoplay = true.
I recommend using angular youtube-player. You can detect when the video is ready to play. If the video is not ready yet, just display an image or a spinner.
Here is an example:
HTML:
<img class="video-loading-cover" src="assets/images/home/ytcover.png" height="550" width="1400" [hidden]="isVideoLoaded" alt="">
<youtube-player #youTubePlayer (ready)="playVideo($event)" (stateChange)="onPlayerStateChange($event)" (error)="hideVideo()"
width="100%" height="540px" [playerVars]="playerVars" [videoId]="videoId"></youtube-player>
TS:
#ViewChild('youTubePlayer') youTubePlayer: YT.Player;
isVideoLoaded: boolean;
videoId = 'your video id';
// optional
playerVars: YT.PlayerVars = {
autoplay: AutoPlay.AutoPlay,
loop: Loop.Loop,
playlist: 'yourPlaylist',
controls: Controls.Hide,
enablejsapi: JsApi.Enable,
origin: window.location.origin,
rel: RelatedVideos.Hide,
iv_load_policy: IvLoadPolicy.Hide,
autohide: AutoHide.HideAllControls,
showinfo: ShowInfo.Hide
};
ngOnInit() {
const tag = document.createElement('script');
tag.src = 'https://www.youtube.com/iframe_api';
document.body.appendChild(tag);
}
onPlayerStateChange(el) {
this.youTubePlayer.mute();
switch (el.data) {
case -1:
case 2:
this.youTubePlayer.playVideo();
break;
case 1:
this.isVideoLoaded = true;
break;
}
}
hideVideo(): void {
this.isVideoLoaded = false;
}
playVideo(event): void {
this.youTubePlayer.mute();
setTimeout(() => {
this.youTubePlayer.playVideo();
}, 10);
}
Note that in my example when the youtube video is loading it will call the (error)="hideVideo()" function and an image will be displayed. When it is available it will automatically hide the image and play the video.
I'm developing an hybrid application using ionic.
Most of the features would work on mobile web browser. So the final product can be used from an http address as well. But there are certain code segments/features ( such as vibration, background alerts ) which will obviously be for the app version. And those features will only be available on the app version.
What's a good/recommended way to detect the current situation in the code base so that I can do logic such as if (isRunningAsApp) {do this} else {do that}*?
would it be just checking window.location.href and if you get something that starts wiih http:, then it is a mobile app, otherwise it is an app?
This is from Ionic's documentation
angular.module('PlatformApp', ['ionic']).controller('PlatformCtrl', function($scope) {
ionic.Platform.ready(function(){
// will execute when device is ready, or immediately if the device is already ready.
});
var deviceInformation = ionic.Platform.device();
var isWebView = ionic.Platform.isWebView();
var isIPad = ionic.Platform.isIPad();
var isIOS = ionic.Platform.isIOS();
var isAndroid = ionic.Platform.isAndroid();
var isWindowsPhone = ionic.Platform.isWindowsPhone();
var currentPlatform = ionic.Platform.platform();
var currentPlatformVersion = ionic.Platform.version();
ionic.Platform.exitApp(); // stops the app
});
I've tried this code to play a video on my iphone with starling 1.7:
var nc:NetConnection = new NetConnection();
nc.connect(null);
var file:File = File.applicationDirectory.resolvePath("video.mp4");
var ns:NetStream = new NetStream(nc);
ns.play(file.url);
var texture:Texture = Texture.fromNetStream(ns, 1, function():void{
addChild(new Image(texture));
});
(code directly from starling blog)
It works on simulator, works with android ...but not on iPhone/iPad.
Webcam works both on simulator and iPhone:
var camera:Camera = Camera.getCamera();
var texture2:Texture = Texture.fromCamera(camera, 1, function():void{
addChild(new Image(texture2));
});
So this should be an encoding issue for the video, but how should I encode a video to use on ios to play a video as videotexture?
The same video.mp4 works if I play it without stage3D.
thx
To fix this issue on iOS, you need to call play() after Texture.fromNetStream()
var nc:NetConnection = new NetConnection();
nc.connect(null);
var file:File = File.applicationDirectory.resolvePath("video.mp4");
var ns:NetStream = new NetStream(nc);
var texture:Texture = Texture.fromNetStream(ns, 1, function():void{
addChild(new Image(texture));
});
ns.play(file.url);
I was experimenting the Web Audio API
with
var context = new webkitAudioContext();
//alert(context);
//alert(context.createOscillator);
var oscillator = context.createOscillator();
oscillator.connect(context.destination);
oscillator.noteOn(0);
but I get no sound, so I was wondering what I missing
the alert(context) that is commented out prints [object AudioContext]
but the following alert prints undefined
and when i try alert(context.decodeAudioData) its prints that is a function
thank you for the help
Your problem here is the AudioContext differs between browser. For a fixed version see this jsfiddle.
try {
if (! window.AudioContext) {
if (! window.webkitAudioContext) {
bad_browser();
return;
}
window.AudioContext = window.webkitAudioContext;
}
context = new AudioContext();
}
catch(e) {
console.log('Web Audio API is not supported in this browser');
}
var oscillator = context.createOscillator();
oscillator.connect(context.destination);
oscillator.noteOn(0);