DFP Video Creative - google-dfp

I need help on the video creative for DFP (for publishers). I see a lot of discussion on building ads that run before a video.
However, I, simply, want to have a 1-minute video play as the creative, instead of a static image.
I followed the steps in DFP to create an ad unit with both a normal size and a VAST size. The 'Generate Tags' result is:
https://pubads.g.doubleclick.net/gampad/ads?sz=300x250|300x250|300x600&iu=/1028***/GGB_video_1&impl=s&gdfp_req=1&env=vp&output=vast&unviewed_position_start=1&url=[referrer_url]&description_url=[description_url]&correlator=[timestamp]
It's not clear what to do with this link? Obviously, it's not formatted as a normal DFP tag set. Note, my creative is a youtube video.
OR - Is it preferable to do a normal ad unit and use the HTML5 creative for a video?
Thank you.

First you need to check if your vast is working and serving the video creative. You can use the Vast Inspector.
Second you will need a video player. The video player uses the vast url to build the video creative with the IMA SDK 3
Google provide a list of video partners:
https://support.google.com/dfp_premium/answer/186110?hl=en
if you will use Sambatech, there is an example using javascript API:
player = new SambaPlayer("player", {
height: 360,
width: 640,
ph: "",
m: "",
playerParams: {
enableShare: true,
wideScreen: false,
autoStart: false,
ad_program: /** vast tag here*/,
html5: true,
sambatech: true
},
events: {
onLoad: "eventListener",
onStart: "eventListener",
onFinish: "eventListener",
onResizeActive: "eventListener",
onResizeDeactive: "eventListener",
onPause: "eventListener",
}
});

Related

Flutter - Audio Player

hello i am new to flutter
i am trying to play audio files from url or network but which to use because
i searched google it showed many but which one to use.
if possible can show an example on how to create like below image
i want to create an audio player like this
kindly help...
Thanks in Advance!!!
An answer that shows how to do everything in your screenshot would probably not fit in a StackOverflow answer (audio code, UI code, and how to extract audio wave data) but I will give you some hopefully useful pointers.
Using the just_audio plugin you can load audio from these kinds of URLs:
https://example.com/track.mp3 (any web URL)
file:///path/to/file.mp3 (any file URL with permissions)
asset:///path/to/asset.mp3 (any Flutter asset)
You will probably want a playlist, and here is how to define one:
final playlist = ConcatenatingAudioSource(children: [
AudioSource.uri(Uri.parse('https://example.com/track1.mp3')),
AudioSource.uri(Uri.parse('https://example.com/track2.mp3')),
AudioSource.uri(Uri.parse('https://example.com/track3.mp3')),
AudioSource.uri(Uri.parse('https://example.com/track4.mp3')),
AudioSource.uri(Uri.parse('https://example.com/track5.mp3')),
]);
Now to play that, you create a player:
final player = AudioPlayer();
Set the playlist:
await player.setAudioSource(playlist);
And then as the user clicks on things, you can perform these operations:
player.play();
player.pause();
player.seekToNext();
player.seekToPrevious();
player.seek(Duration(milliseconds: 48512), index: 3);
player.dispose(); // to release resources once finished
For the screen layout, note that just_audio includes an example which looks like this, and since there are many similarities to your own proposed layout, you may get some ideas by looking at its code:
Finally, for the audio wave display, there is another package called audio_wave. You can use it to display an audio wave, but the problem is that there is no plugin that I'm aware of that provides you access to the waveform data. If you really want a waveform, you could possibly use a fake waveform (if it's just meant to visually indicate position progress), otherwise either you or someone will need to write a plugin to decode an audio file into a list of samples.

Overlay text and images to YouTube live stream from Flutter app

I am looking into creating a Flutter mobile app that live streams to YouTube using the YouTube Live Streaming API. I have checked the API and found that it does not offer a way to overlay text and images onto the livestream. How would I achieve this using Flutter?
I imagine this involves using the Stack widget to overlay content on top of the user's video feed. However this would somehow need to be encoded into the video stream to be sent to YouTube.
this type of work is usually done with FFmpeg
See this discussion for more info: https://video.stackexchange.com/questions/12105/add-an-image-overlay-in-front-of-video-using-ffmpeg
FFmpeg for mobile devices is made available by this project:
https://github.com/tanersener/mobile-ffmpeg
And then, as always, we have a flutter package called flutter_ffmpeg to allow us these features on flutter
https://pub.dev/packages/flutter_ffmpeg
TLDR: You can use CameraController (Camera package) and Canvas in Flutter for drawing the text. Unfortunately CameraController.startImageStream is not documented in the API docs, and is a 1 year+ GitHub issue.
Everytime the camera plugin gives you a video frame controller.startImageStream((CameraImage img) { /* your code */}, you can draw the image onto the canvas, draw the text, capture the video and call the YouTube API. You can see an example of using the video buffer in Tensorflow Lite package here or read more info at this issue.
On this same canvas, you can draw whatever you want, like drawArc, drawParagraph, drawPoints. It gives you ultimate flexibility.
A simple example of capturing the canvas contents is here, where I have previously saved the strokes in state. (You should use details about the text instead, and just pull the latest frame from the camera.):
Future<img.Image> getDrawnImage() async {
ui.PictureRecorder recorder = ui.PictureRecorder();
Canvas canvas = Canvas(recorder);
canvas.drawColor(Colors.white, BlendMode.src);
StrokesPainter painter = StrokesPainter(
strokes: InheritedStrokesHistory.of(context).strokes);
painter.paint(canvas, deviceData.size);
ui.Image screenImage = await (recorder.endRecording().toImage(
deviceData.size.width.floor(), deviceData.size.height.floor()));
ByteData imgBytes =
await screenImage.toByteData(format: ui.ImageByteFormat.rawRgba);
return img.Image.fromBytes(deviceData.size.width.floor(),
deviceData.size.height.floor(), imgBytes.buffer.asUint8List());
}
I was going to add a link to an app I made which allows you to draw and screenshot the drawing into your phone gallery (but also uses Tensorflow Lite), but the code is a little complicated. Its probably best to clone it and see what it does if you are struggling with capturing the canvas.
I initially could not find the documentation on startImageStream and forgotten I have used it for Tensorflow Lite, and suggested using MethodChannel.invokeMethod and writing iOS/ Android specific code. Keep that in mind if you find any limitations in Flutter, although I don't think Flutter will limit you in this problem.

How to set poster image in JW player from flv file?

How can I set a poster image using particular video frame (flv)?
I don't have image file available, so I wonder if it's possible to set a poster image "in the fly" from the video source.
No, that's not possible with JW Player.
Edit: A little more explanation.... JW Player doesn't actually "play" or "process" the video in any way. It's just a steering and styling script - it feeds the video to the browser's <video> tag if the browser can handle it, or to the Flash plugin. The player provides its own control bar, advertising capabilities, and so on, but when it comes to the video file itself, the player isn't touching it. So, there's no way to do things like extracting certain frames. The player isn't ffmpeg.
Here is a little bit of a hack you can use:
<div id="myElement"></div>
<script>
jwplayer("myElement").setup({
file: "/uploads/myVideo.mp4",
autostart: true,
mute: true,
controls: false
});
</script>
<script>
setTimeout(function() {
jwplayer().pause();
jwplayer().setMute(false);
jwplayer().setControls(true);
},3000);
</script>
What this does is basically autostarts the player, muted, with no controls, then after a second, pauses the player, and sets the controls and volume back. Essentially it is grabbing the first frame. Keep in mind this is a bit of a hack and isn't great for bandwidth.

Play Youtube Videos with Swift

Hey I want to display a youtube video in my app but if I run it there is only a black screen ??
func playVideo() {
var audioplayer : MPMoviePlayerController!
var url:NSURL = NSURL(string: "https://youtu.be/7n1KPclvGQY.mp4")!
var MPMoviePlayerViewController = MPMoviePlayerController(contentURL: url)
MPMoviePlayerViewController.view.frame = CGRect(x: 20, y: 100, width: 200, height: 150)
MPMoviePlayerViewController.movieSourceType = MPMovieSourceType.File
self.view.addSubview(MPMoviePlayerViewController.view)
MPMoviePlayerViewController.prepareToPlay()
MPMoviePlayerViewController.play()
MPMoviePlayerViewController.pause()
}
There's no really nice way to do this. There is this library but it breaks YouTube's TOS.
Best to call [UIApplication sharedApplication] openURL:...] which will open in safari or the YouTube app, if they have it installed.
If its content you own, perhaps try hosting it yourself and using the same method yourself?
Use custom library for playing youtube video
Make use of custom https://github.com/larcus94/LBYouTubeView
It is a subclass of MPMoviePlayerViewController.
LBYouTubeView is just a small view that is able to display YouTube videos in a MPMoviePlayerController. You even have the choice between high-quality and standard quality stream.
It just loads the HTML code of YouTube's mobile website and looks for the data in the script tag.
LBYouTubeView doesn't use UIWebView which makes it faster and look cleaner.

HTML Video plays with sound only through earphones -iOS?

I have developed a sencha-phonegap application which comprise of playing videos already present in the application.
When I play any other video such as a video song through the native player, audio is heard perfectly from the stereo speakers present at the bottom.
I am simply using the xtype: video tag to play the video. Here's my code to play the video:
{
xtype : 'video',
x : 0,
y : 0,
left :'0px',
top :'0px',
width : 175px,
height : 98px,
url : "path of video"
posterUrl: 'placeholder.png'
}
The problem which I am facing is that whenever the video is played, it only plays with sound when the earphones are connected and sometimes the audio can also be heard from the top-speaker (the speaker used during calls).
I have checked for the ringer and volume controls. Everything is at maximum.
My app supports iOS versions 5.0 and above.
Could this be iOS related issue?
Kindly provide some direction.
Thanx
Please first add Audio Toolbox and AVFoundation frameworks if not added already and than import class in which you add the below lines on app launch
#import<AVFoundation/AVFoundation.h>
than add these two lines in your project when you app launches
UInt32 audioRouteOverride = kAudioSessionOverrideAudioRoute_Speaker;
AudioSessionSetProperty (kAudioSessionProperty_OverrideAudioRoute,sizeof (audioRouteOverride),&audioRouteOverride);
May be it will work now.
On one of sencha forum, I found This Link Which Helped Me.
Since, it only specifically mentions about ringer off scenario, it also mentions in generic way that we need to add playback method for sound to work.
Cheers |m|