Where to put server address where data will store? - iphone

I am creating an app that capture video and upload it to server i have create the code for capture video ,and for upload I am studying this code but I did not understand to where to put my server address to store data there.
And also is there posible to store video using FTP because in my company PHP and .NET guys use FTP and i want to know that i am able to upload video using FTP in my iPhone app.

In the WebService.m file, there's a placeholder that says "your URL string", replace that with your actual server.
If you want to use FTP, you can do that as well using a networking library that supports FTP, one in particular is the very popular libcURL library.

Related

Stream Audio/Video from an iphone app using HTTP Live Streaming

I am trying existing to stream music/video on an iphone using HTTP Live Streaming. I read the apple docs on HTTP live streaming (http://developer.apple.com/library/mac/#documentation/NetworkingInternet/Conceptual/StreamingMediaGuide/Introduction/Introduction.html), and I get how it works.
What it doesn't say is how one would use iphone as a server? Do I have to add the tools to my ios app(mediastreamsegmenter, variantplaylistcreator) and run it as a NSTask or is there some kind of native support to stream media files.
If you really want to stream from an iPhone app you can't do this with the iPhone acting as a server. You need a separate server where you can send data from the iPhone app. So you can use the camera or the microphone in the app to get live content and then you can send asynchronously data to the server, which using mediastreamsegmenter and variantplaylistcreator will convert the data to ts segments and then will append them at the end of the m3u8 file and meanwhile another iPhone app can act as a client and watch the live content that you are streaming from the first app.
From my experience this is the only way to achieve that. Hope that helps.

Encoding of audio (mp3, mp4, m4a, ogg) file for smooth streaming window media services

I want to encode the audio file (mp3, mp4, m4a, ogg) for the streaming and want to play (I want to play encoded file smoothly) using the HTML5 player but I think HTML5 player.
So now what I am doing, I am uplaoding a file and econding this file on windows Azure Media Services using the preset "AAC Good Quality Audio". It encode the file with .mp4 file format and then I create SAS locator to run this file, it works well but the problem is that user can download it too which I don't want to allow.
If I create the OnDemandOrigin locator of the same encoded asset, it gives me 404 erroe. It means we can not play it.
Below are the steps that I have used to upload the file on Azure Media Services:
Created the empty assest.
Upload the file into the asset.
Then create the new task job to encode the audio file.
I have successfully encoded the file but when I try to generate the origin url it generate the url but when I browse the file I get
the error 404.
My queries:
"AAC Good Quality Audio" preset is the right for my task?
How can I restrict the user to download the file, if I use sas locator.
Is it possible to play the encoded file using origin locator.
Can I encode audio files for smooth streaming ? If I can then which player I should use to run the encoded file for all browsers, IOS devices and android devices.
If you want further details please feel free to ask me.
Awaiting your response.
Thanks
If your user is able to listen to the audio you're publishing, they will also be able to download the file. This you can not prevent. At best, you can make it difficult, but not impossible. More to the point, Media Services at its current incarnation has no way for you to do authorization of any kind, so the only tool you've got is time-bombed SAS locators.
The typical solution for this problem is to use DRM. Media Services supports PlayReady encryption, but you need to either have a PlayReady server or purchase it as a service (there is currently a service in the Azure Marketplace that provides PlayReady for a monthly price).
See following article how to protect assets with Microsoft PlayReady technology
Origin Locators are something you would use to publish a Smooth Stream or HLS asset. It is not useful for regular media files, as it is internally something equivalent to an IIS Media Services endpoint. For regular media files, you can just as well host them in Blob Storage -- and refer to them via the SAS locator.
There is currently no single format that will play across all devices and operating systems. You can get Smooth Streaming to work on most Windows and Mac computers (possibly Linux, too), either with Silverlight or with the Smooth Streaming Plugin for the Flash-based OSMF. For iOS devices you will need to encode to HLS and use the HTML5 video tag. Microsoft Media Platform will support MPEG-DASH, a recently ratified ISO/IEC standard for dynamic adaptive streaming over HTTP.More details how to use DASH preview feature can be found here
If you want smooth streaming for audio only, it looks like you will have to create a video asset with an empty video stream -- although there is a Uservoice request to add support for audio only in the future.

Mac/iPhone:Streaming video file to iPhone

I have a http streaming link which gives me .flv streaming feed. I want to convert that and access in my iPhone program. How can i do that? I want to have a desktop software like VLC and input this streaming feed URL and convert to iPhone supported and stream again to iPhone. I tried VLC with H.264 and Mpeg-1 audio, but seems to be it doesn't give the supported format, so as iPhone program doesn't play the video.
Could someone please guide me how can i setup a desktop software which can stream iPhone supported file?
Thanks in advance.
I think even the great VLC can't convert FLV on the fly...(or even do anything with FLV). As far as streaming goes, you'll probably be limited to the local network (Wi-Fi). I'd start with the simple way—create an ad-hoc file server on the desktop, then use AVPlayer's initWithURL method to find that video.
On the desktop, you could query the IP address of the computer, and ask the user to enter that URL (along with an optional port assignment and file component, like http://192.168.0.2:2234/streamingVideo.mp4) onto the iDevice, then convert to NSURL.
What exactly is the http streaming link? This matters a lot as in order to stream to the iPhone you need to use HTTP Live Streaming which requires some different bits than a typical flash media, or more properly RTMP, server. Typically you need two different streaming architectures or some expensive boxes.

iphone: coding a data compression or zipping agent?

I'm creating a simple service for uploading photographs from an iphone to a web server.
However, before the requests is sent, I want the app to compress the pictures (custom format or otherwise) in the background before sending it.
Any pointers on how I could go about doing this?
Check out the NSDataCategory posted to CocoaDev. It does exactly what you're looking to do.
http://www.cocoadev.com/index.pl?NSDataCategory
I use ziparchive to unzip content downloaded from a server. It also has functionality to create zip files on an iOS device and might be what you are looking for.
http://code.google.com/p/ziparchive/

How do i stream an audio file from the server to iphone?

I need to stream an audio file which is saved on my server. Is it possible for me to stream that file in order to play it on my iPhone? Or is there any other way to play an audio file from the server to iPhone? help me please.
Thanks,
Shibin
This link was useful to me : http://cocoawithlove.com/2008/09/streaming-and-playing-live-mp3-stream.html
He's got a project linked from that page http://projectswithlove.com/projects/iPhoneStreamingPlayer.zip
In this project, interesting lines are in iPhoneStreamingPlayerViewController.m, lines 82-89 start streaming the audio from a url.
I've manged to get this running on my iPhone and tested it using an mp3 on another server and it works fine. However, I've not picked through the code so I can't help you anymore than this, sorry!
Sam
NS To get the project to compile I had to change the SDK to 3.0 - if you right click on the project name and choose Get Info, then change the option called Base SDK to iPhone Device 3.0 and it should work.
There's a couple of ways to get the file playing on the iPhone, but the first problem is that you need to decide how to serve the file from your server.
One great way is to share the file out via HTTP using a Web Server. If the server is Windows, look into 'IIS'. If it's a Mac or Linux, Apache is your friend.
Once you've got the serving going, here are the options on the iPhone:
1) Use iPhone Safari to navigate to http://your-server/your-folder/the-file.ext. If the serving is correct, it'll open the mediaplayer and stream it.
2) Write an iPhone application that uses the AVMediaPlayer framework to play the file. Non-trivial, but there are plenty of samples.