Mobile friendly photo upload / crop - filepicker.io

We're using filepicker for users to choose a profile photo. We need to crop the photo to a square. We had been doing this automatically, but now want to allow users to choose the crop region.
We tried using jCrop as you do on your homepage but didn't work well on mobile. Can you point us to any canonical examples of pick and crop and store, that work well on mobile?

Mobile cropping gets a bit tricky, as many don't have good support for the touchmove events, or render without using WebGL and therefore are not very responsive. Aviary (http://www.aviary.com/) may be your best best.

Related

I would like to know if I can create an ar experience using facebook that will play entire songs and videos?

I'm new to the Facebook AR studio and wanted to know if I can add entire songs and videos to a marker. And if that marker can be a simple logo or graphic.
AR Studio does not allow markers at this moment. As far as I know there is no information about when this feature will be available.
Regarding playing songs and videos, the only limitation I think you will find is the size of the effect. Facebook asks developers to limit the effect file size to 2MB.

What are the basic steps should I follow to make a photo realistic avatar?

Can you help me find the right step by step process of rendering a photo realistic avatar in web?
Requirements:
I need a avatar for Men's tailor made suit that renders photo realistic fabric.
I need the avatar to move in circular motion seamlessly.
I need the avatar to be use in web browser or to be render in web.
How can I make it an API based?
Was it easier if I just could make this an app? or my browser need to download any plugin to render it smoothly?
Thanks in advance.
Use DAZ3d, Zbrush, Mudbox or your skill in another 3d software. If your good your avatar will be good.
Make an animation of your avatar turning 360 degrees and loop it in Unity.
Use Unity to make webgl App
You can communicate from browser to your webgl App and you can make api how you like. If you explain your api needs i can be more precise.
Thank you so much for giving your insights, well the API needs are:
Garment should be rendered on the server.
Website will pass api request with parameter to the server, rendering server will parse the parameter.
and render the garment based on the parameter. sample parameter is fabric, collar style and cuff.
Thank you in advance.

How can you enable the iOS 6.0 panorama camera within an application?

I'm working on an app for iOS 6 that utilizes the camera, but I've been unable to figure out how to enable the options which would allow the user to take a panoramic shot. I've got everything working as far as taking a normal photo goes, but would like to give the user the option to use the new panorama feature in iOS 6 in the app.
I've scoured the net and not been able to find any information (SO, Apple Dev Center, etc). I'm unable to determine whether this is even a possibility or not. Can we use this, and if so, how?
This is not possible. Access to the camera is only possible via UIImagePickerController or through AVFramework, and neither of these provide this functionality.
AVFramework gives you the data stream from the camera, and you have to handle that yourself. Unless you want to implement your own panoramic feature, this isn't appropriate.
Whilst there are settings to adjust flash, video/image and rear/front in UIImagePickerController, there are none to enable the native panoramic feature.
In a similar vain to HDR photographs, Panoramic capture is only available within the Camera app.

iPhone User Interface steps online demo

I've designed the User Interface of an iPhone app and I wish to show an online demo of that consisting for the moment of a series of static images representing the main steps of the app.
According to you what is the best way to do this simulation?
You know, something like a series of single webpage, optimized for mobile, containing a single image linking to the next step, but I was wondering if exists a much elegant and sophisticated solution, with a transition effect for example or other features.
I hope I was clear enough :)
Any help will be sincerely appreciated.
Thanks in advance for your attention.
This sounds like a good use for Briefs Briefs App Website. This pretty much allows you to create an interface and step through it as if it were an application. I believe you'll need to have a developer account to run the app that will read the brief on your phone (since it wasn't able to be released in the app store).
An alternative to static images would be to make a video. I use the iShowU video screen capture tool and set it to record the iPhone/iPad simulator window. I then run through the screens, type inputs, etc. In addition to recording the video, the program records my voice as I narrate the app's features.
As to transition effects, the video will capture whatever transition animations are in your program.
In the end you have a video that you could give your user, put on YouTube, or whatever.
You can do this easily and for free on AppDemoStore. You just have to upload the app screenshots and then add hotspots which are used for the navigation through the demo.
AppDemoStore offers also the sophisticated features you are asking for:
iPhone specific transition effects such as slide up/down/left/right, fade and flip
gestures icons for the hotspots
text boxes and callouts
multiple hotspots on a screen in order to create a simulation of the app (and not just a linear demo)
Here's a sample demo: http://www.appdemostore.com/demo?id=1699008
Moreover, the demos created on AppDemoStore run in any browser and mobile device and can be embedded in your webpage or blog (like you do it with a YouTube video). With the FREE account, you can create up to 10 demos with unlimited screenshots and all the features specified above.
Regards,
Daniel

How to add a tag overlay to a photo in iOS aka Facebook

I was wondering if anyone had an idea as to how the people tagging feature works on facebooks iPhone app i.e. in the app you can touch the photo and then associate that touch-point with a facebook friend. Specifically I was wondering whether this is just as simple as associating co-ordinates on the image with a data object (facebook friend in this case) using the iPhone or whether they are doing some smarter image recognition in the background to workout what other areas of the photo also may belong to that person i.e. is does the tag extend beyond the point touched on the screen. If the latter is the case is anyone familiar with the techniques used?
Thanks in advance
Dave
I don't think they are using face recognition algorithms on the iphone, since that is processor consuming specially if you have hundreds of friends. If you want to do a face recognition and you have faces of people that you want to search in you should do it on the server, so after you take or import a photo, you should send it to your server where you search for the face and return a JSON with points for faces and data for matched users. Then do your UI to present it on the screen for the user.
Edit
If you want to use face recogingitioning on iphone try this: Face recoginition iOS