In the iOS Human Interface Guidelines under the Live Photos section Apple says this,
"Make sure that users can distinguish a Live Photo from a traditional
still photo. It’s especially important to help users make this
distinction when they can share the photo. The best way to show users
that they’re viewing a Live Photo is to display a little movement that
gives a hint of the experience. In cases where a hint isn’t possible,
you can display the system-provided badge on the Live Photo. A Live
Photo never displays a playback button that looks like a video
playback button."
I am kind of confused on how to provide that little hint of movement in my live photo. How do I provide that hint of movement?
You don't need to dissect a Live Photo into still frames and construct an animated UIImage, or dig out the Live Photo's movie file... It's much simpler.
Display your user's Live Photo content in a PHLivePhotoView.
Call startPlaybackWithStyle: and pass
.Hint for the playback style to get the "hint" that the HIG is talking about.
There's no step three.
Related
I'm adding a thumbnail photo gallery to my iPhone app which will include an iPhone Photo App-like photo browser plus a detail view for each photo. So far the only thing I've found that provides this is Three20 -- the thing is that Three20 is so huge and has a complex looking API which is not a standard Apple style interface as far as I can tell.
I've also seen this post Open source photo viewer for IPhone which mentions a couple of other photo browsers -- but these don't have thumbnail galleries built in.
My question is: does anybody know of a good combination photo-browser and thumbnail gallery library which is easy to use, not excessively large and easy to customize? Cacheing is also crucial.
I have found an answer to my own question called KTPhotoBrowser which fits the bill exactly -- just a thumbnail gallery and photo browser. Combined with SDWebImage which handles the image cacheing it appears to be a perfect simple solution that doesn't require 2 MB of code!
Update: quick to install, works with tab bar controller (example app provided!) and works like a charm. May not have all the features of Three20 but its a lot easier to use and has a much smaller footprint. TabBarController woes made the final decision, I spent several hours trying to get Three20 photo viewer to play nice with my tab controller app -- and failed! After only a half an hour of futzing I got KTPhotoBrowser working by using their tab sample app as a reference point.
I know Three20 looks huge, but it is the best from my point of view. I also had problems to start, but I was able to make it. I have sample code here on stackoverflow and on github. Check it out, you will see that is exactly what you need.
I've designed the User Interface of an iPhone app and I wish to show an online demo of that consisting for the moment of a series of static images representing the main steps of the app.
According to you what is the best way to do this simulation?
You know, something like a series of single webpage, optimized for mobile, containing a single image linking to the next step, but I was wondering if exists a much elegant and sophisticated solution, with a transition effect for example or other features.
I hope I was clear enough :)
Any help will be sincerely appreciated.
Thanks in advance for your attention.
This sounds like a good use for Briefs Briefs App Website. This pretty much allows you to create an interface and step through it as if it were an application. I believe you'll need to have a developer account to run the app that will read the brief on your phone (since it wasn't able to be released in the app store).
An alternative to static images would be to make a video. I use the iShowU video screen capture tool and set it to record the iPhone/iPad simulator window. I then run through the screens, type inputs, etc. In addition to recording the video, the program records my voice as I narrate the app's features.
As to transition effects, the video will capture whatever transition animations are in your program.
In the end you have a video that you could give your user, put on YouTube, or whatever.
You can do this easily and for free on AppDemoStore. You just have to upload the app screenshots and then add hotspots which are used for the navigation through the demo.
AppDemoStore offers also the sophisticated features you are asking for:
iPhone specific transition effects such as slide up/down/left/right, fade and flip
gestures icons for the hotspots
text boxes and callouts
multiple hotspots on a screen in order to create a simulation of the app (and not just a linear demo)
Here's a sample demo: http://www.appdemostore.com/demo?id=1699008
Moreover, the demos created on AppDemoStore run in any browser and mobile device and can be embedded in your webpage or blog (like you do it with a YouTube video). With the FREE account, you can create up to 10 demos with unlimited screenshots and all the features specified above.
Regards,
Daniel
I'm using source code from apple (SquareCam) and I would like to figure out how I can use the photos taken in that app and see them in a library (like the photos app) without having all the other photos the user has taken elsewhere like in the regular camera.
I'm not too great at making apps yet so I'm pretty noobish.
I've gotten as far as opening the photo library but nothing else. no viewing individual photos or the options such as emailing, messaging and what not.
Don't think you can do that. It either all photos or no photos...
You can of course save the photo in your own documents directory and write your own photo viewer - there's quite some code and frameworks around that do most of the job, see e.g. the answers to this so question:
Open source photo viewer for IPhone
I was wondering if anyone had an idea as to how the people tagging feature works on facebooks iPhone app i.e. in the app you can touch the photo and then associate that touch-point with a facebook friend. Specifically I was wondering whether this is just as simple as associating co-ordinates on the image with a data object (facebook friend in this case) using the iPhone or whether they are doing some smarter image recognition in the background to workout what other areas of the photo also may belong to that person i.e. is does the tag extend beyond the point touched on the screen. If the latter is the case is anyone familiar with the techniques used?
Thanks in advance
Dave
I don't think they are using face recognition algorithms on the iphone, since that is processor consuming specially if you have hundreds of friends. If you want to do a face recognition and you have faces of people that you want to search in you should do it on the server, so after you take or import a photo, you should send it to your server where you search for the face and return a JSON with points for faces and data for matched users. Then do your UI to present it on the screen for the user.
Edit
If you want to use face recogingitioning on iphone try this: Face recoginition iOS
How to change wallpaper in iPad programmatically?
You can't do this. The user decides what wall paper s/he wants and has one, single place to change it. You can, however, add new items to the user's saved photos list, which they can choose from when they change their wallpaper.
Apparently, it is possible: see this link:
You can save pictures to your iPad's photo album or directly to wallpaper -- a neat feature.
Quite likely, it's a private api, that only Apple can use in their iAds, but I'm still interested in how they accomplish that.
As far as I know, iAds are basically just little HTML5 web sites, so, the call would be part of the html code. It shouldn't be to difficult to figure out what's going on.