Translate image into plain text - iOS App [closed] - iphone

It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 10 years ago.
I want to create an app where the user takes a photo and then the app translates that image into text based on what the user took a picture of. For example: If the user took a picture of a book then the app would then translate that into plain text.
Where would I start with something like this?

If you 'just' want to read an image of text into a string, look for OCR.
If you want to really have a computer describe what's in the picture.. With some background in computer vision I feel qualified to state that that's not possible with current technology.
So, if you want that, what are your options? You could do like mealsnap, and use cheap micro-labor from mechanical turk.

Check out IQEngines. It combines computer vision and crowdsourcing to figure out what's in the image.
We use it at Voxy (a language learning company) to help users create flashcards when they're learning English and don't know what the word is.

I have used one OCR in one of my app - http://www.abbyy.com/Default.aspx?DN=5b0ab341-0c6e-4119-a824-c652b9e888f4
And here is a REST based api for OCR - http://www.wisetrend.com/wisetrend_ocr_cloud.shtml

Related

How do I get started on developing Pandora like app on iOS? [closed]

It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 11 years ago.
I am a newbie iOS Developer. Could anybody point me in the right direction how can I develop pandora like app?
However, unlike Pandora, user should be able to view/play from the catalog or use recommendation engine. At this moment, I am not worried about recommendation engine. Basically, I want the app to be able to present sound file collection and play selected mp3 or entire album (play, pause, shuffle, repeat, etc). I want the melody catalog to be easily maintainable. I guess I will not store files locally. They have to be streamed from the server (HTTP Live Streaming?)??? The list will keep getting bigger and I will be the only person adding tracks.
What frameworks/libraries/documentation should I read up on?
Anyways, I learn better by doing it, so even though this is not an easy task, decided to take on it.
Thanks!
Good choice... learning by doing.
What I found useful in getting started with iOS was the online stanford class and the apple documentation concerning Audio:
Stanford Youtube Course
Apple Documentation

iPhone App - Coffee Cup Recognition [closed]

It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 11 years ago.
I want to build an iOS application that recognizes patterns in a cup of coffee.
For example http://en.wikipedia.org/wiki/File:Coffereading.jpg .
This image recognition script could run on the client side (iphone eventually) or on server side.
The goal of the app is to take a picture of a cup of coffee, analyse patterns,
and compare them with images/patterns that are already stored in a database, and return the most appropriate one
I have no experience in this field and after doing some research, I found some libraries that might help me do this: openCV, kooaba, snaptell, and server side libraries like afroge.net.
This confused me alot. I want to know if this is possible using libraries like the ones above, or any other one. And how much time/effort should be given if any modification should be made on one of those in order to achieve my goal
You can also read more information about reading fortune under this link: http://en.wikipedia.org/wiki/Tasseography
In short: It can be done with OpenCV. You can use it to recognize the coffee cup (circle and/or square detection). After that you have to take a look at "feature detection" (SURF ist a good way to go) to match the coffee patterns.
Yes, it can be done. But it's a huge topic and very complex.
Your best bet is to license third party code that will handle all the difficulties for you.
Take a look at this library available for iOS by Mataio:
metaio | Software | Augmented Reality 3

take multiple pictures and edit them forming video [closed]

It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 11 years ago.
Can you auto take multiple pictures and then edit them in real time for an iphone app? Like say u want to make a video of a man and add a beard to him can you do it while streaming? the mustach moves according to where his face is detected and say take about 5-15 frames per second?
I guess you can.
It would involve tracking some facial features or markers added for the purpose at the least. However this is such a vast and complex field, you'll hardly get a single advice here that will get you going.
If you really mean it, I'd suggest looking for Augmented Reality libraries, there's a few out there. Most of them work by tracking a special pattern and not arbitrary features though, so be prepared for a big load of work.
Check this SO question for a first few hints, you'll find more information on the topic easily through the search engine of your choice.

Creating a login function and search function in iOS [closed]

It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 11 years ago.
I've been having some difficulty designing a search and login function for my app. I have a list of video's with titles, and I need to create a search function (much like the youtube app) to search through the list. What would be the best way to go about doing this? Search the video's before they get fetched, and display the results, or search through the array after it's been fetched.
I also need to connect with an apache server to login in my app, and display unique content.
I haven't been able to find any samples on the matter.
Thanks for reading!
Your question is too vague. There are many paths that you can take. Your videos can be searchable on your remote apache server, or your iOS app can download the list of videos and search locally. This depends on the number of videos.
Best way is to have a table view with a list, and a search field on top. To get to this table view, the iOS app asks the user to login. If it's valid, then it shows the table/search views.

cinema App in Iphone [closed]

It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 12 years ago.
M very new to this field of iphone development.I need to make one app in which customer will be Purchasing ticket for movie by selecting the Location,Movie,Timing of show and seat.
Can anyone help me out in implementing Ticket Purchasing Part.
Thanks
Although you haven't asked a specific question and you say nothing about your experience in other programming languages I can give you some hints:
Look at the WWDC videos on http://developer.apple.com
Understand the Model-View-Controller paradigm
Make sure you really understood the Model-View-Controller paradigm
Start with designing your user interface first. Take your time for this. Make mockups on paper, each sheet of paper corresponds to a screen and play with it and show it to some colleagues and watch if they get what to do on each screen. You can also do this in Photoshop or Illustrator or omnigraffle, but in the early stage maybe paper is a simple option.
Build a mockup of your app on the phone itself. Using UIImageViews you can insert the digital screens made digitally to the phone and see how it looks like. Show this version to you customers. If they like the design and feature, you can go on coding.
Code everything
If you want to write a serious application it is good to invest a lot into design. The coding itself is much easier if you know how it will look like in the end, because then you can design the underlying model such that it will be great for your UI.