I am struggling from past 1 week that how to compile the tesseract ocr enging for iphone. I have gon through through some link, But i couldn't find the proper way. Can anyone help me through step by step procedure. Thanks in advance.
I will have to agree with you on this that it is not an easy task, but the below links finally helped me achieve this... hoping this helps..
http://tinsuke.wordpress.com/2011/11/01/how-to-compile-and-use-tesseract-3-01-on-ios-sdk-5/
http://tinsuke.wordpress.com/2011/02/17/how-to-cross-compiling-libraries-for-ios-armv6armv7i386/
That probably won't be enough. I know nothing about Tesseract OCR library but you will require the include directives, plus specify the directory to where the Tesseract header files are installed via a compiler switch (usually -I) and (possibly) link with the Tesseract library file(s).
That would be enough to compile the sources if the header files are installed into common include directories. However, if the header files do not contain the definitions (not just declarations) for all functions/variables then there will be a library file(s) (like libtesseract.a or/and libtesseract.so) that must be linked in order to compile your binary.
Related
I have a fresh java card and two .cap file!
one of them successfully installed on the card (and deleted successfully!) and the another one failed to upload and install. I put the output of "GPJ" in the below. what's wrong with the .cap file? and how I can fix it?
I made cap file with Eclipse and its source code is in this link.
and this is output of 'gpj'
thank you.
there is a wide range of possiblities for this error, as smartcard return codes are not vvery specific, my thoughts would be:
check for correct JavaCard Version(cap file must be compiled for the version supported by your card)
your code uses an api call that is nit supported by your card(its manufacturers decision which algorithms he supports or not)
try read Global Platform Specification for these detailed questions
i have a project having lots of file, around 4500 file in all Project. i separated it in 40 library project.Now my problem is that it taking too much time to Compile. so i increase a memory of Flash builder. This gives me little improvement to compile. i am sure that i have too much file which is not used in my project. so now i want to remove it by plug of Flash Builder / Eclipse. Because it is too much headache process to see that "xyz.as/mxml" file is used in any other file or if it is used in Other file say "abx.as/mxml" then again i have a question that "abx.as/mxml" is useful file or used in any other file. so you have any idea or hint please give me. Thanks in advance...
One thing you can try is to use the size-report of your flex compilation : flex compiler options
This way you will have an idea of which classes are really used in your libraries and therefore wich ones aren't because the flex compiler only link to classes you really need in your compiled swf.
This not ideal but it can avoid a lot of manual process pain.
Add this param to your compiler:
-link-report output.xml
this information will help you.
I'm not sure if it would work for flash-builder, but for Java there is the UCDetector.
I have RestClientLibrary and UserFunctionsLibrary
UserFunctionsLibrary needs RestClientLibrary in order to function.
When I compile these down to libRestClientLibrary.a and libUserFunctionsLibrary.a how will they be able to interact with each other?
In Xcode currently I have set the User header search paths to find the .h files and I have linked the UserFunctionsLibrary with the RestClientLibrary binary. However, when distributed other users of these libraries may have different set ups and such. I can't see that it will work.
Thanks for any insight you can give me.
Those .a files are just library files. They will need to be linked together to actually be used. The linker will handle resolving all the symbols from RestClientLibrary into UserFunctionsLibrary.
As far as other users, they will have to configure their system in a way such that both libraries are passed to the linker.
I am trying to create an app that will allow me to stream video FROM the iPhone TO a server. my current theory as to how to do this is to create a series of FFMpeg files and send them to the server. as far as i can tell i have compiled the FFMpeg library correctly for the iPhone.
i followed these instructions here. a series of executable files appeared in the folder so i'm assuming it worked.
my question is now what? how do i get these into an app? how do i make calls to these executable files? and most importantly will this even work the way i want it to?
You have built the ffmpeg binary which can run on an iPhone. You cannot run executables from an app on a (non-jailbroken) phone. So you would have to compile the library, and link against that. Then, from your app, call the relevant functions directly, mimicing what the ffmpeg program does.
Althought this issue is quite old this could help to other users in the future:
Just take a look at the source code here http://dev.wunderground.com/support/wunderradio/wunderradio.1.9lgpl.zip
Good luck
Ok you said that you succesfully compiled FFMpeg for iPhone, right? As mvds said, you can't use them as executable files. So in order to use this libraries after compilation finishes you need to copy all the .a libs generated to your project (as when you add libraries or other frameworks). These libraries are:
libavcodec.a
libavfilter.a
libavutil.a
libswscale.a
libavdevice.a
libavformat.a
libswresample.a
Then you have to configure your project
Clic on your project -> build settings
Search "Header Search Paths" and add the folder location of your libraries (location can be absolute or relative)
Clic on build phases -> Link Binary With Libraries -> Add other, and add all .a files
Voila! Now you can import and use the libraries of FFmpeg for your project
#include <avcodec.h>
#include ...
// More C and/or Objective-C Code
To access individual uncompressed frames you can use captureOutput:didOutputSampleBuffer:fromConnection: delegate method from AVCaptureVideoDataOutput (there are plenty of examples), and somehow encoding them to h264 maybe using AVFrame? As far as I know FFmpeg also can stream using RTSP for live streaming, but it seems that documentation is close to zero :(
To answer your final question
and most importantly will this even work the way i want it to?
The answer is yes, it can work, I found 2 libraries that do just what you want to achieve
http://www.foxitsolutions.com/iphone_h264_sdk.html
http://ios-rtmp-library.com
Both uses FFmpeg the same way you are suggesting, this question is a little old but I have found many users trying to achieve this so so I have a question Do you had success on doing this? Can you share your experience or recomendations?
I have been trying to integrate GSL (Gnu Scientific Library) into an iPhone project using Xcode.
The challenge is GSL has all the modules in different folders, yet when their header files are reference, they often reference instead of or .
At least with I could use Xcode's recursive header file search to find it. But with , basically file not found.
Does anyone have an easy way to address the hierarchical GSL structure to compile into Xcode?
I can do it the tedious way of fixing all the #INCLUDE lines but hoping there is a better alternative. This way, I can more easily update GSL when changes are made by the community.
NOTE: I found Xcode needs to find the right header file locations (make sure to adjust your TARGET Build settings, not just your Project Build settings).
You may be able to use grep and sed to replace all of the header lines in the gsl package.
Using grep and sed to find and replace a string