I have to use two native libraries: one is my own and the other one is 3rd-party. As long as I used them in separate projects, everything was ok. But now I'm getting the Exception Ljava/lang/UnsatisfiedLinkError.
I'm using Eclipse.
I found out that if I place the existing library in libs/armeabi, Eclipse begins compilation of the native code and it fails. If I rebuild the JNI part from the command line, compilation succeeds but the 3rd party library disappears. Really stupid.
So how do I tell Eclipse to use an existing .so library along with a library that must be built? The libraries are independent.
The NDK allows for linking with prebuilt user libraries, using the PREBUILT_SHARED_LIBRARY variable.
Assuming that the library you need to link is librandom.so, create a libs folder in jni subfolder of the project folder:
mkdir -p jni/libs
cp librandom.so jni/libs
Then, just create a jni/libs/Android.mk file:
LOCAL_PATH := $(call my-dir)
include $(CLEAR_VARS)
LOCAL_MODULE := random
LOCAL_SRC_FILES := librandom.so
include $(PREBUILT_SHARED_LIBRARY)
You can create a section for each prebuilt library, all placed in jni/libs.
Next, you just need to include the above file into your jni/Android.mk to get things to work. In the NDK docs, it is recommended that this be done at the end of the Android.mk, rather than the middle:
include $(LOCAL_PATH)/libs/Android.mk
However, you'll need to do this before the module that requires this library.
For linking, you'll need to add the following into the module section that links to the prebuilt library.
LOCAL_SHARED_LIBRARIES := random
Then when you do ndk-build, it will copy this library into libs/armeabi/ before building the module, and you're good to go.
Note: This does not solve problems with required headers. You'll still need to add the location of the headers for the library into the variable LOCAL_C_INCLUDES in the module that requires it.
This is what I have done for the moment. I will not accept (in stackoverflow sense) my own (this) answer beause it is unsatisfactory.
I have created a new project and copied all java files there. Then, I copied the .so library from the old project and the 3rd party library into libs/armeabi.
That's monstrous. But it works. For the moment. The worst thing is that the version control is torpedoed.
Related
Using the really easy to follow instructions for building a NuGet package for an assembly with an associated package of sources for the symbol server, found here on David Ebbo's blog "The easy way to publish NuGet packages with sources" I have indeed created a pair of packages: binary and sources.
However, the sources package is incomplete and the reason is that the sources come from two class library .csproj and I used ILMerge to combine the results of the second into the first. (*) So, using the minimal .nuspec described in this post and pointing nuget.exe at the .csproj for the "main" library, the binary package is fine, but of course the sources package only has sources for the "main" library, not also for the library that was ILMerged into the "main" library.
How do I fix this (and get the sources for both projects included in the symbol package but only the binary for the "main" project in the binary package)?
FYI, the actual nuget.exe command line was: nuget pack CommandLineLexing.csproj -Build -Symbols -Properties Configuration=Release.
(*) The reason I'm doing this, in case you're interested, is that the second library is a cut down version of my accumulated "C# utilities" library - you know, a bunch of extension methods and other helpers - cut down so it only has the bare minimum needed for this particular project. And so, since it is cut down, I don't want there to be separate assembly for it which might eventually get confused with the full assembly (having the same name, and not a strong name). So I used ILMerge to put the utilities methods into the main assembly (and also mark them internal).
Not going to be easy I'm afraid.
NuGet symbol packages are simply your regular package, with pdbs, augmented with source files.
Assuming you already know you can get a merged PDB with ILMerge/ILRepack (/debug), that part is probably working file, I'm assuming your issue is that only the source files from the current project gets included.
You could simply post-process your symbol.nupkg (which is a zip), and include the source files from your other (merged) project in the src folder (you can even try that manually).
Though if you run srctool.exe -r MyMergedAssembly.pdb, you'll see different root paths, where usually (for a non-merged project) they all have a common prefix.
It may work, if SymbolSource copes with having multiple path prefixes in your PDB, that I haven't tried.
I also failed to find any documentation irt to their processing of symbol packages. We can assume they use pdbstr.exe tool to update the PDB srcsrv section of the PDB file to redirect the symbol loading to their website, but one can only tell if they support multiple roots by testing it.
If you upload your augmented symbol package to symbolsource, you can download the updated PDB using a URL similar to http://srv.symbolsource.org/pdb/Public/Castle.Core.pdb/4C81FC30DF584853B9869EAB2FA7D9891/Castle.Core.pd_ (then unzip it to a pdb file)
Then you can use both srctool.exe file.pdb and pdbstr.exe -r -s:srcsrv -p:file.pdb to verify their work.
I've just been exposed to a large non-trivial CMake/Eclipse based C++ project. One of the build targets is Windows/nmake based. In the final step of building an executable, the linker throws LNK1104: cannot open file 'python27.lib'. This is correct, because Python 2.7 hasn't been installed.
Problem is, I cannot find any references to this library in cl.exe's command line. Also a grep on the whole project directory (including eclipses .metadata directory) won't find anything plausible. Deleting all the cmake generated build stuff didn't help too.
The real question is, if MSVC-based libraries (import or static ones) have any mechanism to request additional libraries during the link step implicitely. There are a few pre-compiled ones in the mentioned project. I simply need the vocabulary, where to begin a more qualified search regarding the error cause.
I found the answer here:
Puzzling dependency of Boost.Python 1.54 (debug build) to Python27.lib on Windows
Basically, the culprit is a #pragma comment() directive inside the boost libraries.
Since I first saw a dist/ directory in many open source projects, usually on GitHub, I've been wondering what it means.
With dist, vendor, lib, src, and many other folder names that we see quite often, I sometimes wonder how I should name my own folders.
Correct me if I'm wrong!
src: Contains the sources. Sometimes only the pure sources, sometimes with the minified version, depends on the project.
vendor: Contains other dependencies, like other open source projects.
lib: Good question, it's really close to vendor actually, depending on the project we can see one or another or both...
dist: From what I saw, it contains the "production" files, the one we should use if we want to use the library.
Why is open source so confusing? Isn't it possible to do things clearer? At least per language because some languages use specific names.
To answer your question:
/dist means "distributable", the compiled code/library.
Folder structure varies by build system and programming language. Here are some standard conventions:
src/: "source" files to build and develop the project. This is where the original source files are located, before being compiled into fewer files to dist/, public/ or build/.
dist/: "distribution", the compiled code/library, also named public/ or build/. The files meant for production or public use are usually located here.
There may be a slight difference between these three:
build/: is a compiled version of your src/ but not a production-ready.
dist/: is a production-ready compiled version of your code.
public/: usually used as the files runs on the browser. which it may be the server-side JS and also include some HTML and CSS.
assets/: static content like images, video, audio, fonts etc.
lib/: external dependencies (when included directly).
test/: the project's tests scripts, mocks, etc.
node_modules/: includes libraries and dependencies for JS packages, used by Npm.
vendor/: includes libraries and dependencies for PHP packages, used by Composer.
bin/: files that get added to your PATH when installed.
Markdown/Text Files:
README.md: A help file which addresses setup, tutorials, and documents the project. README.txt is also used.
LICENSE.md: any rights given to you regarding the project. LICENSE or LICENSE.txt are variations of the license file name, having the same contents.
CONTRIBUTING.md: how to help out with the project. Sometimes this is addressed in the README.md file.
Specific (these could go on forever):
package.json: defines libraries and dependencies for JS packages, used by Npm.
package-lock.json: specific version lock for dependencies installed from package.json, used by Npm.
composer.json: defines libraries and dependencies for PHP packages, used by Composer.
composer.lock: specific version lock for dependencies installed from composer.json, used by Composer.
gulpfile.js: used to define functions and tasks to be run with Gulp.
.travis.yml: config file for the Travis CI environment.
.gitignore: Specification of the files meant to be ignored by Git.
To answer your original question about the meaning of the dist folder:
The shortform dist stands for distributable and refers to a directory where files will be stored that can be directly used by others without the need to compile or minify the source code that is being reused.
Example: If I want to use the source code of a Java library that someone wrote, then you need to compile the sources first to make use of it. But if a library author puts the already compiled version into the repository, then you can just go ahead. Such an already compiled version is saved into the dist directory.
Something similar applies to JavaScript modules. Usually JavaScript code is minified and obfuscated for use in production. Therefore, if you want to distribute a JavaScript library, it's advisable to put the plain (not minified) source code into an src (source) directory and the minified and obfuscated version into the dist (distributable) directoy, so others can grab the minified version right away without having to minify it themselves.
Note: Some developers use names like target, build or dest (destination) instead of dist. But the purpose of these folders is identical.
Summary of the folders:
bin: binaries
contrib: contributions to the project
dist: -- see 1. and 2.
doc/s: documentation
include: headers (C/C++)
lib: libraries (C/C++)
man: short for man/manual pages (Unix/Linux), c.f. man(1)
src: source
"/dist means "distributable", the compiled code/library." ref.
"The shortform dist stands for distributable and refers to a directory where files will be stored that can be directly used by others without the need to compile or minify the source code that is being reused." ref.
Actually! "dist folder" is the result you get after modifying a source code with "npm run build" or "ng build" or "ng build --prod" for production.
Meanwhile! After getting "dist folder" there might still be few things that you still need to do depending on your project type ✌️
I have inherited a Flash Builder 4.6 project, but cannot get the Eclipse FB to compile the project, and now my 60 days has lapsed.
I got fed up with being sent in circles by Eclipse/Fb and now I want to try and sanitise/understand the process by building manually using Flex SDK 4.6.
Trouble, is I do not know where to start. There is the MXMLC.exe and CompC.exe, there are projects within the workspace with inter-depedencies.
Simple example using ANt exists where the chap is only compiling from a single source file.
In my workspace I have a main project folder with sub-folders like src containing .as, .mxml, .png files and sub0fodlers like "assets". At the src root folder there is the main mxml which maps to the final compiled exe.
The project also has a "libs", "bin" "bin-release" folder.
The referenced projects are similar.
One of the referenced projects is "flerry".
I want a single Windows Perl/BAT scipts that will compile this for me.
Any ideas on where to look?
mxmlc.exe is used to compile the application, compc.exe is used to compile libraries. If you have code library dependencies, you'll first need to compile these and then compile your application.
The documentation on compiling is pretty good and can be found here: http://help.adobe.com/en_US/flex/using/WS2db454920e96a9e51e63e3d11c0bf69084-7fcc.html
I would suggest you give this a go and post specific questions if you run into issues.
As I'm quite new to Java, I would like to know the proper procedure of installing new libraries (those that are no available in my linux dist repositories).
Where should I place them? and how to install them?
For instance, I downloaded openCsv (http://opencsv.sourceforge.net/), and I have no idea how to install it.
Java libraries don't really need to be 'installed' like other applications. All you need to do is put the jar file in a specific location, and add the jar file to your classpath. How you do that depends on the linux distro you are using. If you are making a web application in eclipse, you can drop the .jar file in the WebRoot/web-inf/lib folder, and it will be bundled in with your project.
Be sure that the path, which you place the libaries at, is set in the $CLASSPATH Environment Variable.
For Eclipse: Project -> Properties -> Java Build Path -> Add JARs...
It's up to you really - I use /opt/javalib, but you might consider a directory in /usr/local as well.
You can store them wherever you wish. You can store them within the JRE distribution directories, but I wouldn't recommend that.
Instead I would store them per-project (so you can have different versions for each project easily - some libraries have different names for each version, some don't) and adopt a standard such as a lib/ directory. That way you can have standard build scripts (Ant etc.) that can operate in the same way (if you're using Maven, then there's a standard place per-project - src/main/resources)
You could use Maven to manage any dependencies to those libraries.
Maven will automatically download all needed JAR files and put them in a local repository (the location is configurable).
This makes upgrading to new versions of various libraries very easy as you just declare the version you want and Maven does the rest.
Beware: Maven is something to get used to and the initial learning curve is steep.
The rewards come if you have everything set up properly and maven takes care of compiling, packaging, distribution, site creation, release management etc. etc. etc.