Is there a difference in linking standard and custom dynamic library? - operating-system

I don't get how standard library like libc is linked.I use MingW compiler.
I see that there is no libc.dll file in its bin folder.Then how libc is linked?
How does compiler know difference between custom and dynamic library?

We use build tools because they are a practical way to compile code and create executables, deployables etc.
For example. Consider that you have a large Java application consisting of thousands of separate source files split over multiple packages. Suppose that it has a number of JAR file dependencies, some of them for libraries that you have developed, and others for external libraries that can be downloaded from standard places.
You could spend hours manually downloading external JAR files and putting them in the right place. Then you could manually run javac for each of the source files, and jar in multiple times. Each time in the correct directory with the correct command line arguments ... and in the correct order.
And each time you change the source code ... repeat the relevant parts of the above process.
And make sure you don't make a mistake which will cause you to waste time finding test failures, runtime errors, etc caused by not building correctly.
Or ... you could use a build tool that takes care of it all. And does it correctly each time.
In summary, the reasons we use build tools are:
They are less work than doing it by hand
They are more accurate
The results are more reproducible.
I want to know why compiler can't do it?
Because compilers are not designed to perform the entire build process. Just like your oven is not designed to cook a three course meal and serve it up at the dinner table.
A compiler is (typically) designed to compile individual source code files. There is more to building than doing that.

Related

Does same scala code might produce different bytecodes?

I have a jar file contains scala files which I'd like to version somehow. Sometimes I can use git SHA inserted in METADATA but it's not always the case.
As a "fallback" I thought doing md5 on the bytecodes themselfs (*.class files). For that I'll need to make sure same code given to maven package will produce same bytecode. I found out here that:
Different versions of Scala may produce slightly different bytecode.
which is fine by me, I'm asking if there are more variables involved, mainly time as in jar creation date presented in bytecode somehow.
Thanks!

TypeScript : Can't reference a static field in a certain context [duplicate]

I'm in the process of moving a fairly large typescript project from internal modules to external modules. I do this because I want to create one core bundle, which, if and when required, can load other bundles. A seccond requirement which I'm keeping in mind is that I'd like to be able to run the bundles (with some modifications if required) on the server with nodeJS aswell.
I first tried using AMD & require.js to build the core bundle, but I came across an issue with circular dependencies. After having reading that this is common with require.js and commonJS is more often adviced for large project I tried that. But now that it's set up together with browserify I have the exact same issue coming up at the exact same place when I run the compiled bundle.
I have something like 10 base classes that havily rely on eachother and form multiple circular dependencies. I don't see any way to remove all of them.
A simplified setup to explain why I don't think I can remove the circular dependencies:
Triples are made of 3 Resources (subject,predicate,object)
Resource has TripleCollections to keep track of which triples its used in
Resource has multiple functions which rely on properties of Triple
Triple has some functions which handle TripleCollections
TripleCollection has multiple functions which uses functions of Triple
TripleCollection.getSubjects() returns a ResourceCollection
ResourceCollection.getTriples() returns a TripleCollection
Resource keeps track of the objects of its triples in ResourceCollections
ResourceCollection uses multiple functions of Resource
I've read multiple related issues here on SO (this one being most helpful), and from what I can gather I only have a few options:
1) Put all the base classes with circular dependencies in 1 file.
This means it will become one hell of a file. And imports will always need aliases:
import core = require('core');
import BaseA = core.BaseA;
2) Use internal modules
The core bundle worked fine (with its circular dependencies) when I used internal typescript modules and reference files. However if I want to create separate bundles and load them at run time, I'm going to have to use shims for all modules with require.js.
Though I don't really like all the aliasing, I think I will try option 1 now, because if it works I can keep going with CommonJS and browserify and later on I can also more easily run everything on the server in node. And I'll have to resort to option 2 if 1 doesn't work out.
Q1: Is there some possible solution I havn't mentioned?
Q2: what is the best setup for a typescript project with 1 core bundle which loads other bundles (which build upon the core) on demand. Which seems to cannot go around circular dependencies. And which preferably can run both on the client and the server.
Or am I just asking for the impossible? :)
Simply put (perhaps simplistically, but I don't think so), if you have ModuleA and ModuleB and both rely on each other, they aren't modules. They are in separate files, but they are not acting like modules.
In your situation, the classes are highly interdependent, you can't use any one of those classes without needing them all, so unless you can refactor the program to try an make the dependencies one-way, rather than two-way (for example), I would treat the group of classes as a single module.
If you do put all of these in a single module, you may still be able to break it into modules by moving some of the dependencies, for example (and all of these questions are largely rhetorical as they are the kind of question you would need to ask yourself), what do Triple and Resource share on each other? Can that be moved into a class that they could both depend on? Why does a Triple need to know about a TripleCollection (probably because this represents some hierarchical data)? There may only be some things you can move, but any dependency removed from this current design will reduce complexity. For example, if the two-way relationship between Triple and Resource could be removed.
It may be that you can't substantially change this design right now, in which case you can put it all in one module and that will largely solve the module loading issue and it will also keep code together that is likely to change for the same reason.
The summary of all of this is:
If you have a two way module dependency, make it a one-way module dependency. Do this by moving code to make the dependency one way or by moving it all into a single larger module that more honestly represents the coupling between the modules.
So my view on your options is slightly different to those in your questions...
1) Try refactoring the code to reduce coupling to see if you can keep smaller modules
Failing that...
2) Put all the base classes with circular dependencies in 1 file.
I wouldn't place the internal modules option on the list - I think external modules are a much better way of managing a large program.

Compiler optimization to reduce bytes of executable code

Is it possible for a Compiler (for ex. javac) to scan your whole project for unused methods and variables before compilation and then compiles the project without those unused methods and variables such that you end up with fewer bytes of executable code.
If this would be a compiler optimization, I would create one huge library that contains all my helper methods and import it in all my projects and not worry that it being so huge could effect my Software size.
I understand this could be impossible, if you do not have the source code of those libraries you are using(importing), but I am speaking of the case where you have the source code.
Is there a tool/IDE plugin that does something similar? I would think this could be also done in one step ahead of the compilation.
Java's compiler doesn't do this natively, but you can use a tool like ProGuard or any number of other Java optimizers to remove unused code.
But, in your case, why don't you just compile your big software library once and put it on your classpath? That way you don't have to duplicate it at all.

Identifying circular dependency between static libraries through script

I have a list of binaries that link some static libraries. It was identified that a bunch of these libraries are circular dependent. We never ran into troubles because we enclosed these static libraries between -Wl,--start-group, and -Wl,--end-group
Having understood that this is a bad practice, I'm trying to clean the system.
I've come up with a perl script that tells me how these libraries are dependent on each other, like so:
libchld.a depends on libprnt.a, libgprnt.a
libprnt.a depends on libncle.a, libgprnt.a
and goes one.
Now, I should sort these topologically with each node either pointing upwards or downwards.
And then If I find a set of cyclic dependent libraries while sorting topologically, I will have to enclose only those inside a --start-group and --end-group(than enclosing the entire bunch of libraries) there by cleaning up the system.
Are there some perl modules that does this type of sorting already?
Sort::Topological
Graph::Directed
are those I'm trying to check. But, they don't appear to handle if the graph is circular.
Having understood that this is a bad practice, I'm trying to clean the system.
It's a bad practice because you are not using proper layering, not because it's somehow bad for the linker.
Therefore, cleaning up the link line without re-arranging the libraries into a proper hierarchy with no circular dependencies is a pointless exercise.
And if you do rearrange the libraries, then their proper order will be easy to understand and you wouldn't need to use Perl for that.

Speeding up compilation in xCode

I have a rather large project where compilation takes more than 1 hour on a Mac with i5 processor.
Just changing one little piece of code at one place makes the complete long compilation necessary.
Is there any way to reduce this time?
I was thinking about "precompiling of classes" or "pre-linking" if there is anything like that.
Even uploading a little app to a device takes 10 seconds.
ps Anyone can provide some experience whether xCode4.3 is faster on the new Mac Retinas in this context?
Many thanks!
1) Use a precompiled header and remove any imports of those files (UIKite, Foundation, Cocoa, etc) which Xcode adds when you create classes)
2) Add reasonable stable user header files in the .pch as well - to reduce the precompile work.
In your classes, make most of the imports in the implementation file (.m), not the headers. Use forward declaration when appropriate. See '#class vs. #import' and 'Importing header in objective c'
You might consider moving a stable and well confined part of your main project into a separate project and include it as a static library in the main project.
Recently I removed a few libraries that I had been referencing as .a files and moved the code in with the code. The speed increased amazingly. Compilation used to take 15min, now takes 15 seconds. Indexing used to take all day to finish (in time for shut-down), but now it is really fast. The library was on an network drive which may have been exacerbating the problem.