Finding the latest build version of Library - github

I think I am missing something, but I want to add library from github to Android, I don't see anywhere on the Github page the latest built version of the library so I can include it in my gradle file. I have to go to maven or jetpack manually and search for it. Is there a shortcut? Am I missing something?
Thanks

There is a Lint check which allow Android Studio to query the latest versions available.
First you will have to activate this Lint Check
Go to Settings, then Editor > Inspections and search for Newer Library Version Available and check it.
Then run a code Analyze with Analyze > Run Inspection by Name... and type newer and select Newer Library Version Available
Run the inspection on the wanted scope (module only, full project, etc...)
Then you will see which library has a new version available.
PS
As stated by the Lint description of this feature, you should not let this check activated because it may slow your code analysis (query the repositories can take time)

You can use the + annotation to get a dynamic version. It can be use for the major, minor and patch part of the version. Ex :
// Major
compile group: 'org.mockito', name: 'mockito-core', version: '+'
// Minor
compile group: 'org.mockito', name: 'mockito-core', version: '2.+'
// Patch
compile group: 'org.mockito', name: 'mockito-core', version: '2.18.+'
But it's not a good practice to use such a dependency resolution.
Dependencies can unexpectedly introduce behavior changes to your app. Read your changelogs carefully!
The same source built on two different machines can differ. How many times have you said "but it works on my machine?"
Similarly, builds built on the same machine but at different times can differ. I've wasted so much time on builds that worked one minute
then broke the next.
Past builds cannot be reproduced perfectly. This makes it difficult to revert safely.
There are security implications if a bad actor introduces a malicious version of a dependency.

Related

how to determine the current and future electron version(s) of vscode at extention build time

I'm collaborating on a vscode extension that uses a native module (#serialport) which needs to be included / pre-compiled for each platform/electron-version combination.
if we only include the current versions, it frequently breaks when vscode updates the electron version. some platforms can nativly re-compile , other can only after a (very) lengthy install of rather-complex toolchains that IMO should not be required for end-users.
So we want to include the relevant prebuilds,
and for that we need to look ahead in time ...
I'm looking for a reliable method to determine the electron versions used by vscode
- current version
- and the future (insider) version
- in addition it may be good to include a prior version to allow for backward compatibility
I have found that master/.yarnrc has the current ( or next imminent) version
today it is 4.2.7
vscode current release uses 4.2.5
prior versions can be read from the version history
master/.yarnrc
but what about the future / insider version ?
what is a good method/location to determine that programmatically ?, i.e. Which branch has the insiders version ?
Probable answer based on below hints and some more probing :
next version is in master ..microsoft/vscode/blob/.yarnrc
version 1.36.1 is in ..microsoft/vscode/blob/1.36.1/.yarnrc
version x.y.z is in ..microsoft/vscode/blob/x.y.z/.yarnrc
which only leaves the in-between versions/tags to be discovered.
intended approach:
during the build collect the relevant electron versions, ie "3.1.8","4.2.5","6.0.0-beta.0"
determine the ABI used by these versions using node-abi
var getAbi = require('node-abi').getAbi;getAbi('$version','electron')
use prebuild-install to download the relevant native prebuilds bindings, and include these as part of the extension
.\node_modules\.bin\prebuild-install.cmd --runtime electron --target $version --arch $arch --platform $platform --tag-prefix #serialport/bindings#
copy the bindings files for all ABI-arch-platform combinations to a folder, and inclide that in the vscode extension package
at load time , determine the ABI version of the running instance of vscode/electron, and dynamically load the module from the ABI/platform folder
alternative / additional approach:
- as a last ditch effort the code could try a just-in-time download of the pre-build binding file for the current platform, but this might run into permissions/malware scanner problems as that is essentially downloading downloading executable code from an external github repro.
current script code to download the bindings:
https://github.com/Josverl/pymakr-vsc/blob/fix/SerialMultiPlatform/scripts/mp-download.ps1
okay, so reverse logic then indicates that:
next version is in master https://raw.githubusercontent.com/microsoft/vscode/master/.yarnrc
version 1.36.1 is in https://raw.githubusercontent.com/microsoft/vscode/1.36.1/.yarnrc
version x.y.z is in https://raw.githubusercontent.com/microsoft/vscode/x.y.z/.yarnrc
which only leaves the in-between versions/tags to be discovered.

Bower versioning best practise?

I am wondering how can I ensure that my bower version configuration will be workable in the future?
E.g. I have already touched multiple projects, which either tell to use
">=1.0.0"
"~1.0.0"
Afaik the
">=" tells that all versions above 1.0.0 are fine
"~" tells all versions/minor updates on 1.0.x are fine
To be more specific:
"dependencies": {
"angular": ">=1.3.0",
"bootstrap": ">=3.2.0",
"jquery": "~2.1.0",
}
Of the day of writing this code following version configuration was included:
angular: 1.3.1
bootstrap: 3.2.0
jquery: 2.1.0
today you will get included:
angular: 1.4.0
bootstrap: 3.3.4
jquery: 2.1.4
From the point of the developers integration of the lib this features are fine on the beginning of the development. You have not to mess around with the painful dependency management of the libs and versions. But as soon as it gets tested the version should be fixed to defined versions.
I have already touched multiple projects which got broken after a very short period of 3 months, since the libs got updated to different versions, which either are incompatible to each other or some features got broken. So either the build was not working any more or even more bad, issues arise on client side.
What is the best practise to get rid of such version issues on the long term projects?
At the moment there is none, if your only option is bower. A lockfile a la composer or a shrinkwrap mechanism a la npm is in the works however it seems to have stalled as there are currently not enough contributors/maintainers to test the feature and maintain it in the long run.
UPDATE:
Since we now have yarn you can opt to use that, which uses a lockfile mechanism as the default behaviour. The only caveat is that it uses the npm registry which means that either some packages haven't been registered there yet, or have been namespaced like Google's Polymer which you might have to watch out for.
My get-t-go method is using exact versions,
don't let your dependency tool decide what version is best for you because they (and other people) are usually wrong.
What I mean by that is, and I have seen this plenty enough on bower. That one day you get version A.B.C and the next day you might get A.D.F and A.D.F conflicts with some other dependency you have. This can introduce all sort's of problems.
Best is to do handle all your upgrades yourself and test this yourself.
I have yet to see a project where UI and javascript testing was automated in such a way that this was done reliably.

Rust library development workflow

When developing a library in Rust (+ Cargo), how do I achieve the fast recompile/test cycle?
When developing an app, it's easy, I:
Make changes in the code
Switch to the terminal and run cargo run
See the compiler feedback
But now I want to extract parts of my app as a library and publish it on GitHub.
I would like to continue developing my app, but now with this library as a dependency. I'm going to develop both the library and the app in parallel.
How do I get same quick feedback now?
Both the library and the app will be developed on the same machine, I would like to make changes to the library, update the app correspondingly and see the compiler feedback.
I'm guessing I could use my library as a dependency in Cargo.toml and run cargo update each time I want to update my app's dependencies, but this will be somewhat slow because it will have to download the code from github each time and recompile all dependencies.
You can use this somewhat undocumented feature of cargo. Add the following line to ~/.cargo/config file (or /path/to/your/binary/project/.cargo/config to limit the effect to your binary project):
paths = ["/path/to/your/library"]
From now on every cargo package (or those under /path/to/your/binary/project root) which depends on your library will use /path/to/your/library as the source code for it regardless of what is specified in this package manifest, so you can keep Git repo URL in your program manifest. Hopefully this feature will be documented in future.
Update
This is now documented in the Cargo guide.

How to validate an upgrade before installation

We have an eclipse feature that is licensed and the license is handled by our own code. The user can go in on our update-site and upgrade his feature. The problem we face is when the user's license needs to be updated before he can use the new upgrade.
What I want to do is to validate the feature version against the users license and warn the user that his license needs to be updated before he install.
I thought I would do this using a custom eclipse p2 touchPoint action validateLicense.
Example:
My code is called, where I validate the version against the user's license. If it fails I warn the user and he can then cancel the installation.
So my first question is:
Do I get this right, or is it some other way to do this?
My second question is pretty basic:
Where do I tell eclipse to run my code?
I have looked here at eclipse help where they explain what it is. But I don't understand where to put the information to run my code? Is it in the feature.xml.
Lastly:
Is there an example how to create and use p2 touchPonts?
I implemented a custom action as shown here and I have a system that seems to work. I left out "touchpoint" extension as it's unnecessary in my case, but the rest is the same.
My action is executed during install phase of my feature (instructions.install) but maybe configure phase could work too. Collect phase did not work.
The action is executed during installation process, after the download was already performed. Ideally it would be before the download but it's not a big issue for me. Returning an error status from the action cancels the install. It leaves some downloaded files around but they do not get activated and are probably removed later by p2's garbage collector.
I also managed to do some more interesting things. My actions plugin has a dependency (optional and non-greedy) on my main plugin. So the install works like this:
Actions plugin is downloaded
Custom action is executed
The action detects whether my main plugin is already installed and if yes, it calls into it to retrieve licensing info. The main plugin has to expose an API for the action. The action also checks main plugin's version to detect whether the API is there or not.
The action now can decide whether to proceed or cancel the install. It can even interact with the user using Display#syncExec (this is what the code in checkTrust phase does so I think it's safe). If needed, the action could also detect whether the install is headless.
Some gotchas:
Action itself must be versioned. It's the version you declare in plugin.xml and p2.inf files and it's different from plugin's version. I just replace 1.0.0 with the same version my plugin has. This way the latest version of the action plugin is always downloaded before being executed. This is great because now any problem changes to licensing rules can be implemented in actions plugin.
Actions API changed between Eclipse 3.5 and 3.6. I will probably drop support for 3.5 as it's pretty old anyway.
Actions plugin should probably be signed. It's the case in my case. The system seems almost too powerful to me as just pointing Eclipse to an update site gets it to execute downloaded code.
I still need to test how this works with different versions of Eclipse and other IDEs. I saw a strange (non-blocking) error with 3.6. However the results are promising and it looks like the system might actually work.
Touchpoints are executed at installation time, which means that the resolution (validation) has already happened. I'm not sure they would help. What about creating an Installable Unit (IU) (or Eclipse Feature) that represents the license the user has installed. Then you would put a dependency from your product to that license.
For example, create an IU called com.mycompany.license (1.0.0). You would create another one called com.mycompany.license (2.0.0). When you installed a license, the appropriate IU would be added to the profile.
Now, when you go to install you product, the new version of the product would require license version 2.0.0. If this license was not installed, the resolution would fail.
Does this make sense? Do you think this would help?

P2 headless update not working

I have taken the org.eclipse.equinox.p2.examples.rcp.prestartupdate project and adapted it for use in my RCP application. I then setup an update repository that gets updated as part of my nightly build.
When I open my application it goes through the motions like it is updating - it finds the update site, generates an uninstall and install operand for each bundle correctly and says that it finished with no errors. The problem is that the plugins never actually get installed in the plugins folder even though the profile gets updated (a subsequent run states there are no updates). Next time my build runs it correctly identifies there are updates, but the same thing happens again.
I have spent days debugging and the only thing that looks out of the ordinary (not that I fully understand what is going on) is that during the final configure phase none of the TouchpointData objects have any instructions so it doesn't look like configure is doing what it should.
I really have no clue where to look next and would like to see if anyone else has any ideas.
Update:
I finally figured out what was going on.
The problem started when I built my product without the generating the metadata repository. When building through Eclipse I didn't check the "Generate metadata repository" in the export product wizards because I didn't need a p2 repository, just the product. The problem is that without checking that button the product does not install as P2 enabled causing side effects such as not generating a profile among other things.
I tried to compensate for this by manually creating a profile in code which I have since found out is a really bad idea. My original problems were created because my profile wasn't set up correctly.
Once I started exporting the product with "Generate metadata repository" checked the update started correctly installing the new plugins.
The problem I have now is that although the plugins are being installed correctly, the executable is getting trashed and I cannot launch my application any more. I am building my update site through Hudson and the binary folder which is present when I use the Eclipse Export Product wizard is missing. I am assuming that is what is going wrong now.
Any ideas why the binaries would not be building in my headless PDE build?
Figured this out also. I had assumed that all I needed was the individual launcher plugins for the platforms I wanted to build on. Since I was trying to understand the process I was copying over plugins one by one to the build server. It turns out to include the platform specific binaries in the build you need to have the org.eclipse.equinox.executable feature from the delta pack. Once I added that to the build the binaries started showing up in the output. With the binaries the update mechanism works exactly as intended.
I had assumed that all I needed was the individual launcher plugins for the platforms I wanted to build on. Since I was trying to understand the process I was copying over plugins one by one to the build server. It turns out to include the platform specific binaries in the build you need to have the org.eclipse.equinox.executable feature from the delta pack. Once I added that to the build the binaries started showing up in the output. With the binaries the update mechanism works exactly as intended.