What is the sweet spot PC configuration to download and build android AOSP (as of 2022)? - android-source

What is the sweet spot PC configuration to download and build android AOSP (as of 2022)?
I am new to PC building and don't have much knowledge about PC

You can see the complete system requirements and recomendations at https://source.android.com/setup/build/requirements
A 64-bit environment is required for Android 2.3.x (Gingerbread) and higher versions, including the master branch. You can compile older versions on 32-bit systems.
At least 250GB of free disk space to check out the code and an extra 150 GB to build it. If you conduct multiple builds, you need additional space.
At least 16 GB of available RAM is required, but Google recommends 64 GB.
As of June 2021, Google is using 72-core machines with 64 GB of RAM internally, which take about 40 minutes for a full build (and just a few minutes for incremental builds, depending on exactly which files were modified). By contrast, a 6-core machine with a similar amount of RAM takes 3 hours.

Related

react-native-windows build takes 100% usage of disk

I am running out of options on what to do with this issue.
On vscode when I build a Windows eg react-native run-windows the disk usage is always 100% and it takes 10 to 15 min to build a project.
This disk I am using is D:// and it's free, there is only my projects that I am building and developing.
I have Added D:// so Windows 10 won't scan the driver but it is no use.
Is there something I am missing?

Dymola 2018 performance on Linux (xubuntu)

The issue that I experience is that when running simulations (same IBPSA/AixLib-based models) on Linux I get a significant performance drop (simulation time is about doubled) in comparison to a Windows 8 machine. Below you find the individual specs of the two machines. In both cases I use Cvode solver with equal settings. Compilation is done with VC14.0 (Win) or GCC (Xubuntu).
Is this issue familiar to someone or can anyone help what the reason might be?
Win 8:
Intel Xeon #2.9GHz (6 logic processors)
32 GB RAM
64-Bit
Xubuntu 16.04 VM:
Intel Xeon #3.7GHz (24 logic processors)
64 GB RAM
64-Bit
Thanks!
In addition to the checklist in the comments, also consider enabling hardware virtualization support if not already done.
In general gcc tends to produce slower code than Visual Studio. In order to turn on optimization one could try adding the following line:
CFLAGS=$CFLAGS" -02"
at the top of insert/dsbuild.sh.
The reason for not having it turned on by default is to avoid lenghty compilations and bloated binaries. For industrial sized models these are actual issues.

Why should someone update to the 64 bit version of Vs Code?

All the info I can find basically says the 64 bit version is available but I can't find any info on what exactly that means? How is it different from the 32bit version?
From the release notes:
Large file supportĀ - Improved performance for large files, no size limit on 64-bit machines.
64-bit Windows buildsĀ - Use your computer's full address space with the 64-bit builds.
For example, to open large files (the 32 bit version has now a limit of 300 MB, previously it was 50 MB) you'll need the 64 bit version.

Installshield 2013 Professional/Premier

Our IT department wants to take advantage of deploying msi packages/imgs through Landesk Management Suite. We have been deploying images on workstations using usb Smartdeploy. I've been asked to find an msi packaging tool that can handle packaged programs with large file size.
Is Installshield 2013 Professional/Premier able to build an MSI package out of package installers with huge file size (i.e. Microsoft Office 2010/Autocad Package)? And if so, what's the limit of (number of programs/file size) you can fit in the msi package?
Replies and Suggestions are all welcome
Thanks
Authoring large windows installer packages (.MSI) is a nuanced discussion. Strictly speaking an MSI can be up to 2GB. However I personally never build an MSI larger then about 200MB. The 2GB is a hard limit and the 200MB is a judgement call dealing with UAC / DEP scanning activities that windows does that will really slow an installer down.
You get around these limits by building MSI's with external .CAB files instead of embedded CABs. This way the MSI is small for initial scanning.
Another limit to MSI is 32K files. Once you go past that it gets trickier but possible. The real problem though becomes a scalability problem. Really large installers tend to become harder to maintain (upgrade strategy, testing) take a very long time to build and a large amount of time to install ( one of Windows Installer strengths is also it's weakness as all the infrastructure to know what's being installed and manage it really takes a toll on installation performance ).
If you are using setup.exe to launch your installers, Windows also has a hard limit on the size of an EXE. The total size of the EXE with streamed data must be less than 4GB or it will not launch. While an MSI's 2GB limit makes this seem far off, adding multiple large PRQs to the installer can rapidly exceed the 4GB limit. You can also get around this limit by building the setup.exe with external PRQ files.

Should I use Eclipse 32bits or 64bits on my new machine?

This might be a dumb/naive question, and if it is please excuse me :)
I have a brand new machine with the following specs:
Inter Core i7 2600#3.4GHz
RAM 8 GB
Windows 7
This machine has a 64 bits architecture.
On my previous machine, I used to install 32 bits versions of Eclipse and run it using a 32 bits JRE, and my current Eclipse setup works perfectly on the new machine.
I tried to install a 64bits version of Eclipse, and run it with a 64 bits JRE, and I am wondering if there are any compelling reasons to switch to this kind of setup or stick to my existing install. I guess that I would have to reinstall all the plugins, and maybe find that some of them are not compatible with the 64 bits version of Eclipse.
So far, the 64 bits version seems to need quite some more RAM than the 32 bits version, which is something that I expected, but nothing seems to have improved.
Thanks for your advice!
In general I use 64-bit Eclipse without problem, but there can be issues around plug-ins such as:
Adobe Flash Builder only works with 32-bit
The Subversion plug-in Subclipse needs a native 64-bit version of Subversion installed separately
There may be more but those are the ones I've encountered in the past.
Moving to 64-bit gives you access to more addressable memory but it won't speed anything up, in fact it might reduce performance in some cases (but nothing I see as significant to what I do).
Well the only thing that will improve is that you are able to use the advantages of 64bit. Other then that I'm not aware of any improvement.
For example what's better in 64bit is that if you have a very large project set you would be able to handle it more comfortably. For more information on 64-bit please look here
If you want to be on the edge of technology your choice would of course be the 64bit setup.
About the ram, this is expected because some of the Datatyps now use 64bit and are therefore larger to store in memory.
For most plugins you will get a 64bit version or alternative and so far for what I've used it it always worked.