What are the benefits of using a tool like Chef vs. using a makefile/shell script for deployment? [closed] - deployment

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
I have heard good things about Chef, was curious about all of the benefits before I devote time to learning a new tool. Not looking to turn this into opinion thread, looking for a list of additional features it has over makefile/shell script.

Chef, and Ansible/Puppet/Salt too (collectively called CAPS), are all based on the structure of "describe the desired state of the system and the tool will make it happen".
A script or Makefile is generally a procedural system, run this, then run that, etc. That means you need to keep a mental model of system from each step to the next, and if that ever deviates from the real system (ex, a directory you are trying to set the owner of doesn't exist) your script usually breaks.
With some stuff this is easy, like yum/apt-get install as they are internally idempotent, you can run them every time and if the package is already installed, it just does nothing.
CAPS systems take that principle (idempotence) and apply it to all management tasks. This has for the most part resulted in less brittle configuration management as you only need to tell the tool what the end result should look like and it will take care of figuring out the delta from the current state.

Related

To Run an MSI with no user logged on and default folder in Users\*\Appdata\Roaming [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 1 year ago.
Improve this question
I have a MSI that creates it's default folder in Users*\Appdata\Roaming but the machines on which we have to execute the msi are all newly built. There are no users logged on. Is there an VBS or any suggestion that can execute this.
You might not realize it, but your question is pretty broad. There's an abundance of methods you could use to achieve this result. The easiest would likely be to scrape the host fqdn's, iterate over the list, and use powershell to invoke-command an install action.
A preferred method would be to use some sort of configuration management. Be it Chef, Ansible, Puppet, etc. Though this requires some implementation and architecture which will necessitate some planning.
A package manager would likely be helpful too, chocolatey, nuget, etc.

Which one is best to consume Restful WebServices for Xamarin.Forms? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
I want to go with Xamarin.Forms project. Now, I am bit confuse for consuming Rest API for this project. Performance matters.
There are many available but can any body please suggest me which should be best for Xamarin.Forms(.Net Standard)?
Microsoft Http Libraries or third party libraries like Refit, RESTSharp, PortableRest, etc.
Please suggest
All of these options are viable. I think the performance differences between these libraries will be marginal. So, it mostly comes down to what you feel comfortable with.
I like to use Refit because it will take a lot of redundant code out of your hands and you just have to focus on the contract. All the code for the actual calls is generated at compile-time (and thus won't impact your performance at runtime).
Also have a look at how well the library is maintained and if it's active. If you choose one that is already inactive for a while, chances are that you will start relying on older software versions which might not be what you want.

Do libraries exist for building operating systems? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
So I'm curious about this. I assume the building of operating systems is a monumental task, especially with all the back end stuff that an os involves. I was curious if I wanted to rework the front end of an operating system, but take advantage of existing architecture/backend, what would be the best resources to use? Also, can you guys point to any examples of well designed front ends of operating systems that aren't really mainstream? It seems like everyone uses pretty large well known OS.
Yes, you can. But like you said, it's a huge, huge task. I am not sure of windows or mac, but in Linux you have options to do so. You can download a Kernel from https://www.kernel.org/ and write applications around it.
If your goal is to make applications around the kernel, then look at linux application development resources. Check out linux desktop environments https://en.wikipedia.org/wiki/Desktop_environment#History_and_common_use to see which one is good.

how to assess the quality of CPAN Perl modules? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
In CPAN, a huge amount of Perl modules are available.
Which is your favourite way / method to assess the quality of a Perl module ? --- in prior to a manual test.
It seems like a rather easy question but it implies an impact. Many modules are buggy and time wasters. And it is not my intention to name a few of them to avoid an unnecessary defamation.
MetaCPAN Metrics: Rating, Bugs, Last Updated
In order to get easy access to these, try searching on metacpan.org instead of cpan.org. It displays number of open bug reports and average rating on the left-hand side, as well as telling you when the module was last updated and gives you an overview of the speed at which changes are made to the module.
You can also easily see a list of dependencies on the right, so you can look for any modules in the groundwork which are less-than-stellar. It doesn't give you any data you can't get on cpan.org, but it does put it all in one place.
Obviously, if you're working on critical infrastructure, nothing is going to replace an old-fashioned code review as you need to be confident not only in the quality of your program but in your understanding of the plumbing, but those are the metrics I usually look at first.

How to maintain a small repository of bash/python scripts [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
For the past several years, I've been making small (single file, 1-500 line) scripts (mostly bash & python) to automate random tasks (usually scientific data analysis). Most of these end up being one-offs, but sometimes I want to go back and revisit/change something, or end up with a rather unwieldy script that could benefit from some sort of version control. I should note that all of these scripts are done solely on my own, and don't necessarily need to be share-able.
Which type of versioning (SVN,CVS,git,Mercurial..) Has the simplest command structure/syntax for my use case? More importantly, the machines I connect to are behind rather finicky kerberos walls, so I'm not looking for any sophisticated server-based implementation.
I found this thread from 2010 asking a similar question, though it didn't really talk about specific options, just whether or not I should be using a single repository.
In short, which versioning system allows for simple same-directory approach with minimal bells & whistles (only checkouts and commits needed)?
Should I set up some sort of subversion/CVS/git repository and just throw everything in?
Yes.
For your use-case, I suppose, SVN can be best choice (with URL-based access to every object in repo you can easy and fast get access to any single file any revision of file and for your linear history "not the best" merge in SVN isn't problem). Local file:///-based repository will require minimum of maintenance. You can use single-repository, flat tree (all files in /trunk)