Related
We are using a Variscite VAR-SOM-AM33 platform for our project, and software platform is based on OpenEmbedded/Yocto.
To ensure the hardware is running with the current software, the devices are connected to the internet. So far, we have been following the OE recipes and generating ipk and applying software updates via opkg.
However, the process is not satisfactory as some of the recipes are poorly written (fails to uninstall/install during the upgrade process). What robust technique/solution are available for OE/Yocto based systems?
Thanks in advance.
I'd like to add SWUpdate to the list of packages that you should consider. It was recommended in a 2016 paper by the Konsulko Group for Automotive-Grade Linux. That paper mentions a few other options, and provides an analysis of the various tools, so it's probably worth a read. Quoting from the paper:
It is our recommendation that the reference AGL software update strategy make use of SWUpdate in a dual copy configuration and integrate OSTree support. This allows recovery from a corrupt partition for the exception case, but also optimizes the common case where small, incremental updates can be quickly applied or rolled back as needed to [meet] OEM policy.
I don't completely agree with the paper. For example, they wrote off Mender.io because it lacks community support, but IMO the Automotive-Grade Linux group is influential enough to create popularity from scratch. Still, it's a good paper, and the fact that they settled on SWUpdate was interesting to me. I was already leaning toward it because the author, sbabic, is involved in U-Boot software development, and we use U-Boot to burn new images into our device.
At the moment I'm unsatisfied with all of the current options, but mostly because I want extra functionality. I'll probably settle on a custom system which incorporates one or more of the aforementioned packages. Unfortunately that's not the kind of definitive answer that SO prefers, but I hope that it was helpful.
I'm working on a metadata layer to integrate the Software Updater (swupd) from Clear Linux with the Yocto Project / OpenEmbedded Core.
swupd performs whole of OS updates, rather than package-based updates, using binary deltas to only update the files which change and to do so in an efficient manner.
I recently wrote some documentation (within the docs/Guide.md file in the meta-swupd repo) about adopting the "Clear Linux Way" to utilise meta-swupd from an OE/YP based distro. A wikified version of that guide, including a link to the layer git repository, are available on the Yocto Project wiki:
https://wiki.yoctoproject.org/wiki/Meta-swupd
I also have a sample layer on Github which demonstrates use of the layer (this is also the distro layer I test much of meta-swupd with):
https://github.com/incandescant/meta-myhouse
About mender.io, I have recently talked to them regarding their open-source update.
Currently, they already have their client side developed, and is working on the server side. They use HTTP and JSON. This is their git, it is only supporting Beaglebone and QEMU at the moment.
The way mender.io works is: they will have one persistence data and uboot. and 2 rootfs (active and backup) to update. So, when there is an update on the server, the users will be notified to pull it down. Give a mender -rootfs image update command. And if the upgrade is success, the user gives another mender -commit command. If there is no mender -commit, the rootfs will be rolled back to the previous rootfs in the next reboot. Mender currently only support update of kernel and rootfs.
The main role of mender.io is to ensure that the mass distributed image upgrade process is recoverable from errors. In the Server side, mender.io developed a management server to the mass distributed devices using UUID.
Not to advertise but please try out mender.io and give feedback so that the software could be more mature.
Mender Introduction pdf
Well, you can either use package based upgrades, like you do. In that case, you'll need to test and verify everything locally before you push any updates to the field. Obviously, you'll likely need to improve a number of recipes. (And I assume that you upstream those immprovements, right?)
The alternative is to use image-based upgrades. Either with full images, see for instance the discussion at Stackoverflow: Embedded Linux mechanism for deloying firmware updates or swupd
Note: I got distracted while writing this answer, so look at the answer from joshuagi; he explains a lot more of swupd.
I think they are two problems here. We (OpenEmbedded) do need to be careful that we do not break package based updates.
Also, there are image updates like swupd (mentioned above and swupdate, described at: https://sbabic.github.io/swupdate/swupdate.html
meta-updater provides support for OSTree-based updates to OE systems.
OSTree is interesting because it provides a half-way house between full image updates (which are large and tricky to handle correctly) and package based updates (which are tricky to make robust). It has a 'git-like' object representation of a root filesystem, and uses chroot and hard links to atomically switch between file system images.
(Disclosure: I'm a contributor to meta-updater)
Posts here were done years ago. Now also RAUC seems to be promising alternative to mender.io
Me and few friends run a little app creation business in our spare time, our current development environment is a 3 macbooks laptops running just snow leopard, 4 asus laptops with dual boot windows 7 and ubuntu and a rubbish test server box that is similar to our vps.
Our setup currently work okayish at the moment, with a few minor issues, like not knowing what version of software we are working on, caused by continually switching operating systems and lost of productivity from being to lazy to switch the laptop we are working, having to unplug it and plug in the new one, including the second monitor, keyboard and mouse.
Our system is far from professional and we are looking to upgrade. This is because we wish to increase our staff and we have some cash saved up, so why not. The phone we are targetting are iOS, android and Win7. Our servers are written in php and json. So my question is basically, how do you guys manage with all these multiple operating systems.
iOS requires mac os x
android can use all
json require linux/mac os x
windows phone 7 requires windows
do you guys use some form of virutalization?
or try those libraries that compile to each phone binary such as unity?
There are many many different ways to solve this and you may have to find what works best for you. Here are some suggestions though.
Using the macbooks, set up bootcamp so you can dual boot to OSX or Windows. This will mean you can use the Macbook for all development without having to bother swapping monitors, etc. Doing this will leave your other Windows laptops spare which you can use for the next suggestion....
Set up a central repository for your sourcecode. Use one of the servers you have, or re-purpose one of the other machines and install a decent source code repository system. CVS, Git, etc. There's plenty of resources about these. This will allow you to keep your code in one place so it won't matter which machine you are working on - you can always get the most recent code. Plus it will help you track your code changes. Oh, and don't forget having it all in one place will be much easier for backups (you do do backups, don't you....?)
Don't fall into the trap of upgrading hardware just because you have some money floating around. You may just need to use the hardware you have more wisely. You mention what you have is "far from professional". You don't need the latest, greatest hardware and software to do development. I've done iOS development on 4 year old Macbook Pro, used an 8 year old PC as a server for web and database and still use Windows XP every day.
Depending on how many of you there are, you may not have enough Macbooks. If this is the case, then perhaps you have some who are specialists in the server-side stuff (ie they don't do iOS development and so don't need the Macs).
Virtualisation - using VMWare or similar tools are an excellent way of getting more from what you have. For example, you could have a couple of test servers that aren't very heavily utilised. Using virtualisation, you could put both of these servers onto one machine. This will then free up the other box for something else. It also makes it very easy to backup (you are doing backups, aren't you...?) an entire server and recover it back to the exact state in the case of a hardware failure. You can also very easily create a server tailored for each client/project and switch between them quickly without having to maintain lots of other stuff (think if you had a web server configured for one project and you then work on another project that needs a different configuration and you change it, then you need to change it back, etc).
EDIT: Update in response to comments.
If using Bootcamp isn't an option, then consider running a Windows and/or Linux virtual machine inside OSX. Depending on the spec of your macbooks and as long as you don't need very low-level hardware access on Windows, then this would probably work as well and not need to switch in and out using BootCamp. Same goes for the Linux virtual machine. I'm a big fan of using Virtual Machines on development environments as it allows you to copy around and switch in and out servers without having to rely on physical hardware connections. And you can very easily return to a known state with the server configuration and data.
With regards the source control "in the cloud". I'm not a fan of this approach. It's my source code and I want to control it. I don't want to be reliant on some other company and I don't want to hope I've read some Terms and Conditions correctly and I'm not handing over my code to some other company to do what they want with it. Aside from that, what happens if your internet access goes down and you absolutely must get some coding done for a customer? If you are relying on another service, then you are risking problems. Yes, it has advantages for multi-site, they do the backups for you, etc. But it really isn't a problem unless you have lots of developers spread all across the world. And even then it isn't necessarily a problem. You could always do a backup of your code to some package file, encrypt it and then throw that up in the cloud for a backup storage (as well as burning it to disc, writing to another external hard drive and storing them off-site). But I certainly wouldn't want to rely on an external source control unless I was doing open source stuff.
There's sooooo much more to these subjects and there are many other subjects you will probably encounter along the way of building up your business.
One of the most important things about software development is to keep it organised and to get that organisation part done at the start. If you are just each keeping a copy of the code on local drives, then changing code and hoping that you haven't changed the same file as someone else, then this will just lead to pain. The source control aspect is key from the start.
Oh, and did I mention backups?
I would also consider the IDE you're using as part of the equation. For instance a good cross platform IDE (Like QT4+) and a centralised code repository on a server will go a long way towards mitigating your working problems. Eclipse, Netbeans and QT4+ are cross platform and will work with all 3 systems. Virtualisation as you mentioned is an option, but first I would decide on the IDE platforms to use before worrying about your dev infrastructure setup.
Bro, I'm not a pro, but you have two options:
Either multiboot your system by installing multiple OSes...(Obviously, you need a separate MACbook)
Or use Virtual Machines like VMWare etc.
Personally, I haven't heard much about libraries like Unity etc.
Go for dedicated systems & not just libraries.
Programmers that actually promote their products to production need an installer. (pre-emptive "programming related" justificaton.)
For deploying a new suite of internal corporate apps and services, I'm trying to decide between using WIX and the InstallShield Express edition that comes with Visual Studio 2010.
I've looked, but haven't found a feature matrix that highlights the features that are not in the express edition. I expect WIX to be generally quite capable, but more difficult to use, and have heard of situations that WIX doesn't support well.
Has anyone found a feature matrix, or have other recommendations on the long-term best way to manage internal deployments?
I find that wix is a great choice (in spite of the very very steep learning curve) if you need to manage installers in a complex environment because
setup definitions are stored in an XML format
it gives you full control to the underlying windows installer technology; the XML schema typically closely follows the windows installer database schema (which is also the main reason why the learning curve is so steep)
It is easy to integrate into your automated build
Parts of the setup can be generated automatically
It allows you to define small reusable modules and manage complex dependencies between them.
no cost or licensing issues (before wix we all had to use a single "Installshield PC")
Why the XML format is an advantage: this allows you to fully leverage code versioning systems like subversion or mercurial. Reviewing changes, examining history or even merging changes across branches is a breeze. Compare that to installshield projects which are opaque binary blobs.
What I mean by managing complex dependencies: in our case we have a big pool of reusable component libraries with a complex set of dependencies between them, and many applications that were build on top of that. Before wix, this was a nightmare when a new dependency was introduced somewhere: ALL setups had to be updated.
Now with wix, we have a ComponentGroup for each library, organized into a couple wixlibs. Each component group references other component groups that it depends on with a ComponentGroupRef. Application setup developers only need to reference the component groups of direct dependencies, and wix will do the rest by following the references. As a result, introducing a new dependency only requires making a single local change. Our automated builds and wix do the rest to regenerate all the setups.
InstallShield Express is for basic deployments (it's nothing but glorified WinZip). You can also check my favorite AdvancedInstaller. They have also free express edition but I think both of them will be no use to you, because if you need to do anything with IIS, MS SQL, Active directory, GAC etc, you will need "enterprise level" editions. WiX is free but learning curve is so steep, that it's not worth learning. I regret ever learning it.
If you need this just for internal deployments and cannot spend $1,000 on installer, just create your own "installation" project from scratch. System.EnterpriseServices.Internal namespace contains some useful wrappers for IIS, GAC etc. System.Configuration.Install.ManagedInstallerClass can help you deploy windows services. In other words, you can make your own program from scratch that can handle all necessary steps for deployment of your primary product. Many companies don't use for their flagship products commercial installers, they make their own.
The feature matrix for Install shield can be found here:
http://www.flexerasoftware.com/products/installshield/features.htm
However, for the IIS section (I assume you need IIS based on the link to my earlier question) all it says is "Limited". It is up to you to guess what Limited means, but I am betting it will not support an enterprise level deployment.
Can anyone explain in simple terms what the difference is between configuration management and version control? From the descriptions I've been able to find on various websites, it seems like configuration management is just a fancy term for putting your config files in a source control repository. But others lead me to believe there is a more involved explanation.
Version control is necessary but not sufficient for configuration management. Version control happens in some central or distributed repository, but says nothing about where any particular version is deployed or used.
Configuration management worries about how to take what is in version control and deploy that consistently to the appropriate places, primarily QA and production, but in a large enough development operation developers as well.
For example, you may keep all of your SQL queries in version control, including your table modification scripts, but that doesn't control when those scripts are deployed to the appropriate database server and kept in sync with the deployment of any other code that relies on that database structure.
Configuration management includes, but is not limited to, version control.
Configuration management is everything that you need to manage in terms of a project. This includes software, hardware, tests, documentation, release management, and more. It identifies every end-user component and tracks every proposed and approved change to it from Day 1 of the project to the day the project ends.
Version control is specifically applied to computer files. This includes documents, spreadsheets, emails, source code, and more.
Version control is saving files and keeping different versions of them, so you can see the change over time.
Configuration management is generally referred to as an overall process of which keeps track of what version of the code is on what server, how the servers are setup (and the install scripts to do so at many places). It is how process of what happens after the code goes into source control and how gets to deployed to the servers/desktops etc.
Configuration management is an ambigute term.
In software, it tends to be a superset of version control with emphasis on the entire process to produce a result in a repeatable and predictible manner.
In computing maintenance, it is related to the maintenance of the configuration settings and hardware/firmware/software versions of entire networks and set of attached computing machines (including servers, clients, routers...).
In hardware manufacturing, it represents even a superset of the two above, including the hardware pieces and software modules needed to obtain a product, with the description of the process to manufacture them, and even sometime the entire schemas and configurations of the production lines themselves.
In addition to everything said above I'd like to recommend Bob Aiello's book named "Configuration Management Best Practices" - http://www.amazon.com/dp/0321685865 .
It covers all aspects of Software Configuration Management including version control.
Version control is the control of deliverables whereas configuration management is managing the entire process leading to produce the deliverables. Configuration management involves change management, project management, etc., which generally are not managed by simple version control.
Roughly speaking, version control means you can check out the source for any particular version. Configuration management means you can build and deploy and probably test any particular version.
This can be helpful.
Versions and configurations
Versions:
Ability to maintain several versions of an object.
Commonly found in many software engineering and concurrent engineering environments.
Merging and reconciliation of various versions is left to the application program
Some systems maintain a version graph
Configuration:
A configuration is a collection compatible versions of modules of a software system (a version per module)
Version control is one of the features of a SCM system.
From the subversion user guide:
http://svnbook.red-bean.com/en/1.7/svn-book.html
"Some version control systems are also software configuration management (SCM) systems. These systems are specifically tailored to manage trees of source code and have many features that are specific to software development—such as natively understanding programming languages, or supplying tools for building software. Subversion, however, is not one of these systems. It is a general system that can be used to manage any collection of files. For you, those files might be source code—for others, anything from grocery shopping lists to digital video mixdowns and beyond."
I'm currently working at a small web development company, we mostly do campaign sites and other promotional stuff. For our first year we've been using a "server" for sharing project files, a plain windows machine with a network share. But this isn't exactly future proof.
SVN is great for code (it's what we use now), but I want to have the comfort of versioning (or atleast some form of syncing) for all or most of our files.
What I essentially want is something that does what subversion does for code, but for our documents/psd/pdf files.
I realize subversion handles binary files too, but I feel it might be a bit overkill for our purposes.
It doesn't necessarily need all the bells and whistles of a full version control system, but something that that removes the need for incremental naming (Notes_1.23.doc) and lessens the chance of overwriting something by mistake.
It also needs to be multiplatform, handle large files (100 mb+) and be usable by somewhat non technical people.
SVN is great for binaries, too. If you're afraid you can't compare revisions, I can tell you that it is possible for Word docs, using Tortoise.
But I do not know, what you mean with "expanding the versioning". SVN is no document management system.
Edit:
but I feel it might be a bit overkill for our purposes
If you are already using SVN and it fulfils your purposes, why bother with a second system?
If you have a windows 2003 server, you can have a look at Sharepoint Services 3.0 (http://technet.microsoft.com/en-us/windowsserver/sharepoint/bb684453.aspx).
It can do version control for documents, and has a nice integration with Office, starting with Office xp, but office 2003 and 2007 are better. Office and PDF files can be indexed (via Adobe IFilter), and searched. You can also add IFilters to search metadata in your documents.
Regarding large files, by default the max filesize is 50MB, but it can be configured.
We've just moved over to Perforce and have been really happy with it. It's a commercial product, but it's so powerful and easy to use that it's worth the price per seat IMHO.
A decent folder structure and naming scheme?
VCS don't really handle images and such very well - would it be possible to have the code in a VCS (SVN/Git/Mercurial etc), along-side a sensible folder structure for the binary-assets (source photos, Photoshop PSD files, Illustrator files and so on)?
It wouldn't handle syncing, but a central file-server would achieve the same thing.
It would require some enforcing and kitten-herding to get people to name things properly, but I think having a version folder for each asset (like someproject/asset/header_logo/v01/header_logo_v01.psd) will basically be like a VCS, but easier to move between different revisions (no vcs checkout blah -r 234 when a client decides they prefered v02 more than v03)
Your question is interesting because your specifying that it be suitable for a small office. At the enterprise level, I would recommend something along the line of EMC Documentum's eRoom, but obviously thats going to be way more than you need, and more than you want cost-wise as well. I'm not sure of the licensing details on this but I've heard that if your office has MS Office, you have access to Sharepoint, which might work well for you. I'm also sure there are a lot of SAAS implementations of this kind of stuff, so you may want to look at that, keeping in mind that the servers will not be hosted by you, so if the material is extremely sensitive, thats obviously not the proper route.
You might want to consider using a Mac as your server and using Time Machine to backup your shared folders. Doing this gives you automatic backups and allows you to share through Samba so everyone can have a network drive on their computer. A Mac server is probably overkill. A Mac Mini would do for a small office or a repurposed desktop machine.
You might also consider Amazon's S3 service to do offline backups. Since it's a pay-as-you-go service this can scale with use, and if you feel you want to move to something else you can always download your data and take it somewhere else.
Windows Vista features local file versioning in its file system, which can be useful, but is limited in terms of teamwork. However, if somebody overwrites somebody else's file, a new version is stored as it should be.
Also consider KnowledgeTree. Have a look at it, some demos/screenshots are available at
http://www.knowledgetree.com/
It has a free open source Community Edition - so it's cost effective. We haven't tried it, but we chose this one over other systems for a small business looking for document versioning solution.