something similar to futon for couchdb?
also where is the source code fo futon?
is it FOSS?
The source code for Futon is included in the CouchDB project. It is pretty much a completely independent project, so you will get lots of knowledge and experience by looking at it. (Of course, it is only useful with CouchDB--that is why the source code is inside the CouchDB project.)
If you download the CouchDB source code, Futon will be in the share/www/ folder: https://github.com/apache/couchdb/tree/1.1.1/share/www
Related
I am looking for a easier way to manage my eclipse code snippets. I know and have used Eclipse's template and snippets features. But as far as I have found, they can only be exported and imported as XML files.
Since I use many versions of eclipse and I keep migrating between machines managing the snippets is a hassle. I am looking for a UltiSnips like method to manage these snippets/templates. Is there one?
I also looked at snip2code, but it didn't appeal to me because I sometime work offline. Also, I want to have much greater control over the snippets using version control.
Oh well! I couldn't find anything that matched the set of requirements I had. So, I've ended up creating my own.
Situation:
I need several swf/exe output files compiled in FlashDevelop from several projects. More than 60% of ActionScript 3.0 source is common for all project, rest are project-specific. How can I organize that in FlashDevelop? I want to have "one-click-to-build all" setting without duplicating common codebase (so when I need to fix something I do not need to copy-paste solution into several files).
All sources are under develeopment and will change very often.
A straightforward solution is to make an external classpath, for instance:
c:\dev\shared_src\
c:\dev\project1\
c:\dev\project2\
Then configure each project:
Project Properties > Classpath
Add Classpath > select '../shared_src'
PS: of course you should keep everything under source control.
Using svn:externals you could structure your repository in such a way that the commom parts are stored just once in the source control system, so changes made can be synchronised with just a single commit and update cycle.
For example, imagine that you have ^/ProjectA and ^/ProjectB, each of with require ^/Common as a sub directory.
Using svn:externals, pull ^/Common into both projects.
The exact nature of doing this will depend on the version of svn you use, and any client you use (such as TortoiseSvn). Refer to the relevant edition of the svn book for specifics.
The ease of implementing this will depend quite a lot on how separate the common code currently is in your application; and pulling in directories as directories is much more practical than trying to pull them into an existing directory; and unfortunately wildcards for filepaths are not supported.
However, based on your description of your aim; this is the most straight-forward solution I can imagine.
Hope this helps.
This is probably a really silly question but my work is kind of new to embedded linux, we aren't really sure how we should source our code.
We'll be getting a package from free scale and if it's anything like our omap package it'll prabably be pretty big. Is it a good idea to just source everything, or split it up into different repos, should we leave some stuff out?
We do have some experience with windows ce, we never really sourced everything, just the stuff we used in the board support package and checked it out over the wince600 folder.
There are too many variables to give a definitive answer.
Thus, the correct answer depends on
your application and how heavy your changes are
your build system
You're talking about freescale and linux embedded, maybe you want to check openembedded . If you use this or any similar solution, than you don't need to put under versioning anything but your sources and your recipes.
But if you're customizing the way you build the system, than I'd definitively put the sources and the build scripts under versioning.
Anyone out there using LiveCode in a multi developer project?
Either way, can someone recommend a good source control system / plugin to use?
We've looked at MagicCarpet but since it is no longer developed we wish to use something else.
Thanks
I'm working on a solution to this problem by exporting the stack file as a structured directory of script, json and image files which will diff and merge nicely in most VCS. It is not yet available but the intention is it will be open source. My goal is to demonstrate it at the RunRevLive conference in May.
Here's the repo for lcVCS https://github.com/montegoulding/lcVCS
I have put a git library stack on revOnline (libVersionControl) that exports to structured xml files that git can handle. It works as far as it goes, but I have hopes that Monte's solution will supersede this effort.
revOnline link to stack
Yes, our team has been using LiveCode with multiple developers. Since the Livecode community is still young, acquiring good source control tools can be a challenge. Our solution has been to break code into modules (stack files). When there are updates to merge into the main codebase, we clone our existing codebase, and merge code changes manually using line by line compare in a text editor. This is not a fun process, but is much less painful than it sounds.
If I were to redesign the system, we would simply use Git (Github.com etc.). There is no reason this would not work with Livecode stacks.
We use LiveCode in a small team with Subversion.
We don't have a perfect solution, but it is very lightweight; we all use a custom extension to the standard toolbar, which among other things has a 'save+backup' button. When we started using it with Subversion, we added code to this button which saves an XML sidecar file for the stack. The file contains all the scripts, custom properties, and optionally fields (controlled by user property in each stack). In our case almost all of our work is in scripts, so this works for us.
The effect is that each time we commit to SVN, we're always committing two files, the LiveCode stack and the accompanying sidecar file - the latter works fine for diffing etc.
Where this lets us down is that we don't have any solution for merging. If we were working on larger systems more actively, we'd also modify I expect look to modify the sidecar format into a complete folder of files. For now however this makes the situation workable (and it takes no noticeable time to generate the sidecar file).
Happy to share code if that was useful.
I know of a tool thats being worked on that is going to really help in this regard. When he showed it to me it looked very functional already. But I'm not sure when he will share it with the community.
So the point is, its just a matter of time before people's stuff comes together to make a turn key solution for this.
Given a project I'm about to start there will be documentation produced.
What is the best practice for this?
Should the documents live with the code and assets or should there be a separate documentation store?
Edit
I'd like a wiki but I will need to print the documents etc... It's a university project.
It really depends on your team. Where I work, we keep documentation in a wiki which is linked in with our team website. For the purposes of shipping documentation, the wiki can be exported and we run it through a parser that "fancifies" the look and feel of the documentation for customer purposes.
Storing the documentation with the code (typically in your source repository) is not a bad idea. Just make sure to keep them separated. For example, keep a docs folder which is on the same level with your src folder in your repository. This way, you can quickly ship the current documentation, you can easily track revisions, and anybody new to the project can immediately jump in without having to go to multiple locations for information.
Storing it in source control is fine.
This is an interesting question -- basically, what others are saying is right about generated documentation, source files and templates/etc. should be stored in source control and generated during your build process.
As far as requirements/specs/etc. documentation, I have worked both ways, and I very much prefer using SharePoint or a Wiki/document portal that is designed for document sharing/versioning. The reason is, most non-developer folks aren't comfortable working with source control systems, and you don't gain any of the advantages of intelligent merging if you are using a binary format like Word. Plus it's nice to have internet-based access so you can reference and work on the docs in a distributed team without people having to install extra software.
Here's a 2017 summary of the options and my experience:
(extreme 1) Completely external (e.g. a wiki, Google Docs, LaTeX, MS Word, MS Onedrive)
People aren't bothered about keeping it up to date (half of them don't even know where to find the page that needs updating since it's so out of the trenches).
wiki platforms are “captive user interfaces” - your data gets stored in their proprietary schemas and is not easy to examine with a simple text editor (Confluence is even worse in that you have no access to the plaintext content at all anymore)
(extreme 2) Completely internal (e.g. javadoc)
pollutes the source code, and is usually too low level to be of any use. Well-written source code is still the best form of low level documentation.
However, I feel package-info.java files are underutilized.
(balance) Colocated documentation (e.g. README.md)
A good half way solution, with the benefits of version control. If a single README.md file is not enough, consider a doc/ folder. The only drawback of this I've seen is whether to source control helpful graphics (e.g. png files) and risk bloating the repo.
One interesting way to avoid this problem is to use plaintext diagram tools (I find Grapheasy and Text Diagram to be a breath of fresh air).
plaintext can be easily read even if your rendering engine changes as the years go by.
Github's success is in no small part thanks to its README.md located in the root of the project.
One tiny disadvantage of this approach though is that your continuous integration system will trigger a new build each time you make edits to the README.md file.
If you are writing versioned user documentation associated with each release of the product, then it makes sense to put the documentation in source control along with its associated product release.
If you are writing internal developer documentation, use automated internal source code documentation (javadoc, doxygen, .net annotations, etc) for source level documentation and a project wiki for design level documentation.
I think most of us in the industry are not really following best-practices and it of course also depends a lot on your situation.
In an agile environment where you would have a very iterative process of release, you will want to "travel light". In this particular case, Jason's suggestion of a separate Wiki really works great.
In a water-fall/big bang model, you will have a better opportunity to have a decent documentation update with each new release. Also you will need to clearly document what version of the requirements was agreed on and have loads of documentation for every tiny change you do to requirements (due to the effects it has on subsequent stages). Often if the documentation can live together with the version controlled source code it is the best.
Are you using any sort of auto-documentation or is it completely manual? Assuming that you are using an auto-documentation system, the documentation is more or less generated on the fly, and would be part of the code itself.
To me, (assuming that it's possible with whatever code you are using), this would be the preferred method of handling it, as you wouldn't need to maintain the documentation source at all.