Best practices for deploying data to a custom folder - deployment

Sometimes when we issue an upgrade to our application we need to install some files to the application's Data folder. We want to make it possible for the users to move this folder to a place of their liking. But how to deal with this at install time?
I was thinking of deploying to the user's AppData folder and have the application somehow check there for new files at startup.
Any advice or references would be very welcome!
We use InnoSetup for a VB6 application if that matters for your answer.

Generally the best solution I've found is to allow the user to move the folder from within the application.
This allows the application to keep track of where its data is being kept (by adding a reference to it in a file or registry entry which it accesses at load time) and to access it seamlessly in the future.
Your update routines can then also access this information to determine where to place the update files.
Alternatively, make sure the folder name is as distinctive as possible and add a search routine to look for the directory in a number of sensible places at load time. Then write your manual specifying that the data folder can be moved to one of those locations ONLY.

Wouldn't the users just run an update or patch package? I'm not sure why they'd want or need to see such files. It's pretty rare for commercial software to offer users the options of where to store program settings and other internal-use files.
Give some thought to this before putting a lot of stuff into users' roaming profiles. You might want LocalAppData instead.

Related

Export to Java application deletes files

When I wanted to export the model I was working on as a Java application, I encountered an error regarding the databases I loaded into the model. When I said OK to the error, I realized that all the files in the folder I wanted to create the Java application were deleted. That folder was desktop by the way.
Right now all the files (i mean all of them!) on my desktop are deleted and they don't even show up in the recycle bin. How are we going to solve this situation? How can AnyLogic have the authority to delete all files in that folder? How is this authority not shared with me and not warned beforehand?
When you work with software in general, you need to have a version control in place that will allow you to recover your information. These problems occur, and if AnyLogic has access to your computer it's because you grant the permission and it needs the permission. If you make your desktop your project folder, then i would say you are to blame.. why would you do that...
Using GIT as Ben commented, is always a good idea... but it requires you to be conscious about when you commit a version.
What I do, is I use dropbox and all my projects are done in a dropbox folder... the good thing is that dropbox always saves automatically all the files on the folder... this has saved my life multiple times and I suggest you to do something like that in the future. So on one hand you have the autosaving features, which is useful, but sometimes you erase everything by mistake, and the autosave is not useful, but dropbox saves no matter what.

symlink or an alternative solution on GCS

I am currently using Google cloud storage to store some files. From the PHP side, I am able to use these files just fine. Now I want to extend this functionality to store 4 good versions of these files so that I can change the file path through symlink(or any other alternative way is that's not an option) on PHP side in case the latest set of files get corrupted. I am wondering how to go about this.
I appreciate any suggestions that you might have.
Cloud Storage offers a versioning system as a feature that you need to enable. Versioning allows for you to save a file with the same title and the system archives the previous version and displays the new one. In this case, if there was a corruption, you would have to go into the Cloud Shell and retrieve the previous copy.
If you do not wish to go that route, I can suggest save 4 copies with distinct names(ie: fileName[number]). This way, you would take the newest file, retrieve a substring containing the number, and creating your new file based off the substring.
In both methods, you are able to roll-back to a previous version.
Cloud Storage does not allow for symlinks.

Extend local history to save more data on changes

We are evaluating a project where eclipse should be extended in a way to enable more analysis on code changes. As these changes are already stored by eclipse in the local history, we don't want to save all data again in files on the system and are searching for a good way to efficiently build it on top.
More details:
Java files are changing multiple times in a lifecycle of a project. To build some statistics about developer habits, we want to store additional data as last time modified, username, some user input, and some more things. This data should be displayed in a compare view with annotations but should be a normal comparison view also.
We don't find any extension points or classes to extend, which could make this possible so we think asking here for help is maybe a try to find an answer/tip for our project.
Thanks to all in advance.

Is there a way to get the system configuration files folder within a Perl script?

Tried searching for this a number of ways and have not yet found an answer ...
Background
I am working on a legacy Perl application that has a lot of hard-coded values in it which should be configurable depending on where the app is installed. So, obviously, I am looking to externalize these values into a configuration file that may be located in one of a few "expected" locations. That is, using a traditional approach of checking for the configuration file in:
the current working directory,
the user's home directory (or a sub-folder therein), and
the system configuration directory (or a sub-folder therein)
where the first one found wins.
Where I am at
Perused the CPAN site a bit and found the Config::Any package, which looks promising. I can give it a list of files to use:
use Config::Any;
my $config = Config::Any->load_files(
{
files => [qw(sample.conf /home/william/.config/sample.conf /etc/sample.conf)],
use_ext => 0,
});
This will check for the existence of each of these files, and, if found, load the contents into an array reference of hash references. Not bad, but I still have to hard-code the locations where I search for my sample.conf file. Here, I assume that I am working on a Linux system, and that the location for the configuration file for all users of the application is /etc/. I could always add /usr/local/etc/ as well, but regardless, this is not system agnostic.
I can locate the user home folder using File::HomeDir for searching there, and it works correctly regardless of the system on which the application is running. So is there a similar package that would provide the /etc/ folder (or its equivalent on other platforms)?
Questions
Is there a way to do this without having to know what particular OS I am on? (Perl package or code snippet)
What is the "Perl best practice" way of accomplishing this? I cannot imagine that no one else has run into this previously.
Unless you don't plan to run your code on non unix-based hosts, according to the conventional directory layout and filesystem hierarchy standard, you may rely on a quite large set of well known places.
Anyway, nothing prevents you to dynamically build the file search specification to take account of platform oddities and their specific ways to get them (eg. File::HomeDir::Win32 vs File::HomeDir).

In Source Safe is there a way to protect crucial files being modified by some developers

I'd sooner not just permanently have a bunch of files checked out to me, but I'd like a way to prevent some really critical files being changed without my being aware of it.
PS Yes I know SourceSafe is terrible.
You can only set access rights on a project by project basis.
This is done by running the the Source Safe Administration application and then go to Tools > Rights By Project. Select the required project and then give a users the required priveledges.
In order to protect a subset of files place them in a seperate project and hence protect the subset.
When you go into Source Safe if you set the working folder of the sub folder to be the same as the parent then when you do a get latest etc. all the files will be in the same folder. If you want the protected files to be in a seperate folder then set the working folder accordingly.
It's been a while since I've had to use Source Safe but I don't think it has this kind of functionality built it.
Can you set up a separate repository/instance that excludes the users who shouldn't be allowed to modify them?
Or failing that, just keep the files always checked out on your machine :P
check them out locked exclusive