SQL CE deployment - deployment

I have a small windows app and am trying to use SQL CE for the local datastore. I have had a couple of problems deploying it. I am using ClickOnce deployment.
First question:
In the Publish properties -> Application Files I have it set to Data File(Auto), Required, Include. However, it doesn't seem to be included? When I navigate to the location that Click Once installs to its not there?
Second:
Click once creates a new directory in the User\Local\Apps directory, with the app files and SDF file in when I update the app and release a new version I don't want to start with a new database. All the data in the existing database will be lost? The just doesn't seem to make sense?
What is the procedure around this?

Related

How to automatically scan the re-uploaded files with some modification in the wildfly-10 without the server restart?

I am using wildfly-10 server. I am providing an option to Upload images or jsp files for the user in the UI and the user can make use of these files in the other section of the application later.
At any one point of time I am allowing only one entry with a particular name. If the user tries to upload file with a name that is already existing then I am trying to overwrite the existing one with the new file.
In this scenario I am facing the below problem:
I have uploaded a image with the name image1.png.
Now if I change some other image's name to image1.png and upload it, the new image is not visible until I restart the server.
Looks like the older image has been cached by the server and it is still referring to the cache location. When I restart the server then it refreshes the cache with the new content of the file.
Is there any way that I can immediately see the changes in the UI whenever I re-upload the modified file?
I am using a custom folder to store the uploaded files in my server.
Is there way that I can enable deployment directory scan for this particular directory only?
You don't have to restart the server, a redeploy of the application should work.
You can define another deployment scanner or the directory scanned by the scanner: http://wildscribe.github.io/WildFly/16.0/subsystem/deployment-scanner/scanner/index.html
Another solution would be to create overlays http://wildscribe.github.io/WildFly/16.0/deployment-overlay/index.html .
Thirdly with exploded deployments WildFly already provide the functionality you have developed: https://wildfly.org/news/2017/09/08/Exploded-deployments/ (note that all jboss-cli operations can be called using HTTP rest API)

Packaging SF service into a single file

I am working through how to automate the build and deploy of my Service Fabric app. Currently I'm working on the package step and while it is creating files within the pkg subfolder it is always creating a folder hierarchy of files, not a true package in a single file. I would swear I've seem a .SFPKG file (or something similarly named) that has everything in one file (a zip maybe?). Is there some way to to create such a file with msbuild?
Here's the command line I'm using currently:
msbuild myservice.sfproj "/p:Configuration=Dev;Platform=AnyCPU" /t:Package /consoleloggerparameters:verbosity=minimal /maxcpucount
I'm concerned about not having a single file because it seems inefficient in sending a new package up to my clusters, and it's harder for me to manage a bunch of files on a build automation server.
I believe you read about the .sfpkg at
https://azure.microsoft.com/documentation/articles/service-fabric-get-started-with-a-local-cluster
Note that internally we do not yet support provisioning a .sfpkg file. This is a feature that will be coming in soon (date TBD). Instead, we upload each file in the application package.
Update (SF 6.1 - April 2018)
Since 6.1 it is possible to create a ZIP file (*.sfpkg) and upload it to an external store. Service Fabric executes a GET operation to download the sfpkg application package. For more infos see https://learn.microsoft.com/en-us/azure/service-fabric/service-fabric-package-apps#create-an-sfpkg
NOTE: This only works with external provisioning, the Azure image store still doesn't support sfpkg files.

Web2py transfer site

I have a Web2py site that I want to transfer to another computer. I'll do an SQL dump for the (external) database, but does anyone have experience of transferring the Web2py site itself? Which files do I need to copy to the new machine?
Thanks everyone.
You should be fine just copying the application folder. You can exclude the /cache, /errors, and /sessions subfolders. Make sure you restore the database before running the application (or if you want web2py to re-create the database tables, make sure migrations are enabled and do not copy the contents of the /databases folder).

How to upload Parent child SSIS package to server

Hi all I am very new to SSIS. I have got SSIS package developed by some other guy this package reads data from flat files and stores to database after mapping.
Flow:
1) First package extract records from flat file and stores in table.
2) Then it calls child package using Execute package tasks.
3) Then child package do some calculations and update the database table.
SSIS is using Environment variable to get database information.
Every thing is working fine but now I want to deploy this package to my client's server.
Ques: Do I need to copy and paste files from bin folder and paste on clients machine?
What I Tried: I copy files from bin folder and placed on my local computer. Then I create a job in MSSQL and run the job. Package runs perfectly. But Later I changed location of my project and problem starts job stops working.
Issue: Error says location of child package is not available(As I changed position of my project files)
Kindly suggest what to do.
I am going to make several assumtions here so please correct me if I get any wrong.
The problem I am guessing is that on your Package.dtsx within the connection manager this is currently linked to the package location within the project folder. In this case you are wanting to change it to another location, however the package in the connection manager is still pointing to the project location.
If I were you I would do the following:
Create a string variable
PackageFolderPath - C:\CurrentPackagePath\DBPackage.dtsx
Now what you want to do is go to the package within the connection manager and under the properties add an expression for ConnectionString with the following: “#[User::PackageFolderPath] If you evaluate the expression it should give you the location you setup in your variables.
Please note however that if you want this to work on the development system then setup the package to the project location.
Now once you have those setup, copy the files across the new server and under the SQL agent job to go the Set Values tab and within here you want to add the following:
\Package.Variables[User::PackageFolderPath].Properties[Value]
Under the value you want to put wherever the package is now located
This now should pickup the new location of the package when it is run.
A better way to do this would be to make use of the deployment utility and using an XML configuration variable on the package. However this way should work.

What is the best deployment practice when using MODX?

It is convenient when you have DEVELOPMENT version of application on your local machine and you may deploy it on STAGE server for testing (it's optional) and then deploy it on PRODUCTION server. You can do this relatively easily when there is a fine discretion of code and data in the project (for example, if we store all the code and settings in project files and data in database).
MODX stores templates, snippets, etc. in database. Yes, we can move this code to static files and then we can use version control system for tracking changes of these items. But these ones have representation rows in database too. It means we must update database as before if we added or removed some items.
Looks like we can also get some troubles if we just copied files of extensions instead of making installation by package manager (because extensions often have its own tables in DB).
Another problem is that applications on DEV and PROD have different settings stored in files (configs) and database (user accounts, e.g.).
I do not still see the clear way to organize iterative DEV-STAGE-PROD development cycle. So, my questions are:
Which files and database tables should (or must) I copy when deploying?
What is the mode (replace, ignore) I should do that in?
What is the easiest and fastest way to do that?
My biggest concern here is having to deal with database.
P.S. I'm talking about "Revolution" version of MODX if it matters.
The database should not store any path information at all, previous versions did in the modx_workspaces table, but that has since disappeared [as of 2.2.4 I believe].
If you are concerned about the url changes [dev.mysite.com / stage.mysite.com / production...] don't be - this is all in the .htaccess file [there used to be a site_url system setting, but it also seems to have disappeared.]
The only file you need to worry about is the core/config/config.inc.php ~ create 3 different files with the different paths or just replace them when you migrate.
my process for moving/updating/migrating modx sites is:
clear the cache!!
tar cvfz httpdocs.tar.gz httpdocs/
mysqldump -u -p the_database > export.sql
move the files, tar xvfz & import the database.
It's a good idea to check the modx_workspaves table and if you have used an older version of gallery, check that as well, but most plugins & developers seem to be used to NOT storing path information in code & DB tables.
Of course if you have hardened your installation there are a few more steps, but nothing major. [see the "hardening Modx article on rtfm.modx.com]
I think what you're looking for is this plugin (depending on your version of modx):
https://github.com/digitalbutter/MODX-Mirror
https://github.com/digitalbutter/FEM
All Chunks, Snippets etc. are located on disk. Any changes made to the files will trigger the appropriate database changes without the need to do a complete SQL Import/Reimport. This will allow for any Version Control System / Distributed Development Environment / Automated Deployment.