I have a Umbraco API project that using the UmbracoCms library V 7.15.3. The project work properly, but from some reason after that i copy the project to another computer. The API stop to work there and requests did not comes to the controller.
Is there a reason for something like this to happen?
Umbraco gives you a web application which you need to install and setup somewhere before it will work.
You probably used their installer initially which created an IIS website and a database for you. You don't need to re-run the installer, but you do need to copy the database and all the files and re-create the website in IIS on the other computer.
Simply moving files around is not enough.
This is a project I've been working on off and on for months and I feel like I'm pretty close, but I just can't seem to get past the final hurdle.
The goal is to develop an organization extension library that contains both internal and 3rd party code that we frequently rely on.
History
As a test project, I started with Apache Poi because that is already in wide use in our environment. I have a plug-in and feature built just from the Poi .jars that allows me to build our current Poi applications as long as I add the plug-in (from my workspace) to my build path. The apps work on the servers because we have already distributed the Poi .jars by manually copying them.
The next step is taking that plug-in and getting it into an updatesite so that all of the servers and developers can synchronize on one version. I found and followed these two excellent blog articles (that I wish existed when I started this project):
http://www.dalsgaard-data.eu/blog/wrap-an-existing-jar-file-into-a-plug-in/
http://www.dalsgaard-data.eu/blog/deploy-an-eclipse-update-site-to-ibm-domino-and-ibm-domino-designer/
With the caveat that the articles are written for Domino 9 and we are running 8.5.3 here, but that only matters in the last (installation) step.
Current
This brings us to the problem. All of the above seems to have worked great up to a point. I can install my feature to my designer client from the eclipse update site and it works great. However, the install is failing when I import that into our updatesite.nsf database. This means that while the developers can all install from the updatesite if I put it on a network drive, that doesn't deploy updates to our servers.
The problem is that when I try to install from the .nsf update site, the Eclipse Updater just hangs. I've let it go for well over an hour and eventually Notes becomes completely unresponsive.
So the question is, is there anything I might have done wrong, either in the development of the plug-in or server configuration that might be causing this issue?
Additional Info
I'm looking at the osgi console and that is largely unhelpful. I am getting the following errors as I'm trying to install: SEVERE Could not access digest on the site: no protocol: 0/5B004DDD5E38F3FF85257CAF004C72C7/$file/digest.zip ::class.method=unknown ::thread=Worker-7 ::loggername=org.eclipse.update.core
I could generate dumps if that would be useful.
Security is also locked down fairly tight here. It could be a security issue - is there a way to troubleshoot that? Once I get to the hang I'm just stuck guessing.
This has been edited for clarity and to update information
I know that this is post is over 5 years ago but...
for those that find this and are trying to resolve the error
SEVERE Could not access digest on the site: no protocol: "
is due to the update site project not having the URL of the Domino updatesite.nsf not being added to the Archives tab of the site.xml.
I found the updatesite.nsf also needs to be anonymously accessible as no credentials are prompted/passed through to the Domino server hosting the updatesite.nsf database (at least from DDE), YMMV from eclipse. So if Anonymous connections are blocked on the Domino server you will be out of luck.
To develop a plug-in you really want to have 3 projects:
the plug-in
the feature
the update site
Of course a feature can contain more than one plug-in (and probably should) and a update site can contain more than one feature (and probably should). Once you have an update site project it features a handy button "build all" that makes sure plug-in, feature and update-site get compiled in one go. And that button is what you really want.
You can point using a setting in your Domino Designer (or local Domino server) to the feature directory. Add a plain text .link file to framework/rcp/eclipse/links, that contains the path to your install site - it then picks up the features and plug-ins from there. After a build you would need to restart designer/server to activate the updated feature.
For the Domino server the approach using an updatesite.nsf and the respective notes.ini setting makes the most sense (to me). http restart required. Lazy people script the whole thing.
I still don't have a great answer for this, but I believe the issue is related to the environment here. I don't have the authority to change the environment, even if I were able to conclusively demonstrate it is the cause of this problem, so it is a moot point. All I can say is that at least one administrator computer had no issue installing from the update site.
For me, the solution for distributing the update site is to put it on a network drive and have everyone install it from there. The server has no problem using it from the updatesite.nsf.
I'm going to deploy an application click once and wondered what the best technique for dealing with the database, since the click once install sqlserver to me I was wondering how I determine the sqlserver installed so that when the application is first run, see it and create the sqlserver database.
I would not use the windows install to distribute this application because it will have multiple versions of update, and would be easier with clickonce.
Which flavor of SQLServer are you using? Here's some helpful information.
If it is SQLServer Express, you need to add the database to your project and set the build action to 'content' and set 'copy to output directory' to 'copy if newer'. This will ensure the database is included in the deployment.
Next, go into the prerequisites dialog and select SQLServer Express. When the user runs the setup.exe, it will check and see if it is installed, and if not, will install it.
If you want the newest version of SQLServer Express, you can find how to create a bootstrapper package here -- Microsoft doesn't provide one, but this article provides the XML you need and provides links to the SQLServer Express downloads.
If you are using SQLCE, you need to attach the database (*.sdf) to your project and set the properties as noted above. However, you do not need to publish this as a prerequisite, you can just include the dll's in your project as noted here.
When your publish a new version, if the database has changed, ClickOnce will put the new database in the DataDirectory and put the old one in the \pre subfolder of the DataDirectory, and you have to write code to handle that. This sounds appealing, but I think it's dangerous. If you even so much as open your database to look at the structure, it will change the date/time stamp, and ClickOnce will think it's new and publish it, and you will get calls from your customer about their data missing, unless you handle this.
So I usually recommend you copy the database to LocalApplicationData when the user first installs your application, and handle any updates to the structure programmatically after that. There is an article about how to do that here.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I am working on my first Drupal project on XAMPP in my MacBook. It's a prototype and receives positive feedback from my client.
I am going to deploy the project on a Linux VPS two weeks later. Is there a better way than 're-do'ing everything on the server from scratch?
install Drupal
download modules (CCK, Views, Date, Calendar)
create the Contents
...
Thanks
A couple of tips:
Use source control, NOT FTP/etc., for the files. It doesn't matter what you use; we tend to spin up an Unfuddle.com subversion account for each client so they have a place to log bugs as well, but the critical first step is getting the full source tree of your site into version control. When changes are made on the testing server or staging server, you see if they work, you commit, then you update on the live server. Rollbacks and deployment gets a lot, lot simpler. For clusters of multiple webheads you can repeat the process, or rsync from a single 'canonical' server.
If you use SVN, though, you can also use CVS checkouts of Drupal and other modules/themes and the SVN/CVS metadata will be able to live beside each other happily.
For bulky folders like the files directory, use a symlink in the 'proper' location to point to a server-side directory outside of the webroot. That lets your source control repo include all the code and a symlink, instead of all the code and all the files users have uploaded.
Databases are trickier; cleaning up the dev/staging DB and pushing it to live is easiest for the initial rollout but there are a few wrinkles when doing incremental DB updates if users on the live site are also generating content.
I did a presentation on Drupal deployment best practices last year. Feel free to check the slides out.
Features.module is an extremely powerful tool for managing Drupal configuration changes.
Content Types, CCK settings, Views, Drupal Variables, Contexts, Imagecache presets, Menus, Taxonomies, and Permissions can all be rolled into a feature, which can be checked into version control. From there, deploying a new site, or pushing changes to an existing one, is easily managed with the Features UI or Drush.
Make sure you install Strongarm.module for exporting drupal config that gets stored in your Variables table. You can also static content/nodes (ie: about us, faqs, etc) into Features by installing uuid_features.module.
Hands down, this is the best way to work with other developers on the same site, and to move your site from Development to Testing to Staging and Production.
We've had an extensive discussion on this at my workplace, and the way we finally settled on was pushing code updates (including modules and themes) from development to staging to production. We're using Subversion for this, and it's working well so far.
What's particularly important is that you automate a process for pushing the database back from production, so that your developers can keep their copies of the database as close to production as possible. In a mission-critical environment, you want to be absolutely certain a module update isn't going to hose your database. The process we use is as follows:
Install a module on the development server.
Take note of whatever changes and updates were necessary. If there are any hitches, revert and do again until you have a solid, error-free process.
Test your changes! Repeat your testing process as a normal, logged-in user, and again as an anonymous user.
If the update process involved anything other than running update.php, then write a script to do it.
Copy the production database to your staging server, and perform the same steps immediately. If it fails, diagnose the failure and return to step 1. Otherwise, continue.
Test your changes!
BACK UP YOUR PRODUCTION DATABASE and TAKE NOTE OF THE REVISION YOU HAVE CHECKED OUT FROM SVN.
Put your production Drupal in maintenance mode, run "svn update" on your production tree, and go through your update process.
Take Drupal out of maintenance mode and test everything (as admin, regular user, and anonymous)
And that's it. One thing you can never really expect for a community framework such as Drupal is to be able to move your database from testing to production after you go live. From then on, all database moves are from production to testing, which complicates the deployment process somewhat. Be careful! :)
We use the Features module extensively to capture features and then install them easily at the production site.
I'm surprised that no one mentioned the Deployment module. Here is an excerpt from its project page:
... designed to allow users to easily stage content from one Drupal site to another. Deploy automatically manages dependencies between entities (like node references). It is designed to have a rich API which can be easily extended to be used in a variety of content staging situations.
I don't work with Drupal, but I do work with Joomla a lot. I deploy by archiving all the files in the web root (tar and gzip in my case, but you could use zip) and then uploading and expanding that archive on the production server. I then take a SQL dump (mysqldump -u user -h host -p databasename > dump.sql), upload that, and use the reverse command to insert the data (mysql -u produser -h prodDBserver -p prodDatabase < dump.sql). If you don't have shell access you can upload the files one at a time and write a PHP script to import dump.sql.
Any version control system (GIT, SVN) + Features module to deploy Drupal code + custom settings (content types, custom fields, module dependencies, views etc.).
As Deploy module is still in development mode, so you may like to use Node export module in Drupal 7 to deploy your content / nodes.
If you're new to deployment (and or Drupal) then be sure to do everything in one lump.
You have to be quite careful once there are users effecting content while you are working on another copy.
It is possible to leave the tables that relate to actual content, taxonomy, users, etc. rather than their structure. Then push the ones relating to configuration. However, this add an order of magnitude of complexity.
Apologies if deployment is something old hat to you, thus this is vaguely insulting.
A good strategy that I have found and am currently implementing is to use a combination of the deploy module to migrate my content, and then drush along with dbscripts to merge and update the core and modules. It takes care of database merging even if you have live content, security and module updates, and I currently have mine set up to work with svn.
I'm a Drupal newbie. Is it possible to set up everything and deploy Drupal on the server? I mean things like putting in the content, setting up the modules, etc..., then you put it all up to the production server?
Of course.
copy all the files
edit the database credentials (sites/default/settings.php)
export the database content via mysqldump or phpMyAdmin (supposing you use MySQL)
import the database content at the target server
I've done it several times.