I have a PHP Azure project which I have to manage with Powershell cmdlets. One of these, Publish-AzureServiceProject doesn't seem to be detecting file changes so these are not updated on the cloud (even though no errors are displayed).
I have remote desktop'd into the machines and the code is definitely not updated from weeks ago.
If I deploy to the local emulator, it is fine but this is much more obvious because it displays "removing old package" and "creating local package". The cloud package definitely contains the latest files, so the packaging is working fine.
Can anyone tell me how to force the publish to update the files on the cloud and more importantly, why this is not happening? Also, if I force the update, will it deploy to a new box and get a new IP Address?
Thanks.
It seems to work now.
I have removed and reinstalled azure libraries from my machine and created a new project from scratch and copied the original files over into it. I have not included diagnostics (not sure if that's an issue) and I have modified the Publish-AzureServiceProject script to select the subscription each time before it publishes.
It is possible that the subscription confusion was not helping (I have two Azure subscriptions and it might have used the wrong one at some point and done something weird) and also it was possible there was some conflict with various versions of the Azure SDK since I have been using it for over 6 months but at the moment, all is good.
A related article on my blog here: Problems with PHP Azure
Thanks for the interest
Related
The problem
So I am running into an interesting issue. I have been tasked to change a query for a simple SSIS package in Visual Studio 2015, which is a thing I have done multiple times in the last 6 months.
After changing the package and deploying it (to an installation of SQL server 2016, without errors!) I noticed that the execution of the package (scheduled with SSMS) generates the same result as the pre-updated package, meaning the demanded changes hadn't taken effect. Of course, as test, I have executed the package directly from VS2015 and got the result I wanted.
Ever since I have been running tests and trying to find a solution. The problems seems to lie with the receiving side of the deployment proces.
What I have tried
Deleted the package from the existing project in SSMS and redeployed. Deployment again seemd to succeed but the package didn't show up, so I had to restore an old version of the project.
Deploy the package from multiple different computers with access to VS2015 and the source code. No change...
Deploy the package to a new (empty) SSMS project: package does not appear in the project. This leads me to believe that the old package is kept when I publish the new version to the existing project in SSMS.
Regenating/rebuilding the package in VS2015, frankly this was never necessary and probably doesn't do anything for an SSIS package, but it may help you get an idea of my skill level.
In the past we have had issues with the encryption level blocking the deployment of packages. I have verified these settings and found no issues.
I have verified if any updates have recently been installed to the database server, which does not seem to be the case.
I have (of course) tried to google the issue, which is tricky due to the lack of errors. I have found the following links, that describe the same/a similar issue, but their solutions haven't helped:
https://dba.stackexchange.com/questions/259672/ssis-package-not-being-deployed
Deployed SSIS Package not reflecting changes made to package
What is still left to try
Rebuild project from scratch to see if that version is deployable.
Unfortunately I don't have a lot of experience with this subject and no colleagues or contacts to ask for help.
Thanks in advance.
My workaround
After quite a bit of time attempting to solve the issue I have resorted to working around the problem, by manually importing the .ispac file into the database. While this is not the prettiest of solutions, at least it's a workable one. If anyone has any other idea's I'll gladly see them, but for now the issue isn't nearly as pressing as it was.
From your post. "Deleted the package from the existing project in SSMS and redeployed. Deployment again seemd to succeed but the package didn't show up".
Are you 100% sure you are deploying it to the same project on the same server on the database? Are you refreshing after you deploy?
I'm trying to setup an integration between my GitHub and VSTS account.
I'm creating a build definition to build and deploy my code to my azure web app. From all the things I've read online (I've been trying to make this work for at least some 3h now) this should be so simple I'm reconsidering calling myself a developer... =/
I've added the Service Endpoint:
But it doesn't show up at the build definition:
The other service endpoints for azure work fine, I'm able to set them at the build definition, but the GitHub just won't work! I've tried signing out, using chrome's private window, removing and re-adding the endpoint, using IE, nothing makes that damn thing show up as an option in the dropdown.
Does anyone have any ideas?
EDIT - ScreenCast
http://screencast-o-matic.com/watch/cDif6XiUSZ
After 3 days and no answer from MS I kept trying to figure it out and it seems this account is running on an old version.(?)
I have another account that's working fine and I started to compare both, here's why I think that:
Different versions
no online column and different header (Account profile vs Account/acc_name)
Perhaps you are right that your account is running in an old server.
I created one in India South and experienced the same behavior.
Then I deleted it and created one in South US and I can see my GitHub repository.
We've got some legacy on-premise apps that we're evaluating moving off-site, and we are evaluating all our options. I understand that Azure Web Sites would be a lot easier to setup, but at this point it looks like may need some of the additional control that Cloud Services gives us.
However, everything I've read about Cloud Services so far demonstrates how you build an app and then deploy the build to the cloud. Similarly, you can connect to a Visual Studio Online repository, define builds in VSO, and after a commit, a build is performed and the build is deployed to the cloud.
However, in our case some of our pages are Classic ASP pages. In the event that one of these pages changes, I have not been able to figure out a workflow that allows us to deploy the updated files. Remember, classic ASP files do not have a "build" process; it's like a powershell script that is interpreted at runtime.
There is no Visual Studio solution or project involved with these apps. It's just a package of files we want to upload. For a "proof of concept" I decided to start with the simplest possible "app," a simple "hello.txt" file, and I have not been able to figure out a way to deploy this without "wrapping" it in a Visual Studio solution.
I was hoping that I could use, e.g., Publish-AzureServiceProject, however this appears to need a ServiceDefinition.csdef file, and again, I'm not sure how to do this without setting up solution in Visual Studio--a solution that wouldn't be used for anything.
I have a feeling I'm missing something and just need to find the appropriate publish settings file, or proper use of an Azure cmdlet. Is there a straightforward way to publish a package of files to an Azure Cloud Service?
Josh, you will need to package the files into a deployable package. This can be achieved using the cspack commandline tool and a hand-crafted definition file. Your ASP files would be treated as 'content' in this case.
The easiest way would be just to create stub Visual Studio Solution and include a 'Cloud Service' project to which you add all the ASP files. This way all your files will be redeployed in the event that your Web Roles require recycling by the Azure fabric.
While this might seem like a big overhead if you need to tweak just a single file, it is the correct way to manage PaaS deployments in Azure. If this process doesn't work for you then you should consider moving to an IaaS VM you fully manage yourself.
One thing that may be helpful is to realize that the web role in Cloud Services are just VM's using IIS. For that reason, you can connect to them just like any other server, via RDP, FTP, etc. Our team often bypasses the overhead of simple things, like deploying a new CSS file, an image, etc. by simply copying it in the old school way.
Again, not sure if this helps you, but old school techniques work just as well. :-)
This is a project I've been working on off and on for months and I feel like I'm pretty close, but I just can't seem to get past the final hurdle.
The goal is to develop an organization extension library that contains both internal and 3rd party code that we frequently rely on.
History
As a test project, I started with Apache Poi because that is already in wide use in our environment. I have a plug-in and feature built just from the Poi .jars that allows me to build our current Poi applications as long as I add the plug-in (from my workspace) to my build path. The apps work on the servers because we have already distributed the Poi .jars by manually copying them.
The next step is taking that plug-in and getting it into an updatesite so that all of the servers and developers can synchronize on one version. I found and followed these two excellent blog articles (that I wish existed when I started this project):
http://www.dalsgaard-data.eu/blog/wrap-an-existing-jar-file-into-a-plug-in/
http://www.dalsgaard-data.eu/blog/deploy-an-eclipse-update-site-to-ibm-domino-and-ibm-domino-designer/
With the caveat that the articles are written for Domino 9 and we are running 8.5.3 here, but that only matters in the last (installation) step.
Current
This brings us to the problem. All of the above seems to have worked great up to a point. I can install my feature to my designer client from the eclipse update site and it works great. However, the install is failing when I import that into our updatesite.nsf database. This means that while the developers can all install from the updatesite if I put it on a network drive, that doesn't deploy updates to our servers.
The problem is that when I try to install from the .nsf update site, the Eclipse Updater just hangs. I've let it go for well over an hour and eventually Notes becomes completely unresponsive.
So the question is, is there anything I might have done wrong, either in the development of the plug-in or server configuration that might be causing this issue?
Additional Info
I'm looking at the osgi console and that is largely unhelpful. I am getting the following errors as I'm trying to install: SEVERE Could not access digest on the site: no protocol: 0/5B004DDD5E38F3FF85257CAF004C72C7/$file/digest.zip ::class.method=unknown ::thread=Worker-7 ::loggername=org.eclipse.update.core
I could generate dumps if that would be useful.
Security is also locked down fairly tight here. It could be a security issue - is there a way to troubleshoot that? Once I get to the hang I'm just stuck guessing.
This has been edited for clarity and to update information
I know that this is post is over 5 years ago but...
for those that find this and are trying to resolve the error
SEVERE Could not access digest on the site: no protocol: "
is due to the update site project not having the URL of the Domino updatesite.nsf not being added to the Archives tab of the site.xml.
I found the updatesite.nsf also needs to be anonymously accessible as no credentials are prompted/passed through to the Domino server hosting the updatesite.nsf database (at least from DDE), YMMV from eclipse. So if Anonymous connections are blocked on the Domino server you will be out of luck.
To develop a plug-in you really want to have 3 projects:
the plug-in
the feature
the update site
Of course a feature can contain more than one plug-in (and probably should) and a update site can contain more than one feature (and probably should). Once you have an update site project it features a handy button "build all" that makes sure plug-in, feature and update-site get compiled in one go. And that button is what you really want.
You can point using a setting in your Domino Designer (or local Domino server) to the feature directory. Add a plain text .link file to framework/rcp/eclipse/links, that contains the path to your install site - it then picks up the features and plug-ins from there. After a build you would need to restart designer/server to activate the updated feature.
For the Domino server the approach using an updatesite.nsf and the respective notes.ini setting makes the most sense (to me). http restart required. Lazy people script the whole thing.
I still don't have a great answer for this, but I believe the issue is related to the environment here. I don't have the authority to change the environment, even if I were able to conclusively demonstrate it is the cause of this problem, so it is a moot point. All I can say is that at least one administrator computer had no issue installing from the update site.
For me, the solution for distributing the update site is to put it on a network drive and have everyone install it from there. The server has no problem using it from the updatesite.nsf.
I was looking for some insight about what happens to existing workspaces and files that are already checked-out on people, after an upgrade to TFS2010. Surprisingly enough I can not find any satisfactory information on this. (I am talking about upgrading on new hardware by the way. Fresh TFS instance, upgraded databases)
I've checked TFS Installation guide, I searched through the web, all I could find is upgrade scenarios for the server side. Nobody even mentions what happens to source control clients.
I've created a virtual machine to test the upgrade process, The upgrade was successful and all my files and workspaces exist in the new server too. The problem is: The new TFS installation has a new instanceID. When I redirected on the clients to the new server, the client seemed unable to match files and file states in the workspace with the ones on the new server. This makes me wonder if it will be possible to keep working after the production upgrade.
As I mentioned above I can not find anything on this, it would be great if anyone could point me to some paper or blog post about this.
Thanks in advance...
When you do an upgrade your server ID should stay the same. You may need to chnage it is you want to clone your enviroment.
In your test senario you are creating a clone of the TFS server rather than a strate upgrade.
ChangeServerID
You are probably running into problems as this has been run on your test envionment to facilitate it runing on the same network as your production TFS server.
All workspaces and shelvesets remain unchanged, and people will be able to continue working immediately. Even checked-out files are OK and will be picked up correctly.
I would recommend upgrading the server first, and keep the clients as 2008 (using the Forward Compatibility Pack), and then upgrading the clients to 2010 as and when the projects are upgraded.