A project was recently deleted from the Draft and Published databases by mistake. Sever versions are still listed in the Archive database. How can I restore that project to the published database and make it available in PWA?
I tried restoring the project via Administrative Restore. I got a message saying the restore job would be queued. I checked the queue and the job finished successfully - but the project does not appear in Project Center. I'm wondering if Administrative Restore is only meant for restoring specific versions of projects that are still "live".
Thanks for any help and please ask if more information might help.
Have you checked to see if the project is accessible through Project Professional? Also, have you checked the simple things that are easy to forget, such as making sure that "sub projects" is selected in the ribbon, or that when you restored, it wasn't marked as closed, and then being excluded from the views? When I have restored a project previously, sometimes key pieces of Project Details are either missing or incorrect and the project does not appear as expected in the views. Once the project is opened, the details can be reapplied and the project appears in PWA as expected.
You could also attempt to use "ProjTool." It would be worth trying to see if that program recognizes it is as available and publish from there.
Related
The problem
So I am running into an interesting issue. I have been tasked to change a query for a simple SSIS package in Visual Studio 2015, which is a thing I have done multiple times in the last 6 months.
After changing the package and deploying it (to an installation of SQL server 2016, without errors!) I noticed that the execution of the package (scheduled with SSMS) generates the same result as the pre-updated package, meaning the demanded changes hadn't taken effect. Of course, as test, I have executed the package directly from VS2015 and got the result I wanted.
Ever since I have been running tests and trying to find a solution. The problems seems to lie with the receiving side of the deployment proces.
What I have tried
Deleted the package from the existing project in SSMS and redeployed. Deployment again seemd to succeed but the package didn't show up, so I had to restore an old version of the project.
Deploy the package from multiple different computers with access to VS2015 and the source code. No change...
Deploy the package to a new (empty) SSMS project: package does not appear in the project. This leads me to believe that the old package is kept when I publish the new version to the existing project in SSMS.
Regenating/rebuilding the package in VS2015, frankly this was never necessary and probably doesn't do anything for an SSIS package, but it may help you get an idea of my skill level.
In the past we have had issues with the encryption level blocking the deployment of packages. I have verified these settings and found no issues.
I have verified if any updates have recently been installed to the database server, which does not seem to be the case.
I have (of course) tried to google the issue, which is tricky due to the lack of errors. I have found the following links, that describe the same/a similar issue, but their solutions haven't helped:
https://dba.stackexchange.com/questions/259672/ssis-package-not-being-deployed
Deployed SSIS Package not reflecting changes made to package
What is still left to try
Rebuild project from scratch to see if that version is deployable.
Unfortunately I don't have a lot of experience with this subject and no colleagues or contacts to ask for help.
Thanks in advance.
My workaround
After quite a bit of time attempting to solve the issue I have resorted to working around the problem, by manually importing the .ispac file into the database. While this is not the prettiest of solutions, at least it's a workable one. If anyone has any other idea's I'll gladly see them, but for now the issue isn't nearly as pressing as it was.
From your post. "Deleted the package from the existing project in SSMS and redeployed. Deployment again seemd to succeed but the package didn't show up".
Are you 100% sure you are deploying it to the same project on the same server on the database? Are you refreshing after you deploy?
VS2015 Community is not showing SQLite in the list of available data sources in one place and showing it in the other.
If I click New Connection button in Server Explorer and click Change, I get the following list of Data Sources:
If I add a new item to my project > choose Entity Model > from existing database > New Connection, I get the following list of Data Sources:
How can I get SQLite in the New Connection data sources list?
Background
The problem started when my existing EDMX failed to load with the infamous error message
The Operation could not be completed: Invalid Pointer
This error can be fixed by deleting ComponentModelCache folder as described in this post. This method has worked for me in the past, but not this time. I finally decided to recreate the EDMX from scratch. Since then I'm facing this issue.
A few things that might give some hint:
I have recently installed VS2017 Community side-by-side with VS2015. VS2017 can open the existing EDMX just fine, but cannot do Update from Database, so I came back to VS2015.
I uninstalled and reinstalled System.Data.SQLite provider several times, thinking that this might be a registration issue. Didn't do any good.
Note that VS2017 support is not there yet on System.Data.SQLite's download page. I'm using the last available version that supports VS2015 (version number 1.0.104.0).
Good news is that the issue is fixed finally; at least for VS2015. Bad news is that I don't know what exactly did the trick. So I'll list down everything that I tried and maybe this could help someone in the future. These steps are not in any particular order.
Uninstall all SQLite packages from NuGet.
Uninstall Entity Framework package too.
Reinstall all these packages.
Remove and reinstall the latest version of SQLite provider (1.0.104.0 as of this writing).
Use VS2015 only. VS2017 is currently not supported by SQLite provider.
Clear ComponentModelCache folder and restart Visual Studio.
I found an easy solution, just install below extension from market place and Sqlite will be available to data source list
SQLite/SQL Server Compact Toolbox
I am facing the puzzling fact that the information of update sites fail to be updated despite my forcing a reload in Preferences > Install/Updates > Available Software Sites.
I have a local update site (file:/ protocol, on Windows) and an online update site (https://) that I use as staging/test update sites for an open source project that I am maintaining.
I build the update site using an update site project that is stored locally and wiped clean each time I build it. When I have tested the new release in a different Eclipse instance and I have validated my changes, I then upload the entire update site to my server. Then, just to simulate what a user would do, I update the plugin in another Eclipse instance that runs on a different physical machine.
I have (yesterday) built another version, 2.2.0.201702052007 and uploaded it to my server. The previous version was 2.2.0.201702042059.
The problem that I have is that the Eclipse instances (Mars.2 and Neon) on my development machine keep reporting the previous to last version, despite my reloading the update site information. However, the other machine sees the new version without a problem.
This is what I've tried:
Reloading the information of the update site: each time, I get a confirmation message saying "information for [...] has been reloaded from the server" but it turns out that it hasn't been reloaded: I see the older feature version.
Accessing the update site from a different Eclipse instance on a different machine: I see the new version.
Loading the update site's site.xml file from a browser: I see the new version.
Using FileZilla to download the entire update site to a local folder and unzipping content.jar and artifacts.jar so that I can read the XML files embedded in those JAR files: I see no trace of the older version.
Removing the update site, restarting Eclipse and adding the update site again: the problem was still there.
As a last resort, I removed all files of the update site from the server: Eclipse still reported successfully reloading the information from the server.
I shut down the httpd service on the VPS. Eclipse reported success until I restarted it and it then failed. But once the web server was again online, it failed to actually send a request to the web server as it kept saying there was no update site! As a consequence, the online update site now appears empty and restarting Eclipse does not change that.
[EDIT] Even more incomprehensible, the Reload button reports success even when there's no network connection to the update site (network interface disabled).[/EDIT]
There seems to be in the provisioning framework a cache somewhere between the UI and my server that reports an outdated information and feature version in spite of the explicit requests to reload that very information.
Is there any file or folder that I can delete to have the provisioning framework reset itself? If possible, I would altogether disable its cache.
I've found out that Oomph apparently has an action on the update site information retrieval process.
Anyway, I could recover normal operation (for now) and have the information properly reloaded by first deleting the appropriate files in C:\Users\...\.eclipse\org.eclipse.oomph.p2\cache.
By “the appropriate files”, I am referring to the fact that files in that folder are named after the URLs of repositories known to your Eclipse instances.
This is a project I've been working on off and on for months and I feel like I'm pretty close, but I just can't seem to get past the final hurdle.
The goal is to develop an organization extension library that contains both internal and 3rd party code that we frequently rely on.
History
As a test project, I started with Apache Poi because that is already in wide use in our environment. I have a plug-in and feature built just from the Poi .jars that allows me to build our current Poi applications as long as I add the plug-in (from my workspace) to my build path. The apps work on the servers because we have already distributed the Poi .jars by manually copying them.
The next step is taking that plug-in and getting it into an updatesite so that all of the servers and developers can synchronize on one version. I found and followed these two excellent blog articles (that I wish existed when I started this project):
http://www.dalsgaard-data.eu/blog/wrap-an-existing-jar-file-into-a-plug-in/
http://www.dalsgaard-data.eu/blog/deploy-an-eclipse-update-site-to-ibm-domino-and-ibm-domino-designer/
With the caveat that the articles are written for Domino 9 and we are running 8.5.3 here, but that only matters in the last (installation) step.
Current
This brings us to the problem. All of the above seems to have worked great up to a point. I can install my feature to my designer client from the eclipse update site and it works great. However, the install is failing when I import that into our updatesite.nsf database. This means that while the developers can all install from the updatesite if I put it on a network drive, that doesn't deploy updates to our servers.
The problem is that when I try to install from the .nsf update site, the Eclipse Updater just hangs. I've let it go for well over an hour and eventually Notes becomes completely unresponsive.
So the question is, is there anything I might have done wrong, either in the development of the plug-in or server configuration that might be causing this issue?
Additional Info
I'm looking at the osgi console and that is largely unhelpful. I am getting the following errors as I'm trying to install: SEVERE Could not access digest on the site: no protocol: 0/5B004DDD5E38F3FF85257CAF004C72C7/$file/digest.zip ::class.method=unknown ::thread=Worker-7 ::loggername=org.eclipse.update.core
I could generate dumps if that would be useful.
Security is also locked down fairly tight here. It could be a security issue - is there a way to troubleshoot that? Once I get to the hang I'm just stuck guessing.
This has been edited for clarity and to update information
I know that this is post is over 5 years ago but...
for those that find this and are trying to resolve the error
SEVERE Could not access digest on the site: no protocol: "
is due to the update site project not having the URL of the Domino updatesite.nsf not being added to the Archives tab of the site.xml.
I found the updatesite.nsf also needs to be anonymously accessible as no credentials are prompted/passed through to the Domino server hosting the updatesite.nsf database (at least from DDE), YMMV from eclipse. So if Anonymous connections are blocked on the Domino server you will be out of luck.
To develop a plug-in you really want to have 3 projects:
the plug-in
the feature
the update site
Of course a feature can contain more than one plug-in (and probably should) and a update site can contain more than one feature (and probably should). Once you have an update site project it features a handy button "build all" that makes sure plug-in, feature and update-site get compiled in one go. And that button is what you really want.
You can point using a setting in your Domino Designer (or local Domino server) to the feature directory. Add a plain text .link file to framework/rcp/eclipse/links, that contains the path to your install site - it then picks up the features and plug-ins from there. After a build you would need to restart designer/server to activate the updated feature.
For the Domino server the approach using an updatesite.nsf and the respective notes.ini setting makes the most sense (to me). http restart required. Lazy people script the whole thing.
I still don't have a great answer for this, but I believe the issue is related to the environment here. I don't have the authority to change the environment, even if I were able to conclusively demonstrate it is the cause of this problem, so it is a moot point. All I can say is that at least one administrator computer had no issue installing from the update site.
For me, the solution for distributing the update site is to put it on a network drive and have everyone install it from there. The server has no problem using it from the updatesite.nsf.
I have a PHP Azure project which I have to manage with Powershell cmdlets. One of these, Publish-AzureServiceProject doesn't seem to be detecting file changes so these are not updated on the cloud (even though no errors are displayed).
I have remote desktop'd into the machines and the code is definitely not updated from weeks ago.
If I deploy to the local emulator, it is fine but this is much more obvious because it displays "removing old package" and "creating local package". The cloud package definitely contains the latest files, so the packaging is working fine.
Can anyone tell me how to force the publish to update the files on the cloud and more importantly, why this is not happening? Also, if I force the update, will it deploy to a new box and get a new IP Address?
Thanks.
It seems to work now.
I have removed and reinstalled azure libraries from my machine and created a new project from scratch and copied the original files over into it. I have not included diagnostics (not sure if that's an issue) and I have modified the Publish-AzureServiceProject script to select the subscription each time before it publishes.
It is possible that the subscription confusion was not helping (I have two Azure subscriptions and it might have used the wrong one at some point and done something weird) and also it was possible there was some conflict with various versions of the Azure SDK since I have been using it for over 6 months but at the moment, all is good.
A related article on my blog here: Problems with PHP Azure
Thanks for the interest