We're using Eclipse (with PDT) for several projects. The main load seems to come from the DLTK indexing process, which is quite a lot resources per client.
It would offload the clients a lot if a server could perform this indexing task (DLTK indexing could be switched off).
So I wonder; is it possible to create and use a central (H2) database/repository for indexing the projects for multiple clients?
Though I have not attempted this, it seems entirely possible to configure the H2's Connection profile to utilize an external URL. Follow these steps for configuring H2 but point the connection URL from local to the server.
After transferring the database and pointing the driver's URL Connection at the server, you could disable the DLTK indexing on every machine except the server. I also recommend you have a look at the article suggested in this answer.
Related
I'm working on a website using PhpStorm. For a long time I developed it locally, but then I got hosting and a remote ftp server.
I created a new project in PhpStorm with the settings for remote host, and I found that deploying code takes long time (over a minute) before I can see the result, which is quite uncomfortable when debugging.
Is there any possibility to work with code on a local server, and, when I think that the project is ready for deploy, just send it to the server.
I understand, that I can just work in two different projects and just deploy the "ready" version to server via FTP, but maybe there is some more comfortable way?
There is several answers to this question, and most of them opinion based but i will try and keep it objective.
Case 1
A big corporation gives every developer a sandbox, to test their code from, the corp requires every developer to keep their code on the sandbox.
Using mounted drives could be extremely slow. Especially when PhpStorm is indexing.
Case 2
An easy way to keep an auto backup of your code it to use the build in (s)ftp(s) upload/deploy.
Solution
In both cases you could use the auto deploy feature that saves every changes to the server, that way the deploy doesn't take over a minute, but is usually already there before you know it.
I cannot recommend to use the deployment for Production as it will not pass through your version control, SAT, security setups etc. In that case I would suggest something like rocketeer etc.
EDIT:
As for 2 projects, well you can define 2 different deployment servers, and use the default one for your testing, with auto upload or something, and then the other one can be selected from the deployment menu.
I want to start developing with a team using a Neo4j DB, a Spring Boot backend and an AngularJS frontend.
For that, I want to have a Maven Repository and a Jenkins.
To enable my team to use this, I want to have some kind of server at home that can provide remote (sequred) access to the Maven Repo, the Jenkins and the Neo4j DB and that can host the AngularJS frontend communicating with the Spring Backend.
I don't really know where to start. After some googling I found a NAS, but I'm not sure if they suit my requirements.
I've found tutorials for configuring a VPN but there may be a simpler way.
What would you recommend?
So, after some more asking around and googling if found 2 possible solutions, that i want to try out in the future:
First of seems to be the NAS (I've only read about Synology), although it not seems to be intended for my requirements. However there are packages available in the DiskStation OS that allow the installation of a Jenkins, a Maven Repo and Docker, allowing to host a Neo4j DB. I was told, I should be cautious, because only the "x86 diskstation supports docker". At this point I'm not too sure what this means, but since I'm posting an answer, I don't want to keep this knowledge for myself.
I didn't really find anything on hosting applications.
Second solution seems to be, to build a homeserver. In my current understanding, it should suffice to have a spare PC at home for that. All the steps involved should be available under here (german).
I didn't find anything about hosting applications here too, but since this is a "real" system, I'm pretty sure it's possible.
I'm going to try the second one out and keep you updated as far as I don't forget it :)
I can't wrap my head around how I'm supposed to use ColdFusion Builder 3 (akin to Eclipse).
Up until this point, I've been using Dreamweaver 5, which is getting 'long-in-the-tooth', and I wanted to give CF Builder a try.
So, in Dreamweaver, it's pretty simple: you setup connections to servers using credentials... There's a Local path, which is the local copy of your code, and the webroot of the Server which is the 'live' copy of your code. Basically, you make a change to the local copy, and PUT the change to the Server. Easy peasy lemon squeezy, right?
But, how does this translate to ColdFusion Builder 3?
Just to give you an idea of our infrastructure.... we have Development and Production. Each of these boxes has multiple web instances, example: Accounting, Human Resources, IT. Each of those web instances could have multiple applications.... I'm only considered about my instance, IT, on both the Production and Development servers.
Is a workspace supposed to represent an instance on a web server?
In CFBuilder, should I configure 1 server per web app?
Is a project supposed to represent a web app?
Am I supposed to use drive mappings to the inetpub wwwroot for access to web applications? Is it even considered kosher to have a drive mapping to the web root? \server\c$\inetpub\wwwroot
Where do I keep my local copy of my code?
How do I move items from Development to Production?
My main confusion is with workspaces, projects, and servers... My intent is to debug and 'view page in browser' from CFBuilder.... However, when you setup a server, under Server Mapping and URL Prefix, you're supposed to indicate the Local and Remote paths, plus this is not directly related to the physical location of the project.... and as I've mentioned, there's multiple instances, multiple applications, and the development box is not my local machine, it's a remote server...
I would really like to know how others have made this work for them.
I really don't mind this question even though it's not directly code related because I've been using ColdFusion Builder (CFB) for years and there just isn't enough good documentation out there. I now enjoy a great experience with CFB thanks to blog posts and sharing experiences with other devs :)
My setup: CFB3 running on Windows 8.1, dev server running on a Virtual Machine so it is treated as "remote server" just like yours. I also update remote staging and production servers (although not directly from CFB).
First, let's set some reasonable expectations: Dreamweaver and CFB are very different in that CFB focuses on programming and Dreamweaver on design. CFB is built on eclipse and therefore has the advantage of benefiting from most eclipse plugins.
Your question is specifically about how to set up your projects in CFB using 2 remote servers (dev and prod). It's different for everyone but I'll share my setup with you. (sidenote: My projects are also stored in Git repositories - 1 repo for every app)
Starting from the top: A workspace in CFB deals with your whole eclipse application, not just your projects. The most important things kept in this directory are snippets and plugins. You do NOT need to keep your project files in here. This is merely the main directory where all of your settings are kept. You are not required to have more than 1 workspace (I only have one). Why would you need more than one? You may be multifaceted programmer who needs to keep separate workspaces using separate tools (like different plugins, snippets, window layouts...)
To answer your next question (1 server per web app), all you need to to is configure your dev servers in the "CF Servers" tab. You need to add 1 server per web instance for every instance that you'd like to test on. Hopefully, your dev server has RDS enabled (very helpful for remote database and file viewing, just like in Dreamweaver). During configuration, don't worry about Mappings or Virtual Host Settings (I have another recommendation later). Once configured, you'll be able to assign that server to a project.
Drive mappings: I would never recommend mapping to the webroot of a shared dev server. If you were to use that drive map as your local directory, your changes will be made directly to the development server. What you want to do is create a new project by right clicking in the Navigator area and select Import > Other > FTP. Follow the steps, choose anywhere on your local drive to store the files, then choose "New project" at the end (this will add the .project file necessary for CFB to control the project).
Once the project is created, right click on it, select ColdFusion Project and choose the CFML Dictionary version you'll be using (CF11, 10, 9...). Then, select ColdFusion Server Settings and choose the dev server. This is necessary for testing.
What you now have is a local directory with your app and eclipse knows about the remote server. In order to synchronize, you right click on the project, go to Team and synchronize from there. For detailed information about synchronization over FTP, see the help section "Guide to WebDAV and FTP".
Moving to production is not as simple as it was in Dreamweaver. The FTP configuration information only allows for 1 connection (thus giving you a list of files synchronized between your project and the dev server). Therefore, you'll need a third party FTP client to synchronize between your local project and your prod server.
As promised, my last entry will be able the "debugging" which is why I said to skip the mappings and virtual host settings in CF Server config. I really, really recommend using a third party paid plugin called FusionDebug (http://www.fusion-debug.com/). This plugin facilitates the setup and allows you to step-into all of your code (which doesn't work so well in native CFB). There's a 30 day trial and I recommend you try before your buy (or license for a year in this case!)
I have a Weblogic server configured in Eclipse with a local database as the data source. When debugging issues it would be nice to be able to connect to the database the test group is using. I thought I would be able to clone the default "myserver" in the default mydomain and create new data sources which point to the test groups database. I've done this but now I'm attempting to figure out how to start this new server and deploy my application to it through Eclipse.
I don't really care how it works, I just need to be able to easily switch between the two data sources, either through the Weblogic admin console or through eclipse via multiple servers. Being able to clone the current server would be nice since it's configuration is rather complex or just switch the sources out.
Any ideas on how to accomplish this would be much appreciated.
The JNDI name must be different, because it connects through the JNDI name Every data source should have unique JNDI name.
Our lab has a single machine that is open to http connections from outside. However, this machine is quite weak (little memory, slow CPU). We have other machines that are much stronger, but they are behind a firewall and cannot be accessed from outside the lab.
I am writing a GWT app whose server is very demanding. Is it possible to install the server on a strong computer, and the client on the weak compuer, and have them connect using RPC? I assume it requires some changes in the web.xml file, but what exactly?
Theoretically I can just wrap the demanding part in a separate TCP/IP server, and have the GWT server contact it, but I would like to know if it is possible to do directly in GWT.
The GWT client is downloaded from the server and runs inside a Web browser as javascript code. I don't quite understand what part of the GWT app you would want to run on a separate server.
If your GWT servlet (the RPC service implementation) is accessing external resources, like a database or Web services, you could move those resources to a separate server.
Another option is to install a reverse proxy on the "weak" server that would forward specific requests to a stronger server behind the firewall. The proxying could be done by Apache (httpd) on the "weak" server (using mod_proxy). Then Tomcat would only need to be installed on the stronger machine, and would take care of most of the processing.
I have tried to do this but was only successful in splitting a GWT project into 3 parts (Client, RPC, Server) as eclipse projects. In the end you are going to end up with 1 big WAR file and it'll be deployed in one place (unless someone else was successful at really separating the code.)
A solution you can do is to set up another server that will do all the server side processing (your strong machine) and have the GWT servlets act like a proxy. They accept the requests from the client and forward the data to another server for processing. Then wait for the response.
How you do it is up to you. You could use web-services, direct socket connection, JMS ..etc.
Depends on your setup.
GWT ACRIS- Please see this link.
EJB - One approach could be to keep business objects in remote machines as EJBs and your servlets accessing them over RMI/JNDI.
Spring - Another simple way to do it is with Spring Remoting. See this link.