just doing some preliminary research on a project. do any frameworks like django or ruby on rails offer a way of creating a web app which allows you to sell web hosting to clients and sell domain names? i have looked at sites like hostgator but want to make some more customized.
my clients wouldnt need all the bells and whistles of a normal host and i'd like to maintain some sort of branding.
You can build whatever you want on top of a framework - CMS or otherwise. Ruby on Rails, Symfony, or any one of the other MVC frameworks would let you build a web application that does what you want. You would, of course, need to code it. There'd be integration or construction of a shopping cart, website management, and DNS management involved.
You'd probably have more luck using an application that's already designed for this purpose - a very common one is cPanel (for hosting). For domain names, it's much more complicated, because you need to register with the TLDs. You would probably have better luck reselling for GoDaddy or one of the 100 other sites that sell domain names.
Related
I'm new to building APIs, I made the first one using an MVC framework: codeigniter, with chris kacerguis rest implementation.
I'm not really sure this was the best think to do because I believe maybe the framework is not that "slim" or light just to API's purposes.
I plan to do a mobile App, an admin and a website so the three can consume the Api's services.
Is it a bad idea to have the API, the website and the admin on the same project? which are the pros and cons? or the best architectural approach?
Otherwise I will have: One Codeigniter project for the API and Another Codeigniter project for website and admin
thanks
You can create folders in "controllers" folder to organize your project and use the same project/env configuration :
controllers/Home.php
controllers/api/MyApi.php
controllers/admin/Admin.php
Edit : You will share models and libraries too.
In my project I realized 2 types of controller - REST and API. Admin js gui work with REST, other world work with API. You can do it simply with silex framework, a little brother of symfony.
The purpose of building a REST API so that you only have to build one project for your business model. This allows you to construct any number of applications on any platform, only requiring you to consume the API in different ways. This essentially separates/decouples the user interface from the business logic, and vice versa.
You should create separate projects for the REST API and each UI project should also be separated projects. This allows you to change the underlying code, language and platform in any one of the projects without breaking any of the other projects as long as the API signatures remain the same.
For example, you could have a live version of your website built using Codeignitor while developing another septate project using AngularJS. When your AngularJS project is complete you would simply swap out the project on your server (or create an entirely new website or server) still allowing you to use the other if required. Additionally, you may decide that you would like to move the API onto a different platform, language or database, develope it and swap the implementation when finished causing no changes to any of your UI projects assuming you have not changed the API signatures.
I have started to work in a new company. We have rest service (XML exchange with external system) and have web site. REST service work on subdomain, for example rest.mycompany.com. Company site is mycompany.com. Site and rest work like that
REST -> DB <-SITE. This means that REST is not a part of site. It's an independent system. REST and site work with one database and use 90% same code (model, mapper etc). The problem for me is double coding and I wonder why it can't be a part of site (Import export controller, XML parser and one logger system)? On the other hand, it may be better to have different systems in terms of security and highloading for each subdomain...separated traffic for each subdomain?
Site and rest work like that REST -> DB <-SITE. This means that REST is not a part of site. It's an independent system. REST and site work with one database and use 90% same code (model, mapper etc).
That's a big problem. Especially since one system might generate a bug (inconsistent data) which only shows in the other system. Quite hard to debug.
The problem for me is double coding and I wonder why it can't be a part of site (Import export controller, XML parser and one logger system)?
The REST service and the website are just UI layers. The actual business logic should be moved to a third project (class library / module / lib) which both UI layer uses.
On the other hand, it may be better to have different systems in terms of security and highloading for each subdomain...separated traffic for each subdomain?
I would stick with different sites. Not for performance but since they have distinct responsibilities.
In my next planned project I suppose to implement an online classroom website in which I want to share a black board (Say a simple text area) among the Tutor and the participating online students {These all are logined through the website}. Whatever the text Tutor writes over the Black board has to be seen by the other participants in Real-time.
I want to use Java as my development platform.
To implement this I started googling on PUSH technology. I read that XMPP implementation servers and XMPP libraries can be used to implement near-realtime applications like collaberative applications/mutiparty games/IM applications etc.,
But I also read about BlazeDS usage for real-time and low-lattency web applications.
My questions are,
what is the difference among these XMPP java based implementation and BlazeDS? Is not both the techs final goal is to achieve low-lattency web apps using PUSH technology?
What is the difference among Comet servers and XMPP Servers? Do they just differ in the way they implement the PUSH technology or any thing else?
I am confused. Please someone explain me these little more so that I will know what I do before I do the things and where to start on next in my project.
Thanks
To answer your first question: No, XMPP's goal is not simply to "achieve low-latency web apps using PUSH technology". XMPP far pre-dates the "realtime web", and is a much more extensive platform than basic comet/push servers.
Features in XMPP that won't typically be in plain push servers include:
Support for non-web clients (including existing desktop clients)
A choice of authentication methods from strong credential-based mechanisms such as SCRAM-SHA-1 to temporary anonymous accounts
Support for federation (communication with other XMPP servers)
Lots of chat-centric capabilities available (contact lists, offline messaging, file transfers) - this can be a bonus or worthless, but most features like these can be disabled in some XMPP servers.
If you are just looking for a component to push messages to website visitors in realtime, and you're using Java which BlazeDS integrates tightly with, then BlazeDS is probably your best choice.
If your push system is part of a larger realtime platform which may involve multiple clients (including non-web for example), multiple languages, and even possibly multiple servers (e.g. you could allow users to use existing XMPP accounts to receive updates) then XMPP is more than likely worth looking into.
We are considering purchasing DotNetNuke (or Sitefinity) on pretty short notice and there are is a question I have that I am having trouble finding a quick answer to. (I have a separate but similar post with Sitefinity as the focus, if you can answer that better or in addition.)
We are currently not using any CMS at all and we have some custom development that will not go away just because we go with a CMS for some or most of our site.
Our custom development is c# ASPX with Site Master and nested Site Master pages. These custom apps do not own their own top level in our web site, but are part of a branch, typically one or two levels down (for example, http:www.contoso.com/branch/app/default.aspx).
How is DotNetNuke typically configured in a CMS/Custom “mixed mode”? For example, is DotNetNuke installed at the “top” of the web site, or “where needed” down in the web site.
How does this relate when mixing CMS and custom web applications?
Does the CMS interface allow for adding these custom apps or do you just go to the web server and add them to the structure?
It appears from reading other posts, we can create our own custom c# modules and have CMS editors “drop in” the modules on the pages. Can someone confirm that for me?
If I did not provide enough detail, please feel free to ask for more.
DNN is most commonly installed at the root of the website, but that is not required. It is sometimes run as an application in a virtual directory that is part of a larger site.
It is possible to add .aspx pages at the correct location within the DNN. The UrlRewrite handler will initially look at all such requests, and assuming that existing pages, and friendly url handlers don't think they "own" the .aspx page, DNN will stop processing the request and hand it to your page. There is no specific way to "register" these pages with DNN. I wouldn't generally recommend this approach, but it does work and can make sense in specific situations.
Alternately, you can write your own DNN modules. Existing code, can usually be quite easily be adapted by converting the code to work in .ascx user control that inherits from PortalModuleBase. Code that wants to take advantage of core DNN features e.g. membership or permissions will of course need to be modified to use the DNN APIs.
The DNN module approach is generally the best option. But the details of your situation may make one of the other approaches more appropriate for you. Basically as long as your site is layed out so that it is clear which requests are destined for DNN and which are not, you can mix and match with other asp.net code as needed.
One thing that causes trouble in a mixed configuration is configuration inheritance.
If DNN is the root application, you'll either have to remove problematic http modules and handlers in the application's web.config or disable inheritance with a location setting in the DNN's (root) web.config:
<location path="." inheritInChildApplications="false">
<system.web>
...
</system.web>
</location>
Maintaining the inheritance between application's web.config and DNN web.config is fragile.
Changes in DNN web.config can cause the application in the virtual directory to fail. In addition to removing each http module and handler, you'll need at least to add DNN's App_Code directories to the application configuration.
On the other hand, location setting does not always play well with DNN modules, especially if they have aspx pages in addition to controls inheriting from PortalModuleBase. Personally, I've never got the location setting work well enough with DNN.
See also
How to disable web.config Inheritance for Child Applications in Subfolders in ASP.NET?
How do I stop web.config inheritance
Avoid web.config inheritance in child web application using inheritInChildApplications
I am in the process of scoping the development of an iPhone app for a client. Among other things, the app will allow users to browse through and place orders on specific (tangible) products.
The client has a website that currently does a similar thing and due to their limited budget and the fact that the website runs on a third-party proprietary platform which they have no control over, we are investigating possible alternatives to building a web service.
On the website, user registration and authentication, as well as order placing is done through POST requests via secure HTTP. The response is always a formatted HTML page which will contain strings indicating whether the request was successful or not, and if there was an error, what the error is etc.
So provided I can replicate the POST requests on the phone, and parse the HTML responses to read the results of each request, do you think this is an acceptable alternative to building a web service to handle this?
Apart from the possibility of pages changing (which we can manage) and the fact that I will probably have to download and parse a relatively large HTML response, are there any other drawbacks to this solution and is there anything else that I might be missing?
Many thanks in advance for your thoughts.
Cheers,
Rog
You could create an intermediary server that will communicate with the client server, and on it expose some REST web services with json (small overhead and easy to handle) responses that will be consumed by the iPhone app.
So, you're going to parse HTML and formulate POSTs off a third-party server, and pray that they don't even so much as rename a form field.
Your question is in two parts:
Do I think that a miracle is an acceptable solution? I don't.
Do I think that aside from the fact a miracle is required, are there any other drawbacks? None that I can think of.
You didn't ask, but this is a terrible course of action. Two suggestions.
I spy an assumption that the providers of the third-party platform aren't interested in enabling third-party applications by providing an API. They have a very good business reason for this, which is that it promotes platform lock-in. Reach out to their support department and have a talk with them.
You have to sell the client on building an intermediary web service. To at least try to mitigate the damage that changes on this third-party platform can do to your app, I recommend that you build and operate a proxy that receives requests from your applications, and proxies them over to the third-party platform. You should build into this client-server protocol a means for returning "we are in maintenance mode, go away" messages to apps, for that inevitable day when the third-party server changes something that breaks your app (they swapped the billing and shipping address pages, for instance) and you have to rush through an update through Apple to deal with it.
The proxy could be written in something more flexible and easy to bash stuff out in, such as PHP, Python, Perl, or Ruby. It could be hosted at Amazon in a micro instance.
p.s. This question is inappropriately tagged objective C.
HTML is the worst because of parsing (1-2secs per page), memory, and changes, but you already know that. Check in advance that ALL the data you need is exposed on the HTML.
If you use an intermediary server you are moving work elsewhere and you have another server to maintain. I would only do that if memory is an issue. Check How To Choose The Best XML Parser for Your iPhone Project for memory/performance/xpath support. libxml2 is a good option, but it depends on your needs. And maybe you'll want to check ASIHTTPRequest features before using the SDK.
I think utilising the web language of JSON would contribute to the diminishing of the parsing time. By building a REST service that, when sent a GET request, returns the correct information for easy sorting, you could then display the output a lot faster than that of parsing straight HTML.
I prefer JSON over XML, but everyone has their personal preference. You should look at a few very good libraries that are built specifically for parsing purposes of both XML and JSON.
For XML I recommend using the inbuilt libxml parser. Albeit, this can sometimes deem very difficult to use. A simple Google search will bring up a heap of results that relate specifically to what parser should be used depending on what task is to be completed.
As for a JSON parser, I recommend SBJSON. I am currently using it one of the biggest projects I have undertaken and it is definitely working perfectly for my use.
If you need a good way to connect to a RESTful web service, you should try LRResty.
Don't go for a parsing solution on the iPhone for 4 reasons:
Server can change their design and break your application (AppStore submition is long) + They can also detect that the request are sent from an application based on user agent which you have to update the application to change it.
Some of the requests might be made thru Javascript so you not only have to parse (X)HTML but also Javascript request (which can be in the form of XMLHttpRequest, but don't have to)
Long term evolution of the mobile market : maybe your client want (or will want) an application for android, Blackberry, Bada OS (Samsung), Symbian (Nokia/ OVIStore), Java Mobile or Windows Phone 7?
Of course network traffic, Memory and CPU needed to parse HTML (look the time it takes to the browser to do it?)
Regarding the traffic, if the application will not have a huge traffic you can home-host your proxy. Or you can find some provider to host it for you. I guess you won't need more than a couple of Megabytes of storage but maybe traffic. For less than 100€/year you can find some with unlimited traffic (like OVH Pro plan or Infomaniak). But if you want to go Java have a look at Google App Engine : you pay only if your traffic is important and if your application generate many CPU Cycles. If not : you don't have to pay. And it's hosted on Google server : reliable.
If the client is open, you could consider the paypal API.