I'm starting work on a small web based request system which needs to implement a two-step approval process (C# mainly, most likely MVC3).
I have come up with a simple workflow diagram and can easily come up with code to move from step to step. However, I'm having trouble coming up with a way to persist workflow related "tasks" or "steps" in the database.
For a request, an admin would approve/deny it when it comes in, then it's goes into "working" state. At compeletion, user is requested to approve/deny "QA" step.
Can anyone provide some directions on this?
I don't want to use Windows Workflow Foundation, since I don't have time to learn yet another new framework. And Google is not being too helpful, any mention of the word "workflow" keeps bringing results for WF.
Learn WF.
Why? All you're doing by inventing your own new framework is forcing the next people to maintain your code (or those who work with you) to "learn yet another new framework," as you so put it. Do you want to write all that new documentation or do you just want to point someone to MSDN?
WF4 is actually pretty easy to learn, way better than WF3. Before you throw away the idea completely, at least read this:
A Developer's Introduction to Windows Workflow Foundation (WF) in .NET 4
http://msdn.microsoft.com/en-us/library/ee342461.aspx
If you're looking at a web based system, you may use Windows Server AppFabric to provide the persistence and monitoring layer and a WCF workflow service to host the workflow. The time you spend learning the basics of this will be far less than the time you spend inventing your own system. I guarantee it. You can install AppFabric via the Web Platform Installer tool along with SQLEXpress to persist.
Related
I'm using swift — and the app needs to send a photo and matching text (that a user submits) to a server so that I can download the photo/text.
Would an existing FTP server that I have setup for my website be possible for this application? Or would it make more sense to do something with a web portal?
During my research, I'm finding options like Backendless, Alamofire, Gold Racoon, and various others. They seem like overkill for the simple task I'm looking to accomplish. Is there some minimal service out there that can automate my simple need? If not, which of these options would you recommend for my situation?
Ideally the setup would be free, but I'd be willing to spend up to $100 or $10/month if the service fits.
I'm new to app development so I'm feeling overwhelmed with the options and not sure how to begin researching. I hope beginner questions aren't frowned upon here — I would really appreciate any advise on what I should begin learning to achieve my goal of sending a photo+text from an iOS app to a place where I can access them.
If there are other questions I should be asking to achieve this, please let me know.
For your case there are two main routes to consider
1. BYOS (Bring Your Own Server)
With this option, you are responsible for creating and maintenance of your own server.
Now you can use various services such as Digital Ocean and Amazon for this.
On top of this, you would be responsible for creating your own database and maintaining it as well. Plus, you would need to write server side code along with client side code (the app) in order to instantiate communication between the two.
The advantage of this is that you virtually have control of everything but I think it is pretty clear how painful this task is.
2: BAS (Backend As a Service) Highly recommended
With this route, you simply have to write the app and let another service handle the server side of things. One of the most common ones is Firebase. Most folks including startups go down this route.
There are a bunch of other services out there.
Two Cents:
Whatever option or service you decide to use, I would recommend you make sure that:
a. The service has a good track record.
You do not want something that might end up getting shutdown in the next couple months. I know it is hard to predict this but certain initial guesses/probabilities can be made.
b. Make sure its community is vibrant.
The last thing you as a newbie wants is to be stuck and have no one to help you. Research around and see the different questions people ask and whether or not answers exist.
I am working on a website of 3,000+ pages that is updated on a daily basis. It's already built on an open source CMS. However, we cannot simply continue to apply hot fixes on a regular basis. We need to replace the entire system and I anticipate the need to replace the entire system on a 1-2 year basis. We don't have the staff to work on a replacement system while the other is being worked on, as it results in duplicate effort. We also cannot have a "code freeze" while we work on the new site.
So, this amounts to changing the tire while driving. Or fixing the wings while flying. Or all sorts of analogies.
This brings me to a concept called "continuous migration." I read this article here: https://www.acquia.com/blog/dont-wait-migrate-drupal-continuous-migration
The writer's suggestion is to use a CDN like Fastly. The idea is that a CDN allows you to switch between a legacy system and a new system on a URL basis. This idea, in theory, sounds like a great idea that would work. This article claims that you can do this with Varnish but Fastly makes the job easier. I don't work much with Varnish, so I can't really verify its claims.
I also don't know if this is a good idea or if there are better alternatives. I looked at Fastly's pricing scheme, and I simply cannot translate what it means to a specific price point. I don't understand these cryptic cloud-service pricing plans, they don't make sense to me. I don't know what kind of bandwidth the website uses. Another agency manages the website's servers.
Can someone help me understand whether or not using an online CDN would be better over using something like Varnish? Is there free or cheaper solutions? Can someone tell me what this amounts to, approximately, on a monthly or annual basis? Any other, better ways to roll out a new website on a phased basis for a large website?
Thanks!
I think I do not have the exact answers to your question but may be my answer helps a little bit.
I don't think that the CDN gives you an advantage. It is that you have more than one system.
Changes to the code
In professional environments I'm used to have three different CMS installations. The fist is the development system, usually on my PC. That system is used to develop the extensions, fix bugs and so on supported by unit-tests. The code is committed to a revision control system (like SVN, CVS or Git). A continuous integration system checks the commits to the RCS. When feature is implemented (or some bugs are fixed) a named tag will be created. Then this tagged version is installed on a test-system where developers, customers and users can test the implementation. After a successful test exactly this tagged version will be installed on the production system.
A first sight this looks time consuming. But it isn't because most of the steps can be automated. And the biggest advantage is that the customer can test the change on a test system. And it is very unlikely that an error occurs only on your production system. (A precondition is that your systems are build on a similar/equal environment. )
Changes to the content
If your code changes the way your content is processed it is an advantage when your
CMS has strong workflow support. Than you can easily add a step to your workflow
which desides if the content is old and has to be migrated for the current document.
This way you have a continuous migration of the content.
HTH
Varnish is a cache rather than a CDN. It intercepts page requests and delivers a cached version if one exists.
A CDN will serve up contents (images, JS, other resources etc) from an off-server location, typically in the cloud.
The cloud-based solutions pricing is often very cryptic as it's quite complicated technology.
I would be careful with continuous migration. I've done both methods in the past (continuous and full migrations) and I have to say, continuous is a pain. It means double the admin time for everything, and assumes your requirements are the same at all points in time.
Unfortunately, I would say you're better with a proper rebuilt on a 1-2 year basis than a continuous migration, but obviously you know best about that.
I would suggest you maybe also consider a hybrid approach? Build yourself an export tool to keep all of your content in a transferrable state like CSV/XML/JSON so you can just import into a new system when ready. This means you can incorporate new build requests when you need them in a new system (what's the point in a new system if it does exactly the same as the old one) and you get to keep all your content. Plus you don't need to build and maintain two CMS' all the time.
I'm admittedly unsure whether this post falls within the scope of acceptable SO questions. If not, please advise whether I might be able to adjust it to fit or if perhaps there might be a more appropriate site for it.
I'm a WinForms guy, but I've got a new project where I'm going to be making web service calls for a Point of Sale system. I've read about how CRUD operations are handled in RESTful environments where GET/PUT/POST/etc represent their respective CRUD counterpart. However I've just started working on a project where I need to submit my requirements to a developer who'll be developing a web api for me to use but he tells me that this isn't how the big boys do it.
Instead of making web requests to create a transaction followed by requests to add items to the transaction in the object based approach I'm accustomed to, I will instead use a service based approach to just make a 'prepare' checkout call in order to see the subtotal, tax, total, etc. for the transaction with the items I currently have on it. Then when I'm ready to actually process the transaction I'll make a call to 'complete' checkout.
I quoted a couple words above because I'm curious whether these are common terms that everyone uses or just ones that he happened to choose to explain the process to me. And my question is, where might I go to get up to speed on the way the 'big boys' like Google and Amazon design their APIs? I'm not the one implementing the API, but there seems to be a little bit of an impedance mismatch in regard to how I'm trying to communicate what I need and the way the developer is expecting to hear my requirements.
Not sure wrt the specifics of your application though your general understanding seems ik. There are always corner cases that test the born though.
I would heed that you listen to your dev team on how things should be imolemented and just provide the "what's" (requirements). They should be trusted to know best practice and your company's own interpretation and standards (right or wrong). If they don't give you your requirement (ease-of-use or can't be easily reusable with expanded requirements) then you can review why with an architect or dev mgr.
However, if you are interested and want to debate and perhaps understand, check out Atlassian's best practice here: https://developer.atlassian.com/plugins/servlet/mobile#content/view/4915226.
FYI: Atlassian make really leading dev tools in use in v.large companies. Note also that this best-practices is as a part of refactoring meaning they've been through the mill and know what worked and what hasn't).
FYI2 (edit): Reading between the lines of your question, I think your dev is basically instructing you specifically on how transactions are managed within ReST. That is, you don't typically begin, add, end. Instead, everything that is transactional is rolled within a transaction wrapper and POSTed to the server as a single transaction.
We're an SME with SAP implemented. We're trying to use the transactional data in SAP to build another system in PHP for our trucking division for graphical reports, etc. This is because we don't have in-house expertise ABAP development and any SAP modifications are expensive.
Presently, I've managed to achieve our objectives with read-only access to our Quality DB2 server and any writes go to another DB2 server. We've found the CPU usage on the SELECT statements to be acceptable and the user is granted access only to specific tables/views.
SAP's Quality DB2 -> PHP -> Different DB2 client
Would like your opinion on whether it is safe to read from production the same way? Implementing all of this again via the RFC connector seems very painful. Master-Slave config is an option for us but again will involve external consultancy.
EDIT
Forgot to mention that our SAP guys don't want to build even reports for another 6-months - they want to leave the system intact. Which is why we're building this in PHP on the top.
If you don't have ABAP expertise, get it - it's not that hard, and you'll get a lot of stuff "for granted" (as in "provided by the platform") that you'll have to implement manually otherwise - like user authentication and authority management and software logistics (moving stuff from the development to the production repository). See these articles for a short (although biased) introduction. If you still need an external PHP application, fine - but you really should give ABAP a try first. For web applications, you might want to look into Web Dynpro ABAP. Using the IGS built'in chart engine with the BusinessGraphics element, you'll get a ton of the most custom chart types for free. You can also integrate PDF forms created with Adobe Livecycle Designer.
Second, while "any SAP modifications are expensive" might be a good approach, what you're suggesting isn't a modification. That's add-on development, and it's neither expensive nor more complex than any other programming language and/or environment out there. If you can't or don't want to implement your own application entirely using the existing infrastructure, at least use a decent interface - web services, RFC, whatever. From an ABAP point of view, RFC is always the easiest option, but you can use SOAP or REST as well, although you'll have to implement the latter manually. It's not that hard either.
NEVER EVER access the SAP database directly. Just don't. You'll have to implement all the constraints like client dependency or checks for validity dates and cancellation flags for yourself - that's hardly less complex than writing a decent interface, and it's prone to break every time the structure is changed. And if at some point you need to read some of the more complex contents like long texts, you're screwed - period. Not to mention that most internal or external auditors (if that happens to be an issue with your company and/or legal requirements) don't like direct database access to a system as critical as this one, which again can cause lots of trouble from people you really don't want to mess with. It's just not worth it.
We are in the process of selecting a workflow solution for a company that uses Microsoft products end to end. Given the news on WF4, in that it seems to be essentially a rewrite of previous versions, is it a wise move to back the current version or should we be looking elsewhere?
Ie - is the current version so bad that we would not be wise to try and use it?
Haiving just launched a project which .NET 3.5 and workflow I'd say that the current release of WF is good enough to use and run with. It has helped us to get a product out quickly (we have the usual feature creep and requirements changing weekly). However, I have a list of complaints with it:
The workflow designer will drive you insane because it is so slow (in certain circumstances) and re-arranges your state machines as it sees fit.
There is no built in upgrade strategy for keeping your old workflows running once you do a bug fix release. If you are going to use WF think carefully how to do upgrades early.
Itegrating with WCF (the send and recieve activity) hide the WorkflowRuntime from you this makes it very difficult to understand what is going on on the hood.
Its not easy to unit test them. There are ideas out there but none seemed particulary easy when we started this WorkFlow Unit Testing
I like the ideas and potential of Workflow based development, however I am not in a hurry to repeat this experience and would probably stick without it for long running processes. One place I would use it again would be in a short, complicated process (like a rules engine for working out prices).
Maybe it is a little late for you, but now that WF 4.0 is released in beta, other people thinking the same question can consider backing the 4.0 horse instead of 3.5 horse.
This goes some way to fixing the following problems:
•The workflow designer will drive you insane because it is so slow (in certain circumstances) and re-arranges your state machines as it sees fit.
[Designer Perf Improved]
•Its not easy to unit test them. There are ideas out there but none seemed particulary easy when we started this WorkFlow Unit Testing
[I think it's a little easier now, some of the introduction to workflow samples include plenty of unit testing]
My understanding is that Microsoft will provide backwards compatibility and/or a migration strategy to the new WF, so I would guess that you are safe to use it. However, I have heard from other developers in my organization that the current version of WF is extremely painful to use. If you have the budget (and depending on the complexity of your workflows), you may want to consider K2: http://www.k2.com/en/index.aspx
I, as a workflow developer, think that current version is painful to use. This is not surprising as this is a v1.0 software out from microsoft :)
I think you should first consider your expectations from a workflow software. Do you have a well defined list of expectations from WF? Acutally I am wondering content of such a list. Maybe we can help more detailed on each topic.
I don't know why people have such negative impressions about WF. Sure it has it drawbacks, but I thought it was pretty useful. The one major issue I have about it is the lack of support for upgrading existing workflow (bullent #2 in gbanfill's list).
Another point to use the current version is that "Dublin" (Microsoft new App Server) will be built on WCF & WF .NET 4.0 but will gladly host 3.5 WF's. So you will be able to migrate to that without a rewrite.
Just a quick note to mention that Visual Studio 2010 CTP contains a new updated WF designer as part of the Oslo objective.