Is going for a BCS the right move for me? - bcs

I'm at a fork in the road. I need somebody to give me some advice from their personal journey in IT.
At the moment, I have a college diploma (2 years) in Computer Programmer, and about 2 years of professional experience in the field of software. I'm currently freelancing my programming skills to the public, and am enjoying a nice income, and the rewards of flexibly working on a variety of projects with different cool people.
I'm young (21 years old), passionate about software, technology, the internet, and also business. I know if I ever want to dwell deeper into the software industry, I might have a hard time doing so without a Bachelors in Computer Science.
On one side, I think I'm better off getting my BCS while I'm still young and malleable. Also, the thought of learning even more stuff in my field is really exciting to me. On the flip side, it means another 3-4 years of studying, and jeopardizing my chances of going on vacation and accumulating wealth for a long time.
Considering that I'm already pretty successful with my college diploma, do you think it's a good idea for me to go get my BCS? Will it open up many more doors in the future?

I know if I ever want to dwell deeper into the software industry, I might have a hard time doing so without a Bachelors in Computer Science.
I completely disagree with this statement. If you want to "dwell deeper into the software industry", your drive, determination, and will to learn will be your limiting factors. I've known many CS Masters Degree candidates who, frankly, sucked. I've known CS majors who were very skilled and talented. I've known English majors who were excellent developers. The distinguishing factor between those that could and those that could not was a desire to learn.
I once thought that I needed a degree, but ultimately found it to be a distraction. I even wrote about it here: http://jasonleveille.com/blog/2009/10/the-cs-masters-degree-distraction. I don't know if a degree is or isn't right for you. You have to make that decision. Good luck figuring it out.

As far as helping your resume for getting a good job... I consider my stackoverflow profile about as powerful as my education background from one of the top universities in Canada. Just a thought.
I do think a diploma will help you though in 2 main ways:
In learning ability/thought discipline
As Resume flair for getting jobs
But experience is king for many companies. Personally if I was in your shoes I'd probably continue to do some contract work and take a class or two at a time.
If you have the will now though, and the money, and the time, go for the education full time and take on some projects part time. You may not be as flexible in the future.

My journey is a bit different from yours as I did my Bachelor's straight out of high school. I graduated with a high enough average to get a few scholarships and ended up with CS as one of my majors and Combinatorics & Optimization as my other one. I graduated in 1997 and this was about the time of the dot-com boom so while it did take a few months I did eventually get a job.
Could some of your college credits count towards a Bachelor's degree? Have you considered seeing if there is a Bachelor of Information Systems that may suit you better than a general CS program? Those are a couple of questions that I'd put out there as I'd likely think the key is finding that program that works for you is the key point I'd have as while some places may have that Bachelor's degree as a requirement, there will likely be places that wouldn't have that as a requirement so I kind of second Jason's reply on that point.

Related

How to motivate team to work on legacy products

We are a team working on legacy code which is pretty old and written in languages of initial programming days. As the team members are trained in latest technology and are now put to work on legacy code, they are not happy. How to motivate them to work in legacy code also?
Send your team to meet users and watch them use software. They should find out what are most critical problems users have with that software.
Getting to know users makes work more real - your team will know that adding new functionality or eliminating some bugs will help some real person. That should motivate programmers to get boring job done.
Only cash can not make the developers happy. You should provide them good environment so that they can pay attention to their work.
Another thing is no technology is bad OR legacy OR older. Thing is if your company needs to maintain it, then you must keep it going. But keep all standards for designing, coding, testing, code review, interactive sessions etc..
Also you can motivate them to convert your legacy code to some new platform for better performance and maintainability. Every company does that even once i think, because they want to compete with other market products.
Also provide them some cool sessions for other technologies which are used in your company but they don't know or use. Let them be deeply in the things, give them proper time and support for problem solving. Main goal is to deliver on time with less rework and bugs.
Provide some rewards towards their work and keep them happy about their work.
thanks.
i really like "Send your team to meet users and watch them using software"
If i have to motivate my team, I will really ask my developer to visit the use and find out how much user is happy with the product.
I will really like to take challenge on how we can make it more better then what exist.
Do you have some scope for retiring the legacy code in the foreseeable future? If so, "we only need to keep this going until..." might sweeten the pill.
Are the team members experienced in the languages/environments which the legacy code is written in? If not, it might be simple reluctance to do something that they don't know how to operate. Possibly scheduling in some time for them to gain at least a passing familiarity might be in order; provided it's not too much of a paradigm shift from latest technology, it shouldn't be all that hard?
Are the team members only allowed to work on the legacy code team or can their time be split between different projects? I don't think anyone is going to be happy about spending a 40 hour week on FORTRAN debugging. But if you have to spend a few hours on the legacy code knowing that you can take breaks during the day to work on something you actually enjoy it's a little less painful.
And I'll reiterate what was said before about making sure the team members have time to learn and gain experience with the old technologies before throwing them in there. Try to make the training enjoyable too. Our legacy code training was set up as a competition to see who could come up with the fastest/shortest/most complete/etc solution to interesting problems rather than having to look exclusively at the code we were to be working on. Really, that could be applied to the team's plan even if you don't have time set aside for training. Add a little competition to the task at hand or allow a little bit of time for challenging and competitive side projects.
How are they getting rewarded for working on these legacy products? Do you know what motivates them? Some people may prefer timely recognition and praise while others may expect cash or understanding that this isn't necessarily what they signed up for when they initially took the job. I'd be tempted to suggest having 1:1 meetings to see what would they like that would make them happier. Is it more money? More flexibility in time off? Training in the legacy technologies? Affirmation that they are doing good work on these ancient systems, as the initial programming days makes me think of mainframes and other really old tools that one may wonder, "How much longer will this run really?"
Cash isn't the answer. Free food, soft drinks, whatever, that only goes so far at alleviating the drudgery of legacy code work. What about trying to change their perspective?
"Anyone can do good work with modern code that has a nice IDE with refactoring built in, a ton of resources just one Google search away, but we proud few, we band of brothers, we are good enough to do this with ancient procedural languages. We'll tame this awful mess of code and do it with one hand behind our backs and create processes and tools to make sure the next poor bastards won't have it so bad."
I would say the simplest way to attract the most positive emotion from developers to legacy coding would be to make the old new in some way.
Have a session or two to identify what it is that the legacy code does, and then get an idea of what it would take to do it anew on a new architecture. The "new architecture" part is key, because 9/10 times, it's the architecture that is dreaded (spaghetti code, pre-standard conventions, etc...).
If you can't get your re-write estimates approved, then at least work out a plan to get your refactoring of legacy code into the daily maintenance. At the very least your developers will feel as though they are working towards something, and something new at that, instead of just monkey-wrenching the old decay that no one wants to even remember.
Just my 2ยข.
You can for example try to do fancy things on the testing side. Try out mocking frameworks etc.
Try also emphasize that handling legacy code is a good experience if you want to become a solid programmer, since every technology becomes eventually legacy.
Extra cash? :) Don't know anything else...
Even if it's new technologies legacy code it's not always a pleasure to work on such code, so on "initial technologies"... i guess the only motivating thing is to discover how programming was these days...
The amount of Time required in spending motivating the team and learning the legacy code and fixing it half heartedly can easily be used to build the same stuff in new platform , given the amount of resources , IDEs , expertise , frameworks etc available for free, the Good news you have the system in place you just need to meet the same behavior in new platform unlike we have to build something new for some product whose behavior and user experience we dont know .

Encouraging good development practices for non-professional programmers? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
In my copious free time, I collaborate with a number of scientists (mostly biologists) who develop software, databases, and other tools related to the work they do.
Generally these projects are built on a one-off basis, used in-house, and eventually someone decides "oh, this could be useful to other people," so they release a binary or slap a PHP interface onto it and shove it onto the web. However, they typically can't be bothered to make their source code or dumps of their databases available for other developers, so in practice, these projects usually die when the project for which the code was written comes to an end or loses funding. A few months (or years) later, some other lab has a need for the same kind of tool, they have to repeat the work that the first lab did, that project eventually dies, lather, rinse, repeat.
Does anyone have any suggestions for how to persuade people whose primary job isn't programming that it's of benefit to their community for them to be more open with the tools they've built?
Similarly, any advice on how to communicate the idea that version control, bug tracking, refactoring, automated tests, continuous integration and other common practices we professional developers take for granted are good ideas worth spending time on?
Unfortunately, a lot of scientists seem to hold the opinion that programming is a dull, make-work necessary evil and that their research is much more important, not realising that these days, software development is part of scientific research, and if the community as a whole were to raise the bar for development standards, everyone would benefit.
Have you ever been in a situation like this? What worked for you?
Software Carpentry sounds like a match for your request:
Overview
Many scientists and engineers spend
much of their lives programming, but
only a handful have ever been taught
how to do this well. As a result, they
spend their time wrestling with
software, instead of doing research,
but have no idea how reliable or
efficient their programs are.
This course is an intensive
introduction to basic software
development practices for scientists
and engineers that can reduce the time
they spend programming by 20-25%. All
of the material is open source: it may
be used freely by anyone for
educational or commercial purposes,
and research groups in academia and
industry are actively encouraged to
adapt it to their needs.
Let me preface this by saying that I'm a bioinformatician, so I see the things you're talking about all the time. There's some truth to the fact that many of these people are biologists-turned-coders who just don't have the exposure to best practices.
That said, the core problem isn't that these people don't know about good practices, or don't care. The problem is that there is no incentive for them to spend more time learning software engineering, or to clean up their code and release it.
In an academic research setting, your reputation (and thus your future job prospects) depends almost entirely on the number and quality of publications that you've contributed to. Publications on methods or new algorithms are not given as much respect as those that report new biological findings. So after I do a quick analysis of a dataset, there's very little incentive for me to spend lots of time cleaning up my code and releasing it, when I could be moving on to the next dataset and making more biological discoveries.
I'll also note that the availability of funding for computational development is orders of magnitude less than that available for doing the biology. In a climate where only 10% of submitted grants are getting funded, scientists don't have the luxury of taking time to clean and release their code, when doing so doesn't help them keep their lab funded.
So, there's the problem in a nutshell. As a bioinformatician, I think it's perverse and often frustrating.
That said, there is hope for the future. With second-and-third generation sequencing, in particular, biology is moving into the realm of high-throughput discovery, where data mining and solid computational pipelines become integral to the success of the science. As that happens, you'll see more and more funding for computational projects, and more and more real software engineering happening.
It's not exactly simple, but demonstration by example would probably drive the point home most effectively - find a task the researcher needs done, find someone who did take the time to make a tool w/source available, and point out how much time the researcher could save as a result due to having that tool available - then point out that they could give back to the community in the same fashion.
In effect, what you are asking them to do is become professional developers (with their copious free time), in addition to their chosen profession. Their reluctance is understandable.
Does anyone have any suggestions for how to persuade people whose primary job isn't programming that it's of benefit to their community for them to be more open with the tools they've built?
Give up. Seriously, this is like teaching a pig to sing. (I can say this because I used to be a physicist so I know what they're like.)
The real issue is that your colleagues are rewarded for scientific output measured in publications, not software. It's hard enough in computer science to get recognized for building software; in the other sciences, it's nearly impossible.
You can't sell good development practice to your biology friends on the grounds that "it's good for you." They're going to ask "should I invest effort in learning about good software practice, or should I invest the same effort to publish another biology paper?" No contest.
Maybe framing it in terms of academic/intellectual responsibility would help, to a degree - sharing your source is, in many ways, like properly citing your sources or detailing your research methodology. There are similar arguments to be made for some of the "professional software developer" behaviors you'd like to encourage, though I think releasing the code is probably an easier sell on these grounds than other things which could require significantly more work.
Actually, asking any busy project team to include in their schedule time for making their software suitable for adoption by another team is extremely hard in my experience.
Doing extra work for the public good is a big ask.
I've seen a common pattern of "harvesting" after the project is complete, reflecting that immediate coding for reuse tends to get lost in the urgency of the day.
The only avenue I can think of is if the reuse is within an organisation with a budget for a "hunter gatherer", someone whose reason for being there is IT.
You may be on more of a win for things such as unit tests because they have immediate payback for the development.
For one thing, could we please stop teaching biologists Perl? Teaching non-professional programmers a write-only language is practically guaranteed to lead to unmaintainable, throw-away code. Python fills the same niche, is just as easy to learn (it's even used to teach kids programming!), and is much more readable.
Draw parallels with statistics. Stats is a crucial part of scientific research, and one where the only sensible advice is: either learn to do it properly, or get an expert to do it for you. Incorrectly-done stats can completely undermine a paper, just as badly-written code can completely undermine a public database or web resource.
PS: This blog is very good, but getting them to read it will be an uphill struggle: Programming for Scientists
Chris,
I agree with you to a degree, but in my experience what ends up happening is that in their eagerness to publish you end up with too many "me too" codes and methods, which don't really add to the quality of science. If there was a little more thought about open sourcing code and encouraging others to contribute (without necessarily getting publications out of it) then everyone would benefit.
Definitely agree that a separation between the scientific programmers and the software engineers is a good thing, especially for production applications. But even for scientific programming, the quality of my code would have been so much better if I had followed good practices at the time.
In my experience the best way of getting people to program cleanly is to show a good example when you're working with them.
eg: "I never spend hopeless days debugging my code because the first things I code are automated unit tests that will pinpoint problems when they are small and easily detectable"
or: "I'm very bad at keeping track of versions of things, but sometimes my new code does break what did work before. So I use svn/git/dropbox to keep track of things for me"
In my experience that kind of statement can raise the interest of "biologists that learned how to script".
And if you need to collaborate on a bigger project, make it clear that you have more experience and that everything will go more smoothly if things are done your way.
Regarding publication of code, current practice is indeed frustrating. I would like to see a new journal like Source Code for Biology and Medecine, where code is peer-reviewed and can be published, but that has no (or very low) publication costs. Putting code on sourceforge or others is indeed not "scientifically worth it" because it doesn't make a line on your publication list, and most code is not revolutionary enough to warrant paying $1,000 for publication in Source Code for Biology and Medecine or PLoS One...
You could have them use a content management system, like Joomla. That way they only push content and not code.
I wouldn't so much persuade as I would streamline the process. Document it clearly, make video tutorials and bundle some kind of tool chain that makes it ridiculously easy to get source repositories set up without requiring them to become experts in something that isn't their main field.
Take a really good programmer who already knows best practices, ask your scientists to teach him what they need and what they do, eventually the programmer will have minimum domain knowledge (I suspect it takes between 1 and 3 years depending on the domain) to do what scientists asks for.
Developers always learn another domain of competency, because most of their programs are not for developers, so they need to know what the "client" do.
To be devil's advocate, is teaching scientists to be good software engineers the right thing to do? Software in research is usually very purpose specific - sometimes to the point where a piece of code needs to run successfully only once on a single data set. The results then feed into a publication and the goal is met. And there's a high risk that your technique or algorithm will be superseded by a better one in short order. So, there's a real risk that effort spent producing sparkling code will be wasted.
When you're frustrated by wading through a swamp of ill-formed perl code, just think that the code you're looking at is one of the rare survivors. Mountains of such code has been written, used a few times, then discarded never to see the light of day again.
I guess I'm just saying there's a big place in research for smelly heinous one-off prototype code. There are good reasons why such code exists. It may not be pretty, but if it gets the job done, who cares? We can always hire a software engineer to write the production-ready version later, IF it turns out to be justified, and let our scientists move on.

Has anyone been to the Pragmatic Studio iPhone course, and if so, is it worth it?

I am thinking of going to the pragmatic studio iphone course, but am a little wary due to the price. I currently do contract work and therefore don't have a company that will pay the course costs for me, other than mine of course.
After travel, food, hotel and currency conversion the course runs up to over 3500.
Is it worth it?
I'm honestly not a fan of coding bootcamps or short duration courses in terms of cost/benefit.
If money wasn't an issue I would say go for it but since you are paying your own way and brought up the money aspect I'm going to assume that at least some consideration must be taken before dropping $3.5k.
I find that I can honestly absorb much more material and at greater depth by simply reading a book and attempting exercises or small projects myself. Certainly there are some tips & tricks that can be imparted to you through actual instruction that you might not glean right away from self-learning but those things won't be the bulk of the knowledge you have to learn.
These programs remind me of short duration certification bootcamps: expensive and of little substance. You have to be realistic, you are there for 4 days. You are going to learn how to setup the development environment, shown some canned examples that you follow along with illustrating the basics of the accelerometer, photos, and other iPhone basics.
Is that really worth it to you? To me, a serious professional developer would simply sit down with a few of the cocoa/iPhone books and start banging out some chapters on their time off.
So, in my opinion, save yourself some money and spend $90 bucks on some books and take a few days off work.
I hemmed and hawed about the BigNerd Ranch bootcamp and decided against it.
Classrooms are great, and you can really learn a lot, but I think I would benefit more from an advanced bootcamp for iPhone now vs one for newbies to the platform back them.
Knowing how stuff works makes it easier to learn how similar stuff works. Having a class on hardcore multi-threading, UI best practices, various networking patterns, and in-depth CoreAnimation...that is a bootcamp you could sell videos of, let alone pack to the rafters right now.
What I found was doing real projects and trying to make stuff teaches you far more than a classroom setting would, and faster. Get your hands dirty, and start making mistakes.
I personally have not been but I hear Bill Dudney is great. It sounds expensive but maybe you can right it off for tax purposes, that being said anytime you get a chance to learn something, it's probably worth it. If you think that you would like to make commercial iPhone apps, AND you think you can make at least $3500 dollars doing this, then I say go for it. If you think your just going to be doing this as a hobby or you think you'll make less than $3500 then I totally agree with Simucal.
P.S. Maybe you could write a tutorial on how to make iPhone apps after this. Also the networking section looks fun!

The Framework/IDE Knowledge Trap [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
We don't teach children calculus first. We first teach them arithmetic, then algebra, then geometry, the analytical geometry, then finally calculus.
Why then, do we teach our computer scientists frameworks and IDE first. Some curriculum do force students to learn computer science fundamentals, but the vast majority of graduates that I see could not compose a framework of their own to save their lives.
Where then is the next generation of tool builders?
How can we promote the understanding necessary to create frameworks and development environments?
This is of course a generality. Not all education is lacking, but it seems to be the majority and it brings down the quality of our profession as a whole.
I think the analogy is a bit off. A better analogy would be "We don't teach our kids to use calculators to add and subtract, why teach programmers to use an IDE to program?"
Get rid of HR departments that require X years experience in Y. The universities are just tailoring their course to the HR department's requirements.
I employ graduates who can code in something (I really don't care what language) and who can learn.
I see your point, although I think the math analogy doesn't quite fit. You have to know basic arithmetic to be able to get anything done in any other math discipline.
When I began programming frameworks were mostly unheard of. If you wanted a binary tree, by God, you went and wrote one. In C or Assembler. That was basically it, so to get anything done at all you had to know a lot.
Today, Frameworks and IDEs and designers make it possible for "noobs" to create actually pretty brilliant things without knowing the first thing about how to build a framework, or a compiler, or manage memory allocation.
The real issue is, what about all the dingbats that think they are awesome, great programmers because they used Frontpage or Access? Managers have a hard time telling the difference between that kind of programmer and one that really knows software development as a discipline.
So, specifically, why is it that way? Because everyone wants a job and nobody hires programmers that know how to build a binary tree. They want programmers that know .Net or J2EE, etc.
I would argue that there is probably enough work out there for 9 to 5 programmers who can start at the framework level and go up from there. The truly good ones - mostly your program as a career and/or program as a hobby - are going to get the knowledge they may have missed in college over time anyway. You can't force everyone to be a wonderful programmer no matter what curriculum you teach. Inquisitive students are going to learn about the fundamentals whether its taught to them in class or entirely on their own.
There are tool makers and tool breakers. And of course there are tools, but let's not go there.
If you have a good look at an automotive workshop, you will see a lot of funny little tools that you don't see on the shelves in hardware stores. Like the ones for pushing back brake caliper pistons. Or the clamps for compressing valve stems so you can get the collets out with one hand while talking to your mates about nailing the new secretary (instead of watching them fly across the room when the spring slips out from your screwdriver).
These were designed by mechanics. They're really effective, generally small and cheap, and totally incomprehensible until you seen them in action.
Most of the profound changes in automotive technology were bottom-up, but top-down is also needed. Individual mechanics can't make fundamental technology changes like the switch from cast iron to alloy heads. A new broom sweeps clean, an old broom knows the corners. You need both.
But I digress: the point is that the mechanics couldn't design these tools if they lacked fundamental skills and knowledge. My father built me an entire motorcycle from scrap iron when I was a kid. As an adult, because I lack his skills and knowledge and modes of thought, I can barely maintain the bike I bought from Honda, much less take to it with an oxy like Mr T in a creative frenzy.
With code, I am as my father was with steel. Donald Knuth is my constant companion, and when the wireless protocol for our GPS loggers needs to be implemented in .NET it's me they come to see. The widget monkeys wouldn't know where to start.
I think the problem is in fact the GUI paradigm in general.
Microsoft made using computers much easier, they popularized the Graphical User Interface. They brought this interface metaphor, (the desktop, the file) to the domain of programming as well and very effectively too with their Visual Basic tool.
But just as the GUI obscures what happens "under the hood" so does the IDE obscure the manipulation of bits and bytes. The question is, of course, risk to reward ratio - how much understanding do programmers lose in exchange for productivity?
A cursory look at "The Art of Computer Programming" might show why IDEs are useful; "The ultimate packing density is achieved when we have 1-bit items, because we can cram 64 of them into a single 64-bit word. Suppose, for example, that we want a table of all odd prime numbers less than 1024, so that we can easily decide the primality of a small integer. No problem; only eight 64-bit numbers are required:
p0 = 011101101101001100101101001001001100101100101001000101101101000000
p1 = . . ."
Programming is really hard, you can see how an IDE might help. :^)
Learning the abstraction is easier than learning the details when it comes to programming. It's harder to teach someone to hand-code assembler to print "Hello World" than it is to have them throw together a form with a button on it that shows a "Hello World" message when the button is clicked.
You didn't know how to build the engine of a car before learning to drive, did you? Because it's not necessary in order to drive. In the same vein, you don't need to learn how a linked list or binary tree works in order to maintain a list of names and search them.
There will always be those who want to get under the hood and learn the "why" of things, but I don't think it's required to get things done.
I always screen applications by asking difficult questions that they could only answer if they understood how something really works. I think it is a real shame colleges and universities are teaching people framework based development but not focusing on core software principles. I agree that what matters more than anything is someone who understands how programming works and has the drive to learn anything they can about it.
Most universities I know of have an introduction to computer programming course that teaches basic programming concepts. Unfortunately it is impossible to teach programming without actually writing code.
The problem is that some prefer to teach this course using some OO language such as JAVA or C# and so the students must use Visual Studio (or the Java equivalent).
It is very hard to explain the basic concepts when the IDE forces you to work in a certain way.
I think that the first language students learn should be functional language such as C. This way you have less layers of abstraction between them and the basic CS concepts.
Agree with cfeduke.
I looked at the work for the same CS courses I did from 2 years previously, and they were way harder. 5 years previously, way way harder.
The CS bar is being lowered more and more, presumably because there are more and more jobs that don't require any working knowledge of any of the complicated CS subjects. There are huge numbers of jobs for people to just cut code.
Since traditionaly people who wanted to be programmers did CS courses as coding has gotten easier this is still the case.
What really needs to happen is for CS to not be a requirement for professional software development. Instead there needs to be another curriculam that focuses more on getting people out the door and cutting code.
This would leave CS to be that course for you next generation of tool builder.

Neural Networks or Human-computer interaction

I will be entering my third year of university in my next academic year, once I've finished my placement year as a web developer, and I would like to hear some opinions on the two modules in the Title.
I'm interested in both, however I want to pick one that will be relevant to my career and that I can apply to systems I develop.
I'm doing an Internet Computing degree, it covers web development, networking, database work and programming. Though I have had myself set on becoming a web developer I'm not so sure about that any more so am trying not to limit myself to that area of development.
I know HCI would help me as a web developer, but do you think it's worth it? Do you think Neural Network knowledge could help me realistically in a system I write in the future?
Thanks.
EDIT:
I thought it would be useful to follow-up with what I decided to do and how it's worked out.
I picked Artificial Neural Networks over HCI, and I've really enjoyed it. Having a peek into cognitive science and machine learning has ignited my interest for the subject area, and I will be hoping to take on a postgraduate project a few years from now when I can afford it.
I have got a job which I am starting after my final exams (which are in a few days) and I was indeed asked if I had done a module in HCI or similar. It didn't seem to matter, as it isn't a front-end developer position!
I would recommend taking the module if you have it as an option, as well as any module consisting of biological computation, it will open up more doors should you want to go onto postgraduate research in the future.
The worthiness depends on three factors:
How familiar are you with the topic already?
How good is the course/class you want to take?
What are your interested in more?
Especially for HCI, there is a broad range of "common sense" information you would also easily obtain from reading a good book or a wider range of articles about it also published on the internet. On the other hand, there indeed exist many deeper insights mostly obtained by Psychology studies. If the course is done right, you can indeed learn a lot about the topic and the real considerations to use for developing an interface.
For Neural Networks, one has to say that this is a typical hype topic. It would be mainly interesting in what application domain the course wants to deal with neural networks. You can be quite sure that you won't program or use any neural networks for web development. On the other hand, if the course is done right, this could be a good opportunity for you to broaden your knowledge. Especially, deepening your understanding about the theory of computer science. This highly depends on how the course is laid out, though.
HCI is a topic which helps your career as a web developer, but only if you feel incompetent in that topic (then it is a must) or it is done very well. Neural Networks is a topic which has more potential of being really interesting hardcore computer science stuff, where you indeed learn a better understanding about something. If you are interested in NN, you should not pass the opportunity to get an education which is not narrowly concentrated on the domain of web development -- and, after all, perhaps find more interest in other stuff (it is always good to know other directions you would perhaps like to go into for the future).
Neural networks sound cool until you read the fine print:
In modern software implementations of
artificial neural networks the
approach inspired by biology has more
or less been abandoned for a more
practical approach based on statistics
and signal processing.
This is something that has mystified me for years. Here you have an amazingly complex and powerful control system (real-world biological neural networks), and an academic discipline that appears to be about modeling these systems in software but that has in reality abandoned that activity.
If you're doing web development, your time is probably better spent in the HCI course.
Go with what interests you the most. The HCI stuff will be much easier to pick up later as needed, you'll likely never get another chance to learn about neural networks!
For prospective employers (at least the good ones!) you need to show a passion and excitement about what you do. I'd sooner hire someone who can enthusiastically talk about neural networks than someone who has an extra credit in HCI.
Unless you want to do the research end of the world, ie, get a Masters/PhD, go HCI.
I studied Neural Computation at University when I studied AI. I now run my own company. The number of times since I studied that I have used my NN skills equals zero. I'm glad I did it, as it was quite fascinating, but I would have found HCI much more useful from the position I'm at now. I think that you'd pick up a lot more insight from an HCI course relevant to the software industry, but if you think you experience should be more on the esoteric/almost arty side of development, go for NN.
Which sounds like more fun? Or, equivalently, which will you work harder at? Pick that one.
Did two courses in NN and some other AI-courses - its fun to poke round with that stuff and I actually managed to implement the stuff in some of the things I've done like face-recognition, and it's useful in some other areas to if you wanna plot your lab data etc. I have never used the NN:s in my web development career though I am sure it could be used for something however what it all really boils down to is to find a client or employee willing pay for it when you can just take the straight path. So I would rather read book about it if I wasn't that hardcore about it.
Fundamental Neural Networks doesn't take to much knowledge in math, and was what I used in my first course.
as a programmer to be you need the knowledge of neural network. if parallel processing is the way to go in hardware then future programmers must be knowledgable in neural network. don't forget that NN works better with noise or imprecise data but other systems may not. Note that most data we use for analysis are sample data which is a fraction of the whole and you could imagine if some in the sample are way off. so you need knowledge of NN if you want to last in computer programming field.