Preparing for a Scientific Programmer/HPC programmer Interview - hpc

Could you recommend sources/literature to prepare for Scientific Programmer/HPC programmer Interview?
Thanks!

"Introduction to High-Performance Scientific Computing" by Victor Eijkhout is a very good book to start with (and is relatively up to date). You can find it freely on his personal home page.
As this book mainly focus especially on quite theoretical/abstract concepts, you probably need to complete the book with practical HPC programming. MPI and OpenMP are two programming standard massively used in HPC applications. As a result, I strongly advise you to learn how to program with both, especially from practical exercises.
There is a lot of resource you can find on internet for both. If you don't now how to start, please look here for MPI and here for OpenMP.

Related

The state of programming and compiling for multicore systems

I'm doing some research on multicore processors; specifically I'm looking at writing code for multicore processors and also compiling code for multicore processors.
I'm curious about the major problems in this field that would currently prevent a widespread adoption of programming techniques and practices to fully leverage the power of multicore architectures.
I am aware of the following efforts (some of these don't seem directly related to multicore architectures, but seem to have more to do with parallel-programming models, multi-threading, and concurrency):
Erlang (I know that Erlang includes constructs to facilitate concurrency, but I am not sure how exactly it is being leveraged for multicore architectures)
OpenMP (seems mostly related to multiprocessing and leveraging the power of clusters)
Unified Parallel C
Cilk
Intel Threading Blocks (this seems to be directly related to multicore systems; makes sense as it comes from Intel. In addition to defining certain programming-constructs, it also seems have features that tell the compiler to optimize the code for multicore architectures)
In general, from what little experience I have with multithreaded programming, I know that programming with concurrency and parallelism in mind is definitely a difficult concept. I am also aware that multithreaded programming and multicore programming are two different things. in multithreaded programming you are ensuring that the CPU does not remain idle (on a single-CPU system. As James pointed out the OS can schedule different threads to run on different cores -- but I'm more interested in describing the parallel operations from the language itself, or via the compiler). As far as I know you cannot truly do parallel operations. In multicore systems, you should be able to perform truly-parallel operations.
So it seems to me that currently the problems facing multicore programming are:
Multicore programming is a difficult concept that requires significant skill
There are no native constructs in today's programming languages that provide a good abstraction to program for a multicore environment
Other than Intel's TBB library I haven't found efforts in other programming-languages to leverage the power of multicore architectures for compilation (for example, I don't know if the Java or C# compiler optimizes the bytecode for multicore systems or even if the JIT compiler does that)
I'm interested in knowing what other problems there might be, and if there are any solutions in the works to address these problems. Links to research papers (and things of that nature) would be helpful. Thanks!
EDIT
If I had to condense my question down to one sentence, it would be this: What are the problems that face multicore programming today and what research is going on in the field to solve these problems?
UPDATE
It also seems to me that there are three levels where multicore needs to be concerned:
Language level: Constructs/concepts/frameworks that abstract parallelization and concurrency and make it easy for programmers to express the same
Compiler level: If the compiler is aware of what architecture it is compiling for, it can optimize the compiled code for that architecture.
OS level: The OS optimizes the running process and perhaps schedules different threads/processes to run on different cores.
I've searched on ACM and IEEE and have found a few papers. Most of them talk about how difficult it is to think concurrently and also how current languages don't have a proper way to express concurrency. Some have gone so far as to claim that the current model of concurrency that we have (threads) is not a good way to handle concurrency (even on multiple cores). I'm interested in hearing other views.
I'm curious about the major problems in this field that would currently prevent a widespread adoption of programming techniques and practices to fully leverage the power of multicore architectures.
Inertia. (BTW: that's pretty much the answer to all "what does prevent the widespread adoption" questions, whether that be models of parallel programming, garbage collection, type safety or fuel-efficient automobiles.)
We have known since the 1960s that the threads+locks model is fundamentally broken. By ~1980, we had about a dozen better models. And yet, the vast majority of languages that are in use today (including languages that were newly created from scratch long after 1980), offer only threads+locks.
The major problems with multicore programming is the same as writing any other concurrent applications, but whereas before it was uncommon to have multiple cpus in a computer, now it is hard to find any modern computer with only one core in it, so, to take advantage of multicore, multiple cpu architectures there are new challenges.
But, this problem is an old problem, whenever computer architectures go beyond compilers then it seems the fallback solution is to move back toward functional programming, as that programming paradigm, if strictly followed, can make very parallelizable programs, as you don't have any global mutable variables, for example.
But, not all problems can be done easily using FP, so the goal then is how to easily get other programming paradigms to be easy to use on multicores.
The first thing is that many programmers have avoided writing good mulithreaded applications, so there isn't a strongly prepared number of developers, as they learned habits that will make their coding harder to do.
But, as with most changes to the cpu, you can look at how to change the compiler, and for that you can look at Scala, Haskell, Erlang and F#.
For libraries you can look at the parallel framework extension, by MS as a way to make it easier to do concurrent programming.
It is at work, but I recently either IEEE Spectrum or IEEE Computer had articles on multicore programming issues, so look at what IEEE and ACM articles have been written on these issues, to get more ideas as to what is being looked at.
I think the biggest impediment will be the difficulty to get programmers to change their language as FP is very different than OOP.
One place for research besides developing languages that will work well this way, is how to handle multiple threads accessing memory, but, as with much in this area, Haskell seems to be at the forefront in testing ideas for this, so you can look at what is going on with Haskell.
Ultimately there will be new languages, and it may be that we have DSLs to help abstract the developer more, but how to educate programmers on this will be a challenge.
UPDATE:
You may find Chapter 24. Concurrent and multicore programming of interest, http://book.realworldhaskell.org/read/concurrent-and-multicore-programming.html
One of the answers mentioned the Parallel Extensions for the .NET Framework and since you mentioned C#, it's definitely something I would investigate. Microsoft has done something interesting things there, though I have to think many of their efforts seem more suited for language enhancements in C# than a separate and distinct library for concurrent programming. But I think their efforts are worth applauding and respect that we're early here. (Disclaimer: I used to be the marketing director for Visual Studio about 3 years ago)
The Intel Thread Building Blocks are also quite interesting (Intel recently released a new version, and I'm excited to head down to Intel Developer Forum next week to learn more about how to use it properly).
Lastly, I work for Corensic, a software quality startup in Seattle. We've got a tool called Jinx that is designed to detect concurrency errors in your code. A 30-day trial edition is available for Windows and Linux, so you might want to check it out. (www.corensic.com)
In a nutshell, Jinx is a very thin hypervisor that, when activated, slips in between the processor and operating system. Jinx then intelligently takes slices of execution and runs simulations of various thread timings to look for bugs. When we find a particular thread timing that will cause a bug to happen, we make that timing "reality" on your machine (e.g., if you're using Visual Studio, the debugger will stop at that point). We then point out the area in your code where the bug was caused. There are no false positives with Jinx. When it detects a bug, it's definitely a bug.
Jinx works on Linux and Windows, and in both native and managed code. It is language and application platform agnostic and can work with all your existing tools.
If you check it out, please send us feedback on what works and doesn't work. We've been running Jinx on some big open source projects and already are seeing situations where Jinx can find bugs 50-100 times faster than simply stress testing code.
The bottleneck of any high-performance application (written in C or C++) designed to make efficient use of more than one processor/core is the memory system (caches and RAM). A single core usually saturates the memory system with its reads and writes so it is easy to see why adding extra cores and threads causes an application to run slower. If a queue of people can pass through a door one a time, adding extra queues will not only clog the door but also make the passage of any one individual through the door less efficient.
The key to any multi-core application is optimization of and economizing on memory accesses. This means structuring data and code to work as much as possible inside their own caches where they don't disturb the other cores with acceses to the common cache (L3) or RAM. Once in a while a core needs to venture there but the trick is to reduce those situations as much as possible. In particular, data needs to be structured around and adapted to cache lines and their sizes (currently 64 bytes) and code needs to be compact and not call and jump all over the place which also disrupts pipelines.
My experience is that efficient solutions are unique to the application in question. The generic guidelines (above) are a basis on which to construct code but the tweak changes resulting from profiling conclusions will not be obvious to those who were not themselves involved in the optimizing work.
Look up fork/join frameworks and work-stealing runtimes. Two names for the same, or at least related, approaches, which is to recursively subdivide large tasks into lightweight units, such that all available parallelism is exploited, without having to know in advance how much parallelism there is. The idea is that it should run at serial speed on a uniprocessor, but get a linear speedup with multiple cores.
Sort of a horizontal analogue of cache-oblivious algorithms if you look at it right.
But i'd say the main problem facing multicore programming is that the great majority of computations remain stubbornly serial. There's just no way to throw multiple cores at those computations and make them stick.

Lisp Community - Quality tutorials/resources [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 1 year ago.
Improve this question
As many other people interested in learning Lisp, I feel the resources available are not the best for beginners and eventually prevent many new people from learning it. Do you feel it could be created some sort of community, with a website, forum or something, that provides good (as in quality) resources/tutorials, for Lisp users, possibly translated to several idioms? That way beginners that don't have the necessary skills for writing tutorials could help translating them. Is it a bad idea or is it something that could be accomplished? Give me some feedback or flame me :D
There are two popular dialects of Lisp - Common Lisp and Scheme. Both have excellent books/tutorials and implementations available online for free. Beginners can start with Scheme which is simpler. Some resources for learning Scheme:
Free Books:
Teach Yourself Scheme in Fixnum days. (pdf)
The Scheme Programming Language.
Structure and Interpretation of Computer Programs.
How To Design Programs
Online communities/resources:
The latest Scheme standard.
Scheme Cookbook.
Scheme Requests for Implementation
Scheme Related Research
http://www.schemers.org/
http://groups.csail.mit.edu/mac/projects/scheme/
A Scheme implementation suitable for beginners is PLT Scheme.
Free Books to learn Common Lisp:
Practical Common Lisp
On Lisp
Common Lisp HyperSpec (Reference)
Common Lisp: A Gentle Introduction to Symbolic Computation
Online communities/resources for Common Lisp:
The Common Lisp Cookbook
http://common-lisp.net/
CLiki
The Common Lisp Directory
Popular Common Lisp implementations: SBCL, CLISP, Clozure CL, Allegro CL
Lisp has been around for a long time, there are many (fragmented) communities. There's really no way to "create" a common community, especially from the outside.
Paul Graham would be a likely (IMNO, N=naive) person to potentially unite a large lisp community, given his popularity among younger programmers, as well as his background in lisp (writing On Lisp). However, he has chosen to create a yet another dialect of lisp, Arc.
Many folks have written about the fragmentation of the Lisp community, or Lisp's inability to "catch on". Some examples: here, here, here, and here. So, while your idea is a good one, it is probably fruitless.
That being said, don't let me stop you from rising up and being such a uniting figure in the Lisp community.
As far as existing tutorials, the Emacs Wiki is a good starting place for learning Emacs Lisp. And for an introduction to Scheme - as well as a good introduction to programming in general, there's the classic Structure and Interpretation of Computer Programs.
I find those two resources to be good starting points for learning Emacs Lisp and Scheme. I haven't played with Arc, but presumably there would be some good tutorials on learning Arc - because it is designed in part to be a good language for creating basic web apps.
Here's a forum: Lisp Forum, and here's a community: Planet Lisp
Here's a pretty decent post you might find helpful, How to Learn Lisp.
One of the strengths of Lisp is that being a mature language there are a number of really great books on the subject.
Actually, there are quite a few free CL books available online:
"Common Lisp: A Gentle Introduction to Symbolic Computation" covers the basics, but might be too gentle, depending on your level.
"Successful Lisp" is quite comprehensive, and IMHO the best online resource for learning CL, if you have already programmed a little in another language.
"Practical Common Lisp" aims to reach experienced programmers and surely is one of the best Lisp books available -- one of the few which explicitly try to explain "real world usage".
"On Lisp" is an interesting read for advanced CL programmers, mostly covering macros.
Besides those, there is the indispensable Hyperspec, a htmlized version of the standard, and CLtL2, which is was the pre-ANSI de facto standard (still valuable, since many people find it more accessible than the Hyperspec. At least it sometimes shows things from another perspective).
Finally, there is the Lisp Forum and c.l.lisp. Though there is much noise on c.l.l., you can get very insightful answers there and learn from the masters. As a newbie, one should try to post thoughtful questions on c.l.l., and have a thick skin.
Download, install, run http://download.plt-scheme.org/drscheme/.
Read its "Guide".
My thoughts, as a newcommer to lisp, would be to recommend Clojure (I have over the past six months played with Scheme and Emacs Lisp). I have only been playing with Clojure over the past couple of days.
Running on the JVM, means that most people allready have most of the Clojure enviroment, they only need to .jar files and a plugin for their editor or IDE (Java ones anyway) of choice. So getting running is easier then Scheme or CL in terms of choice.
Most new programmers are at the very least familar with Java, which Clojure of course utilizes pretty well, meaning that while they are learning they can focus on lisp, and a bit less on libraries. There is a lot of concepts that they are much better off focusing on.
On the downside, Java does have a lot of stigma against it. But Clojure has a lot going for it, and I believe a good future ahead, and the Programming Clojure is imho, very accessible, and both Joy of Clojure and Clojure in Action are comming some time soon.
Another great book to learn scheme and it's programming style, even more when you came from OO world it's better to start from scratch.
How to design programs

Why people don't use LabVIEW for purposes other than data acquisition and virtualization? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
This is marked as a subjective question, I hope I won't get too many down votes though.
LV seems to offer a nice graphic alternative to traditional text based programming. As I understand, it's not a just-virtualization/data acquisition programming language. Nonetheless, it seems to have that paradigm pegged to its creator's name.
My question comes up because it doesn't seem to be widely used for multi-purpose applications. I'm not a LV-expert of any kind, I'm more like a learner. I'm still getting used to LV.
Labview is fantastic if you have National Instruments hardware, and want to do something like acquire, plot and log the data.
When you start interfacing to custom devices the wiring between modules gets complicated having to do all the string manipulation work for input and output to a device.
At my place of work, we found that we got annoyed with having to make massive, complicated VI's to interface to devices and started writing them in .NET and interfacing them to Labview.
In the end we ended up scrapping Labview all together and using the NI Measurement Studio for Visual Studio to give us all the lovely looking NI controls (waveform plot, tank, gauges, switches etc) with the flexibility of C#.
In summary, even with a couple of 24" screens, sometimes the wiring for Labview code can get too complex and becomes impossible to comment, debug, and make extensible for any future changes. I suggest taking a look at Measurement Studio for Visual Studio and using your favourite .NET language with the pretty NI controls.
My two experiences with "graphic alternative[s] to traditional text based programming" have been dreadful. I find such languages to be slow to use, hard to edit, and inexpressive. Debugging them is a nightmare. And they offer no real advantages.
To be sure, it has been quite a long time since I looked at one, but the opinions of others I've asked about them have been only luke warm, so I have never taken the time to look again. Reasons to look again are welcome and will be taken on board...
Labview can be used to author large, complex software projects. Labview is unquestionably much more fun to use than a syntax based language. I have programmed mathematically dense, dynamic simulations using labview. Newer versions of Labview include alot of exciting features, especially for utilizing multiple processors. I like Labview very much. But I don't recommend it to anyone.
Unfortunately, it's an absolute nightmare for anything other than simple acquisition and display. It may one day be sufficiently developed to be considered as a viable alternative to text based languages. However, the developers at NI have consistently opted to ignore the three fundamental problems that plague labview.
1) It is unstable and riddled with bugs. There are thousands of bugs that have been posted to the labview support forums that are yet to be fixed. Some of these are quite serious, such as memory leaks, or mathematical errors in basic functions.
2) The documentation is atrocious. More often than not, when you look for help with a labview function in the local help file you'll find a sentence that merely restates the name of the item you are trying to find some detail on. e.g. A user looks up the help file on the texture filter mode setting and the only thing written in the help file is "Texture Filter Mode- selects the mode used for texture filtering." Gee, thanks. That clears things right up, doesn't it? The problem goes much deeper in that; quite often, when you ask a technical representative from national instruments to provide critical details about labview functionality or the specific behavior of mathematical functions, they simply don't know how the functions in their own library work. This may sound like an exaggeration, but trust me, it's not.
3) While it's not impossible to keep graphical code clean and well documented, Labview is designed to make these tasks both difficult and inefficient. In order to keep your code from becoming a tangled, confusing mess, you must routinely (every few operations) employ structures like clusters, and sub-vis and giant type defined controls (which can stretch over multiple screens in a large project). These structures eat memory and destroy performance by forcing labview to make multiple copies of data in memory and perform gratuitous operations- all for the sake of keeping the graphical diagram from looking like rainbow colored spaghetti with no comments or text anywhere in sight. Programming in labview is like playing pictionary with the devil. Imagine your giant software project written as a wall sized flowchart with no words on it at all. Now imagine that all the lines cross each other a thousand times so that tracing the data flow is completely impossible. You have just envisioned the most natural and most efficient way to program in labview.
Labview is cool. Labview is getting better with each new release. If National Instruments keeps improving it, it will be great one day as a general programming language. Right now, it's an extremely bad choice as a software development platform for large or logically complex projects.
I **have been writing in LabVIEW for almost 20 years now. I develop automated test systems. I have developed, RF, Vison, high speed digital and many different flavors of mixed signal test systems. I was a "C" programmer before I switched to LabVIEW.
It's true that you can build some programs quickly in LabVIEW, but just like any other language it takes a lot of training to learn to build a large application that is clean easy to maintain with reusable code. In 20 years I have never had a LabVIEW bug stop me from finishing a project.
Back in the day, NIWEEK would have a software shootout every year. LabVIEW and LabWINDOWS (NI's version of "C") programmers would both be given the same problem and have a race to see which group finished first. Each and every year all the LabVIEW programmers were done way before the 1st LabWINDOWs person finished. I have challenged many of my dedicated text based programming friends to shootouts and they all admit they don't stand a chance, even if I let them define the software problem.
So, I feel LabVIEW is a great programming tool. It's definitely the way to go if you’re interfacing with any type of NI hardware. It's not the answer for everything but I’m sure there are many people not using it just because they don’t consider LabVIEW a “real programming language”. After all, we just wire a bunch of blocks together right? I do find it funny how many text based programmers snub there noses at it as they are so proud of the mess of text code they have created that only they can understand. A good programmer in any language should write code that others can easily read. Writing overly complex code that is impossible to follow does not make the programmer a genius. It means the programmer is a “compliator”(someone who can take a simple problem and complicate it). I believe in the KISS principle (KEEP IT SIMPLE STUPID).
Anyway, there’s my two cents worth!**
I thought LabVIEW was a dream for FPGA programming. Independent executable blocks just... work. In general, I use LabVIEW for various tasks interfacing with my DAQ and FPGA hardware, but that's about it. It seems (again to me) that this is LabVIEW's strong point and the reason it was built, but outside that arena it feels "cumbersome." As far as getting things done, it's like any other language with a learning curve - once you figure it out it's not too bad for getting work done. I've seen several people give up before that thinking the learning curve was permanent or something.
Picking up a 30" monitor made a huge difference.
I know one thing that people dislike is the version control integration.
Edit: LabVIEW/hardware is hella expensive for "just for fun" use. I dropped $10K on their hardware (student prices) and got the software for free from school for making toys around the house.
Our company is using LabVIEW for the last 10 years for measuring, monitoring and reporting of our subject (trains).
Recently we have started using LabVIEW as GUI for databases with lots of data, the powers of LabVIEW with the recent new features (Classes, XControls) allows use to create these kinds of GUIs for a fraction of development costs at other platforms. While we don't need external programmers at consultancy rate.
Ton
I first started using Labview in a college physics lab. Initially, I thought it was slow and cumbersome when compared to other text-based languages. It was too difficult to create complex logic and code became sloppy real fast (wires everywhere).
Then, a few years later, I learned about using sub-vi's and bundles. What a difference! At this point, I was using labview for very high level functions. I was taking raw input from a camera, using all kinds of image filters and processing to ultimately parse out the lines in a road so that a vehicle could drive itself down this road with no driver - it was for the DARPA URBAN CHALLENGE. I was also generating maps from text waypoint data, making high-level parsing functions, and a slew of other applications that had nothing to do with processing data from input devices. It was really a lot of fun. and FAST.
After leaving college, I am now back to using text-based languages. I've been using: PHP, Javascript, VBA, C#, VBscript, VB.net, Matlab, Epson RC+, Codeigniter, various API's, and I'm sure some others. I often get very frustrated in the amount of syntax I have to memorize in order to program with any significant speed. I find it annoying to have to switch schools of thought based on the language I am using... when all programming languages essentially do the same thing! I need a second monitor just to have the help up at all times so i can find the syntax for the same functions in different languages. I miss Labview very much, it's too bad it's so expensive otherwise I would use it for everything.
Graphical based programming I think has a huge potential. By not being constrained by syntax, you can focus on logic instead of code. Labview itself may still be in its infancy in terms of support and debugging, but I believe conceptually it beats out the competition. It's simple a more intuitive way to program.
We use LabVIEW for running our end of line test equipment and it is ideal for data acquisition and control. Typically measuring 15 to 80 differential voltages and controlling environmental chambers, mass flow controllers and various serial devices LabVIEW is more than capable.
Interfacing with custom devices can be simplified greatly by using the NI instrument driver wizard to create reusable VI's, interfacing with custom dll's if needed. On a number of projects we have created such drivers for custom hardware and once created there are reusable in future projects with no modification.
Using event driven structures user interfaces are responsive and we regularly use LabVIEW applications to interface with a database.
Whatever programming environment you choose it's the process of designing the application that matters most. I agree that you can create some really horrible and unreadable block diagrams in LabVIEW but then you can also create unreadable code in Visual studio. With just a little thought and planning a LabVIEW block diagram can be made to fit on a single 24" monitor with plenty of space to add comments.
I would use LabVIEW over Visual Studio for most projects.
But people do use LabView for purposes other than data acquisition and virtualization. Of course LabVIEW is mainly used in labs and production environments because it is (or was) one of the main NI's customer target.
However you can do a lots of various things with LabVIEW, like programming a robot that would perform a lot of image analysis, and then tweet the results. Have a look at videos from NI Week 2009 on you-tube, and you'll see how powerful this tool is. For instance, there is possibility to write code and deploy it to ARM MCUs (see this Dev Monkey article from 2009.08.10).
And finally check this LabVIEW DIY group
I have been using LabVIEW for about two years for developing automation. If given due care and proper design we sure can develop maintainable and really good looking application in LabVIEW.I think this is the same for all the other languages out there. I have seen equally bad code in LabVIEW primarily from people who use it only to develop quick and dirty working automation. IMHO Graphical programming is a lot easier to code and understand if rightly done. But that said I feel text based programming 'feels' more powerful!
LabVIEW is primarily marketed for industrial automation, has inherent support for lot of NI hardware and you can get the third party hardwares working with it pretty quickly. I think that is the reason you see it only in automation field. Moreover it is pretty costly and you are locked down with NI as you do cannot even open your code if you do not buy the software from them!
I've been thinking about this question for decades (yes, since 1989...)
Like all programming languages, LabVIEW is a high-level tool used to manipulate the flow of electrons. Unless you are a purist and refuse to use anything other than a breadboard and wires; transistors, integrated circuits and programming languages are probably a good thing if you wish to build something of any consequence.
But like all high-level tools, just wielding one does not make you a professional craftsman. Back in the day of soldering irons, op-amps and UARTs it required a large amount of careful study before you could create a system that actually functioned. The modern realm of text-based languages is so overly dominated by syntax that the programmer must get it just right before it will compile and run. In order to write code that works, the programmer must increase their skill level to create systems much larger than "Hello World".
LabVIEW is not dominated by syntax, but by Data Flow. Back in the day, reaching for your flow charting template and developing the diagram of a well-balanced information system was the art and beauty part of the job. Only after you had the reviewed flowchart in hand would you even consider slogging through the drudgery of punching out the code. (yes... punch cards)
LabVIEW is a development system that allows the programmer to use flow charting tools to diagram the complete information system and press "run"..... LabVIEW "punches out the code" and compiles it for you. No need to fight through the syntax of text language A or language B.
With such a powerful tool, novices can build large, working programs rapidly -- implying some level of professional craftsmanship since it runs at all. However, if the system does not perform elegantly, or the source code diagram is a mess, it is not the fault of LabVIEW.
People often point to "LabVIEW is only good for developing large data acquisition systems." Perhaps those people should consider the professionalism of the scientists and engineers that are working in data acquisition. If they know enough to get the actual wires right for the sensors and transducers, it may be a good bet that they are expert at developing LabVIEW wiring diagrams as well.
I do use LabView at home, as it is part of Lego Mindstorms, which my son loves. And I really like the way to compose systems like this.
However, in my work (embedded systems), it is generally to restrictive. But also here, I'm trying to move up in abstraction:
- control and state behavior: Model based design (i.e. Rhapsody)
- data algorithms etc. Simulink
Sometimes a graphical model can require more clicks than a piece of code. But this also includes the work a good programmer need to do in design & documentation; not just the code typing. The graphical notation takes many hassles away and is generally much faster if the tool is powerful enough for the complexity at hand. So I expect these kinds of tools will gain more popularity in the next years as they mature and people get familiar with them.
I have used LabView for some 10 years. It's brilliant for Scientific prorgamming ie like Matlab or Simulink but 10 times better. If you are having problems then you are doing something wrong. It takes time to learn like any language. As for using .Net instead - are these people even on the same planet? Why would you go to the trouble of writing eveything from scratch when you can say pull up an FFT etc and use alread written code. .NET is fine for simple programs but not so good for Scientific processing. yes you can do it but not without oodles of add-ons for graphics etc. Prorgamming in G is far easier than text based for Scientific problems. You can of course program in c if you are interfacing and use the dll. Now there are things that I would not use LabView for - speech recognition for example may be a bit messy at present. More to the point though, why do people like programming in outdated text form when there is an easy alternative. It is as if people want to make things complicated so as to justify their job in some way. Simplify Simplify!
Somebody said that LabView is only sued in the Automation field. Simply not write at all. It has applications in Digital Signal Processing,Control Systems,Communications, Web Based,Mathematics,Image Processing and so on. It started as a data aquisition method and they invented the name Virtual Instrumentation but it has gone far beyond that now. It is a Scientific programming language with a second to none graphical interface. It is way beyond Simulink and if you like Matlab then it has a type of Matlab scripting built in for those that like such ways of programming. It is evolving all the time. The one thing I found difficult was writing code for the Compact Rio - tricky but far easier than the alternative. It's expensive but you get a quality product. I personally have not found any bugs in ordinary programming. It is an engineers language but anybody could use it to program.

What's the best way to learn LISP? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I have been programming in Python, PHP, Java and C for a couple or years now, and I just finished reading Hackers and Painters, so I would love to give LISP a try!
I understand its totally diferent from what i know and that it won't be easy. Also I think (please correct me if I'm wrong) there's way less community and development around LISP. So my question is: what's the best way to learn LISP?
I wouldn't mind buying books or investing some time. I just don't want it to be wasted.
The "final" idea would be to use LISP for web development, and I know that's not so common so... I know it's good to plan my learning before picking the first book or tutorial and spending lots of time on something that may not be the best way!
Thank you all for your answers!
edit: I read Practical Common Lisp and was: ... long, hard, interesting and definitely got me rolling in Lisp, after that i read the little schemer, and it was short, fun and very very good for my overall programming. So my recommendation would be to read first the little schemer, then (its a couple of hours and its worth it) if you decide lisp(or scheme or whatever dialect) is not what you where looking for, you will still have a very fun new way of thinking about recursion!
Try reading Practical Common Lisp, by Peter Seibel.
My personal favorite is Abelson & Sussman Structure and Interpretation of Computer Programs.
It uses Scheme, which is a nice and clean dialect of Lisp.
If you like a more practical approach maybe you should pick some Lisp framework for web design
(I have no idea if such a beast exists) and jump right in.
You might want to start with The Little Schemer as a warm-up. It's not a practical book about writing production Lisp programs, but it's a great book for learning how to think in Lisp.
MIT has made available an entire LISP course in DIVX and MPEG format. I highly recommend it.
http://groups.csail.mit.edu/mac/classes/6.001/abelson-sussman-lectures/
There is now a book out called 'Land of LISP' that teaches LISP programming through writing 80's style text games. I'm reading it now, and it's very well written and doesn't take itself too seriously, which I like.
There are several options here. First of all, Scheme and Common Lisp are fairly different in rather deep ways (like scoping); you should pick one to start with and stick with it for a while. I'm a Common Lisp fan, but that may be one of those vi-vs-EMACS religious questions.
For Scheme, go for Kent Dybvig's Scheme Programming Language, followed by SICP.
For Common Lisp, as well as Practical Common Lisp, I'd recommend David Lamkins's Successful Lisp. Successful Lisp is also available online for free.
After than, look at Lisp in Small Pieces by Queinnec, and Norvig's Lisp in AI book.
Marty Hall has a nice list at Johns Hopkins.
Updated: I don't mean stick to it forever, just that trying to learn both at once would be confusing.
Pick up The Land of Lisp by Conrad Barski. It is a fun filled introduction to Lisp programming using cartoons and games.
I'd recommend Project Euler as an excellent source of small bite-sized problems you can use to teach yourself any new programming language.
Ansi Common Lisp by Paul Graham is a good book.
I think it might be out of print, so your best bet to get it via Amazon.
I got the book for a "Natural Language Processing" class I took my sophomore year in college.
We had to write the programing projects in LISP, and so I needed to learn Lisp quickly.
The book helped me quite a bit.
Once I had a problem. I didn't know lisp. So I decided to download LISP in a box.
Then I found myself with an Emacs install without any help or documentation.
Then I had two problems.
For serious learners, I'd recommend PAIP from Norvig. It is an excellent resource to learn both Lisp and AI.
Berkeley offers CS61a in podcast format. This is an intro to CS class based around SICP. It's a more modern version than the 1982 videos MIT has available.
I'm working my way through Lisp right now and have come across "the book" to learn Lisp. It was suggested by Rainer Joswig
The book is called Common Lisp: A Gentle Introduction to Symbolic Computation and can be downloaded as a PDF. The author begins with a UML like approach to Lisp in the first chapter and gradually introduces more and more Lisp syntax.
I've also looked at practical lisp and I think that the author glosses over a lot of required information, even for a seasoned programmer. This book doesn't seem to do that (I'm not completely finished, but have found it useful enough to suggest).
one more thing, you'll need an environment to work in. I've found Lisp in a Box to work well. It runs on Windows and Linux and uses eMacs.
I've got attracted to LISP by its JVM dialect - Clojure. Clojure is sort of great LISP, since
it has "simplified syntax", that is less parathensis are required, there is cleverly design set of collections available
it is JVM based, so there is stable, performant runtime underneath, in addition whole Java ecosystem (libraries, e.g. database drivers, build tools, IDEs) is within our reach
Noir is a good web framework, apps can be deployed on Java web and applications servers
In other words, Clojure can be used in production right here, right now.
When it comes to resources, there are at least 4 books and planty online resources:
Books: "The joy of Clojure" - very insightful but can be difficult, so it's best read together with "Programming Clojure".
Online tutorials: Mark Volkmann tutorial is great
see also https://stackoverflow.com/questions/599519/which-tutorial-on-clojure-is-best
4Clojure website contains a number of simple programming tasks, so one can play with Clojure and see other people solutions
I enjoyed reading Practical Common LISP and ANSI Common LISP.
On LISP looks interesting, but at $190 seems a little expensive for a book.
For web development you might want to have a look at hunchentoot, a web server written in common lisp.
I found working through the exercises in "The Little Schemer" really helped hone the recursive, pattern-matching side of my thinking and made working in XSLT considerably easier.
I recommend Gentle Introduction to Symbolic Computation or Practical Common Lisp first, based on your programming experiences. For practicing, I use Allegro CL 8.2 Free Express Edition in Windows. It turns practicing into a lot of fun.
Sort of a difficult question to answer ... I think it all depends on your learning style.
I learned LISP in my A.I. and Expert Systems classes in college, but that's how I learn ... I'm not a great book learner, I prefer to have someone explain it to me in a class setting.
LISP is definitely a unique language and it requires a new train of thought if you're used to conventional C, Java, PHP programming.
Best of luck to you !
I found reading the book SICP really helped me learn. I used Steel Bank Common Lisp (SBCL) and had good success with it.
Good luck
I don't know that there's anything special about Lisp that makes it different from learning any other language. You just need to start using it and trying out its features.
One option might be to try a simple project.
Another option, that's specific for Lisp, would be to write an Emacs extension that assists you in your regular work.
Read these books in order: Gentle Introduction to Symbolic Computation, ANSI Common Lisp by Paul Graham and then move onto Practical Common Lisp. Or skip ACL and use it as a reference while working your way through PCL.

Resources to develop an operating system [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 9 years ago.
Improve this question
I'm planning to write an operating system and I don't know very much about operating systems. Are there any good resources or books to read in order for me to learn? What are your recommendations?
We used Andrew Tannenbaum's Modern Operating Systems at the university I attended. I highly recommend it for it's clear explanations of the tradeoffs inherent in many of the design decisions that you'll run up against. This book is a little bit more "fair and balanced" than the Minix book.
I also recommend this book because, despite his net-famous flame war with Linus Torvalds, few of his biases come through in the book. Also, he's a pretty decent writer, and the book is actually entertaining.
Operating Systems Implementation Prentice Software
This book is written by Tanenbaum, the main guy behind Minix, which is what Linux was based on. It provides good overviews for basic OS concepts like memory management, file systems, processes, etc. The concepts in this book book are intimately tied to examples of the Minix OS, which is a good thing.
I think you should start by something like that.
Similar threads on this very site:
OS Development
Building a new operating system
How to get started in operating system development
Operating System Concepts is the book we used at University. It's quite ugly BUT the information inside are well explain (from basic memory management, to how to OS decide what to execute or how to avoid deadlock). Pretty wide.
Os dev website is rich in information if you want to start coding your own OS too,
While old, these books are very good:
Operating System Design with Xinu
http://ecx.images-amazon.com/images/I/51AVJFBS3EL._SL500_BO2,204,203,200_AA219_PIsitb-sticker-dp-arrow,TopRight,-24,-23_SH20_OU01_.jpg
Operating System Design-Internetworking With XINU, Vol. II
3: http://Operating System Design-Internetworking With XINU, Vol. II
Take a look at HelenOS, which is a from scratch microkernel based OS that aims to be a fully modern OS. Disclamer, I'm a contributor, I'm working on its shell from scratch.
HelenOS has been ported to ia32/64, SPARC, ARM and more, its very well designed and easy to read. Its still in its infancy but shows one possible design that really takes advantage of the microkernel design and solves many issues in a microkernel implementation (such as IPC).
It also includes scripts that automatically set up a proper tool chain needed for cross compiling. Its very easy to build and runs very well in most simulators (i.e. QEMU) or bare metal.
I would also study L4, Minix3 and the GNU HURD (based on Mach), the latter being an illustration of design pitfalls when trying to leverage a microkernel.
If you want to go the monolithic route, just study Linux.
I'd highly recommend taking a look at the MIT Operating Systems class. It's got lots of useful references, and a bunch of lab exercises which you can play around with (including automated grading scripts, so you don't have to be an MIT student to do them).
textbook http://ecx.images-amazon.com/images/I/411E3CQQYZL._SS500_.jpg
I used Operating Systems and Middleware: Supporting Controlled Interaction when I was in college. It is probably one of the best textbooks on the subject.
Operating Systems Design and Implementation
The design and implementation of the FreeBSD OS
Just off the top of my head.
Developing Your Own 32-Bit Operating System by Richard A. Burgess. Went into great details about boot loaders, setting up those strange memory and process management registers, etc. It was a great read back in 1996 when i thought i'd take a crack at writing a simple OS from scratch, but may be dated by now, dealing only with the first few generations of Pentium-class CPUs.
If I remember correctly, the Powerup to Bash Prompt HOWTO contained a lot of information that looked like it would be useful for this. So did older versions of the Linux From Scratch HOWTO, but in recent versions that has been removed.
You'll also find a lot of good information in Understanding the Linux Kernel.
I would recommend looking at embedded operating systems and building an embedded OS. It will deal with the core concepts without the overhead of a modern desktop CPU.
I wrote a multitasking embedded OS last spring as a final project, it's easier than you might think.
You should look into MINIX 3. This is an operating system that was written in, I believe, less than 10,000 lines. You can get a very good idea of how an OS works with the aid of one of Tanenbaum's books and understanding how MINIX 3 works. You could go straight to Linux, but I think this is a useful task and really helps you see how it really doesn't take that many lines to build a working OS.
http://www.minix3.org/
Apart from books, there are many sites that learn OS Development
BrokenThorn Entertainment is on of this website that learn OS dev from base