What is the best way to visualize abstract concepts (algorithm/data structure)? - visualization

What's the best way to "see what is happening" in an algorithm/data structure? If it's something like a binary search I just imagine a bunch of boxes in a row, and throwing half of them out each time. Is there something more powerful that will let us grok something as abstract as an algorithm/data structure?
Clarification: I'm looking for something a little more general. Example: in order to visualize time - some people use a clock in there head but thats slow, whereas a more natural feel would be a globe and if you are trying to get a 'feel' for how an algorithm works you can imagine two objects moving in different directions on that globe.

In general, animations are excellent for visualizing processes that occur over time, such as the execution of algorithms.
For example, check out these animations: Animated Sort Algorthms
Here's an animation that shows how the data structures work on MergeSort.
Now, whether you want to spend time hooking up your algorithm to some kind of animated visualization is a different question!

Algorithm animation was a big research area in the 1990s. Marc H. Brown, who was then at the Digital Systems Research Center, did a large amount of interesting work. The source code to his Zeus animation system is still available, and it would not be hard to get it set up. I played with Zeus years ago and if I remember correctly it ships with dozens of animations.
They had a couple of 'festivals' where non-experts got together to animate new algorithms. You can see one of the reports (with still images) on the 1993 festival. YouTube has one of their videos Visualizing Combinatorial Structures.

Describing something in terms of another thing is called analogy. You just did it with the binary search being a bunch of boxes. Just play with the student's prior knowledge.
For instance, trees can be thought of linked-lists, with multiple "next" nodes, or they could be explained to the uninitiated as something similar to a hierarchy.
As for concrete methods of explanation, graphs and state machines can be easily visualized with Graphviz. A very basic, directed graph can be expressed very simply:
digraph G {
A->B;
B->C;
C->D;
D->B;
}

Related

Programming an intelligent game-playing bot

I am taking part in a programming competition where the objective is writing a bot that can play a specific game.
The objective of the game is to earn a certain amount of points. You control multiple airships, that you move around, capture islands and navigate drones that carry treasure. You play against one opponent, turns happen simultaneously, and there is a time limit. You can move multiple ships and drones in one turn. You can program your bot in Python, Java or C#.
The exact details don‘t matter, just that each ship has around 15 options each turn (moving and shooting) and overall you have around 10000 different options for each turn (different configurations of airship movements and shooting)
Up until now I approached this competition naively, and haven‘t done anything exceptionally clever (for example, if near enemy, shoot). I have read about minimax algorithms, and I would really like to apply it here (or something similar), you can assume that I can tell the value of a state. My problem is the mass of options for each turn - which create an enourmous branching factor that doesnt let me get very deep.
Question 1: Is there a better, applicable approach to this problem? Perhaps deep-learning or something similar?
Question 2: Is there a way to minimize the branching factor? I`ve read about alpha-beta and similar algorithms, but nothing seems to do the job.
Any help would be much appreciated
The minimax algorithm seems to be natural for these kinds of problems. At first, the game will be modelled in a abstract way and then a solver is used to find the path from current situation to a gamestate which maximize the amount of points. A similar approach to minimax is GOAP, which was implemented in the 1970'er for Shakey the robot under the name STRIPS. But, GOAP and minimax has two problems: first, a abstract model of the game is needed (perhaps in PDDL or in Game Description Language) and second the state-space is to big.
An better alternative to planning is to use a Behavior Tree. Thats a static program which describes the behavior of an agent. No solver is needed and no complete modelling of the game is needed. Instead, a bottom up approach is used with multiple edit-compile-run iterations for finding the optimal behavior tree (Test-driven-development). To implement such programming approach a so called "reactive planner" has to be implemented first which is another word for a realtime scheduler. Thats a module whichs maps a behavior tree onto a gantt-chart for executing an action at a specific moment in time. As introduction, the unity3d Engine is a good starting point, which has a full behaviortree implementation out-of-the-box.

Should I use Object Oriented Programming in MATLAB?

I have an issue, where I need to handle a lot of figures in matlab and the code is starting to get messy. Different kinds of plot objects are added to the code in different stages and some have legends and some does not. The problem is that there is no NULL legends. As soon as an object is created, so is a legend. However, until the legend(handles,...) is called they are not shown. This means that if things are plotted and some, need a legend entry and some not, a lot of handles needs to be passed around.
Now, the file is starting to be quite long, about 1500 lines, with some globals that spans over many functions in the file and so. To prevent the "Do not use globals" comments to pour in, yes I know globals are normally unnecessary, but the code was like that when I laid my hands on it. However, now the code is getting more and more messy and I think about using Object Oriented Programming (OOP) to handle figures.
The idea is to have the custom figure objects handling themselves and thus make more readable code, split up in smaller blocks. The idea is to have a design like
class Figure
private:
MainFrame;
SubFrame;
Lines;
Legends;
Title;
X-Label;
Y-Label;
Methods:
To be defined, for example formatting plotting, edit title,…
The complete design is not really thought through completely, but the point of this questions is really about using OOP in matlab. What I have seen so far it os not really used were much. Are there a reason for this? Could anyone give pros and cons to OOP in matlab? Is OOP recommended or not in matlab?
I have added the information about my issue since I understand that OOP is more needed for large complex issues, so an answer would preferably take the drawbacks in comparison with the complexity of the problem into account. (For example, do never use OOP in matlab, do it only when you have complex problems, do it whenever you like,...)
Okay the question is about OOP in Matlab - but is it not OOP in Matlab in your organisation?
By that I mean to think who is going to use/develop and maintain the code going forward.
Background: I have used OOP for my own toolbox (because its complex/large enough to warrant it - and I develop/maintain it) - however in consultancy jobs for the majority of my clients I create functions (which in some instances call my toolbox) - because when the job is finished they get the source code and the majority are (much) more comfortable working with functions rather than classes.
In summary - I decide on whether to use OOP on the job specifics and the situation where the code will be used (developed & maintained) in the future.
So back to your topic - I would consider where you think the code is going to go and who will develop/maintain it. Will they be comfortable with classes - or will they be more comfortable with functions?
FYI: Last year I was talking to Mathworks and they said that they run multiple "Intro to Matlab" courses per week - but only 1 "Matlab Classes" per quarter!! That gives you an indication on the level of Matlab class use in industry.

Where are jplephem ephemerides api documented?

I am working on what is likely a unique use case - I want to use Skyfield to do some calculations on a hypothetical star system. I would do this by creating my own ephemeris, and using that instead of the actual one. The problem i am finding is that I cannot find documentation on the API to replace the ephemerides with my own.
Is there documentation? Is skyfield something flexible enough to do what I am trying?
Edit:
To clarify what I am asking, I understand that I will have to do some gravitational modeling (and I am perfectly willing to configure every computer, tablet, cable box and toaster in this house to crunch on those numbers for a few days :), but before I really dive into it, I wanted to know what the data looks like. If it is just a module with a number of named numpy 2d arrays... that makes it rather easy, but I didn't see this documented anywhere.
The JPL-issued ephemerides used by Skyfield, like DE405 and DE406 and DE421, simply provide a big table of numbers for each planet. For example, Neptune’s position might be specified in 7-day increments, where for each 7-day period from the beginning to the end of the ephemeris the table provides a set of polynomial coefficients that can be used to estimate Neptune's position at any moment from the beginning to the end of that 7-day period. The polynomials are designed, if I understand correctly, so that their first and second derivative meshes smoothly with the previous and following 7-day polynomial at the moment where one ends and the next begins.
The JPL generates these huge tables by taking the positions of the planets as we have recorded them over human history, taking the rules by which we think an ideal planet would move given gravitational theory, the drag of the solar wind, the planet's own rotation and dynamics, its satellites, and so forth, and trying to choose a “real path” for the planet that agrees with theory while passing as close to the actual observed positions as best as it can.
This is a big computational problem that, I take it, requires quite a bit of finesse. If you cannot match all of the observations perfectly — which you never can — then you have to decide which ones to prioritize, and which ones are probably not as accurate to begin with.
For a hypothetical system, you are going to have to start from scratch by doing (probably?) a gravitational dynamics simulation. There are, if I understand correctly, several possible approaches that are documented in the various textbooks on the subject. Whichever one you choose should let you generate x,y,z positions for your hypothetical planets, and you would probably instantiate these in Skyfield as ICRS positions if you then wanted to use Skyfield to compute distances, observations, or to draw diagrams.
Though I have not myself used it, I have seen good reviews of:
http://www.amazon.com/Solar-System-Dynamics-Carl-Murray/dp/0521575974

Integrating NetLogo and Java : when should we think about this integration as a good option?

I just came to know about this excellent tutorials
http://scientificgems.wordpress.com/2013/12/11/integrating-netlogo-and-java-part-1/
http://scientificgems.wordpress.com/2013/12/12/integrating-netlogo-and-java-2/
http://scientificgems.wordpress.com/2013/12/13/integrating-netlogo-and-java-3/
Their example concerns about computation needed for patch diffusion and shows how to access patch variable from java and change them in netlogo.
I was wondering if anyone has any idea or comments on when we should think of writing an extension to make our model work better? I am new to netlogo itself, but I think it's good to know what are the options that I might not be aware of :)
I think looking through the extensions listed at https://github.com/NetLogo/NetLogo/wiki/Extensions , both the ones that we (on the NetLogo team) have built ourselves and the ones that have come from the user community, gives you a pretty good idea of the range of things extensions can be good for.
Some broad categories:
data structures (tables, arrays, matrices, priority queues...)
algorithms (networks, statistics, discrete event scheduling, diffusion, ...)
integration with other tools (R, SQL databases, MatLab, ...)
media (sound playback, sound synthesis, images, movies, speech, ...)
new device types (Gogo, Arduino, WiiMote...)
visualization (ray-tracing, sprites, Java2D drawing, ...)
not necessarily exhaustive!

simple speech recognition methods

Yes, I'm aware that speech recognition is fairly complicated (as an understatement). What I'm looking for is a method for distinguishing between maybe 20-30 phrases. An ability to split words (discrete speech is fine) would be nice, but isn't required. The software will be user-dependent(i.e. for use by me). I'm not looking for existing software, but for a good way of going about doing this myself. I've looked into various existing methods and it seems like splitting the sound into phonemes, while common, is somewhat excessive for my needs.
For some context, I'm just looking for a way to control some aspects of my computer with a few simple voice commands. I'm aware that Windows already has speech recognition software, but I'd like to go about this one myself as a learning exercise. Commands would be simple like "Open Google", or "Mute". What I had in mind (not sure if this is a good idea) is that some commands would be compound. So "Mute" would just be "Mute". Whereas the "Open" command could be recognized individually, and then have its suffixes (Google, Photoshop, etc). recognized with another network/model/whatever. But I'm not sure if looking for prefixes/word breaks in this way would produce better results than having to deal with an increased number of individual commands.
I've been looking into perceptrons, hopfield networks (though they're somewhat obsolete from what I understand) and HMMs, and while I understand the ideas behind these (I've implemented the ANNs before) I don't really know which is best suited to this task. I'm assuming that linear vector quantization models would also be appropriate, but I can't really find much literature to this end. Any guidance/resources would be greatly appreciated.
There are some open source project in speech recognition:
HTK (Hidden Markov Models Toolkit)
Sphinx
Both have decoder, training, language model toolkits. Eveything to build a complete and robust speech recognizer.
Voxforge has acoustic and language models for both open source speech recognition toolkits.
Some time ago, I read a whitepaper about a limited vocabulary system, which used a simple recognition process. The system divided each utterance into a small number of bins (6 in time, and 4 in magnitude, if I remember correctly, for 24 total), and all it did was count the number of sample audio measurements in each bin. There was a fuzzy logic rule base which then interpreted each utterances 24 bin counts, and generated an interpretation.
I imagine that (for some applications) a simple matching process might work just as well, in which the 24 bin counts of the current utterance are simple matched against those of each of your stored prototypes, and the one with the least overall difference is the winner.