CoDeSys 2.3 - coils won't activate upon receiving a signal - codesys

Like I've stated in the title, I have a problem with coils not changing their state. I'm a newbie when it comes to programming PLC, so it's probably something basic. Could someone take a quick look? coil doesn't react

Related

STM32F103 blue pill ADC example

After searching for a veeery long time (more then 3 months) in all the main places where to get info and reading the datasheet of the chip I would like to ask the STM32 specialists inhere if there is a example for using the ADC maybe with DMA from the arduino IDE. I did see some incomplete parts inhere and for other compiler/IDE environments. But maybe i did not strike the good luck of finding the right info (that even i can understand) yet for what i need.
Your help is much appreciated.
I want to sample audio data, one channel 30KHz plus, 12 bits and each time 16 samples are taken an interrupt to handle the data that is in an array.
I have seen the pigOscope code (it uses analogread) and the info about analogread where there is stated this command is not meant for higher sampling speeds So that got me sort of into conflict with myself .... Who can break me out of my endless brainloop .....?;
Greetings ... Eric.
I have seen the pigOscope code (it uses analogread)
I wrote the Pig-o-scope code, with a lot of input from others at stm32duinocom and if you take the time to read the code, which I will grant you is somewhat simplistic, you will discover that analogRead is only used to trigger. The code uses DMA to do the high speed transfer.
I completely agree with the comment that you dont't need the Arduino IDE, you could "borrow" the DMA code and tailor it to your needs. However if you want a quick and dirty coding and prototyping environment, then there is nothing wrong with using the Arduino IDE. Take a trip to the stm32duino.com site, and you will see that I along with a lot of the other developers use the Arduino IDE, and Eclipse, and Atollic, and roll our own batch files, use vi, etc etc.
It all depends on what you are trying to do, and in many cases using the Arduino IDE gets you to working result a lot faster than learning an entire new IDE, just for one task.
But then again, I'm firmly on the side of vi in the vi/emacs wars, so what the heck do I know. Just don't use nano. ;¬)

Why don't serial ports work properly in Unity?

I need help, I'm desperate
During two weeks I have been working in my project, this uses serial port communication (a PIC serial board). I got to set the connection up, but I can not get data from the COM port. I've read some forums and the cause of the problem seems to be the incomplete implementation of System.IO.Ports class.
When I try to get data of the COM port, the event SerialDataReceivedEventHandler (represents the method that will handle the DataReceived event of a SerialPort object.) is not called or activated. I tried to resolve it but I don't find a definitive solution. I thought to prove a external DLL, but a friend told me that the problem will go on, in fact I did it and got the same problem: SerialDataReceivedEventHandler does not work. Also, someone recommended me using a secondary thread, although I don´t understand how to do it at all.
I wrote a program in visual C# and everything works fine. I'm intrigued.
I need to find a solution, some idea or good documentation. If there's someone knows something about it, help me please.
I need to understand the cause of this to continue.
Unity is based on Mono, and Mono doesn't implement completely the Serial class, in particular there are no notifications implemented (such as SerialDataReceivedEvent).
That's why it works in Visual Studio, and not in Unity.
Here are the differences between the Mono and complete .NET implementation of the Serial class :
Extract from http://www.mono-project.com/archived/howtosystemioports/#limitations
"Limitations
At the time of this writing, there are a a few limitations that one must take note:
1) There is no event notification for received serial data. If you want to receive data, one must set a timeout and watch for received data by polling ReadByte() when you think there might be data.
2) One must Read data in byte[] format only – there is no char[] support. You must do your own reading of bytes and translate that into your encoding.
3) DiscardNull, ParityReplace, ReceivedBytesThreshold are not implemented."
I think it happens because the Unity is based on Mono instead of .Net, and a pretty old version of it. You couldn't use Linq on iOS devices for a long time because of AOT bugs, and the localisation implementation is buggy (or at least it was in the previous versions of Unity I tried to work with). I wasn't even able to find the source of System.IO.Ports in the source of Unity's Mono fork, so it's surprising it compiles at all.

Can an NXT brick really be programmed with Enchanting and adaption of Scratch?

I want my students to use Enchanting a derivative of Scratch to program Mindstorm NXT robots to drive a pre-programmed course, follow a line and avoid obstacles. (Two state, five state and proportional line following.) Is Enchanting developed enough for middle school students to program these behaviors?
I'm the lead developer on Enchanting, and the answer is: Yes, definitely.
The video demoing Enchanting 0.0.4 shows how to make a proportional line follower (and you could extend it to use a PID controller, if you wish). If you download the latest version, 0.2.2, it includes a sample showing a two-state line follower (and you can see a video and download code here). You, or with some instruction / playing around, a middle-schooler, can readily create a program to do n-states, and, especially if you follow a behaviour-oriented approach, you can avoid obstacles at the same time.
As far as I know, yes and no.
What Scratch does with its sensor board, Lego Wedo, and the S4A - Scratch for Arduino - version as well as, I believe, with NXT is basically use its remote sensor protocol - it exchanges messages on TCP port 42001.
A client written to interface that port with an external system allows communication of messages and sensor data. Scratch can pick up sensor state and pass info to actuators, every 75ms according to the S4A discussion.
But that isn't the same as programming the controller - we control the system remotely, which is already very nice, but we're not downloading a program in the controller (the NXT brick) that your robot can use to act independently when it is disconnected.
Have you looked into 12blocks? http://12blocks.com/ I have been using it for Propeller and it's great and it has the NXT option (I have not tested)
It's an old post, but I'll answer anyway.
Enchanting looks interesting, and seems to be still an active project.
I would actually take the original Scratch (1.4), as it's is more familiar and reliable.
It's easy to interface hardware with Scratch using the remote sensor protocol. I use a simple serial interface (over a USB-adapter) which provides 3 digital inputs and 3 digital outputs. With that, it's possible to implement projects such as traffic lite, light/water/heat-sensors, using only lets, resistors, reed-contacts, photo-transistors, switches, PTSs.
The costs are < 5$
For some motor-based projects like factory belts, elevator, etc. There is not much more required, a battery and a couple of transistors/relais/motor driver.

text-chat xmpp message stanzas never make it to the network - iphone project using libjingle

i thought it'd be better to rephrase. my earlier formulation of my question could have been better focused. the trouble with presence notifications, while real and still recurring, is kind of minor. the things i haven't figured out yet i can fake or work around. much less of a concern than getting basic text chat messages moving between users.
again, still fairly new with this combo of tech flavors - using libjingle as the heart (and liver and kidneys...) of an iphone xmpp/jabber app. got the sign on part, the presence notification/roster updating business actually seems to be working more or less as it should. but xmpp text chat message stanzas seem to vanish before making it out on the wire. capturing other network traffic, and perusing the pretty packets, i yeah, i can see the exchanges for sign on, etc. that got us going, but then nothing for chat.
i've checked and double checked how i'm putting together the stanzas, attributes, namespaces, etc. everything looks jim-dandy to me. i can see that a message is getting queued in the libjingle internal infrastructure. but no results on the ethernet.
hoping somebody who has played around with this stuff before might remember a similar stumbling block and can offer up a hint, suggestion, or pointer in the right direction.
thanks for any help.
mike

Should functional testing simulate UI events or check for preconditions?

I am struggling with this question since I noticed that many functional testing frameworks (like Selenium for the web or UISpec for iOS) actually simulate UI events while testing. I am asking: couldn't it be sufficient just to check for preconditions such as that, e.g., the target and selector for a button are set correctly and then fire the selector manually? Why do I need to simulate touches? This has the con that you have to know more about the UI elements you're testing (you have to know what makes them to behave correctly), but since I am the one writing the tests, maybe this doesn't matter?
Could anyone shed some light on this?
Simulating touches can be useful for determining crashes caused by obscure or unplanned user behaviour - a particularly common one is having two items pressed simultaneously. It also allows you to create potentially quite esoteric tests: for example, random user input for a sustained period of time to attempt to crash or break your application in ways you wouldn't expect. The level to which you'd do this would depend on your app, and how important it was to you.
Your alternative approach also has some disadvantages when it comes to multi-touch. Whilst it would be fairly straightforward to fire a button selector through some sort of automatic test rather than simulating user input, what happens if you have an app that deals with swiping, pinching, or other multiple input gestures? In those cases the desired result may not be as black and white as the on/off of the button: you may have many shades of grey and differing output that required validation.
Simulated UI testing actually has quite a long history - there's an interesting story (well, interesting to me) about the original MacPaint and how a random UI input test was able to assist in reproducing obscure or difficult crashes here: http://www.folklore.org/StoryView.py?story=Monkey_Lives.txt