File Sort (SRT) and Average (AVE) instructions

Moosetracks

Member
Join Date
Apr 2016
Location
Missouri
Posts
35
I am storing values into an array. I am wanting to find an average of the highs and an average of the lows.

Something like this...
After 50 samples average the 10 biggest and average the 10 smallest.

The number of samples starts at 0 and grows up to 600. I want to have these averages update with each new sample after we have collected 50 samples.

For me this screams structured text, but that isn't an option, and my original desire was the have the number to average a percentage of the number of samples collected, but the AVE instructions won't let me use a tag as the length. I suppose I could have a couple of different ave instructions with different lengths and then depending on the sample size use different ones. Hopefully that all makes sense.

After some trial and error this is what I have come up with. How would you guys do this? Is there anything wrong with what I have?

2022-06-29 07_47_56-Studio31 - VMware Workstation.jpg
 
Use a MOV to the .len of your AVE control tag to bypass the inability to enter a tag for it in the actual instruction.

EDIT: Beaten to it :)
 
Ah ok. Thank you both! I figured there surely had to be a way to change the length on the fly. What do you think of the rest of my code? Is there a better way?
 
Is there a better way?

There are always worse ways, and often there are different ways. "Better" is in the eye of the beholder

You don't say if you are looking for this to be a rolling average or a Last-X-Samples average. If you're doing a rolling average (i.e., once you hit 600 samples, you want to average samples 2 thru 601) you'll run into trouble because the counter will keep ticking up, even with the DN bit set. That could potentially mean that you'll be wanting sample 901 eventually, when your stack is only 900

Instead of doing the "COPY FIFO" (twice, because you are wanting to have your list starting at word 0), why not just do an indirect MOV ( Local_Max Load_Cell_History[Samples_to_Take2.ACC} ?

You'd need to set (OTL) a flag (FIFO_is_Full), and create a DINT "Samples_Taken", that would either move the counter's ACC or PRE, depending on whether or not the counter has Done at least once, so you know that the array is full, and you should be taking "600" samples.
 
That looks pretty good.

@Aardwizz raises a good point about stopping the CTU, which you could do by putting the [MOV 0 LocalMax] and an [XIO Saimple_to_take2.DN] before the CTU, so the CTU would be the last item on that branch.

It could be done more efficiently but the code would be ugly to the point of being illegible, and there would be no improvement to the result; the only reason to do that would be if there was a scan time watchdog issue.

I might

  • Add comments,
  • Use a [CLR LocalMax] instead of [MOV 0 ...],
  • Use a FIFO and FFU/FFL instructions,
    • or copy down instead of up to eliminate the need for the JunkDump buffer,
  • use an INT instead of a CTU, although since the .DN bit is needed maybe not,
but those are personal preferences, and it reads nicely as-is and looks pretty straightforward.

And as others have noted you do summat like [DIV Samples_to_take.ACC 10 AvgControl.Length LES AvgControl.Length 1 MOV 1 AvgControl.Length] to vary the length of the averaged sequences to be at least 10% of the accumulated samples.
 
I want it to be an average of the smallest XX samples and an average of the largest XX samples.

I like the idea of copying using the indirect. Thank you for that suggestion. That is more elegant than the cop cop mov.

With the counter I have its DN bit stopping this whole part of the sequence so it wont be able to activate after it hits its preset. So once it gets to 600 samples it stops everything. I didn't send the entire rung.

Thank you so much for your suggestions.

There are always worse ways, and often there are different ways. "Better" is in the eye of the beholder

You don't say if you are looking for this to be a rolling average or a Last-X-Samples average. If you're doing a rolling average (i.e., once you hit 600 samples, you want to average samples 2 thru 601) you'll run into trouble because the counter will keep ticking up, even with the DN bit set. That could potentially mean that you'll be wanting sample 901 eventually, when your stack is only 900

Instead of doing the "COPY FIFO" (twice, because you are wanting to have your list starting at word 0), why not just do an indirect MOV ( Local_Max Load_Cell_History[Samples_to_Take2.ACC} ?

You'd need to set (OTL) a flag (FIFO_is_Full), and create a DINT "Samples_Taken", that would either move the counter's ACC or PRE, depending on whether or not the counter has Done at least once, so you know that the array is full, and you should be taking "600" samples.
 
I used to use a lot of Fifos but I hate that they fill from the bottom so I started using the cop cop move so the data does from newest on top to oldest on bottom. What did you mean by the copy down?

It definitely needs comments for someone to understand the logic.

Thank you for your feedback!

That looks pretty good.

@Aardwizz raises a good point about stopping the CTU, which you could do by putting the [MOV 0 LocalMax] and an [XIO Saimple_to_take2.DN] before the CTU, so the CTU would be the last item on that branch.

It could be done more efficiently but the code would be ugly to the point of being illegible, and there would be no improvement to the result; the only reason to do that would be if there was a scan time watchdog issue.

I might

  • Add comments,
  • Use a [CLR LocalMax] instead of [MOV 0 ...],
  • Use a FIFO and FFU/FFL instructions,
    • or copy down instead of up to eliminate the need for the JunkDump buffer,
  • use an INT instead of a CTU, although since the .DN bit is needed maybe not,
but those are personal preferences, and it reads nicely as-is and looks pretty straightforward.

And as others have noted you do summat like [DIV Samples_to_take.ACC 10 AvgControl.Length LES AvgControl.Length 1 MOV 1 AvgControl.Length] to vary the length of the averaged sequences to be at least 10% of the accumulated samples.
 
I used to use a lot of Fifos but I hate that they fill from the bottom so I started using the cop cop move so the data does from newest on top to oldest on bottom. What did you mean by the copy down?


This:
COP array[1] array[0] 599
And insert new values at offset array[599]; it saves one COP, but you would have to keep track of the decreasing tail i.e. decrement an INT from 599 to 0.

You could also use a CTD to do that with a preset of 0 and .ACC initialized to 600, but then the code starts getting non-intuitive: commenting to explain the two COPs and the JunkDump is probably cleaner than commenting for copying down.

Six of one ...


The advantage to the FIFO/FFL/FFU is that it handles all the housekeeping for you: no counter, no COPs, control.DN triggers the FFU when FIFO is full, etc. I would code it up to see what it looks like; if it's cleaner and easier to understand, maybe that is more important than personal preference about how it works?
 
Last edited:
We have a load cell on an arm that comes out and pushes up against product as it travels through. The arm is made of flexible material. The whole fixture is in a rough environment and gets tweaked. When it does the load cell either thinks its already pushing or it pushes harder than we want. The pusher is a hydraulic cylinder and is position controlled by a motion controller. It uses the set points from the load cell to determine how far to go out.

I have been making manual changes to the set points whenever things get tweaked for several years. I thought I would try and write code that does what I do when I adjust it. Creating a button on a touch screen that will let an operator adjust set points.

They will put in "teach" mode and put it in the two positions by hand and hold it there in each position for about 10 seconds. Then I will run my code and as long as the range between push mode and retract mode is over a certain value I will move those two values into set points.

The reason for the sampling and averages might be overkill? I could possibly just use a min and max value. However when you move the arm manually it has a tendency to create some spikes as you push and pull it. My thought was with the 600 samples taken every 100ms I would do something like this... Take 600 samples, sort them all from high to low. Knock off the highest 20 and the lowest 20. 20 samples would be about 2 seconds of maneuvering the guide into position Then I would take what was left and average the 20 highest and 20 lowest and see how that works.

Also, with all the collected data I can look at the array of data after its done and see if I think it made a good decision and adjust all the variables as needed to improve the routine.

Lastly I wanted something that does both high and low with one "train" button. I could put a train out and train in, but then it would be a two person operation.

I haven't tried any of this yet, so it may be a total flop, but this is my "why".

Thanks everyone for your suggestions. I feel like with each question I ask on here I am becoming a better programmer. I have been programming for years, but the ideas I have gotten from this and other questions have been very valuable.


I am curious about the requirements. Why? Just because? It seems that if the data is noisy then all you are going to get is the extremes of the noise.
 
We have a load cell on an arm that comes out and pushes up against product as it travels through.
Through what? I have a lot of experience, many decades, with hydraulic systems. I am a member of the International Hydraulic Fluid Power Hall of Fame.



The arm is made of flexible material.
This is a problem. The oil in the cylinder also compresses.





The whole fixture is in a rough environment and gets tweaked.
What environment? Over 35+ years I think I have just about everything.



When it does the load cell either thinks its already pushing or it pushes harder than we want.
Is this due to a slip/stick action / variable friction?





The pusher is a hydraulic cylinder and is position controlled by a motion controller.
What hydraulic motion controller? I design, make and sell hydraulic motion controllers. I am familiar with many of my competitors too.





It uses the set points from the load cell to determine how far to go out.
What is it pushing against? Again, is there variable friction?



I have been making manual changes to the set points whenever things get tweaked for several years. I thought I would try and write code that does what I do when I adjust it. Creating a button on a touch screen that will let an operator adjust set points.
Yes, you can kludge something that may work. Chances are you are not the first to have this problem and it has already been solved.



They will put in "teach" mode and put it in the two positions by hand and hold it there in each position for about 10 seconds.
So you have position feed back or are these position just chosen manually?



Then I will run my code and as long as the range between push mode and retract mode is over a certain value I will move those two values into set points.
What values?




The reason for the sampling and averages might be overkill?
Getting data that shows how your system works statistically is good.



]quote]

I could possibly just use a min and max value. However when you move the arm manually it has a tendency to create some spikes as you push and pull it.
[/quote]
Why the spikes? Again, I suspect the friction is variable and you are suffering from what is commonly called slip/stick or Stribeck friction.


My thought was with the 600 samples taken every 100ms I would do something like this... Take 600 samples, sort them all from high to low. Knock off the highest 20 and the lowest 20. 20 samples would be about 2 seconds of maneuvering the guide into position Then I would take what was left and average the 20 highest and 20 lowest and see how that works.

[/quote[
Your still aren't clear about what you will do with your high and low numbers.



I think you need to tell us what the source is of your "noise"?
Is it really "noise" or is it variable friction?
 
Like Peter, I still don't understand the rationale, but if the data are normally distributed, you could calculate a running average and a running standard deviation, and estimate your max/min spread from that.


Then you don't have to bother with the sorting. Accuracy would not be great, but it might be good enough.
 

Similar Topics

I have a machine with a tag called SP_FLT[57] with a data type of DINT. Each bit in each DINT is a machine fault of some sort. I need a way to...
Replies
12
Views
3,454
Working with a customer today and they have qualified a bunch of changes today on this 1756-L73 v20.01. It crashes with a "fatal application...
Replies
26
Views
10,562
Good day all! Can someone help me with the procedure to update Beijers E700 firmware? The Panel I am working on is firmware 2.04v and I would...
Replies
1
Views
13
I have a file which I'm hoping is for a Mitsubishi A172SHCPUN processor but I can't see how to open it in GX Developer V8.102G The file is a zip...
Replies
6
Views
218
Hello, This product that I have the 1790D-T8BV8B is discontinued, does anyone have the EDS file for it? I wonder if TechConnect will have this...
Replies
8
Views
175
Back
Top Bottom