Jump to content

Leaderboard

Popular Content

Showing content with the highest reputation on 11/17/2021 in all areas

  1. Mike & Andrew - I came up with a kludgy, resource-intensive solution, but it works. Let's say $tau is a continuous signal specifying the reactor residence time in minutes. Create a new formula that calculates the average of $tau in 10-minute time intervals, and then round it to the nearest 10-minute integer value: $tenminutes = periods(10min) $tau_avg = $tau.aggregate(average(),$tenminutes,startKey()) round($tau_avg/10)*10 Now, create a formula that defines multiple signals that apply the exponential filter to the reaction parameter (in my case, monomer concentration) explicitly for each possible value of $tau_rounded. Then splice these signals together based on the current value of $tau_rounded. //conditions $tau10 = $tau_rounded == 10 $tau20 = $tau_rounded == 20 $tau30 = $tau_rounded == 30 //signals $CC2_T10 = $CC2SS.exponentialFilter(10min,1min) $CC2_T20 = $CC2SS.exponentialFilter(20min,1min) $CC2_T30 = $CC2SS.exponentialFilter(30min,1min) //final spliced signal: exponential filter with varying values of tau $CC2SS //default is the steady state C2 concentration .splice($CC2_T10, $tau10) .splice($CC2_T20, $tau20) .splice($CC2_T30, $tau30) //extend as needed to cover expected range of values of $tau_rounded edit: of course, this method has some obvious limitations. It is not going to be accurate during time periods when the residence time and concentration are both changing (reactor startup, process upset, etc.). There will be a discontinuity in the signal for each "step" in the residence time, as it jumps between the exponential signals. Perhaps an agileFilter could be applied at the end to smooth it out. It's a decent approximation (better than nothing), and during times when the residence time is constant, the output will match that of the exponentialFilter for the current residence time.
    1 point
  2. I know this is an old thread, but I am including what I did in case posterity finds it useful. I am more or less working the the same issue, but with a somewhat noisier and less reliable signal. I found the above a helpful starting point, but had to do a bit of tweaking to get something reliable that didn't require tuning. The top lane is the raw signal, from which I remove all the drop outs, filled in any gaps with a persisted value, and did some smoothing with agile to get the cleansed level on the next lane. For the value decreasing condition I used a small positive threshold (since there were some small periods of the levels fluctuating and the tank being refilled was a very large positive slope) and a merge to eliminate any gaps in the condition shorter than 2 hours (since all the true fills were several hours). For the mins and maxes I did not use the grow function on the condition like was done above, instead just used relatively wide max durations and trusted that the cleansing I did on value decreasing was good enough. I was then able to use the combinewith and running delta function on the mins and maxes, and filter to get the deliveries and the usage. One additional set of calculations I added was to filter out all the periods of deliveries by converting the Delta function to a condition and removing all the data points in conditions that started positive from the cleansed signal. I then subtracted a running sum of the delta function over a quarter, yielding a signal that without the effect of an of the deliveries over each quarter. I could then aggregate the delta for days and quarters of that signal to get the daily and quarterly consumption figures. Chart showing all the calculated signals for this example. Top lane is the raw signals. Next lane shows the cleansed signal with the nexus of the mins and maxes between deliveries. Middle lane combines the mins and maxes and takes the running deltas, and then filters them into delivery and usage numbers. The next lane removes the deliveries from the cleansed signal and does a running sum of the consumption over the quarter. The last two lanes are daily and quarterly deltas in those consumption figures. Calculation for identifying the periods in which the chemical level is decreasing. I used a small positive threshold and removed two hour gaps, and that allowed it to span the whole time between deliveries. Aggregate the cleansed signal over those decreasing time periods to find the min and max values. Used the combinewith and running delta functions to get the next deltas of consumption and deliveries. Filtered based on positive and negative value to separate into deliveries and consumption numbers. Removed the delivery numbers from the cleansed signal in order to get a running sum of consumption over a quarter. aggregated the deltas in the consumption history over days and quarters to calculate daily and quarterly consumption.
    1 point
This leaderboard is set to Los Angeles/GMT-07:00
×
×
  • Create New...