Jump to content

Shamus Cunningham

Super Seeqer
  • Posts

    56
  • Joined

  • Last visited

  • Days Won

    18

Everything posted by Shamus Cunningham

  1. Can you see the forecast signal ever crossing your empty threshold. You can visually mark your lower limit on the trend by creating a Scalar value in formula
  2. There is also a way to complete this inside of Seeq Formula. As a warning as rates change over the years these formulas could get longer and longer as you splice in new rate schedules against old rate schedules. Below is a formula for a rate schedule with a Winter Rate & Summer Off/Mid/High Rates. You could expand this to weekends and weekdays as well if needed $Summer = periods(6months, 1year, "2020-05-01T00:00:00","US/Pacific") $Winter = periods(6months, 1 year, "2020-11-01T00:00:00","US/Pacific").setProperty('Rate',0.10) $SummerHighPeak = (shifts(16, 5, "US/Pacific") and $Summer).setProperty('Rate',0.18) $SummerMidPeak = (shifts(14, 2,"US/Pacific") and $Summer).setProperty('Rate',0.16) $SummerOffPeak = ($Summer -($SummerHighPeak or $SummerMidPeak)).setProperty('Rate',0.12) $AllPeriods = CombineWith($Winter, $SummerHighPeak, $SummerMidPeak, $SummerOffPeak) $AllPeriods.toSignal('Rate').setUnits('$')
  3. You can also re-create the same effect at the past() operator in offline situations or in older versions by modifying the formula to remove the period of time that there is data in the original source signal $LowerLimit = 0% $AboveLowerLimit = $forecast > $LowerLimit $AboveLowerLimit - isvalid($original)
  4. Another common question through the support portal this morning that is of general interest To help with this example I am going to create a quick polynomial prediction using Data from Area C in the example set. Our target is going to be to try to predict compressor power as a function of all of the input weather signals If you wanted to re-create this prediction model in excel or another tool you need the coefficients from block #1 in the screenshot above and the y-intercept from block #2 in the screenshot. Inside of the workbench tool you will see rounded values for each of the coefficients and intercepts but the full values are available when you copy them to the clipboard by clicking the little button highlighted in red. To fill out the example in excel the formula will look like the following $temperature^2 * -0.000230 + $temperature * 0.0607 + $WB^2 * 0.000646 + $WB * -0.101 + 6.5946 A final point to mention here is that for multi-variable regressions with many input signals it is important to take a minute and evaluate the p-values listed in the coefficient table. If the p-values for any coefficient are above 0.05 it is best practices to rethink if that signal needs to be included in the model at all or if you may need to perform data data cleansing or re-alignment to create a better performing model. Good blog post on P-values - https://medium.com/analytics-vidhya/understanding-the-p-value-in-regression-1fc2cd2568af Great reference post on how to optimize regression models using time shifting -
  5. Sam, You can pretty easily forecast the value from now till you empty setpoint and then display the result as a Capsule on the screen as well as a Time/Duration in a Scorecard/Table and have all the results update in real-time For my example I am going to use some example data but this should look very similar for your use case Step 1 - Create a Forecasted Value - This function may not work on exactly as you expect on historical data depending on your datasource $signal.forecastLinear(1.5h,5d) //Train in the last 1.5 hours of data //Project 5 days into the future Step 2 - Create a condition that captures the time between now() and when you fall below your "empty" threshold. This step will only work for online data as it is using the past() operator $LowerLimit = 0% $AboveLowerLimit = $fv > $LowerLimit $AboveLowerLimit - past() Step 3 - Create a Scorecard to Quantify the time between now and full and display it in a table. This uses the "Condition" mode in the Tables view and a Condition scorecard type Content Verified DEC2023
  6. In Seeq version 52 and beyond we have introduced scheduled notebooks which opens up a new world of signals calculated using purely python and pushed into the seeq system on a schedule. These scheduled executions can be really powerful but also can place huge loads on the entire Seeq system if your notebooks are not designed with scheduled execution in mind. Attached is a notebook with some of our current thoughts on best practices to minimize the load on the system and not request and push the same sets of data over and over again with each notebook execution. General Calculation Plan Query our calculated signal name between now - Lookback length (1day) Find the last timestamp of our calculated signal Query the input signals to our calculation for the time range between that last timestamp and now Run our python script on our input signals to generate a new range of values for our calculated signal Push results back into Seeq for range between last timestamp and now Schedule next execution (15 min) Scenarios/methods you may want to avoid include pulling a month of data for all input signals, calculating values and then pushing results back on a 15 minute schedule. This will result in needlessly reading and writing the same data points over and over again. Example Scheduled Signal Calculation (1).ipynb
  7. Great question today from the support channel that I wanted to share along with a helpful spreadsheet. As much as I spend my days getting people out of Excel, sometimes it comes in really handy for creating signals from lookup tables. Attached to this Post is an excel sheet I put together where you can input a rate schedule and pricing tiers for Summer/Winter and On/Mid/Off peak pricing. The last sheet in the excel file is setup to be exported to CSV which can then be quickly imported back into Seeq for use in analysis and integration anywhere it would be helpful. Step 1 - In the excel sheet fill out the Lookup Tables sheet to match your utility rate schedule Step 2 - Export the "Output Table - Save as CSV" to a new CSV file Step 3 - Import to Seeq using the Import CSV tool making sure to fill out your timezone, Step interpolation, unit of measure ($) and under optional settings you may want to add the "Lenient daylight Savings" option to ignore those pesky clock changes. Electric Tier Pricing Worksheet.xlsx
  8. Starting in Seeq Version R52 Data Lab notebooks can be run on a schedule which opens up a world of new interesting possibilities. One of those possibilities is to create a simple script that pulls data from a Web API source and pushes it into the Seeq data cache on a schedule. This can be a great way to prototype out a data connection prior to building a full featured connector using the Connector SDK This example notebook pulls from the USGS which has information on river levels, temperatures, turbidity etc and pushes those signals for multiple sites into the Seeq system. The next logical step would be to make a notebook to organize these signals into an asset tree. Curious to see what this inspires other to do and to connect to. If there are additional public resources of interest put them in the thread for ideas. USGS Upload Example.ipynb
  9. Sometimes you want to clean up and remove an asset tree built using Seeq Data Lab from a Workbook so that you can restart or do something else. Below is the quickest way to remove an unwanted Tree Step 1 - In Workbench click in the info icon for the Asset Tree Step 2 - Copy the ID value to your clipboard. This may be located in a different place or under the advanced subsection in prior versions of Seeq Step 3 - Open the API Reference page from the Menu in the upper right corner Step 4 - Navigate down to the Trees/assets/{id} delete endpoint expand it, paste in the ID you copied in Step 2 and then click the Try it Out button You should receive a 200 confirmation response code and your Tree will no longer show up in your workbench analysis.
  10. Great question through the support portal today I am going to try to generalize for everyone. This requires joining two conditions and I am going to show how to do it in a using the simple tools and then how to combine it all together into one formula. Step 1 - Create a Value Search which matches you product code ABC. Once trick here is to search for "ABC*" that will return string results for anything starting with ABC and ending with any other series of characters. The * is part of Regex notation which you can use when searching strings inside the Value Search tool - https://support.seeq.com/space/KB/146637020/Regex%20Searches Step 2 - Create a Condition for each batch using formula and the toCondition() function. This will create a capsule every time the value of our string signal changes. $mp.toCondition() Step 3 - Combine conditions together using the Composite Condition Tool and the intersection join Alternate Single Formula - Combine the steps in a multi-line formula $AllBatches = $mp.toCondition() $ProductRuns = $mp.contains("ABC") $AllBatches and $ProductRuns
  11. Yasmin, This should be pretty straight forward using the integral function in formula Step 1 - Create a condition that marks the start and end time for when you want to calculate the integral. This could be done with a value search or manual condition Step 2 - Use the integral function to calculate the new signal $signal.integral($condition.removeLongerThan(5d)).convertUnits('kWh')
  12. Question through the support channel Great use case which can be captured in a two quick steps Step 1 - Create a signal in the Formula tool with your custom aggregation. In this case we are going to use the periods() function to create capsules every 15 minutes and the delta() function to plot the change indensity from the start of the 15minute capsule to the end of the 15 minute capsule. Another alternative is to use the range() function which would return the absolute min - max of any values within the 15 minute interval. $myInterval = 15min $delta = $signal.aggregate(delta(),periods($myInterval),durationkey()) abs($delta) Step 2 - Create a Value Search to identify the periods of interest where the density change signal exceeds your desired threshold
  13. A follow-up on this post. As of Seeq version R48 there is no longer a limit on the number of capsules which can be displayed in chain view. So if you were interested in signal scrunching as a work around that is no longer needed. https://support.seeq.com/space/KB/736035141/What's+New+in+R22.0.48 With a couple of our new functions this calculation is a lot simpler This requires a Manual Condition with a single capsule so that we know the starting point where we will calculate delays from. $delay = timesince($ManualCondition,1min,$Running) $signal.within($running).move(-$delay,30days)
  14. Final variation on this theme How do I calculates a running time since the last sample in my signal $cond = $VariableSampleRateSignal.toCondition().setMaximumDuration(30d) $cond.timeSince(1min)
  15. Unfortunately currently (R53) the ForecastLinear() operator does not allow for a calculated scalar to be passed in as a parameter to the function. However, I came up with this two step work around that should help get you started on your workflow Step 1 - Create a scalar for the time interval since your last step change In my example here I am going to use the time since the Example -> Area A -> Compressor Stage signal last changed values. You will need to create a similar calculation for your workflow $cond = $cs.toCondition().removeLongerThan(40h) //Create a condition with a capsule that starts each time you want to reset your training range $currentCap = $cond.toGroup(capsule(now()-1min,now())).pick(1) //Pick the most capsule closest to "Now" $currentCap.duration().convertUnits("min") //Plot the duration of the most recent capsule as a scalar value Step 2 - Manually generate linear forecast $trainingCapsule = capsule(now()-$Step1DurationScalar,now()+4h) //Create training range and distance you want to project signal into the future -- 4h in this example $training = group($trainingCapsule) //Convert capsule into a group for use in regression tool $timeSince = timeSince(condition(30d,$trainingCapsule),1min) //Create a signal which is a time counter since the start of your mode of interest -- 1min is sampling rate of new signal $model = $signal.regressionModelOLS($training,false,$timeSince).predict($timeSince) //Train model and project it into the future $signal.forecastSplice($model) //splice model into the orignal signal
  16. Unfortunately in the trend view there is not a way to stack the bars or create a legend. These two options are available in the Histogram tool but given your setup with a number of different signals I don't think there is a good way to configure your display for that tool. I will pass along your suggestions to our development team
  17. Quick method to calculate the median of multiple related signals over a time period $allSignals = combinewith($p1fr.move(1ns), $p2fr.move(2ns), $p3fr.move(3ns), $p4fr.move(4ns)) $allSignals.aggregate(median(),hours(),durationkey()) The first part of the formula combines all the samples from your signals of interest into a new single signal which has all of the datapoints from the first four signals. One trick with the combinewith() function is that it can't combine samples that occur at the exact same timestamp. Since I am using data from a csv file for this example where the timestamps for all 4 signals are exactly the same I use the move() function to slightly offset each of the signals by a nanosecond so that they all combine smoothly. The final part of the formula is the aggregation where we can calculate the median value of our new combined signals which has all the sample form our prior 4 signals. In this example I am calculating an hourly median but this can be changed to daily() or a calculated condition that makes the most sense for your application.
  18. The formula above is only focused on getting the single value of the time period between the very last sample and "Now". It uses the past() operator which was added in Seeq Version 51 so the error you are seeing is because you are probably on an earlier version. On your separate question about a signal that represents the time since the last sample this formula will give you a continuous signal that calculates things are you outlined in your table. You can replace the durationkey() parameter with either startkey() or endkey() depending on if you want the time between samples to be plotted at the beginning or end of the interval. $CapsEachSample = $signal.toCapsules() //create a capsule for the time between each sample in the orignal signal $CapsEachSample.aggregate(totalduration(),$CapsEachSample,durationkey()) .convertUnits("min") //convert the duration of the capsules into their own signal
  19. One additional option here is to hold the last valid value until a new one appears. The formula below allows you to mix in step interpolation characteristics for gaps into a linearly interpolated signal. $gaps = $signal.isNotValid().removeLongerThan(30d).move(-1ns,0) $holdSignal = $signal.aggregate(startValue(),$gaps,durationkey()) $signal.validValues().splice($holdSignal,$gaps)
  20. A generally interest question I want to document this morning Step 1 - Create a condition from the last sample till "Now" $caps = $MySignal.toCapsules() $timeSinceSample = past().subtract($caps) $timeSinceSample Step 2 - Create a Scorecard to Capture the duration of the condition created in Step 1 Make sure the Maximum Capsule Duration is longer than the longest gap you would expect in your signal. In this example it is 180 days
  21. Another interesting question through the support portal this morning. The user has a signal where each discrete point represents the run length recorded at the end of the run. They would like to translate this signal into a condition so it can be used for further analysis. The original signal looks a bit like this To create the matching condition we need to use a transform and the ToCapsules() function $Signal.toCapsules( //for each sample in the signal create a capsule $sample -> //create a variable name of your choice to reference each indivudal sample in your signal capsule( //open the capsule $sample.key()-$sample.value(), //Capsule Start definition (Key = timestamp of the sample - value of the sample) $sample.key() //Capsule End definition (timestamp of the sample) ),5d) //Maximum length of any given capsule in the condition
  22. A great question came in today through the support portal and I wanted to share the current best practice workflow. In the admin panel for long running requests you can click on the (+) icon and get some important additional information. For our workflow the most important of these categories is the formula Parameters. This will take the form of "series = <id>" or something similar Copy the ID from this field and open up the API Reference from the upper right hand corner. If there are multiple IDs in the parameter field pick the first. In the API Reference Docs navigate to the Items/<id> Endpoint and paste your ID from the step above and click the "Try it Out" button. Scroll down to the Scoped To section of the response and copy the <id>. This is the ID of the workbook that the calculation is scoped to The final step is to copy the workbook ID from the Step above into the get Workbooks/ID endpoint The response from this endpoint will contain a bunch of information but the most important pieces will probably be the document owner, workbook location & workbook name
  23. This question came in through the support channel today and I thought it was a good opportunity to explain some of the nuances in the Scorecard tool and give a couple of potential work arounds. Question: In this example we are using two simple Value Searches on the Example -> Area A dataset Running -> Compressor Stage !~ "off" Stage 2 Running -> Compressor Stage ~ "stage 2" Unfortunately in Seeq there is not an easy way to directly perform Math or formula like operations on Scorecard Metrics created using the "Simple" option like you see here above. The "Simple" metrics are reactive to your display range and so have more limitations as compared to the "Condition" and "Continuous" options. In order to create this calculation we need to first decide on a calculation range that matches our engineering or business need. In this case we want to calculate this value per month. Option 1 - Convert to Signal from Condition Calculations Create a Monthly condition using the Periodic Condition tool Create Monthly Total duration in each mode calculations using the Signal from Condition Tool Use Formula to Create your % Time running calculation ($Stage2Duration/$RunningDuration).convertUnits("%") Create Scorecard using Condition Option Option 2 - Create Continuous Calculation Signal The main downside of Option 1 is that you cant see at any given point in time how the last running 30 days have been performing like you could with the original simple scorecard metrics. The following formula will build a calculation of the % Time in Stage 2 of total running time for the trailing 30 days calculated every hour $aggPeriods = periods(30days,1h) $RunningDuration = $Running.removeLongerThan(1d).aggregate(totalduration(),$aggPeriods,endkey()) $stage2Duration = $Stage2Running.removeLongerThan(1d).aggregate(totalduration(),$aggPeriods,endkey()) ($stage2Duration/$RunningDuration).convertUnits("%") This calculation can be added in a simple scorecard metric as well and as long as the simple scorecard duration is 30 days it will give you the desired results.
  24. To follow up if you want to put the folder and workbook names into variables the code looks like this FolderPath = 'Folder level 1 >> Folder level 2' WorkbookName = 'My Workbook' spy.push(data = statusExpanded,workbook=f'{spy.workbooks.CORPORATE} >> ' + FolderPath + ' >> ' + WorkbookName)
  25. If you have a signal in Seeq that is an integer representing an 8-bit 16-bit or 32-bit set of states here is some example code you can use to convert the integer signal into 8-16-32 separate 0/1 signals that can be used to define machine states elsewhere in the Seeq platform Import Python packages from seeq import spy import pandas as pd import numpy as np Use Spy.search to find your integer signal of interest - see spy.search() docs for more information IntegerSignal = spy.search( <your method here> ) Pull the signal for your time period of interest. This is a good place to add parameters such that this operation can be run on a schedule which is available in R52+ IntegerSignalDF = spy.pull(IntegerSignal, start='2021-06-01', end='2021-06-03') IntegerSignalDF Create a function to split your integer column into a number of different 1/0 columns # https://stackoverflow.com/questions/43738541/converting-one-int-to-multiple-bool-columns-in-pandas def num2bin(nums, width): return ((nums[:,None] & (1 << np.arange(width-1,-1,-1)))!=0).astype(int) Apply the function to your Integer Dataframe, you will also need to name the columns with the status's the represent. The num2bin function takes in the number of columns 8/16/32 as in an input statusExpanded = pd.DataFrame(num2bin(IntegerSignalDF.TestBinary.values,32), index = IntegerSignalDF.index ,columns = ['Status 1', 'Status 2', 'Status 3', 'Status 4', 'Status 5', 'Status 6', 'Status 7', 'Status 8', 'Status 9', 'Status 10', 'Status 11', 'Status 12', 'Status 13', 'Status 14', 'Status 15', 'Status 16', 'Status 17', 'Status 18', 'Status 19', 'Status 20', 'Status 21', 'Status 22', 'Status 23', 'Status 24', 'Status 25', 'Status 26', 'Status 27', 'Status 28', 'Status 29', 'Status 30', 'Status 31', 'Status 32'] ) Push the new signals back into your desired location inside Seeq spy.push(data = statusExpanded)
×
×
  • Create New...