# Shamus Cunningham

Super Seeqer

56

• #### Days Won

17

Shamus Cunningham had the most liked content!

## Personal Information

• Company
Seeq
• Title
Principal Analytics Engineer
• Level of Seeq User
Seeq Super-User

## Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

1. ## Clean Signal by Using Percentile Calculation in Training Range

This is a solution for a question that came in the support channel that I though would be of general interest. The question was how to designate a fixed training range for a signal and then calculate upper and lower limits of the signal using the 3rd and 97th percentile and apply those limits to the entire history of the signal This requires a two step process. The first is to create scalar signals for the upper and lower limits. Next we use those upper and lower limits to clean the signal using the remove() formula Step 1) Calculating the Scalar values for the 97th and 3rd Percentiles In the example below the training range start and end dates are hard coded into the formulas for simplicity \$trainingRangeStart = '2022-10-01T00:00:00Z' \$trainingRangeEnd = '2022-10-31T00:00:00Z' \$trainingCondition = condition(capsule(\$trainingRangeStart,\$trainingRangeEnd)) \$calcPercentile = \$signal.aggregate(percentile(97), \$trainingCondition, startKey()) \$calcPercentile.toScalars(capsule(\$trainingRangeStart,\$trainingRangeEnd)).average() Similar formula for the lower limit \$trainingRangeStart = '2022-10-01T00:00:00Z' \$trainingRangeEnd = '2022-10-31T00:00:00Z' \$trainingCondition = condition(capsule(\$trainingRangeStart,\$trainingRangeEnd)) \$calcPercentile = \$signal.aggregate(percentile(3), \$trainingCondition, startKey()) \$calcPercentile.toScalars(capsule(\$trainingRangeStart,\$trainingRangeEnd)).average() Step 2) Clean the signal using the new scalar values for upper and lower limits \$signal .remove(isGreaterThan(\$upper)) .remove(islessthan(\$lower))
2. ## Timestamp for last sample in a signal

A newer and simpler method for this using the Table view and conditions is linked below
3. ## Create Table View of string signal and duration at each value

Here is an example of how to convert a String signal into a table where each row contains information on the start / end time and total duration of each time the string signal changed values Step 1: Convert your string signal into a condition inside of Formula \$signal.tocondition() -> This formula creates a new capsule every time that the string signal changes value regardless of how many sample points have the same string value. Step 2: Create a table view of the condition. Select the "Tables and Charts" view and the "Condition" mode Step 3: Add Capsule properties as values to the table. To add the "Value" property which is the value from the string signal type in "Value" into the Capsule property statistics table. You can also select the duration here Final Product
4. ## How do I replicate the pattern of a signal from the rolling latest cycle?

This seems like a use case that would greatly benefit from being able to share screens and view the raw signals. Could you please schedule a time at office hours for one of our Analytics Engineers to work through this scenario with you https://outlook.office365.com/owa/calendar/SeeqOfficeHours@seeq.com/bookings/
5. ## PowerBI Scheduled Refresh Best Practices

The following is a mini-guide on best practices for setting up a scheduled refresh of an oData in PowerBI Step 1: Configure your exports using the "Auto-Update" option. This is an important step and makes sure that whenever the powerbi service requests data from Seeq that the data range window updates to "now" https://support.seeq.com/space/KB/112868662/OData+Export#Configuring-the-OData-Export Step 2: Create an access key and authenticate with the access key in powerbi to begin building your dashboards. https://support.seeq.com/space/KB/740721558 Step 3: Publish to your PowerBI workspace of choice Step 4: In your PowerBI workspace open up the settings for your dataset Step 5: For each export included in your dataset click the edit credentials link and then enter your access key information from Step 2. Click Sign In to verify the information Step 6: Configure your Scheduled Refresh cadence and click Apply
6. ## Create constant signal as Max/Min of signal

The first response with the hard coded dates will give you the answer you are looking for as long as you do anticipate adding new capsules to the "Data Valid" condition in the future. The part of the formula that limits the scope of the search is the \$signal.within(\$ValidData) section. This means that only data that falls within capsules part of the ValidData condition AND within the capsule("2020-01-01T00:00:00Z","2022-07-28T00:00:00Z") date range
7. ## Create constant signal as Max/Min of signal

It is possible to create a moving window for the SearchArea --- please read below \$SearchArea = capsule("2020-01-01T00:00:00Z",now()) However there could be some performance impacts if there are a lot of downstream calculations dependent on this value. Since this value would need to be continuously evaluated Seeq will not be able to cache the result and so this number as well as any calculations which are dependent on it will show up as dotted lines indicating that the results are subject to change. If this is just for a visualization or the number of datapoints is not that large it may not be a problem, however if you are seeing performance issues consider moving the SearchArea back to a fixed range
8. ## Create constant signal as Max/Min of signal

I think what you are going for will look like the formula below Where \$SearchArea is the total range where any of your valid data capsule could fall (you can be very conservative with these dates). This formula will work if you have multiple valid data range capsules as long as they all fall within the \$SearchArea \$SearchArea = capsule("2020-01-01T00:00:00Z","2022-07-28T00:00:00Z") \$Signal.within(\$ValidData).maxValue(\$SearchArea).toSignal()
9. ## Simple Graph Deviation

Ray, There are probably two ways to approach this and I would try each out and see what works best to capture what you are looking for Method 1 Use derivative to find instantaneous rate of change. Then search for period when that instantaneous value is high for an extended period of time. Step 1 Create derivative signal - you may also want to optionally apply some simple signal smoothing to your raw signal to accommodate for any spikes in the data. In the example below I am using 2 min smoothing and the AgileFilter function but this should be tuned to your data. You could also add the abs() function to the end of this formula if you are interested in any types of rate of change events not just positive increases. \$TankLevel.agilefilter(2min).derivative('h') Step 2 Use value search to find period when derivative is above your target value of 4 for a specified period of time (30 minutes in the demo below) Method 2 Directly calculate the rate of change over an hour at a specified sampling rate. The formula below calculated the delta between the signal at a point in time and in 1 hour. The second line periods() function sets up the sampling interval where this will be evaluated every 10 minutes. The startkey() parameter places the value for the difference between value at the start of the 1 hours period. This could be adjusted to the endkey() or middlekey() depending on your needs. Finally, the toStep() function makes this a step interpolated signal but you could remove this line if you would like a lineally interpolated value. For this example the step interpolation helps tell the story of the delta evaluation at distinct moments in time. Finding period of high rate of change would be the same as the ValueSearch step above in method 1 \$TankLevel.aggregate(delta(), periods(1hour,10min), startKey()) .toStep()
10. ## Display Current Values of Signals in an Organizer

Quick Guide to adding a table of current signal values to an Organizer Topic Create the Table in Workbench Add signals of interest to the display and then switch to Tables & Charts mode Add a new Column with Last Value (Optional) Remove Average column rename Last column to "Current Value" Add Table to Organizer and setup date range Insert table into your document either by pasting in the url of the workbench or navigating to the document Create Date range and update schedule and attach it to the table. The Range for the data can be anything from 1 day to 1 hour depending on how frequently your signals are sampled. You want to make sure to set a duration that will always include at least one data point. Set Update rate for the document in this example 1 hour updates are shown but you can change the update frequency if a more live view is desired
11. ## Delaying or Shifting the Time of a Signal

Brett, There is not currently a direct equivalent function that would allow you to move a capsule using a variable amount. However, below is a formula that does the same thing in a couple of steps. It comes with a couple of caveats however If you have capsule properties on the first calculation they will not be transferred over to the delayed signal This formula will delay the start and end of the capsule the same amount as defined by the value of your delay signal at the capsule start. You could probably extend this to do more complex transformations if needed \$step1 = \$condition.aggregate(totalDuration("min"), \$condition, startKey(), 0s) \$step2 = \$step1.move(\$timeShiftSignal,2h) \$step2.toCapsules(\$sample -> capsule(\$sample.key(),\$sample.key()+\$sample.value()),30d) Let me know if this helps get you on the right track. Also I am curious to understand more about your use case so that we can help improve the built-in functions in the future. Shamus
12. ## Create your own formula v50

Eric, There is not currently a way to add stats to the capsule panel. What kind of statistics are you interested in adding?
13. ## The starting time of the average signal

The Scorecard tool does not have all of the timezone options in it and instead uses the default server time. The easy work around is to create a monthly condition using the Periodic Condition Tool with our desired timezone and then switch your Scorecard to a "Condition" type using that new monthly condition.
14. ## Set property for capsules

I am sure that there are probably a few ways to do this but here is a solution I came up with \$conditionToSamples = \$condition.aggregate(count(),\$condition,durationkey()) \$countingRange = condition(capsule('2020-01-01T00:00Z', '2023-01-01T00:00Z')) \$countSignal = \$ConditionToSamples.runningsum(\$countingRange) \$condition.setProperty('Capsule Count',\$countSignal,average()) Step by Step Outline \$conditiontoSamples - Take your input condition and turn it into a signal with a value of 1 whenever the condition exists \$countingRange - the range we are going to count these capsules over. In this example beginning of 2020 to beginning of 2023 \$countSignal - Create a signal that counts up those values for each condition starting at the start date Set the value of the \$countSignal as a property on your original condition
15. ## Removeoutliers function on signals with Gaps

Today in Office Hours I ran into an interesting problem when using the removeoutliers() function on a signal that also had gaps in the data. If you use the function directly on a signal of this type it will not detect the outlier point as you might expect. However there is a quick work around that I will detail below. The signal looks like the one above where the outlier was right after a data gap. In order to work around this problem we chained together a couple of functions in formula. \$gaps = \$signal.isValid() \$signal.validValues().removeOutliers().within(\$gaps) Step 1 - create a condition \$gaps that captures only the periods of time that contained valid data in the original signal Step 2 - use the validValue() function to ignore the gaps in the original signal, next run the removeoutliers() function finally add back in the gaps by using the within function
×