Jump to content

Shamus Cunningham

Super Seeqer
  • Posts

    56
  • Joined

  • Last visited

  • Days Won

    18

Everything posted by Shamus Cunningham

  1. Starting in Version 50 Seeq now includes a Corporate Drive which is a great place to store shared documents. In order to push content to the Corporate drive using the folder path structure you need to include a bit of special code for the base "Corporate" folder spy.push(data=csv_file,workbook=f'{spy.workbooks.CORPORATE} >> My Folder Name >> WorkbookName')
  2. Here is a nice a simple example from a question emailed in this morning. This can be accomplished in a couple of minutes using a combination of Value Search to create a condition that captures our "Pump Running" state and Chain view to filter our trends to only those "Pump Running" time periods Step 1 - Create a condition that captures your event of interest. In our simple example that will be whenever the Stage tag is != to "OFF". This step can be much more complex and use the full range of tools in the "Identify" category to refine your search to exactly the periods relevant to your analysis. Step 2 - Switch to Chain View and zoom to range you want to look at. You can also now add any other signals that might be of interest to view in the context of your "Running" condition
  3. Using some of the new Asset Group features in R52 we can easily create unique prediction training ranges for each asset. In this example we are going to create three separate training ranges for three assets. Step 1 - Find the assets you want to model with your prediction and add them to a new asset group Step 2 - Create a Manual Condition for each asset with your desired training range for each particular asset. These training ranges can be as simple as single capsules or as complex as a Manual Condition combined with a mode of operation condition. Step 3 - Add the manual conditions to your asset group as a new "Training Range" column Step 4 - Create the prediction using the signals and the condition from the asset group. Key points in the prediction tool is to make sure that you select a wide training window that encompasses the ranges of all the individual assets. Under the advanced section select the training range condition we created in Step 3. You now have a Temperature Model signal that you can "swap" across all assets in your group with different training ranges in on each of the assets.
  4. Currently in Seeq (June 2021) there is not a method to export ungridded signal data. There are a couple of development items in the works which will help in the future but for the moment this is the best work around for oData connections. As a reminder if you just need the data in excel, there is an ungridded raw timestamp option available for the excel export. Lets use the following setup as our example. We have a condition which represents my batches and we have a simple temperature signal that I want to get the average value of for each batch. Step 1 - Create my Average temperature per batch KPI using the Signal from Condition tool. In this example we are selecting the start of the batch to place the KPI timestamp. Step 2 - Re-grid my discrete points onto a known fixed grid. This step will loose some fidelity with the batch start timestamps and is the main drawback of this method. Pick a re-grid interval that ensures that you will not have two points inside the same interval (hours, minutes, days, weeks) $AveCycleTemp.aggregate(average(), hours(), startkey()) Step 3 - Step export on the same new grid by selecting Custom grid period and Days, minutes, hours to match your re-grid interval from Step 2. Select the OData Sample Table endpoint when you create the export. Step 4 - One final optional step to clean things up in your BI tool is to remove all the null points when you import the oData feed. The example below shows the process in the PowerBI Power query editor but there should be similar steps in other tools.
  5. This is a question that came into our Support system today that I thought a number of our users might find interesting. Question - I have a signal that is calculating a number of equipment events and I want to create a scorecard with our total compliance to our procedures showing % Compliance per week / month / year For this example we are going to use the Area A -> Compressor Power example data and create a scorecard judging our compliance with our policy that the compressor has to turn on ever day. Step I - Create a metric that measures if you met your compliance requirement. In this case I will use the Signal from Condition tool to calculate the max compressor power per day. Step 2 - Turn your KPI into a daily 0-100% score in Formula. In our case we are going to say that as long as the max KW per day was greater than 1 kW we are going to consider that a complying event. This step will be different for your calculations be the end result need to be a 1-0 signal with a sample for each compliance period (days in this example). Using the pattern $signal.min(1.tosignal()) will clip the signal at 1. The second part of the formula converts things into 0-100% which will make our scorecards looks nice ($signal*100).setunits("%") Step 3 - Create the scorecard and average all of your 0-100% samples over the reporting period of interest. The reporting period can be created using the Periodic Conditions tool To create different weekly or yearly compliance simply create another Condition in the periodic condition tool and generate another Scorecard. Multiple scorecards can then be combined into a Topic to create a like updating report that looks a bit like the following using Scorecard thresholds to color the values based on your preferences.
  6. For particularly large oData exports it can sometime be helpful to increase the default PowerBI timeout period. This is easily done in the advanced editor of the Power Query Editor. 1) To access the Power Query editor click the options menu on your table and select Edit Query 2) In Power Query select the Advanced Editor option to edit the advanced query parameters 3) in the OData.Feed() function add the optional Timeout parameter. The example below is for a 90 minute timeout Source = OData.Feed("https://explore.seeq.com:443/odata.svc/Export_67ExampleDataSignals_DataSet", null, [Implementation="2.0",Timeout=#duration(0, 0, 90, 0)])
×
×
  • Create New...