Jump to content

All Activity

This stream auto-updates

  1. Yesterday
  2. I am setting up a variety of datasources that are connecting to a SQL database. Is there a way to delete them if I no longer want them around? Also, when trying to edit the configuration of a datasource, such as its SQL query or Name value, the datasource does not appear to update within the workbench or datalab after I save my edits. Why might this be happening?
  3. Last week
  4. Hi Tranquil Oshan, I have a workaround but as you can see from the below steps, it may not be the best either. Can you share with me what you hope to achieve with this polynomial fitting of min and max? , maybe there's another way we can approach this if i can understand your objective. Step 1 : Create the binning condition using value search. In this example I have 8 conditions with the bin size of 5. Step 2 : Calculate Max and Min value during each bin. combinewith( $t.aggregate(maxValue(),$a.removeLongerThan(5d),startkey(),0s), $t.aggregate(maxValue(),$b.removeLongerThan(5d),startkey(),0s), $t.aggregate(maxValue(),$c.removeLongerThan(5d),startkey(),0s), $t.aggregate(maxValue(),$d.removeLongerThan(5d),startkey(),0s), $t.aggregate(maxValue(),$e.removeLongerThan(5d),startkey(),0s), $t.aggregate(maxValue(),$f.removeLongerThan(5d),startkey(),0s), $t.aggregate(maxValue(),$g.removeLongerThan(5d),startkey(),0s), $t.aggregate(maxValue(),$h.removeLongerThan(5d),startkey(),0s)) combinewith( $t.aggregate(minValue(),$a.removeLongerThan(5d),startkey(),0s), $t.aggregate(minValue(),$b.removeLongerThan(5d),startkey(),0s), $t.aggregate(minValue(),$c.removeLongerThan(5d),startkey(),0s), $t.aggregate(minValue(),$d.removeLongerThan(5d),startkey(),0s), $t.aggregate(minValue(),$e.removeLongerThan(5d),startkey(),0s), $t.aggregate(minValue(),$f.removeLongerThan(5d),startkey(),0s), $t.aggregate(minValue(),$g.removeLongerThan(5d),startkey(),0s), $t.aggregate(minValue(),$h.removeLongerThan(5d),startkey(),0s)) Step 3 : Create the X-axis using setProperty() function with the 'bin' as value, corresponding to the Max value. combinewith( $t.aggregate(maxValue(),$a.removeLongerThan(5d),startKey(),0s).toCapsules().setProperty('Bin',5), $t.aggregate(maxValue(),$b.removeLongerThan(5d),startKey(),0s).toCapsules().setProperty('Bin',10), $t.aggregate(maxValue(),$c.removeLongerThan(5d),startKey(),0s).toCapsules().setProperty('Bin',15), $t.aggregate(maxValue(),$d.removeLongerThan(5d),startKey(),0s).toCapsules().setProperty('Bin',20), $t.aggregate(maxValue(),$e.removeLongerThan(5d),startKey(),0s).toCapsules().setProperty('Bin',25), $t.aggregate(maxValue(),$f.removeLongerThan(5d),startKey(),0s).toCapsules().setProperty('Bin',30), $t.aggregate(maxValue(),$g.removeLongerThan(5d),startKey(),0s).toCapsules().setProperty('Bin',35), $t.aggregate(maxValue(),$h.removeLongerThan(5d),startKey(),0s).toCapsules().setProperty('Bin',40)) .toSignal('Bin',startKey()).toDiscrete() Step 4 : Switch to the XY Plot. Step 5 : Click on the f(x) button, Create a New Prediction for min and max polynomial. Expand the training window. In this example the training window is set to 1 year (7/6/2022 11:26-6/9/2023 17:26).
  5. I need to accomplish following things: - Bin the x-axis - Span the y-axis for each bin (max-min) - Filter outliers based on quantiles - Finally I would want to fit some polynomial curve based on max and min values of the span (join the mid values in each bin) But I cant find appropriate way to do it
  6. Hello Emil, I believe this is a simple table and you are displaying raw tag values right? You can change the number of digits for a signal using the item properties of the signals. Have a good day, --Selmane
  7. I would like to adjust the height of each lane in a Worksheet. For example it could be done by setting a percentage, or by a dragging option between two lanes.
  8. Hi, I would like to see the possibility to change the format of the predefined statistics in a Table. For example, I would like to choose number of decimals for the "Avg", "S.D." and "% Duration", and for the "Duration" I would like to have the option for example to only show years, months, days, hours, minutes and/or seconds
  9. Gotcha. The good news is that coming in R63 will be a way to force string signal values to be upper, lower, or titlecase. The bad news is that R63 is not yet released, though it's right around the corner. IF your signal name is consistent, one way to tackle this would be to do a replace on the string to standardize the case: $string_signal .replace('BLU','Blu') .replace('LB','lb') You can then create a condition using .toCondition() which will span the entire duration when the signal value doesn't change. $replaced_string.toCondition() The result will be a single condition for that string state: As noted, this will be more simple/robust starting in R63 where step 1 can simply be replaced by $string_signal.lower() which will force the Signal value to be all lower case (alternatively, .upper() and .toTitleCase() will also be available for your use.) Let me know if this helps.
  10. Patrick, thanks for your swift reply. My goal in this case is to combine the two variations of the same capsule into one continuous capsule.
  11. Hi protonshake - To convert a step signal into capsules, you can use the .tocapsules() or .toCondition() formula. The former will create a capsule between each sample, regardless if the value has changed or not, whereas .toCondition() will create a continuous capsule if the value is not changing. Regarding your input signal and changing case, is the end goal for the above to have a single capsule for the duration shown (i.e., ignore the signal case), or do you want them separated out each time the source signal case changes?
  12. Hello everyone, I am attempting to correlate data to a product code that is in Step interpolation however, the way the step signal comes into Seeq is rather strange. There are peaks and valleys but the text value is similar (capitalization differences) at both the peak and the valley. Is there a way to smooth this signal out in Seeq to make each step a single capsule?
  13. Hi Saha, from your example it seems you want to filter your signal for a specific value to create a condition. In general it is easier to do this using a Value Search. The syntax you want to use only works on conditions, but you are using it on a signal. Therefore you get the syntax error. Can you provide some more details on what you are trying to achieve? Regards, Thorsten
  14. So you want to say I can't use KEEP function in condition ? I have a requirement to do that and then combine the variable via 'combinewith' but I am stuck here.
  15. Earlier
  16. That's the number of unique items, and it includes lots of things: - Worksheets - Worksteps (these are the individual worksheet "states" that define the displayed items and all other worksheet configuration at a particular point in time) - Calculated items (signals/conditions/scalars/metrics/histograms) -- this referred to as the "item inventory" - Journal text - Journal images The things that generally take a long time are the Calculated items. You can try specifying `include_inventory=False` during your push to see if that's a lot faster.
  17. I am pushing 8 worksheets to a workbook and it is taking nearly 3 hours. I notice the count is 321. Is that what it thinks the number of worksheets is, or is it items, or something else? Regardless, this is a much higher number than I think it should be, which might explain why it is taking so long.
  18. Hello Saha, Keep formula could be used with Signals but is generally better used with conditions if you want to filter on a specific capsule property. From the screenshot you sent, I understand you want to find periods of interest when the Receipe signal equals *REGN668 C3P1? I would suggest you use the Tools >> Identify >> Value search tool Then fill the different fields of the tool with the signal you have and the value you are looking for. If you still encounter issues, I would suggest you book a session with an AE is Seeq Office Hours: https://info.seeq.com/office-hours. You can talk with a Seeq ANalytics Engineer who will guide you on how to achieve this. Let me know if this is helpful. Thanks, Selmane
  19. Hi , I am trying to build a syntax using KEEP function. But I am getting error. Any help would be appreciated. I am new on SEEQ. $rn.keep('Recipe_NQ',isEqualTo('*REGN668 C3P1')) -----This is I am using
  20. Currently, If the labels for cursors are on top of the lane labels, I cannot read what the value is on the cursor. Could the labels for cursors be set so they are on top of the lane labels?
  21. Tamer, I sent you a person message asking few a questions on this topic. Regards, Teddy
  22. I would like to install demo version of SEEQ on local windows machine to test purpose and POC along with SEEQ ignition module ? Is it applicable to do that ? is there a version to be able to download with trail ? Thanks.
  23. Hi Edmund, I have a suggestion that may be helpful. You can create separate conditions for each color threshold - Concern, Investigate, and Shutdown - and put them all in one lane. Then, place the signal under the same lane. By doing this, you can monitor the signal movement on the trend, and the condition color will be shown in the background of the trend. Please refer to the example screenshot below for a better understanding. Let me know your thoughts on this.
  24. Hi Kin, I played with the Treemap approach for a little bit and have built several of those to track other unit metrics, but it doesn't really work for this situation. It's a relatively complex equipment setup that warrants the need to see the signal trends. I'm just targeting an MDI/visual management flavored solution that incorporates both the signals and easily discernable color thresholds. Green: no concern, yellow: investigate, red: shutdown, etc. One of the main reasons is because most of the distribution that receives this report does not use SEEQ and would not be able to click around for more information. They rely purely on the auto-generated PDF. Do you have any ideas on how to potentially manipulate the axis settings so the y-axis bounds are only driven by the min/max of the displayed signal and not the scorecard thresholds? It would be cool if SEEQ had a checkbox next to each threshold in the scorecard metric tab that would let you disable "always display threshold on trend". One can dream! 😇
  25. Hi Edmund, Have you ever considered using Treemap for condition-based monitoring? With Treemap, you can prioritize the conditions you want to monitor and focus on high-priority events when you have lots of process parameters to keep track of. Plus, you can easily drill down and check out what happened with high-priority events over a trend view. For more information on Treemap, please refer to this article.
  26. Hi Joseph, Kindly raise a support ticket via this link. A Seeq representative will assist you.
  27. Tulip is one of the leading frontline operations platforms, providing manufacturers with a holistic view of quality, process cycle times, OEE, and more. The Tulip platform provides the ability to create user-friendly apps and dashboards to improve the productivity of your operations without writing any code. Integrating Tulip and Seeq allows Tulip app and dashboard developers to directly include best-in-class time series analytics into their displays. Additionally, Seeq can access a wealth of contextual information through Tulip Tables. Accessing Tulip Table Data in Seeq Tulip table data is an excellent source of contextual information as it often includes information not gathered by other systems. In our example, we will be using a Tulip Table called (Log) Station Activity History. This data allows us to see how long a line process has been running, the number of components targeted for assembly, actually assembled, and the number of defects. The easiest way to bring this into Seeq is as condition data. We will create one condition per station and each column will be a capsule property. This can be achieved with a scheduled notebook: import requests import json import pandas as pd # This method gets data from a tulip table and formats the data frame into a Seeq-friendly structure def get_data_from_tulip(table_id, debug): url = f"https://{TENANT_NAME}.tulip.co/api/v3/tables/{table_id}/records" headers = { "Authorization": AUTH_TOKEN } params = { "limit": 100, "sortOptions" : '[{"sortBy": "_createdAt", "sortDir": "asc"}]' } all_data = [] data = None while True: # Use for paginating the reqeusts if data: last_sequence = data[-1]['_sequenceNumber'] params['filters'] = json.dumps([{"field":"_sequenceNumber","functionType":"greaterThan","arg":last_sequence}]) # Make the API request response = requests.get(url, headers=headers, params=params) if debug: print(json.dumps(response.json(), indent=4)) # Check if the request was successful if response.status_code == 200: # Parse the JSON response data = response.json() all_data.extend(data) if len(data) < 100: break # Exit the loop if condition is met else: print(f"API request failed with status code: {response.status_code}") break # Convert JSON data to pandas DataFrame df = pd.DataFrame(all_data) df = df.rename(columns={'id': '_id'}) df.columns = df.columns.str.split('_').str[1] df = df.drop(columns=['sequenceNumber','hour'], errors='ignore') df['createdAt'] = pd.to_datetime(df['createdAt']) df['updatedAt'] = pd.to_datetime(df['updatedAt']) df = df.rename(columns={'createdAt': 'Capsule Start', 'updatedAt': 'Capsule End', 'duration': 'EventDuration'}) df = df.dropna() return df def create_metadata(station_data, station_name): print(f"DataFrame for station: {station}") print("Number of rows:", len(group)) metadata=pd.DataFrame([{ 'Name': station_name, 'Type': 'Condition', 'Maximum Duration': '1d', 'Capsule Property Units': {'status': 'string', 'id': 'string', 'station':'string', 'duration':'s'} }]) return metadata # This method splits the dataframe by station. Each Station will represent a condition in Seeq. def create_dataframe_per_station(all_data, debug): data_by_station = all_data.groupby('station') if debug: for station, group in data_by_station: print(f"DataFrame for station: {station}") print("Number of rows:", len(group)) display(group) return data_by_station # This method sends the data to Seeq def send_to_seeq(data, metadata, workbook, quiet): spy.push(data=data, metadata=metadata, workbook=workbook, datasource="Tulip Operations", quiet=quiet) data = get_data_from_tulip(TABLE_NAME, False) per_station = create_dataframe_per_station(data, False) for station, group in per_station: metadata = create_metadata(group, station) send_to_seeq(group, metadata, 'Tulip Integration >> Bearing Failure', False) The above notebook can be run on a schedule with the following command: spy.jobs.schedule('every 6 hours') This will pull the data from the Tulip Table into Seeq to allow for quick analysis. The notebook above will need you to provide a tenant, API key, and table name. It will also be using this REST API method to get the records. Once provided, this data will be pulled into a dataset called Tulip Operations and scoped to a workbook called Tulip Integration. We can now leverage the capsule properties to start isolating interesting periods of time. For example, using the formula $ea.keep('status', isEqualTo('RUNNING')) Where $ea is the Endbell Assembly condition from the Tulip Table. We can create a new condition keeping only the capsules where the state is running. Once a full analysis is created, Seeq content can be displayed in a Tulip App as an iFrame, allowing for the combination of Tulip and Seeq data: Data can be pushed back to Tulip using the create record API. This allows for Tulip Dashboards to contain Seeq Content:
  28. Hello, I'm trying to look at an organizer's content but for some reason when the organizer is opened/or refreshed using the daily or monthly refresh, the content within the organizer does not load and produces a "content capture timed out" error. However, where the content is inherited, the analyses show in the workbench. Does anyone know why this is happening or what could be causing the issue?
  1. Load more activity
  • Create New...