Jump to content

Francesco

Members
  • Posts

    9
  • Joined

  • Last visited

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

Francesco's Achievements

Apprentice

Apprentice (3/14)

  • One Month Later
  • Week One Done
  • Reacting Well
  • First Post
  • Conversation Starter

Recent Badges

0

Reputation

  1. Hello, I have created a program whose task is to pull data from a server, apply some calculations to obtain some calculated values with metadata associated to them and finally push values and metadata to the Seeq Workbench. In the Workbench, the pushed signals are organized in different Worksheets. The program is automatically ran every hour using spy.jobs and the values are 'stored' in the Workbench so to create a time series of hourly points. A scorecard metric was created for every pushed signal in order to visualize the data in 'Calendar' mode in the Worksheet. While the system is working properly most of the time, it happens sometimes that all signals get deleted from some Worksheet for apparently no reason. It happens to random Worksheets at random moments (first time was few months ago and last time was a week ago). The signals get deleted from the Worksheet only and not from the Seeq Database, so that the problem can be solved by searching the pushed signals one by one in the 'Data' panel of the Workbook and select them again. Of course, restoring the Worksheet is a tedious work given the high number of signals. Hereafter, there is a diagram for the data flow, maybe it can help find the problem. Thank you, Francesco
  2. Hello Nuraisyah Rosli, thank you for the response, the method you suggested is working. Once the scorecard metrics are pushed into Seeq, can they be organized into an asset tree? Thank you.
  3. Hello Nuraisyah Rosli, every signal is computed with a different calculation, therefore I think that an asset tree would be of help just to have a more organized output and arrangement of the signals. The only common 'calculation' is the creation of the periodic condition. Thank you, Francesco
  4. Hello, I am currently working on a project in DataLab in which I have to manage a large number of signals (around 150). The final output should a table like the one below but with the last value of all the 150 signals. However, in order to obtain such a table I have to create a periodic condition (hourly) in the workbench and add one signal at a time by clicking on 'columns'. Since this would be a tedious work, I was trying to create a periodic condition in DataLab and apply it to all the signals. Therefore, my questions are: is it possible to create a periodic condition in Seeq DataLab and push it into workbench? Is it possible to somehow apply this periodic condition to all the signals ion order to obtain the shown table? Also the table must be uploaded every hour. I looked at the documentation but couldn't find anything. Thank you
  5. Hello, I am using Seeq Datalab to create a script that updates every hour the value of a variable. I am than pushing the value into the Seeq to show its time series in the Workbench. My question is, how long the pushed variable is retained into the Seeq database? Is it somehow canceled after a certain period of time or under some conditions (available memory for each user etc...)? Thank you.
  6. Hello Joe, thank you for the answer. In the end the following lines of code worked for me: my_data = spy.pull(items = my_signals, start = '6/6/2023 10:00 AM', end = '6/7/2023 10:00 AM', calculation = "$signal.aggregate(totalized(), condition(capsule('2023-06-06T10:00Z', '2023-06-07T10:00Z')), startKey())", grid = None ) This is the result: On the other hand, your solution returned me the following error:
  7. I am trying to apply the function totalized() when pulling data in Seeq Datalab. I want to apply the totalized() function over some data pulled between 6/6/2023 10:00 AM and 6/7/2023 10:00 AM. I am trying to use "$signal.totalized($capsule)" as the calculation parameter but I do no know how to specify the capsule. I was not expecting the need to specify the capsule since I am already defining the start and end time in pull(). Here is my code: my_data = spy.pull(items = my_signals, start = '6/6/2023 10:00 AM', end = '6/7/2023 10:00 AM', calculation = "$signal.totalized()", grid = None ) Of course it returns an error: I had the same issue when trying to apply the average() formula. What should I do? Thank you!
  8. Hello Jhon thank you for the answer, I was looking to compute the average of the signal extracted over a period of one hour. For this purpose, the following lines of code did the job: end_date = pd.Timestamp.now(tz='Europe/Rome').replace(microsecond=0, second=0, minute=0) #function .replace rounds at the last hour start_date = end_date - pd.Timedelta('1h') calculation = "$signal.aggregate(average(), Hours('Europe/Rome'), middleKey())" #compute hourly average my_data = spy.pull(items = my_signals, start = str(start_date), end = str(end_date), calculation = calculation, grid = None ) After adjusting, I obtained the following Dataframe where the column represents the average value of the signal between 8.00am and 9.00am in this case.
  9. Hello, I am new to Seeq and I am trying to pull data from OsisoftPI in Seeq Datalab. The data are extracted over a capsule of 1 hour and I would like to display only the average value of the signal over the capsule. I am using the 'calculation' parameter assigning ' calculation = "$signal.average($condition)" ' in the spy.pull() function. However, '"$signal.average()"' requires $condition as input. How do I specify the $condition? Or should I compute the average of the signal using the built in formulas in Python? Thank you
×
×
  • Create New...