Jump to content

SBC

Members
  • Posts

    22
  • Joined

  • Last visited

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

SBC's Achievements

Contributor

Contributor (5/14)

  • One Year In
  • Collaborator Rare
  • One Month Later
  • Conversation Starter
  • Week One Done

Recent Badges

0

Reputation

  1. In a particular use-case of the Prediction function, the end-user is interested in the regression fit on the data in the current display range only. i.e., the regression training window will auto-pick the current display range instead of requiring to manually update every time the display range is changed) For example, this will be useful in a typical XY plot analysis with a curve fit function overlaid to the data data points. Is there any simple trick on capsules/conditions that can be utilized to make the Training window to always default to the current display/investigation range ? Alternatively, can we utilize a date/time formula for the user to adjust it from the details pane ?
  2. Thanks Mark for confirming that it is not a limitation to have multiple jobs - I will double check and make sure there is no other issues with the notebook/schedule command stoping it from executing.
  3. I have two notebooks that have to have to be run at a cadence of every 6 months - the first notebook's output (image files) is used by the second notebook (to push the image files to an organizer topic). I scheduled two jobs with separate spy.jobs.schedule function calls using the url of the corresponding notebook, staggering the schedule time to be an hour apart. Attached below are some code snippets showing the two cells of code that scheduled the jobs separately. However, when the scheduled time was reached, Datalab executed the second notebook only (the Job_Results folder contained the execution snapshot of the second notebook corresponding to Job2_notebook_url only). Is this a limitation of the current Datalab architecture i.e. only a single notebook can be scheduled to run for one datalab project ?
  4. I am looking to update an Organizer topic every few months in an year (for example, Quarterly = 3 months, Biannually = 6 months). The Organizer version R57.x does not seem to show this option (I beleive this was a feature available in previous versions). Please advise.
  5. I am working collaboratively with another team member on a workbench and datalab project - both of which is currently shared with full access rights ('manage'). Can the spy.push syntax use a shared folder path (I understand there is also a workbook ID input argument support in spy.push but wanted to confirm if the path context method is feasible for shared workbenches) ?
  6. Is there a detailed documentation for the available key words arguments to pick for the Metric attribute properties. For example, it was not straight forward to know which keyword to pick for getting the Statistic = Value at End (the AggregationFunction was called differently 'endValue' In other words, Can we assume all the attribute properties will have the same name as the GUI drop down menu list items ? Please see some screenshots attached to illustrate the quesiton:
  7. Thanks Sanman .. the SDK reference will be rally helpful. In the 'Step-by=step..' code for Aseet tree push, is there a way to prevent existing worksheets from getting archived ? (I originally thought it never did that but seeing it happen on my current workbook)
  8. Hello Sanman - I had a follow-up question on this same subject. I am able to 'Unarchive' a worksheet by using spy.pull, knowing the specific worksheet ID (fortunatley because it was used in a Organizer topic) I want to confirm if there is a spy function that can retrieve 'Archived' worksheets inside a workbook? Scenario: When I am using the 'Step-by-step.." code to push an Asset Tree in existing workbook, it archives the pre-existing worksheets. Ran into this in one of my use-cases (for Temp Mixing points asset tree I was showing you a while ago). This can be a critical workflow step in my use-case for data-lab coding for 'unarchiving worksheet' whenever the asset tree needs an update from the end-users. Adding some code snippets on how I am doing it today... Thank you.
  9. I am frequently getting the following error message (screenshot attached) while updating my KPI metrics table. Is there any specific root cause for this or just a random glitch in the data queries ? Thank you.
  10. Can the above method be utilized to push a custom Python plot (for example, generated using python visualization libraries) to a WORKBENCH ? I have seen some advanced topics covering how to push it to Organizer topics (requires HTML type but was wondering if it can be done to a workbench.
  11. Thank you Sanman - I believe there is a difference in speed when the calculated signal is pushed as an "Asset Attribute" (and cached for each Asset) VS. keeping it as a "work-sheet (cached formula has to re-populate when swapping assets).
  12. I need to calculate a simple difference (delta) between two time series signals feeding from Continuous Historian to SeeQ over the past history (10 years) and keep it available in SeeQ for other aggregate calculations. This will have to be done over several Assets in an Asset tree (similar to the Example tree above). Currently, my workbooks are trying to calculate 'just-in-time' whenever we open the workbook which is causing a lot of time lag in getting the outputs as it is waiting to first pull and calculate the delta value between the two signals. Will taking the above approach speed up my worksheet calculation performance/speed if the the delta value was pre-pushed with SeeQ data lab on a regular basis (say, daily) ? I am also planning to combine with the Asset tree signal push script shared by SeeQ team member in another post.
  13. HI Thorsten, Thankyou very much, these functions are great solutions to my questions (and more...)
  14. I have a continuous time signal (say T) and a capsule defined by value search (eg. simple capsule Y>120) where Y is another signal. I want to get the histogram of only the values of T that satisfy the the capsule condition, i.e., Get hist(T) from data pulled for all capsules where (Y(t)>120)
×
×
  • Create New...