Jump to content

Robert Rhodes

Members
  • Posts

    3
  • Joined

  • Last visited

Robert Rhodes's Achievements

Rookie

Rookie (2/14)

  • First Post
  • Conversation Starter

Recent Badges

0

Reputation

  1. @Ken How, That solution is working like a charm, but there seems to be a limit of 10 signals once I open the workbook. Any ideas on if there is a limit on the push or should I be doing something different?
  2. @Ken How, When I do spy.search and specify the estimate_sample_period for a day and if there is data on a signal it will come back with a timestamp, if it does not have data, it will return a NAT for that signal. After that step the dataframe.dropna(subset=['Estimated Sample Period']) can be used to remove NAT rows as well. Then I am left with a dataset of good tags that I want to insert into a Workbench after doing the convert. spy.search({ 'Name': 'Area ?_Compressor Stage', 'Datasource Name': 'Example Data' }, estimate_sample_period=dict(Start='2019-01-01', End='2019-01-30'))
  3. I am trying to push a curated dataframe of existing tags that was queried with a spy.search then dropna was done on "Estimated Sample Period" to remove stale tags. Any ideas on how to do this? I would also like to perform the formula function .convertUnits('eu') in the process. Any tips, trick, ideas would be appreciated.
×
×
  • Create New...