Jump to content

Recommended Posts

Posted (edited)

Hi,

 

I have a local asset tree scoped to a workbook where I want to pull in certain signals from different assets starting from a specific branch path.
 

I've been talking a lot with the AI Assistant and reading the spy documentation but it's time to switch to humans 🙂

in this example I'm trying to pull all Temperature values across a set of 10 production lines under the path'Dyka group >> Steenwijk >> Extrusie >> Per lijntype >> U3'.

I would expect the code below to work as I put the recursive option to True.

However I don't get any signals in return, when you go and visualize the existing_tree from that workbook I do see that in all 10 assets under this path there is the sensor 'Massa Temperatuur'. 

Does this only work for global asset trees or can I specify the asset tree ID also somewhere in the search?

 

from seeq import spy
import pandas as pd

# Define the time range for the last 3 days
end_time = pd.Timestamp.now()
start_time = end_time - pd.Timedelta(days=3)

# Specify the workbook ID (because the asset tree is not global)
workbook_id = 'B735E74B-E09F-4D13-BFB2-1AAD999993EE'
tree_id = '0EF7FE5B-2F9F-EA20-8B73-E2657B59C913'  # you can search the Asset Tree ID by clicking on the i button on the top level of your asset tree in the data pane in seeq workbench
existing_tree = spy.assets.Tree(tree_id)



# Search for all 'Massa Temperatuur' sensors under the parent 'U3' in the specified asset tree
sensors = spy.search({
    'Path': 'Dyka group >> Steenwijk >> Extrusie >> Per lijntype >> U3',
    'Name': 'Massa Temperatuur',
    'Workbook': workbook_id,
    'Type': 'Signal'
}, recursive=True) #set recursive to True to allow and search all sub assets

# Pull the "Massa Temperatuur" signal data for the last 3 days
sensor_data = spy.pull(sensors, start=start_time, end=end_time)

 

I know I could also just add the 10 signals into a worksheet and pull it like this using the URL, but I want to do this at a much larger scale for a lot more signals.

 

Kr,

 

Jan

Edited by JDLom
  • Seeq Team
Posted (edited)

Hi Jan-- you just need to use the `spy.search(workbook=workbook_id)` function argument instead of specifying a `Workbook` dictionary member.

If you had executed your `spy.search` call in its own cell, you should have received a warning that starts with "The following properties are not indexed..."

Edited by Mark Derbecker
Posted

Thanks Mark, I had to use Scoped to apparently.
 

searching and pulling multiple signals from different assets under a certain part of the tree is not possible I guess?
is the only way to do this to search for 1 signal name a time and loop it over a list of signal names to append them all together and then pull them in?

 

 

  • Seeq Team
Posted

Ah yes-- `Scoped To` is more restrictive, you will only get results for that workbook. (No global items.)

> searching and pulling multiple signals from different assets under a certain part of the tree is not possible I guess?

It should work fine with `recursive=True`. Is it still not returning anything?

Posted

yes indeed because it was a local asset tree, then using the workbook ID it doesn't work, only with 'Scoped To'

 

A search only goes for 1 signalname at a time apparently, so I had to loop it one by one using a Signal Name list: 

from seeq import spy
import pandas as pd

# Define the search criteria
search_criteria = {
    'Path': 'Dyka group >> Steenwijk >> Extrusie >> Per lijntype >> U3',
    'Scoped To': '0EF7FE5A-A0DB-EC80-B68C-6E089FB1EA48'
}

# List of names to search for
names = ['Metergewicht','Product naam','Trekbank Snelheid']

# Initialize an empty list to store search results
all_search_results = []

# Perform the search for each name
for name in names:
    search_criteria['Name'] = name
    search_results = spy.search(search_criteria)
    all_search_results.append(search_results)

# Concatenate all search results into a single DataFrame
all_search_results_df = pd.concat(all_search_results, ignore_index=True)

# Display the search results
all_search_results_df

 

 

  • Seeq Team
Posted

That's a reasonable approach, you could also try to use a regex, i.e.:

all_names_regex_fragment = '|'.join(names)
search_criteria['Name'] = f'/({all_names_regex_fragment})/'

But the regex might get prohibitively long if you end up with a lot of names

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...