Jump to content

Johannes Sikstrom

Members
  • Posts

    16
  • Joined

  • Last visited

Everything posted by Johannes Sikstrom

  1. Just for posterity: Seeq/Mark identified this as a bug. I was given a temporary work around and the bug will be addressed in future version. The method in the first post was basically correct. "This Development Request has been addressed in future Seeq version(s): 64.0.0, 59.3.5, 61.1.12, 60.3.7, 58.8.7, 63.0.5, 62.0.13"
  2. Hi, I'm having issues pulling raw samples without the default grid. First I create a calculation one one truck. I search for one truck, and do a calculation push() truck1034_signals = spy.search({ "Path": "path >> to >> truck >> G52_TRC1034" }) Payload_resamp_calc = spy.push(metadata=pd.DataFrame([{ 'Type': 'Signal', 'Name': 'Payload Resamp', 'Formula': "$p.resample($c.toNumber()).remove($l2.isNotEqualTo('KR_KID2'))", 'Formula Parameters': { '$c': truck1034_signals[truck1034_signals['Name'] == 'CycleID'], '$p': truck1034_signals[truck1034_signals['Name'] == 'Payload'], '$l2': truck1034_signals[truck1034_signals['Name'] == 'LoadDestination'], } }]), workbook='SPy Documentation Examples >> spy.pull') Then i search for more trucks, in this case just 6 trucks for testing. (In reality its like 60 trucks i want to get.) #this finds 6 trucks truck_assets = spy.search({ "Name": "G52_TRC103", "Path": "path >> to >> truck >> Hauling" }) start=pd.Timestamp("2023-11-17 01:45", tz='CET') end=pd.Timestamp("2023-11-17 03:15", tz='CET') df = spy.pull(truck_assets,start=start, end=end, calculation=Payload_resamp_calc,header='Name') df.head() Running this with default grid works and gives this result: However, i want raw samples, not default grid of 15 min. When I ad the grid=None option to the pull i get this error: df = spy.pull(truck_assets,start=start, end=end, calculation=Payload_resamp_calc,header='Name',grid=None) SPy Error: Pull cannot include conditions when no signals are present with shape='samples' and grid=None Why is it not working without the grid? I tried even for just one truck and still it does not work with the grid=None. Is it due to the calculation in some way?
  3. Hi, yes! Those changes seem to do the trick. I now have this that works: data_for_seeq_FCRD_Up = pd.DataFrame({'FCR-D Up Price': prices_upp}, index=pd.to_datetime(dates_upp)) data_for_seeq_FCRD_Down = pd.DataFrame({'FCR-D Down Price': prices_ned}, index=pd.to_datetime(dates_ned)) # Combine the data for both signals into a single DataFrame data_for_seeq_combined = pd.concat([data_for_seeq_FCRD_Up, data_for_seeq_FCRD_Down], axis=1) # Create a single metadata DataFrame for both signals metadata_combined = pd.DataFrame({ 'Name': ['FCR-D Up Price', 'FCR-D Down Price'], # Signal names 'Value Unit Of Measure': 'EUR/MW', # Units of measure for both signals 'Type':'Signal' # Add other required fields for metadata as per your requirements }).set_index('Name', drop=False) # Push the combined data and metadata to Seeq in one operation push_results_combined = spy.push(data=data_for_seeq_combined, metadata=metadata_combined, workbook='Aitik >> FCR Aitik Monitor', worksheet='FCR Prices',datasource='Mimer SVK API') I thought I had tried all combinations but I suspect the 'Type': 'Signal' was the culprit. In my attempts to add Type I added just the short form 'Si' (for 'Signal') and that does not work. That's why I removed the Type assign from the dataframe. Thank you!
  4. Hi, (updated since first post) I'm trying to push() 2 external signals and metadata for UoM for them in a single push(). SO far i can make it work with 2 separate pushes, one for the data and one for the metadata like in the code below: #create df's data_for_seeq_FCRD_Up = pd.DataFrame({'FCR-D Up Price': prices_upp}, index=pd.to_datetime(dates_upp)) data_for_seeq_FCRD_Down = pd.DataFrame({'FCR-D Down Price': prices_ned}, index=pd.to_datetime(dates_ned)) # Combine the data for both signals into a single DataFrame data_for_seeq_combined = pd.concat([data_for_seeq_FCRD_Up, data_for_seeq_FCRD_Down], axis=1) push_result = spy.push(data=data_for_seeq_combined, workbook='Aitik >> FCR Aitik Monitor', worksheet='FCR Prices') # Assuming 'push_result' now contains two rows, one for each signal # Update the 'Name' and 'Value Unit Of Measure' for both signals push_result.loc[push_result['Name'] == 'FCR-D Up Price', 'Value Unit Of Measure'] = 'EUR/MW' push_result.loc[push_result['Name'] == 'FCR-D Down Price', 'Value Unit Of Measure'] = 'EUR/MW' # Push the combined metadata back to Seeq push_results_meta_combined = spy.push(metadata=push_result, workbook='Aitik >> FCR Aitik Monitor', worksheet='FCR Prices') # Print the combined push results print(push_results_meta_combined) When i try to structure the metadata and send in in the same push() as the data it looks like this: data_for_seeq_FCRD_Up = pd.DataFrame({'FCR-D Up Price': prices_upp}, index=pd.to_datetime(dates_upp)) data_for_seeq_FCRD_Down = pd.DataFrame({'FCR-D Down Price': prices_ned}, index=pd.to_datetime(dates_ned)) # Combine the data for both signals into a single DataFrame data_for_seeq_combined = pd.concat([data_for_seeq_FCRD_Up, data_for_seeq_FCRD_Down], axis=1) # Create a single metadata DataFrame for both signals metadata_combined = pd.DataFrame({ 'Name': ['FCR-D Up Price', 'FCR-D Down Price'], # Signal names 'Value Unit Of Measure': ['EUR/MW', 'EUR/MW'] # Units of measure for both signals # Add other required fields for metadata as per your requirements }) metadata_combined.index = metadata_combined['Name'] print(data_for_seeq_combined.head()) print(metadata_combined.head()) push_results_combined = spy.push(data=data_for_seeq_combined, metadata=metadata_combined, workbook='Aitik >> FCR Aitik Monitor', worksheet='FCR Prices',datasource='Mimer SVK API') The format of the data and metadata dataframes looks like: FCR-D Up Price FCR-D Down Price 2023-11-16 00:00:00+01:00 23.066646 9.116699 2023-11-16 01:00:00+01:00 22.670452 9.076252 2023-11-16 02:00:00+01:00 21.816880 8.906874 2023-11-16 03:00:00+01:00 21.654117 8.916882 2023-11-16 04:00:00+01:00 26.594917 8.885253 Name Value Unit Of Measure Name FCR-D Up Price FCR-D Up Price EUR/MW FCR-D Down Price FCR-D Down Price EUR/MW The error I keep getting is: SPy Error: Items with no valid type specified cannot be pushed unless they are calculations. "Formula" column is required for such items. So when I send then in separate pushes it works, but sending data and metadata together then the type is not valid for the signals? What is happening here?
  5. Yes, that works like a charm! Never used resample before, that's useful!
  6. I have a signal for the production LAST hour. I want to create "Hourly production" from this. Since the samples in the raw data (below image) is not always right on the whole hour I want to use the value from this signal on the half hours and shift that value 90 minutes back to the previous whole hour. So I want the value of the step signal below from 14:30 to be placed at exactly 13:00 in my new signal. And 15:30 -> 14:00 and so on. How would i go about that?
  7. Ah, nice that worked to move to formula interface! Never used the API before. But it seems to have broken some of the dependent analysis somehow, but that could be related to some max duration or something like that, but that should not change by adding a property. Anyway, Think I will just need to replace manually this time. Although that's also a bit involved. But if there is a lot of dependencies that's gonna be quicker approach then duplicating. Thank you!
  8. Hi, I have a condition that is used in a lot of subsequent analysis in a workbench. It's created using the "identify" tool (and not formula). I wish to add a property to this condition. I can do this in a new formula, or by duplicating the condition to a formula. My issue with that is then I need to go into every place it is referenced and change the condition to the new one. Instead of "duplicate to formula" I would want a feature to "convert to formula" which would maintain the hierarchy, but unlock possibility to add properties or otherwise modify that very condition further. Can this be achieved somehow?
×
×
  • Create New...