You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Are you using a local or a remote (AKS) FarmVibes.AI cluster?
Local
Bug description
I'm encountering an issue while running the SAM - Automatic Segmentation workflow in the farm_ai/segmentation/auto_segment_s2 pipeline. The workflow fails during the automatic_segmentation operation with the following error message:
RuntimeError: Failed to run op automatic_segmentation in workflow run id 821b713b-1298-470d-9b27-fdeeffc7db70 for input with message id 00-821b713b1298470d9b27fdeeffc7db70-9512c4362ec3e544-01. Error description: <class 'RuntimeError'>: Traceback (most recent call last):
File "/opt/conda/lib/python3.11/site-packages/vibe_agent/worker.py", line 142, in run_op
return factory.build(spec).run(input, cache_info)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/vibe_agent/ops.py", line 110, in run
items_out = self.storage.store(run_id, stac_results, cache_info)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/vibe_agent/storage/local_storage.py", line 159, in store
catalog.save(stac_io=self.stac_io)
File "/opt/conda/lib/python3.11/site-packages/pystac/catalog.py", line 796, in save
child.save(stac_io=stac_io)
File "/opt/conda/lib/python3.11/site-packages/pystac/catalog.py", line 812, in save
item.save_object(
File "/opt/conda/lib/python3.11/site-packages/pystac/stac_object.py", line 366, in save_object
stac_io.save_json(dest_href, self.to_dict(include_self_link=include_self_link))
File "/opt/conda/lib/python3.11/site-packages/pystac/stac_io.py", line 252, in save_json
txt = self.json_dumps(json_dict, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/pystac/stac_io.py", line 120, in json_dumps
return orjson.dumps(json_dict, option=orjson.OPT_INDENT_2, **kwargs).decode(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: Type is not JSON serializable: ChipWindow
Workflow Details:
Pipeline: `farm_ai/segmentation/auto_segment_s2`
Run Name: `SAM - Automatic Segmentation`
Run ID: `821b713b-1298-470d-9b27-fdeeffc7db70`
Run Status: `failed`
Run Duration: `00:04:52`
Key Task: `s2_automatic_segmentation`
Status: `failed`
Start Time: `2024/08/08 10:38:15`
End Time: `2024/08/08 10:43:06`
Duration: `00:04:50`
Steps to reproduce the problem
Trigger the SAM - Automatic Segmentation workflow using the above pipeline.
Observe the failure during the automatic_segmentation operation.
Expected Behavior:
The workflow should complete the segmentation task without errors.
Actual Behavior:
The workflow fails with a TypeError indicating that the ChipWindow object is not JSON serializable.
Environment:
FarmVibes.AI
Python Version: 3.11
Operating System: Ubuntu (in a cluster environment)
Additional Context:
This issue seems to arise from an attempt to serialize a ChipWindow object. Any guidance or fixes would be greatly appreciated.
Additional Context:
This issue seems to arise from an attempt to serialize a ChipWindow object. Any guidance or fixes would be greatly appreciated.
The text was updated successfully, but these errors were encountered:
In which step did you encounter the bug?
Notebook execution
Are you using a local or a remote (AKS) FarmVibes.AI cluster?
Local
Bug description
I'm encountering an issue while running the
SAM - Automatic Segmentation
workflow in thefarm_ai/segmentation/auto_segment_s2
pipeline. The workflow fails during theautomatic_segmentation
operation with the following error message:Workflow Details:
Steps to reproduce the problem
Expected Behavior:
Actual Behavior:
Environment:
Additional Context:
This issue seems to arise from an attempt to serialize a ChipWindow object. Any guidance or fixes would be greatly appreciated.
The text was updated successfully, but these errors were encountered: