An experiment on how to inject custom model definitions: #598
bruno-f-cruz
started this conversation in
Ideas
Replies: 1 comment 3 replies
-
Is it possible to do something like this without creating task_schema = TaskSchema(param1=1, param2='hello')
behavior_simulation = BehaviorStimulation(input_parameters=task_schema.dict(), ...) |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
In our current experiment, we define task logic parameters under a custom json schema:
https://github.com/AllenNeuralDynamics/aind-vr-foraging/blob/main/src/DataSchemas/aind-vr-foraging-task.json
The goal of this experiment was to try to find a way to "inject" this schema definition in the current Session class provided by aind-data-schema. The obvious target would be the generic field:
aind-data-schema/src/aind_data_schema/stimulus.py
Line 107 in 8e1eaa3
Since the field already accepts a Dict[str, any] this should be compatible with any json schema definition, even without knowledge of that specific schema (i.e. it would just deserialize to the raw Dict type). However, what would be great is to be able to type this Dict to something more constrained. Here's what I propose:
AindModel
:or even from raw json schemas using tools like https://docs.pydantic.dev/latest/integrations/datamodel_code_generator/
BehaviorStimulation
class by inheriting from the original base class. This guarantees that the children class will be compatible with the deserialization process of the parent, assuming we are only overloading the aforementioned property. Eg:Notice how the Schema model definition never really gets updated, so everything should be backward and forward-compatible so long as the type of this field remains a
Dict[str, any]
.Alternatively, one could explore the idea of making the input to this property a
Union[BaseModel, Dict[str,any]]
My current plan is to generate, on each update of the task logic schema, a new json schema that results from merging the untouched Session object with a new, automatically generated pydantic object (using the datamodel-code-generator tool), that gets injected into the full schema. As a result, for each experiment, I should be able to provide a fully compatible Session object but, at the same time, have a fully spec'ed json schema that I can use to automatically generate code, and validate task logic, in Bonsai or even during analysis.
Beta Was this translation helpful? Give feedback.
All reactions