You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As of version 2.0.0-beta.7, there is no simple way for users to mock service outputs for testing. Consequently, they currently rely on workarounds such as using ModelReaderWriter.Read<T> to create a mock from JSON as illustrated below:
ChatCompletioncompletion= ModelReaderWriter.Read<ChatCompletion>(BinaryData.FromObjectAsJson(new{
id ="1234",
choices =newobject[]{new{
finish_reason ="stop",
index =0,
message =new{
content ="It's a nice day today!",
role ="assistant"}}},
created = DateTimeOffset.Now.ToUnixTimeSeconds(),
model ="model",
system_fingerprint ="N/A",
@object ="N/A",
usage =(object)null}));
As of version 2.0.0-beta.7, there is no simple way for users to mock service outputs for testing. Consequently, they currently rely on workarounds such as using
ModelReaderWriter.Read<T>
to create a mock from JSON as illustrated below:To improve the experience around mocking, we plan to fully implement and expose the
OpenAIModelFactory
static classThe text was updated successfully, but these errors were encountered: