Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TypeScript or Babel #70

Open
sublimator opened this issue May 4, 2024 · 20 comments
Open

TypeScript or Babel #70

sublimator opened this issue May 4, 2024 · 20 comments

Comments

@sublimator
Copy link
Contributor

Which way commonjs?

@sublimator
Copy link
Contributor Author

sublimator commented May 6, 2024

It seems the JSDOC comments are actually incorrect a lot of the time. For example chatStream takes a single object param, but the comment describes multiple params:

/**
   * A chat endpoint that streams responses.
   * @param {*} model the name of the model to chat with, e.g. mistral-tiny
   * @param {*} messages an array of messages to chat with, e.g.
   * [{role: 'user', content: 'What is the best French cheese?'}]
   * @param {*} tools  a list of tools to use.
   * @param {*} temperature the temperature to use for sampling, e.g. 0.5
   * @param {*} maxTokens the maximum number of tokens to generate, e.g. 100
   * @param {*} topP the cumulative probability of tokens to generate, e.g. 0.9
   * @param {*} randomSeed the random seed to use for sampling, e.g. 42
   * @param {*} safeMode deprecated use safePrompt instead
   * @param {*} safePrompt whether to use safe mode, e.g. true
   * @param {*} toolChoice the tool to use, e.g. 'auto'
   * @param {*} responseFormat the format of the response, e.g. 'json_format'
   * @return {Promise<Object>}
   */
  chatStream = async function* ({
    model,
    messages,
    tools,
    temperature,
    maxTokens,
    topP,
    randomSeed,
    safeMode,
    safePrompt,
    toolChoice,
    responseFormat,
  })

This should be more like:

/**
 * A chat endpoint that streams responses using a generator function.
 * This method uses an object to pass named parameters.
 * 
 * @param {Object} config - The configuration object for the chat stream.
 * @param {string} config.model - The name of the model to chat with, e.g., 'mistral-tiny'.
 * @param {Array<{role: string, content: string}>} config.messages - An array of messages for the chat, each containing a 'role' and 'content'.
 * @param {Array<string>} config.tools - A list of tools to use during the chat session.
 * @param {number} config.temperature - The temperature to use for sampling, typically between 0 and 1, e.g., 0.5.
 * @param {number} config.maxTokens - The maximum number of tokens to generate, e.g., 100.
 * @param {number} config.topP - The cumulative probability of token selection, e.g., 0.9.
 * @param {number} config.randomSeed - The random seed to use for sampling, e.g., 42.
 * @param {boolean} config.safePrompt - Whether to use safe mode in prompts, e.g., true.
 * @param {string} config.toolChoice - The tool to use, specified by the user, e.g., 'auto'.
 * @param {string} config.responseFormat - The format of the response, e.g., 'json'.
 * @return {AsyncGenerator<Object>} - A generator that yields the chat response as an object.
 */
async function* chatStream({
  model,
  messages,
  tools,
  temperature,
  maxTokens,
  topP,
  randomSeed,
  safeMode, // If still needed, otherwise remove it.
  safePrompt,
  toolChoice,
  responseFormat,
}) {

If you were to choose to move to TypeScript, the parameters could be documented as part of the interface:

/**
 * Configuration options for the chat stream function.
 */
interface ChatParams {
  /** the name of the model to chat with, e.g. mistral-tiny */
  model: string;

  /** an array of messages to chat with, e.g.
   * [{role: 'user', content: 'What is the best French cheese?'}]
   */
  messages: Array<{ role: string, content: string }>;

  /** a list of tools to use. */
  tools: string[];

  /** the temperature to use for sampling, e.g. 0.5 */
  temperature: number;

  /** the maximum number of tokens to generate, e.g. 100 */
  maxTokens: number;

  /** the cumulative probability of tokens to generate, e.g. 0.9 */
  topP: number;

  /** the random seed to use for sampling, e.g. 42 */
  randomSeed: number;

  /** deprecated use safePrompt instead */
  safeMode?: boolean; // Including as optional, assuming it's still part of the configuration but deprecated.

  /** whether to use safe mode, e.g. true */
  safePrompt: boolean;

  /** the tool to use, e.g. 'auto' */
  toolChoice: string;

  /** the format of the response, e.g. 'json_format' */
  responseFormat: string;
}

@fuegoio
Copy link
Contributor

fuegoio commented May 7, 2024

I think I prefer to use Typescript and have a compilation step. We could use JS + JSDoc like some other projects that ditched TS but I think at this scale it is not worth it, especially as TS compilation time is gonna be really quick.

We will anyway need a rollup / esbuild step I think at some point.

@sublimator
Copy link
Contributor Author

sublimator commented May 7, 2024 via email

@JamieLee0510
Copy link

Actually, I just created the Typescript version from this repo, mistral-ts.
Maybe could help contribute or refactor?

@fuegoio
Copy link
Contributor

fuegoio commented May 7, 2024

Actually, I just created the Typescript version from this repo, mistral-ts. Maybe could help contribute or refactor?

Really nice thanks a lot! I think @sublimator wanted to do the TS version, but we could start from your repo. I like the usage of tsup, but I would prefer to use swc instead of Babel for performance and long-term maintenance.

If you can create a PR to import your work I think we can start from there, or we can just use it as inspiration!

@JamieLee0510
Copy link

Actually, I just created the Typescript version from this repo, mistral-ts. Maybe could help contribute or refactor?

Really nice thanks a lot! I think @sublimator wanted to do the TS version, but we could start from your repo. I like the usage of tsup, but I would prefer to use swc instead of Babel for performance and long-term maintenance.

If you can create a PR to import your work I think we can start from there, or we can just use it as inspiration!

Love to do it~
After switching into swc, I will pr it.

@sublimator
Copy link
Contributor Author

tsup looks like a really nice way to build both commonjs and esm!

I see that mistral-ts is including a version of code that has been updated recently, fixing the node-fetch logic. I believe it fixed usage in non-window browser environments such as service workers, and added a _fetch hook point to the client, used in the tests.

For me personally, my preference would be to merge #72 (which includes contributions of others, and is also set up to close other PRs/issues) and then move ahead from there, continuing in a line.

Even then, I think it would probably be quite easy to reuse a lot of the config and typescript code from /tests in mistral-ts, so if you opened a PR with that it would be very helpful, and I could include some of the commits (as done with #72).

I'm happy to move ahead with it.

There was also PR #41 for TypeScript, but it seemed to languish. So really, whatever is easy for Mistral. In practice, that seems to be Mr @fuegoio at the moment.

@JamieLee0510
Copy link

I have create the PR #73 in tsup version; just encounter some issues while migrating to swc+rollup 🥲
Will keep working in progress on it.

If there are there any reference will help a lot~

@sublimator
Copy link
Contributor Author

@JamieLee0510

Great, I'll cherrypick some commits from that once #72 is sorted out

@fuegoio
Copy link
Contributor

fuegoio commented May 8, 2024

I have create the PR #73 in tsup version; just encounter some issues while migrating to swc+rollup 🥲 Will keep working in progress on it.

If there are there any reference will help a lot~

@JamieLee0510 You can take inspiration from the openai-node repository, they are using SWC but not tsup unfortunately (they have a custom build script for commonjs + esm). Thanks a lot for your contribution :) Try to also base your work from the latest main updates!

Great, I'll cherrypick some commits from that once #72 is sorted out

@sublimator I'll merge #72 asap as I think it goes into the right direction and can be the source of truth for the types. You can work on TSing like @JamieLee0510 but that may seems to be duplicate work, let's maybe wait a bit.

@JamieLee0510
Copy link

JamieLee0510 commented May 8, 2024

@fuegoio no problem~
However, I had checked the openai-node repository while working on mistral-ts; maybe they are using tsc-multi instead of swc to compile?

Or there are something I 'm missing? The file I read was [openai-node/scripts/utils/make-dist-package-json.cjs] (https://github.com/openai/openai-node/blob/master/scripts/utils/postprocess-files.cjs).

@sublimator
Copy link
Contributor Author

sublimator commented May 8, 2024

@fuegoio
That works for me! Thanks.
I'm happy the time spent on #72 was not in vain

(could have sworn I sent an email (edit: message) like this from email already?)

@sublimator
Copy link
Contributor Author

Can tsup simply use typescript (i.e. no babel, or swc) ? (at least for now)
It's a pretty tiny project really?

@sublimator
Copy link
Contributor Author

Oh, it uses esbuild by default, right? Why not just use the default? KISS?

@fuegoio
Copy link
Contributor

fuegoio commented May 8, 2024

Yes that's fine by me actually! I thought of swc to replace babel as I thought it was somewhere in the chain. But using only what is necessary is the ideal yes. @JamieLee0510 Why do you use babel in your repository?

@JamieLee0510
Copy link

oh the babel usage is for dealing with test-cases running issue, not for compiling parts.

@fuegoio
Copy link
Contributor

fuegoio commented May 8, 2024

Well in this case let's just roll with tsup I think and try to not have babel as a dependency that's all I ask ahah.

@JamieLee0510
Copy link

haha I should have recognized this point sooner

@sublimator
Copy link
Contributor Author

oh right, yeah! babel is to make ts-jest faster eh?

You can probably get around to using swc for jest in another PR :)

@JamieLee0510
Copy link

@fuegoio
I have updated the pr #73 by syncing latest commits
could you kindly help check about it?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants