Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implementing support for embeddings #62

Open
someshsingh22 opened this issue May 15, 2020 · 2 comments
Open

Implementing support for embeddings #62

someshsingh22 opened this issue May 15, 2020 · 2 comments
Labels
Core Priority: Medium question Further information is requested

Comments

@someshsingh22
Copy link
Member

This module needs to be planned well as we will need this for all our future implementations and will be required at all steps.
Currently, there are a lot of libraries available for this, common ones including Flair and Torchtext. We need a pipeline that will convert any textual input to vectors/matrices for further computation.

@someshsingh22 someshsingh22 added question Further information is requested Priority: High Core labels May 15, 2020
@someshsingh22 someshsingh22 changed the title Implementic support for embeddings Implementing support for embeddings May 15, 2020
@Sharad24
Copy link
Member

Sharad24 commented May 15, 2020 via email

@someshsingh22
Copy link
Member Author

Yes, Yes, you can load transformers embeddings using torchtext easily but it's not in the library bas of now. So we will need a separate module that has torchtext's own as well as other methods.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Core Priority: Medium question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants