Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add embeddings_ layer and supporting utility functions #3021

Open
wants to merge 13 commits into
base: master
Choose a base branch
from

Conversation

Cydral
Copy link
Contributor

@Cydral Cydral commented Sep 27, 2024

This PR introduces the embeddings_ layer class and adds two new utility functions, embeddings() and embeddings_gradient(), to the tensor_tools.h file. These additions enhance neural network capabilities, particularly for working with LLM-type networks.

Key changes:

  • Implemented embeddings_ layer: This new layer class allows for efficient token embedding in neural network architectures.
  • Added embeddings function to tensor_tools.h: This utility function projects tokens from an input tensor into embeddings, facilitating efficient lookup and transformation of token representations.
  • Added embeddings_gradient function to tensor_tools.h: This function computes and updates gradients for embeddings, supporting learning rate adjustment and optional frequency-based scaling. It's designed for thread-safe parallel processing.
  • Included detailed function declarations with requires/ensures contracts in tensor_tools.h, providing clear documentation on usage and behavior of the utility functions.

@Cydral Cydral changed the title Implement embeddings_ layer and add supporting utility functions to tensor tools Add embeddings_ layer and supporting utility functions Sep 27, 2024
@davisking
Copy link
Owner

Is this PR ready for review? It's got conflicts with git master. So not sure if I should be reviewing it yet or not.

@Cydral
Copy link
Contributor Author

Cydral commented Sep 30, 2024

Is this PR ready for review? It's got conflicts with git master. So not sure if I should be reviewing it yet or not.

Yes, you can go. I hope that these conflicts, which come (as I mentioned in another thread) from the fact that I've created several branches from my fork to work in parallel on the new Dlib layers, won't overwrite useful changes for certain layers. That said, and if there are compilation problems, I'll obviously fix the few problems, but all the layers currently released are ready for review and integration into the master branch. Thanks in advance.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants