We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
请问论文中的公式7定义的attention值计算公式,用的分别是t时刻和j时刻的隐层向量计算的吗(都用h表示)? 为什么后面的解释里说,是hidden state ht与之前点击item hj的相似度呢?请问这里的hj指的是什么向量??
The text was updated successfully, but these errors were encountered:
No branches or pull requests
请问论文中的公式7定义的attention值计算公式,用的分别是t时刻和j时刻的隐层向量计算的吗(都用h表示)?
为什么后面的解释里说,是hidden state ht与之前点击item hj的相似度呢?请问这里的hj指的是什么向量??
The text was updated successfully, but these errors were encountered: