We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
2 文本数据处理 一章中:“ 具体来说,我们关注的是基于变transomer架构的仅解码器模式的大型语言模型(LLMs)” 翻译有误 2.7 中最后的链接失效:https://github.com/datawhalechina/llms-from-scratch-cn/tree/main/ch02/03_bonus_embedding-vs-matmul 2.8 词位置编码 一章中,外部标题与内部标题不匹配;同时建议在markdown中使用代码块而不是引用来添加代码 3.1 章中"图 3.4 在变压器模型出现之前," 和 3.2 章中 “图 3.6 自注意力是变压器中的一种机制。。。”,transformer被直译成变压器:
The text was updated successfully, but these errors were encountered:
好的,我们后续会修订相关内容,感谢指正。后续有任何问题,欢迎随时提出。
Sorry, something went wrong.
No branches or pull requests
2 文本数据处理 一章中:“ 具体来说,我们关注的是基于变transomer架构的仅解码器模式的大型语言模型(LLMs)” 翻译有误
2.7 中最后的链接失效:https://github.com/datawhalechina/llms-from-scratch-cn/tree/main/ch02/03_bonus_embedding-vs-matmul
2.8 词位置编码 一章中,外部标题与内部标题不匹配;同时建议在markdown中使用代码块而不是引用来添加代码
3.1 章中"图 3.4 在变压器模型出现之前," 和 3.2 章中 “图 3.6 自注意力是变压器中的一种机制。。。”,transformer被直译成变压器:
The text was updated successfully, but these errors were encountered: