Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

hadoop distributedcache 应该是共享的 #22

Open
cjyu opened this issue May 14, 2015 · 2 comments
Open

hadoop distributedcache 应该是共享的 #22

cjyu opened this issue May 14, 2015 · 2 comments

Comments

@cjyu
Copy link

cjyu commented May 14, 2015

hadoop用distributedcache发布数据应该是共享的。
每次task开始前会初始化,这里面会包括distributedcache下载数据。
这个操作在每个node上只能顺序进行。多个task的话只能一个一个等。
如果已经下载过了就不会重复下载。

@JerryLead
Copy link
Owner

多谢指出,还有几个问题,等过些天整理文档的时候再请教你

@lxyscls
Copy link

lxyscls commented Apr 10, 2019

Then, before a task is run, the node manager copies the files from the distributed filesystem to a local
disk—the cache—so the task can access the files. The files are said to be localized at this point. From the task’s point of view, the files are just there, symbolically linked from the task’s working directory.

<Hadoop: The Definitive Guide, Fourth Edition> ch9 distributed cache

我个人的理解是nodemanager帮忙下载,用软链接的方式链接到task的工作目录,所以不会存在多个同样的distribtued cache文件。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants