此内容没有您所选择的语言版本。

26.4. Running Hadoop Jobs Across Multiple Red Hat Gluster Storage Volumes


If you are already running Hadoop Jobs on a volume and wish to enable Hadoop on existing additional Red Hat Gluster Storage Volumes, then you must follow the steps in the Enabling Existing Volumes for use with Hadoop section in Deploying the Hortonworks Data Platform on Red Hat Gluster Storage chapter, in the Red Hat Gluster Storage 3.1 Installation Guide . If you do not have an additional volume and wish to add one, then you must first complete the procedures mentioned in the Creating volumes for use with Hadoop section and then the procedures mentioned in Enabling Existing Volumes for use with Hadoop section. This will configure the additional volume for use with Hadoop.
Specifying volume specific paths when running Hadoop Jobs

When you specify paths in a Hadoop Job, the full URI of the path is required. For example, if you have a volume named VolumeOne and that must pass in a file called myinput.txt in a directory named input, then you would specify it as glusterfs://VolumeOne/input/myinput.txt, the same formatting goes for the output. The example below shows data read from a path on VolumeOne and written to a path on VolumeTwo.

# bin/hadoop jar /opt/HadoopJobs.jar ProcessLogs glusterfs://VolumeOne/input/myinput.txt glusterfs://VolumeTwo/output/

Note

The very first Red Hat Gluster Storage volume that is configured for using with Hadoop is the Default Volume. This is usually the volume name you specified when you went through the Installation Guide. The Default Volume is the only volume that does not require a full URI to be specified and is allowed to use a relative path. Thus, assuming your default volume is called HadoopVol, both glusterfs://HadoopVol/input/myinput.txt and /input/myinput.txt are processed the same when providing input to a Hadoop Job or using the Hadoop CLI.
返回顶部
Red Hat logoGithubredditYoutubeTwitter

学习

尝试、购买和销售

社区

关于红帽文档

通过我们的产品和服务,以及可以信赖的内容,帮助红帽用户创新并实现他们的目标。 了解我们当前的更新.

让开源更具包容性

红帽致力于替换我们的代码、文档和 Web 属性中存在问题的语言。欲了解更多详情,请参阅红帽博客.

關於紅帽

我们提供强化的解决方案,使企业能够更轻松地跨平台和环境(从核心数据中心到网络边缘)工作。

Theme

© 2025 Red Hat