Example usage for org.apache.hadoop.hdfs.server.protocol DatanodeStorageReport getDatanodeInfo

List of usage examples for org.apache.hadoop.hdfs.server.protocol DatanodeStorageReport getDatanodeInfo

Introduction

In this page you can find the example usage for org.apache.hadoop.hdfs.server.protocol DatanodeStorageReport getDatanodeInfo.

Prototype

public DatanodeInfo getDatanodeInfo() 

Source Link

Usage

From source file:ch.cern.db.hdfs.DistributedFileSystemMetadata.java

License:GNU General Public License

public HashMap<String, Integer> getNumberOfDataDirsPerHost() {
    HashMap<String, Integer> disksPerHost = new HashMap<>();

    try {/* ww  w.j  ava2  s .co m*/
        @SuppressWarnings("resource")
        DFSClient dfsClient = new DFSClient(NameNode.getAddress(getConf()), getConf());

        DatanodeStorageReport[] datanodeStorageReports = dfsClient
                .getDatanodeStorageReport(DatanodeReportType.ALL);

        for (DatanodeStorageReport datanodeStorageReport : datanodeStorageReports) {
            disksPerHost.put(datanodeStorageReport.getDatanodeInfo().getHostName(),
                    datanodeStorageReport.getStorageReports().length);

        }
    } catch (IOException e) {
        LOG.warn(
                "number of data directories (disks) per node could not be collected (requieres higher privilegies).");
    }

    return disksPerHost;
}