Example usage for org.apache.hadoop.hdfs DFSClient getDatanodeStorageReport

List of usage examples for org.apache.hadoop.hdfs DFSClient getDatanodeStorageReport

Introduction

In this page you can find the example usage for org.apache.hadoop.hdfs DFSClient getDatanodeStorageReport.

Prototype

public DatanodeStorageReport[] getDatanodeStorageReport(DatanodeReportType type) throws IOException 

Source Link

Usage

From source file:ch.cern.db.hdfs.DistributedFileSystemMetadata.java

License:GNU General Public License

public HashMap<String, Integer> getNumberOfDataDirsPerHost() {
    HashMap<String, Integer> disksPerHost = new HashMap<>();

    try {/*from   w  w w  .ja v a  2 s.  co  m*/
        @SuppressWarnings("resource")
        DFSClient dfsClient = new DFSClient(NameNode.getAddress(getConf()), getConf());

        DatanodeStorageReport[] datanodeStorageReports = dfsClient
                .getDatanodeStorageReport(DatanodeReportType.ALL);

        for (DatanodeStorageReport datanodeStorageReport : datanodeStorageReports) {
            disksPerHost.put(datanodeStorageReport.getDatanodeInfo().getHostName(),
                    datanodeStorageReport.getStorageReports().length);

        }
    } catch (IOException e) {
        LOG.warn(
                "number of data directories (disks) per node could not be collected (requieres higher privilegies).");
    }

    return disksPerHost;
}