Example usage for org.apache.hadoop.mapreduce TaskInputOutputContext getJobID

List of usage examples for org.apache.hadoop.mapreduce TaskInputOutputContext getJobID

Introduction

In this page you can find the example usage for org.apache.hadoop.mapreduce TaskInputOutputContext getJobID.

Prototype

public JobID getJobID();

Source Link

Document

Get the unique ID for the job.

Usage

From source file:com.moz.fiji.mapreduce.util.SerializeLoggerAspect.java

License:Apache License

/**
 * Logic to write a profiling content for a single method signature to a file on HDFS.
 * The format of the file is as follows: Job Name, Job ID, Task Attempt, Function Signature,
 * Aggregate Time (nanoseconds), Number of Invocations, Time per call (nanoseconds)'\n'
 *
 * @param out The {@link OutputStreamWriter} for writing to the file.
 * @param context The {@link TaskInputOutputContext} for this job.
 * @param signature The method signature for the profile.
 * @param loggingInfo The profiling information for the method.
 * @throws IOException If the writes to HDFS fail.
 *//*w w w . j  a  v a  2 s. c o m*/
private void writeProfileInformation(OutputStreamWriter out, TaskInputOutputContext context, String signature,
        LoggingInfo loggingInfo) throws IOException {
    // ensure that files do not end up with x.yzE7 format for floats. Instead of 1.0E3, we want
    // 1000.000
    NumberFormat nf = NumberFormat.getInstance();
    nf.setGroupingUsed(false);
    nf.setMinimumFractionDigits(1);
    nf.setMaximumFractionDigits(3);

    out.write(context.getJobName() + ", " + context.getJobID() + ", " + context.getTaskAttemptID() + ", "
            + signature + ", " + loggingInfo.toString() + ", " + nf.format(loggingInfo.perCallTime()) + "\n");
}