List of usage examples for org.apache.hadoop.mapreduce RecordReader subclass-usage
From source file com.ikanow.infinit.e.data_model.custom.InfiniteFileInputReader.java
public class InfiniteFileInputReader extends RecordReader<Object, BSONObject> { private static Logger _logger = Logger.getLogger(InfiniteFileInputReader.class); protected SourceFileConfigPojo _fileConfig;
From source file com.ikanow.infinit.e.data_model.custom.InfiniteShareInputReader.java
public class InfiniteShareInputReader extends RecordReader<Object, BSONObject> { private static Logger _logger = Logger.getLogger(InfiniteShareInputReader.class); protected SourceFileConfigPojo _fileConfig;
From source file com.inmobi.conduit.distcp.tools.mapred.lib.DynamicRecordReader.java
/**
* The DynamicRecordReader is used in conjunction with the DynamicInputFormat
* to implement the "Worker pattern" for DistCp.
* The DynamicRecordReader is responsible for:
* 1. Presenting the contents of each chunk to DistCp's mapper.
* 2. Acquiring a new chunk when the current chunk has been completely consumed,
From source file com.inmobi.messaging.consumer.databus.mapreduce.DatabusRecordReader.java
public class DatabusRecordReader extends RecordReader<LongWritable, Message> { private LineRecordReader lineReader; public DatabusRecordReader() { lineReader = new LineRecordReader();
From source file com.intel.genomicsdb.GenomicsDBRecordReader.java
@InterfaceAudience.Public @InterfaceStability.Stable public class GenomicsDBRecordReader<VCONTEXT extends Feature, SOURCE> extends RecordReader<String, VCONTEXT> { private final GenomicsDBFeatureReader<VCONTEXT, SOURCE> featureReader; private CloseableTribbleIterator<VCONTEXT> iterator;
From source file com.knewton.mapreduce.SSTableRecordReader.java
/**
* Abstract record reader class that handles keys and values from an sstable. It's subclassed by a
* row record reader ({@link SSTableRowRecordReader}), passing an entire row as a key/value pair and
* a disk atom record reader ({@link SSTableColumnRecordReader}) passing individual disk atoms as
* values. Used in conjunction with {@link SSTableInputFormat}
*
From source file com.knewton.mrtool.io.JsonRecordReader.java
/**
* JSON record reader that reads files containing json objects and returns back a deserialized
* instantiated object of type V. The json deserialization happens with gson. Note that this record
* reader doesn't yet take advantage of splittable compressed codecs.
*
* @author Giannis Neokleous
From source file com.linkedin.cubert.io.CombinedFileRecordReader.java
/**
* Generic record reader for {@code CombineFileSplit}.
* <p>
* This record reader extracts individual {@code FileSplit} from the
* {@code CombineFileSplit}, and creates record readers for each of the FileSplits. The
* record readers are created sequentially (a new one is created when the current record
From source file com.linkedin.cubert.io.MultiMapperRecordReader.java
/**
* A wrapper over a {@code RecordReader} that encapsulates the multi mapper index.
* <p>
* The primary responsibility of this wrapper class is to handle the {@code initialize}
* method. The initialize method is given a {@code MultiMapperSplit} by hadoop, and this
* class will extract the actual InputSplit out of the MultiMapperSplit and initialize the
From source file com.linkedin.cubert.io.rubix.RubixRecordReader.java
public class RubixRecordReader<K, V> extends RecordReader<K, V> { private InputStream in; private K key; private long length; private final int bytesRead = 0; private long offset = 0;