List of usage examples for org.apache.hadoop.io Writable interface-usage
From source file com.linkedin.cubert.io.rubix.RubixInputSplit.java
public class RubixInputSplit<K, V> extends InputSplit implements Writable, Configurable { final static int MAX_LOCATIONS = 5; private K key; private Path filename; private long offset;
From source file com.linkedin.cubert.io.virtual.VirtualInputSplit.java
/**
* The input splits for the Virtual storage.
*
* @author Vinitha Gankidi
*
* @param <K>
From source file com.linkedin.haivvreo.AvroGenericRecordWritable.java
/** * Wrapper around an Avro GenericRecord. Necessary because Hive's deserializer * will happily deserialize any object - as long as it's a writable. */ public class AvroGenericRecordWritable implements Writable { GenericRecord record;
From source file com.liveramp.cascading_ext.bloom.BloomFilter.java
/** * This bloom filter implementation is based on the org.apache.hadoop.util.bloom.BloomFilter implementation, but was * modified to allow 64 bit hashes and larger bloom filters. */ public class BloomFilter implements Writable {
From source file com.m6d.hive.protobuf.Pair.java
public class Pair implements Writable { private Writable key; private Writable value; public Pair() {
From source file com.manning.hip.ch4.joins.improved.impl.OutputValue.java
/**
* This abstract class serves as the base class for the values that
* flow from the mappers to the reducers in a data join job.
* Typically, in such a job, the mappers will compute the source
* tag of an input record based on its attributes or based on the
* file name of the input file. This tag will be used by the reducers
From source file com.marcolotz.lung.io.outputFormat.SeriesDataWritable.java
/**
* Contains all the meta information of a processed DICOM series. By series, one
* can understand a group of images that were obtained in the same exam
* procedure.
*
* @author Marco Aurelio Lotz
From source file com.marcolotz.MapperComponents.ImageMetadata.java
/**
* After the mapping process, there is no need to keep the Blob itself in image
* processed object, just a few of its attributes. This class is used in the
* Reduce phase has a simple representation of the processed image.
*
* @author Marco Aurelio Lotz
From source file com.marcolotz.MapperComponents.MetaNodesCandidates.java
/**
* This class holds the important metadata from the blob generated in the Map
* phase, in order to make the JSON serialization possible.
*
* @author Marco Aurelio Lotz
*
From source file com.marcolotz.ReducerComponents.ReducedValueWritable.java
/**
* In this application one generates a list of MappedValueKeys that will be
* later serialized into JSON format. The Key Value will be the same for the
* mapper and for the reducer, since its data should not change if the images
* are in the same series.
*