List of usage examples for org.apache.hadoop.io Writable interface-usage
From source file com.datasalt.utils.io.WritableIntBloomFilter.java
/**
* A Writable version ready to be used by Hadoop of {@link BloomFilter}
* It has fixed 32-bit length so it can be serialized to an int.
*
* @author pere
*
From source file com.datasalt.utils.mapred.joiner.JoinDatum.java
@SuppressWarnings("rawtypes") public class JoinDatum<T extends TBase> implements Writable { public enum Source { OLD, NEW }
From source file com.datasalt.utils.mapred.joiner.MultiJoinDatum.java
/**
* This is the class that will be serialized as value when using the {@link MultiJoiner}.
* The MultiJoiner API will serialize any object here by using the Hadoop Serialization API.
*
* @author pere
*
From source file com.datascience.hadoop.ListWritable.java
/**
* Hadoop writable for lists containing writable fields.
* <p>
* This is a small {@link org.apache.hadoop.io.Writable} implementation intended for storing sorted lists of values.
* Internally, a {@code T[]} pool is used to store and recycle lists of values of the given {@code type} for efficiency.
*
From source file com.datatorrent.stram.util.AbstractWritableAdapter.java
/** * Adapter for Hadoop RPC to implement Writable using Java serialization. * * @since 0.3.2 */ public abstract class AbstractWritableAdapter implements Writable, Serializable {
From source file com.digitalpebble.behemoth.BehemothDocument.java
/** * Implementation of a Document using Hadoop primitives. A BehemothDocument * consists of a URL, content type, binary content, metadata and @class * Annotations. ***/ public class BehemothDocument implements Writable {
From source file com.digitalpebble.behemoth.io.warc.WritableWarcRecord.java
public class WritableWarcRecord implements Writable { WarcRecord record = null; public WritableWarcRecord() { record = new WarcRecord();
From source file com.digitalpebble.behemoth.tika.TextArrayWritable.java
/** * * **/ public class TextArrayWritable implements Writable { // Hmm, is this the best way to do this?
From source file com.dinglicom.clouder.mapreduce.input.FileSplit.java
/** A section of an input file. Returned by {@link * InputFormat#getSplits(JobContext)} and passed to * {@link InputFormat#createRecordReader(InputSplit,TaskAttemptContext)}. */ //@InterfaceAudience.Public //@InterfaceStability.Stable public class FileSplit extends InputSplit implements Writable {
From source file com.ebay.erl.mobius.core.model.ResultWrapper.java
/**
*
* <p>
* This product is licensed under the Apache License, Version 2.0,
* available at http://www.apache.org/licenses/LICENSE-2.0.
*