Example usage for org.apache.spark.api.java.function Function2 interface-usage

List of usage examples for org.apache.spark.api.java.function Function2 interface-usage

Introduction

In this page you can find the example usage for org.apache.spark.api.java.function Function2 interface-usage.

Usage

From source file BwaSingleAlignment.java

public class BwaSingleAlignment extends BwaAlignmentBase
        implements Function2<Integer, Iterator<String>, Iterator<String>> {

    public BwaSingleAlignment(SparkContext context, Bwa bwaInterpreter) {
        super(context, bwaInterpreter);
    }

From source file BwaPairedAlignment.java

/**
 * Class to perform the alignment over a split from the RDD
 * @author Jos M. Abun
 * @return A RDD containing the resulting Sam files from the alignment.
 */
public class BwaPairedAlignment extends BwaAlignmentBase

From source file co.cask.cdap.etl.spark.streaming.function.ComputeTransformFunction.java

/**
 * Function used to implement a SparkCompute stage in a DStream.
 *
 * @param <T> type of object in the input rdd
 * @param <U> type of object in the output rdd
 */

From source file co.cask.cdap.etl.spark.streaming.function.DynamicAggregatorAggregate.java

/**
 * Serializable function that can be used to perform the aggregate part of an Aggregator. Dynamically instantiates
 * the Aggregator plugin used to ensure that code changes are picked up and to ensure that macro substitution occurs.
 *
 * @param <GROUP_KEY> type of group key
 * @param <GROUP_VAL> type of group val

From source file co.cask.cdap.etl.spark.streaming.function.DynamicAggregatorGroupBy.java

/**
 * Serializable function that can be used to perform the group by part of an Aggregator. Dynamically instantiates
 * the Aggregator plugin used to ensure that code changes are picked up and to ensure that macro substitution occurs.
 *
 * @param <GROUP_KEY> type of group key
 * @param <GROUP_VAL> type of group val

From source file co.cask.cdap.etl.spark.streaming.function.DynamicJoinMerge.java

/**
 * Serializable function that can be used to perform the merge part of a Joiner. Dynamically instantiates
 * the relevant Joiner plugin to ensure that code changes are picked up and to ensure
 * that macro substitution occurs.
 *
 * @param <JOIN_KEY> type of join key

From source file co.cask.cdap.etl.spark.streaming.function.DynamicJoinOn.java

/**
 * Serializable function that can be used to perform the joinOn part of a Joiner. Dynamically instantiates
 * the relevant Joiner plugin to ensure that code changes are picked up and to ensure
 * that macro substitution occurs.
 *
 * @param <JOIN_KEY> type of join key

From source file co.cask.cdap.etl.spark.streaming.function.DynamicTransform.java

/**
 * Serializable function that can be used to perform a flat map on a DStream. Dynamically instantiates
 * the Transform plugin used to perform the flat map to ensure that code changes are picked up and to ensure
 * that macro substitution occurs.
 *
 * @param <T> type of input object

From source file co.cask.cdap.etl.spark.streaming.function.StreamingBatchSinkFunction.java

/**
 * Function used to write a batch of data to a batch sink for use with a JavaDStream.
 * note: not using foreachRDD(VoidFunction2) method, because spark 1.3 doesn't have VoidFunction2.
 *
 * @param <T> type of object in the rdd
 */

From source file com.anhth12.lambda.BatchUpdateFunction.java

/**
 *
 * @author Tong Hoang Anh
 */
public class BatchUpdateFunction<K, M, U> implements Function2<JavaPairRDD<K, M>, Time, Void> {