Example usage for org.apache.mahout.cf.taste.impl.common FastByIDMap clone

List of usage examples for org.apache.mahout.cf.taste.impl.common FastByIDMap clone

Introduction

In this page you can find the example usage for org.apache.mahout.cf.taste.impl.common FastByIDMap clone.

Prototype

@Override
    public FastByIDMap<V> clone() 

Source Link

Usage

From source file:norbert.mynemo.core.evaluation.PersonnalRecommenderEvaluator.java

License:Apache License

/**
 * If the exhaustive evaluation is off, the training percentage behave normally. If the exhaustive
 * evaluation is on, then the training percentage have two different behaviors:
 * <ul>//from ww  w. j  a v a  2s .  co m
 * <li>percentage < 0.5: percentage behave normally, the evaluation is not exhaustive.</li>
 * <li>
 * <p>
 * 0.5 <= percentage: the preferences of the target user are split in several test sets. Each set
 * contains approximately the same number of user preferences. For example, if
 * <code>percentage=1</code>, then the number of set equals the number of preference of the target
 * user, each set containing one preference. If <code>percentage=0.5</code>, then two sets are
 * built, each set contains half of the preferences. If <code>percentage=0.9</code>, then ten sets
 * are built, each set contains one tenth of the preferences. Then, each set is tested as if the
 * evaluation is not exhaustive.
 * </p>
 * <p>
 * Thus, <code>1</code> provides the most precise result, but the computation may be intensive.
 * </p>
 * </li>
 * </ul>
 */
@Override
public double evaluate(RecommenderBuilder recommenderBuilder, DataModelBuilder dataModelBuilder,
        DataModel dataModel, double trainingPercentage, double evaluationPercentage) throws TasteException {

    Timer timer = Timer.createStartedTimer();

    // clear the previously computed errors
    errorStats.clear();
    squaredErrorStats.clear();
    predictionRequestNumber = 0;

    // all training preferences except the target user's one
    FastByIDMap<PreferenceArray> baseTrainingPreferences = buildBaseTrainingPreferences(dataModel,
            evaluationPercentage);

    List<List<Preference>> testSets = buildTestSets(dataModel, trainingPercentage);

    // the idea is to generate a recommendation for each preference of the
    // target user.
    for (List<Preference> currentTestSet : testSets) {
        // add the preferences of the target user
        FastByIDMap<PreferenceArray> currentTrainingPreferences = baseTrainingPreferences.clone();
        addUserPreferences(dataModel, currentTrainingPreferences, currentTestSet);

        DataModel currentTrainingModel = (dataModelBuilder == null)
                ? new GenericDataModel(currentTrainingPreferences)
                : dataModelBuilder.buildDataModel(currentTrainingPreferences);

        Recommender currentRecommender = recommenderBuilder.buildRecommender(currentTrainingModel);

        evaluate(currentTrainingModel, currentRecommender, currentTestSet);
    }

    duration = timer.stop().getDuration();

    return getEvaluationSummary(metric);
}