List of usage examples for org.apache.lucene.analysis TokenStream subclass-usage
From source file org.apache.blur.lucene.security.analysis.DocumentVisibilityTokenStream.java
public class DocumentVisibilityTokenStream extends TokenStream { private static final String UTF_8 = "UTF-8"; private final String _visiblity; private final CharTermAttribute _tokenAtt;
From source file org.apache.jackrabbit.core.query.lucene.SingletonTokenStream.java
/** * <code>SingletonTokenStream</code> implements a token stream that wraps a * single value with a given property type. The property type is stored as a * payload on the single returned token. */ public final class SingletonTokenStream extends TokenStream {
From source file org.apache.jackrabbit.core.query.lucene.SingletonTokenStream.java
/** * <code>SingletonTokenStream</code> implements a token stream that wraps a * single value with a given property type. The property type is stored as a * payload on the single returned token. */ public final class SingletonTokenStream extends TokenStream {
From source file org.apache.jackrabbit.core.query.lucene.SingletonTokenStream.java
/** * <code>SingletonTokenStream</code> implements a token stream that wraps a * single value with a given property type. The property type is stored as a * payload on the single returned token. */ public final class SingletonTokenStream extends TokenStream {
From source file org.apache.mahout.common.lucene.IteratorTokenStream.java
/** Used to emit tokens from an input string array in the style of TokenStream */ public final class IteratorTokenStream extends TokenStream { private final CharTermAttribute termAtt; private final Iterator<String> iterator; public IteratorTokenStream(Iterator<String> iterator) {
From source file org.apache.pylucene.analysis.PythonTokenStream.java
public class PythonTokenStream extends TokenStream { private long pythonObject; public PythonTokenStream() { }
From source file org.apache.solr.legacy.LegacyNumericTokenStream.java
/**
* <b>Expert:</b> This class provides a {@link TokenStream}
* for indexing numeric values that can be used by {@link
* org.apache.solr.legacy.LegacyNumericRangeQuery}.
*
* <p>Note that for simple usage, {@link org.apache.solr.legacy.LegacyIntField}, {@link
From source file org.apache.uima.lucas.indexer.analysis.AnnotationTokenStream.java
/**
*
* AnnotationTokenStream represents a TokenStream which extracts tokens from feature values of
* annotations of a given type from a JCas object. Each token has the start and end offset from the
* annotation object. This class supports only the following UIMA JCas types of features:
* <ol>
From source file org.apache.uima.lucas.indexer.analysis.TokenStreamConcatenator.java
/** * A TokenStreamStringConcatenator takes a {@link java.util.Collection Collection} of * {@link org.apache.lucene.analysis.TokenStream tokenstreams} and concats them. */ public class TokenStreamConcatenator extends TokenStream {
From source file org.apache.uima.lucas.indexer.analysis.TokenStreamMerger.java
/**
* A TokenStreamMerger merges a {@link java.util.List list} of
* {@link org.apache.lucene.analysis.TokenStream token streams} by the means of
* their token offsets. Adapts positionIncrement of tokens if their startOffset
* is exactly the same.
*/