Package edu.berkeley.nlp.lm.cache
Interface ContextEncodedLmCache
- All Superinterfaces:
Serializable
- All Known Implementing Classes:
ContextEncodedDirectMappedLmCache
-
Method Summary
Modifier and TypeMethodDescriptionintcapacity()floatgetCached(long contextOffset, int contextOrder, int word, int hash, ContextEncodedNgramLanguageModel.LmContextInfo outputPrefix) Should return Float.NaN if requested n-gram is not in the cache.voidputCached(long contextOffset, int contextOrder, int word, float prob, int hash, ContextEncodedNgramLanguageModel.LmContextInfo outputPrefix)
-
Method Details
-
getCached
float getCached(long contextOffset, int contextOrder, int word, int hash, ContextEncodedNgramLanguageModel.LmContextInfo outputPrefix) Should return Float.NaN if requested n-gram is not in the cache.- Parameters:
contextOffset-contextOrder-word-hash-outputPrefix-- Returns:
-
putCached
void putCached(long contextOffset, int contextOrder, int word, float prob, int hash, ContextEncodedNgramLanguageModel.LmContextInfo outputPrefix) -
capacity
int capacity()
-