We propose a new method for entity set expansion that achieves highly accurate extraction by suppressing the effect of semantic drift; it requires a small amount of interactive information. we supplement interactive information to re-train the topic models (based on interactive unigram mixtures) not only the contextual information. although the topic information extracted from an unsupervised corpus is effective for reducing the effect of semantic drift, the topic models and target entities sometimes suffer grain mismatch. interactive unigram mixtures can, with very few interactive words, ease the mismatch between topic and target entities. we incorporate the interactive topic information into a two-stage discriminative system for stable set expansion. expriments confrim that the proposal raises the accuracy of the set expansion system from the baselines examined.