原型

class sklearn.naive_bayes.MultinomialNB(alpha=1.0fit_prior=Trueclass_prior=None)

参数

Parameters:

alpha : float, optional (default=1.0)

Additive (Laplace/Lidstone) smoothing parameter (0 for no smoothing).

fit_prior : boolean, optional (default=True)

Whether to learn class prior probabilities or not. If false, a uniform prior will be used.

class_prior : array-like, size (n_classes,), optional (default=None)

Prior probabilities of the classes. If specified the priors are not adjusted according to the data.

alpha的说明——

The parameters sklearn-MultinomialNB朴素贝叶斯分类器 is estimated by a smoothed version of maximum likelihood, i.e. relative frequency counting:

sklearn-MultinomialNB朴素贝叶斯分类器

where sklearn-MultinomialNB朴素贝叶斯分类器 is the number of times feature sklearn-MultinomialNB朴素贝叶斯分类器 appears in a sample of class sklearn-MultinomialNB朴素贝叶斯分类器 in the training set sklearn-MultinomialNB朴素贝叶斯分类器, and sklearn-MultinomialNB朴素贝叶斯分类器 is the total count of all features for class sklearn-MultinomialNB朴素贝叶斯分类器.

The smoothing priors sklearn-MultinomialNB朴素贝叶斯分类器 accounts for features not present in the learning samples and prevents zero probabilities in further computations. Setting sklearn-MultinomialNB朴素贝叶斯分类器 is called Laplace smoothing, while sklearn-MultinomialNB朴素贝叶斯分类器 is called Lidstone smoothing.

 

示例

>>> import numpy as np
>>> X = np.random.randint(5, size=(6, 100))
>>> y = np.array([1, 2, 3, 4, 5, 6])
>>> from sklearn.naive_bayes import MultinomialNB
>>> clf = MultinomialNB()
>>> clf.fit(X, y)
MultinomialNB(alpha=1.0, class_prior=None, fit_prior=True)
>>> print(clf.predict(X[2:3]))

  

相关文章:

猜你喜欢
  • 2021-11-04
  • 2022-12-23
  • 2022-01-05
相关资源
相似解决方案