Machine Learning Series - Introduction (b) of formula / discriminant model

1, machine learning tasks

  • Machine learning task is to predict the mark from the property, namely determination of probability

 

 

2, discriminant model

  • For example no, discriminant model can be obtained in accordance with markers, which can be directly determined out
  • For dichotomous tasks, actually get a score, when the score is greater than the threshold, compared with positive class, otherwise it is a like a
  • As shown in FIG left, is actually directly determined the boundaries
  • Discriminant model is called "discrimination" model, because it is based on "discrimination"
  • Common discriminant model
    • Linear regression models
    • Support vector machine ( SVM )
    • Logistic regression ( LR )
    • Neural Networks ( NN )
    • Gaussian process ( Gaussian Process )
    • Conditional Random CRF
    • CART Classification and regression tree
    • Boosting

 

3, the model formula

  • For example no, need to model formula is obtained between the joint probability distribution of different labels, the probability of winning big
  • As shown in the right side of FIG, and the absence of any boundary, for example no (red triangles), the sum of two joint probability distribution (two classes), compare, as a predicted high probability category
  • Generation model is called "generation" model, because it is based on joint probability forecast
    • The joint probability can be interpreted as the probability of the "Generate" distribution of the sample (or called basis)
  • Specifically,
    • Known machine learning, from a set of candidate selected from one of
    • The sample may have, (the X-| Y_2) , (the X-| Y_3) , ...... ,
    • The actual data is how to "generate" depends on
    • Then the final results of the election predicted "generate" the greatest probability that
  • Familiar Naive Bayes all know, for input, you need to obtain several joint probability, then the larger one is to predict results
  • Common generative model
    • Discriminant analysis: Gaussian discriminant model
    • Naive Bayes ( Naive Bayes )
    • Gaussian mixture model ( Gaussians )
    • K neighbors ( KNN )
    • Hidden Markov model ( the HMM )
    • Bayesian networks
    • Sigmoid belief networks ( S igmoid Belief Networks )
    • MRF ( Markov Random Fields )
    • Deep belief network DBN
    • Latent Dirichlet Allocation ( LDA , Latent Dirichlet Allocation )
    • Multi-expert model ( at The Mixture of Experts Model )

 

4, analog Case

  • Determine a sheep or goat sheep
  • Discriminant model:
    • Learning from historical data to model
    • This feature extraction sheep to predict this sheep is a sheep or goat
    • That can give the sheep a sheep directly from the probability of feature
  • Generative model:
    • According to first learn the characteristics of goat model of a goat
    • It features a learning model sheep sheep
    • Extracting features from sheep, a goat model into what is the probability seen, put sheep seen what is the probability model
    • That is a big probability model class sheep belongs to this category
    • That is, the generative model for each class to have a try, that is, the last resulting most probable

 

 

Reference blog:

https://www.zhihu.com/question/20446337 (machine learning "model determination," and "generative model" What is the difference)

https://blog.csdn.net/u010358304/article/details/79748153 (Model VS generated discriminant model)

https://www.nowcoder.com/questionTerminal/e7ac0572b29a490da333d2c7ff8623ac?orderByHotValue=0&done=0&pos=1&mutiTagIds=631&onlyReference=false (discriminant model formula model)

 

Guess you like

Origin www.cnblogs.com/snailt/p/12561737.html