gh9
gh9

Reputation: 10703

Perceptron Why use it for things that are linearly seperable

If we are to stick with an X and Y axis, the X axis being Time and the Y axis being test scores. Where more time equates to High test scores. You can use a binary classifcation algorithm to predict success. Wouldn't 2 if statements do the same thing

If (time > someValue)
   User will probably pass

Another Scenario Is I have papayas, and they have two labels, squishness and color.

If the squishness is greater then 7 //on a scale of 1-10
   AND the color is green 
     Then it is a good papaya

Else 
   It isn't ripe.

I dont understand the value of a perceptron in these scenarios.

Upvotes: 0

Views: 104

Answers (4)

user8662125
user8662125

Reputation: 21

Adding to others , what if the the target is non linear in nature , perceptron based learning helps to determine the complex functions. which a linear distinguisher will perform badly.

Upvotes: 0

bogatron
bogatron

Reputation: 19179

You've created two ad hoc scenarios in which a simple rule like that works (because your linear boundaries align with your feature axes). But in general, your decision tree could be much more complex (even infinitely long) to perfectly model a linear decision boundary.

Consider the case where the true decision boundary is

test_score = 3 * time

where all points above the line are "will probably pass" and all points below are "will probably fail". The number of if statements you would need grows with the number of samples. On the other hand a single Perceptron node can easily model that case.

The main point is that Perceptrons model linear decision boundaries that do not have to align with your feature axes. So in many practical cases, you can use a single Perceptron (or similarly, logistic regression) to model a decision boundary that would be much more complex (and/or less accurate) than simple feature threshold rules (which basically correspond to a decision tree).

Upvotes: 1

mlvalidated
mlvalidated

Reputation: 9

A linear model will do, yes. You could also do these classification tasks with Perceptrons. You use ANNs and other algorithms based on Perceptrons, when:

  • Input is high-dimensional discrete or realvalued (e.g. raw sensor input)
  • Output is discrete or real valued or a vector of values
  • Possibly noisy data
  • Form of target function is unknown
  • Human readability of result is unimportant
  • Examples: speech recognition, image classification

Using it for less complex tasks is quite the overkill, you are right.

Upvotes: 0

lejlot
lejlot

Reputation: 66815

The value is simple - perceptron, or any other learning algorithm learns the rules, the alternative is to design them by hand, as you did. And how would you find these optimal values if the combination involves not 2 but 100 factors? What if the rules are not "clean", but require some notion of accepting false predictions to maximise the probability of correct ones?

In general you are totally right - for simple, linearly separable data in low-dimensional space there is no point in using ML. In fact, noone uses the old-good perceptron for anything. it was just a proof of concept, which gave raise to huge amount of complex and powerful statistical learning methods.

Upvotes: 0

Related Questions