Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

Required fields*

6
  • $\begingroup$ Thank you for the answer. My apologies for confusion but when you mentioned neural network which part of it did you mean? Inputs are weighted and summed up and then passed to squashing function in hidden layer which eliminates non-linearity (is this process reiterated for input weight optimization?). From your tensorflow playground link I saw that activation functions are weighted too. $\endgroup$ Commented Jul 8, 2018 at 10:01
  • $\begingroup$ About your first part of the comment, neural nets can have one neuron or multiple neurons within different layers. Are you familiar with MLP? $\endgroup$ Commented Jul 8, 2018 at 10:12
  • $\begingroup$ No, unfortunately I'm not. I'm trying to understand how single neuron separator works before switching to multilayer perceptron. I thought inputs were weighted by optimal linear function (if data is small by normal equations), then summed up and passed to squashing function which has output that is classified between -1, 0 and 1, is this correct? I'm guessing multi-layer perceptrons are way to empty the confusion. Thanks a lot. $\endgroup$ Commented Jul 8, 2018 at 10:27
  • $\begingroup$ @ShellRox I can understand due to the fact that I've had those difficulties too :) Welcome to our community. I first recommend you taking a look at here, here and here. $\endgroup$ Commented Jul 8, 2018 at 10:32
  • 1
    $\begingroup$ I understood linear regression in-depth from Professor Gilbert Strang's linear algebra course but neural networks obviously seem to have different concept. Thanks for the links and course they will be very helpful! $\endgroup$ Commented Jul 8, 2018 at 10:49