Hello World,
This is Saumya, and I am here to help you understand and implement Linear
Regression in more detail and will discuss various problems we may encounter
while training our model along with some techniques to solve those problems. There
won't be any more programming done in this post, although, you can try it out
yourself, whatever is discussed in this blog.

So now, first of all, Let's recall what we studied about Linear Regression in our previous blog. So, we first discussed about certain notations regarding to machine learning in general, then the cost function, hθ(x

Now, before we begin, I want to talk about the Cost Function in brief. Cost function, as we defined, is, J(θ)= i=1m

So now, first of all, Let's recall what we studied about Linear Regression in our previous blog. So, we first discussed about certain notations regarding to machine learning in general, then the cost function, hθ(x

**(i)**)= θ0 x**0**+θ1 x**1**. Further we discussed about training the model using the training set by running the gradient descent algorithm over it. We also discussed about the Cost Function.Now, before we begin, I want to talk about the Cost Function in brief. Cost function, as we defined, is, J(θ)= i=1m

**∑**( hθ(x(i))-y(i))2**/**(2*m). If we define cost function, we can define it as the function, whose value is penali…