From 2ace6e8deb934359e6389117542bc3cfb78237a0 Mon Sep 17 00:00:00 2001 From: Navan Chauhan Date: Tue, 21 Jan 2020 21:43:53 +0530 Subject: Publish deploy 2020-01-21 21:43 --- posts/2019-12-16-TensorFlow-Polynomial-Regression/index.html | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) (limited to 'posts/2019-12-16-TensorFlow-Polynomial-Regression/index.html') diff --git a/posts/2019-12-16-TensorFlow-Polynomial-Regression/index.html b/posts/2019-12-16-TensorFlow-Polynomial-Regression/index.html index 06e7974..0e3dd19 100644 --- a/posts/2019-12-16-TensorFlow-Polynomial-Regression/index.html +++ b/posts/2019-12-16-TensorFlow-Polynomial-Regression/index.html @@ -366,4 +366,4 @@ plt.show() -

Results and Conclusion

You just learnt Polynomial Regression using TensorFlow!

Notes

Overfitting

> Overfitting refers to a model that models the training data too well.Overfitting happens when a model learns the detail and noise in the training data to the extent that it negatively impacts the performance of the model on new data. This means that the noise or random fluctuations in the training data is picked up and learned as concepts by the model. The problem is that these concepts do not apply to new data and negatively impact the models ability to generalize.

Source: Machine Learning Mastery

Basically if you train your machine learning model on a small dataset for a really large number of epochs, the model will learn all the deformities/noise in the data and will actually think that it is a normal part. Therefore when it will see some new data, it will discard that new data as noise and will impact the accuracy of the model in a negative manner

Tagged with: \ No newline at end of file +

Results and Conclusion

You just learnt Polynomial Regression using TensorFlow!

Notes

Overfitting

> Overfitting refers to a model that models the training data too well.Overfitting happens when a model learns the detail and noise in the training data to the extent that it negatively impacts the performance of the model on new data. This means that the noise or random fluctuations in the training data is picked up and learned as concepts by the model. The problem is that these concepts do not apply to new data and negatively impact the models ability to generalize.

Source: Machine Learning Mastery

Basically if you train your machine learning model on a small dataset for a really large number of epochs, the model will learn all the deformities/noise in the data and will actually think that it is a normal part. Therefore when it will see some new data, it will discard that new data as noise and will impact the accuracy of the model in a negative manner

Tagged with:
\ No newline at end of file -- cgit v1.2.3