summaryrefslogtreecommitdiff
path: root/posts/2019-12-16-TensorFlow-Polynomial-Regression
diff options
context:
space:
mode:
authorNavan Chauhan <navanchauhan@gmail.com>2020-09-15 15:53:28 +0530
committerNavan Chauhan <navanchauhan@gmail.com>2020-09-15 15:53:28 +0530
commit103beb518fc01535d1a5edb9a8d754816e53ec2c (patch)
treea815ec9445fb4d30500c06d24359505da6fcd1da /posts/2019-12-16-TensorFlow-Polynomial-Regression
parent13a4f4238adeee4ff3f7233ff82a8058ca2ffcb3 (diff)
Publish deploy 2020-09-15 15:53
Diffstat (limited to 'posts/2019-12-16-TensorFlow-Polynomial-Regression')
-rw-r--r--posts/2019-12-16-TensorFlow-Polynomial-Regression/index.html10
1 files changed, 5 insertions, 5 deletions
diff --git a/posts/2019-12-16-TensorFlow-Polynomial-Regression/index.html b/posts/2019-12-16-TensorFlow-Polynomial-Regression/index.html
index b9322b0..75b14ec 100644
--- a/posts/2019-12-16-TensorFlow-Polynomial-Regression/index.html
+++ b/posts/2019-12-16-TensorFlow-Polynomial-Regression/index.html
@@ -1,9 +1,9 @@
-<!DOCTYPE html><html lang="en"><head><meta charset="UTF-8"/><meta name="og:site_name" content="Navan Chauhan"/><link rel="canonical" href="https://navanchauhan.github.io/posts/2019-12-16-TensorFlow-Polynomial-Regression"/><meta name="twitter:url" content="https://navanchauhan.github.io/posts/2019-12-16-TensorFlow-Polynomial-Regression"/><meta name="og:url" content="https://navanchauhan.github.io/posts/2019-12-16-TensorFlow-Polynomial-Regression"/><title>Polynomial Regression Using TensorFlow | Navan Chauhan</title><meta name="twitter:title" content="Polynomial Regression Using TensorFlow | Navan Chauhan"/><meta name="og:title" content="Polynomial Regression Using TensorFlow | Navan Chauhan"/><meta name="description" content="Polynomial regression using TensorFlow"/><meta name="twitter:description" content="Polynomial regression using TensorFlow"/><meta name="og:description" content="Polynomial regression using TensorFlow"/><meta name="twitter:card" content="summary"/><link rel="stylesheet" href="/styles.css" type="text/css"/><meta name="viewport" content="width=device-width, initial-scale=1.0"/><link rel="shortcut icon" href="/images/favicon.png" type="image/png"/><link rel="alternate" href="/feed.rss" type="application/rss+xml" title="Subscribe to Navan Chauhan"/><meta name="twitter:image" content="https://navanchauhan.github.io/images/logo.png"/><meta name="og:image" content="https://navanchauhan.github.io/images/logo.png"/></head><head><script>var _paq=window._paq=window._paq||[];_paq.push(['trackPageView']),_paq.push(['enableLinkTracking']),function(){var a='https://navanspi.duckdns.org:6969/analytics/';_paq.push(['setTrackerUrl',a+'matomo.php']),_paq.push(['setSiteId','2']);var e=document,t=e.createElement('script'),p=e.getElementsByTagName('script')[0];t.type='text/javascript',t.async=!0,t.src=a+'matomo.js',p.parentNode.insertBefore(t,p)}();</script></head><head><script src="https://www.googletagmanager.com/gtag/js?id=UA-108635191-1v"></script><script>window.dataLayer = window.dataLayer || [];function gtag(){dataLayer.push(arguments);}gtag('js', new Date());gtag('config', 'UA-108635191-1');</script></head><body class="item-page"><header><div class="wrapper"><a class="site-name" href="/">Navan Chauhan</a><nav><ul><li><a href="/about">About Me</a></li><li><a class="selected" href="/posts">Posts</a></li><li><a href="/publications">Publications</a></li><li><a href="/assets/résumé.pdf">Résumé</a></li><li><a href="https://navanchauhan.github.io/repo">Repo</a></li></ul></nav></div></header><div class="wrapper"><article><div class="content"><span class="reading-time">17 minute read</span><span class="reading-time">Created on December 16, 2019</span><span class="reading-time">Last modified on June 1, 2020</span><h1>Polynomial Regression Using TensorFlow</h1><p><strong>In this tutorial you will learn about polynomial regression and how you can implement it in Tensorflow.</strong></p><p>In this, we will be performing polynomial regression using 5 types of equations -</p><ul><li>Linear</li><li>Quadratic</li><li>Cubic</li><li>Quartic</li><li>Quintic</li></ul><h2>Regression</h2><h3>What is Regression?</h3><p>Regression is a statistical measurement that is used to try to determine the relationship between a dependent variable (often denoted by Y), and series of varying variables (called independent variables, often denoted by X ).</p><h3>What is Polynomial Regression</h3><p>This is a form of Regression Analysis where the relationship between Y and X is denoted as the nth degree/power of X. Polynomial regression even fits a non-linear relationship (e.g when the points don't form a straight line).</p><h2>Imports</h2><pre><code><div class="highlight"><span></span><span class="kn">import</span> <span class="nn">tensorflow.compat.v1</span> <span class="k">as</span> <span class="nn">tf</span>
+<!DOCTYPE html><html lang="en"><head><meta charset="UTF-8"/><meta name="og:site_name" content="Navan Chauhan"/><link rel="canonical" href="https://navanchauhan.github.io/posts/2019-12-16-TensorFlow-Polynomial-Regression"/><meta name="twitter:url" content="https://navanchauhan.github.io/posts/2019-12-16-TensorFlow-Polynomial-Regression"/><meta name="og:url" content="https://navanchauhan.github.io/posts/2019-12-16-TensorFlow-Polynomial-Regression"/><title>Polynomial Regression Using TensorFlow | Navan Chauhan</title><meta name="twitter:title" content="Polynomial Regression Using TensorFlow | Navan Chauhan"/><meta name="og:title" content="Polynomial Regression Using TensorFlow | Navan Chauhan"/><meta name="description" content="Polynomial regression using TensorFlow"/><meta name="twitter:description" content="Polynomial regression using TensorFlow"/><meta name="og:description" content="Polynomial regression using TensorFlow"/><meta name="twitter:card" content="summary"/><link rel="stylesheet" href="/styles.css" type="text/css"/><meta name="viewport" content="width=device-width, initial-scale=1.0"/><link rel="shortcut icon" href="/images/favicon.png" type="image/png"/><link rel="alternate" href="/feed.rss" type="application/rss+xml" title="Subscribe to Navan Chauhan"/><meta name="twitter:image" content="https://navanchauhan.github.io/images/logo.png"/><meta name="og:image" content="https://navanchauhan.github.io/images/logo.png"/></head><head><script>var _paq=window._paq=window._paq||[];_paq.push(['trackPageView']),_paq.push(['enableLinkTracking']),function(){var a='https://navanspi.duckdns.org:6969/analytics/';_paq.push(['setTrackerUrl',a+'matomo.php']),_paq.push(['setSiteId','2']);var e=document,t=e.createElement('script'),p=e.getElementsByTagName('script')[0];t.type='text/javascript',t.async=!0,t.src=a+'matomo.js',p.parentNode.insertBefore(t,p)}();</script></head><head><script src="https://www.googletagmanager.com/gtag/js?id=UA-108635191-1v"></script><script>window.dataLayer = window.dataLayer || [];function gtag(){dataLayer.push(arguments);}gtag('js', new Date());gtag('config', 'UA-108635191-1');</script></head><body class="item-page"><header><div class="wrapper"><a class="site-name" href="/">Navan Chauhan</a><nav><ul><li><a href="/about">About Me</a></li><li><a class="selected" href="/posts">Posts</a></li><li><a href="/publications">Publications</a></li><li><a href="/assets/résumé.pdf">Résumé</a></li><li><a href="https://navanchauhan.github.io/repo">Repo</a></li></ul></nav></div></header><div class="wrapper"><article><div class="content"><span class="reading-time">17 minute read</span><span class="reading-time">Created on December 16, 2019</span><span class="reading-time">Last modified on September 15, 2020</span><h1>Polynomial Regression Using TensorFlow</h1><p><strong>In this tutorial you will learn about polynomial regression and how you can implement it in Tensorflow.</strong></p><p>In this, we will be performing polynomial regression using 5 types of equations -</p><ul><li>Linear</li><li>Quadratic</li><li>Cubic</li><li>Quartic</li><li>Quintic</li></ul><h2>Regression</h2><h3>What is Regression?</h3><p>Regression is a statistical measurement that is used to try to determine the relationship between a dependent variable (often denoted by Y), and series of varying variables (called independent variables, often denoted by X ).</p><h3>What is Polynomial Regression</h3><p>This is a form of Regression Analysis where the relationship between Y and X is denoted as the nth degree/power of X. Polynomial regression even fits a non-linear relationship (e.g when the points don't form a straight line).</p><h2>Imports</h2><pre><code><div class="highlight"><span></span><span class="kn">import</span> <span class="nn">tensorflow.compat.v1</span> <span class="k">as</span> <span class="nn">tf</span>
<span class="n">tf</span><span class="o">.</span><span class="n">disable_v2_behavior</span><span class="p">()</span>
<span class="kn">import</span> <span class="nn">matplotlib.pyplot</span> <span class="k">as</span> <span class="nn">plt</span>
<span class="kn">import</span> <span class="nn">numpy</span> <span class="k">as</span> <span class="nn">np</span>
<span class="kn">import</span> <span class="nn">pandas</span> <span class="k">as</span> <span class="nn">pd</span>
-</div></code></pre><h2>Dataset</h2><h3>Creating Random Data</h3><p>Even though in this tutorial we will use a Position Vs Salary datasset, it is important to know how to create synthetic data</p><p>To create 50 values spaced evenly between 0 and 50, we use NumPy's linspace funtion</p><p><code>linspace(lower_limit, upper_limit, no_of_observations)</code></p><pre><code><div class="highlight"><span></span><span class="n">x</span> <span class="o">=</span> <span class="n">np</span><span class="o">.</span><span class="n">linspace</span><span class="p">(</span><span class="mi">0</span><span class="p">,</span> <span class="mi">50</span><span class="p">,</span> <span class="mi">50</span><span class="p">)</span>
+</div></code></pre><h2>Dataset</h2><h3>Creating Random Data</h3><p>Even though in this tutorial we will use a Position Vs Salary dataset, it is important to know how to create synthetic data</p><p>To create 50 values spaced evenly between 0 and 50, we use NumPy's linspace function</p><p><code>linspace(lower_limit, upper_limit, no_of_observations)</code></p><pre><code><div class="highlight"><span></span><span class="n">x</span> <span class="o">=</span> <span class="n">np</span><span class="o">.</span><span class="n">linspace</span><span class="p">(</span><span class="mi">0</span><span class="p">,</span> <span class="mi">50</span><span class="p">,</span> <span class="mi">50</span><span class="p">)</span>
<span class="n">y</span> <span class="o">=</span> <span class="n">np</span><span class="o">.</span><span class="n">linspace</span><span class="p">(</span><span class="mi">0</span><span class="p">,</span> <span class="mi">50</span><span class="p">,</span> <span class="mi">50</span><span class="p">)</span>
</div></code></pre><p>We use the following function to add noise to the data, so that our values</p><pre><code><div class="highlight"><span></span><span class="n">x</span> <span class="o">+=</span> <span class="n">np</span><span class="o">.</span><span class="n">random</span><span class="o">.</span><span class="n">uniform</span><span class="p">(</span><span class="o">-</span><span class="mi">4</span><span class="p">,</span> <span class="mi">4</span><span class="p">,</span> <span class="mi">50</span><span class="p">)</span>
<span class="n">y</span> <span class="o">+=</span> <span class="n">np</span><span class="o">.</span><span class="n">random</span><span class="o">.</span><span class="n">uniform</span><span class="p">(</span><span class="o">-</span><span class="mi">4</span><span class="p">,</span> <span class="mi">4</span><span class="p">,</span> <span class="mi">50</span><span class="p">)</span>
@@ -22,7 +22,7 @@
<span class="o">|</span> <span class="n">Senior</span> <span class="n">Partner</span> <span class="o">|</span> <span class="mi">8</span> <span class="o">|</span> <span class="mi">300000</span> <span class="o">|</span>
<span class="o">|</span> <span class="n">C</span><span class="o">-</span><span class="n">level</span> <span class="o">|</span> <span class="mi">9</span> <span class="o">|</span> <span class="mi">500000</span> <span class="o">|</span>
<span class="o">|</span> <span class="n">CEO</span> <span class="o">|</span> <span class="mi">10</span> <span class="o">|</span> <span class="mi">1000000</span> <span class="o">|</span>
-</div></code></pre><p>We convert the salary column as the ordinate (y-cordinate) and level column as the abscissa</p><pre><code><div class="highlight"><span></span><span class="n">abscissa</span> <span class="o">=</span> <span class="n">df</span><span class="p">[</span><span class="s2">&quot;Level&quot;</span><span class="p">]</span><span class="o">.</span><span class="n">to_list</span><span class="p">()</span> <span class="c1"># abscissa = [1,2,3,4,5,6,7,8,9,10]</span>
+</div></code></pre><p>We convert the salary column as the ordinate (y-coordinate) and level column as the abscissa</p><pre><code><div class="highlight"><span></span><span class="n">abscissa</span> <span class="o">=</span> <span class="n">df</span><span class="p">[</span><span class="s2">&quot;Level&quot;</span><span class="p">]</span><span class="o">.</span><span class="n">to_list</span><span class="p">()</span> <span class="c1"># abscissa = [1,2,3,4,5,6,7,8,9,10]</span>
<span class="n">ordinate</span> <span class="o">=</span> <span class="n">df</span><span class="p">[</span><span class="s2">&quot;Salary&quot;</span><span class="p">]</span><span class="o">.</span><span class="n">to_list</span><span class="p">()</span> <span class="c1"># ordinate = [45000,50000,60000,80000,110000,150000,200000,300000,500000,1000000]</span>
</div></code></pre><pre><code><div class="highlight"><span></span><span class="n">n</span> <span class="o">=</span> <span class="nb">len</span><span class="p">(</span><span class="n">abscissa</span><span class="p">)</span> <span class="c1"># no of observations</span>
<span class="n">plt</span><span class="o">.</span><span class="n">scatter</span><span class="p">(</span><span class="n">abscissa</span><span class="p">,</span> <span class="n">ordinate</span><span class="p">)</span>
@@ -32,7 +32,7 @@
<span class="n">plt</span><span class="o">.</span><span class="n">show</span><span class="p">()</span>
</div></code></pre><img src="/assets/gciTales/03-regression/1.png"/><h2>Defining Stuff</h2><pre><code><div class="highlight"><span></span><span class="n">X</span> <span class="o">=</span> <span class="n">tf</span><span class="o">.</span><span class="n">placeholder</span><span class="p">(</span><span class="s2">&quot;float&quot;</span><span class="p">)</span>
<span class="n">Y</span> <span class="o">=</span> <span class="n">tf</span><span class="o">.</span><span class="n">placeholder</span><span class="p">(</span><span class="s2">&quot;float&quot;</span><span class="p">)</span>
-</div></code></pre><h3>Defining Variables</h3><p>We first define all the coefficients and constant as tensorflow variables haveing a random intitial value</p><pre><code><div class="highlight"><span></span><span class="n">a</span> <span class="o">=</span> <span class="n">tf</span><span class="o">.</span><span class="n">Variable</span><span class="p">(</span><span class="n">np</span><span class="o">.</span><span class="n">random</span><span class="o">.</span><span class="n">randn</span><span class="p">(),</span> <span class="n">name</span> <span class="o">=</span> <span class="s2">&quot;a&quot;</span><span class="p">)</span>
+</div></code></pre><h3>Defining Variables</h3><p>We first define all the coefficients and constant as tensorflow variables having a random initial value</p><pre><code><div class="highlight"><span></span><span class="n">a</span> <span class="o">=</span> <span class="n">tf</span><span class="o">.</span><span class="n">Variable</span><span class="p">(</span><span class="n">np</span><span class="o">.</span><span class="n">random</span><span class="o">.</span><span class="n">randn</span><span class="p">(),</span> <span class="n">name</span> <span class="o">=</span> <span class="s2">&quot;a&quot;</span><span class="p">)</span>
<span class="n">b</span> <span class="o">=</span> <span class="n">tf</span><span class="o">.</span><span class="n">Variable</span><span class="p">(</span><span class="n">np</span><span class="o">.</span><span class="n">random</span><span class="o">.</span><span class="n">randn</span><span class="p">(),</span> <span class="n">name</span> <span class="o">=</span> <span class="s2">&quot;b&quot;</span><span class="p">)</span>
<span class="n">c</span> <span class="o">=</span> <span class="n">tf</span><span class="o">.</span><span class="n">Variable</span><span class="p">(</span><span class="n">np</span><span class="o">.</span><span class="n">random</span><span class="o">.</span><span class="n">randn</span><span class="p">(),</span> <span class="n">name</span> <span class="o">=</span> <span class="s2">&quot;c&quot;</span><span class="p">)</span>
<span class="n">d</span> <span class="o">=</span> <span class="n">tf</span><span class="o">.</span><span class="n">Variable</span><span class="p">(</span><span class="n">np</span><span class="o">.</span><span class="n">random</span><span class="o">.</span><span class="n">randn</span><span class="p">(),</span> <span class="n">name</span> <span class="o">=</span> <span class="s2">&quot;d&quot;</span><span class="p">)</span>
@@ -304,4 +304,4 @@
<span class="n">plt</span><span class="o">.</span><span class="n">title</span><span class="p">(</span><span class="s1">&#39;Quintic Regression Result&#39;</span><span class="p">)</span>
<span class="n">plt</span><span class="o">.</span><span class="n">legend</span><span class="p">()</span>
<span class="n">plt</span><span class="o">.</span><span class="n">show</span><span class="p">()</span>
-</div></code></pre><img src="/assets/gciTales/03-regression/6.png"/><h2>Results and Conclusion</h2><p>You just learnt Polynomial Regression using TensorFlow!</p><h2>Notes</h2><h3>Overfitting</h3><blockquote><p>&gt; Overfitting refers to a model that models the training data too well.Overfitting happens when a model learns the detail and noise in the training data to the extent that it negatively impacts the performance of the model on new data. This means that the noise or random fluctuations in the training data is picked up and learned as concepts by the model. The problem is that these concepts do not apply to new data and negatively impact the models ability to generalize.</p></blockquote><blockquote><p>Source: Machine Learning Mastery</p></blockquote><p>Basically if you train your machine learning model on a small dataset for a really large number of epochs, the model will learn all the deformities/noise in the data and will actually think that it is a normal part. Therefore when it will see some new data, it will discard that new data as noise and will impact the accuracy of the model in a negative manner</p></div><span>Tagged with: </span><ul class="tag-list"><li><a href="/tags/tutorial">Tutorial</a></li><li><a href="/tags/tensorflow">Tensorflow</a></li><li><a href="/tags/colab">Colab</a></li></ul></article></div><footer><p>Made with ❤️ using <a href="https://github.com/johnsundell/publish">Publish</a></p><p><a href="/feed.rss">RSS feed</a></p></footer></body></html> \ No newline at end of file
+</div></code></pre><img src="/assets/gciTales/03-regression/6.png"/><h2>Results and Conclusion</h2><p>You just learnt Polynomial Regression using TensorFlow!</p><h2>Notes</h2><h3>Overfitting</h3><blockquote><p>&gt; Overfitting refers to a model that models the training data too well.Overfitting happens when a model learns the detail and noise in the training data to the extent that it negatively impacts the performance of the model on new data. This means that the noise or random fluctuations in the training data is picked up and learned as concepts by the model. The problem is that these concepts do not apply to new data and negatively impact the models ability to generalise.</p></blockquote><blockquote><p>Source: Machine Learning Mastery</p></blockquote><p>Basically if you train your machine learning model on a small dataset for a really large number of epochs, the model will learn all the deformities/noise in the data and will actually think that it is a normal part. Therefore when it will see some new data, it will discard that new data as noise and will impact the accuracy of the model in a negative manner</p></div><span>Tagged with: </span><ul class="tag-list"><li><a href="/tags/tutorial">Tutorial</a></li><li><a href="/tags/tensorflow">Tensorflow</a></li><li><a href="/tags/colab">Colab</a></li></ul></article></div><footer><p>Made with ❤️ using <a href="https://github.com/johnsundell/publish">Publish</a></p><p><a href="/feed.rss">RSS feed</a></p></footer></body></html> \ No newline at end of file