From aae00025bd8bff04de90b22b2472aed8a232f476 Mon Sep 17 00:00:00 2001 From: Navan Chauhan Date: Tue, 26 Mar 2024 18:21:29 -0600 Subject: post testing latex extra --- ...4-03-26-Derivation-of-the-Quadratic-Equation.md | 55 +++++++++++++ ...-03-26-Derivation-of-the-Quadratic-Equation.png | Bin 0 -> 21809 bytes docs/feed.rss | 60 +++++++++++--- ...-03-26-Derivation-of-the-Quadratic-Equation.png | Bin 0 -> 21809 bytes docs/index.html | 24 +++--- ...3-21-Polynomial-Regression-in-TensorFlow-2.html | 10 +-- ...03-26-Derivation-of-the-Quadratic-Equation.html | 90 +++++++++++++++++++++ docs/posts/index.html | 11 +++ docs/tags/mathematics.html | 11 +++ 9 files changed, 232 insertions(+), 29 deletions(-) create mode 100644 Content/posts/2024-03-26-Derivation-of-the-Quadratic-Equation.md create mode 100644 Resources/images/opengraph/posts/2024-03-26-Derivation-of-the-Quadratic-Equation.png create mode 100644 docs/images/opengraph/posts/2024-03-26-Derivation-of-the-Quadratic-Equation.png create mode 100644 docs/posts/2024-03-26-Derivation-of-the-Quadratic-Equation.html diff --git a/Content/posts/2024-03-26-Derivation-of-the-Quadratic-Equation.md b/Content/posts/2024-03-26-Derivation-of-the-Quadratic-Equation.md new file mode 100644 index 0000000..0435a6c --- /dev/null +++ b/Content/posts/2024-03-26-Derivation-of-the-Quadratic-Equation.md @@ -0,0 +1,55 @@ +--- +date: 2024-03-26 15:36 +description: Quick derivation of the quadratic equation by completing the square +tags: mathematics +--- + +# Quadratic Formula Derivation + +The standard form of a quadratic equation is: + +$$ +ax^2 + bx + c = 0 +$$ + +Here, $a, b, c \in \mathbb{R}$, and $a \neq 0$ + +We begin by first dividing both sides by the coefficient $a$ + +$$ +\implies x^2 + \frac{b}{a}x + \frac{c}{a} = 0 +$$ + +We can rearrange the equation: + +$$ +x^2 + \frac{b}{a}x = - \frac{c}{a} +$$ + +We can then use the method of completing the square. ([Maths is Fun](https://www.mathsisfun.com/algebra/completing-square.html) has a really good explanation for this technique) + +$$ +x^2 + \frac{b}{a}x + (\frac{b}{2a})^2 = \frac{-c}{a} + (\frac{b}{2a})^2 +$$ + +On our LHS, we can clearly recognize that it is the expanded form of $(x + d)^2$ i.e $x^2 + 2x\cdot d + d^2$ + +$$ +\implies (x + \frac{b}{2a})^2 = \frac{-c}{a} + \frac{b^2}{4a^2} = \frac{-4ac + b^2}{4a^2} +$$ + +Taking the square root of both sides + +$$ +\begin{align*} +x + \frac{b}{2a} &= \frac{\sqrt{-4ac + b^2}}{2a} \\ +x &= \frac{\pm \sqrt{-4ac + b^2} - b}{2a} \\ +&= \frac{-b \pm \sqrt{b^2 - 4ac}}{2a} +\end{align*} +$$ + +This gives you the world famous quadratic formula: + +$$ +x = \frac{-b \pm \sqrt{b^2 - 4ac}}{2a} +$$ diff --git a/Resources/images/opengraph/posts/2024-03-26-Derivation-of-the-Quadratic-Equation.png b/Resources/images/opengraph/posts/2024-03-26-Derivation-of-the-Quadratic-Equation.png new file mode 100644 index 0000000..2464364 Binary files /dev/null and b/Resources/images/opengraph/posts/2024-03-26-Derivation-of-the-Quadratic-Equation.png differ diff --git a/docs/feed.rss b/docs/feed.rss index 12e9f8d..12b90df 100644 --- a/docs/feed.rss +++ b/docs/feed.rss @@ -4,8 +4,8 @@ Navan's Archive Rare Tips, Tricks and Posts https://web.navan.dev/en - Thu, 21 Mar 2024 14:27:28 -0000 - Thu, 21 Mar 2024 14:27:28 -0000 + Tue, 26 Mar 2024 18:20:37 -0000 + Tue, 26 Mar 2024 18:20:37 -0000 250 @@ -557,9 +557,7 @@ creating a DOS -

$$ -y = ax^3 + bx^2 + cx + d -$$

+y=ax3+bx2+cx+d

Optimizer Selection & Training

@@ -584,9 +582,7 @@ $$

Our loss function is Mean Squared Error (MSE):

-

$$ -= \frac{1}{n} \sum_{i=1}^{n}{(Y_i - \hat{Y_i})^2} -$$

+=1ni=1n(Y_iY_i^)2

Where Yi^ is the predicted value and Yi is the actual value

@@ -726,7 +722,7 @@ $$

How would you modify this code to use another type of nonlinear regression? Say,

-

$$ y = ab^x $$

+y=abx

Hint: Your loss calculation would be similar to:

@@ -3186,6 +3182,52 @@ values using the X values. We then plot it to compare the actual data and predic ]]> + + + https://web.navan.dev/posts/2024-03-26-Derivation-of-the-Quadratic-Equation.html + + + Quadratic Formula Derivation + + + Quick derivation of the quadratic equation by completing the square + + https://web.navan.dev/posts/2024-03-26-Derivation-of-the-Quadratic-Equation.html + Tue, 26 Mar 2024 15:36:00 -0000 + Quadratic Formula Derivation + +

The standard form of a quadratic equation is:

+ +ax2+bx+c=0 + +

Here, a,b,c, and a0

+ +

We begin by first dividing both sides by the coefficient a

+ +x2+bax+ca=0 + +

We can rearrange the equation:

+ +x2+bax=ca + +

We can then use the method of completing the square. (Maths is Fun has a really good explanation for this technique)

+ +x2+bax+(b2a)2=ca+(b2a)2 + +

On our LHS, we can clearly recognize that it is the expanded form of (x+d)2 i.e x2+2x·d+d2

+ +(x+b2a)2=ca+b24a2=4ac+b24a2 + +

Taking the square root of both sides

+ +x+b2a=4ac+b22ax=±4ac+b2b2a=b±b24ac2a + +

This gives you the world famous quadratic formula:

+ +x=b±b24ac2a +]]>
+
+ https://web.navan.dev/posts/2022-08-05-Why-You-No-Host.html diff --git a/docs/images/opengraph/posts/2024-03-26-Derivation-of-the-Quadratic-Equation.png b/docs/images/opengraph/posts/2024-03-26-Derivation-of-the-Quadratic-Equation.png new file mode 100644 index 0000000..2464364 Binary files /dev/null and b/docs/images/opengraph/posts/2024-03-26-Derivation-of-the-Quadratic-Equation.png differ diff --git a/docs/index.html b/docs/index.html index 0a3070a..c462f4b 100644 --- a/docs/index.html +++ b/docs/index.html @@ -50,6 +50,17 @@

Recent Posts

For all posts go to Posts diff --git a/docs/posts/2024-03-21-Polynomial-Regression-in-TensorFlow-2.html b/docs/posts/2024-03-21-Polynomial-Regression-in-TensorFlow-2.html index 7a25daf..ab46ec7 100644 --- a/docs/posts/2024-03-21-Polynomial-Regression-in-TensorFlow-2.html +++ b/docs/posts/2024-03-21-Polynomial-Regression-in-TensorFlow-2.html @@ -107,9 +107,7 @@ -

$$ -y = ax^3 + bx^2 + cx + d -$$

+y=ax3+bx2+cx+d

Optimizer Selection & Training

@@ -134,9 +132,7 @@ $$

Our loss function is Mean Squared Error (MSE):

-

$$ -= \frac{1}{n} \sum_{i=1}^{n}{(Y_i - \hat{Y_i})^2} -$$

+=1ni=1n(Y_iY_i^)2

Where Yi^ is the predicted value and Yi is the actual value

@@ -276,7 +272,7 @@ $$

How would you modify this code to use another type of nonlinear regression? Say,

-

$$ y = ab^x $$

+y=abx

Hint: Your loss calculation would be similar to:

diff --git a/docs/posts/2024-03-26-Derivation-of-the-Quadratic-Equation.html b/docs/posts/2024-03-26-Derivation-of-the-Quadratic-Equation.html new file mode 100644 index 0000000..6f02f7c --- /dev/null +++ b/docs/posts/2024-03-26-Derivation-of-the-Quadratic-Equation.html @@ -0,0 +1,90 @@ + + + + + + + + + Quadratic Formula Derivation + + + + + + + + + + + + + + + + + + + + + +
+
+ +
+ +

Quadratic Formula Derivation

+ +

The standard form of a quadratic equation is:

+ +ax2+bx+c=0 + +

Here, a,b,c, and a0

+ +

We begin by first dividing both sides by the coefficient a

+ +x2+bax+ca=0 + +

We can rearrange the equation:

+ +x2+bax=ca + +

We can then use the method of completing the square. (Maths is Fun has a really good explanation for this technique)

+ +x2+bax+(b2a)2=ca+(b2a)2 + +

On our LHS, we can clearly recognize that it is the expanded form of (x+d)2 i.e x2+2x·d+d2

+ +(x+b2a)2=ca+b24a2=4ac+b24a2 + +

Taking the square root of both sides

+ +x+b2a=4ac+b22ax=±4ac+b2b2a=b±b24ac2a + +

This gives you the world famous quadratic formula:

+ +x=b±b24ac2a + +
If you have scrolled this far, consider subscribing to my mailing list here. You can subscribe to either a specific type of post you are interested in, or subscribe to everything with the "Everything" list.
+ +
+ +
+
+ + + + + \ No newline at end of file diff --git a/docs/posts/index.html b/docs/posts/index.html index d886b19..40b6a92 100644 --- a/docs/posts/index.html +++ b/docs/posts/index.html @@ -52,6 +52,17 @@