From 103beb518fc01535d1a5edb9a8d754816e53ec2c Mon Sep 17 00:00:00 2001 From: Navan Chauhan Date: Tue, 15 Sep 2020 15:53:28 +0530 Subject: Publish deploy 2020-09-15 15:53 --- .googlecb0897d479c87d97 5.html.icloud | Bin 0 -> 178 bytes feed.rss | 328 ++------------------- index.html | 2 +- .../index.html | 10 +- .../index.html | 2 +- .../index.html | 4 +- .../index.html | 10 +- posts/2019-12-22-Fake-News-Detector/index.html | 4 +- .../index.html | 4 +- posts/2020-03-08-Making-Vaporwave-Track/index.html | 2 +- .../index.html | 4 +- .../index.html | 6 +- .../index.html | 4 +- .../index.html | 6 +- .../index.html | 282 +----------------- posts/index.html | 2 +- sitemap.xml | 2 +- tags/arjs/index.html | 2 +- tags/augmentedreality/index.html | 2 +- tags/javascript/index.html | 2 +- tags/tutorial/index.html | 2 +- 21 files changed, 66 insertions(+), 614 deletions(-) create mode 100644 .googlecb0897d479c87d97 5.html.icloud diff --git a/.googlecb0897d479c87d97 5.html.icloud b/.googlecb0897d479c87d97 5.html.icloud new file mode 100644 index 0000000..20e97ce Binary files /dev/null and b/.googlecb0897d479c87d97 5.html.icloud differ diff --git a/feed.rss b/feed.rss index c9ddea3..6637b94 100644 --- a/feed.rss +++ b/feed.rss @@ -1,4 +1,4 @@ -Navan ChauhanWelcome to my personal fragment of the internet. Majority of the posts should be complete.https://navanchauhan.github.io/enTue, 15 Sep 2020 15:40:39 +0530Tue, 15 Sep 2020 15:40:39 +0530250https://navanchauhan.github.io/posts/2020-08-01-Natural-Feature-Tracking-ARJSIntroduction to AR.js and Natural Feature TrackingAn introduction to AR.js and NFThttps://navanchauhan.github.io/posts/2020-08-01-Natural-Feature-Tracking-ARJSSat, 1 Aug 2020 15:43:00 +0530Introduction to AR.js and Natural Feature Tracking

AR.js

AR.js is a lightweight library for Augmented Reality on the Web, coming with features like Image Tracking, Location based AR and Marker tracking. It is the easiest option for cross-browser augmented reality.

The same code works for iOS, Android, Desktops and even VR Browsers!

It weas initially created by Jerome Etienne and is now maintained by Nicolo Carpignoli and the AR-js Organisation

NFT

Usually for augmented reality you need specialised markers, like this Hiro marker (notice the thick non-aesthetic borders 🤢)

This is called marker based tracking where the code knows what to look for. NFT or Natural Feature Tracing converts normal images into markers by extracting 'features' from it, this way you can use any image of your liking!

I'll be using my GitHub profile picture

Creating the Marker!

First we need to create the marker files required by AR.js for NFT. For this we use Carnaux's repository 'NFT-Marker-Creator'.

$ git clone https://github.com/Carnaux/NFT-Marker-Creator +Navan ChauhanWelcome to my personal fragment of the internet. Majority of the posts should be complete.https://navanchauhan.github.io/enTue, 15 Sep 2020 15:53:16 +0530Tue, 15 Sep 2020 15:53:16 +0530250https://navanchauhan.github.io/posts/2020-08-01-Natural-Feature-Tracking-ARJSIntroduction to AR.js and Natural Feature TrackingAn introduction to AR.js and NFThttps://navanchauhan.github.io/posts/2020-08-01-Natural-Feature-Tracking-ARJSSat, 1 Aug 2020 15:43:00 +0530Introduction to AR.js and Natural Feature Tracking

AR.js

AR.js is a lightweight library for Augmented Reality on the Web, coming with features like Image Tracking, Location based AR and Marker tracking. It is the easiest option for cross-browser augmented reality.

The same code works for iOS, Android, Desktops and even VR Browsers!

It was initially created by Jerome Etienne and is now maintained by Nicolo Carpignoli and the AR-js Organisation

NFT

Usually for augmented reality you need specialised markers, like this Hiro marker (notice the thick non-aesthetic borders 🤢)

This is called marker based tracking where the code knows what to look for. NFT or Natural Feature Tracing converts normal images into markers by extracting 'features' from it, this way you can use any image of your liking!

I'll be using my GitHub profile picture

Creating the Marker!

First we need to create the marker files required by AR.js for NFT. For this we use Carnaux's repository 'NFT-Marker-Creator'.

$ git clone https://github.com/Carnaux/NFT-Marker-Creator Cloning into 'NFT-Marker-Creator'... remote: Enumerating objects: 79, done. @@ -68,283 +68,9 @@ Generator started at 2020-08-01 16 [info] Saving to asa.iset... [info] Done. [info] Generating FeatureList... -[info] Start for 72.000000 dpi image. -[info] ImageSize = 309560[pixel] -[info] Extracted features = 24930[pixel] -[info] Filtered features = 6192[pixel] - 544/ 545.[info] -[info] Done. -[info] Max feature = 305 -[info] 1: ( 22,474) : 0.211834 min=0.212201 max=0.583779, sd=36.253441 -[info] 2: (259,449) : 0.365469 min=0.373732 max=0.667143, sd=64.356659 -[info] 3: (244,492) : 0.368801 min=0.373514 max=0.644463, sd=52.414131 -[info] 4: (542,503) : 0.388110 min=0.393117 max=0.659145, sd=21.867199 -[info] 5: (544,451) : 0.426580 min=0.431487 max=0.697276, sd=24.540915 -[info] 6: (486,334) : 0.593511 min=0.565134 max=0.800069, sd=31.706526 -[info] 7: (217,283) : 0.602713 min=0.553285 max=0.815628, sd=11.092167 -[info] 8: ( 44,420) : 0.612274 min=0.550906 max=0.832009, sd=29.664345 -[info] 9: (522,343) : 0.615029 min=0.569004 max=0.796405, sd=34.439430 -[info] 10: ( 57,476) : 0.621610 min=0.568849 max=0.816438, sd=41.452328 -[info] 11: (407,335) : 0.626746 min=0.601339 max=0.802741, sd=22.136026 -[info] 12: (483,375) : 0.636573 min=0.552658 max=0.851101, sd=53.539089 -[info] 13: ( 54,509) : 0.637408 min=0.563383 max=0.804955, sd=34.774330 -[info] 14: ( 22,386) : 0.642944 min=0.630736 max=0.852005, sd=29.959364 -[info] 15: (459,434) : 0.649170 min=0.567012 max=0.817146, sd=44.087994 -[info] 16: (510,409) : 0.667462 min=0.572251 max=0.808130, sd=49.187576 -[info] 17: (330,270) : 0.690323 min=0.625252 max=0.836476, sd=24.105335 -[info] 18: (544,270) : 0.695668 min=0.550262 max=0.841321, sd=53.076946 -[info] 19: (443,489) : 0.696738 min=0.557579 max=0.868091, sd=27.418671 -[info] 20: (439,373) : 0.706379 min=0.658029 max=0.856492, sd=52.750744 -[info] 21: (381,264) : 0.712895 min=0.567250 max=0.829908, sd=21.462694 -[info] 22: (114,344) : 0.726579 min=0.574026 max=0.873275, sd=19.631178 -[info] 23: (450,339) : 0.730613 min=0.622663 max=0.840786, sd=36.808407 -[info] 24: (187,316) : 0.737529 min=0.568579 max=0.856549, sd=35.841721 -[info] 25: (155,451) : 0.741329 min=0.617655 max=0.848432, sd=50.381092 -[info] 26: (425,406) : 0.770987 min=0.674625 max=0.908930, sd=39.619099 -[info] 27: (520,308) : 0.773589 min=0.646116 max=0.856012, sd=31.303595 -[info] 28: (239,244) : 0.784615 min=0.655032 max=0.943640, sd=25.512465 -[info] 29: (415,277) : 0.784977 min=0.745286 max=0.898037, sd=24.985357 -[info] 30: (278,244) : 0.796536 min=0.713171 max=0.940000, sd=36.488716 -[info] 31: (536,235) : 0.825348 min=0.654568 max=0.901623, sd=54.036903 -[info] 32: (341,310) : 0.828034 min=0.796073 max=0.928327, sd=57.174885 -[info] 33: (355,438) : 0.833364 min=0.616488 max=0.944241, sd=57.199963 -[info] 34: (330,215) : 0.852530 min=0.778738 max=0.960263, sd=31.844889 -[info] 35: (307,163) : 0.859535 min=0.750590 max=0.963522, sd=41.524643 -[info] 36: ( 43,246) : 0.865821 min=0.715005 max=0.967188, sd=17.746605 -[info] 37: (207,171) : 0.873648 min=0.753025 max=0.956383, sd=22.992336 -[info] 38: ( 75,383) : 0.877258 min=0.809668 max=0.943998, sd=24.749569 -[info] 39: ( 77,438) : 0.893997 min=0.853110 max=0.953184, sd=37.195824 -[info] 40: (186,231) : 0.897896 min=0.893945 max=0.959936, sd=53.592140 -[info] --------------------------------------------------------------- -[info] Start for 59.184002 dpi image. -[info] ImageSize = 209216[pixel] -[info] Extracted features = 16664[pixel] -[info] Filtered features = 4219[pixel] - 447/ 448.[info] -[info] Done. -[info] Max feature = 205 -[info] 1: ( 24,404) : 0.263453 min=0.272001 max=0.579902, sd=30.270309 -[info] 2: (181,415) : 0.286756 min=0.296179 max=0.570393, sd=51.832920 -[info] 3: (229,375) : 0.299946 min=0.301300 max=0.620830, sd=63.595726 -[info] 4: (443,403) : 0.395126 min=0.407708 max=0.635656, sd=21.330490 -[info] 5: (224,412) : 0.444073 min=0.451129 max=0.679228, sd=50.032726 -[info] 6: (402,276) : 0.562894 min=0.550990 max=0.783724, sd=31.768101 -[info] 7: ( 22,324) : 0.597796 min=0.553165 max=0.803201, sd=25.844311 -[info] 8: (408,318) : 0.606647 min=0.558857 max=0.803414, sd=50.661160 -[info] 9: (378,394) : 0.616479 min=0.558946 max=0.824612, sd=26.079950 -[info] 10: (384,361) : 0.668396 min=0.603888 max=0.820440, sd=36.232616 -[info] 11: (337,278) : 0.671341 min=0.586483 max=0.837408, sd=23.079739 -[info] 12: ( 52,371) : 0.679165 min=0.648876 max=0.842554, sd=35.683979 -[info] 13: (440,260) : 0.683035 min=0.577838 max=0.836932, sd=36.886761 -[info] 14: (444,227) : 0.687587 min=0.567562 max=0.837861, sd=49.854889 -[info] 15: (149,266) : 0.688923 min=0.572425 max=0.832697, sd=31.967720 -[info] 16: (290,212) : 0.714381 min=0.573321 max=0.825159, sd=25.429075 -[info] 17: (374,309) : 0.720491 min=0.711943 max=0.874688, sd=58.918808 -[info] 18: (103,283) : 0.723728 min=0.559241 max=0.835176, sd=17.688787 -[info] 19: (235,200) : 0.742138 min=0.745569 max=0.912951, sd=36.019238 -[info] 20: (128,360) : 0.770635 min=0.551060 max=0.871743, sd=51.743370 -[info] 21: (297,368) : 0.794845 min=0.553557 max=0.908358, sd=48.856777 -[info] 22: (348,343) : 0.798785 min=0.662930 max=0.928792, sd=40.917496 -[info] 23: (195,204) : 0.801663 min=0.621057 max=0.936245, sd=25.684557 -[info] 24: (325,221) : 0.810848 min=0.756911 max=0.898076, sd=30.086334 -[info] 25: (276,253) : 0.825425 min=0.812713 max=0.926472, sd=58.497112 -[info] 26: ( 57,409) : 0.838737 min=0.829799 max=0.922687, sd=45.331120 -[info] 27: (174,164) : 0.846327 min=0.738286 max=0.963738, sd=31.914589 -[info] 28: (440,191) : 0.859450 min=0.752551 max=0.938949, sd=62.600094 -[info] 29: (271,176) : 0.865594 min=0.832825 max=0.961079, sd=36.463100 -[info] 30: (247,141) : 0.869905 min=0.719910 max=0.944092, sd=39.328327 -[info] 31: ( 62,315) : 0.889240 min=0.820342 max=0.949085, sd=28.934418 -[info] 32: ( 55,200) : 0.896143 min=0.724967 max=0.973851, sd=18.574352 -[info] --------------------------------------------------------------- -[info] Start for 46.974373 dpi image. -[info] ImageSize = 132076[pixel] -[info] Extracted features = 10582[pixel] -[info] Filtered features = 2654[pixel] - 355/ 356.[info] -[info] Done. -[info] Max feature = 125 -[info] 1: (147,328) : 0.253711 min=0.261744 max=0.546386, sd=49.037407 -[info] 2: ( 23,318) : 0.326023 min=0.332772 max=0.553814, sd=29.970749 -[info] 3: (180,307) : 0.332172 min=0.353050 max=0.582633, sd=55.894489 -[info] 4: (339,229) : 0.568601 min=0.561106 max=0.788570, sd=35.519234 -[info] 5: ( 34,277) : 0.618809 min=0.583962 max=0.811560, sd=31.459497 -[info] 6: (120,210) : 0.655516 min=0.553481 max=0.808650, sd=30.417620 -[info] 7: (299,226) : 0.660352 min=0.551423 max=0.810235, sd=40.050533 -[info] 8: (289,291) : 0.672981 min=0.584721 max=0.865715, sd=34.681435 -[info] 9: ( 85,221) : 0.735781 min=0.557358 max=0.837869, sd=15.401685 -[info] 10: (192,150) : 0.748029 min=0.762064 max=0.911963, sd=36.248280 -[info] 11: (348,194) : 0.750950 min=0.647856 max=0.880163, sd=44.824394 -[info] 12: ( 27,244) : 0.779110 min=0.741316 max=0.889470, sd=27.621294 -[info] 13: (192,110) : 0.801694 min=0.737249 max=0.930409, sd=40.227238 -[info] 14: (142,131) : 0.807086 min=0.639763 max=0.952103, sd=32.719967 -[info] 15: (218,190) : 0.819924 min=0.789691 max=0.922567, sd=52.960388 -[info] 16: (265,213) : 0.829599 min=0.755986 max=0.906421, sd=30.495857 -[info] 17: (348,159) : 0.833503 min=0.770736 max=0.922161, sd=62.732380 -[info] 18: (241,296) : 0.846388 min=0.551359 max=0.934936, sd=41.934254 -[info] 19: (263,178) : 0.888409 min=0.791312 max=0.931626, sd=35.446648 -[info] --------------------------------------------------------------- -[info] Start for 37.283585 dpi image. -[info] ImageSize = 82908[pixel] -[info] Extracted features = 6477[pixel] -[info] Filtered features = 1665[pixel] - 281/ 282.[info] -[info] Done. -[info] Max feature = 76 -[info] 1: (126,255) : 0.291458 min=0.293245 max=0.532445, sd=44.819416 -[info] 2: (259,179) : 0.544833 min=0.552487 max=0.779788, sd=34.632847 -[info] 3: ( 22,217) : 0.577221 min=0.572050 max=0.765973, sd=29.686686 -[info] 4: (101,164) : 0.634161 min=0.551986 max=0.799318, sd=26.766178 -[info] 5: (263,212) : 0.644384 min=0.564216 max=0.795845, sd=42.637730 -[info] 6: ( 22,258) : 0.680414 min=0.692315 max=0.791998, sd=35.050774 -[info] 7: (220,182) : 0.698840 min=0.592309 max=0.862061, sd=37.508007 -[info] 8: (229,257) : 0.703305 min=0.553369 max=0.864329, sd=24.709126 -[info] 9: (151,118) : 0.726578 min=0.738549 max=0.906151, sd=38.828815 -[info] 10: (177,214) : 0.781371 min=0.563572 max=0.832300, sd=54.721115 -[info] 11: (225,215) : 0.788058 min=0.755838 max=0.899616, sd=50.707241 -[info] 12: (158,155) : 0.794013 min=0.781587 max=0.916502, sd=48.264225 -[info] 13: ( 63,176) : 0.841771 min=0.763143 max=0.907695, sd=19.686169 -[info] 14: (206,145) : 0.844404 min=0.752984 max=0.916683, sd=35.973286 -[info] 15: (270,142) : 0.845205 min=0.795885 max=0.935199, sd=59.147652 -[info] 16: (149, 75) : 0.874711 min=0.796581 max=0.958077, sd=47.111187 -[info] 17: (113,104) : 0.877415 min=0.769141 max=0.966519, sd=51.069527 -[info] --------------------------------------------------------------- -[info] Start for 29.592001 dpi image. -[info] ImageSize = 52192[pixel] -[info] Extracted features = 4027[pixel] -[info] Filtered features = 1050[pixel] - 223/ 224.[info] -[info] Done. -[info] Max feature = 50 -[info] 1: (102,201) : 0.289298 min=0.304870 max=0.535742, sd=46.188416 -[info] 2: (210,148) : 0.562558 min=0.576227 max=0.774472, sd=41.276623 -[info] 3: ( 78,129) : 0.612243 min=0.551623 max=0.784806, sd=23.271040 -[info] 4: (177,133) : 0.688757 min=0.576350 max=0.879510, sd=33.073624 -[info] 5: (201,187) : 0.689806 min=0.643513 max=0.833690, sd=35.062008 -[info] 6: (114, 82) : 0.729714 min=0.720887 max=0.903299, sd=42.064465 -[info] 7: (124,118) : 0.735968 min=0.746138 max=0.899143, sd=45.898678 -[info] 8: ( 23,164) : 0.779643 min=0.747459 max=0.878366, sd=34.013752 -[info] 9: (140,163) : 0.809162 min=0.637383 max=0.881929, sd=50.335152 -[info] 10: ( 23,198) : 0.819792 min=0.807587 max=0.883169, sd=39.944038 -[info] 11: (210,113) : 0.867805 min=0.835419 max=0.952952, sd=62.521526 -[info] --------------------------------------------------------------- -[info] Start for 23.487186 dpi image. -[info] ImageSize = 32930[pixel] -[info] Extracted features = 2542[pixel] -[info] Filtered features = 663[pixel] - 177/ 178.[info] -[info] Done. -[info] Max feature = 26 -[info] 1: ( 90,150) : 0.521553 min=0.522064 max=0.695849, sd=47.276417 -[info] 2: (150,127) : 0.588753 min=0.553176 max=0.839270, sd=40.334232 -[info] 3: ( 63,104) : 0.657959 min=0.625311 max=0.845079, sd=26.239153 -[info] 4: ( 86, 71) : 0.716036 min=0.701616 max=0.897777, sd=40.668495 -[info] 5: (124, 92) : 0.795828 min=0.799467 max=0.919533, sd=46.250336 -[info] 6: ( 22,140) : 0.823488 min=0.788265 max=0.904299, sd=37.023449 -[info] 7: (158, 94) : 0.846220 min=0.807598 max=0.945632, sd=57.189617 -[info] --------------------------------------------------------------- -[info] Start for 18.641792 dpi image. -[info] ImageSize = 20727[pixel] -[info] Extracted features = 1597[pixel] -[info] Filtered features = 415[pixel] - 140/ 141.[info] -[info] Done. -[info] Max feature = 17 -[info] 1: ( 66,105) : 0.595532 min=0.553885 max=0.773757, sd=42.355804 -[info] 2: (114, 96) : 0.636754 min=0.558845 max=0.877043, sd=41.727524 -[info] 3: ( 68, 63) : 0.740243 min=0.674251 max=0.907678, sd=48.359558 -[info] 4: ( 25, 97) : 0.770776 min=0.551998 max=0.915293, sd=33.076458 -[info] 5: (102, 62) : 0.878442 min=0.879749 max=0.958104, sd=59.505066 -[info] --------------------------------------------------------------- -[info] Start for 14.796000 dpi image. -[info] ImageSize = 13104[pixel] -[info] Extracted features = 1028[pixel] -[info] Filtered features = 266[pixel] - 111/ 112.[info] -[info] Done. -[info] Max feature = 9 -[info] 1: ( 48, 83) : 0.625048 min=0.567549 max=0.834230, sd=42.692223 -[info] 2: ( 89, 73) : 0.670262 min=0.553308 max=0.882925, sd=45.027554 -[info] 3: ( 52, 50) : 0.786427 min=0.743476 max=0.927114, sd=55.718193 -[info] --------------------------------------------------------------- -[info] Start for 11.743593 dpi image. -[info] ImageSize = 8277[pixel] -[info] Extracted features = 679[pixel] -[info] Filtered features = 166[pixel] - 88/ 89.[info] -[info] Done. -[info] Max feature = 4 -[info] 1: ( 60, 64) : 0.633060 min=0.550807 max=0.857935, sd=42.695072 -[info] 2: ( 26, 63) : 0.702131 min=0.685312 max=0.867015, sd=38.862392 -[info] 3: ( 40, 30) : 0.833010 min=0.834262 max=0.945118, sd=62.075710 -[info] --------------------------------------------------------------- -[info] Start for 9.320896 dpi image. -[info] ImageSize = 5254[pixel] -[info] Extracted features = 398[pixel] -[info] Filtered features = 106[pixel] - 70/ 71.[info] -[info] Done. -[info] Max feature = 4 -[info] 1: ( 47, 48) : 0.706897 min=0.668202 max=0.875970, sd=47.628330 -[info] --------------------------------------------------------------- -[info] Start for 7.398000 dpi image. -[info] ImageSize = 3248[pixel] -[info] Extracted features = 248[pixel] -[info] Filtered features = 65[pixel] - 55/ 56.[info] -[info] Done. -[info] Max feature = 1 -[info] 1: ( 34, 33) : 0.794624 min=0.780241 max=0.925466, sd=59.612782 -[info] --------------------------------------------------------------- -[info] Start for 5.871797 dpi image. -[info] ImageSize = 2024[pixel] -[info] Extracted features = 161[pixel] -[info] Filtered features = 41[pixel] - 43/ 44.[info] -[info] Done. -[info] Max feature = 1 -[info] --------------------------------------------------------------- -[info] Start for 4.660448 dpi image. -[info] ImageSize = 1295[pixel] -[info] Extracted features = 108[pixel] -[info] Filtered features = 26[pixel] - 34/ 35.[info] -[info] Done. -[info] Max feature = 1 -[info] --------------------------------------------------------------- -[info] Start for 3.699000 dpi image. -[info] ImageSize = 812[pixel] -[info] Extracted features = 65[pixel] -[info] Filtered features = 17[pixel] - 27/ 28.[info] -[info] Done. -[info] Max feature = 0 -[info] --------------------------------------------------------------- -[info] Done. -[info] Saving FeatureSet... -[info] Done. -[info] Generating FeatureSet3... -[info] (568, 545) 72.000000[dpi] -[info] Freak features - 405[info] ========= 405 =========== -[info] (467, 448) 59.184002[dpi] -[info] Freak features - 401[info] ========= 401 =========== -[info] (371, 356) 46.974373[dpi] -[info] Freak features - 385[info] ========= 385 =========== -[info] (294, 282) 37.283585[dpi] -[info] Freak features - 486[info] ========= 486 =========== -[info] (233, 224) 29.592001[dpi] -[info] Freak features - 362[info] ========= 362 =========== -[info] (185, 178) 23.487186[dpi] -[info] Freak features - 232[info] ========= 232 =========== -[info] (147, 141) 18.641792[dpi] -[info] Freak features - 148[info] ========= 148 =========== -[info] (117, 112) 14.796000[dpi] -[info] Freak features - 113[info] ========= 113 =========== -[info] (93, 89) 11.743593[dpi] -[info] Freak features - 81[info] ========= 81 =========== -[info] (74, 71) 9.320896[dpi] -[info] Freak features - 51[info] ========= 51 =========== -[info] (58, 56) 7.398000[dpi] -[info] Freak features - 36[info] ========= 36 =========== + +... + [info] (46, 44) 5.871797[dpi] [info] Freak features - 23[info] ========= 23 =========== [info] (37, 35) 4.660448[dpi] @@ -513,7 +239,7 @@ Serving HTTP on 0.0.0.0 port 8000 if __name__ == "__main__": install() -
]]>
https://navanchauhan.github.io/posts/2020-06-02-Compiling-AutoDock-Vina-on-iOSCompiling AutoDock Vina on iOSCompiling AutoDock Vina on iOShttps://navanchauhan.github.io/posts/2020-06-02-Compiling-AutoDock-Vina-on-iOSTue, 2 Jun 2020 23:23:00 +0530Compiling AutoDock Vina on iOS

Why? Because I can.

Installing makedepend

makedepend is a Unix tool used to generate dependencies of C source files. Most modern programes do not use this anymore, but then again AutoDock Vina's source code hasn't been changed since 2011. The first hurdle came when I saw that there was no makedepend command, neither was there any package on any development repository for iOS. So, I tracked down the original source code for makedepend (https://github.com/DerellLicht/makedepend). According to the repository this is actually the source code for the makedepend utility that came with some XWindows distribution back around Y2K. I am pretty sure there is a problem with my current compiler configuration because I had to manually edit the Makefile to provide the path to the iOS SDKs using the -isysroot flag.

Editting the Makefile

Original Makefile ( I used the provided mac Makefile base )

BASE=/usr/local +
]]>
https://navanchauhan.github.io/posts/2020-06-02-Compiling-AutoDock-Vina-on-iOSCompiling AutoDock Vina on iOSCompiling AutoDock Vina on iOShttps://navanchauhan.github.io/posts/2020-06-02-Compiling-AutoDock-Vina-on-iOSTue, 2 Jun 2020 23:23:00 +0530Compiling AutoDock Vina on iOS

Why? Because I can.

Installing makedepend

makedepend is a Unix tool used to generate dependencies of C source files. Most modern programs do not use this anymore, but then again AutoDock Vina's source code hasn't been changed since 2011. The first hurdle came when I saw that there was no makedepend command, neither was there any package on any development repository for iOS. So, I tracked down the original source code for makedepend (https://github.com/DerellLicht/makedepend). According to the repository this is actually the source code for the makedepend utility that came with some XWindows distribution back around Y2K. I am pretty sure there is a problem with my current compiler configuration because I had to manually edit the Makefile to provide the path to the iOS SDKs using the -isysroot flag.

Editing the Makefile

Original Makefile ( I used the provided mac Makefile base )

BASE=/usr/local BOOST_VERSION=1_41 BOOST_INCLUDE = $(BASE)/include C_PLATFORM=-arch i386 -arch ppc -isysroot /Developer/SDKs/MacOSX10.5.sdk -mmacosx-version-min=10.4 @@ -522,7 +248,7 @@ Serving HTTP on 0.0.0.0 port 8000 BOOST_LIB_VERSION= include ../../makefile_common -

I installed Boost 1.68.0-1 from Sam Bingner's repository. ( Otherwise I would have had to compile boost too 😫 )

Editted Makefile

BASE=/usr +

I installed Boost 1.68.0-1 from Sam Bingner's repository. ( Otherwise I would have had to compile boost too 😫 )

Edited Makefile

BASE=/usr BOOST_VERSION=1_68 BOOST_INCLUDE = $(BASE)/include C_PLATFORM=-arch arm64 -isysroot /var/sdks/Latest.sdk @@ -541,15 +267,15 @@ include ../../makefile_common std::cerr << "\n\nParse error on line " << e.line << " in file \"" << e.file.native_file_string() << "\": " << e.reason << '\n'; ~~~~~~ ^ 2 errors generated. -

Turns out native_file_string was deprecated in Boost 1.57 and replaced with just string

Error 3 - Library Not Found

This one still boggles me because there was no reason for it to not work, as a workaround I downloaded the DEB, extracted it and used that path for compiling.

Error 4 - No Member Named 'nativefilestring' Again.

But, this time in another file and I quickle fixed it

Moment of Truth

Obviously it was working on my iPad, but would it work on another device? I transfered the compiled binary and

The package is available on my repository and only depends on boost. ( Both, Vina and Vina-Split are part of the package)

]]>
https://navanchauhan.github.io/posts/2020-06-01-Speeding-Up-Molecular-Docking-Workflow-AutoDock-Vina-and-PyMOLWorkflow for Lightning Fast Molecular Docking Part OneThis is my workflow for lightning fast molecular docking.https://navanchauhan.github.io/posts/2020-06-01-Speeding-Up-Molecular-Docking-Workflow-AutoDock-Vina-and-PyMOLMon, 1 Jun 2020 13:10:00 +0530Workflow for Lightning Fast Molecular Docking Part One

My Setup

  • macOS Catalina ( RIP 32bit app)
  • PyMOL
  • AutoDock Vina
  • Open Babel

One Command Docking

obabel -:"$(pbpaste)" --gen3d -opdbqt -Otest.pdbqt && vina --receptor lu.pdbqt --center_x -9.7 --center_y 11.4 --center_z 68.9 --size_x 19.3 --size_y 29.9 --size_z 21.3 --ligand test.pdbqt +

Turns out native_file_string was deprecated in Boost 1.57 and replaced with just string

Error 3 - Library Not Found

This one still boggles me because there was no reason for it to not work, as a workaround I downloaded the DEB, extracted it and used that path for compiling.

Error 4 - No Member Named 'nativefilestring' Again.

But, this time in another file and I quickly fixed it

Moment of Truth

Obviously it was working on my iPad, but would it work on another device? I transferred the compiled binary and

The package is available on my repository and only depends on boost. ( Both, Vina and Vina-Split are part of the package)

]]>
https://navanchauhan.github.io/posts/2020-06-01-Speeding-Up-Molecular-Docking-Workflow-AutoDock-Vina-and-PyMOLWorkflow for Lightning Fast Molecular Docking Part OneThis is my workflow for lightning fast molecular docking.https://navanchauhan.github.io/posts/2020-06-01-Speeding-Up-Molecular-Docking-Workflow-AutoDock-Vina-and-PyMOLMon, 1 Jun 2020 13:10:00 +0530Workflow for Lightning Fast Molecular Docking Part One

My Setup

  • macOS Catalina ( RIP 32bit app)
  • PyMOL
  • AutoDock Vina
  • Open Babel

One Command Docking

obabel -:"$(pbpaste)" --gen3d -opdbqt -Otest.pdbqt && vina --receptor lu.pdbqt --center_x -9.7 --center_y 11.4 --center_z 68.9 --size_x 19.3 --size_y 29.9 --size_z 21.3 --ligand test.pdbqt

To run this command you simple copy the SMILES structure of the ligand you want an it automatically takes it from your clipboard, generates the 3D structure in the AutoDock PDBQT format using Open Babel and then docks it with your receptor using AutoDock Vina, all with just one command.

Let me break down the commands

obabel -:"$(pbpaste)" --gen3d -opdbqt -Otest.pdbqt

pbpaste and pbcopy are macOS commands for pasting and copying from and to the clipboard. Linux users may install the xclip and xsel packages from their respective package managers and then insert these aliases into their bash_profile, zshrc e.t.c

alias pbcopy='xclip -selection clipboard' alias pbpaste='xclip -selection clipboard -o'
$(pbpaste)

This is used in bash to evaluate the results of a command. In this scenario we are using it to get the contents of the clipboard.

The rest of the command is a normal Open Babel command to generate a 3D structure in PDBQT format and then save it as test.pdbqt

&& -

This tells the termianl to only run the next part if the previous command runs succesfuly without any errors.

vina --receptor lu.pdbqt --center_x -9.7 --center_y 11.4 --center_z 68.9 --size_x 19.3 --size_y 29.9 --size_z 21.3 --ligand test.pdbqt -

This is just the docking command for AutoDock Vina. In the next part I will tell how to use PyMOL and a plugin to directly generate the coordinates in Vina format --center_x -9.7 --center_y 11.4 --center_z 68.9 --size_x 19.3 --size_y 29.9 --size_z 21.3 without needing to type them manually.

]]>
https://navanchauhan.github.io/posts/2020-05-31-compiling-open-babel-on-iosCompiling Open Babel on iOSCompiling Open Babel on iOShttps://navanchauhan.github.io/posts/2020-05-31-compiling-open-babel-on-iosSun, 31 May 2020 23:30:00 +0530Compiling Open Babel on iOS

Due to the fact that my summer vacations started today, I had the brilliant idea of trying to run open babel on my iPad. To give a little background, I had tried to compile AutoDock Vina using a cross-compiler but I had miserably failed.

I am running the Checkr1n jailbreak on my iPad and the Unc0ver jailbreak on my phone.

But Why?

Well, just because I can. This is literally the only reason I tried compiling it and also partially because in the long run I want to compile AutoDock Vina so I can do Molecular Docking on the go.

Let's Go!

How hard can it be to compile open babel right? It is just a simple software with clear and concise build instructions. I just need to use cmake to build and the make to install.

It is 11 AM in the morning. I install clang, cmake and make from the Sam Bingner's repository, fired up ssh, downloaded the source code and ran the build command.`clang

Fail No. 1

I couldn't even get cmake to run, I did a little digging arond StackOverflow and founf that I needed the iOS SDK, sure no problem. I waited for Xcode to update and transfered the SDKs to my iPad

scp -r /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk root@192.168.1.8:/var/sdks/ -

Them I told cmake that this is the location for my SDK 😠. Succesful! Now I just needed to use make.

Fail No. 2

It was giving the error that thread-local-storage was not supported on this device.

[ 0%] Building CXX object src/CMakeFiles/openbabel.dir/alias.cpp.o +

This tells the terminal to only run the next part if the previous command runs successfully without any errors.

vina --receptor lu.pdbqt --center_x -9.7 --center_y 11.4 --center_z 68.9 --size_x 19.3 --size_y 29.9 --size_z 21.3 --ligand test.pdbqt +

This is just the docking command for AutoDock Vina. In the next part I will tell how to use PyMOL and a plugin to directly generate the coordinates in Vina format --center_x -9.7 --center_y 11.4 --center_z 68.9 --size_x 19.3 --size_y 29.9 --size_z 21.3 without needing to type them manually.

]]>
https://navanchauhan.github.io/posts/2020-05-31-compiling-open-babel-on-iosCompiling Open Babel on iOSCompiling Open Babel on iOShttps://navanchauhan.github.io/posts/2020-05-31-compiling-open-babel-on-iosSun, 31 May 2020 23:30:00 +0530Compiling Open Babel on iOS

Due to the fact that my summer vacations started today, I had the brilliant idea of trying to run open babel on my iPad. To give a little background, I had tried to compile AutoDock Vina using a cross-compiler but I had miserably failed.

I am running the Checkr1n jailbreak on my iPad and the Unc0ver jailbreak on my phone.

But Why?

Well, just because I can. This is literally the only reason I tried compiling it and also partially because in the long run I want to compile AutoDock Vina so I can do Molecular Docking on the go.

Let's Go!

How hard can it be to compile open babel right? It is just a simple software with clear and concise build instructions. I just need to use cmake to build and the make to install.

It is 11 AM in the morning. I install clang, cmake and make from the Sam Bingner's repository, fired up ssh, downloaded the source code and ran the build command.`clang

Fail No. 1

I couldn't even get cmake to run, I did a little digging around StackOverflow and founf that I needed the iOS SDK, sure no problem. I waited for Xcode to update and transferred the SDKs to my iPad

scp -r /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk root@192.168.1.8:/var/sdks/ +

Them I told cmake that this is the location for my SDK 😠. Successful! Now I just needed to use make.

Fail No. 2

It was giving the error that thread-local-storage was not supported on this device.

[ 0%] Building CXX object src/CMakeFiles/openbabel.dir/alias.cpp.o [ 1%] Building CXX object src/CMakeFiles/openbabel.dir/atom.cpp.o In file included from /var/root/obabel/ob-src/src/atom.cpp:28: In file included from /var/root/obabel/ob-src/include/openbabel/ring.h:29: @@ -589,7 +315,7 @@ THREAD_LOCAL OB_EXTERN OBAromaticTyper aromtyper; make[2]: *** [src/CMakeFiles/openbabel.dir/build.make:76: src/CMakeFiles/openbabel.dir/atom.cpp.o] Error 1 make[1]: *** [CMakeFiles/Makefile2:1085: src/CMakeFiles/openbabel.dir/all] Error 2 make: *** [Makefile:129: all] Error 2 -

Strange but it is alright, there is nothing that hasn't been answered on the internet.

I did a little digging around and could not find a solution 😔

As a temporary fix, I disabled multithreading by going and commenting the lines in the source code.

Packaging as a deb

This was pretty straight forward, I tried installing it on my iPad and it was working pretty smoothly.

Moment of Truth

So I airdropped the .deb to my phone and tried installing it, the installation was succesful but when I tried obabel it just abborted.

Turns out because I had created an install target of a seprate folder while compiling, the binaries were refferencing a non-existing dylib rather than those in the /usr/lib folder. As a quick workaround I transferred the deb folder to my laptop and used otool and install_name tool: install_name_tool -change /var/root/obabel/ob-build/lib/libopenbabel.7.dylib /usr/lib/libopenbabel.7.dylib for all the executables and then signed them using jtool

I then installed it and everything went smoothly, I even ran obabel and it executed perfectly, showing the version number 3.1.0 ✌️ Ahh, smooth victory.

Nope. When I tried converting from SMILES to pdbqt, it gave an error saying plugin not found. This was weird.

So I just copied the entire build folder from my iPad to my phone and tried runnig it. Oops, Apple Sandbox Error, Oh no!

I spent 2 hours around this problem, only to see the documentation and relaise I hadn't setup the environment variable 🤦‍♂️

The Final Fix ( For Now )

export BABEL_DATADIR="/usr/share/openbabel/3.1.0" +

Strange but it is alright, there is nothing that hasn't been answered on the internet.

I did a little digging around and could not find a solution 😔

As a temporary fix, I disabled multithreading by going and commenting the lines in the source code.

Packaging as a deb

This was pretty straight forward, I tried installing it on my iPad and it was working pretty smoothly.

Moment of Truth

So I airdropped the .deb to my phone and tried installing it, the installation was successful but when I tried obabel it just aborted.

Turns out because I had created an install target of a separate folder while compiling, the binaries were referencing a non-existing dylib rather than those in the /usr/lib folder. As a quick workaround I transferred the deb folder to my laptop and used otool and install_name tool: install_name_tool -change /var/root/obabel/ob-build/lib/libopenbabel.7.dylib /usr/lib/libopenbabel.7.dylib for all the executables and then signed them using jtool

I then installed it and everything went smoothly, I even ran obabel and it executed perfectly, showing the version number 3.1.0 ✌️ Ahh, smooth victory.

Nope. When I tried converting from SMILES to pdbqt, it gave an error saying plugin not found. This was weird.

So I just copied the entire build folder from my iPad to my phone and tried running it. Oops, Apple Sandbox Error, Oh no!

I spent 2 hours around this problem, only to see the documentation and realise I hadn't setup the environment variable 🤦‍♂️

The Final Fix ( For Now )

export BABEL_DATADIR="/usr/share/openbabel/3.1.0" export BABEL_LIBDIR="/usr/lib/openbabel/3.1.0"

This was the tragedy of trying to compile something without knowing enough about compiling. It is 11:30 as of writing this. Something as trivial as this should not have taken me so long. Am I going to try to compile AutoDock Vina next? 🤔 Maybe.

Also, if you want to try Open Babel on you jailbroken iDevice, install the package from my repository ( You, need to run the above mentioned final fix :p ). This was tested on iOS 13.5, I cannot tell if it will work on others or not.

Hopefully, I add some more screenshots to this post.

Edit 1: Added Screenshots, had to replicate the errors.

]]>
https://navanchauhan.github.io/posts/2020-04-13-Fixing-X11-Error-AmberTools-macOSFixing X11 Error on macOS Catalina for AmberTools 18/19Fixing Could not find the X11 libraries; you may need to edit config.h, AmberTools macOS Catalinahttps://navanchauhan.github.io/posts/2020-04-13-Fixing-X11-Error-AmberTools-macOSMon, 13 Apr 2020 11:41:00 +0530Fixing X11 Error on macOS Catalina for AmberTools 18/19

I was trying to install AmberTools on my macOS Catalina Installation. Running ./configure -macAccelerate clang gave me an error that it could not find X11 libraries, even though locate libXt showed that my installation was correct.

Error:

Could not find the X11 libraries; you may need to edit config.h to set the XHOME and XLIBS variables. @@ -606,7 +332,7 @@ Error: The X11 libraries are not in the usual location ! To build Amber without XLEaP, re-run configure with '-noX11: ./configure -noX11 --with-python /usr/local/bin/python3 -macAccelerate clang Configure failed due to the errors above! -

I searcehd on Google for a solution on their, sadly there was not even a single thread which had a solution about this error.

The Fix

Simply reinstalling XQuartz using homebrew fixed the error brew cask reinstall xquartz

If you do not have xquartz installed, you need to run brew cask install xquartz

]]>
https://navanchauhan.github.io/publications/2020-03-17-Possible-Drug-Candidates-COVID-19Possible Drug Candidates for COVID-19COVID-19, has been officially labeled as a pandemic by the World Health Organisation. This paper presents cloperastine and vigabatrin as two possible drug candidates for combatting the disease along with the process by which they were discovered.https://navanchauhan.github.io/publications/2020-03-17-Possible-Drug-Candidates-COVID-19Tue, 17 Mar 2020 17:40:00 +0530Possible Drug Candidates for COVID-19

This is still a pre-print.

Download paper here

]]>
https://navanchauhan.github.io/publications/2020-03-14-generating-vaporwaveIs it possible to programmatically generate Vaporwave?This paper is about programmaticaly generating Vaporwave.https://navanchauhan.github.io/publications/2020-03-14-generating-vaporwaveSat, 14 Mar 2020 22:23:00 +0530Is it possible to programmatically generate Vaporwave?

This is still a pre-print.

Download paper here

Recommended citation:

APA

Chauhan, N. (2020, March 15). Is it possible to programmatically generate Vaporwave?. https://doi.org/10.35543/osf.io/9um2r +

I searched on Google for a solution. Sadly, there was not even a single thread which had a solution about this error.

The Fix

Simply reinstalling XQuartz using homebrew fixed the error brew cask reinstall xquartz

If you do not have XQuartz installed, you need to run brew cask install xquartz

]]>
https://navanchauhan.github.io/publications/2020-03-17-Possible-Drug-Candidates-COVID-19Possible Drug Candidates for COVID-19COVID-19, has been officially labeled as a pandemic by the World Health Organisation. This paper presents cloperastine and vigabatrin as two possible drug candidates for combatting the disease along with the process by which they were discovered.https://navanchauhan.github.io/publications/2020-03-17-Possible-Drug-Candidates-COVID-19Tue, 17 Mar 2020 17:40:00 +0530Possible Drug Candidates for COVID-19

This is still a pre-print.

Download paper here

]]>
https://navanchauhan.github.io/publications/2020-03-14-generating-vaporwaveIs it possible to programmatically generate Vaporwave?This paper is about programmaticaly generating Vaporwave.https://navanchauhan.github.io/publications/2020-03-14-generating-vaporwaveSat, 14 Mar 2020 22:23:00 +0530Is it possible to programmatically generate Vaporwave?

This is still a pre-print.

Download paper here

Recommended citation:

APA

Chauhan, N. (2020, March 15). Is it possible to programmatically generate Vaporwave?. https://doi.org/10.35543/osf.io/9um2r

MLA

Chauhan, Navan. “Is It Possible to Programmatically Generate Vaporwave?.” IndiaRxiv, 15 Mar. 2020. Web.

Chicago

Chauhan, Navan. 2020. “Is It Possible to Programmatically Generate Vaporwave?.” IndiaRxiv. March 15. doi:10.35543/osf.io/9um2r.

Bibtex

@misc{chauhan_2020, @@ -618,7 +344,7 @@ Configure failed due to the errors above! year={2020}, month={Mar} } -
]]>
https://navanchauhan.github.io/posts/2020-03-08-Making-Vaporwave-TrackMaking My First Vaporwave Track (Remix)I made my first vaporwave remixhttps://navanchauhan.github.io/posts/2020-03-08-Making-Vaporwave-TrackSun, 8 Mar 2020 23:17:00 +0530Making My First Vaporwave Track (Remix)

I finally completed my first quick and dirty vaporwave remix of "I Want It That Way" by the Backstreet Boys

V A P O R W A V E

Vaporwave is all about A E S T H E T I C S. Vaporwave is a type of music genre that emmerged as a parody of Chillwave, shared more as a meme rather than a proper musical genre. Of course this changed as the genre become mature

How to Vaporwave

The first track which is considered to be actual Vaporwave is Ramona Xavier's Macintosh Plus, this unspokenly set the the guidelines for making Vaporwave

  • Take a 1980s RnB song
  • Slow it down
  • Add Bass and Trebble
  • Add again
  • Add Reverb ( make sure its wet )

There you have your very own Vaporwave track.

( Now, there are some tracks being produced which are not remixes and are original )

My Remix

Where is the Programming?

The fact that there are steps on producing Vaporwave, this gave me the idea that Vaporwave can actually be made using programming, stay tuned for when I publish the program which I am working on ( Generating A E S T H E T I C artwork and remixes)

]]>
https://navanchauhan.github.io/posts/2020-03-03-Playing-With-Android-TVTinkering with an Android TVTinkering with an Android TVhttps://navanchauhan.github.io/posts/2020-03-03-Playing-With-Android-TVTue, 3 Mar 2020 18:37:00 +0530Tinkering with an Android TV

So I have an Android TV, this posts covers everything I have tried on it

Contents

  1. Getting TV's IP Address
  2. Enable Developer Settings
  3. Enable ADB
  4. Connect ADB
  5. Manipulating Packages

IP-Address

These steps should be similar for all Android-TVs

  • Go To Settings
  • Go to Network
  • Advanced Settings
  • Network Status
  • Note Down IP-Address

The other option is to go to your router's server page and get connected devices

Developer-Settings

  • Go To Settings
  • About
  • Continously click on the "Build" option until it says "You are a Developer"

Enable-ADB

  • Go to Settings
  • Go to Developer Options
  • Scroll untill you find ADB Debugging and enable that option

Connect-ADB

  • Open Terminal (Make sure you have ADB installed)
  • Enter the following command adb connect <IP_ADDRESS>
  • To test the connection run adb logcat

Manipulating Apps / Packages

Listing Packages

  • adb shell
  • pm list packages

Installing Packages

  • adb install -r package.apk

Uninstalling Packages

  • adb uninstall com.company.yourpackagename
]]>
https://navanchauhan.github.io/posts/2020-01-19-Connect-To-Bluetooth-Devices-Linux-TerminalHow to setup Bluetooth on a Raspberry PiConnecting to Bluetooth Devices using terminal, tested on Raspberry Pi Zero Whttps://navanchauhan.github.io/posts/2020-01-19-Connect-To-Bluetooth-Devices-Linux-TerminalSun, 19 Jan 2020 15:27:00 +0530How to setup Bluetooth on a Raspberry Pi

This was tested on a Raspberry Pi Zero W

Enter in the Bluetooth Mode

pi@raspberrypi:~ $ bluetoothctl

[bluetooth]# agent on

[bluetooth]# default-agent

[bluetooth]# scan on

To Pair

While being in bluetooth mode

[bluetooth]# pair XX:XX:XX:XX:XX:XX

To Exit out of bluetoothctl anytime, just type exit

]]>
https://navanchauhan.github.io/posts/2020-01-16-Image-Classifier-Using-TuricreateCreating a Custom Image Classifier using Turicreate to detect Smoke and FireTutorial on creating a custom Image Classifier using Turicreate and a dataset from Kagglehttps://navanchauhan.github.io/posts/2020-01-16-Image-Classifier-Using-TuricreateThu, 16 Jan 2020 10:36:00 +0530Creating a Custom Image Classifier using Turicreate to detect Smoke and Fire

For setting up Kaggle with Google Colab, please refer to my previous post

Dataset

Mounting Google Drive

import os +
]]>
https://navanchauhan.github.io/posts/2020-03-08-Making-Vaporwave-TrackMaking My First Vaporwave Track (Remix)I made my first vaporwave remixhttps://navanchauhan.github.io/posts/2020-03-08-Making-Vaporwave-TrackSun, 8 Mar 2020 23:17:00 +0530Making My First Vaporwave Track (Remix)

I finally completed my first quick and dirty vaporwave remix of "I Want It That Way" by the Backstreet Boys

V A P O R W A V E

Vaporwave is all about A E S T H E T I C S. Vaporwave is a type of music genre that emerged as a parody of Chillwave, shared more as a meme rather than a proper musical genre. Of course this changed as the genre become mature

How to Vaporwave

The first track which is considered to be actual Vaporwave is Ramona Xavier's Macintosh Plus, this set the the guidelines for making Vaporwave

  • Take a 1980s RnB song
  • Slow it down
  • Add Bass and Treble
  • Add again
  • Add Reverb ( make sure its wet )

There you have your very own Vaporwave track.

( Now, there are some tracks being produced which are not remixes and are original )

My Remix

Where is the Programming?

The fact that there are steps on producing Vaporwave, this gave me the idea that Vaporwave can actually be made using programming, stay tuned for when I publish the program which I am working on ( Generating A E S T H E T I C artwork and remixes)

]]>
https://navanchauhan.github.io/posts/2020-03-03-Playing-With-Android-TVTinkering with an Android TVTinkering with an Android TVhttps://navanchauhan.github.io/posts/2020-03-03-Playing-With-Android-TVTue, 3 Mar 2020 18:37:00 +0530Tinkering with an Android TV

So I have an Android TV, this posts covers everything I have tried on it

Contents

  1. Getting TV's IP Address
  2. Enable Developer Settings
  3. Enable ADB
  4. Connect ADB
  5. Manipulating Packages

IP-Address

These steps should be similar for all Android-TVs

  • Go To Settings
  • Go to Network
  • Advanced Settings
  • Network Status
  • Note Down IP-Address

The other option is to go to your router's server page and get connected devices

Developer-Settings

  • Go To Settings
  • About
  • Continously click on the "Build" option until it says "You are a Developer"

Enable-ADB

  • Go to Settings
  • Go to Developer Options
  • Scroll untill you find ADB Debugging and enable that option

Connect-ADB

  • Open Terminal (Make sure you have ADB installed)
  • Enter the following command adb connect <IP_ADDRESS>
  • To test the connection run adb logcat

Manipulating Apps / Packages

Listing Packages

  • adb shell
  • pm list packages

Installing Packages

  • adb install -r package.apk

Uninstalling Packages

  • adb uninstall com.company.yourpackagename
]]>
https://navanchauhan.github.io/posts/2020-01-19-Connect-To-Bluetooth-Devices-Linux-TerminalHow to setup Bluetooth on a Raspberry PiConnecting to Bluetooth Devices using terminal, tested on Raspberry Pi Zero Whttps://navanchauhan.github.io/posts/2020-01-19-Connect-To-Bluetooth-Devices-Linux-TerminalSun, 19 Jan 2020 15:27:00 +0530How to setup Bluetooth on a Raspberry Pi

This was tested on a Raspberry Pi Zero W

Enter in the Bluetooth Mode

pi@raspberrypi:~ $ bluetoothctl

[bluetooth]# agent on

[bluetooth]# default-agent

[bluetooth]# scan on

To Pair

While being in bluetooth mode

[bluetooth]# pair XX:XX:XX:XX:XX:XX

To Exit out of bluetoothctl anytime, just type exit

]]>
https://navanchauhan.github.io/posts/2020-01-16-Image-Classifier-Using-TuricreateCreating a Custom Image Classifier using Turicreate to detect Smoke and FireTutorial on creating a custom Image Classifier using Turicreate and a dataset from Kagglehttps://navanchauhan.github.io/posts/2020-01-16-Image-Classifier-Using-TuricreateThu, 16 Jan 2020 10:36:00 +0530Creating a Custom Image Classifier using Turicreate to detect Smoke and Fire

For setting up Kaggle with Google Colab, please refer to my previous post

Dataset

Mounting Google Drive

import os from google.colab import drive drive.mount('/content/drive')

Downloading Dataset from Kaggle

os.environ['KAGGLE_CONFIG_DIR'] = "/content/drive/My Drive/" @@ -806,11 +532,11 @@ Configure failed due to the errors above! Completed 384/395 Completed 395/395 0.9316455696202531 -

We just got an accuracy of 94% on Training Data and 97% on Validation Data!

]]>
https://navanchauhan.github.io/posts/2020-01-15-Setting-up-Kaggle-to-use-with-ColabSetting up Kaggle to use with Google ColabTutorial on setting up kaggle, to use with Google Colabhttps://navanchauhan.github.io/posts/2020-01-15-Setting-up-Kaggle-to-use-with-ColabWed, 15 Jan 2020 23:36:00 +0530Setting up Kaggle to use with Google Colab

In order to be able to access Kaggle Datasets, you will need to have an account on Kaggle (which is Free)

Grabbing Our Tokens

Go to Kaggle

Click on your User Profile and Click on My Account

Scroll Down untill you see Create New API Token

This will download your token as a JSON file

Copy the File to the root folder of your Google Drive

Setting up Colab

Mounting Google Drive

import os +

We just got an accuracy of 94% on Training Data and 97% on Validation Data!

]]>
https://navanchauhan.github.io/posts/2020-01-15-Setting-up-Kaggle-to-use-with-ColabSetting up Kaggle to use with Google ColabTutorial on setting up kaggle, to use with Google Colabhttps://navanchauhan.github.io/posts/2020-01-15-Setting-up-Kaggle-to-use-with-ColabWed, 15 Jan 2020 23:36:00 +0530Setting up Kaggle to use with Google Colab

In order to be able to access Kaggle Datasets, you will need to have an account on Kaggle (which is Free)

Grabbing Our Tokens

Go to Kaggle

Click on your User Profile and Click on My Account

Scroll Down until you see Create New API Token

This will download your token as a JSON file

Copy the File to the root folder of your Google Drive

Setting up Colab

Mounting Google Drive

import os from google.colab import drive drive.mount('/content/drive')

After this click on the URL in the output section, login and then paste the Auth Code

Configuring Kaggle

os.environ['KAGGLE_CONFIG_DIR'] = "/content/drive/My Drive/" -

Voila! You can now download kaggel datasets

]]>
https://navanchauhan.github.io/posts/2020-01-14-Converting-between-PIL-NumPyConverting between image and NumPy arrayShort code snippet for converting between PIL image and NumPy arrays.https://navanchauhan.github.io/posts/2020-01-14-Converting-between-PIL-NumPyTue, 14 Jan 2020 00:10:00 +0530Converting between image and NumPy array
import numpy +

Voila! You can now download Kaggle datasets

]]>
https://navanchauhan.github.io/posts/2020-01-14-Converting-between-PIL-NumPyConverting between image and NumPy arrayShort code snippet for converting between PIL image and NumPy arrays.https://navanchauhan.github.io/posts/2020-01-14-Converting-between-PIL-NumPyTue, 14 Jan 2020 00:10:00 +0530Converting between image and NumPy array
import numpy import PIL # Convert PIL Image to NumPy array @@ -824,7 +550,7 @@ Configure failed due to the errors above! except IOError: PIL.ImageFile.MAXBLOCK = img.size[0] * img.size[1] img.save(destination, "JPEG", quality=80, optimize=True, progressive=True) -
]]>
https://navanchauhan.github.io/posts/2019-12-22-Fake-News-DetectorBuilding a Fake News Detector with TuricreateIn this tutorial we will build a fake news detecting app from scratch, using Turicreate for the machine learning model and SwiftUI for building the apphttps://navanchauhan.github.io/posts/2019-12-22-Fake-News-DetectorSun, 22 Dec 2019 11:10:00 +0530Building a Fake News Detector with Turicreate

In this tutorial we will build a fake news detecting app from scratch, using Turicreate for the machine learning model and SwiftUI for building the app

Note: These commands are written as if you are running a jupyter notebook.

Building the Machine Learning Model

Data Gathering

To build a classifier, you need a lot of data. George McIntire (GH: @joolsa) has created a wonderful dataset containing the headline, body and wheter it is fake or real. Whenever you are looking for a dataset, always try searching on Kaggle and GitHub before you start building your own

Dependencies

I used a Google Colab instance for training my model. If you also plan on using Google Colab then I reccomend choosing a GPU Instance (It is Free) This allows you to train the model on the GPU. Turicreat is built on top of Apache's MXNet Framework, for us to use GPU we need to install a CUDA compatible MXNet package.

!pip install turicreate +
]]>
https://navanchauhan.github.io/posts/2019-12-22-Fake-News-DetectorBuilding a Fake News Detector with TuricreateIn this tutorial we will build a fake news detecting app from scratch, using Turicreate for the machine learning model and SwiftUI for building the apphttps://navanchauhan.github.io/posts/2019-12-22-Fake-News-DetectorSun, 22 Dec 2019 11:10:00 +0530Building a Fake News Detector with Turicreate

In this tutorial we will build a fake news detecting app from scratch, using Turicreate for the machine learning model and SwiftUI for building the app

Note: These commands are written as if you are running a jupyter notebook.

Building the Machine Learning Model

Data Gathering

To build a classifier, you need a lot of data. George McIntire (GH: @joolsa) has created a wonderful dataset containing the headline, body and whether it is fake or real. Whenever you are looking for a dataset, always try searching on Kaggle and GitHub before you start building your own

Dependencies

I used a Google Colab instance for training my model. If you also plan on using Google Colab then I recommend choosing a GPU Instance (It is Free) This allows you to train the model on the GPU. Turicreate is built on top of Apache's MXNet Framework, for us to use GPU we need to install a CUDA compatible MXNet package.

!pip install turicreate !pip uninstall -y mxnet !pip install mxnet-cu100==1.4.0.post0

If you do not wish to train on GPU or are running it on your computer, you can ignore the last two lines

Downloading the Dataset

!wget -q "https://github.com/joolsa/fake_real_news_dataset/raw/master/fake_or_real_news.csv.zip" @@ -865,7 +591,7 @@ Configure failed due to the errors above!

Exporting the Model

model_name = 'FakeNews' coreml_model_name = model_name + '.mlmodel' exportedModel = model.export_coreml(coreml_model_name) -

Note: To download files from Google Volab, simply click on the files section in the sidebar, right click on filename and then click on downlaod

Link to Colab Notebook

Building the App using SwiftUI

Initial Setup

First we create a single view app (make sure you check the use SwiftUI button)

Then we copy our .mlmodel file to our project (Just drag and drop the file in the XCode Files Sidebar)

Our ML Model does not take a string directly as an input, rather it takes bag of words as an input. DescriptionThe bag-of-words model is a simplifying representation used in NLP, in this text is represented as a bag of words, without any regatd of grammar or order, but noting multiplicity

We define our bag of words function

func bow(text: String) -> [String: Double] { +

Note: To download files from Google Colab, simply click on the files section in the sidebar, right click on filename and then click on download

Link to Colab Notebook

Building the App using SwiftUI

Initial Setup

First we create a single view app (make sure you check the use SwiftUI button)

Then we copy our .mlmodel file to our project (Just drag and drop the file in the XCode Files Sidebar)

Our ML Model does not take a string directly as an input, rather it takes bag of words as an input. DescriptionThe bag-of-words model is a simplifying representation used in NLP, in this text is represented as a bag of words, without any regard for grammar or order, but noting multiplicity

We define our bag of words function

func bow(text: String) -> [String: Double] { var bagOfWords = [String: Double]() let tagger = NSLinguisticTagger(tagSchemes: [.tokenType], options: 0) @@ -969,7 +695,7 @@ Configure failed due to the errors above! import matplotlib.pyplot as plt import numpy as np import pandas as pd -

Dataset

Creating Random Data

Even though in this tutorial we will use a Position Vs Salary datasset, it is important to know how to create synthetic data

To create 50 values spaced evenly between 0 and 50, we use NumPy's linspace funtion

linspace(lower_limit, upper_limit, no_of_observations)

x = np.linspace(0, 50, 50) +

Dataset

Creating Random Data

Even though in this tutorial we will use a Position Vs Salary dataset, it is important to know how to create synthetic data

To create 50 values spaced evenly between 0 and 50, we use NumPy's linspace function

linspace(lower_limit, upper_limit, no_of_observations)

x = np.linspace(0, 50, 50) y = np.linspace(0, 50, 50)

We use the following function to add noise to the data, so that our values

x += np.random.uniform(-4, 4, 50) y += np.random.uniform(-4, 4, 50) @@ -988,7 +714,7 @@ Configure failed due to the errors above! | Senior Partner | 8 | 300000 | | C-level | 9 | 500000 | | CEO | 10 | 1000000 | -

We convert the salary column as the ordinate (y-cordinate) and level column as the abscissa

abscissa = df["Level"].to_list() # abscissa = [1,2,3,4,5,6,7,8,9,10] +

We convert the salary column as the ordinate (y-coordinate) and level column as the abscissa

abscissa = df["Level"].to_list() # abscissa = [1,2,3,4,5,6,7,8,9,10] ordinate = df["Salary"].to_list() # ordinate = [45000,50000,60000,80000,110000,150000,200000,300000,500000,1000000]
n = len(abscissa) # no of observations plt.scatter(abscissa, ordinate) @@ -998,7 +724,7 @@ Configure failed due to the errors above! plt.show()

Defining Stuff

X = tf.placeholder("float") Y = tf.placeholder("float") -

Defining Variables

We first define all the coefficients and constant as tensorflow variables haveing a random intitial value

a = tf.Variable(np.random.randn(), name = "a") +

Defining Variables

We first define all the coefficients and constant as tensorflow variables having a random initial value

a = tf.Variable(np.random.randn(), name = "a") b = tf.Variable(np.random.randn(), name = "b") c = tf.Variable(np.random.randn(), name = "c") d = tf.Variable(np.random.randn(), name = "d") @@ -1270,7 +996,7 @@ Configure failed due to the errors above! plt.title('Quintic Regression Result') plt.legend() plt.show() -

Results and Conclusion

You just learnt Polynomial Regression using TensorFlow!

Notes

Overfitting

> Overfitting refers to a model that models the training data too well.Overfitting happens when a model learns the detail and noise in the training data to the extent that it negatively impacts the performance of the model on new data. This means that the noise or random fluctuations in the training data is picked up and learned as concepts by the model. The problem is that these concepts do not apply to new data and negatively impact the models ability to generalize.

Source: Machine Learning Mastery

Basically if you train your machine learning model on a small dataset for a really large number of epochs, the model will learn all the deformities/noise in the data and will actually think that it is a normal part. Therefore when it will see some new data, it will discard that new data as noise and will impact the accuracy of the model in a negative manner

]]>
https://navanchauhan.github.io/posts/2019-12-10-TensorFlow-Model-PredictionMaking Predictions using Image Classifier (TensorFlow)Making predictions for image classification models built using TensorFlowhttps://navanchauhan.github.io/posts/2019-12-10-TensorFlow-Model-PredictionTue, 10 Dec 2019 11:10:00 +0530Making Predictions using Image Classifier (TensorFlow)

This was tested on TF 2.x and works as of 2019-12-10

If you want to understand how to make your own custom image classifier, please refer to my previous post.

If you followed my last post, then you created a model which took an image of dimensions 50x50 as an input.

First we import the following if we have not imported these before

import cv2 +

Results and Conclusion

You just learnt Polynomial Regression using TensorFlow!

Notes

Overfitting

> Overfitting refers to a model that models the training data too well.Overfitting happens when a model learns the detail and noise in the training data to the extent that it negatively impacts the performance of the model on new data. This means that the noise or random fluctuations in the training data is picked up and learned as concepts by the model. The problem is that these concepts do not apply to new data and negatively impact the models ability to generalise.

Source: Machine Learning Mastery

Basically if you train your machine learning model on a small dataset for a really large number of epochs, the model will learn all the deformities/noise in the data and will actually think that it is a normal part. Therefore when it will see some new data, it will discard that new data as noise and will impact the accuracy of the model in a negative manner

]]>
https://navanchauhan.github.io/posts/2019-12-10-TensorFlow-Model-PredictionMaking Predictions using Image Classifier (TensorFlow)Making predictions for image classification models built using TensorFlowhttps://navanchauhan.github.io/posts/2019-12-10-TensorFlow-Model-PredictionTue, 10 Dec 2019 11:10:00 +0530Making Predictions using Image Classifier (TensorFlow)

This was tested on TF 2.x and works as of 2019-12-10

If you want to understand how to make your own custom image classifier, please refer to my previous post.

If you followed my last post, then you created a model which took an image of dimensions 50x50 as an input.

First we import the following if we have not imported these before

import cv2 import os

Then we read the file using OpenCV.

image=cv2.imread(imagePath)

The cv2. imread() function returns a NumPy array representing the image. Therefore, we need to convert it before we can use it.

image_from_array = Image.fromarray(image, 'RGB') @@ -1338,7 +1064,7 @@ np.random.shuffle(s) model.add(layers.Dropout(0.2)) model.add(layers.Dense(2,activation="softmax"))#2 represent output layer neurons model.summary() -

Compiling Model

We use the adam optimiser as it is an adaptive learning rate optimization algorithm that's been designed specifically for training deep neural networks, which means it changes its learning rate automaticaly to get the best results

model.compile(optimizer="adam", +

Compiling Model

We use the Adam optimiser as it is an adaptive learning rate optimisation algorithm that's been designed specifically for training deep neural networks, which means it changes its learning rate automatically to get the best results

model.compile(optimizer="adam", loss="sparse_categorical_crossentropy", metrics=["accuracy"])

Training Model

We train the model for 10 epochs on the training data and then validate it using the testing data

history = model.fit(X_train,y_train, epochs=10, validation_data=(X_test,y_test)) @@ -1381,7 +1107,7 @@ np.random.shuffle(s)

We have achieved 98% Accuracy!

Link to Colab Notebook

]]>
https://navanchauhan.github.io/posts/2019-12-08-Splitting-ZipsSplitting ZIPs into Multiple PartsShort code snippet for splitting zips.https://navanchauhan.github.io/posts/2019-12-08-Splitting-ZipsSun, 8 Dec 2019 13:27:00 +0530Splitting ZIPs into Multiple Parts

Tested on macOS

Creating the archive:

zip -r -s 5 oodlesofnoodles.zip website/

5 stands for each split files' size (in mb, kb and gb can also be specified)

For encrypting the zip:

zip -er -s 5 oodlesofnoodles.zip website

Extracting Files

First we need to collect all parts, then

zip -F oodlesofnoodles.zip --out merged.zip -
]]>
https://navanchauhan.github.io/posts/2019-12-04-Google-Teachable-MachinesImage Classifier With Teachable MachinesTutorial on creating a custom image classifier quickly with Google Teachanle Machineshttps://navanchauhan.github.io/posts/2019-12-04-Google-Teachable-MachinesWed, 4 Dec 2019 18:23:00 +0530Image Classifier With Teachable Machines

Made for Google Code-In

Task Description

Using Glitch and the Teachable Machines, build a Book Detector with Tensorflow.js. When a book is recognized, the code would randomly suggest a book/tell a famous quote from a book. Here is an example Project to get you started: https://glitch.com/~voltaic-acorn

Details

  1. Collecting Data

Teachable Machine allows you to create your dataset just by using your webcam! I created a database consisting of three classes ( Three Books ) and approximately grabbed 100 pictures for each book/class

  1. Training

Training on teachable machines is as simple as clicking the train button. I did not even have to modify any configurations.

  1. Finding Labels

Because I originally entered the entire name of the book and it's author's name as the label, the class name got truncated (Note to self, use shorter class names :p ). I then modified the code to print the modified label names in an alert box.

  1. Adding a suggestions function

I first added a text field on the main page and then modified the JavaScript file to suggest a similar book whenever the model predicted with an accuracy >= 98%

  1. Running!

Here it is running!

Remix this project:-

https://luminous-opinion.glitch.me

]]>
https://navanchauhan.github.io/publications/2019-05-14-Detecting-Driver-Fatigue-Over-Speeding-and-Speeding-up-Post-Accident-ResponseDetecting Driver Fatigue, Over-Speeding, and Speeding up Post-Accident ResponseThis paper is about Detecting Driver Fatigue, Over-Speeding, and Speeding up Post-Accident Response.https://navanchauhan.github.io/publications/2019-05-14-Detecting-Driver-Fatigue-Over-Speeding-and-Speeding-up-Post-Accident-ResponseTue, 14 May 2019 02:42:00 +0530Detecting Driver Fatigue, Over-Speeding, and Speeding up Post-Accident Response

Based on the project showcased at Toyota Hackathon, IITD - 17/18th December 2018

Edit: It seems like I haven't mentioned Adrian Rosebrock of PyImageSearch anywhere. I apologize for this mistake.

Download paper here

Recommended citation:

ATP

Chauhan, N. (2019). &quot;Detecting Driver Fatigue, Over-Speeding, and Speeding up Post-Accident Response.&quot; <i>International Research Journal of Engineering and Technology (IRJET), 6(5)</i>. +
]]>
https://navanchauhan.github.io/posts/2019-12-04-Google-Teachable-MachinesImage Classifier With Teachable MachinesTutorial on creating a custom image classifier quickly with Google Teachable Machineshttps://navanchauhan.github.io/posts/2019-12-04-Google-Teachable-MachinesWed, 4 Dec 2019 18:23:00 +0530Image Classifier With Teachable Machines

Made for Google Code-In

Task Description

Using Glitch and the Teachable Machines, build a Book Detector with Tensorflow.js. When a book is recognized, the code would randomly suggest a book/tell a famous quote from a book. Here is an example Project to get you started: https://glitch.com/~voltaic-acorn

Details

  1. Collecting Data

Teachable Machine allows you to create your dataset just by using your webcam! I created a database consisting of three classes ( Three Books ) and approximately grabbed 100 pictures for each book/class

  1. Training

Training on teachable machines is as simple as clicking the train button. I did not even have to modify any configurations.

  1. Finding Labels

Because I originally entered the entire name of the book and it's author's name as the label, the class name got truncated (Note to self, use shorter class names :p ). I then modified the code to print the modified label names in an alert box.

  1. Adding a suggestions function

I first added a text field on the main page and then modified the JavaScript file to suggest a similar book whenever the model predicted with an accuracy >= 98%

  1. Running!

Here it is running!

Remix this project:-

https://luminous-opinion.glitch.me

]]>
https://navanchauhan.github.io/publications/2019-05-14-Detecting-Driver-Fatigue-Over-Speeding-and-Speeding-up-Post-Accident-ResponseDetecting Driver Fatigue, Over-Speeding, and Speeding up Post-Accident ResponseThis paper is about Detecting Driver Fatigue, Over-Speeding, and Speeding up Post-Accident Response.https://navanchauhan.github.io/publications/2019-05-14-Detecting-Driver-Fatigue-Over-Speeding-and-Speeding-up-Post-Accident-ResponseTue, 14 May 2019 02:42:00 +0530Detecting Driver Fatigue, Over-Speeding, and Speeding up Post-Accident Response

Based on the project showcased at Toyota Hackathon, IITD - 17/18th December 2018

Edit: It seems like I haven't mentioned Adrian Rosebrock of PyImageSearch anywhere. I apologize for this mistake.

Download paper here

Recommended citation:

ATP

Chauhan, N. (2019). &quot;Detecting Driver Fatigue, Over-Speeding, and Speeding up Post-Accident Response.&quot; <i>International Research Journal of Engineering and Technology (IRJET), 6(5)</i>.

BibTeX

@article{chauhan_2019, title={Detecting Driver Fatigue, Over-Speeding, and Speeding up Post-Accident Response}, volume={6}, url={https://www.irjet.net/archives/V6/i5/IRJET-V6I5318.pdf}, number={5}, journal={International Research Journal of Engineering and Technology (IRJET)}, author={Chauhan, Navan}, year={2019}}
]]>
https://navanchauhan.github.io/posts/2019-05-05-Custom-Snowboard-Anemone-ThemeCreating your own custom theme for Snowboard or AnemoneTutorial on creating your own custom theme for Snowboard or Anemonehttps://navanchauhan.github.io/posts/2019-05-05-Custom-Snowboard-Anemone-ThemeSun, 5 May 2019 12:34:00 +0530Creating your own custom theme for Snowboard or Anemone

Contents

  • Getting Started
  • Theme Configuration
  • Creating Icons
  • Exporting Icons
  • Icon Masks
  • Packaging
  • Building the DEB

Getting Started

Note: Without the proper folder structure, your theme may not show up!

  • Create a new folder called themeName.theme (Replace themeName with your desired theme name)
  • Within themeName.theme folder, create another folder called IconBundles (You cannot change this name)

Theme Configuration

  • Now, inside the themeName.theme folder, create a file called Info.plist and paste the following
<?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"> @@ -1393,7 +1119,7 @@ np.random.shuffle(s) <string>Icons</string> </dict> </plist> -
  • Replace PackageName with the name of the Pacakge and replace ThemeName with the Theme Name

Now, you might ask what is the difference between PackageName and ThemeName?

Well, if for example you want to publish two variants of your icons, one dark and one white but you do not want the user to seperately install them. Then, you would name the package MyTheme and include two themes Blackie and White thus creating two entries. More about this in the end

Creating Icons

  • Open up the Image Editor of your choice and create a new file having a resolution of 512x512

Note: Due to IconBundles, we just need to create the icons in one size and they get resized automaticaly :ghost:

Want to create rounded icons? Create them squared only, we will learn how to apply masks!

Exporting Icons

Note: All icons must be saved as *.png (Tip: This means you can even create partially transparent icons!)

  • All Icons must be saved in themeName.theme>IconBundles as bundleID-large.png
Finding BundleIDs

Stock Application BundleIDs

NameBundleID
App Storecom.apple.AppStore
Apple Watchcom.apple.Bridge
Calculatorcom.apple.calculator
Calendarcom.apple.mobilecal
Cameracom.apple.camera
Classroomcom.apple.classroom
Clockcom.apple.mobiletimer
Compasscom.apple.compass
FaceTimecom.apple.facetime
Filescom.apple.DocumentsApp
Game Centercom.apple.gamecenter
Healthcom.apple.Health
Homecom.apple.Home
iBookscom.apple.iBooks
iTunes Storecom.apple.MobileStore
Mailcom.apple.mobilemail
Mapscom.apple.Maps
Measurecom.apple.measure
Messagescom.apple.MobileSMS
Musiccom.apple.Music
Newscom.apple.news
Notescom.apple.mobilenotes
Phonecom.apple.mobilephone
Photo Boothcom.apple.Photo-Booth
Photoscom.apple.mobileslideshow
Playgroundscome.apple.Playgrounds
Podcastscom.apple.podcasts
Reminderscom.apple.reminders
Safaricom.apple.mobilesafari
Settingscom.apple.Preferences
Stockscom.apple.stocks
Tipscom.apple.tips
TVcom.apple.tv
Videoscom.apple.videos
Voice Memoscom.apple.VoiceMemos
Walletcom.apple.Passbook
Weathercom.apple.weather

3rd Party Applications BundleID Click here

Icon Masks

  • Getting the Classic Rounded Rectangle Masks

In your Info.plist file add the following value between <dict> and +

  • Replace PackageName with the name of the Package and replace ThemeName with the Theme Name

Now, you might ask what is the difference between PackageName and ThemeName?

Well, if for example you want to publish two variants of your icons, one dark and one white but you do not want the user to seperately install them. Then, you would name the package MyTheme and include two themes Blackie and White thus creating two entries. More about this in the end

Creating Icons

  • Open up the Image Editor of your choice and create a new file having a resolution of 512x512

Note: Due to IconBundles, we just need to create the icons in one size and they get resized automatically :ghost:

Want to create rounded icons? Create them squared only, we will learn how to apply masks!

Exporting Icons

Note: All icons must be saved as *.png (Tip: This means you can even create partially transparent icons!)

  • All Icons must be saved in themeName.theme>IconBundles as bundleID-large.png
Finding BundleIDs

Stock Application BundleIDs

NameBundleID
App Storecom.apple.AppStore
Apple Watchcom.apple.Bridge
Calculatorcom.apple.calculator
Calendarcom.apple.mobilecal
Cameracom.apple.camera
Classroomcom.apple.classroom
Clockcom.apple.mobiletimer
Compasscom.apple.compass
FaceTimecom.apple.facetime
Filescom.apple.DocumentsApp
Game Centercom.apple.gamecenter
Healthcom.apple.Health
Homecom.apple.Home
iBookscom.apple.iBooks
iTunes Storecom.apple.MobileStore
Mailcom.apple.mobilemail
Mapscom.apple.Maps
Measurecom.apple.measure
Messagescom.apple.MobileSMS
Musiccom.apple.Music
Newscom.apple.news
Notescom.apple.mobilenotes
Phonecom.apple.mobilephone
Photo Boothcom.apple.Photo-Booth
Photoscom.apple.mobileslideshow
Playgroundscome.apple.Playgrounds
Podcastscom.apple.podcasts
Reminderscom.apple.reminders
Safaricom.apple.mobilesafari
Settingscom.apple.Preferences
Stockscom.apple.stocks
Tipscom.apple.tips
TVcom.apple.tv
Videoscom.apple.videos
Voice Memoscom.apple.VoiceMemos
Walletcom.apple.Passbook
Weathercom.apple.weather

3rd Party Applications BundleID Click here

Icon Masks

  • Getting the Classic Rounded Rectangle Masks

In your Info.plist file add the following value between <dict> and ``` IB-MaskIcons @@ -1443,7 +1169,7 @@ would result in * Create a new folder outside `themeName.theme` with the name you want to be shown on Cydia, e.g `themeNameForCydia` * Create another folder called `DEBIAN` in `themeNameForCydia` (It needs to be uppercase) -* In `DEBIAN` create an extensionless file called `control` and edit it using your favourite text editor +* In `DEBIAN` create an extension-less file called `control` and edit it using your favourite text editor Paste the following in it, replacing `yourname`, `themename`, `Theme Name`, `A theme with beautiful icons!` and `Your Name` with your details: @@ -1460,7 +1186,7 @@ Section: Themes * Important Notes: * The package field **MUST** be lower case! - * The version field **MUST** be changed everytime you update your theme! + * The version field **MUST** be changed every-time you update your theme! * The control file **MUST** have an extra blank line at the bottom! * Now, Create another folder called `Library` in `themeNameForCydia` @@ -1475,7 +1201,7 @@ Section: Themes 1) Install Homenbrew `/usr/bin/ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"` (Run this in the terminal) 2) Install dpkg, by running `brew install dpkg` -**There is a terrible thing called .DS_Store which if not removed, will cause a problem durin either build or installation** +**There is a terrible thing called .DS_Store which if not removed, will cause a problem during either build or installation** * To remove this we first need to open the folder in the terminal diff --git a/index.html b/index.html index 14062ca..c9e796a 100644 --- a/index.html +++ b/index.html @@ -1 +1 @@ -👋 Hi! | Navan Chauhan

👋 Hi!

Welcome to my personal fragment of the internet. Majority of the posts should be complete.

Latest content

\ No newline at end of file +👋 Hi! | Navan Chauhan

👋 Hi!

Welcome to my personal fragment of the internet. Majority of the posts should be complete.

Latest content

\ No newline at end of file diff --git a/posts/2019-05-05-Custom-Snowboard-Anemone-Theme/index.html b/posts/2019-05-05-Custom-Snowboard-Anemone-Theme/index.html index 016a763..2ae2000 100644 --- a/posts/2019-05-05-Custom-Snowboard-Anemone-Theme/index.html +++ b/posts/2019-05-05-Custom-Snowboard-Anemone-Theme/index.html @@ -1,4 +1,4 @@ -Creating your own custom theme for Snowboard or Anemone | Navan Chauhan
5 minute readCreated on May 5, 2019Last modified on June 1, 2020

Creating your own custom theme for Snowboard or Anemone

Contents

  • Getting Started
  • Theme Configuration
  • Creating Icons
  • Exporting Icons
  • Icon Masks
  • Packaging
  • Building the DEB

Getting Started

Note: Without the proper folder structure, your theme may not show up!

  • Create a new folder called themeName.theme (Replace themeName with your desired theme name)
  • Within themeName.theme folder, create another folder called IconBundles (You cannot change this name)

Theme Configuration

  • Now, inside the themeName.theme folder, create a file called Info.plist and paste the following
<?xml version="1.0" encoding="UTF-8"?> +Creating your own custom theme for Snowboard or Anemone | Navan Chauhan
5 minute readCreated on May 5, 2019Last modified on September 15, 2020

Creating your own custom theme for Snowboard or Anemone

Contents

  • Getting Started
  • Theme Configuration
  • Creating Icons
  • Exporting Icons
  • Icon Masks
  • Packaging
  • Building the DEB

Getting Started

Note: Without the proper folder structure, your theme may not show up!

  • Create a new folder called themeName.theme (Replace themeName with your desired theme name)
  • Within themeName.theme folder, create another folder called IconBundles (You cannot change this name)

Theme Configuration

  • Now, inside the themeName.theme folder, create a file called Info.plist and paste the following
<?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"> <plist version="1.0"> <dict> @@ -8,7 +8,7 @@ <string>Icons</string> </dict> </plist> -
  • Replace PackageName with the name of the Pacakge and replace ThemeName with the Theme Name

Now, you might ask what is the difference between PackageName and ThemeName?

Well, if for example you want to publish two variants of your icons, one dark and one white but you do not want the user to seperately install them. Then, you would name the package MyTheme and include two themes Blackie and White thus creating two entries. More about this in the end

Creating Icons

  • Open up the Image Editor of your choice and create a new file having a resolution of 512x512

Note: Due to IconBundles, we just need to create the icons in one size and they get resized automaticaly :ghost:

Want to create rounded icons? Create them squared only, we will learn how to apply masks!

Exporting Icons

Note: All icons must be saved as *.png (Tip: This means you can even create partially transparent icons!)

  • All Icons must be saved in themeName.theme>IconBundles as bundleID-large.png
Finding BundleIDs

Stock Application BundleIDs

NameBundleID
App Storecom.apple.AppStore
Apple Watchcom.apple.Bridge
Calculatorcom.apple.calculator
Calendarcom.apple.mobilecal
Cameracom.apple.camera
Classroomcom.apple.classroom
Clockcom.apple.mobiletimer
Compasscom.apple.compass
FaceTimecom.apple.facetime
Filescom.apple.DocumentsApp
Game Centercom.apple.gamecenter
Healthcom.apple.Health
Homecom.apple.Home
iBookscom.apple.iBooks
iTunes Storecom.apple.MobileStore
Mailcom.apple.mobilemail
Mapscom.apple.Maps
Measurecom.apple.measure
Messagescom.apple.MobileSMS
Musiccom.apple.Music
Newscom.apple.news
Notescom.apple.mobilenotes
Phonecom.apple.mobilephone
Photo Boothcom.apple.Photo-Booth
Photoscom.apple.mobileslideshow
Playgroundscome.apple.Playgrounds
Podcastscom.apple.podcasts
Reminderscom.apple.reminders
Safaricom.apple.mobilesafari
Settingscom.apple.Preferences
Stockscom.apple.stocks
Tipscom.apple.tips
TVcom.apple.tv
Videoscom.apple.videos
Voice Memoscom.apple.VoiceMemos
Walletcom.apple.Passbook
Weathercom.apple.weather

3rd Party Applications BundleID Click here

Icon Masks

  • Getting the Classic Rounded Rectangle Masks

In your Info.plist file add the following value between <dict> and +

  • Replace PackageName with the name of the Package and replace ThemeName with the Theme Name

Now, you might ask what is the difference between PackageName and ThemeName?

Well, if for example you want to publish two variants of your icons, one dark and one white but you do not want the user to seperately install them. Then, you would name the package MyTheme and include two themes Blackie and White thus creating two entries. More about this in the end

Creating Icons

  • Open up the Image Editor of your choice and create a new file having a resolution of 512x512

Note: Due to IconBundles, we just need to create the icons in one size and they get resized automatically :ghost:

Want to create rounded icons? Create them squared only, we will learn how to apply masks!

Exporting Icons

Note: All icons must be saved as *.png (Tip: This means you can even create partially transparent icons!)

  • All Icons must be saved in themeName.theme>IconBundles as bundleID-large.png
Finding BundleIDs

Stock Application BundleIDs

NameBundleID
App Storecom.apple.AppStore
Apple Watchcom.apple.Bridge
Calculatorcom.apple.calculator
Calendarcom.apple.mobilecal
Cameracom.apple.camera
Classroomcom.apple.classroom
Clockcom.apple.mobiletimer
Compasscom.apple.compass
FaceTimecom.apple.facetime
Filescom.apple.DocumentsApp
Game Centercom.apple.gamecenter
Healthcom.apple.Health
Homecom.apple.Home
iBookscom.apple.iBooks
iTunes Storecom.apple.MobileStore
Mailcom.apple.mobilemail
Mapscom.apple.Maps
Measurecom.apple.measure
Messagescom.apple.MobileSMS
Musiccom.apple.Music
Newscom.apple.news
Notescom.apple.mobilenotes
Phonecom.apple.mobilephone
Photo Boothcom.apple.Photo-Booth
Photoscom.apple.mobileslideshow
Playgroundscome.apple.Playgrounds
Podcastscom.apple.podcasts
Reminderscom.apple.reminders
Safaricom.apple.mobilesafari
Settingscom.apple.Preferences
Stockscom.apple.stocks
Tipscom.apple.tips
TVcom.apple.tv
Videoscom.apple.videos
Voice Memoscom.apple.VoiceMemos
Walletcom.apple.Passbook
Weathercom.apple.weather

3rd Party Applications BundleID Click here

Icon Masks

  • Getting the Classic Rounded Rectangle Masks

In your Info.plist file add the following value between <dict> and ``` IB-MaskIcons @@ -58,7 +58,7 @@ would result in * Create a new folder outside `themeName.theme` with the name you want to be shown on Cydia, e.g `themeNameForCydia` * Create another folder called `DEBIAN` in `themeNameForCydia` (It needs to be uppercase) -* In `DEBIAN` create an extensionless file called `control` and edit it using your favourite text editor +* In `DEBIAN` create an extension-less file called `control` and edit it using your favourite text editor Paste the following in it, replacing `yourname`, `themename`, `Theme Name`, `A theme with beautiful icons!` and `Your Name` with your details: @@ -75,7 +75,7 @@ Section: Themes * Important Notes: * The package field **MUST** be lower case! - * The version field **MUST** be changed everytime you update your theme! + * The version field **MUST** be changed every-time you update your theme! * The control file **MUST** have an extra blank line at the bottom! * Now, Create another folder called `Library` in `themeNameForCydia` @@ -90,7 +90,7 @@ Section: Themes 1) Install Homenbrew `/usr/bin/ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"` (Run this in the terminal) 2) Install dpkg, by running `brew install dpkg` -**There is a terrible thing called .DS_Store which if not removed, will cause a problem durin either build or installation** +**There is a terrible thing called .DS_Store which if not removed, will cause a problem during either build or installation** * To remove this we first need to open the folder in the terminal diff --git a/posts/2019-12-04-Google-Teachable-Machines/index.html b/posts/2019-12-04-Google-Teachable-Machines/index.html index 254d7f5..94ea7e8 100644 --- a/posts/2019-12-04-Google-Teachable-Machines/index.html +++ b/posts/2019-12-04-Google-Teachable-Machines/index.html @@ -1 +1 @@ -Image Classifier With Teachable Machines | Navan Chauhan

2 minute readCreated on December 4, 2019Last modified on June 1, 2020

Image Classifier With Teachable Machines

Made for Google Code-In

Task Description

Using Glitch and the Teachable Machines, build a Book Detector with Tensorflow.js. When a book is recognized, the code would randomly suggest a book/tell a famous quote from a book. Here is an example Project to get you started: https://glitch.com/~voltaic-acorn

Details

  1. Collecting Data

Teachable Machine allows you to create your dataset just by using your webcam! I created a database consisting of three classes ( Three Books ) and approximately grabbed 100 pictures for each book/class

  1. Training

Training on teachable machines is as simple as clicking the train button. I did not even have to modify any configurations.

  1. Finding Labels

Because I originally entered the entire name of the book and it's author's name as the label, the class name got truncated (Note to self, use shorter class names :p ). I then modified the code to print the modified label names in an alert box.

  1. Adding a suggestions function

I first added a text field on the main page and then modified the JavaScript file to suggest a similar book whenever the model predicted with an accuracy >= 98%

  1. Running!

Here it is running!

Remix this project:-

https://luminous-opinion.glitch.me

Tagged with:
\ No newline at end of file +Image Classifier With Teachable Machines | Navan Chauhan
2 minute readCreated on December 4, 2019Last modified on September 15, 2020

Image Classifier With Teachable Machines

Made for Google Code-In

Task Description

Using Glitch and the Teachable Machines, build a Book Detector with Tensorflow.js. When a book is recognized, the code would randomly suggest a book/tell a famous quote from a book. Here is an example Project to get you started: https://glitch.com/~voltaic-acorn

Details

  1. Collecting Data

Teachable Machine allows you to create your dataset just by using your webcam! I created a database consisting of three classes ( Three Books ) and approximately grabbed 100 pictures for each book/class

  1. Training

Training on teachable machines is as simple as clicking the train button. I did not even have to modify any configurations.

  1. Finding Labels

Because I originally entered the entire name of the book and it's author's name as the label, the class name got truncated (Note to self, use shorter class names :p ). I then modified the code to print the modified label names in an alert box.

  1. Adding a suggestions function

I first added a text field on the main page and then modified the JavaScript file to suggest a similar book whenever the model predicted with an accuracy >= 98%

  1. Running!

Here it is running!

Remix this project:-

https://luminous-opinion.glitch.me

Tagged with:
\ No newline at end of file diff --git a/posts/2019-12-08-Image-Classifier-Tensorflow/index.html b/posts/2019-12-08-Image-Classifier-Tensorflow/index.html index 9b4db41..300fc2e 100644 --- a/posts/2019-12-08-Image-Classifier-Tensorflow/index.html +++ b/posts/2019-12-08-Image-Classifier-Tensorflow/index.html @@ -1,4 +1,4 @@ -Creating a Custom Image Classifier using Tensorflow 2.x and Keras for Detecting Malaria | Navan Chauhan
4 minute readCreated on December 8, 2019Last modified on June 1, 2020

Creating a Custom Image Classifier using Tensorflow 2.x and Keras for Detecting Malaria

Done during Google Code-In. Org: Tensorflow.

Imports

%tensorflow_version 2.x #This is for telling Colab that you want to use TF 2.0, ignore if running on local machine +Creating a Custom Image Classifier using Tensorflow 2.x and Keras for Detecting Malaria | Navan Chauhan
4 minute readCreated on December 8, 2019Last modified on September 15, 2020

Creating a Custom Image Classifier using Tensorflow 2.x and Keras for Detecting Malaria

Done during Google Code-In. Org: Tensorflow.

Imports

%tensorflow_version 2.x #This is for telling Colab that you want to use TF 2.0, ignore if running on local machine from PIL import Image # We use the PIL Library to resize images import numpy as np @@ -58,7 +58,7 @@ np.random.shuffle(s) model.add(layers.Dropout(0.2)) model.add(layers.Dense(2,activation="softmax"))#2 represent output layer neurons model.summary() -

Compiling Model

We use the adam optimiser as it is an adaptive learning rate optimization algorithm that's been designed specifically for training deep neural networks, which means it changes its learning rate automaticaly to get the best results

model.compile(optimizer="adam", +

Compiling Model

We use the Adam optimiser as it is an adaptive learning rate optimisation algorithm that's been designed specifically for training deep neural networks, which means it changes its learning rate automatically to get the best results

model.compile(optimizer="adam", loss="sparse_categorical_crossentropy", metrics=["accuracy"])

Training Model

We train the model for 10 epochs on the training data and then validate it using the testing data

history = model.fit(X_train,y_train, epochs=10, validation_data=(X_test,y_test)) diff --git a/posts/2019-12-16-TensorFlow-Polynomial-Regression/index.html b/posts/2019-12-16-TensorFlow-Polynomial-Regression/index.html index b9322b0..75b14ec 100644 --- a/posts/2019-12-16-TensorFlow-Polynomial-Regression/index.html +++ b/posts/2019-12-16-TensorFlow-Polynomial-Regression/index.html @@ -1,9 +1,9 @@ -Polynomial Regression Using TensorFlow | Navan Chauhan
17 minute readCreated on December 16, 2019Last modified on June 1, 2020

Polynomial Regression Using TensorFlow

In this tutorial you will learn about polynomial regression and how you can implement it in Tensorflow.

In this, we will be performing polynomial regression using 5 types of equations -

  • Linear
  • Quadratic
  • Cubic
  • Quartic
  • Quintic

Regression

What is Regression?

Regression is a statistical measurement that is used to try to determine the relationship between a dependent variable (often denoted by Y), and series of varying variables (called independent variables, often denoted by X ).

What is Polynomial Regression

This is a form of Regression Analysis where the relationship between Y and X is denoted as the nth degree/power of X. Polynomial regression even fits a non-linear relationship (e.g when the points don't form a straight line).

Imports

import tensorflow.compat.v1 as tf +Polynomial Regression Using TensorFlow | Navan Chauhan
17 minute readCreated on December 16, 2019Last modified on September 15, 2020

Polynomial Regression Using TensorFlow

In this tutorial you will learn about polynomial regression and how you can implement it in Tensorflow.

In this, we will be performing polynomial regression using 5 types of equations -

  • Linear
  • Quadratic
  • Cubic
  • Quartic
  • Quintic

Regression

What is Regression?

Regression is a statistical measurement that is used to try to determine the relationship between a dependent variable (often denoted by Y), and series of varying variables (called independent variables, often denoted by X ).

What is Polynomial Regression

This is a form of Regression Analysis where the relationship between Y and X is denoted as the nth degree/power of X. Polynomial regression even fits a non-linear relationship (e.g when the points don't form a straight line).

Imports

import tensorflow.compat.v1 as tf tf.disable_v2_behavior() import matplotlib.pyplot as plt import numpy as np import pandas as pd -

Dataset

Creating Random Data

Even though in this tutorial we will use a Position Vs Salary datasset, it is important to know how to create synthetic data

To create 50 values spaced evenly between 0 and 50, we use NumPy's linspace funtion

linspace(lower_limit, upper_limit, no_of_observations)

x = np.linspace(0, 50, 50) +

Dataset

Creating Random Data

Even though in this tutorial we will use a Position Vs Salary dataset, it is important to know how to create synthetic data

To create 50 values spaced evenly between 0 and 50, we use NumPy's linspace function

linspace(lower_limit, upper_limit, no_of_observations)

x = np.linspace(0, 50, 50) y = np.linspace(0, 50, 50)

We use the following function to add noise to the data, so that our values

x += np.random.uniform(-4, 4, 50) y += np.random.uniform(-4, 4, 50) @@ -22,7 +22,7 @@ | Senior Partner | 8 | 300000 | | C-level | 9 | 500000 | | CEO | 10 | 1000000 | -

We convert the salary column as the ordinate (y-cordinate) and level column as the abscissa

abscissa = df["Level"].to_list() # abscissa = [1,2,3,4,5,6,7,8,9,10] +

We convert the salary column as the ordinate (y-coordinate) and level column as the abscissa

abscissa = df["Level"].to_list() # abscissa = [1,2,3,4,5,6,7,8,9,10] ordinate = df["Salary"].to_list() # ordinate = [45000,50000,60000,80000,110000,150000,200000,300000,500000,1000000]
n = len(abscissa) # no of observations plt.scatter(abscissa, ordinate) @@ -32,7 +32,7 @@ plt.show()

Defining Stuff

X = tf.placeholder("float") Y = tf.placeholder("float") -

Defining Variables

We first define all the coefficients and constant as tensorflow variables haveing a random intitial value

a = tf.Variable(np.random.randn(), name = "a") +

Defining Variables

We first define all the coefficients and constant as tensorflow variables having a random initial value

a = tf.Variable(np.random.randn(), name = "a") b = tf.Variable(np.random.randn(), name = "b") c = tf.Variable(np.random.randn(), name = "c") d = tf.Variable(np.random.randn(), name = "d") @@ -304,4 +304,4 @@ plt.title('Quintic Regression Result') plt.legend() plt.show() -

Results and Conclusion

You just learnt Polynomial Regression using TensorFlow!

Notes

Overfitting

> Overfitting refers to a model that models the training data too well.Overfitting happens when a model learns the detail and noise in the training data to the extent that it negatively impacts the performance of the model on new data. This means that the noise or random fluctuations in the training data is picked up and learned as concepts by the model. The problem is that these concepts do not apply to new data and negatively impact the models ability to generalize.

Source: Machine Learning Mastery

Basically if you train your machine learning model on a small dataset for a really large number of epochs, the model will learn all the deformities/noise in the data and will actually think that it is a normal part. Therefore when it will see some new data, it will discard that new data as noise and will impact the accuracy of the model in a negative manner

Tagged with:
\ No newline at end of file +

Results and Conclusion

You just learnt Polynomial Regression using TensorFlow!

Notes

Overfitting

> Overfitting refers to a model that models the training data too well.Overfitting happens when a model learns the detail and noise in the training data to the extent that it negatively impacts the performance of the model on new data. This means that the noise or random fluctuations in the training data is picked up and learned as concepts by the model. The problem is that these concepts do not apply to new data and negatively impact the models ability to generalise.

Source: Machine Learning Mastery

Basically if you train your machine learning model on a small dataset for a really large number of epochs, the model will learn all the deformities/noise in the data and will actually think that it is a normal part. Therefore when it will see some new data, it will discard that new data as noise and will impact the accuracy of the model in a negative manner

Tagged with:
\ No newline at end of file diff --git a/posts/2019-12-22-Fake-News-Detector/index.html b/posts/2019-12-22-Fake-News-Detector/index.html index 4f1197c..3ffb14a 100644 --- a/posts/2019-12-22-Fake-News-Detector/index.html +++ b/posts/2019-12-22-Fake-News-Detector/index.html @@ -1,4 +1,4 @@ -Building a Fake News Detector with Turicreate | Navan Chauhan
7 minute readCreated on December 22, 2019Last modified on June 1, 2020

Building a Fake News Detector with Turicreate

In this tutorial we will build a fake news detecting app from scratch, using Turicreate for the machine learning model and SwiftUI for building the app

Note: These commands are written as if you are running a jupyter notebook.

Building the Machine Learning Model

Data Gathering

To build a classifier, you need a lot of data. George McIntire (GH: @joolsa) has created a wonderful dataset containing the headline, body and wheter it is fake or real. Whenever you are looking for a dataset, always try searching on Kaggle and GitHub before you start building your own

Dependencies

I used a Google Colab instance for training my model. If you also plan on using Google Colab then I reccomend choosing a GPU Instance (It is Free) This allows you to train the model on the GPU. Turicreat is built on top of Apache's MXNet Framework, for us to use GPU we need to install a CUDA compatible MXNet package.

!pip install turicreate +Building a Fake News Detector with Turicreate | Navan Chauhan
7 minute readCreated on December 22, 2019Last modified on September 15, 2020

Building a Fake News Detector with Turicreate

In this tutorial we will build a fake news detecting app from scratch, using Turicreate for the machine learning model and SwiftUI for building the app

Note: These commands are written as if you are running a jupyter notebook.

Building the Machine Learning Model

Data Gathering

To build a classifier, you need a lot of data. George McIntire (GH: @joolsa) has created a wonderful dataset containing the headline, body and whether it is fake or real. Whenever you are looking for a dataset, always try searching on Kaggle and GitHub before you start building your own

Dependencies

I used a Google Colab instance for training my model. If you also plan on using Google Colab then I recommend choosing a GPU Instance (It is Free) This allows you to train the model on the GPU. Turicreate is built on top of Apache's MXNet Framework, for us to use GPU we need to install a CUDA compatible MXNet package.

!pip install turicreate !pip uninstall -y mxnet !pip install mxnet-cu100==1.4.0.post0

If you do not wish to train on GPU or are running it on your computer, you can ignore the last two lines

Downloading the Dataset

!wget -q "https://github.com/joolsa/fake_real_news_dataset/raw/master/fake_or_real_news.csv.zip" @@ -39,7 +39,7 @@

Exporting the Model

model_name = 'FakeNews' coreml_model_name = model_name + '.mlmodel' exportedModel = model.export_coreml(coreml_model_name) -

Note: To download files from Google Volab, simply click on the files section in the sidebar, right click on filename and then click on downlaod

Link to Colab Notebook

Building the App using SwiftUI

Initial Setup

First we create a single view app (make sure you check the use SwiftUI button)

Then we copy our .mlmodel file to our project (Just drag and drop the file in the XCode Files Sidebar)

Our ML Model does not take a string directly as an input, rather it takes bag of words as an input. DescriptionThe bag-of-words model is a simplifying representation used in NLP, in this text is represented as a bag of words, without any regatd of grammar or order, but noting multiplicity

We define our bag of words function

func bow(text: String) -> [String: Double] { +

Note: To download files from Google Colab, simply click on the files section in the sidebar, right click on filename and then click on download

Link to Colab Notebook

Building the App using SwiftUI

Initial Setup

First we create a single view app (make sure you check the use SwiftUI button)

Then we copy our .mlmodel file to our project (Just drag and drop the file in the XCode Files Sidebar)

Our ML Model does not take a string directly as an input, rather it takes bag of words as an input. DescriptionThe bag-of-words model is a simplifying representation used in NLP, in this text is represented as a bag of words, without any regard for grammar or order, but noting multiplicity

We define our bag of words function

func bow(text: String) -> [String: Double] { var bagOfWords = [String: Double]() let tagger = NSLinguisticTagger(tagSchemes: [.tokenType], options: 0) diff --git a/posts/2020-01-15-Setting-up-Kaggle-to-use-with-Colab/index.html b/posts/2020-01-15-Setting-up-Kaggle-to-use-with-Colab/index.html index d7b1e62..afd97b8 100644 --- a/posts/2020-01-15-Setting-up-Kaggle-to-use-with-Colab/index.html +++ b/posts/2020-01-15-Setting-up-Kaggle-to-use-with-Colab/index.html @@ -1,5 +1,5 @@ -Setting up Kaggle to use with Google Colab | Navan Chauhan
1 minute readCreated on January 15, 2020Last modified on June 1, 2020

Setting up Kaggle to use with Google Colab

In order to be able to access Kaggle Datasets, you will need to have an account on Kaggle (which is Free)

Grabbing Our Tokens

Go to Kaggle

Click on your User Profile and Click on My Account

Scroll Down untill you see Create New API Token

This will download your token as a JSON file

Copy the File to the root folder of your Google Drive

Setting up Colab

Mounting Google Drive

import os +Setting up Kaggle to use with Google Colab | Navan Chauhan
1 minute readCreated on January 15, 2020Last modified on September 15, 2020

Setting up Kaggle to use with Google Colab

In order to be able to access Kaggle Datasets, you will need to have an account on Kaggle (which is Free)

Grabbing Our Tokens

Go to Kaggle

Click on your User Profile and Click on My Account

Scroll Down until you see Create New API Token

This will download your token as a JSON file

Copy the File to the root folder of your Google Drive

Setting up Colab

Mounting Google Drive

import os from google.colab import drive drive.mount('/content/drive')

After this click on the URL in the output section, login and then paste the Auth Code

Configuring Kaggle

os.environ['KAGGLE_CONFIG_DIR'] = "/content/drive/My Drive/" -

Voila! You can now download kaggel datasets

Tagged with:
\ No newline at end of file +

Voila! You can now download Kaggle datasets

Tagged with:
\ No newline at end of file diff --git a/posts/2020-03-08-Making-Vaporwave-Track/index.html b/posts/2020-03-08-Making-Vaporwave-Track/index.html index 92c2148..6dbaa38 100644 --- a/posts/2020-03-08-Making-Vaporwave-Track/index.html +++ b/posts/2020-03-08-Making-Vaporwave-Track/index.html @@ -1 +1 @@ -Making My First Vaporwave Track (Remix) | Navan Chauhan
2 minute readCreated on March 8, 2020Last modified on June 1, 2020

Making My First Vaporwave Track (Remix)

I finally completed my first quick and dirty vaporwave remix of "I Want It That Way" by the Backstreet Boys

V A P O R W A V E

Vaporwave is all about A E S T H E T I C S. Vaporwave is a type of music genre that emmerged as a parody of Chillwave, shared more as a meme rather than a proper musical genre. Of course this changed as the genre become mature

How to Vaporwave

The first track which is considered to be actual Vaporwave is Ramona Xavier's Macintosh Plus, this unspokenly set the the guidelines for making Vaporwave

  • Take a 1980s RnB song
  • Slow it down
  • Add Bass and Trebble
  • Add again
  • Add Reverb ( make sure its wet )

There you have your very own Vaporwave track.

( Now, there are some tracks being produced which are not remixes and are original )

My Remix

Where is the Programming?

The fact that there are steps on producing Vaporwave, this gave me the idea that Vaporwave can actually be made using programming, stay tuned for when I publish the program which I am working on ( Generating A E S T H E T I C artwork and remixes)

Tagged with:
\ No newline at end of file +Making My First Vaporwave Track (Remix) | Navan Chauhan
2 minute readCreated on March 8, 2020Last modified on September 15, 2020

Making My First Vaporwave Track (Remix)

I finally completed my first quick and dirty vaporwave remix of "I Want It That Way" by the Backstreet Boys

V A P O R W A V E

Vaporwave is all about A E S T H E T I C S. Vaporwave is a type of music genre that emerged as a parody of Chillwave, shared more as a meme rather than a proper musical genre. Of course this changed as the genre become mature

How to Vaporwave

The first track which is considered to be actual Vaporwave is Ramona Xavier's Macintosh Plus, this set the the guidelines for making Vaporwave

  • Take a 1980s RnB song
  • Slow it down
  • Add Bass and Treble
  • Add again
  • Add Reverb ( make sure its wet )

There you have your very own Vaporwave track.

( Now, there are some tracks being produced which are not remixes and are original )

My Remix

Where is the Programming?

The fact that there are steps on producing Vaporwave, this gave me the idea that Vaporwave can actually be made using programming, stay tuned for when I publish the program which I am working on ( Generating A E S T H E T I C artwork and remixes)

Tagged with:
\ No newline at end of file diff --git a/posts/2020-04-13-Fixing-X11-Error-AmberTools-macOS/index.html b/posts/2020-04-13-Fixing-X11-Error-AmberTools-macOS/index.html index 44f9312..d1c7421 100644 --- a/posts/2020-04-13-Fixing-X11-Error-AmberTools-macOS/index.html +++ b/posts/2020-04-13-Fixing-X11-Error-AmberTools-macOS/index.html @@ -1,4 +1,4 @@ -Fixing X11 Error on macOS Catalina for AmberTools 18/19 | Navan Chauhan
2 minute readCreated on April 13, 2020Last modified on June 1, 2020

Fixing X11 Error on macOS Catalina for AmberTools 18/19

I was trying to install AmberTools on my macOS Catalina Installation. Running ./configure -macAccelerate clang gave me an error that it could not find X11 libraries, even though locate libXt showed that my installation was correct.

Error:

Could not find the X11 libraries; you may need to edit config.h +Fixing X11 Error on macOS Catalina for AmberTools 18/19 | Navan Chauhan
2 minute readCreated on April 13, 2020Last modified on September 15, 2020

Fixing X11 Error on macOS Catalina for AmberTools 18/19

I was trying to install AmberTools on my macOS Catalina Installation. Running ./configure -macAccelerate clang gave me an error that it could not find X11 libraries, even though locate libXt showed that my installation was correct.

Error:

Could not find the X11 libraries; you may need to edit config.h to set the XHOME and XLIBS variables. Error: The X11 libraries are not in the usual location ! To search for them try the command: locate libXt @@ -13,4 +13,4 @@ Error: The X11 libraries are not in the usual location ! To build Amber without XLEaP, re-run configure with '-noX11: ./configure -noX11 --with-python /usr/local/bin/python3 -macAccelerate clang Configure failed due to the errors above! -

I searcehd on Google for a solution on their, sadly there was not even a single thread which had a solution about this error.

The Fix

Simply reinstalling XQuartz using homebrew fixed the error brew cask reinstall xquartz

If you do not have xquartz installed, you need to run brew cask install xquartz

Tagged with:
\ No newline at end of file +

I searched on Google for a solution. Sadly, there was not even a single thread which had a solution about this error.

The Fix

Simply reinstalling XQuartz using homebrew fixed the error brew cask reinstall xquartz

If you do not have XQuartz installed, you need to run brew cask install xquartz

Tagged with:
\ No newline at end of file diff --git a/posts/2020-05-31-compiling-open-babel-on-ios/index.html b/posts/2020-05-31-compiling-open-babel-on-ios/index.html index af3ad63..b98c35c 100644 --- a/posts/2020-05-31-compiling-open-babel-on-ios/index.html +++ b/posts/2020-05-31-compiling-open-babel-on-ios/index.html @@ -1,5 +1,5 @@ -Compiling Open Babel on iOS | Navan Chauhan
5 minute readCreated on May 31, 2020Last modified on June 25, 2020

Compiling Open Babel on iOS

Due to the fact that my summer vacations started today, I had the brilliant idea of trying to run open babel on my iPad. To give a little background, I had tried to compile AutoDock Vina using a cross-compiler but I had miserably failed.

I am running the Checkr1n jailbreak on my iPad and the Unc0ver jailbreak on my phone.

But Why?

Well, just because I can. This is literally the only reason I tried compiling it and also partially because in the long run I want to compile AutoDock Vina so I can do Molecular Docking on the go.

Let's Go!

How hard can it be to compile open babel right? It is just a simple software with clear and concise build instructions. I just need to use cmake to build and the make to install.

It is 11 AM in the morning. I install clang, cmake and make from the Sam Bingner's repository, fired up ssh, downloaded the source code and ran the build command.`clang

Fail No. 1

I couldn't even get cmake to run, I did a little digging arond StackOverflow and founf that I needed the iOS SDK, sure no problem. I waited for Xcode to update and transfered the SDKs to my iPad

scp -r /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk root@192.168.1.8:/var/sdks/ -

Them I told cmake that this is the location for my SDK 😠. Succesful! Now I just needed to use make.

Fail No. 2

It was giving the error that thread-local-storage was not supported on this device.

[ 0%] Building CXX object src/CMakeFiles/openbabel.dir/alias.cpp.o +Compiling Open Babel on iOS | Navan Chauhan
5 minute readCreated on May 31, 2020Last modified on September 15, 2020

Compiling Open Babel on iOS

Due to the fact that my summer vacations started today, I had the brilliant idea of trying to run open babel on my iPad. To give a little background, I had tried to compile AutoDock Vina using a cross-compiler but I had miserably failed.

I am running the Checkr1n jailbreak on my iPad and the Unc0ver jailbreak on my phone.

But Why?

Well, just because I can. This is literally the only reason I tried compiling it and also partially because in the long run I want to compile AutoDock Vina so I can do Molecular Docking on the go.

Let's Go!

How hard can it be to compile open babel right? It is just a simple software with clear and concise build instructions. I just need to use cmake to build and the make to install.

It is 11 AM in the morning. I install clang, cmake and make from the Sam Bingner's repository, fired up ssh, downloaded the source code and ran the build command.`clang

Fail No. 1

I couldn't even get cmake to run, I did a little digging around StackOverflow and founf that I needed the iOS SDK, sure no problem. I waited for Xcode to update and transferred the SDKs to my iPad

scp -r /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk root@192.168.1.8:/var/sdks/ +

Them I told cmake that this is the location for my SDK 😠. Successful! Now I just needed to use make.

Fail No. 2

It was giving the error that thread-local-storage was not supported on this device.

[ 0%] Building CXX object src/CMakeFiles/openbabel.dir/alias.cpp.o [ 1%] Building CXX object src/CMakeFiles/openbabel.dir/atom.cpp.o In file included from /var/root/obabel/ob-src/src/atom.cpp:28: In file included from /var/root/obabel/ob-src/include/openbabel/ring.h:29: @@ -39,6 +39,6 @@ THREAD_LOCAL OB_EXTERN OBAromaticTyper aromtyper; make[2]: *** [src/CMakeFiles/openbabel.dir/build.make:76: src/CMakeFiles/openbabel.dir/atom.cpp.o] Error 1 make[1]: *** [CMakeFiles/Makefile2:1085: src/CMakeFiles/openbabel.dir/all] Error 2 make: *** [Makefile:129: all] Error 2 -

Strange but it is alright, there is nothing that hasn't been answered on the internet.

I did a little digging around and could not find a solution 😔

As a temporary fix, I disabled multithreading by going and commenting the lines in the source code.

Packaging as a deb

This was pretty straight forward, I tried installing it on my iPad and it was working pretty smoothly.

Moment of Truth

So I airdropped the .deb to my phone and tried installing it, the installation was succesful but when I tried obabel it just abborted.

Turns out because I had created an install target of a seprate folder while compiling, the binaries were refferencing a non-existing dylib rather than those in the /usr/lib folder. As a quick workaround I transferred the deb folder to my laptop and used otool and install_name tool: install_name_tool -change /var/root/obabel/ob-build/lib/libopenbabel.7.dylib /usr/lib/libopenbabel.7.dylib for all the executables and then signed them using jtool

I then installed it and everything went smoothly, I even ran obabel and it executed perfectly, showing the version number 3.1.0 ✌️ Ahh, smooth victory.

Nope. When I tried converting from SMILES to pdbqt, it gave an error saying plugin not found. This was weird.

So I just copied the entire build folder from my iPad to my phone and tried runnig it. Oops, Apple Sandbox Error, Oh no!

I spent 2 hours around this problem, only to see the documentation and relaise I hadn't setup the environment variable 🤦‍♂️

The Final Fix ( For Now )

export BABEL_DATADIR="/usr/share/openbabel/3.1.0" +

Strange but it is alright, there is nothing that hasn't been answered on the internet.

I did a little digging around and could not find a solution 😔

As a temporary fix, I disabled multithreading by going and commenting the lines in the source code.

Packaging as a deb

This was pretty straight forward, I tried installing it on my iPad and it was working pretty smoothly.

Moment of Truth

So I airdropped the .deb to my phone and tried installing it, the installation was successful but when I tried obabel it just aborted.

Turns out because I had created an install target of a separate folder while compiling, the binaries were referencing a non-existing dylib rather than those in the /usr/lib folder. As a quick workaround I transferred the deb folder to my laptop and used otool and install_name tool: install_name_tool -change /var/root/obabel/ob-build/lib/libopenbabel.7.dylib /usr/lib/libopenbabel.7.dylib for all the executables and then signed them using jtool

I then installed it and everything went smoothly, I even ran obabel and it executed perfectly, showing the version number 3.1.0 ✌️ Ahh, smooth victory.

Nope. When I tried converting from SMILES to pdbqt, it gave an error saying plugin not found. This was weird.

So I just copied the entire build folder from my iPad to my phone and tried running it. Oops, Apple Sandbox Error, Oh no!

I spent 2 hours around this problem, only to see the documentation and realise I hadn't setup the environment variable 🤦‍♂️

The Final Fix ( For Now )

export BABEL_DATADIR="/usr/share/openbabel/3.1.0" export BABEL_LIBDIR="/usr/lib/openbabel/3.1.0"

This was the tragedy of trying to compile something without knowing enough about compiling. It is 11:30 as of writing this. Something as trivial as this should not have taken me so long. Am I going to try to compile AutoDock Vina next? 🤔 Maybe.

Also, if you want to try Open Babel on you jailbroken iDevice, install the package from my repository ( You, need to run the above mentioned final fix :p ). This was tested on iOS 13.5, I cannot tell if it will work on others or not.

Hopefully, I add some more screenshots to this post.

Edit 1: Added Screenshots, had to replicate the errors.

Tagged with:
\ No newline at end of file diff --git a/posts/2020-06-01-Speeding-Up-Molecular-Docking-Workflow-AutoDock-Vina-and-PyMOL/index.html b/posts/2020-06-01-Speeding-Up-Molecular-Docking-Workflow-AutoDock-Vina-and-PyMOL/index.html index 27d01c9..c2726cf 100644 --- a/posts/2020-06-01-Speeding-Up-Molecular-Docking-Workflow-AutoDock-Vina-and-PyMOL/index.html +++ b/posts/2020-06-01-Speeding-Up-Molecular-Docking-Workflow-AutoDock-Vina-and-PyMOL/index.html @@ -1,8 +1,8 @@ -Workflow for Lightning Fast Molecular Docking Part One | Navan Chauhan
2 minute readCreated on June 1, 2020Last modified on June 2, 2020

Workflow for Lightning Fast Molecular Docking Part One

My Setup

  • macOS Catalina ( RIP 32bit app)
  • PyMOL
  • AutoDock Vina
  • Open Babel

One Command Docking

obabel -:"$(pbpaste)" --gen3d -opdbqt -Otest.pdbqt && vina --receptor lu.pdbqt --center_x -9.7 --center_y 11.4 --center_z 68.9 --size_x 19.3 --size_y 29.9 --size_z 21.3 --ligand test.pdbqt +Workflow for Lightning Fast Molecular Docking Part One | Navan Chauhan
2 minute readCreated on June 1, 2020Last modified on September 15, 2020

Workflow for Lightning Fast Molecular Docking Part One

My Setup

  • macOS Catalina ( RIP 32bit app)
  • PyMOL
  • AutoDock Vina
  • Open Babel

One Command Docking

obabel -:"$(pbpaste)" --gen3d -opdbqt -Otest.pdbqt && vina --receptor lu.pdbqt --center_x -9.7 --center_y 11.4 --center_z 68.9 --size_x 19.3 --size_y 29.9 --size_z 21.3 --ligand test.pdbqt

To run this command you simple copy the SMILES structure of the ligand you want an it automatically takes it from your clipboard, generates the 3D structure in the AutoDock PDBQT format using Open Babel and then docks it with your receptor using AutoDock Vina, all with just one command.

Let me break down the commands

obabel -:"$(pbpaste)" --gen3d -opdbqt -Otest.pdbqt

pbpaste and pbcopy are macOS commands for pasting and copying from and to the clipboard. Linux users may install the xclip and xsel packages from their respective package managers and then insert these aliases into their bash_profile, zshrc e.t.c

alias pbcopy='xclip -selection clipboard' alias pbpaste='xclip -selection clipboard -o'
$(pbpaste)

This is used in bash to evaluate the results of a command. In this scenario we are using it to get the contents of the clipboard.

The rest of the command is a normal Open Babel command to generate a 3D structure in PDBQT format and then save it as test.pdbqt

&& -

This tells the termianl to only run the next part if the previous command runs succesfuly without any errors.

vina --receptor lu.pdbqt --center_x -9.7 --center_y 11.4 --center_z 68.9 --size_x 19.3 --size_y 29.9 --size_z 21.3 --ligand test.pdbqt +

This tells the terminal to only run the next part if the previous command runs successfully without any errors.

vina --receptor lu.pdbqt --center_x -9.7 --center_y 11.4 --center_z 68.9 --size_x 19.3 --size_y 29.9 --size_z 21.3 --ligand test.pdbqt

This is just the docking command for AutoDock Vina. In the next part I will tell how to use PyMOL and a plugin to directly generate the coordinates in Vina format --center_x -9.7 --center_y 11.4 --center_z 68.9 --size_x 19.3 --size_y 29.9 --size_z 21.3 without needing to type them manually.

Tagged with:
\ No newline at end of file diff --git a/posts/2020-06-02-Compiling-AutoDock-Vina-on-iOS/index.html b/posts/2020-06-02-Compiling-AutoDock-Vina-on-iOS/index.html index 9553d47..e1b18af 100644 --- a/posts/2020-06-02-Compiling-AutoDock-Vina-on-iOS/index.html +++ b/posts/2020-06-02-Compiling-AutoDock-Vina-on-iOS/index.html @@ -1,4 +1,4 @@ -Compiling AutoDock Vina on iOS | Navan Chauhan
3 minute readCreated on June 2, 2020

Compiling AutoDock Vina on iOS

Why? Because I can.

Installing makedepend

makedepend is a Unix tool used to generate dependencies of C source files. Most modern programes do not use this anymore, but then again AutoDock Vina's source code hasn't been changed since 2011. The first hurdle came when I saw that there was no makedepend command, neither was there any package on any development repository for iOS. So, I tracked down the original source code for makedepend (https://github.com/DerellLicht/makedepend). According to the repository this is actually the source code for the makedepend utility that came with some XWindows distribution back around Y2K. I am pretty sure there is a problem with my current compiler configuration because I had to manually edit the Makefile to provide the path to the iOS SDKs using the -isysroot flag.

Editting the Makefile

Original Makefile ( I used the provided mac Makefile base )

BASE=/usr/local +Compiling AutoDock Vina on iOS | Navan Chauhan
3 minute readCreated on June 2, 2020Last modified on September 15, 2020

Compiling AutoDock Vina on iOS

Why? Because I can.

Installing makedepend

makedepend is a Unix tool used to generate dependencies of C source files. Most modern programs do not use this anymore, but then again AutoDock Vina's source code hasn't been changed since 2011. The first hurdle came when I saw that there was no makedepend command, neither was there any package on any development repository for iOS. So, I tracked down the original source code for makedepend (https://github.com/DerellLicht/makedepend). According to the repository this is actually the source code for the makedepend utility that came with some XWindows distribution back around Y2K. I am pretty sure there is a problem with my current compiler configuration because I had to manually edit the Makefile to provide the path to the iOS SDKs using the -isysroot flag.

Editing the Makefile

Original Makefile ( I used the provided mac Makefile base )

BASE=/usr/local BOOST_VERSION=1_41 BOOST_INCLUDE = $(BASE)/include C_PLATFORM=-arch i386 -arch ppc -isysroot /Developer/SDKs/MacOSX10.5.sdk -mmacosx-version-min=10.4 @@ -7,7 +7,7 @@ BOOST_LIB_VERSION= include ../../makefile_common -

I installed Boost 1.68.0-1 from Sam Bingner's repository. ( Otherwise I would have had to compile boost too 😫 )

Editted Makefile

BASE=/usr +

I installed Boost 1.68.0-1 from Sam Bingner's repository. ( Otherwise I would have had to compile boost too 😫 )

Edited Makefile

BASE=/usr BOOST_VERSION=1_68 BOOST_INCLUDE = $(BASE)/include C_PLATFORM=-arch arm64 -isysroot /var/sdks/Latest.sdk @@ -26,4 +26,4 @@ include ../../makefile_common std::cerr << "\n\nParse error on line " << e.line << " in file \"" << e.file.native_file_string() << "\": " << e.reason << '\n'; ~~~~~~ ^ 2 errors generated. -

Turns out native_file_string was deprecated in Boost 1.57 and replaced with just string

Error 3 - Library Not Found

This one still boggles me because there was no reason for it to not work, as a workaround I downloaded the DEB, extracted it and used that path for compiling.

Error 4 - No Member Named 'nativefilestring' Again.

But, this time in another file and I quickle fixed it

Moment of Truth

Obviously it was working on my iPad, but would it work on another device? I transfered the compiled binary and

The package is available on my repository and only depends on boost. ( Both, Vina and Vina-Split are part of the package)

Tagged with:
\ No newline at end of file +

Turns out native_file_string was deprecated in Boost 1.57 and replaced with just string

Error 3 - Library Not Found

This one still boggles me because there was no reason for it to not work, as a workaround I downloaded the DEB, extracted it and used that path for compiling.

Error 4 - No Member Named 'nativefilestring' Again.

But, this time in another file and I quickly fixed it

Moment of Truth

Obviously it was working on my iPad, but would it work on another device? I transferred the compiled binary and

The package is available on my repository and only depends on boost. ( Both, Vina and Vina-Split are part of the package)

Tagged with:
\ No newline at end of file diff --git a/posts/2020-08-01-Natural-Feature-Tracking-ARJS/index.html b/posts/2020-08-01-Natural-Feature-Tracking-ARJS/index.html index d6e1f73..9bdad25 100644 --- a/posts/2020-08-01-Natural-Feature-Tracking-ARJS/index.html +++ b/posts/2020-08-01-Natural-Feature-Tracking-ARJS/index.html @@ -1,4 +1,4 @@ -Introduction to AR.js and Natural Feature Tracking | Navan Chauhan
20 minute readCreated on August 1, 2020

Introduction to AR.js and Natural Feature Tracking

AR.js

AR.js is a lightweight library for Augmented Reality on the Web, coming with features like Image Tracking, Location based AR and Marker tracking. It is the easiest option for cross-browser augmented reality.

The same code works for iOS, Android, Desktops and even VR Browsers!

It weas initially created by Jerome Etienne and is now maintained by Nicolo Carpignoli and the AR-js Organisation

NFT

Usually for augmented reality you need specialised markers, like this Hiro marker (notice the thick non-aesthetic borders 🤢)

This is called marker based tracking where the code knows what to look for. NFT or Natural Feature Tracing converts normal images into markers by extracting 'features' from it, this way you can use any image of your liking!

I'll be using my GitHub profile picture

Creating the Marker!

First we need to create the marker files required by AR.js for NFT. For this we use Carnaux's repository 'NFT-Marker-Creator'.

$ git clone https://github.com/Carnaux/NFT-Marker-Creator +Introduction to AR.js and Natural Feature Tracking | Navan Chauhan
7 minute readCreated on August 1, 2020Last modified on September 15, 2020

Introduction to AR.js and Natural Feature Tracking

AR.js

AR.js is a lightweight library for Augmented Reality on the Web, coming with features like Image Tracking, Location based AR and Marker tracking. It is the easiest option for cross-browser augmented reality.

The same code works for iOS, Android, Desktops and even VR Browsers!

It was initially created by Jerome Etienne and is now maintained by Nicolo Carpignoli and the AR-js Organisation

NFT

Usually for augmented reality you need specialised markers, like this Hiro marker (notice the thick non-aesthetic borders 🤢)

This is called marker based tracking where the code knows what to look for. NFT or Natural Feature Tracing converts normal images into markers by extracting 'features' from it, this way you can use any image of your liking!

I'll be using my GitHub profile picture

Creating the Marker!

First we need to create the marker files required by AR.js for NFT. For this we use Carnaux's repository 'NFT-Marker-Creator'.

$ git clone https://github.com/Carnaux/NFT-Marker-Creator Cloning into 'NFT-Marker-Creator'... remote: Enumerating objects: 79, done. @@ -68,283 +68,9 @@ Generator started at 2020-08-01 16 [info] Saving to asa.iset... [info] Done. [info] Generating FeatureList... -[info] Start for 72.000000 dpi image. -[info] ImageSize = 309560[pixel] -[info] Extracted features = 24930[pixel] -[info] Filtered features = 6192[pixel] - 544/ 545.[info] -[info] Done. -[info] Max feature = 305 -[info] 1: ( 22,474) : 0.211834 min=0.212201 max=0.583779, sd=36.253441 -[info] 2: (259,449) : 0.365469 min=0.373732 max=0.667143, sd=64.356659 -[info] 3: (244,492) : 0.368801 min=0.373514 max=0.644463, sd=52.414131 -[info] 4: (542,503) : 0.388110 min=0.393117 max=0.659145, sd=21.867199 -[info] 5: (544,451) : 0.426580 min=0.431487 max=0.697276, sd=24.540915 -[info] 6: (486,334) : 0.593511 min=0.565134 max=0.800069, sd=31.706526 -[info] 7: (217,283) : 0.602713 min=0.553285 max=0.815628, sd=11.092167 -[info] 8: ( 44,420) : 0.612274 min=0.550906 max=0.832009, sd=29.664345 -[info] 9: (522,343) : 0.615029 min=0.569004 max=0.796405, sd=34.439430 -[info] 10: ( 57,476) : 0.621610 min=0.568849 max=0.816438, sd=41.452328 -[info] 11: (407,335) : 0.626746 min=0.601339 max=0.802741, sd=22.136026 -[info] 12: (483,375) : 0.636573 min=0.552658 max=0.851101, sd=53.539089 -[info] 13: ( 54,509) : 0.637408 min=0.563383 max=0.804955, sd=34.774330 -[info] 14: ( 22,386) : 0.642944 min=0.630736 max=0.852005, sd=29.959364 -[info] 15: (459,434) : 0.649170 min=0.567012 max=0.817146, sd=44.087994 -[info] 16: (510,409) : 0.667462 min=0.572251 max=0.808130, sd=49.187576 -[info] 17: (330,270) : 0.690323 min=0.625252 max=0.836476, sd=24.105335 -[info] 18: (544,270) : 0.695668 min=0.550262 max=0.841321, sd=53.076946 -[info] 19: (443,489) : 0.696738 min=0.557579 max=0.868091, sd=27.418671 -[info] 20: (439,373) : 0.706379 min=0.658029 max=0.856492, sd=52.750744 -[info] 21: (381,264) : 0.712895 min=0.567250 max=0.829908, sd=21.462694 -[info] 22: (114,344) : 0.726579 min=0.574026 max=0.873275, sd=19.631178 -[info] 23: (450,339) : 0.730613 min=0.622663 max=0.840786, sd=36.808407 -[info] 24: (187,316) : 0.737529 min=0.568579 max=0.856549, sd=35.841721 -[info] 25: (155,451) : 0.741329 min=0.617655 max=0.848432, sd=50.381092 -[info] 26: (425,406) : 0.770987 min=0.674625 max=0.908930, sd=39.619099 -[info] 27: (520,308) : 0.773589 min=0.646116 max=0.856012, sd=31.303595 -[info] 28: (239,244) : 0.784615 min=0.655032 max=0.943640, sd=25.512465 -[info] 29: (415,277) : 0.784977 min=0.745286 max=0.898037, sd=24.985357 -[info] 30: (278,244) : 0.796536 min=0.713171 max=0.940000, sd=36.488716 -[info] 31: (536,235) : 0.825348 min=0.654568 max=0.901623, sd=54.036903 -[info] 32: (341,310) : 0.828034 min=0.796073 max=0.928327, sd=57.174885 -[info] 33: (355,438) : 0.833364 min=0.616488 max=0.944241, sd=57.199963 -[info] 34: (330,215) : 0.852530 min=0.778738 max=0.960263, sd=31.844889 -[info] 35: (307,163) : 0.859535 min=0.750590 max=0.963522, sd=41.524643 -[info] 36: ( 43,246) : 0.865821 min=0.715005 max=0.967188, sd=17.746605 -[info] 37: (207,171) : 0.873648 min=0.753025 max=0.956383, sd=22.992336 -[info] 38: ( 75,383) : 0.877258 min=0.809668 max=0.943998, sd=24.749569 -[info] 39: ( 77,438) : 0.893997 min=0.853110 max=0.953184, sd=37.195824 -[info] 40: (186,231) : 0.897896 min=0.893945 max=0.959936, sd=53.592140 -[info] --------------------------------------------------------------- -[info] Start for 59.184002 dpi image. -[info] ImageSize = 209216[pixel] -[info] Extracted features = 16664[pixel] -[info] Filtered features = 4219[pixel] - 447/ 448.[info] -[info] Done. -[info] Max feature = 205 -[info] 1: ( 24,404) : 0.263453 min=0.272001 max=0.579902, sd=30.270309 -[info] 2: (181,415) : 0.286756 min=0.296179 max=0.570393, sd=51.832920 -[info] 3: (229,375) : 0.299946 min=0.301300 max=0.620830, sd=63.595726 -[info] 4: (443,403) : 0.395126 min=0.407708 max=0.635656, sd=21.330490 -[info] 5: (224,412) : 0.444073 min=0.451129 max=0.679228, sd=50.032726 -[info] 6: (402,276) : 0.562894 min=0.550990 max=0.783724, sd=31.768101 -[info] 7: ( 22,324) : 0.597796 min=0.553165 max=0.803201, sd=25.844311 -[info] 8: (408,318) : 0.606647 min=0.558857 max=0.803414, sd=50.661160 -[info] 9: (378,394) : 0.616479 min=0.558946 max=0.824612, sd=26.079950 -[info] 10: (384,361) : 0.668396 min=0.603888 max=0.820440, sd=36.232616 -[info] 11: (337,278) : 0.671341 min=0.586483 max=0.837408, sd=23.079739 -[info] 12: ( 52,371) : 0.679165 min=0.648876 max=0.842554, sd=35.683979 -[info] 13: (440,260) : 0.683035 min=0.577838 max=0.836932, sd=36.886761 -[info] 14: (444,227) : 0.687587 min=0.567562 max=0.837861, sd=49.854889 -[info] 15: (149,266) : 0.688923 min=0.572425 max=0.832697, sd=31.967720 -[info] 16: (290,212) : 0.714381 min=0.573321 max=0.825159, sd=25.429075 -[info] 17: (374,309) : 0.720491 min=0.711943 max=0.874688, sd=58.918808 -[info] 18: (103,283) : 0.723728 min=0.559241 max=0.835176, sd=17.688787 -[info] 19: (235,200) : 0.742138 min=0.745569 max=0.912951, sd=36.019238 -[info] 20: (128,360) : 0.770635 min=0.551060 max=0.871743, sd=51.743370 -[info] 21: (297,368) : 0.794845 min=0.553557 max=0.908358, sd=48.856777 -[info] 22: (348,343) : 0.798785 min=0.662930 max=0.928792, sd=40.917496 -[info] 23: (195,204) : 0.801663 min=0.621057 max=0.936245, sd=25.684557 -[info] 24: (325,221) : 0.810848 min=0.756911 max=0.898076, sd=30.086334 -[info] 25: (276,253) : 0.825425 min=0.812713 max=0.926472, sd=58.497112 -[info] 26: ( 57,409) : 0.838737 min=0.829799 max=0.922687, sd=45.331120 -[info] 27: (174,164) : 0.846327 min=0.738286 max=0.963738, sd=31.914589 -[info] 28: (440,191) : 0.859450 min=0.752551 max=0.938949, sd=62.600094 -[info] 29: (271,176) : 0.865594 min=0.832825 max=0.961079, sd=36.463100 -[info] 30: (247,141) : 0.869905 min=0.719910 max=0.944092, sd=39.328327 -[info] 31: ( 62,315) : 0.889240 min=0.820342 max=0.949085, sd=28.934418 -[info] 32: ( 55,200) : 0.896143 min=0.724967 max=0.973851, sd=18.574352 -[info] --------------------------------------------------------------- -[info] Start for 46.974373 dpi image. -[info] ImageSize = 132076[pixel] -[info] Extracted features = 10582[pixel] -[info] Filtered features = 2654[pixel] - 355/ 356.[info] -[info] Done. -[info] Max feature = 125 -[info] 1: (147,328) : 0.253711 min=0.261744 max=0.546386, sd=49.037407 -[info] 2: ( 23,318) : 0.326023 min=0.332772 max=0.553814, sd=29.970749 -[info] 3: (180,307) : 0.332172 min=0.353050 max=0.582633, sd=55.894489 -[info] 4: (339,229) : 0.568601 min=0.561106 max=0.788570, sd=35.519234 -[info] 5: ( 34,277) : 0.618809 min=0.583962 max=0.811560, sd=31.459497 -[info] 6: (120,210) : 0.655516 min=0.553481 max=0.808650, sd=30.417620 -[info] 7: (299,226) : 0.660352 min=0.551423 max=0.810235, sd=40.050533 -[info] 8: (289,291) : 0.672981 min=0.584721 max=0.865715, sd=34.681435 -[info] 9: ( 85,221) : 0.735781 min=0.557358 max=0.837869, sd=15.401685 -[info] 10: (192,150) : 0.748029 min=0.762064 max=0.911963, sd=36.248280 -[info] 11: (348,194) : 0.750950 min=0.647856 max=0.880163, sd=44.824394 -[info] 12: ( 27,244) : 0.779110 min=0.741316 max=0.889470, sd=27.621294 -[info] 13: (192,110) : 0.801694 min=0.737249 max=0.930409, sd=40.227238 -[info] 14: (142,131) : 0.807086 min=0.639763 max=0.952103, sd=32.719967 -[info] 15: (218,190) : 0.819924 min=0.789691 max=0.922567, sd=52.960388 -[info] 16: (265,213) : 0.829599 min=0.755986 max=0.906421, sd=30.495857 -[info] 17: (348,159) : 0.833503 min=0.770736 max=0.922161, sd=62.732380 -[info] 18: (241,296) : 0.846388 min=0.551359 max=0.934936, sd=41.934254 -[info] 19: (263,178) : 0.888409 min=0.791312 max=0.931626, sd=35.446648 -[info] --------------------------------------------------------------- -[info] Start for 37.283585 dpi image. -[info] ImageSize = 82908[pixel] -[info] Extracted features = 6477[pixel] -[info] Filtered features = 1665[pixel] - 281/ 282.[info] -[info] Done. -[info] Max feature = 76 -[info] 1: (126,255) : 0.291458 min=0.293245 max=0.532445, sd=44.819416 -[info] 2: (259,179) : 0.544833 min=0.552487 max=0.779788, sd=34.632847 -[info] 3: ( 22,217) : 0.577221 min=0.572050 max=0.765973, sd=29.686686 -[info] 4: (101,164) : 0.634161 min=0.551986 max=0.799318, sd=26.766178 -[info] 5: (263,212) : 0.644384 min=0.564216 max=0.795845, sd=42.637730 -[info] 6: ( 22,258) : 0.680414 min=0.692315 max=0.791998, sd=35.050774 -[info] 7: (220,182) : 0.698840 min=0.592309 max=0.862061, sd=37.508007 -[info] 8: (229,257) : 0.703305 min=0.553369 max=0.864329, sd=24.709126 -[info] 9: (151,118) : 0.726578 min=0.738549 max=0.906151, sd=38.828815 -[info] 10: (177,214) : 0.781371 min=0.563572 max=0.832300, sd=54.721115 -[info] 11: (225,215) : 0.788058 min=0.755838 max=0.899616, sd=50.707241 -[info] 12: (158,155) : 0.794013 min=0.781587 max=0.916502, sd=48.264225 -[info] 13: ( 63,176) : 0.841771 min=0.763143 max=0.907695, sd=19.686169 -[info] 14: (206,145) : 0.844404 min=0.752984 max=0.916683, sd=35.973286 -[info] 15: (270,142) : 0.845205 min=0.795885 max=0.935199, sd=59.147652 -[info] 16: (149, 75) : 0.874711 min=0.796581 max=0.958077, sd=47.111187 -[info] 17: (113,104) : 0.877415 min=0.769141 max=0.966519, sd=51.069527 -[info] --------------------------------------------------------------- -[info] Start for 29.592001 dpi image. -[info] ImageSize = 52192[pixel] -[info] Extracted features = 4027[pixel] -[info] Filtered features = 1050[pixel] - 223/ 224.[info] -[info] Done. -[info] Max feature = 50 -[info] 1: (102,201) : 0.289298 min=0.304870 max=0.535742, sd=46.188416 -[info] 2: (210,148) : 0.562558 min=0.576227 max=0.774472, sd=41.276623 -[info] 3: ( 78,129) : 0.612243 min=0.551623 max=0.784806, sd=23.271040 -[info] 4: (177,133) : 0.688757 min=0.576350 max=0.879510, sd=33.073624 -[info] 5: (201,187) : 0.689806 min=0.643513 max=0.833690, sd=35.062008 -[info] 6: (114, 82) : 0.729714 min=0.720887 max=0.903299, sd=42.064465 -[info] 7: (124,118) : 0.735968 min=0.746138 max=0.899143, sd=45.898678 -[info] 8: ( 23,164) : 0.779643 min=0.747459 max=0.878366, sd=34.013752 -[info] 9: (140,163) : 0.809162 min=0.637383 max=0.881929, sd=50.335152 -[info] 10: ( 23,198) : 0.819792 min=0.807587 max=0.883169, sd=39.944038 -[info] 11: (210,113) : 0.867805 min=0.835419 max=0.952952, sd=62.521526 -[info] --------------------------------------------------------------- -[info] Start for 23.487186 dpi image. -[info] ImageSize = 32930[pixel] -[info] Extracted features = 2542[pixel] -[info] Filtered features = 663[pixel] - 177/ 178.[info] -[info] Done. -[info] Max feature = 26 -[info] 1: ( 90,150) : 0.521553 min=0.522064 max=0.695849, sd=47.276417 -[info] 2: (150,127) : 0.588753 min=0.553176 max=0.839270, sd=40.334232 -[info] 3: ( 63,104) : 0.657959 min=0.625311 max=0.845079, sd=26.239153 -[info] 4: ( 86, 71) : 0.716036 min=0.701616 max=0.897777, sd=40.668495 -[info] 5: (124, 92) : 0.795828 min=0.799467 max=0.919533, sd=46.250336 -[info] 6: ( 22,140) : 0.823488 min=0.788265 max=0.904299, sd=37.023449 -[info] 7: (158, 94) : 0.846220 min=0.807598 max=0.945632, sd=57.189617 -[info] --------------------------------------------------------------- -[info] Start for 18.641792 dpi image. -[info] ImageSize = 20727[pixel] -[info] Extracted features = 1597[pixel] -[info] Filtered features = 415[pixel] - 140/ 141.[info] -[info] Done. -[info] Max feature = 17 -[info] 1: ( 66,105) : 0.595532 min=0.553885 max=0.773757, sd=42.355804 -[info] 2: (114, 96) : 0.636754 min=0.558845 max=0.877043, sd=41.727524 -[info] 3: ( 68, 63) : 0.740243 min=0.674251 max=0.907678, sd=48.359558 -[info] 4: ( 25, 97) : 0.770776 min=0.551998 max=0.915293, sd=33.076458 -[info] 5: (102, 62) : 0.878442 min=0.879749 max=0.958104, sd=59.505066 -[info] --------------------------------------------------------------- -[info] Start for 14.796000 dpi image. -[info] ImageSize = 13104[pixel] -[info] Extracted features = 1028[pixel] -[info] Filtered features = 266[pixel] - 111/ 112.[info] -[info] Done. -[info] Max feature = 9 -[info] 1: ( 48, 83) : 0.625048 min=0.567549 max=0.834230, sd=42.692223 -[info] 2: ( 89, 73) : 0.670262 min=0.553308 max=0.882925, sd=45.027554 -[info] 3: ( 52, 50) : 0.786427 min=0.743476 max=0.927114, sd=55.718193 -[info] --------------------------------------------------------------- -[info] Start for 11.743593 dpi image. -[info] ImageSize = 8277[pixel] -[info] Extracted features = 679[pixel] -[info] Filtered features = 166[pixel] - 88/ 89.[info] -[info] Done. -[info] Max feature = 4 -[info] 1: ( 60, 64) : 0.633060 min=0.550807 max=0.857935, sd=42.695072 -[info] 2: ( 26, 63) : 0.702131 min=0.685312 max=0.867015, sd=38.862392 -[info] 3: ( 40, 30) : 0.833010 min=0.834262 max=0.945118, sd=62.075710 -[info] --------------------------------------------------------------- -[info] Start for 9.320896 dpi image. -[info] ImageSize = 5254[pixel] -[info] Extracted features = 398[pixel] -[info] Filtered features = 106[pixel] - 70/ 71.[info] -[info] Done. -[info] Max feature = 4 -[info] 1: ( 47, 48) : 0.706897 min=0.668202 max=0.875970, sd=47.628330 -[info] --------------------------------------------------------------- -[info] Start for 7.398000 dpi image. -[info] ImageSize = 3248[pixel] -[info] Extracted features = 248[pixel] -[info] Filtered features = 65[pixel] - 55/ 56.[info] -[info] Done. -[info] Max feature = 1 -[info] 1: ( 34, 33) : 0.794624 min=0.780241 max=0.925466, sd=59.612782 -[info] --------------------------------------------------------------- -[info] Start for 5.871797 dpi image. -[info] ImageSize = 2024[pixel] -[info] Extracted features = 161[pixel] -[info] Filtered features = 41[pixel] - 43/ 44.[info] -[info] Done. -[info] Max feature = 1 -[info] --------------------------------------------------------------- -[info] Start for 4.660448 dpi image. -[info] ImageSize = 1295[pixel] -[info] Extracted features = 108[pixel] -[info] Filtered features = 26[pixel] - 34/ 35.[info] -[info] Done. -[info] Max feature = 1 -[info] --------------------------------------------------------------- -[info] Start for 3.699000 dpi image. -[info] ImageSize = 812[pixel] -[info] Extracted features = 65[pixel] -[info] Filtered features = 17[pixel] - 27/ 28.[info] -[info] Done. -[info] Max feature = 0 -[info] --------------------------------------------------------------- -[info] Done. -[info] Saving FeatureSet... -[info] Done. -[info] Generating FeatureSet3... -[info] (568, 545) 72.000000[dpi] -[info] Freak features - 405[info] ========= 405 =========== -[info] (467, 448) 59.184002[dpi] -[info] Freak features - 401[info] ========= 401 =========== -[info] (371, 356) 46.974373[dpi] -[info] Freak features - 385[info] ========= 385 =========== -[info] (294, 282) 37.283585[dpi] -[info] Freak features - 486[info] ========= 486 =========== -[info] (233, 224) 29.592001[dpi] -[info] Freak features - 362[info] ========= 362 =========== -[info] (185, 178) 23.487186[dpi] -[info] Freak features - 232[info] ========= 232 =========== -[info] (147, 141) 18.641792[dpi] -[info] Freak features - 148[info] ========= 148 =========== -[info] (117, 112) 14.796000[dpi] -[info] Freak features - 113[info] ========= 113 =========== -[info] (93, 89) 11.743593[dpi] -[info] Freak features - 81[info] ========= 81 =========== -[info] (74, 71) 9.320896[dpi] -[info] Freak features - 51[info] ========= 51 =========== -[info] (58, 56) 7.398000[dpi] -[info] Freak features - 36[info] ========= 36 =========== + +... + [info] (46, 44) 5.871797[dpi] [info] Freak features - 23[info] ========= 23 =========== [info] (37, 35) 4.660448[dpi] diff --git a/posts/index.html b/posts/index.html index 1e7a78f..299404f 100644 --- a/posts/index.html +++ b/posts/index.html @@ -1 +1 @@ -Posts | Navan Chauhan

Posts

Tips, tricks and tutorials which I think might be useful.

\ No newline at end of file +Posts | Navan Chauhan

Posts

Tips, tricks and tutorials which I think might be useful.

\ No newline at end of file diff --git a/sitemap.xml b/sitemap.xml index a48f507..396a16e 100644 --- a/sitemap.xml +++ b/sitemap.xml @@ -1 +1 @@ -https://navanchauhan.github.io/aboutdaily1.02020-07-16https://navanchauhan.github.io/postsdaily1.02020-08-01https://navanchauhan.github.io/posts/2010-01-24-experimentsmonthly0.52020-06-01https://navanchauhan.github.io/posts/2019-05-05-Custom-Snowboard-Anemone-Thememonthly0.52020-06-01https://navanchauhan.github.io/posts/2019-12-04-Google-Teachable-Machinesmonthly0.52020-06-01https://navanchauhan.github.io/posts/2019-12-08-Image-Classifier-Tensorflowmonthly0.52020-06-01https://navanchauhan.github.io/posts/2019-12-08-Splitting-Zipsmonthly0.52020-06-01https://navanchauhan.github.io/posts/2019-12-10-TensorFlow-Model-Predictionmonthly0.52020-06-01https://navanchauhan.github.io/posts/2019-12-16-TensorFlow-Polynomial-Regressionmonthly0.52020-06-01https://navanchauhan.github.io/posts/2019-12-22-Fake-News-Detectormonthly0.52020-06-01https://navanchauhan.github.io/posts/2020-01-14-Converting-between-PIL-NumPymonthly0.52020-06-01https://navanchauhan.github.io/posts/2020-01-15-Setting-up-Kaggle-to-use-with-Colabmonthly0.52020-06-01https://navanchauhan.github.io/posts/2020-01-16-Image-Classifier-Using-Turicreatemonthly0.52020-06-01https://navanchauhan.github.io/posts/2020-01-19-Connect-To-Bluetooth-Devices-Linux-Terminalmonthly0.52020-06-01https://navanchauhan.github.io/posts/2020-03-03-Playing-With-Android-TVmonthly0.52020-06-01https://navanchauhan.github.io/posts/2020-03-08-Making-Vaporwave-Trackmonthly0.52020-06-01https://navanchauhan.github.io/posts/2020-04-13-Fixing-X11-Error-AmberTools-macOSmonthly0.52020-06-01https://navanchauhan.github.io/posts/2020-05-31-compiling-open-babel-on-iosmonthly0.52020-06-25https://navanchauhan.github.io/posts/2020-06-01-Speeding-Up-Molecular-Docking-Workflow-AutoDock-Vina-and-PyMOLmonthly0.52020-06-02https://navanchauhan.github.io/posts/2020-06-02-Compiling-AutoDock-Vina-on-iOSmonthly0.52020-06-02https://navanchauhan.github.io/posts/2020-07-01-Install-rdkit-colabmonthly0.52020-09-15https://navanchauhan.github.io/posts/2020-08-01-Natural-Feature-Tracking-ARJSmonthly0.52020-08-01https://navanchauhan.github.io/posts/hello-worldmonthly0.52020-06-01https://navanchauhan.github.io/publicationsdaily1.02020-03-17https://navanchauhan.github.io/publications/2019-05-14-Detecting-Driver-Fatigue-Over-Speeding-and-Speeding-up-Post-Accident-Responsemonthly0.52020-03-14https://navanchauhan.github.io/publications/2020-03-14-generating-vaporwavemonthly0.52020-03-15https://navanchauhan.github.io/publications/2020-03-17-Possible-Drug-Candidates-COVID-19monthly0.52020-03-18 \ No newline at end of file +https://navanchauhan.github.io/aboutdaily1.02020-07-16https://navanchauhan.github.io/postsdaily1.02020-08-01https://navanchauhan.github.io/posts/2010-01-24-experimentsmonthly0.52020-06-01https://navanchauhan.github.io/posts/2019-05-05-Custom-Snowboard-Anemone-Thememonthly0.52020-09-15https://navanchauhan.github.io/posts/2019-12-04-Google-Teachable-Machinesmonthly0.52020-09-15https://navanchauhan.github.io/posts/2019-12-08-Image-Classifier-Tensorflowmonthly0.52020-09-15https://navanchauhan.github.io/posts/2019-12-08-Splitting-Zipsmonthly0.52020-06-01https://navanchauhan.github.io/posts/2019-12-10-TensorFlow-Model-Predictionmonthly0.52020-06-01https://navanchauhan.github.io/posts/2019-12-16-TensorFlow-Polynomial-Regressionmonthly0.52020-09-15https://navanchauhan.github.io/posts/2019-12-22-Fake-News-Detectormonthly0.52020-09-15https://navanchauhan.github.io/posts/2020-01-14-Converting-between-PIL-NumPymonthly0.52020-06-01https://navanchauhan.github.io/posts/2020-01-15-Setting-up-Kaggle-to-use-with-Colabmonthly0.52020-09-15https://navanchauhan.github.io/posts/2020-01-16-Image-Classifier-Using-Turicreatemonthly0.52020-06-01https://navanchauhan.github.io/posts/2020-01-19-Connect-To-Bluetooth-Devices-Linux-Terminalmonthly0.52020-06-01https://navanchauhan.github.io/posts/2020-03-03-Playing-With-Android-TVmonthly0.52020-06-01https://navanchauhan.github.io/posts/2020-03-08-Making-Vaporwave-Trackmonthly0.52020-09-15https://navanchauhan.github.io/posts/2020-04-13-Fixing-X11-Error-AmberTools-macOSmonthly0.52020-09-15https://navanchauhan.github.io/posts/2020-05-31-compiling-open-babel-on-iosmonthly0.52020-09-15https://navanchauhan.github.io/posts/2020-06-01-Speeding-Up-Molecular-Docking-Workflow-AutoDock-Vina-and-PyMOLmonthly0.52020-09-15https://navanchauhan.github.io/posts/2020-06-02-Compiling-AutoDock-Vina-on-iOSmonthly0.52020-09-15https://navanchauhan.github.io/posts/2020-07-01-Install-rdkit-colabmonthly0.52020-09-15https://navanchauhan.github.io/posts/2020-08-01-Natural-Feature-Tracking-ARJSmonthly0.52020-09-15https://navanchauhan.github.io/posts/hello-worldmonthly0.52020-06-01https://navanchauhan.github.io/publicationsdaily1.02020-03-17https://navanchauhan.github.io/publications/2019-05-14-Detecting-Driver-Fatigue-Over-Speeding-and-Speeding-up-Post-Accident-Responsemonthly0.52020-03-14https://navanchauhan.github.io/publications/2020-03-14-generating-vaporwavemonthly0.52020-03-15https://navanchauhan.github.io/publications/2020-03-17-Possible-Drug-Candidates-COVID-19monthly0.52020-03-18 \ No newline at end of file diff --git a/tags/arjs/index.html b/tags/arjs/index.html index 667be64..63da6b1 100644 --- a/tags/arjs/index.html +++ b/tags/arjs/index.html @@ -1 +1 @@ -Navan Chauhan

Tagged with AR.js

Browse all tags
\ No newline at end of file +Navan Chauhan

Tagged with AR.js

Browse all tags
\ No newline at end of file diff --git a/tags/augmentedreality/index.html b/tags/augmentedreality/index.html index d9d175f..1f4f39c 100644 --- a/tags/augmentedreality/index.html +++ b/tags/augmentedreality/index.html @@ -1 +1 @@ -Navan Chauhan

Tagged with Augmented-Reality

Browse all tags
\ No newline at end of file +Navan Chauhan

Tagged with Augmented-Reality

Browse all tags
\ No newline at end of file diff --git a/tags/javascript/index.html b/tags/javascript/index.html index df30132..3a77458 100644 --- a/tags/javascript/index.html +++ b/tags/javascript/index.html @@ -1 +1 @@ -Navan Chauhan

Tagged with JavaScript

Browse all tags
\ No newline at end of file +Navan Chauhan

Tagged with JavaScript

Browse all tags
\ No newline at end of file diff --git a/tags/tutorial/index.html b/tags/tutorial/index.html index 1d03fdf..c400942 100644 --- a/tags/tutorial/index.html +++ b/tags/tutorial/index.html @@ -1 +1 @@ -Navan Chauhan

Tagged with tutorial

Browse all tags
\ No newline at end of file +Navan Chauhan

Tagged with Tutorial

Browse all tags
\ No newline at end of file -- cgit v1.2.3