Рет қаралды 12,996
What if we could do away with all the complexities of a neuron and just model neural networks with logic gates? Fundamentally, logic gates are not differentiable, but with some modifications, we can make it differentiable. We can also make the network learn which logic gate to use using a differentiable categorical distribution. This interesting paper at NeurIPS 2022 shows that using logic gates, we can get much faster inference times and similar accuracy to that of neural networks. Scaling it up, though, is an issue, and we discuss some ways which can potentially help to scale in the next phase of improvements.
DiffLogic Code Implementation: github.com/Felix-Petersen/dif...
De Morgan's Laws: en.wikipedia.org/wiki/De_Morg...
Universal Logic Gates: www.electronics-tutorials.ws/....
Gated Linear Units (GLU): arxiv.org/abs/1908.07442
1:48 Perceptron and Logic Gates
16:08 Differences between Perceptron and Logic Gates
20:10 What Logic Gates to model?
23:26 Logic Gates Network Overall Architecture
36:02 Difficulty in training Logic Gates
37:17 Relaxation 1: Real-valued Logics
38:33 Relaxation 2: Distribution over the choice of parameter
43:55 Training Setup
45:05 Configuring output for classification
59:04 Exponential Growth of Gates
1:01:43 My thoughts on how to model biological neurons
AI and ML enthusiast. Likes to think about the essences behind breakthroughs of AI and explain it in a simple and relatable way. Also, I am an avid game creator.
Online AI blog: delvingintotech.wordpress.com/.
Try out my games here: simmer.io/@chongmin