Research Reading: Replacing Back-Propagation With The Forward-Forward Algorithm

Speaker

Jonathan Bechtel

Details

For this research reading, we'll discuss the latest optimization method proposed by Geoffrey Hinton: the forward-forward algorithm.

It's a technique that, if successful, could replace back-propagation for neural-network training. The gist of its approach is that instead of calculating the gradients of a model's weights after each forward pass, you'd be able to run two-separate forward passes, and the second one would be run with data generated by the neural-network itself, and would contain "negative data" that's generated from the model's loss function.

The potential benefit from this approach is two-fold:
- The forward pass is much more computationally lightweight than the backward pass
- Information from the second forward pass could be stored offline, so models would be able to be run continuously in realtime when ingesting large amounts of data for training, making training much more composable.

Event type:
Research Reading Group
Preparation:
Read the Paper
May 24, 2023
7:00 pm
-
8:00 pm
May 24, 2023
In Person
330 7th Ave, 2nd Floor, NY NY 10001

While this event is FREE, tickets are required & space is limited!

Attend this event

About the speaker

Jonathan Bechtel

Data Scientist

For this research reading, we'll discuss the latest optimization method proposed by Geoffrey Hinton: the forward-forward algorithm. It's a technique that, if successful, could replace back-propagation for neural-network training. The gist of its approach is that instead of calculating the gradients of a model's weights after each forward pass, you'd be able to run two-separate forward passes, and the second one would be run with data generated by the neural-network itself, and would contain "negative data" that's generated from the model's loss function. The potential benefit from this approach is two-fold: - The forward pass is much more computationally lightweight than the backward pass - Information from the second forward pass could be stored offline, so models would be able to be run continuously in realtime when ingesting large amounts of data for training, making training much more composable.

Upcoming Events

How to conduct experiments with text at scale
Online
August 23, 2023

Experimental Design with Text Message Data

How to conduct experiments with text at scale

Speaker

Laura Zheng

Learn More
How to Combine Counterfactuals with Pattern Recognition
In Person
August 10, 2023

Causal Inference and Machine Learning: The Current Frontier

How to Combine Counterfactuals with Pattern Recognition

Speaker

Gerard Torrats-Espinosa

Learn More
Join other DSML Group Attendees for a Private Dinner
In Person
August 2, 2023

DSML Group. Private Dinner

Join other DSML Group Attendees for a Private Dinner

Speaker

Jonathan Bechtel

Learn More
Fast and Easy Chatbots
In Person
July 26, 2023

Research Reading: Prototyping Conversational LLM's With Alpaca

Fast and Easy Chatbots

Speaker

Jonathan Bechtel

Learn More
Join us at Rokt on-site to learn how a leader in e-commerce technology uses ML to solve problems.
In Person
July 12, 2023

Using Machine Learning in E-Commerce w/ Rokt

Join us at Rokt on-site to learn how a leader in e-commerce technology uses ML to solve problems.

Speaker

Yan Xu

Learn More
What is the underlying graph structure of the crypto financial market?
In Person
June 28, 2023

Using Machine Learning to Study the Structure of CryptoMarkets

What is the underlying graph structure of the crypto financial market?

Speaker

Jonathan Bechtel

Learn More