You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It's somehow funny, how at first things started with time series, RNN, LSTM, and now TransFormers, (GPU optimized LSTM) in just a few years. Currently, huge models perform amazing tasks (while using a lot of resources), although it's great to use those Large Transformers, I hope someday we (or an AI) will find a better way than those "predictive machines" to create the next type of AI, which hopefully will not require that much overhead. ( We near the number of neurons of the human brain in some of those models, yet the brain does do so much more, ..questioning current efficiency, transformers are not brains. ) That aside, this course is a great gift.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
It's somehow funny, how at first things started with time series, RNN, LSTM, and now TransFormers, (GPU optimized LSTM) in just a few years. Currently, huge models perform amazing tasks (while using a lot of resources), although it's great to use those Large Transformers, I hope someday we (or an AI) will find a better way than those "predictive machines" to create the next type of AI, which hopefully will not require that much overhead. ( We near the number of neurons of the human brain in some of those models, yet the brain does do so much more, ..questioning current efficiency, transformers are not brains. ) That aside, this course is a great gift.
Beta Was this translation helpful? Give feedback.
All reactions