- Видео 143
- Просмотров 1 474 390
Jon Krohn
Добавлен 13 июл 2013
Dr. Jon Krohn is Chief Data Scientist at Nebula, author of the bestselling book Deep Learning Illustrated, and host of the SuperDataScience podcast. Jon is renowned for his compelling lectures, which he offers in-person at Columbia University, New York University, and leading industry conferences.
Copies of Deep Learning Illustrated are available at bit.ly/iTkrohn. Use KROHN during checkout for 35% off!
Also available from Amazon at amzn.to/32TB6rB
To keep up with the latest from Jon, sign up for his newsletter here at jonkrohn.com, follow him on Twitter @JonKrohnLearns, and on LinkedIn at linkedin.com/in/jonkrohn.
Copies of Deep Learning Illustrated are available at bit.ly/iTkrohn. Use KROHN during checkout for 35% off!
Also available from Amazon at amzn.to/32TB6rB
To keep up with the latest from Jon, sign up for his newsletter here at jonkrohn.com, follow him on Twitter @JonKrohnLearns, and on LinkedIn at linkedin.com/in/jonkrohn.
Generative AI with Large Language Models: Hands-On Training feat. Hugging Face and PyTorch Lightning
TOPIC SUMMARY
Module 1: Introduction to Large Language Models (LLMs)
- A Brief History of Natural Language Processing (NLP)
- Transformers
- Subword Tokenization
- Autoregressive vs Autoencoding Models
- ELMo, BERT and T5
- The GPT (Generative Pre-trained Transformer) Family
- LLM Application Areas
Module 2: The Breadth of LLM Capabilities
- LLM Playgrounds
- Staggering GPT-Family progress
- Key Updates with GPT-4
- Calling OpenAI APIs, including GPT-4
Module 3: Training and Deploying LLMs
- Hardware Options (e.g., CPU, GPU, TPU, IPU, AWS chips)
- The Hugging Face Transformers Library
- Best Practices for Efficient LLM Training
- Parameter-efficient fine-tuning (PEFT) with low-rank adaptation (LoRA)
- Open-...
Module 1: Introduction to Large Language Models (LLMs)
- A Brief History of Natural Language Processing (NLP)
- Transformers
- Subword Tokenization
- Autoregressive vs Autoencoding Models
- ELMo, BERT and T5
- The GPT (Generative Pre-trained Transformer) Family
- LLM Application Areas
Module 2: The Breadth of LLM Capabilities
- LLM Playgrounds
- Staggering GPT-Family progress
- Key Updates with GPT-4
- Calling OpenAI APIs, including GPT-4
Module 3: Training and Deploying LLMs
- Hardware Options (e.g., CPU, GPU, TPU, IPU, AWS chips)
- The Hugging Face Transformers Library
- Best Practices for Efficient LLM Training
- Parameter-efficient fine-tuning (PEFT) with low-rank adaptation (LoRA)
- Open-...
Просмотров: 24 556
Видео
Getting Value from Artificial Intelligence - Jon Krohn at Hg Capital "Digital Forum" 2023
Просмотров 2,4 тыс.Год назад
In February 2023, I delivered this keynote on "Getting Value from A.I." to open the second day of Hg Capital's "Digital Forum" in London. With a focus on B2B SaaS applications, over 45 minutes I covered: 1. What Deep Learning A.I. is and How it Works 2. Tasks that are Replaceable with A.I. vs Tasks that can be Augmented 3. How to Effectively Implement A.I. Research into Production The audience ...
Four Major Weightlifting PRs (DL, OHS, PJ, FS)
Просмотров 2 тыс.2 года назад
New year, same process... and incrementally more results (in this case, weightlifting PRs). By sticking to my never-miss-a-workout habit since Autumn 2020, these most recent weightlifting personal records - shown in the video and all done on different days in recent weeks - are: • Deadlift: 455lb. ( 15lb. over May 2022) • Overhead Squat: 200lb. ( 10lb. over May 2022) • Push Jerk: 230lb. ( 10lb....
#shorts Plotting a System of Linear Equations
Просмотров 3142 года назад
My "Machine Learning Foundations" RUclips series covers the foundational subjects you need to excel at ML. In the second episode of the series I walk through how to plot a system of linear equations in Python! Full video here: ruclips.net/video/ibTYANFwrNc/видео.html #ML #algebra #linearalgebra
#Shorts System of Linear Equations - Topic 1 of Machine Learning Foundations
Просмотров 2722 года назад
In this clip I walkthrough a system of linear equations problem. The full video is the first video of my Machine Learning Foundations series, where I introduce the basics of Linear Algebra and how Linear Algebra relates to Machine Learning, as well as providing a brief lesson on the origins and applications of modern algebra. There are eight subjects covered comprehensively in the ML Foundation...
The Staggering Pace of Technological Change in One Lifetime
Просмотров 9712 года назад
My first TED-format talk is live! In it, I use (A.I.-generated!) visuals to color how A.I. will transform the world in our lifetimes, with particular emphases on climate change, food security, and healthcare innovations. This clip is the opening hook of the talk. For the full video, head over to: jonkrohn.com/TEDx
Big Olympic Lift PRs: 255# Clean & 185# Snatch
Просмотров 1,2 тыс.2 года назад
One year into disciplined commitment to a CrossFit training program, the gains have mostly been slow and modest, but they suddenly started accumulating to great effect. My previous clean PR was just 235 pounds, so this is a whopping 20-pound jump to 255#. Similarly surprising (you can see from my reaction in the video!) the 185-pound snatch was an enormous 15# jump over my previous PR of 170#.
380# Back Squat & 170# Snatch: New All-Time PRs
Просмотров 1,1 тыс.2 года назад
Last week I set new all-time PRs for two lifts: the back squat and snatch. I failed both lifts on my first attempt. However, I was able to overcome mental hurdles and succeeded on the second attempt at hitting new all-time PRs in both lifts: 380# back squat (compared to 375# in December) 170# snatch (compared to 160# in December)
440-pound Deadlift: First Lift 2x Bodyweight
Просмотров 1,8 тыс.2 года назад
I weigh 220 pounds so this 440-pound deadlift is my first ever of twice my bodyweight and a new all-time PR. My previous PR was 405 pounds in May 2021; delighted to crush that figure a year later :)
Exercises on Event Probabilities - Topic 98 of Machine Learning Foundations
Просмотров 3,6 тыс.2 года назад
#MLFoundations #Probability #MachineLearning In my series of videos on Probability Theory we’ve already covered events, sample spaces, multiple observations, and combinatorics. This video features four exercises - some using paper and pencil, some using Python code - to test and cement your understanding of the topics so far. There are eight subjects covered comprehensively in the ML Foundation...
Combinatorics - Topic 97 of Machine Learning Foundations
Просмотров 3,3 тыс.2 года назад
#MLFoundations #Probability #MachineLearning Combinatorics is a field of mathematics devoted to counting that can be helpful for studying probabilities. In this video, we use examples with real numbers to bring this combinatorics field to life and relate it to probability theory. There are eight subjects covered comprehensively in the ML Foundations series and this video is from the fifth subje...
Multiple Independent Observations - Topic 96 of Machine Learning Foundations
Просмотров 3 тыс.2 года назад
#MLFoundations #Probability #MachineLearning In this video, we consider probabilistic events where we have multiple independent observations - such as flipping a coin two or more times instead of just once. There are eight subjects covered comprehensively in the ML Foundations series and this video is from the fifth subject, "Probability & Information Theory". More detail about the series and a...
Events and Sample Spaces - Topic 95 of Machine Learning Foundations
Просмотров 4,4 тыс.2 года назад
#MLFoundations #Probability #MachineLearning In this video, we learn about some of the most fundamental atoms of probability theory: events and sample spaces. There are eight subjects covered comprehensively in the ML Foundations series and this video is from the fifth subject, "Probability & Information Theory". More detail about the series and all of the associated open-source code is availab...
What Probability Theory Is - Topic 94 of Machine Learning Foundations
Просмотров 7 тыс.2 года назад
#MLFoundations #Probability #MachineLearning This video is a quick introduction to what Probability Theory is! There are eight subjects covered comprehensively in the ML Foundations series and this video is from the fifth subject, "Probability & Information Theory". More detail about the series and all of the associated open-source code is available at github.com/jonkrohn/ML-foundations The pla...
A Brief History of Probability Theory - Topic 93 of Machine Learning Foundations
Просмотров 11 тыс.2 года назад
#MLFoundations #Probability #MachineLearning This video is a quick introduction to the fascinating history of Probability Theory. There are eight subjects covered comprehensively in the ML Foundations series and this video is from the fifth subject, "Probability & Information Theory". More detail about the series and all of the associated open-source code is available at github.com/jonkrohn/ML-...
Probability & Information Theory - Subject 5 of Machine Learning Foundations
Просмотров 19 тыс.2 года назад
Probability & Information Theory - Subject 5 of Machine Learning Foundations
My Favorite Calculus Resources - Topic 92 of Machine Learning Foundations
Просмотров 3,1 тыс.2 года назад
My Favorite Calculus Resources - Topic 92 of Machine Learning Foundations
Finding the Area Under the ROC Curve - Topic 91 of Machine Learning Foundations
Просмотров 2,7 тыс.2 года назад
Finding the Area Under the ROC Curve - Topic 91 of Machine Learning Foundations
Definite Integral Exercise - Topic 90 of Machine Learning Foundations
Просмотров 1,8 тыс.2 года назад
Definite Integral Exercise - Topic 90 of Machine Learning Foundations
Numeric Integration with Python - Topic 89 of Machine Learning Foundations
Просмотров 1,8 тыс.2 года назад
Numeric Integration with Python - Topic 89 of Machine Learning Foundations
Definite Integrals - Topic 88 of Machine Learning Foundations
Просмотров 1,8 тыс.2 года назад
Definite Integrals - Topic 88 of Machine Learning Foundations
Indefinite Integral Exercises - Topic 87 of Machine Learning Foundations
Просмотров 1,5 тыс.3 года назад
Indefinite Integral Exercises - Topic 87 of Machine Learning Foundations
The Integral Calculus Rules - Topic 86 of Machine Learning Foundations
Просмотров 1,8 тыс.3 года назад
The Integral Calculus Rules - Topic 86 of Machine Learning Foundations
What Integral Calculus Is - Topic 85 of Machine Learning Foundations
Просмотров 2,3 тыс.3 года назад
What Integral Calculus Is - Topic 85 of Machine Learning Foundations
The ROC Curve (Receiver-Operating Characteristic Curve) - Topic 84 of Machine Learning Foundations
Просмотров 3 тыс.3 года назад
The ROC Curve (Receiver-Operating Characteristic Curve) - Topic 84 of Machine Learning Foundations
The Confusion Matrix - Topic 83 of Machine Learning Foundations
Просмотров 2 тыс.3 года назад
The Confusion Matrix - Topic 83 of Machine Learning Foundations
Binary Classification - Topic 82 of Machine Learning Foundations
Просмотров 2,3 тыс.3 года назад
Binary Classification - Topic 82 of Machine Learning Foundations
Integral Calculus - The Final Segment of Calculus Videos in my ML Foundations Series
Просмотров 1,8 тыс.3 года назад
Integral Calculus - The Final Segment of Calculus Videos in my ML Foundations Series
Exercise on Higher-Order Partial Derivatives - Topic 81 of Machine Learning Foundations
Просмотров 1,5 тыс.3 года назад
Exercise on Higher-Order Partial Derivatives - Topic 81 of Machine Learning Foundations
thank you so much sir
Valuable? Indeed! Very well-explained, comprehensive and concise. What I love the most is how you explain, which makes every concept easily comprehensible (the same as in your book).
Sir, i understood the explanation very well too my understanding .like the police and robber equation are we trying to find the best dose ? And what is the forgetfulness score for the dose is? Am just trying to understand what that 1.76 and -0.469 are in real world terms
thank you sir
looks if AI is there.
thanks a lot sir
Dear Dr. Jon Krohn, My name is MD. Golam Rabbany, and I am from Bangladesh. I aspire to become an AI engineer. Although I currently lack a personal computer, I am actively preparing by practicing mathematics on Khan Academy and improving my English communication skills through RUclips tutorials. I frequently reflect on my future study plans and eagerly anticipate the day I can begin studying Machine Learning (ML) with my own computer. However, through my research, I've realized that mathematics can be a significant challenge for many, including myself. I find your videos, and those of other related channels, incredibly helpful. Your in-depth knowledge of ML and the underlying mathematics is evident. Therefore, I humbly request that you consider creating a video series based on a book like "Mathematics for Machine Learning" by Marc Peter Deisenroth, or any other book you deem suitable. I would particularly appreciate if these videos were "pen and paper friendly," allowing viewers to follow along and learn effectively even without access to advanced software or computational resources. I am eager to embark on my mathematics learning journey with your guidance. Sincerely, MD. Golam Rabbany
Hello @JonKrohnLearns, are there exercises for coding? so that we could cement the knowledge in our brain also, aside from pen and paper exercises. Thanks for the response!
very good lectures. Thank you so much
2025 Thank you sir❤ Will be studying
New sub sir❤
Great tutorial learned a lot
What is the use of all this operation?
Wow 😳. Thankyou so much🎉
Amazing 🤩😍
Hi @JonKrohnLearns, thank you for this series! It seems very promising for beginners, however, I am having issues following along with the Python coding. Mainly, that the notebooks in your repository includes an older version of TensorFlow, which makes reproducing your code very challenging. Are there any plans to update these? Thanks!
computer science student need this!
Just wanted to point it out. Indians had been practicing mathematics long before other civilizations even existed. Please fact-check before providing information about history.
Hey Mr. John! I would like to ask you if you ever continue recording your videos. Please try to place your cursor somewhere at the center of the screen. The part at 9:23 is covered by subtitles, and in Udemy, the subtitles cannot be moved from the bottom.
Thanks you, bro, you have shown where we use matrix multiplication. I got satisfied and want to learn more in the field of AI and ML.
It's probably too late to ask but: The cops and robbers example makes sense to me, but in the house example, looking at say the bx1 term representing distance from school, what is unknown about it? Are we looking a single vale of x1 that applies to all the equations in the system the way we had a single distance and time in the cops and robbers example? And what would it even mean?
Thanks for this video
imagine teaching a class of 14 thousand people your a goat.
😮
heyyy, this feels like a wonderful series, this will be my stop fo MML. I hope by the time I reach this video, you start resuming the series!
youve helped so many people with this course especially me!
i hope your pillow is cold on both sides tonight.
Thank you for everything you do Dr. Krohn ❤😊
excellent! however the second order derivative of distance over time must be velocity not acceleration :)
Your explaination is crystal clear ! The way you explained the eigen vectors not only mathematically, but also through numpy and pytorch made me learn many things. Thanks :)
Thank you for sharing your knowledge. Brilliant.
Why didn't we make the y in yhat- true y 0 when we were taking out the derivative of yhat? Is it related to what you said earlier that regardless of y's irrelevance while the other variable is being differentiated, we still mention it? Or is it bcs of the comparison of yhat and true y
Really good!!!
Watched both linear algebra and calculas, it helped a lot.
Excellent. Thank you for sharing your knowledge.
Hi i think your answer to the linear algebra question is a bit ambiguous. Q1. It is 25minutes from the sheriff's perspective (t+5) but its 30 minutes from the system's perspective. It could be worded, how long does the robber keep away from the sheriff? since this fits t=0 from the robbers perspective and makes (t-5).
Have you taught till Optimization in your Udemy course? If not then for God's sake, pls do continue. I am just your videos away from being an ML engineer. Pls take out some time from your busy schedule and complete the course. Me begs 🙏
My goal is to do it with no supportive gear..then squat clean it! Lol 😅 that would be not awesome..no...it would be purely badass! 😂
damn, I'm 17, Brazilian and yet I did understand your content in a great way. thanks!!!
Thank you sir for the class! Till now without understanding the concept we studied But now you made it clear
import numpy as np # first matrix matrix_1 = np.array([[1,0,0], [0,1,0], [0,0,1]]) # take out the columns into individual matrix. v1 = matrix_1[:, 0] v2 = matrix_1[:, 1] v3 = matrix_1[:, 2] print(v1) print(v2) print(v3) # if dot product of any 2 vector is 0, then 2 vectors are orthogonal to each other. print(np.dot(v1, v2)) print(np.dot(v1, v3)) print(np.dot(v2, v3)) # if the norm is 1 then it is unit norm. print(np.linalg.norm(v1)) print(np.linalg.norm(v2)) print(np.linalg.norm(v3)) # now with second matrix matrix_2 = np.array([[2/3, -2/3, 1/3], [1/3, 2/3, 2/3], [2/3, 1/3, -2/3]]) # take out the columns into individual matrix. v1 = matrix_2[:, 0] v2 = matrix_2[:, 1] v3 = matrix_2[:, 2] print(v1) print(v2) print(v3) # if dot product of any 2 vector is 0, then 2 vectors are orthogonal to each other. print(np.dot(v1, v2)) print(np.dot(v1, v3)) print(np.dot(v2, v3)) # if the norm is 1 then it is unit norm. print(np.linalg.norm(v1)) print(np.linalg.norm(v2)) print(np.linalg.norm(v3))
i cant find the github repository
Loved these . Brilliant Course.
👌👌👌👌well explained
This blows my mind. 😮
I'm watching it again and it's still mind-blowing that you can decompose a huge matrix and when you use the greatest singular value with its corresponding vectors, you will get the most significant feature of an image.
I have found treasure. Thank you Mr. Jon
this is the best course in ML ever, thanks a lot!!
Very good!
Might be given by my level of English, but at the start of the video, there's a statement "a determinant is a special scalar value that we can calculate for any given matrix", but the first slide mentions square matrices as the only ones for which determinant is defined. Many sources (including Wikipedia) mention also just square matrices, but Google returns some guides how to compute determinants of non-square matrices. 😁
Very good. Thank You.