An introduction to 2 basic ideas in machine studying via the lens of studying curves. Overfitting and Underfitting.
source
Tags: aiartificial intelligenceartificial intelligence newsartificial intelligence news 2023CriticalCurvesdata scienceDevelopmentLatelatest news about robotics technologylatest robotslatest robots 2023LearnlearningMachinemachine learningMLMLOpspeoplerobot newsrobotics newsrobotics news 2023robotics technologies llcrobotics technologyskillsoftwaresoftware engineering
Qn. Can we use the concept of overfiting to understand how good is our training data? Eg. What if we can not overfit our training data? Can we say our data is not good enough to train the model?
what does it mean when test loss is lower than training loss?
oh no,… someone must have complained about the sound right?🤨… I'm going to miss the loud videos, I loved them louder. It always woke up my attention😒😒
17 minutes than i've seen in years.
Thanks for this video. I would say validation set, cross-validation sets or resamples instead of testing. But the main ideas are the same. I only use the holdout set once for the last fit and some people can misinterpret some concepts here.
Superb explanation of the what is the problem and how to approach solving it.
That beat 💓
Absolutely brilliant!
Great quality of information and really precise. So helpful for a beginner like me
Hey everyone! Hope you enjoyed the video and it's helpful for you! Make sure to SMASH that like button and Subscribe to help the channel grow. The next video will be sickkkkk! See you then.
Great Insights. Very helpful, Thank you.
Excellent analogy 👏 Thank you very much Santiago!! Your videos are so cool and to the point !!
Straigtht to the point. I honestly like how you talk more of theory and analysis rather than code.
What an amazing explanation ! It shows how well you yourself understood it – so glad I found you on Twitter !!
Hey! Solid video, had a slight recommender; balance the audio out a bit, I think L is 10-20% louder than R.
hey man im not understand roc curves inn logreg
YES!!!! You just solved a problem I ran into years ago!
So top suggestion for a ML book
This is one of the best Machine Learning channels I've seen. Thanks Santiago 🙏🏻 You have a new subscriber. I came here from Twitter, your content there is super good❤️🙏🏻
Please keep making explanatory videos with simple language, so anyone can understand.
Thank you again🙏🏻
Great explanations, especially useful when you work on your own dataset rather than the Kaggle ones.
great explanation
İndian dominance is so much high in IT now, white guys make educational videos in indian accent.
understood evey bit of it, well done brother ❤❤
Great explanation with lots of illustrations, simply a very good job, keep going.
read about islam man
Can't describe how helpful and beautiful this video is, simply Amazing.
Best explanation ever on YouTube! Keep it up man!
My left ear loved this video.
I have read a lot of articles and watched a few tutorials. But THIS is the perfect explanation for beginners in the ML field. Thank you very much.
Thank you! That helped a lot!!
This was very helpful but how do we define 'high' and 'low' loss? It's relative I assume but is there some rule of thumb?
I belive you have thought so many artificial brains that you know how to get information in the most slow human brains out there. You repeat the fundamentals with a different tone, get letters on screen and give it time to absorve ensuring there is no overfit or underfit in my learning today.
Great job and you got a new subscriber.
amazin explanation thanks
Like this so much 👍👍👍
This is too good❤❤
That was easily the best explanation of learning curves. I have seen each of those, except the perfect curve, but I will keep trying!
Please tell what should be the max difference between Val loss and training loss so that model is not overfitting, my model is showing training loss – 1.46 x 10^(-05) and Val loss – 0.018. So is the model overfitting ? Anyone reply