BIG-O AI: INTRODUCTION TO MACHINE LEARNING

Are you eager to learn Machine Learning but don’t know where to start? Do you feel overwhelmed by the vast amount of knowledge out there and are unsure of the most effective learning path?

The “Big-O AI Introduction to Machine Learning” course will take you from the fundamentals to mastering classic ML algorithms (KNN, Linear Regression, SVM, Decision Tree) through 16 lessons combining theory and hands-on practice. Each session includes an explanation of how the model works and a practical exercise, such as spam detection, house price prediction, and data analysis with Netflix and Tesla datasets — along with two Kaggle Challenges.

The course concludes with an introduction to Neural Networks and Large Language Models (LLMs), giving you the foundation to confidently explore advanced deep learning.

Special feature: A career-oriented talkshow with ML/Data Science experts from top tech companies.

By the end of the course, you will have learned how to:

  • Prepare and visualize data: Understand how to clean and prepare datasets while leveraging tools and techniques to visualize data effectively.

  • Apply fundamental Machine Learning algorithms: Gain a solid understanding of how essential ML algorithms work and how to apply them.

  • Optimize models: Learn advanced techniques such as hyperparameter tuning, ensemble methods, and more to enhance model performance.

  • Understand Neural Networks: Grasp the principles of neural networks, providing you with a strong foundation to explore Deep Learning.

Programming Languages in this course: the most used programming languages nowadays: Python.

Tuition Fee: Special offer for the first 5 early registrants. For details of tuition fees, please see the attached link below.

In addition, to get more detailed advice about the AI course, you can contact the following fanpage: Big-O Coding

You can view the opening time, class timetable and register via this link 

SUITABLE AUDIENCES (STUDENTS)

  • Prerequisites: Learners should have basic Python programming skills.

  • Optional Knowledge: You may be able to use machine learning models via library calls, but still lack a deep understanding of how those models actually work.

  • If you are not a suitable student for this Big-O AI class, please call us at: 0937.401.483 for advice on taking the next open classes in the near future.

COURSE ILLUSTRATION EXERCISES

  • The exercises are divided into two types: multiple-choice questions to reinforce theoretical knowledge, and hands-on programming tasks to build coding skills.
  • They range from implementing algorithms from scratch (to deeply understand their mechanics) to applying models to real-world problems such as sentiment analysis, financial prediction, and anomaly detection.
  • Highlight: Two Kaggle Challenges simulate a professional working environment, enabling learners to apply their knowledge to complex datasets and develop problem-solving skills like real-world Data Scientists.

TIME AND LOCATION OF THIS COURSE

  • Duration: 2,5 months (10 weeks)
  • Fomat: Online via Zoom.
  • Number of students per class: 25 to 30 students maximum.
  • Each class has 1 main teacher and 5 teaching assistants.
  • Especially, there are weekly Office Hours for students to review the lesson if they can’t keep up with the lesson progress.

WHAT MAKES THE COURSES AT BIG-O CODING DIFFERENT

1. TEACHING PROGRAM:

  • The programs are taught by Algorithm experts with many years of experience (see alsoTeaching Staff).
  • Students have chances to meet and receive sharing from successful people who have gone before about their Algorithmic learning experiences and working experiences.
  • Each class in addition to the main lecturer has 5 teaching assistants: the teaching assistants are in charge of the class and the class’s own forum to ensure that all students’ questions will be answered quickly anytime, anywhere.

2. OBJECTIVES AFTER THE COURSE:

  • The entire system of solid Machine Learning foundation knowledge.
  • Master the complete machine learning development workflow: from data preprocessing and pipeline construction to model evaluation and professional-level optimization.
  • Gain a solid understanding of core ML models and be able to implement them from scratch — ready to confidently explain how they work to recruiters or colleagues.
  • Build a strong foundation to step into entry-level roles such as Junior Data Scientist or ML Engineer, or to transition your career into the AI field.
  • Understand the big picture of modern AI — from fundamental algorithms to Neural Networks and LLMs — equipping you to stay up-to-date with technology trends and pursue advanced topics through self-learning.

AI COURSE SYLLABUS

Learn the fundamental steps of data preparation for machine learning tasks: explore the dataset, handle common data types (strings, numerics, missing values), and visualize the data using Python libraries such as matplotlib and seaborn.
Practical exercise: Explore and visualize data from Netflix TV shows.
Introduction to the concept of classification, how the K-Nearest Neighbors (KNN) algorithm works, and how to measure distances between data points.
Practical exercise: Apply KNN to the problem of detecting outlier data points.
Learn about the three core components of a machine learning system: data, model, and the training process. Introduction to evaluation metrics for classification and regression tasks. Overview of techniques for evaluating a machine learning model.
Practical exercise: Develop a machine learning model for the task of spam email detection.
Introduction to linear functions and the simple linear regression model. Learn how to measure error using the Mean Squared Error (MSE) loss function and explore optimization methods using calculus. Analyze model outputs to support data-driven decision making.
Practical exercise: Predict individual medical costs using univariate or multivariate linear regression. Apply Lasso Regression for feature selection.
Learn the Gradient Descent optimization method in machine learning: the concept of derivatives, how to compute gradients, and the parameter update mechanism. Analyze the impact of learning rate, number of epochs, and batch size. Introduction to variants such as Stochastic and Mini-batch Gradient Descent.
Practical exercises:
  • Implement Gradient Descent using PyTorch's automatic differentiation.
  • Apply the Gradient Descent algorithm to optimize a model for predicting Tesla stock prices.
Learn the Logistic Regression algorithm and how to apply it to binary and multiclass classification problems.
Practical exercise: Use Logistic Regression to predict the likelihood of loan default.
Explore the Naive Bayes model through probability concepts: random variables, conditional probability, and Bayes’ theorem. Apply Naive Bayes to classification tasks with various types of data.
Practical exercise: Build a model for sentiment analysis from text data.
Introduction to feature extraction and creation techniques to enhance model performance: feature transformation, interaction terms, and advanced encoding methods. Learn how to use pipelines to build a complete machine learning model.
Practical exercise: Build a model to predict house prices.

Participate in a Kaggle challenge to build a model for Loan Approval Prediction.

Learn the concept of optimal linear separation using maximum margin. Introduction to handling non-linear relationships with kernel methods. Explore the connection between Support Vector Machines (SVM) and hinge loss.
Practical exercise: Investigate how different hyperparameters affect the predictive performance of an SVM model.
Understand how Decision Trees work, from splitting data based on feature attributes to applying the model in various machine learning tasks.
Practical exercise: Use Decision Trees for feature engineering and build a model to detect malicious software.
Learn how to optimize models by tuning hyperparameters. Explore the three main ensemble methods: bagging, boosting, and stacking, along with key algorithms associated with each approach.
Practical exercise: Build a model to predict the likelihood of diabetes.
Introduction to unsupervised learning and popular clustering algorithms such as K-Means and Hierarchical Clustering. Visualize clustering results and evaluate clustering quality using the silhouette score.
Practical exercise: Apply K-Means to the image compression problem.
Learn the dimensionality reduction technique Principal Component Analysis (PCA) and how to use it to improve model performance and visualize data.
Practical exercise: Apply feature extraction using PCA for the face recognition task.
Learn how neural networks work, including their basic structure and fundamental concepts such as forward propagation and backward propagation. Introduction to the training process and the functioning of modern large language models.
Practical exercise: Build a neural network for the handwritten digit classification task.
Participate in a Kaggle challenge to build a model for Insurance Cost Prediction (Insurance Regression).

Hear experience-sharing sessions from Machine Learning Engineers and Data Scientists from top tech companies.