Machine Learning,
AI & GenAI
✦Gen AI Ready
A 7-week deep dive — from classical ML and deep learning to transformers, diffusion models, and building AI agents. Theory meets hands-on projects at every step.
7 Weeks
Duration
Intermediate+
Level
Live
Sessions
Live Cohort
ML, AI with GenAI
₹4,00038% OFF
₹2,499Enroll Now
Course Overview
A structured 7-week programme covering Machine Learning, Deep Learning, and Generative AI.
What You'll Learn
- Linear Algebra, Statistics & ML Foundations
- Supervised & Unsupervised Learning (SVM, K-Means, Decision Trees)
- Deep Learning (CNNs, RNNs, LSTMs, Backpropagation)
- Transformers & Self-Attention Mechanisms
- Prompt Engineering & RAG Pipelines
- Diffusion Models, Vision Transformers & AI Agents
What You'll Achieve
- Build & deploy ML models end to end
- Design deep learning architectures from scratch
- Create RAG pipelines & fine-tune LLMs
- Build a capstone AI agent system
Why choose this track?
Everything you need to break into ML, AI, and Generative AI.
Continuous Assignments
Doubt clearing sessions
Mock interviews
3 Real-world projects
Course Curriculum
A weekly roadmap — from ML foundations and deep learning to transformers, GenAI, and building AI agents.
- Vectors, dot product & cross product
- Matrix operations & transformations
- Probability distributions & inferential statistics
- Cost function & gradient descent
- Linear Regression — line fitting & model evaluation (MSE, R²)
- Feature scaling & normalization
- Support Vector Machines — hyperplane, margin & kernel trick
- K-Means clustering — centroid migration & elbow method
- Decision Trees — entropy, information gain & Gini impurity
- Random Forests & ensemble methods
- Dimensionality Reduction — PCA, eigenvectors & eigenvalues
- Validation — K-Fold cross-validation & bias-variance trade-off
- AI ethics, fairness & responsible ML
- Perceptron & multi-layer architecture (MLP)
- Activation functions (ReLU, sigmoid, tanh)
- Forward propagation & loss functions
- Backpropagation — chain rule & computational graphs
- SGD, momentum & learning rate schedules
- Adam optimizer — intuition & math
- Dropout & batch normalization
- Convolution, kernels & feature maps
- Pooling, stride & padding
- CNN architectures (LeNet, VGG, ResNet)
- Image classification project
- Recurrent Neural Networks — hidden state & vanishing gradients
- Gated Recurrent Units (GRU)
- LSTMs — forget, input & output gates
- Time-series forecasting project
- Attention mechanism — Query, Key, Value
- Multi-head attention & positional encoding
- Transformer architecture (encoder-decoder)
- Vision Transformers (ViT) — image patches as tokens
- Comparing ViT vs CNN performance
- Transfer learning with pre-trained models
- Zero-shot & few-shot prompting
- Chain-of-thought reasoning & prompt patterns
- Retrieval-Augmented Generation (RAG)
- Vector databases & embeddings
- Fine-tuning vs in-context learning
- Forward & reverse diffusion — denoising
- U-Net architecture & Stable Diffusion
- Agent architecture — planning & tool use
- Building sub-agents (search, code, write)
- Orchestration & feedback loops
- End-to-end RAG pipeline project
- 🚀 Final Project: Build & deploy a complete AI agent system
- 🔧 Bonus: Git, GitHub & portfolio deployment