ARTIFICIAL INTELLIGENCE COURSE - DURATION: 3 MONTHS (180 HOURS)
MODULE 1: Python Foundations for AI
- Getting Started with VS Code
- Creating Python Environments (virtualenv, Anaconda, etc.)
- Python Syntax and Semantics
- Variables and Data Types in Python
- Lists, Tuples, Sets, Dictionaries
- List Comprehension & Real-World Use Cases
- Operators (Arithmetic, Logical, Relational, etc.)
- Python Control Flow:
- Conditional Statements (if, elif, else)
- Loops (for, while, break, continue)
MODULE 2: Python Programming - Intermediate
- Functions in Python
- Defining Functions
- Lambda, map(), filter(), reduce()
- Built-in Python Modules: os, random, re, datetime
- File Handling (read, write, append modes)
- Exception Handling
- Object-Oriented Programming (OOP):
- Class, Object, Inheritance
- Polymorphism, Abstraction, Encapsulation
MODULE 3: Natural Language Processing (NLP)
- Introduction to NLP
- NLTK (Natural Language Toolkit)
- Tokenization, Stemming, Lemmatization
- Removing Stopwords, Named Entity Recognition (NER)
- Text Vectorization Techniques:
- One-Hot Encoding (OHE)
- Bag of Words (BoW)
- N-Grams
- TF-IDF
- Word Embeddings: Word2Vec
- Text Preprocessing (lowercasing, punctuation removal, etc.)
- Text Classification and Sentiment Analysis
MODULE 4: Deep Learning with Neural Networks
- Introduction to Deep Learning
- Understanding Neural Network Architecture
- Artificial Neural Networks (ANN)
- Forward Propagation, Backpropagation
- Cost Functions & Optimizers
- Recurrent Neural Networks (RNN) – Theory + Practical
- Long Short-Term Memory (LSTM)
- Gated Recurrent Unit (GRU)
- Bidirectional RNNs
- CNN Overview (for completeness)
- Hands-on Deep Learning with Keras/TensorFlow
MODULE 5: Attention Mechanism & Transformers
- Attention Mechanism Basics
- Encoder-Decoder Architecture
- Transformers Overview (BERT, GPT, T5)
- Positional Encoding
- Transfer Learning for NLP (fine-tuning BERT/GPT models)
- Pretrained Models with Hugging Face Transformers
- Text Summarization, Translation, and Q&A Applications
MODULE 6: Generative AI (Gen AI) & LLMs
- What is Generative AI?
- Large Language Models (LLMs) Introduction:
- GPT, LLaMA, Claude, Mistral (brief overview)
- Components of LLMs:
- Encoder, Decoder, Tokenizer, Attention Layers
- Model Evolution (GPT-1 → GPT-4 → GPT-4o)
- Tools & Platforms:
- Ollama
- Hugging Face
- FAISS (Vector Store)
- RAG (Retrieval-Augmented Generation)
- Prompt Engineering
- Building Chatbots
- LangChain for LLM Application Frameworks