40 Intermediate AI Tutorials

40 Intermediate AI Tutorials

1. Deep Learning Basics

Deep learning uses multi-layered neural networks for complex tasks.

Example: Simple deep learning model.

from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
model = Sequential([Dense(10, activation='relu', input_dim=2), Dense(1)])
print("Deep learning model created.")

Deep learning model created.

Note: Deep learning excels in tasks like image and speech recognition.

2. Convolutional Neural Networks

CNNs are used for image processing tasks.

Example: Basic CNN setup.

from tensorflow.keras.layers import Conv2D
model = Sequential([Conv2D(32, (3, 3), activation='relu', input_shape=(28, 28, 1))])
print("CNN for image tasks.")

CNN for image tasks.

Note: CNNs extract spatial features from images.

3. Recurrent Neural Networks

RNNs handle sequential data like time series or text.

Example: Simple RNN layer.

from tensorflow.keras.layers import SimpleRNN
model = Sequential([SimpleRNN(10, input_shape=(None, 1))])
print("RNN for sequential data.")

RNN for sequential data.

Note: RNNs are ideal for time-dependent data.

4. Transfer Learning

Transfer learning reuses pre-trained models for new tasks.

Example: Using a pre-trained model.

from tensorflow.keras.applications import VGG16
base_model = VGG16(weights='imagenet', include_top=False)
print("Using VGG16 for transfer learning.")

Using VGG16 for transfer learning.

Note: Transfer learning saves training time.

5. Feature Selection

Feature selection identifies the most relevant variables.

Example: Selecting features with scikit-learn.

from sklearn.feature_selection import SelectKBest
selector = SelectKBest(k=2)
print("Selecting top 2 features.")

Selecting top 2 features.

Note: Reduces model complexity and overfitting.

6. Hyperparameter Tuning

Hyperparameter tuning optimizes model settings for better performance.

Example: Grid search for tuning.

from sklearn.model_selection import GridSearchCV
from sklearn.svm import SVC
param_grid = {'C': [0.1, 1, 10]}
grid = GridSearchCV(SVC(), param_grid)
print("Tuning SVM parameters.")

Tuning SVM parameters.

Note: Use GridSearchCV or RandomizedSearchCV.

7. Ensemble Methods

Ensemble methods combine multiple models for better predictions.

Example: Random Forest.

from sklearn.ensemble import RandomForestClassifier
model = RandomForestClassifier(n_estimators=100)
print("Random Forest ensemble model.")

Random Forest ensemble model.

Note: Boosting and bagging improve accuracy.

8. Data Augmentation

Data augmentation increases dataset size with transformations.

Example: Image augmentation.

from tensorflow.keras.preprocessing.image import ImageDataGenerator
datagen = ImageDataGenerator(rotation_range=20)
print("Augmenting images with rotation.")

Augmenting images with rotation.

Note: Common for image and text data.

9. Regularization Techniques

Regularization prevents overfitting by penalizing complexity.

Example: L2 regularization.

from tensorflow.keras.regularizers import l2
model = Sequential([Dense(10, kernel_regularizer=l2(0.01))])
print("L2 regularization applied.")

L2 regularization applied.

Note: Use L1 or L2 for simpler models.

10. Advanced Data Preprocessing

Advanced preprocessing handles complex data cleaning tasks.

Example: Encoding categorical data.

from sklearn.preprocessing import OneHotEncoder
encoder = OneHotEncoder(sparse_output=False)
encoded = encoder.fit_transform([['red'], ['blue']])
print(encoded)

[[1. 0.] [0. 1.]]

Note: Use encoding for categorical variables.

11. Cross-Validation

Cross-validation assesses model performance robustly.

Example: K-fold cross-validation.

from sklearn.model_selection import cross_val_score
from sklearn.linear_model import LinearRegression
model = LinearRegression()
scores = cross_val_score(model, [[1], [2]], [10, 20], cv=2)
print(scores)

[1. 1.]

Note: Use 5 or 10 folds for reliable results.

12. NLP Tokenization

Tokenization splits text into meaningful units.

Example: Advanced tokenization.

from nltk.tokenize import word_tokenize
import nltk
nltk.download('punkt')
tokens = word_tokenize("AI is transforming industries.")
print(tokens)

['AI', 'is', 'transforming', 'industries', '.']

Note: Tokenization is the first step in NLP.

13. Word Embeddings

Word embeddings represent words as vectors for NLP.

Example: Using pre-trained embeddings.

from gensim.models import Word2Vec
model = Word2Vec([['AI', 'is', 'great']], vector_size=10, window=5, min_count=1)
print(model.wv['AI'])

(Vector representation of 'AI')

Note: Use GloVe or Word2Vec for embeddings.

14. Text Classification

Text classification assigns labels to text data.

Example: Sentiment classifier.

from sklearn.feature_extraction.text import TfidfVectorizer
from sklearn.linear_model import LogisticRegression
vectorizer = TfidfVectorizer()
model = LogisticRegression()
print("Text classifier setup.")

Text classifier setup.

Note: Use TF-IDF for text feature extraction.

15. Image Preprocessing

Image preprocessing prepares images for AI models.

Example: Resizing images.

from tensorflow.keras.preprocessing.image import img_to_array, load_img
img = load_img('image.jpg', target_size=(224, 224))
img_array = img_to_array(img)
print("Image resized to 224x224.")

Image resized to 224x224.

Note: Normalize pixel values for better results.

16. Object Detection Basics

Object detection identifies and locates objects in images.

Example: YOLO setup.

print("YOLO model for object detection.")

YOLO model for object detection.

Note: Use pre-trained models like YOLO or Faster R-CNN.

17. Model Interpretability

Model interpretability explains AI model decisions.

Example: SHAP values.

import shap
explainer = shap.TreeExplainer(model)
print("SHAP explains model predictions.")

SHAP explains model predictions.

Note: Use SHAP or LIME for interpretability.

18. Time Series Forecasting

Time series forecasting predicts future values based on past data.

Example: ARIMA model.

from statsmodels.tsa.arima.model import ARIMA
model = ARIMA([1, 2, 3, 4], order=(1, 0, 0))
print("ARIMA for time series.")

ARIMA for time series.

Note: Use ARIMA or LSTM for forecasting.

19. Autoencoders

Autoencoders compress and reconstruct data for learning.

Example: Simple autoencoder.

from tensorflow.keras.layers import Dense
model = Sequential([Dense(10, activation='relu'), Dense(2), Dense(10)])
print("Autoencoder for data compression.")

Autoencoder for data compression.

Note: Used for denoising or dimensionality reduction.

20. GANs Introduction

GANs generate new data using a generator and discriminator.

Example: Basic GAN setup.

from tensorflow.keras.models import Sequential
generator = Sequential([Dense(128, activation='relu')])
print("GAN generator model.")

GAN generator model.

Note: GANs are used for image generation.

21. Advanced Model Evaluation

Advanced evaluation uses metrics like ROC-AUC for performance.

Example: ROC-AUC score.

from sklearn.metrics import roc_auc_score
score = roc_auc_score([1, 0], [0.9, 0.1])
print(score)

1.0

Note: Use ROC-AUC for classification tasks.

22. Handling Imbalanced Data

Imbalanced data techniques address skewed class distributions.

Example: Oversampling with SMOTE.

from imblearn.over_sampling import SMOTE
smote = SMOTE()
print("SMOTE balances dataset.")

SMOTE balances dataset.

Note: Use SMOTE or undersampling for balance.

23. Dimensionality Reduction

Dimensionality reduction simplifies data while retaining information.

Example: PCA.

from sklearn.decomposition import PCA
pca = PCA(n_components=2)
print("PCA reduces dimensions.")

PCA reduces dimensions.

Note: PCA or t-SNE for visualization.

24. Advanced Clustering

Advanced clustering groups data with complex algorithms.

Example: DBSCAN clustering.

from sklearn.cluster import DBSCAN
dbscan = DBSCAN(eps=0.5, min_samples=5)
print("DBSCAN for complex clusters.")

DBSCAN for complex clusters.

Note: DBSCAN handles non-spherical clusters.

25. Model Deployment with Flask

Flask deploys AI models as web services.

Example: Flask app for model.

from flask import Flask
app = Flask(__name__)
@app.route('/')
def predict():
return "Model prediction"
print("Flask app running.")

Flask app running.

Note: Use Flask for lightweight deployment.

26. AI Pipeline Automation

Pipeline automation streamlines AI workflows.

Example: Pipeline with scikit-learn.

from sklearn.pipeline import Pipeline
from sklearn.preprocessing import StandardScaler
from sklearn.svm import SVC
pipeline = Pipeline([('scaler', StandardScaler()), ('svm', SVC())])
print("Automated AI pipeline.")

Automated AI pipeline.

Note: Pipelines ensure consistent preprocessing.

27. Attention Mechanisms

Attention mechanisms focus on relevant data in sequences.

Example: Attention layer.

from tensorflow.keras.layers import Attention
attention = Attention()
print("Attention layer for sequences.")

Attention layer for sequences.

Note: Used in NLP and vision tasks.

28. Transformers Intro

Transformers use attention for advanced NLP tasks.

Example: Transformer model.

from transformers import BertModel
model = BertModel.from_pretrained('bert-base-uncased')
print("BERT transformer model.")

BERT transformer model.

Note: Use Hugging Face for transformer models.

29. Reinforcement Learning Basics

Reinforcement learning trains agents through rewards.

Example: Q-learning setup.

import numpy as np
q_table = np.zeros((10, 2))
print("Q-learning table initialized.")

Q-learning table initialized.

Note: Used in games and robotics.

30. AI Model Optimization

Model optimization improves performance and efficiency.

Example: Model pruning.

from tensorflow.keras.models import Sequential
model = Sequential([Dense(10)])
print("Model before pruning.")

Model before pruning.

Note: Pruning reduces model size.

31. Handling Missing Data

Handling missing data ensures robust model training.

Example: Imputation with mean.

from sklearn.impute import SimpleImputer
imputer = SimpleImputer(strategy='mean')
data = imputer.fit_transform([[1, None], [2, 3]])
print(data)

[[1. 3.] [2. 3.]]

Note: Use mean, median, or KNN imputation.

32. Sentiment Analysis

Sentiment analysis classifies text as positive or negative.

Example: Simple sentiment classifier.

from sklearn.feature_extraction.text import TfidfVectorizer
from sklearn.linear_model import LogisticRegression
vectorizer = TfidfVectorizer()
model = LogisticRegression()
print("Sentiment analysis setup.")

Sentiment analysis setup.

Note: Use labeled datasets for training.

33. Named Entity Recognition

NER identifies entities like names and places in text.

Example: NER with spaCy.

import spacy
nlp = spacy.load("en_core_web_sm")
doc = nlp("Apple is in California.")
print([(ent.text, ent.label_) for ent in doc.ents])

[('Apple', 'ORG'), ('California', 'GPE')]

Note: Use spaCy or NLTK for NER.

34. Advanced Visualization

Advanced visualization creates insightful AI data plots.

Example: Heatmap with Seaborn.

import seaborn as sns
import numpy as np
data = np.random.rand(10, 10)
sns.heatmap(data)
print("Heatmap visualization.")

Heatmap visualization.

Note: Use Seaborn for complex visualizations.

35. Model Debugging

Model debugging identifies and fixes errors in AI models.

Example: Checking model predictions.

from sklearn.linear_model import LinearRegression
model = LinearRegression()
model.fit([[1], [2]], [10, 20])
print(model.predict([[3]]))

[30]

Note: Debug with validation data and metrics.

36. AI in Cloud

Cloud platforms host and scale AI models.

Example: AWS SageMaker setup.

print("AWS SageMaker for model training.")

AWS SageMaker for model training.

Note: Use AWS, GCP, or Azure for cloud AI.

37. Fine-Tuning Models

Fine-tuning adapts pre-trained models to specific tasks.

Example: Fine-tuning BERT.

from transformers import BertForSequenceClassification
model = BertForSequenceClassification.from_pretrained('bert-base-uncased')
print("Fine-tuning BERT.")

Fine-tuning BERT.

Note: Fine-tune with small datasets for efficiency.

38. AI Security Basics

AI security protects models from attacks like adversarial examples.

Example: Adversarial robustness.

print("Add noise to inputs to test robustness.")

Add noise to inputs to test robustness.

Note: Use adversarial training for security.

39. Multi-Modal AI

Multi-modal AI combines text, images, and other data.

Example: Multi-modal input.

print("Combine text and image inputs for AI.")

Combine text and image inputs for AI.

Note: Used in advanced applications like CLIP.

40. Building an AI Project

Building an AI project involves planning, coding, and deployment.

Example: Simple project pipeline.

from sklearn.pipeline import Pipeline
from sklearn.preprocessing import StandardScaler
from sklearn.linear_model import LogisticRegression
pipeline = Pipeline([('scaler', StandardScaler()), ('clf', LogisticRegression())])
print("AI project pipeline.")

AI project pipeline.

Note: Plan data, model, and deployment steps.

Macro Nepal Helper