Lectures
-
1. Introduction and KNN
- A computer program is said to
-
1. Pixels and Linear Filters
1. Light comes out from the sources
-
2. Frequency
$$
-
2. Perceptron
- A set of examples is `linearly separable` if there exists a linear decision boundary that can separate the points
-
3. Template Matching
- $1 \rightarrow D$
-
4. Denoising and Compression
- $FFT$ can be used to efficiently implement $SSD$
-
5. Light
- We perceives color from the complex interacction of multiple factors
-
5. Naive Bayes
- **Dataset $\mathcal{D}$**: i.i.d (independent and identically distributed) drawn from some unknonw distribution
-
6. Linear Regression
$$
-
6. Texture Synthesis
- Create new samples of a given texture
-
7. Graph Cut-Based Segmentation
- Good region is similar to foreground color model and dissimilar from background color
-
7. SVM
> [!CAUTION]
-
8. Empirical Risk Minimization (ERM)
$$
-
8. Histogram Equalization
- Reassign values so that the number of pixels with each values is more evenly distributed
-
9. Gradient Descent
- Function $f: \R^d \rightarrow \R $ is `convex` if $\forall x_1 \in \R^d, x_2 \R^d, 0 \le t \le 1$
-
9. Image Compositing
- small segmentation errors noticeable
-
10. Bias-Variance Tradeoff
- $\mathcal{D} = \{(x^{(i)}, y^{(i)})\}_{i = 1}^N$
-
10. Image Warping
$$
-
11. Cross Validation and Model Selection
- Let's say we have the following two models:
-
11. Image Morphing
- Affine transformation
-
12. Pinhole Camera
- Angless and length are lost
-
13. Single-view Metrology and Cameras
$$
-
14. Color Spaces
**RGB** stands for **Red–Green–Blue**, an **additive color model** used in displays, cameras, and digital imaging.