Skip to content

Commit

Permalink
🔦 Updated PatchMatch and Lightfield pages
Browse files Browse the repository at this point in the history
  • Loading branch information
amilworks committed Apr 1, 2024
1 parent a598669 commit b1a079f
Show file tree
Hide file tree
Showing 11 changed files with 68 additions and 107 deletions.
5 changes: 3 additions & 2 deletions ece-178-notes/docs/Notes/light-fields.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -126,7 +126,7 @@ Here, the terms are defined as follows:

This transform streamlines the process of generating digitally refocused images by aggregating the contributions of light rays across multiple focal planes. By adjusting the depth parameter $\alpha_i$ and weighting by the aperture function $A_i(u, v)$, this method simulates how a conventional camera's focus mechanism selectively aggregates light rays to form an image focused at a particular depth.

### Mathematical Rigor and Implications
### Math Stuff

- **Summation over Discrete Focal Planes**: The discrete summation across $N$ focal planes approximates the integration over a continuous depth range in the synthetic photography equation. This approximation is central to the practical application of digital refocusing, as it permits the computation of refocused images from a finite set of captured light field data.

Expand Down Expand Up @@ -154,8 +154,9 @@ The synthetic photography equation is formulated based on the geometric optics a

- **For Perspective Shifts**: By altering the part of the light field that is integrated, you can simulate viewing the scene from a slightly different angle than where the camera was physically located, mimicking a shift in the observer's position.

I hope this all makes sense. I am writing all of this at 4AM and I am not sure if I am making any sense. I will come back to this later and make it more clear.
I hope this all makes sense. <span class="highlight">I am writing all of this at 4AM and I am not sure if I am making any sense. I will come back to this later and make it more clear.</span>

Here is a sentence with a <span class="highlight">highlighted phrase</span> in it.



Expand Down
112 changes: 9 additions & 103 deletions ece-178-notes/docs/Notes/patch-match.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -98,103 +98,7 @@ function RandomSearch(NNF, x, y, imageA, imageB, patchSize)
searchRadius = searchRadius / 2
```

### Brute-Force Implementation in Python

:::warning Under construction

Please do not use this as I am not finished with this implementation. I will update this as soon as I am done with it.

:::

```python
import numpy as np
from skimage import io, util
import matplotlib.pyplot as plt

def patch_distance(patch1, patch2, mask):
"""
Compute the L2 distance between two patches, using a mask to exclude
invalid pixels.

Args:
- patch1: First patch.
- patch2: Second patch.
- mask: A boolean mask indicating valid pixels (True=valid).

Returns:
- The L2 distance between the valid pixels of the two patches.
"""
diff = patch1 - patch2
valid_diff = diff[mask]
distance = np.sum(valid_diff ** 2)
return distance

def extract_patch(image, x, y, k, padding='mirror'):
"""
Extracts a patch centered at (x, y) with size kxk from the image, applying
padding if necessary.

Args:
- image: The source image.
- x, y: Center coordinates of the patch.
- k: Size of the patch (assumed to be odd).
- padding: Method of padding to use for boundary patches.

Returns:
- The extracted patch.
- A mask indicating valid pixels within the patch.
"""
padded_image = util.pad(image, ((k//2, k//2), (k//2, k//2), (0, 0)), mode=padding)
patch = padded_image[y:y+k, x:x+k]
# All pixels are valid if extracted from the padded image
mask = np.ones((k, k), dtype=bool)
return patch, mask

def bruteForceNNF(target_image, source_image, k):
"""
Computes the nearest neighbor field (NNF) from target_image to source_image
using a brute-force approach.

Args:
- target_image: The target image.
- source_image: The source image.
- k: The patch size.

Returns:
- NNF: A 2-D array with the same size as target_image containing for each
pixel the coordinates of the closest patch in the source image.
"""
target_h, target_w, _ = target_image.shape
source_h, source_w, _ = source_image.shape
NNF = np.zeros((target_h, target_w, 2), dtype=np.int32)

for y in range(target_h):
for x in range(target_w):
min_distance = np.inf
min_coords = (0, 0)
target_patch, target_mask = extract_patch(target_image, x, y, k)

for sy in range(source_h - k + 1):
for sx in range(source_w - k + 1):
source_patch, _ = extract_patch(source_image, sx, sy, k, padding='edge')
distance = patch_distance(target_patch, source_patch, target_mask)

if distance < min_distance:
min_distance = distance
min_coords = (sy, sx)

NNF[y, x] = min_coords

return NNF

# Example usage
target_image = io.imread('target.png') / 255.0
source_image = io.imread('source.png') / 255.0
k = 11 # Patch size

NNF = bruteForceNNF(target_image, source_image, k)
print("NNF computed.")
```

### Thought Process and Optimization Opportunities

Expand All @@ -210,7 +114,6 @@ print("NNF computed.")

---

So I provided a very rough implementation in Python that took way too long and one that I need to improve upon. But maybe this gives you some insight into how the algorithm works for the brute force method.
{/*
```python {5,28,46,79}
import numpy as np
Expand Down Expand Up @@ -356,12 +259,8 @@ What is the time complexity of the brute-force implementation of the PatchMatch

### Problem 2

Which of the following is a potential optimization strategy for the PatchMatch algorithm?
Explain how the PatchMatch algorithm works. What are the key differences between the Brute Force approach and the PatchMatch algorithm?

- [ ] Vectorization
- [ ] Dynamic Programming
- [ ] Parallel Processing
- [ ] All of the above

### Problem 3

Expand All @@ -382,9 +281,16 @@ In the context of the Patch Match algorithm, "coherence" refers to a specific pr
- [ ] Coherence is the process by which the algorithm removes all noise from an image before computing the nearest-neighbor field (NNF), ensuring that only coherent (noise-free) patches are analyzed.
- [ ] The term describes the algorithm’s ability to maintain temporal consistency in video sequences, ensuring that patch matches do not flicker or change abruptly from one frame to the next.
- [ ] The term describes the algorithm's ability to maintain temporal consistency in video sequences, ensuring that patch matches do not flicker or change abruptly from one frame to the next.

### Problem 5

Which of the following is a potential optimization strategy for the PatchMatch algorithm?

- [ ] Vectorization
- [ ] Dynamic Programming
- [ ] Parallel Processing
- [ ] All of the above



Expand Down
Loading

0 comments on commit b1a079f

Please sign in to comment.