Teaser Image

Project 2: Hybrid Images

Due

Febuary 14, 10am. Please turn in your project via this form

Overview

The goal of this part of the assignment is to create hybrid images using the approach described in the SIGGRAPH 2006 paper by Oliva, Torralba, and Schyns. Hybrid images are static images that change in interpretation as a function of the viewing distance. The basic idea is that high frequency tends to dominate perception when it is available, but, at a distance, only the low frequency (smooth) part of the signal can be seen. By blending the high frequency portion of one image with the low-frequency portion of another, you get a hybrid image that leads to different interpretations at different distances.

Details

Here, we have included two sample images (of President Mark Wright and a bear. I've mostly aligned these images so that their eyes are in the same place. The alignment of these images is very important for the final results.

  1. First, you'll need to get a few pairs of images that you want to make into hybrid images. You can use the sample images for debugging, but you should use your own images in your results. Then, you will need to write code to low-pass filter one image, high-pass filter the second image, and add (or average) the two images. For a low-pass filter, Oliva et al. suggest using a standard 2D Gaussian filter. For a high-pass filter, they suggest using the impulse filter minus the Gaussian filter (which can be computed by subtracting the Gaussian-filtered image from the original). The cutoff-frequency of each filter should be chosen with some experimentation; in this case that means you need to experiment with how much to blur each images.
  2. For your favorite result, you should also illustrate the process through frequency analysis. Show the log magnitude of the Fourier transform of the two input images, the filtered images, and the hybrid image. In MATLAB, you can compute and display the 2D Fourier transform with with: imagesc(log(abs(fftshift(fft2(gray_image)))))and in Python it's plt.imshow(np.log(np.abs(np.fft.fftshift(np.fft.fft2(gray_image)))))
  3. Try creating a variety of types of hybrid images (change of expression, morph between different objects, change over time, etc.). The site has several examples that may inspire.

Evaluation

The first 60 points (out of 100) are for demonstration that you correctly create hybrid images. 20 points for the quality of the written description. 10 points for the aethetic quality of the best results (for example, if you take 2 random images and put them together you will get a hybrid image, but it will look terrible and not be compelling. Make them compelling!), and 10 points for bells and whistles (Groups need to do |group size| - 1) bells and whistles; and you are welcome to try to do other variations than those listed below.

Bells & Whistles (Extra Points)

Try using color to enhance the effect. Can you make one image gray scale and the other color? Does it work better to use color for the high-frequency component, the low-frequency component, or both? (5 pts)

Can you make a hybrid video that looks compelling, so that you can see a video of one object moving (or something) from up close and a different video scene from far away?

Can you make code to automatically generate a movie (like those on this page) that highlight the change in perception as you zoom in?

Can you make a three-part hybrid image that shows three distinct images from different distances?