Developer Reference for Intel® Integrated Performance Primitives

Histogram of Oriented Gradients (HOG) Descriptor

Histogram of oriented gradients (HOG) is a feature descriptor used to detect objects in computer vision and image processing. The HOG descriptor technique counts occurrences of gradient orientation in localized portions of an image - detection window, or region of interest (ROI).

Implementation of the HOG descriptor algorithm is as follows:

  1. Divide the image into small connected regions called cells, and for each cell compute a histogram of gradient directions or edge orientations for the pixels within the cell.
  2. Discretize each cell into angular bins according to the gradient orientation.
  3. Each cell's pixel contributes weighted gradient to its corresponding angular bin.
  4. Groups of adjacent cells are considered as spatial regions called blocks. The grouping of cells into a block is the basis for grouping and normalization of histograms.
  5. Normalized group of histograms represents the block histogram. The set of these block histograms represents the descriptor.

The following figure demonstrates the algorithm implementation scheme:



Computation of the HOG descriptor requires the following basic configuration parameters:

According to [Dalal05] the recommended values for the HOG parameters are:

Intel® IPP implementation does not assume any default fixed set of parameters values. The IppiHOGConfig structure defines HOG parameters used in Intel IPP functions.

There are some limitations to the values of basic configuration parameters:

#define IPP_HOG_MAX_CELL   (16)  /* max size of cell */
#define IPP_HOG_MAX_BLOCK  (64)  /* max size of block */
#define IPP_HOG_MAX_BINS   (16)  /* max number of bins */

See Also