This application shows an example on the use of Multisensory (i.e. a buffer of metadata synchronized with the media stream) in Kurento. The Media Pipeline implemented in this application is composed by a WebRtcEndpoint connected first to a KmsDetectFaces Media Element, which detects faces in a media stream. The data about the timing and position of the detected faces is published in the metadata buffer. This information is read by another Media Element called KmsShowFaces, which draws a square at the time and position published in the metadata buffer. To run this demo follow these steps: