In the future, it may be possible to try on clothes even when the shop is closed, thanks to
semi-transparent mirrors in interactive systems presented at the ACM UIST 2014 human-computer interface conference.

The work, led by Professor Sriram Subramanian, Dr. Diego Martinez Plasencia and Florent Bethaut from the University of Bristol’s Department of Computer Science, builds on a mirror’s ability to map a reflection to one unique point behind the mirror, independently of the observer’s location. 

In a museum, people in front of a cabinet would see the reflection of their fingers inside the cabinet overlapping the exact same point behind the glass. If this glass is at the front of a museum cabinet, every visitor would see the exhibits their reflection is touching and pop-up windows could show additional information about the pieces being touched. Visitors could also interact with exhibits by focusing their eyes on them. By directly pointing at the exhibit with their reflection, instead of pointing at them through the glass, people could easily discuss the features of the exhibits with other visitors.


Credit: University of Bristol

Combining this approach with different display technologies offers interesting possibilities for interaction systems. By placing a projector on top of the cabinet, fingertips could work as little lamps to illuminate and explore dark and sensitive objects. When a hands reflection cuts through the object, the projections on visitors’ hands could be used to reveal the inside of the object, which would be visible to any user.

The researchers have also demonstrated artistic installations that combine this approach with volumetric displays. The musicians would record loops in their digital mixers and these appear as floating above the digital mixer. Musicians could then grab these representations, to play them or tweak them with different musical effects.


Reflective optical combiners like beam splitters and two way mirrors are used in AR to overlap digital contents on the usersí hands or bodies. Augmentations are usually unidirectional, either reflecting virtual contents on the user's body (Situated Augmented Reality) or augmenting userís reflections with digital contents (AR mirrors). But many other novel possibilities remain unexplored. For example, user’s hands, reflected inside a museum AR cabinet, can allow visitors to interact with the artifacts exhibited. Projecting on the user’s hands as their reflection cuts through the objects can be used to reveal objects’ internals. Augmentations from both sides are blended by the combiner, so they are consistently seen by any number of users, independently of their location or, even, the side of the combiner through which they are looking. This paper explores the potential of optical combiners to merge the space in front and behind them. We present this design space, identify novel augmentations/interaction opportunities and explore the design space using three prototypes. University of Bristol

Martinez, a Researcher in Human-Computer Interaction in the Bristol Interaction and Graphics (BIG) group, said, “This work offers exciting interactive possibilities that could be used in many situations. Semi-transparent surfaces are everywhere around us, in every bank and shop window. One example, is when people can’t access a shop because it’s closed. However, their reflection would be visible inside the shop window and that would enable them to try clothes on using their reflection, pay for the item using a debit/credit card and then have it delivered to their home.

“The possibility to blend together the spaces in front and behind the semi-transparent mirror could mean a whole new type of interactive experience. While projectors can only augment the surface of objects, combining them with reflections allows people to reveal what’s inside the object or even purely virtual objects floating around them.”

Paper: ‘Through the combining glass’, Diego Martinez Plasencia, Florent Berthaut, Abhijit Karnik, Sriram Subramanian, Proceedings of ACM UIST 201427th ACM User Interface Software and Technology Symposium