The Holoflex is a prototype Android smartphone which runs Android 5.1 Lollipop on a 1.5 GHz Qualcomm Snapdragon 810 CPU with 2GB of RAM and a dedicated GPU.
It features a bendable, 1920 x 1080 resolution FOLED screen which is capable of projecting glasses-free 3d images to multiple users at the same time.
The Holoflex pulls off that trick thanks to a 3d-printed fisheye lense filter which is mounted on the display. That filter is made up of an array of over 16,000 individual fisheye lenses, each 12 pixels across, and it effectively turns the screen into a 160 x 104 resolution display which allows users to inspect a 3D object from any angle simply by rotating the phone. (Each lens in the array can be regarded as a single pixel in the filter.)
You can see the screen in action in this video:
Does anyone have a clue why this would be useful? I can’t see a use, myself.
3D modeling on a computer screen is an incredibly useful tool for visualization and design. When the bugs are worked out of VR headsets like the Oculus Rift and the smartphone-based units like Google Cardboard, it’s going to be useful there as well.
But this strikes me more as a gimmick than a useful tool. The resolution is too low, and the interface too quirky, for the Holoflex to be all that useful.
To put it simply, the Holoflex is hampered by the Human Media Lab’s ongoing obsession with flexing a device as an input method. We’ve seen it before in their previous prototypes. Sometimes it works, but in the case of the Holoflex it’s just a pointless gimmick.
An interface based on a 3-finger manipulation of the image would be more useful, practical, and functional, wouldn’t you agree?