Drael646464
New member
How about table projection and kinect type sensor to know what on the table you are pushing! Right down to on-table typing!
I'm just blabbing
That's how this tech works. The sony one table device above that I linked a picture to, uses infrared to know what you are touching. So it would project the screen, you tap what you want, and type on the virtual keyboard.
I've seen the same tech used for a proto-type wrist device, and also for a commonly sold infrared portable keyboard. Apparently its quite good for typing once you get used to it.
I imagine the hinge for the phone gives it the nessasary height.
Here's the sony prototype device in action:
https://www.youtube.com/watch?time_continue=21&v=POlKHK7JQoM
Here's the infrared keyboard:
https://www.youtube.com/watch?v=GMPAgAegFzc
I think the key will be improving on the projection quality, and the infrared motion detection, from existing technologies. To make it good, they'll really need to hone it.
I bet they wait till 2018 or end of 2017. Or they could just do it basing a guage on UWP apps where they have grown too. C-Shell uses the UWP. But if win32 still gets integrated and can emulate well in the C-Shell UI, then I say they should release one before 2018.
The projector could make windows on arm, win32 emulation much more useful to the everyday person on a phone. As you could actually run your, IDK, iTunes or whatever on the go without needing a dock.
Last edited by a moderator: