Teachable moment - that short window of time in which someone wonders about something and, if presented with the answer, will readily and painlessly absorb it and retain it.

Teachable Moments(tm): Software, physical or virtual device, mounted in, as part of, or associated with something one has with one pretty much all the time. I.e., most likely one's cellphone.

Google was a first approximation - good for finding answers about anything, but not as easily or specifically as needed. TM needs next level - take in a full context of what the person is doing, where they are, what they've been asking previously, apply some AI, parse a natural language, spoken or tersely typed question, verify that you have the right question, correct that if necessary (but minimize that), then provide the specific answer AND volunteer a little more context AND provide links to related context - hopefully the answer should always be "almost satisfying" - in the sense that it provides the literal answer desired, but tempts the person to follow links, ask more questions, etc. The idea is to extend the teachable moment as long as the person will tolerate, rather than kill their curiousity with the precise answer.

Wikipedia is a better good example, as it entices one to ask additional questions and provides many links to explore further. If provided with an off-line image and an AI front end to detect context, it might be adequate. Other encylopedias might do this for profit, selling subscriptions to parents of young kids, but it's hard to see how they'd compete with free, other than perhaps snob appeal - "OUR answers are vetted by dozens of experts." Hasn't worked too well for them so far, but probably there'll always be some who care enough or want to pretend they do.

While the near future lies with being "always connected", once in a while someone will find himself off-line. At such times, he will still want to have access to relevant information. And since gigabytes of flash or other non-volatile memory will be cheap, it'll make sense to cache truncated data from any information service one has accessed in the past: information you've already accessed, information closely linked to that, plus some information brought down recently based on your environmental context (physical location, social situation, historical ties, commercial ties) to be ready just in case you asked or your filter AI chose to let you know about something important. Most people will have "where's the nearest toilet" no more than 2 button presses deep... ("Just go past the Barstuck(tm) Cocoa Plantation, turn right next to Binder(tm) Books and look for the sign. On your way out, stop into Binder's to check out Novella by Daniel Smupp, and Barstuck's for a 5% discount on our new Mint Bananas! cocoa!" ) That'll be one clue the AI can use to decide that information is important to keep updated.

Probably there'll be standard "context packages" one can download - not so much information, as useful structure that will draw in the information you need. So if you've never done the world-travelling thing, or taken a cruise, you'd pull down a packet related to those, and in combination with your own context, it'll start pulling in a database of stuff you may soon want to know, even if you don't yet know you will.


Popular posts from this blog

Could a Minimum Income Cryptocurrency Nuke Bitcoin?

Proposed Presidential Vision and Plan for NASA

Cellular Mars Bio-bubbles