In my model on ambient learning support called AICHE I have described several processes important for learning with ambient and embedded information technology.
The main processes that I have foreseen in 2009 have been aggregation, enrichment, synchronisation, and framing. See the complete paper here.
Every now and then I see new apps or model popping up that support this idea but yesterday I really saw a nice fit and I think some quite nice interactive way of developing dual screen apps for iOS with Airplay for Apple TV.
Brightcove introduced a cloud-hosting service for apps which support the development especially of dual screen applications, a model that can recently be seen in several new hardware and software concepts introduced. New home entertainment consoles as the Nintendo Wii U which brings a tablet with additional meta-information and controller options of the introduction of the "My Xbox Live for iPhone" for the XBOX360 follow a comparable model. SYnchronize different screen real estate and adapt the information adapted to the functionality needed of the function best supported by the device.
I think this is a general trend that can be seen in entertainment and mobile technologies that can be pretty useful for learning support. In the video of the introduction of the brightcove app cloud for dual display apps even educational applications are introduced as one model. The example is having multiple choice questions displays on a tablet while watching TV.
BTW from the Appleinsider article: "A recent survey by Razorfish and Yahoo of more than 2,000 smartphone users found that 80 percent of respondents use their mobile device while watching TV."