In Spike Jonze’s Academy Award nominated movie Her, sullen future hipster Joaquin Phoenix purchases a bespoke OS tailored to his personality, with which he subsequently falls in love. A new patent application filed by Apple shows that the company has a similar interest with future versions of iOS – not for love, but money.
Citing a “need in the art for improved targeted content delivery,” Apple patent application 13/556023 documents a system to “infer and/or derive a user’s mood and then use the inferred mood in identifying targeted content that is likely to be of interest to the user.”
The patent specifies the system would be able to infer mood based upon physical characteristics, behavioural characteristics, or spatial-temporal characteristics. In other words, future versions of Siri might not have the sultry voice or emotional depth of Scarlett Johansson, but it will be able to suggest some retail therapy when it notices your dejected tone of voice and that you haven’t left the apartment in two days.
We’ve collected some of the more interesting parts of the patent after the jump, but you can read the whole thing via the source link.
“ Mood-associated characteristics can define characteristics indicative of a user’s mood at a point in time. Mood-associated characteristics can be any subset of user characteristics. For example, mood-associated characteristics can be physical characteristics, behavioral characteristics, and/or spatial-temporal characteristics. Mood-associated physical characteristics can include heart rate; blood pressure; adrenaline level; perspiration rate; body temperature; vocal expression, e.g. voice level, voice pattern, voice stress, etc.; movement characteristics; facial expression; etc. Mood-associated behavioral characteristics can include sequence of content consumed, e.g. sequence of applications launched, rate at which the user changed applications, etc.; social networking activities, e.g. likes and/or comments on social media; user interface (UI) actions, e.g. rate of clicking, pressure applied to a touch screen, etc.; and/or emotional response to previously served targeted content.”
“ The system 100 can also be configured to include a mood analysis system 108 that can generate an inferred mood for one or more users. The mood analysis system 108 can receive current mood-associated data and based on a relationship between the current mood-associated data and at least one baseline mood profile, the mood analysis system 108 can generate an inferred mood. The current mood-associated data can specify one or more data items. Each mood-associated data item can be any mood-associated data specific to the user for which an inferred mood is to be generated, such as current and/or recent mood-associated characteristic data. In some cases, a mood-associated data item can indicate a user’s emotional response to a previously served item of invitational content, e.g. happy, at ease, stressed, angry, etc. The user’s emotional response can be evaluated in a variety of ways, such as by monitoring the user’s vitals, through facial expression recognition, or based on how the user is interacting with the user interface.”