Before long, our devices will know more about us than we do.
At Google I/O 2016, the world received a brief look into Google Awareness APIs, but the company has gone on to make this feature available to developers.
These APIs allows apps to collect a substantial amount of information about users, including location, weather, user activity, and nearby “bluetooth beacons.” The goal of opening up this program is to give apps enough information about users that they’re able to predict which features they’re likely to take advantage of before the user themselves realizes it.
According to a post on Google’s company blog today, Awareness APIs offer two main ways by which apps gather contextual information about their users.
The first of these is the Snapshot API, which lets the app request information about a user’s current situation. The second is a Fence API, which allows apps to react to changes in a user’s situation when it matches a certain set of conditions.
Some of the companies Google has partnered are using awareness APIs to enhance different experiences for their customers. For example, Trulia, an online real estate site, is using the Fence API to suggest open houses to users in their proximity.
Another example is SuperPlayer Music which uses the Snapshot API and the Fence API to suggest music that matches the user’s mood.
While this has the potential to make apps much more useful, the idea of unknowingly allowing apps to track user context in realtime may make many uncomfortable. However, several reports have stated developers should be able to prevent these features from getting too carried away.
Related reading: Google announces native Places API for Android and iOS