Revelations about the access and use of personal including Cambridge Analytica’s Facebook data harvesting – have proved that digital conversations may not be private and that user data can be used for a variety of unexpected purposes.
Not surprisingly with increased awareness, smartphone users have flocked to the likes of WhatsApp and Telegram – perceived safe havens for privacy, due to end-to-end encryption.
However, even with end-to-end encrypted apps, it is still possible for data harvesters and snoopers to access private data via an unlikely source. By connecting with the cloud, some mobile keyboards used for streamlining and personalising typing can access and use data from your device. Passwords, credit card details, personal conversations which are typed could leave your device via many keyboard apps.
Such data can be leaked whenever keyboard apps sync with the cloud. ‘Smart Suggestions’ are another security risk – they often upload information as you type in order to offer more intuitive suggestions. However, the use of ‘Smart Suggestions’ by some virtual keyboards comes with the risk of leaking your personal information.
There have been significant examples of data leaks recently. The personal data of over 31 million users of the AI.type virtual keyboard app leaked online in 2017 after the company failed to secure the database’s server. The users of another different keyboard extension, Swiftkey, reported in 2016 that their keyboards were suggesting the email addresses and search phrases of other users. The bug was found to originate from SwiftKey’s cloud sync service, which had to be suspended.
And while being investigated for intrusive ads in 2017, GO Keyboard, a widely-used custom Android keyboard app was found to be collecting extensive user data, such as Google account information and even the user’s location.
GO Keyboard was also found to be running external code and was connected to dozens of third-party trackers and ad networks, meaning that the number of affected users ranged anywhere from 200 million to 1 billion.
Even Google’s own Gboard keyboard extension gives the company another avenue to harvest its users’ search queries, regardless of whether it is used in conjunction with end-to-end encryption apps.
Despite these problems, third-party keyboard apps have grown in popularity, mainly due to the improved usability, new features, innovative design themes and smart text prediction that they offer.
This means that the onus is on keyboard providers to regain the trust of their users, particularly in light of Next-Service Prediction (NSP) – the latest innovation.
This new smart technology suggests restaurants, bars, cafes, shops, or even brands, based on what the user is typing, allowing users to instantly access content and information from the web, and access different apps within a single chat.
For example, offering to “grab a drink” with a friend could bring up suggestions of local bars, while suggesting a “meeting sometime next week” with a colleague could trigger your phone’s calendar. But as such smart NSP algorithms are designed to comprehensively learn and predict user behaviour, particular care must be taken to ensure data privacy.
Fleksy is currently the only keyboard app that does not share its user data, which is retained on the device so that it can’t be leaked via the cloud. This is a standard security measure I expect users to demand from keyboard apps in the future. Due to its privacy credentials, Fleksy has been licensed to smartphone manufacturers and governments around the world, while many other institutions are growing sceptical about the security of encrypted apps such as Telegram and WhatsApp.
In light of growing data privacy concerns among governments, security agencies and regular smartphone users, brands must now take steps to renew trust. More and more users are both aware of and concerned by privacy issues, and as a result, are becoming less willing to ignore what happens into their data behind the curtain. The days of ticking the T&Cs without reading them are disappearing, and if brands want to survive and compete, they need to ensure customer data is kept private.
In the meantime, as a user, take a closer look at the messaging and emailing apps you’re using. The first thing to check is whether they have the right layers of end-to-end encryption. One good alternative to WhatsApp and Telegram – the security flaws of which I highlighted earlier – is Signal, which has strong encryption credentials to ensure the privacy of your conversations.
Review the free services offered by any app and understand what data you’re giving away in return for the service. Use Google as your search engine and you expose your personal data and search behaviours. There are alternatives, such as Qwant, which respect your privacy.
You need to do your research. Find out just how secure the keyboard you are using actually is. And keep informed on the steps your apps are taking to protect your data.