The Future of Mobile App Security…Keyboards

touchkeyboard

With the recently announced security flaw found in SwiftKey–the immensely popular mobile on-screen keyboard replacement–I think it brings to fore an idea that I have talked about many times on my tech podcast, Sovryn Tech. (By the way, on Android devices I recommend using the “Hacker’s Keyboard”, which can be found in the Google Play Store and the F-Droid app store.)

It’s a pretty simple idea, too, and maybe I’m not understanding something in the way that Android and iOS are set up, because I’m baffled that no one else has implemented it: “App-specific keyboards”.

What “app-specific keyboards” means is that maybe you’re using some wonderful apps from…oh…how about Open Whisper Systems? Either their iOS Signal suite or perhaps Android’s TextSecure? Fine choices, particularly if you are interested in your privacy and security. So you’re using TextSecure and you think everything is perfectly fine and you’re typing away some very important activist information to your female hacktivist friend. But wait…what if you’re using SwiftKey? And what if SwiftKey is keylogging everything you type? And what if you could trust SwiftKey (which you can’t), but then their servers with all of that keylogging data were cracked into by some alphabet-soup government organization? All of that important information is available to eyes you never intended.

What if, however, you decentralized the keyboard? What if each app had a different keyboard that was secured under the purview of each app developer? What if your keystrokes were as solidly client-side encrypted as your messages are with an app like TextSecure, by TextSecure having its own built-in encrypted keyboard that only TextSecure could use? Could that solve the keylogging problem when it really matters? If implemented correctly, I think so.

Of course, there are other advantages to this, as well. If you have app-specific keyboards, then you can have app-specific keys. For example, the Facebook app could have a “Like” key or a “Tag” key. Not the best examples, but it shows there is potential for this idea outside of the security and privacy aspect, and that could be very appealing to even the most malicious and largest of app developers (such as Facebook).

I can already see the emails now: Brian, what’s next? Are you going to tell people to make their own processors out of their own sand in the backyard? Feel free to read this blog post so you don’t have to waste your time waiting for my response.

The notion isn’t crazy that everything you type on your mobile device could be getting keylogged and stored on servers. Google itself warns you when you start using a new keyboard on your Android device that what you type may be recorded by an unknown third-party (and if you think using the Google Keyboard is somehow safer…well…c’mon, it’s Google…they make a living off of using your information). I think that in the near future, app-specific keyboards are going to be a very real thing, and a very real privacy concern. It has been a concern on PC’s for decades, and trusted, open-source virtual keyboards are often used in high-risk situations (login to your email at riseup.net and notice the very convenient option of a virtual keyboard for you to use in case of your physical keyboard being logged).

Does the NSA, FBI, GCHQ, CIA, or even the local police have ways around this idea of app-specific keyboards? Sure, I imagine so (SEE: insect drones). And that can be said for just about everything you do to try and reign in your digital security and privacy. But that doesn’t mean you still shouldn’t take every reasonable and plausible measure to make it expensive as Hell for them to try and invade your privacy and security.

Carpe lucem!

 

donate_svt2