Wickr: Redefining the Messaging Platform, an Interview with Co-Founder, Chris Howell

wickr logo

In the modern era, messaging applications are a constant target for attackers, exposing vulnerabilities, disclosing sensitive information of nation states and insider-employee inappropriate behaviors or practices. There is a constant need to prioritize one's cybersecurity and upgrade one's infrastructure to the latest and greatest of defensive technologies. However, the messaging tools that these same organizations tend to rely on often are the last to be secured, if at all. This is where Wickr comes in. Wickr is an instant-messaging application and platform offering end-to-end encryption and content-expiring messages. Its parent company of the same name takes security seriously and has built a product to showcase that. I was able chat with co-founder and CTO, Chris Howell, who was gracious enough to provide me with more information on what Wickr can achieve, how it works and who would benefit from it.

Petros Koutoupis: Please introduce yourself and tell us about your role at Wickr.

Chris Howell: I'm co-founder/CTO and responsible for technical strategy, security and product design. You can read my full bio here.

Petros: What do you see as a weak point in today's messaging apps?

Chris: By far, at least when it comes to security, the weak point of virtually all messaging apps to date (and all other apps and services, really) is that they're built with the assumption that users will have to trust the service. The problem with that way of thinking is can we really trust the service? That's not to say there are bad people running them, necessarily, but how many breaches (for example, Equifax 2017) or abuses (for example, Snapchat 2019) do we need to see to answer that question? Once the service is built that way, messaging users generally suffer in two ways. First, at some key point on their way to the recipient, messages are readable by some number of folks beyond the recipient. Now, the service typically will point to various security certifications and processes to make us feel okay about that, but in most cases where there are humans involved, what can happen will happen, and whatever controls are put in place to limit access to user data amount to little more than a pinky promise—which when broken, of course, leaves the user with a loss of privacy and security. Second, having been so trusted, the service typically prioritizes "virility" and its own growth over the users' need to control their own data, leading to behavior like scanning message content for marketing purposes, retaining messages longer than necessary, and abusing contacts to aid the growth of the service.

Petros: How does Wickr help address that?