Episode 010: Citizen Cybersecurity with Jesse Pollak


Jay Bobo | Sam Livingston-Gray | Mandy Moore

Guest Starring: Jesse Pollak

Show Notes:

00:16 – Welcome to “Who’s Line of Code is it Anywhere?…” …we mean, “Greater Than Code!”

01:37 – Getting Started with Computer-ing & Security

PGP = Pretty Good Privacy
Filippo Valsorda: I’m giving up on PGP

09:28Clef and Two-factor Authentication (2FA)

12:33 – Citizen Cybersecurity Due to the Rise of Mass Surveillance

Quincy Larson: How to encrypt your entire life in less than an hour
Signal by Whisper Systems

17:27 – Evaluating Service Providers


22:29 – Password Managers and Encrypting Data at Rest (Security by Default)

Noah Zoschke: Encryption at Rest (Convox Article)

25:30 – Tools and Resources

NaCl: Networking and Cryptography library (“Salt”)
Bouncy Castle
Amazon Web Services (AWS)

28:20 – Two-factor Authentication, YubiKey

Instant 2FA

32:58 – Putting Trust in Security and the Organizations That Provide It; Centralization

38:06 – Developer Unions

We are (currently) listener supported!
Support us via Patreon!
Thank you, Sophie Déziel, for your support!

42:58 – “Citizens are buying a lot of IoT devices that are being used for DDoS attacks. As citizens, are we responsible to some extent for them occurring regardless of our technical ability at the time of purchase?” – Yiorgos (George) Adamopoulos; What about retailers?

47:56 – “What are your thoughts on “benevolent” malware that looks for vulnerable devices and patches them without asking for permission from the device’s owner?” – Wesley Ellis


Jesse: We as a society have a responsibility to look after the people on the edges, and look after the people who don’t have the tools or don’t have the resources to do security themselves.

Mandy: Learning about security is important, even for a newbie.

Jay: We can’t just build the thing, we have to make sure that it’s usable and we have to make sure that beyond the fact that it works, that it’s going to be adopted by people and that it’s meaningful and helpful.

Sam: “Stop calling me a consumer. I am neither a gaping a mouth nor an open wallet. I am a citizen interacting in a community.” – Jeme Brelin

Please leave us an iTunes Review!


SAM:  Hello and welcome to ‘Who’s Line of Code is it Anywhere?’ where everything is made up and the story points don’t matter.

JAY:  No, no, no, no, Sam.

SAM:  What? Too soon?

JAY:  It’s Greater Than Code because people matter.

SAM:  Oh, right. So story points don’t matter but people do? Is that what you’re saying?

JAY:  No? Yes? I’m confused.

SAM:  [Laughs] I’m confused too.

JAY:  Well, it’s Greater Than Code and I’m Jay Bobo, one of your panelists. I’m handing it off to Mandy.

MANDY:  Hi everybody. I am Mandy and I am pleased to welcome our guest today, Jesse Pollak. He is a builder and a writer who has gained attention for his thoughtful analysis of startup culture and progress. A Hack New York fellow and college dropout. Jesse was an engineer at BuzzFeed before he started Clef and has launched products that have reached hundreds of thousands of people.

Leading the product team at Clef, he works every day to make sure the entire internet is safer and easier to use. Welcome, Jesse. How are you doing today?

JESSE:  I’m doing good. Thanks for the kind introduction — better than I could have said it. I’m doing really good. It’s a sunny day in Oakland, California and the sun is like seeping below this line of clouds, and it’s orange and I’m looking out from the 12th floor of our office and it’s really, really beautiful. I’m just lucky to be here.

SAM:  That sounds great. We’d like to start our show by learning a little bit more about our guest. How did you get started in this whole computering thing?

JAY:  In the whole computering thing…some people would say it’s an early start but I considered it like a pretty late start. I never wrote my first line of code until my first semester of college so that was the Fall of 2011. Before that, I always had an interest in technology. I took apart iPhones and jailbroke them but I never realized that there were a layer underneath that whole system that I’ve been interacting with, where you could actually program the things that were going on.

In my first semester of college, I started uncovering that and started realizing that I didn’t want to be an environmental engineer. I actually wanted to write code and build things for people. That was the beginning of an awakening and that semester, I also met my two now co-founders that a year later, we started Clef together and then a year later, I dropped out of school. That was three and half years ago. It’s been a very rapid acceleration into doing this.

SAM:  Well, that’s really interesting. It sounded like you were going to school for an Environmental Engineering degree. Did you take a programming class or was this like extracurricular hacking?

JESSE:  Well, it was a combination of things. My last semester of high school, after I’d gotten into Pomona, which is like a little liberal arts school doesn’t have an engineering program. It was where I decided to go but every other school I applied to had been an engineering school. But I ended up wanting to go to Pomona and providing reasons including being able to play soccer there.

In my last semester of high school, I took a math class and part of that math class was just copy and pasting some snippets of code into MATLAB and messing around with that. I got really interested in that and just like what that was because I never seen anything like it before. Then my first semester of college, I took one CS class but then I also was like, “How do I do this outside of class?”

And so I started building a little Ruby on Rails app, which went on to be something called ‘5C ride share’ which was the first thing I ever built. It just coordinated probably like 20,000 or 30,000 rides to it from the airport for students at college where I went. And I also set up an independent study where I could learn how to use Rails that I got college credit for.

I was exposed to this until the end of my senior year high school and then my first semester of college, I was like, “Let’s see what this is really about,” and it was a rapid introduction.

SAM:  Cool. I’m always interested in how people came into this because I started working with computers in a very user-y sort of way as a teenager and then I didn’t really get into programming until I was 26 or so. Then a couple of years later, I got the CS degree. I feel like I did it backwards and I’m always curious about how other people came into this sort of thing.

JESSE:  Yeah, it’s fascinating and I think one of the beautiful things about this industry and obviously, there’s so much more work to do but there’s so many different paths that you could take to get to being an engineer, to get to being a designer, to get to be anything in the technology industry.

For me, one of the things I really love watching and talking to is exactly what you say, “How did people get there,” and how do we figure out ways we can strengthen those paths for people who might not normally stumble into them and why did that set more people can flow through the diverse paths that are available.

MANDY:  Yeah, I’m doing it now. It’s really scary and really overwhelming and really exciting at the same time. I’m a Ruby newbie.

SAM:  Yay!

MANDY:  I’m using Codecademy three or four times a week, four hours at a time. I’m just trying to figure out what the heck I’m doing. But it’s fun so far. It’s a journey.

JESSE:  Yeah, it’s definitely a journey. Especially in the beginning, there’s just so much time of just banging our head against walls of like, “Why isn’t this thing running?”


SAM:  “How do I even?”

JESSE:  “How do I even?” Yeah.

JAY:  That’s a really interesting story. Could you also maybe talk a little bit about this one thing that kind of go through the door of picking up building your own web application. But at some point in time, you kind of started venturing towards more security stuff. How did that come about? How and why did that interest you?

JESSE:  The way it came about was through Clef. My co-founder, B, our CEO, he was writing a thesis on the concept of usable security and really, it turned into Clef. It was exploring with a few professors, how do mobile phones and the rise of mobile phones allow us to encode more usable security standards, so that we were empowering people to take the habits that they already have and use those in a secure way, rather than expecting them to change their habits or do something new.

He was working on that thesis and the two of us were actually in New York for a summer together when I was working at BuzzFeed and he was working out in a little company called H. Bloom that sells flowers. We just spent a lot of time hanging out and at the end of summer, he was like, “Why don’t you come work on this with me and instead of it just being a thesis, instead of it just being academics, let’s see if we can actually build something.”

He had a little prototype of Clef. The first time I saw the prototype of Clef, I was like, “That’s the future of the internet,” like this is exactly where I want to be and this is exactly what I want to do. Then I think a lot of the security things fell naturally from that around understanding how important security is to the world, how important it is to our day to day lives. What it takes really from an engineering product perspective, to make things that people will actually use that will keep them secure. I think the things we’re all so used to seeing in this world is security as a concept and then in our day to day lives, those security systems are just out of reach. Or it’s so way out of reach for so many people.

For us, from the beginning, we were really looking at the other way of we think this is what the future is going to be like. This is what we think the way people should be using the internet. Now, how do we wrap the security protocols into that? How do we expose those security protocols in a way that’s really, really usable and really, really accessible so that everyone can have that future rather than just a select few?

SAM:  Yeah, that’s a really good point. I think I probably generated my first PGP key about 20 years ago and I’ve never used PGP in actual use. It’s just this academic thing and I’m like, “Oh, that’s neat,” then throw it away.

JESSE:  Yeah, I was reading a really interesting blog post yesterday. That was by Filippo and he was on Reddit or Hacker News or one of those on Twitter. It was talking about how he was someone who had done everything for PGP. He had a master key and he has those to degenerate short term, short lived subkeys that could be used that he could rotate and he published them in all places. He went and did face to face meet ups to do key exchanges that you could build a network without having to trust the internet infrastructure that build that network.

At the end of it, he was just like, “I get two emails a year that are encrypted. Any email I get that’s encrypted can be bypassed by me sending responses as, ‘Hey, I’m in vacation. I don’t have access to a master key. Can you just send it in plain text?’ Or, ‘I lost access to this key. Here’s the new key for me’.”

He looked it up and he was like, “Look, the system has been around for so long and there’s some good parts about it and it does a lot of really cool things but it’s fundamentally unusable and therefore, there’ll never be a standard and therefore, it’s just not worth doing.” So he was advocating what I think a lot of people are advocating now is not because of PGP but for other reasons of just using signal and using short-term encrypted communication rather than trying to maintain and continue to use like long-term, persistent encrypted identities or cryptographic identities.

JAY:  So you’ve mentioned Clef, could you talk a little bit more about what Clef is as it deals with trying to create security tools that are easy for the masses to use?

SAM:  Mere mortals.


JESSE:  Clef is what we call the future of two-factor authentication. I guess I should explain two-factor authentication before I move on. Two-factor authentication is when you use two different things to identify yourself when you log into a website, so those two things can be something you are, something you know, or something you have.

Something you are is like a fingerprint, something you know is like a password, and something you have is like a phone. The reason two-factor authentication is becoming a standard is because when you verify two of those things, like a phone and a fingerprint, you’re much more likely to be who you say you are than when you’re just verifying one of them.

The standard for two-factor in the world right now is pretty much a password which we’re all used to using, plus a verifying of a phone ownership. That can be done through getting a text message that you type in a code or using a Google authenticator app. Then some other things that are like that but different, like using a YubiKey which plugs into your computer and you cap and it’s like a standalone cryptographic device.

With Clef, really what we’re trying to do and what our business has been is we see two-factor as this really, really important thing. But in consumer environments where it’s deployed like Facebook and Twitter, because it’s hard to use, because it adds additional burden on top of this already really frustrating system of passwords, it has really low adoption, so only less than 1% in a standard environment.

What we’re trying to do in Clef is figure out how do you build something that has the security of two-factor and keeps people safe like two-factor. But it’s also something that they want to use, something that’s easy to use, and something that they’re going to choose over passwords. The four and a half years we’ve been building Clef, our entire thesis has been that. It’s how do we take systems that already exist that are secure and make them really, really easy to use. You can download it at GetClef.com. You can use it on a bunch of websites on the internet, Bitcoin sites. If you have a WordPress site, that’s a really popular thing, or WordPress Plugin powers logins on almost a million WordPress sites on the internet.

What it looks like is you walk up to a computer, you have a Clef app on your phone, you hold up your phone, you log into the website. We verify the phone using a private key that’s stored on the phone and we verify your fingerprint using Touch ID or the same thing on Android.

It’s two different factors. We’re using something you are and something you have versus something you know and something you have. But it’s the same level of security and it’s so much easier to use that we’re finding it people want to choose this over passwords. That’s a remarkable difference because when people want to use something, they’re going to use it. But when people don’t want to use something, they’re not going to use it. The only security that matters is the security you use so for us, that’s it. That’s all we care about, that’s all we focus on, and everything else kind of falls to the side.

MANDY:  Due to current events in our country right now, security seems to be on everybody’s mind at the moment. People are talking about the need more than ever right now for citizen’s cybersecurity correlated to the rise in mass surveillance. Can you speak to that?

JESSE:  Yeah, just to clarify those words because there are big words there. I think what people are really talking about is given the election of Trump and given, I think, what many political observers are seeing is kind of moving our country towards a more fascist government or a more controlling government, where there’s potential for oppression, there’s potential for silencing of voices, there’s potential for jailing of journalists and all those things we see in countries that we’ve historically looked on as others and we’ve historically look on as like, “How could that be possible?” Given that shift in our country, I think a lot of people are talking about what does the individual need to do to keep themselves safe. Especially people who are actively resisting, people who are going up against Trump, or going up against the white supremacist side of our political spectrum and say, “Hey, this isn’t okay. This isn’t normal. This isn’t real. We need to fight this.” The topic of how do those people keep themselves safe, how do they keep their communication safe, how do they make sure they aren’t silenced is really important so that’s just like a little clarification.

SAM:  Yeah, I was just wanted to point out that these are not abstract, theoretical concerns either. Since the election, we’ve seen a dramatic spike in hate crimes. Many of us who are white male privileged persons are starting to experience the reality of personal and safety that a lot more marginalized communities have been dealing with all along. Those communities are dealing with even more of that. This is not like, “Hey, let’s all talk about security because it’s interesting.” These people’s lives are in the line here.

JESSE:  Yeah.

MANDY:  So what are some things that we can do to keep people safe?

JESSE:  There’s a whole lots of things. There’s been some really great blog posts which I can link to in [inaudible] but I think the key things to consider are the privacy of your communications and the privacy of your activities and how security relates to that. When we’re thinking about the privacy of communications, when you want to be talking to people, and when you want to be communicating in private or in secret, historically people have felt that things like text message or things like phone calls, those are private communications.

Even if there’s some fear that those might be listening on, I think there’s generally been a sense among the greater public that when you’re talking to someone, when you’re texting with someone, you can rely on the fact that you’re going to be only two people that are privy to that conversation.

One important thing that there’s been a lot of encouragement, which is something I also encouraged is switching those communication channels to channels that are fully encrypted, versus channels that are unencrypted because even if the cell phone company says, “We’re not going to look at your messages. We’re not going to listen to your calls,” which they don’t say because there are listening to your calls and they’re looking at your messages and they’re giving it to the government.

Even if they were to say that, the way that information is traveling through the internet and in ‘unencrypted format’ means that the government who is really good at hacking our internet systems and is really good at pressuring companies into giving them whatever information they want, they can get those communications. Even if a company says, “We don’t want to give it,” the government is going to get it.

When we switch to encrypted channels with tools like Signal, which you can all download on an iOS or Android, put in your phone number and start texting and calling people in an encrypted channel, when you switch to an encrypted channel what that means is that the way it travels over the internet, it can’t be stolen, it can’t be broken, it can’t be taken and read by the government. That’s actually private.

When you’re talking about important things and when you’re talking about resisting and when you’re talking about the resistance to white supremacy and the resistance to fascism and there are potential side effects and potential harm that can come to you, making sure that you’re using encrypted channels like Signal is really, really important.

One of the things I’ve been mostly say about, I would say skeptical, but excited ties back to what we were talking earlier around like usable security is we are starting to see people like WhatsApp and we’re starting to see people like Facebook take the things that have been pioneered in security first projects like Signal and wrap them back into the software that billions of people are using, and providing that security encryption by default. There are still flaws and there are still, I think fears around the centralization that occurs there and the centralization that comes with using a platform like WhatsApp or Facebook. But that standard and that default being encoded, I think it’s really, really important. That’s my spiel on messaging, I would say.

MANDY:  As someone who isn’t a security expert, how do I evaluate various service providers?

JESSE:  If you’re looking to do messaging, download Signal. If you want to do anonymous browsing which is important, use Tor. Don’t evaluate them yourself. You will be unable to. Even people who are security experts, even people who are journalists who write about this, even people who are cryptographers, it takes so much diligence, so much knowledge, and so much investigation to actually determine what’s going on here that instead of trying to do it yourself, instead of looking to experts — someone might call me expert but I definitely won’t call myself an expert — looking what the people at Whisper Systems are saying, and Moxie Marlinspike, look at what Bruce Schneier are saying. Trust the people who know everything that know about this and don’t try to make a decision for yourself.

JAY:  See now, I feel like that works for developers, that works for people who have some knowledge of what’s happening within the technology industry. But how do I do, if let’s say, like my little cousin, who lives in the inner city, how do I have this conversation with dear old mom and dad because I’m still trying to get them to stop using the same password for 200-some accounts.

“Mom, you can’t use the same password for your Chase bank account as you use for your e-mail,” you know what I mean? That’s really the thing. I don’t know if you would agree but let me know but I’m very much of the position that developers are very privileged class and it’s kind of our responsibility to develop tools for people that are opt out by default when it gets into stuff like security because it has all types of implications.

We’re talking about stuff from stop-and-frisk to cops rapping kid’s phone to a whole completely different issue, whether you’re getting into things like search-and-seizure. What’s your opinion of that that where we are kind of politically says that we’re very much in a place where it’s definitely a possibility where the side of the law is probably going to be just going to be the other side. We may not stand in the same position that the president-elect’s administration will so it’s kind of our responsibility to put some of the stuff out. Would you agree with that? What things our listeners can do within our workplace to address that possible responsibility.

JESSE:  I think that goes back to what I mentioned earlier on Facebook rolling on encrypted messaging in Messenger and WhatsApp rolling out encrypted messaging by default for all of WhatsApp. I think those things happened because inside of what’s happened Facebook, there was one or two or a team of people who are like, “We think this matters. We think that people deserve this. We think that this is important for the world,” and that’s people went in collaboration with Whisper Systems at WhatsApp because they knew Whisper Systems and Signal are the people who are the best at this in the world and we want to take what they know and bring it into our platform and bring it as a default.

I think every day as developers, we have the opportunity to make decisions that makes our software and the tools we build, either more secure and more private, or less secure and less private. Invariably, there’s a spectrum there and we won’t always be able to choose the more secure, more private due to budget constraints or whatever the constraints may be.

But I think having that in our head that we are the people who are making these decisions, we are the people who are building the infrastructure and tools that every other person in the world is using and we want to be making that more secure. We want to be making that more safe, we want to make that more private. I think taking that attitude is really important.

In terms of what you’re saying about communicating with your mom, that speaks to another thing which is our responsibility, as people who hold a relatively large amount of knowledge about this as educators to people who hold a relatively smaller amount of knowledge about this. You’re right. Your mom is never going to go read Bruce Schneier. She’s never going to look at Whisper Systems. Your little cousin maybe and maybe your mom… I don’t know. I don’t want to rule that possibility out.

But you can and then you can go to your mom and you say, “Hey, mom. These are what I recommend you do and this is why,” and coming from a friend or coming from a family member or coming from a peer, that’s going to be heard and it’s going to be received and it’s going to be internalized in a major way, whereas reading it on CNN or whatever, it’s going to just go through someone’s head and come out the other side. I think we have some responsibility about that.

Just from talking earlier before the show, it sounds like at least you’re already doing that. You know, organizing crypto parties and setting up places where people can come and opt into education around the stuff. I think it’s really, really important.

MANDY:  Do you recommend as an introduction just telling somebody a password manager, like LastPass or 1Password is one way that you can just get somebody started?

JESSE:  Yeah, password managers make everyone’s life easier and more secure, just as default. I used 1Password. It’s one of my favorite tools in the world because it’s seamless and it works really well and it helps me generate 35-character passwords and I’m not going to give that away. It helps me —

SAM:  Plenty of entropy there.

JESSE:  Plenty of entropy. No one’s going to be guessing it. I think yeah, absolutely, that’s a really good point. Just like switching to a password manager is one of the single biggest things you can do for your account security and security is necessary for privacy. Those two things are interlinked in a way that is unmistakable and totally important. Like you cannot have a private life, you cannot keep your communication secret, you cannot keep your activities private unless you practice good security and that’s a hard thing but there’s harsh reality about the internet today.

SAM:  Riffing on that for just a second, it seems to me that as people with a lot of power in the software industry, another of our responsibilities is not only making available encryption of data in flight but we also have to be very careful about how we steward the data that we store and make sure that we are encrypting the data at rest as well because otherwise, we just find ourselves building these giant troves of highly-attractive captive data.

JESSE:  Totally. Again, going back to just the concept of security by default, there are companies out there that are building tools that make that sort of encryption at rest and securing of data at rest is really easy. If you’re using AWS and you want to have all of your database encrypted by default, you just check a box in RDS. It’s all done for you. Your keys are rotated, you have a full audit log, everything’s just there for you and you could run your own Postgres or MySQL server.

But using AWS is not only as are they going to handle all the scaling but they’re also going to do all of that encryption point. I think that’s really, really, really important so figuring out do we choose tools that allow us to encode secure practices by default? I think that’s really important. Also, do we build tools for other developers that make that possible?

There’s a blog post on Convox which is a company that build open source platform on top of some Amazon tools to make it really easy to deploy web applications. They wrote a great blog post on how do you use AWS KMS, which is their key management service to easily encrypt and decrypt data. I think that’s sort of education by developers, not sort of exposure to the tools that are really easy to use for making these secure practices available on our software is exactly what we need more of in this industry.

JAY:  That’s interesting. I have a question about that. As far as tools and resources, what are some tools and resources that you’ve come across, kind of like making this move in creating a tool that handles two-factor authentication? Or some things you’ve pulled it off that you think that you see, what’s the information that you’re consuming just to make sure that you and your team are kind of ahead of the curve, so to speak?

JESSE:  One of the things that we try to do as a team, and I think it’s generally a good practice for pretty much everyone is we don’t try and do cryptography, we don’t try and implement new security systems, and we don’t try and do anything that someone hasn’t done for 20 years. All of Clef is built on standards that have been around for 20 years and the only new things we do is wrap those standards in technology like mobile phones and wrap them in technology like something called the Clef wave which is one of our transmission techniques. But we don’t do new security.

So using libraries like NaCl or Salt, which is basic cryptographic- not [inaudible] but like one higher level than [inaudible] that let you use to encrypt and decrypt data, that usually do asymmetric encryption and asymmetric key signing. Using those libraries, like NaCl or Salt, Bouncy Castle, there are variety of them across all the different platforms. That’s really, really important.

Another thing we do is we’re really, really heavily invested in strong users of AWS. I was talking about this earlier but we think that they do a really good job of making tools easier. We also think that they do a lot of really sane, smart, and secure things around how they move and store data.

Whenever we have the opportunity to rather than building something ourselves or outsourcing it to a third-party software or service, we’re going to go with AWS and we’re going to go with the tool that they provide because we trust them. We know they take security very seriously. We know they do things like encrypting our data at rest and that gives us a sense of confidence that we often don’t even have in our own code, just because we are fallible.

We know we are fallible and we always want to be on our toes to think how could we mess this up, how can we take this off of our plate so we can mess it up? So that someone who’s working on this problem full time can do it right. I’d say like use defaults, use libraries that have been around for a long time and then trust parties who can do a better job than you because you’re more likely to mess it up than someone’s who’s working it out full time.

I think that kind of goes counter to what a lot of people might think because a lot of people are like, “You need to build this yourself,” Or, “You can’t rely on a third-party.” But I think it’s really important to find third parties that can do these things right and then offload the responsibilities of security to them.

SAM:  Find some giants whose shoulders you can stand upon?

JESSE:  Exactly.

SAM:  So Jesse, you brought up two-factor authentication earlier and we talked a little bit about that. But I’m just going to go out there and possibly look really stupid when I say this and I’m okay with that. I got a new phone, which I usually do around Black Friday every two years and I transferred all of my stuff over from the old phone to the new phone and I wiped the old phone.

Then I discovered that Google authenticator settings don’t transfer so I got locked out all of the stuff that I had set that up for. I was annoyed by that so I ordered this YubiKey and actually I’m holding it up to the camera here — our listeners can’t see that. But it’s on my key chain.

After I had ordered it, I was thinking, “Well, how’s that going to help me for authenticating to my mobile apps because I can’t plug this thing into my phone.” So I thought, “Well, maybe I’ll just use it for my work AWS account.” It turns out AWS doesn’t support YubiKey [Laughs]. So if I’m sitting out here making all these less than entirely informed decisions, like what hope does anybody else have?

But actually, that’s not even my question. My question is how is this so complicated and what can we do about it? How do it work?


MANDY:  “How do I do the thing?”

JESSE:  YubiKey is [inaudible]. Mine is this.

SAM:  Oh, tiny.

JESSE:  It is very tiny. It stays in my computer. You just got a message from my YubiKey in the chat. I don’t know if you all see that but —

SAM:  I’m not trying to pronounce that.

JESSE:  — Because it’s just gibberish for listeners. Because it’s in my computer when you tap it, it tries to do its authentication so that it often results just in me bumping my computer and sending gibberish to my friends via text or my work colleagues via Slack. It’s like one of the more annoying things in the world.

YubiKey, there’s not that many places you can use it, which is perhaps unsurprising. But you can use it on Google. You can use on GitHub. I use it on software as a service, the one called Sentry that supports it. But for the most part, I think like the calculus of business, especially for consumer applications, not that many people are going to go out and spend $25 or $50 on a dedicated device that do two-factors.

If you assume that 1% of all consumer users are going to enable two-factor at all, you got assume that it’s like 0.000001% of people who are going to go out and spend $25, rather than just putting it in their phone number. For day to day use, YubiKey for a regular person is pretty useless.

For developers, I think there’s some use. Particularly, more custom-y use cases. I know at Facebook, they really heavily use YubiKey to bind developer private keys to specific devices so the YubiKey attached to a computer. That means that they can know for certain that that private key wasn’t stolen off of that device and being used from somewhere else.

I think in that use case, having a dedicated hardware device that isn’t perhaps vulnerable to malware or vulnerable to interception like a text message which might be as really important. But for a day-to-day person, there’s just not that much to do with it. We have a bunch of it in the office here and they aren’t very heavily used. I think I’m the most avid YubiKey user.

In terms of developers, like we’re building a new product right now alongside Clef. It’s called ‘Instant 2FA’ and it’s basically a tool kit for developers that anyone who setup two-factor authentication in a website in around 30 minutes. Whereas, it might take a couple weeks. Normally, we build a really cool product that I’m very much in love with that. Makes it dead simple.

One of my favorite properties of it is the way it’s built. We are able, as a company and providers, to basically do continuous improvement of the product and roll things out to developers with them just checking a box. Right now, we support only TOTP which is Google authenticator app.

If you have an app on your phone, it cycles through codes and you type one in. But we’re going to be rolling out SMS soon and all it will take for a developer to enable that as a checkbox. We’ll be rolling out YubiKey soon. All it will take for a developer to enable that is a checkbox.

I’m really excited about that because it means that as we have better two-factor technologies, better authentication technologies coming through and becoming available to consumers because we’re a service provider that’s focused entirely on doing this one thing, we can build those tools into the platform and then every developer that’s using our platform can get access to them. That sort of like roll out of cool new secure technology is important.

JAY:  I’m a little bit of a skeptic. I want to throw this out. One of the things that I’m concerned about is that what happens is like Mom and [inaudible] Dad, they’re not going to be air gapping like computers anytime soon. No one’s going to be like pressing stuff to like a read-only DVDs and stuff or whatever.

I think you made a good point earlier to get into this like we have to rest on the shoulders of giants for some of this stuff. But understand that that’s also where the weakness is with some of the stuff. If we say, “Hey, everybody go use Google.” As long as Google is not evil and I guess we’re okay.

This is one of the things that we sometimes get into or sometimes when you go in and you say, I’m utilizing LastPass or 1Password. In the end, where you probably have more security is when you have a lot of different options. you don’t have like one point of failure and could you talk a little bit about that because it’s pretty much saying in some cases like, “Well, I hope AWS has my needs,” like you’re thinking about [inaudible] at all times. There’s no secret sort of —

SAM:  NSA backdoor?

JAY:  Exactly. Like NSA backdoor that’s in place because this is real stuff that’s happening. Could you speak to that a little bit?

JESSE:  I think, I made a [inaudible] to this before the show started. My opinion, I think here is different than a lot of other people’s opinion. I think there’s two sides. It’s a spectrum, right? Necessarily you have a spectrum of how much do we want to centralize these sorts of things because we think when people are doing it and focus on it full-time, they’re going to do a better job than us but that comes with the downside of when you centralized things, that’s the central point of attack, where for instance, the government could break in to AWS and then they could steal everyone’s data and we would all be done. That’s one side of the spectrum like full centralization.

In fact, there are no startups, there are no open source libraries. It’s just Google doing all of our web services for us and we’re all just like, “God, this is horrible. We have [inaudible].” Or on the flip side, it’s not horrible and they figure out a way to be like good people or whatever. Who knows if either of those things are possible?

Then on the other side of the spectrum, you have people who are saying, “We shouldn’t trust anyone. We should verify everything ourselves. We should write all of our own software. We should use only open source libraries. We should make sure that everything we do is fully trustless and decentralized.” If that especially, you have people and you see tools like the Blockchain, like Ethereum, Bitcoin, and Zcash where that ethos of everything needs to be public, everything needs to be decentralized, everything needs to be like fully on that edge of spectrum.

Those two ends of the spectrum, obviously, you can’t actually, for the mass internet sit on either end. You have to find somewhere in between. For me, I generally find it like in the middle and I think that can frustrate people who know a lot about security because they’re like, “How can you trust an organization like Google? How can you trust an organization like AWS? How can you trust an organization like Clef? Or Instant 2FA?”

I think part of me sitting in the middle is definitely influenced by building an organization like that. Through the experience of building an organization like this, I’ve seen first-hand like how hard it is to do things right from a security perspective as developers? It’s hard, it’s expensive and because I talk to hundreds of business every week, I’ve seen how little it takes for businesses to just not do it?

If a business adds two-factor or doesn’t add two-factor, the decision there is do we want to spend two developer weeks? Almost always the answer is going to be no. Almost always, unless they get breached, or unless 50 customers asked. It’s almost always going to be no so if the standard and the default is just like, “No, we’re not going to do this,” maybe we can argue that there might be an ideal world where we’re all doing decentralized a trustless encryption of our data as developers but that’s nowhere near the world we live in.

The world we live in is developers will never do anything about security, unless they are forced to for the most part. So figuring out how do we build tools that make it so easy, they can’t not do that, they can’t not have security, I don’t think that comes without some sort of centralization. I think it requires people working on this problem every day and figuring out, “How do I talk to customers? How do I talk to developers? How do I figure out ways to build tools that they’ll actually use?” because otherwise, no one’s going to use anything.

JAY:  Wow. So I went through [inaudible] more divisive —

SAM:  Bring it.

JAY:  — And this is the reason why developers made unions.

SAM:  Oh, okay.

JAY:  That’s a whole other topic but I think what you’re talking about, Jesse, gets back into the whole thing of responsibility once again. Then you have developers who kind of see the way the world it is, they have business interests. I think that we’re at this time right now we’re talking about the effects of automation on jobs. We’re talking about things like basic income. We’re talking about the [inaudible] secure tools to protect privacy. They have implications for the masses.

The only way I think as developers who are privileged class can say, “The rise of AI is important and we should be very thoughtful about how AI is utilized, especially if it becomes better.” Who is going to speak for that to a certain degree? Where does that power come from? Like you say, “Mr CEO or Mr President, I think that we should implement two-factor.”

“Oh, well. Actually, you already mentioned. No, we have other priorities.”

I think some of this other discussions are continually happening as well. That’s probably a conversation for another time but I just wanted to go out there and put my pitch for why we need to bring back unions in a completely different way and a very 2016/2017 model so that the privileged classes can advocate for others.

That’s not really a question. It’s more of a statement. I don’t know if you agree with that at all.

JESSE:  I’m skeptical if we had unionize work forces among developers, whether that would actually translate to that sort of change around encoding better security practices. I’m skeptical of that. That’s it. I think that unionizing developers is a really important thing and I think that over the last 30 years, the decline in unionization in this country and then over the last 10 years, the incessant destruction and tearing down of unions by the right wing in this country is a travesty.

My dad works for AFLCIO. I grew up in a family that’s always been very pro-union and seeing that happened, watching union membership massive decline and seeing what that does to individuals or what that does to communities, I think it’s a travesty. I think, if there are ways we can bring back more unionization, we can bring back more rights to workers, I’m all for that, all in support. But I am skeptical, whether that would translate to security.

I think unionization is primarily used as a tool to improve the working conditions and rights of the workers, rather than the products they’re making. There needs to be other forces to improve the outcome of the end product.

SAM:  Yeah. I’ve been hearing a lot of political rhetoric recently about bringing back more manufacturing jobs. I think a lot of that misses the point that manufacturing jobs weren’t great because they were manufacturing jobs, they weren’t great because they were unionized, and the unions bargain for much better conditions which brought up to middle class, which made everybody else better off.

When you said that, Jay, I also had a little bit of that same skepticism that you express, Jesse. I had to stop and think about it for a second and I’m used to thinking about unions as a mechanism for collective bargaining, which then as you said Jesse, comes into improving worker conditions.

It seems like if you’re concerned with professional standards, some increased form of professionalization where you have like civil engineering has standards boards, certification bodies, and codes of professional ethics that I really feel like we really should have in software.

It seems like that kind of an organization might be more effective at enforcing these kinds of professional standards. But then it occurred to me that a union is possibly another way to back that up as well. It just maybe not the union’s primary concern but the union can do that too.

JESSE:  Yeah, you have a really good take.

JAY:  Yeah, definitely. This could be a whole other show but —

SAM:  Yes, please.

JESSE:  Bring me back.

JAY:  Some of these insights like collective bargaining and the ability to- Oh, I’m going to sounds like Marxist to stop the means of production out here are necessary as it affects people. We can [inaudible] some of the stuff but talking about civil rights and liberties will not. But anyways, that’s a conversation for another time.


SAM:  We’re going to take some time to thank another one of our $10-level Patreons, Sophie Déziel from Montréal. Sophie is a Ruby on Rails developer and Montréal.rb organizer. You can find her @SophieDeziel on Twitter. Thank you Sophie and thank you to all of our awesome contributors.

If you’d like to support us, please visit Patreon.com/GreaterThanCode and that link will be in the show notes.

JAY:  Our first question is from George Adamopoulos, “Citizens are buying a lot of Internet of Things devices that are being used for Denial of Service attacks. As citizens, are we responsible to some extent for them occurring regardless of our technical ability at the time of purchase?”

JESSE:  I would say no. If you’re an individual and you don’t know anything about computers and you just go to the store and you buy a light switch that is connected to the internet, you cannot be expected to understand the technology behind that. You can’t be expected to patch or fix the technology behind that. I just don’t think that was a viable way to solve this problem.

I think kind of going back to what we’re talking about earlier about the professional union and developer responsibility, I think it falls to the company, it falls to the people at the company to be like, “We’re not going to build software, we’re not going to ship software to millions and millions of homes around the world that’s going to compromise the security of the homes or compromise the security of the internet as a whole,” so I definitely don’t think that people responsible for that.

JAY:  But what if we expanded that question to include retailers: Amazon, Best Buy. Do you think they have a responsibility? I know that some of them stops selling the Samsung bomb phone, right?


JAY:  — The Samsung bomb. Before Samsung it was like, “Oh, I want to do a recall.” I forgot what their process was. Do you think that they have that responsibility, the retailers?

JESSE:  More blurry. I don’t know, I think I need to think about that question a little bit more. It’s like, how much do we want market places and distributor- I mean, I guess marketplaces and distributors already do a fair amount of selection of the things that they make available to end consumers so maybe that’s a selection criteria. Probably, a good selection criteria.

But I think it also be a little bit nervous about- actually, no I’m not nervous about that. I think since they are already doing a fair amount of auditing of what goes on Amazon and what does Best Buy sell, I think that security of those devices should absolutely be one of the things that they make into [inaudible] and I think that’s really the only way to make this be about the bottom line, rather than just being about ethics.

If Amazon and Best Buy and the companies that are selling these IoT devices at the end of line are like, “Look, we’re not going to sell devices that compromise the security of the internet,” then the manufacturers are going to have to change what’s going on. If they do sell those devices, then it’s going to continue to be status quo unless individuals in those companies step up.

But I think figuring out forces that allow us to exert bottom line pressure of like, “We are going to make you less money unless you do security,” I think those are really important so I’d say that I think there is a responsibility of those retailers to do that.

SAM:  That’s an example of a retailer like Amazon being able to exert market pressure on the manufacturers of those devices who exerts the market pressure on Amazon to make that an attractive decision for them.

JESSE:  Yeah, now we’re going back to me saying no. I’m glad I didn’t say definitive no. It makes you think more about that but exerting market pressure, I guess it would be good if people exerted market pressure but do I think people have a responsibility to do that? It’s like there’s just so much education required in order to understand those mechanics.

As an educated consumer, if I’m a developer and I know what’s going on, I wouldn’t buy and I shouldn’t buy the sorts of IoT devices that are used in these attacks. But if I’m a day-to-day person, I don’t think you can assign responsibility to people who don’t understand the things that they’re doing.

SAM:  At the risk of alienating the last listener who didn’t already drop off when Jay came out as a Marxist, this sounds like a classic failure of a free market. I obviously, don’t know enough about markets and economics. Maybe if there is a way that we can exert this but at least at the level that I’m at, it seems like that’s a clear case for some other entity to come in and perhaps regulate a market and say, “No, you actually cannot sell these.”

JESSE:  Yeah, and kind of continuing on that free market train of thought, there are definitely people who can speak more eloquently about this than me but I think it also speaks to the whole fallacy of consumer choice that we as individuals have the power to drew the choices we make, influence the world, then influence the structures that being capitalism and how those corporations work. That’s something that’s widely proliferated, that idea that we can influence that. But for the most part, that’s a fallacy and it’s not real. Instead we are beholden to these structures that exist that empower corporations to continue their pursuit of making as much money as possible without concern for safety, without concern for security unless it makes them more money. I think this question kind of place to the idea as well.

MANDY:  Another listener question comes from Wesley Ellis. He asks, “What are your thoughts on ‘benevolent’ malware that looks for vulnerable devices and patches them without asking for permission from devices owner?”

JESSE:  I don’t think I can give that educated thought here. I would like to defer to people who are smarter about this than me or more informed about this than me. My sense is that there’s always going to be a push and pull in software and in systems that are accessible to the entire internet around people doing good — white hats and people doing bad — black hats.

We’ve seen a lot of people doing bad with the DDoS attacks that are happening with IoT devices and I think there is an opportunity for groups of people to coordinate and to thoughtfully figure out how can we do good and that might come through pushing retailers, like what we’re talking about earlier to not sell those devices or working inside of companies to improve the security standards but maybe will come through shipping malware that patches all these devices. If that’s something that’s coordinated, well thought out, public, and something that people are on board with, then I think it’s a good idea.

I’d be wary of the lone cowboy or cowgirl going out and patching things willy-nilly because… I don’t know. I just think like if you’re a black hat that sort of isolation, I think is important because you’re trying to protect yourself from people who are trying to stop you from doing bad things. But if you’re a white hat there is no reason not to be coordinating, making public, discussing, and having an open conversation around what are we going to do? What are the good things we can do? I would definitely push for that path much more than any individual trying to do this.

SAM:  Yeah, on its face, that scenario of malware that comes in and fix your broken stuff, that’s almost literally straight out of a Spider Robinson’s science fiction novel. It sounds wonderful. “Please, patch my router so I don’t have to.” I’m not even [inaudible], though.

At the end of every show, we like to reflect on what we’ve learned or felt or what we’re going to take away from this. Jesse, since you have a hard stop, let’s have you go first. What are you going to take away from this conversation?

JESSE:  I don’t know if you all have been following the news but Friday of last week, in Oakland, there was a really big warehouse fire that ended up leading to the deaths of 36 people and a lot of people who are one degree of separation from me. I’ve been reflecting a lot obviously, holding those people in [inaudible], their families [inaudible], their friends [inaudible] but also reflecting on the structural things that allowed for something like that to happen.

One of the things we’ve been discussing on the show today is the balance between the responsibility of the individual in a society and the responsibility of the structures in the society and the organizations in the society to look out for the individual.

Post fire, there’s been a lot of conversation about the way the building was occupied, how there are people living there and it was not up to fire code and there were rickety staircases. There’s been at times talk of like how could these people do this? How could they put themselves at risk? Or how could they think this was a good idea?

One of the things that lots of people who I trust are saying and that feeling is that like, in Oakland [inaudible] and all across the country, we have massive affordable housing crises. That means that people who are marginalized are these people who are conforming to the Silicon Valley stereotype, perhaps. They’re pushed to the edges and they don’t have places to live and they need to find ways to live so that means that they’re living in warehouses and in some optimal conditions and that’s not the fault of the people there. We can’t blame them for that. Instead we need to look at the systems.

I think it’s a similar thing with security. It’s what we’ve talked about all of today is that we can’t expect people to just go out and magically make security happen because that’s just not the reality of the world. I’ve been reflecting as we’ve been talking about this on the parallels between those two things, about how we as a society have a responsibility to look after the people on the edges and look after the people who don’t have the tools or don’t have the resources to do security themselves or to get the best housing for themselves. We need to figure out how do we put structures in place, how do we put tools in place that allow those people be lifted up, allow them to be safe, allowed them to be secured, allowed them to be private. All of us have the responsibility to do that.

MANDY:  I’ll piggyback on that. That’s kind of why I agreed to do the episode, not knowing much about security and just being a brand new developer. Security is something that we should be aware of, as we’re learning and it matters for people all over marginalized groups in now more than ever with everything that’s going on in our country that it’s something that we should be thinking about and something that we should be paying attention to.

JAY:  My take away from this episode is that security is important. I mean, no duh, right? But something that we need to talk about, talk about what our responsibility as developers to our companies, to ourselves obviously, and also to the people that we care about. I feel also do want to think that was said a couple of times is that we have to find better ways for people to secure their own privacy and their own communications.

That’s just something else. We can just built the thing. We have to also make sure that it’s usable and we have to make sure that beyond the fact that it works, that it’s going to be adopted by people. It’s going to be something that’s obviously meaningful and helpful for them. That’s what my takeaway was from today.

MANDY:  Awesome.

SAM:  In addition to the sort of superficial like, “Yes, I need to learn more about security,” sense that I’m taking away from this, I find myself at the end of the show thinking of a quote that somebody I interacted with briefly like 15 years ago said and it stuck in my head ever since. The person whose name was Jeme Brelin and he said, “Stop calling me a consumer. I am neither a gaping mouth nor an open wallet. I’m a citizen interacting in a community.”

I think this is a good reminder and to me that we aren’t only interacting in a capitalist society. I don’t think it’s one of the other. I think we can participate in both. Maybe I’m naive for saying that but I think on a day-to-day basis, we do participate in both. But it’s a good reminder to me that my wallet is not the only form of power that I have. I have a voice and I have leverage. I can extend to help other people. It’s a good reminder to maybe pay a little bit more attention to helping other people stay safe with skills that I sort of take for granted so thank you for bringing that back up again.

MANDY:  Thank you.

SAM:  Thanks very much for joining us, Jesse. This has been a wonderful chat.

JESSE:  Oh, thanks for having me. It’s been a real pleasure.

SAM:  With that listeners, we will see you next week.

This episode was brought to you by the panelists and Patrons of >Code. To pledge your support and to join our awesome Slack community, visit patreon.com/greaterthancode. Managed and produced by @therubyrep of DevReps, LLC.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.