Tech'ed Up

Bye, Trolls • Tracy Chou (Block Party)

March 23, 2023 bWitched Media
Tech'ed Up
Bye, Trolls • Tracy Chou (Block Party)
Show Notes Transcript

Block Party CEO & Founder, Tracy Chou, joins Niki remotely to talk about what her team is doing to make the internet a safer place and why she’s passionate about their mission. Her app puts control back in the hands of social media users using block lists, lock-out filters, and other tips and tricks to combat trolls. The conversation covers why the social media giants aren’t incentivized to do this work themselves, what action is needed for third-party app innovation to thrive, and what, as users (and people) our options are - and should be.

"…it [API access] is actually a win-win-win where the platforms don't have to invest their product and engineering resources in building out the surface area of safety product. And it's some stuff that would never make sense for them to build." -Tracy Chou

○ Follow Tracy on LinkedIn
○ Learn more about Block Party
○ Follow Block Party on Twitter
○ Follow Niki on Twitter


Intro:

[music plays]

Niki: Today, I’m talking to Tracy Chou, whose building an app that allows users (also known as people) combat online trolls. She and I taped this interview before Twitter blocked free access for developers to their API. Since that change, Twitter has been wonky at best and our lord and savior, Elon Musk, has been running the company like a toddler having a tantrum. That means Block Party and many other third-party apps have a  harder time building on the platform… but they aren’t out of the fight yet.  

As a human on the interwebs msyelf, this topic feels personal and I’m rooting for Tracy and the work she and her team are doing!  

Transcript:

Niki: Today's guest is Tracy Chou. She's calling in remotely and we will talk about an app that she's building to help deal with trolls on the internet. 

Tracy, welcome to Tech’ed Up.

Tracy: Thanks so much for having me. 

Niki: So, I wanna talk through kind of three parts in this conversation. One, your professional background, which is really, really interesting.

It's technical. You're an engineer. Your parents were engineers. You have a Masters in AI. And then, as part of that, your work in tech. And then, I wanna talk about Block Party, which is the company you founded and the solutions you're trying to create for those of us on social media. Which I think maybe you have a little more positive attitude [chuckling] about social media than I do. I give it an F minus- total hellscape. 

And then, [Tracy: laughs] I mean, facts. And then finally, a lot of the people who listen to this podcast work in tech or are tech adjacent, and you have some specific asks to make your company kind of like get to the next level and really embed the product into social media in a way that is useful to users and solves a problem many of us have, which is dealing with trolls. 

Tell me a little bit about your background and your journey through Silicon Valley. 

Tracy: I had a very charmed Silicon Valley upbringing. My parents were both software engineers. I grew up in the valley. My high school invested in the seed round of Snap, [Niki: thats] which is a very funny little trivia detail. 

Niki: That’s unbelievable! 

[both chuckle]

Tracy: Yes! I studied electrical engineering, and computer science at Stanford. During school, I had internships at Google and Facebook. It was just, like, straight into the valley. When I graduated, I went to work in the start-up scene. So, I was at Quora as employee number five. I joined Pinterest when it was about ten people.

Yeah. When I first started working, I was the only female engineer on my team. I was the only one for a little while. I would go visit friends at other startups and other tech companies and I'd always be paying attention to how many other women were on their engineering teams and what kind of diversity they had in general. And so, I was always tracking this in the back of my mind, like, which teams look more diverse and which ones are, are really struggling. 

And so, I was mentally noting these numbers but also realizing that nobody was officially tracking any of them. And it was actually listening to Cheryl Sandberg, who was at that time at Facebook, talking about how the numbers were dropping precipitously that I had this realization. We actually didn't have any numbers!  And despite being in the tech industry, which is extremely data-driven, and as an engineer, I've been forced to do AB testing on everything. Everything has to be justified by the metrics. 

And so, I wrote this Medium post called, “Where are the numbers?” Not expecting that anything would actually happen, but did start to see more interest from the community in doing something about it. So, start-ups started publishing some of their numbers and then Google was the, the first of the big companies to take up this mantle. So, very grateful to Google for being that first mover and basically pulling the rest of the industry along. 

Now that the whole industry has basically put out its numbers, we can see that it's not an issue with one specific company, which I think was some of the fear originally as well, that if one company moved, if one company released their data and looked bad, then they would just be a target for maybe discrimination lawsuits or other kinds of like bad pr, but now we can see it's an industry-wide problem. We can try to organize at the industry level.

I would say the visibility has definitely helped. I think it's elevated the issue and it's made it possible for people outside the industry to realize that it is an issue. Since that data is out there, it's indisputable. It helps to keep us accountable to whether anything we're doing is actually improving the situation. 

I would like us to have made more progress, but at least we know [Niki: we know!] what level of progress we made or not. [chuckles] 

Niki: Absolutely. I agree and I do wanna move from this into your personal experience and how that has influenced the things you build as an engineer. But the last thing I'll say is on kind of the diversity reports, you absolutely need the numbers to have any accountability.

And it's my personal experience, is not that the companies I've worked for have not done their level best to try to recruit, but they can't retain, and you can't fill a bucket with a hole in the bottom. 

Let's talk about the block button and blocking and your personal experience with that. 

Tracy: Yeah. My career basically starts with blocking. The first thing I worked on when I joined Quora as a new grad was the block button. Even though we were only a few thousand users on the site at that time, there was already somebody who was harassing me and I wanted to make it stop. So, happily, as an engineer, I was empowered to do so by building the block button and made him the first person ever blocked on Quora.

And then, as I went into working on a lot of other moderation tools and admin tools; but one of the big takeaways I had from that experience was if I hadn't been there at that time and hadn't had those experiences, which were more likely to happen to me as a young woman on the internet, the company would not have prioritized building in those safety tools.

So that was a, a big link to the diversity questions of like, if we don't have diversity in our teams, we're not gonna have these perspectives that are so important for the products that we need to build. 

Niki: That's right. I think that one of the things, and we can talk about this, we've talked about it before, is if you're not viscerally feeling what it's like to be harassed on the internet, then it's; you don't get that, for me at least, that cold feeling in your stomach that makes you feel like, “This has to be fixed, or I'm just gonna not be visible on social media. I'm just gonna opt out of this.” 

But if you're feeling it the way you have and a little bit the way I have, then you're gonna, you're gonna understand, like, this has to be fixed because this is so impacting my personal use case. Whereas people who don't experience it, it just doesn't feel that way to them. They don't, they might know it exists in an abstract way or a rational way, but they're not as motivated, I think. 

Tracy: Absolutely. I had this experience from doing the diversity activism work around the data, where I built more of a profile for myself online, mostly on Twitter. And I got a lot of harassment for this.

I was getting everything from low-grade sexism and racism where the insults were not that bad, but having to deal with them all the time is still very annoying to really sustained stalking and harassment by people I didn't know, but they would fly around the world to try to find me in person, so I'd have to go to the police to file reports.

I was worried for my physical safety, and I've also had the occasional firestorms of coordinated attacks from sites like Reddit and 4chan. So, it's terrible!  Like, it's emotionally devastating. I felt like my life has been upturned many times by this. 

Niki: And the fact that you have had this happen and are still so public and present. And then we'll talk about Block Party and what it, how it's gonna address some of this.

I work in and around crypto, which means like, “How many scary clown emojis can one woman get?” And the answer is, like, “Infinite!”

So it's, it's mild to nothing. [chuckling] But, but I do have the opposite experience of you, which is I do have someone who in real life is fixated on me and that's moved into the cyber world, and there's nothing I can do. 

You just get the incoming and it's completely disconcerting, especially because it started in real life. And so, what that has caused for me is just locking down how people can communicate with me, making it hard to get in touch with me, which isn't great for my business, but it's something I feel I have to do to protect myself. And that's one person who's impacting my entire experience on social media.

So, tell us a little bit about how Block Party works and how it can address some of these issues. And also let's also talk about why the companies themselves aren't building these solutions. 

Tracy: Yeah. So, just to quickly react to some of the things you were saying, I think it is that trade off, I find so frustrating between all the good stuff that you can get from being online, whether it's visibility for your work and your business, but then to have to deal with all the negativity, the toxicity, or the harassment. Like, a lot of people do opt out of the good because they don't wanna deal with the bad. And that was a lot of the founding impetus for Block Party.

And the other part of it was for me, wanting to assert agency and not wanting to feel like I was just under attack and vulnerable, and there was nothing I could do, and that I was helpless. And so, being able to assert agency felt very important to me to, like, flip that frame. So, with Block Party, the goal is to make it so that everybody can participate online and be in control of their experience. They can partake in the good stuff while getting rid of the bad.  

The way that our classic product works, the first product we built, was on top of Twitter. There's a few key functions. One is automatic filtering, so if you think about, like, a spam folder, but for social, where things are just automatic, automatically getting cleared out, put into a folder you can go look at later if and when you want to, but you just have a better experience. Things are cleaner. You're not dealing with seeing the bad stuff before you can take some action on it. 

Historically, people would say, “Well, you can always block and mute.” It's like, “Well, the damage has already been done. ‘Cause I've had to see it to know that I want to take some action on it.” So, what we can do is automatically filter that out. 

The other key function we have is for the preemptive protection. So, things like mass blocking, if you wanna block all the people who've liked or retweeted a tweet that is harassing you or spreading misinformation about you.

These two combined, sort of, like, the day-to-day better experience then, like, in more extreme situations, being able to take action, kinda, like, just make that whole experience of being online better. There's additional functionality, but on top of that, too, so, things like engaging your community.

There are often other people who want to be able to help you, but the way that platforms are designed generally is that you are the only person that can do anything about attacks coming in towards you. So, we allow you to add a helper onto your account who can then take actions on your behalf. So, they can block on your behalf, or they can actually fish something out that didn't need to be filtered and unmute them. 

And we also have things like built-in documentation, even for deleted tweets or tweets from suspended accounts, which can be really useful if you do need to compile a report to file it with the platform or take it to law enforcement.

So, just really thinking about that whole experience from the perspective of the end-user or the person. 

Niki: The person, yes! The end user, who is a person!!  [Tracy: Yes]

So, I wanna go back and just recap three things you said. So, one is almost like a spam folder, which everything would go into, but then you choose when you're ready to look at it. Cuz one of the things for me is this sort of like out of nowhere, I get a new follower request from this person, and it's disconcerting in the moment, but if you can decide, ”Okay, I'm ready to look at this folder. Some of the things will be good, some will be bad,” you have more control.

A helper, which is clutch, because they aren't gonna have the same visceral reaction to this attack that you're personally gonna have cuz it's aimed at you, which I love that idea.

And then, finally, thinking through these systemic, like the mass blocking. For me, I would love to see a feature where I could automatically block anyone whose account was created within the last, say, three months. Because it's, people just create new, and new, and new accounts.  You can block all day long, but if they're, if they're brand new, I sort of know that it's not somebody I need to be interacting with. 

And so, I think, I love these ideas. One thing you didn't mention is machine learning.[Tracy: Mm-hmm] Tell me why? 

Tracy: Yeah. So I do have a background in machine learning, and I have previously built machine learning models for recommendations and for trust and safety.

We chose not to use machine learning or any AI in our initial implementation of Block Party because we didn't need to. The heuristics that we use for our filtering work really well, and it's things as simple as what you were describing. If somebody's just created their account within the last seven days, they don't have any followers, they have no profile photo, they're probably not somebody you really need to engage with. And these heuristics, they're good enough for us right now

The other thing that's really key for us is helping users to understand what's happening. So, the understandability of what's going on with the filtering. One of the questions that is coming up with a lot of the, the AI models now is, “What is actually happening behind the scenes?” And we have a lot of difficulty understanding what's going on. 

With our systems, with Block Party. It's as straightforward as like, “We muted this person automatically because they have less than a hundred followers.” And users can see that and they can understand, “Oh, if this was a false positive, maybe I wanna turn off this filter. And I, I know what's going on. Like, I know the rules that are going into it.”

Niki: And also, because this is a consumer-facing product, and you just said this, giving people agency and control. If, if it's just AI, you don't really know everything, you might overcorrect or be blocking things that aren't really your specific, I mean, I would love to block scary clown emojis. That'd be great, [chuckling] for me personally. 

Tracy: And you can do that, actually! Yeah. We, we actually built out emoji blocking.

Niki: Oh, amazing!  

Tracy: Like, some experience I had as well dealing with crypto spam where it's like, oh, there, there's some emoji that appear a lot and I really just wanna get rid of all of it. 

Niki: Like, it's literally the poop emoji for me; poop emoji and the scary clown. I'm just like, “I don't; there's nothing good that's gonna come from those two emoji.” [chuckling] 

Okay, I'm definitely getting this! I need this. 

I don't get it again a lot because I've been, frankly, like, I just don't enjoy participating in Twitter as much as some other social media platforms, partly because of things aimed at me, which again are one, 1,000th of what you've dealt with, one 1,000,000th of what you've dealt with.

And then also because, like, in 2023, the year of our Lord Elon Musk, it seems to be getting worse actually, like the desire from Twitter to change things. But maybe rather than just indicting Elon Musk thinking through what you said before, like why don't they just build these things or create features people can use?

Tracy: What I find really instructive here is always to look at the incentives, and particularly for corporations, like what are the business incentives? And for social media platforms, their goal is to drive up engagement and growth. That is the number one thing they're gonna optimize everything around. 

And then, when it comes to solving user safety, of course there will be lip service to, like, [in a deep voice] This is important. Y’know. We are working very hard on it,” but if you look at the incentives, it's just never going to be a top priority even if we're talking about things like being able to block the users who have fewer followers and new users.

A platform would never want to build that in because they want to encourage new user growth. Like, they don't want new users to sign up and then not be able to interact with anybody. So, that goes against all of their growth incentives to build in this kind of safety functionality, but for people who've been around for a while and are experiencing harassment, they want this tooling to be able to improve their experience.

So, we're looking at the incentive design here. It only really makes sense as a third party or somebody who's much more aligned with the end users as we are, as Block Party or like other third-party developers, to build out the solutions for people that really address their problems versus what makes sense for the platform overall when they're optimizing engagement.

Niki: And to your point, they have to optimize engagement because when they give their earnings report, one of the key numbers is new user growth, to your point exactly. I mean, I have sort of suspected this, I don't know if it's true. I know Elon Musk thought it was true. Now I don't know what he thinks, [chuckling] but is that I would feel like, “Okay, how many of these new users are just trolls, or bots, or, like, troll farms?”

Because you, you need the numbers to go up for your earnings call, so there's no incentive to shut them down because you're held to this, like, almost impossible up and to the right expectation. And so, to your point, there's no financial incentive to do it. It's not that people maybe don't care. It's that why would they do it? Why would they put any ENG resources into that? 

You're the CEO and founder of Block Party. You've devoted a big chunk of your professional life to coming up with these solutions to help people have, like, a less horrific [chuckling] experience on the internet and give them some agency. What do you need? What do you need maybe from Washington or in general to help this app work better?

Tracy: Yeah. First and let me reframe the problem, which is that when it comes to addressing social media harm, most people think of two possible classes of solutions. One is demanding that platforms do better. The second is demanding that governments step in. And Block Party belongs in the third class of solutions that I want more people to know about, which is enable the people who want to build solutions to be able to do that.

So, we are a whole other set of, like, possibilities here. What we can't do right now is integrate very deeply with most platforms because they don't have the level of openness that Twitter has historically had, which is imminently about to change.

With the platform so closed off, it really limits what these third-party developers who are trying to build out other solutions can do. There's still enough demand for these safety solutions and private solutions that there are companies attempting to build in this space, but it's just a lot harder when there's not the API access, where we're being able to programmatically build these solutions. 

And so, what can happen on the policy front, platform front, is there can be legislation that requires a level of openness with these APIs which then creates the market for innovation on top of the social media platforms, allowing this whole layer of solutions to exist. 

Or platforms can also just recognize that it's really good for the user experience to enable these different applications to be built. So, with Twitter, the product that we've built with Block Party, we've been able to show that users legitimately have a better experience. They have a drastically better experience dealing with harassment, and spam, and all that if we can build these tools for them. 

And so, it is actually a win-win-win where the platforms don't have to invest their product and engineering resources in building out the surface area of safety product. And it's some stuff that would never make sense for them to build.

So, things like the documentation functionality we built for Block Party, like, I just can't imagine a platform prioritizing their very expensive engineering resources to go build that cuz it's not core to the platform.

So, the platforms benefit from having somebody else build this product. Users get a better experience. They have more choice. People may want to experience platforms in different ways. And with these third-party developers, like, we can provide that full range of optionality versus just whatever the platform has decided is the default for everybody.

And then obviously, like, y’know, we are very excited to be able to be in this space of providing solutions. So, the big takeaway is, like, more openness and that can come either from the platforms proactively opening up APIs or policy-side pushing for this as a requirement. 

Niki: So, quickly, for our listeners who aren't super technical: APIs. So, if you're a developer and you're building a tool, can you just explain what an API is and how it works?

Tracy:  Yeah. So, API stands for Application Programming Interface, and it means that a third party can, like, take actions programmatically on behalf of you. So, for example, one thing that Block Party does is automatically mute people for you.

What you'd be able to do, as, like, an end user, is you can mute somebody. What is exposed to the API is this muting functionality, so Block Party can do that for you. So, what we need to be able to build out these solutions is access to all the data that you would have as a user to do those things programmatically for you. 

Niki: This isn't a, sort of, picking winners or losers. This is just creating; the idea would be legislation just kind of insisting on openness to developers to build on the platform. It doesn't really take away from the platform itself. And if you have data that show that people have a better experience, ideally that's good for the platforms themselves, but we know it's really hard to get anything done in Congress, [chuckles] and especially right now, a little bit of sand, a lot of sand in the gears. 

And so, what would be easier and simpler for you is if the platforms, if Instagram and if Twitter, well, Twitter's now restricting their API. They used to be quite generous about it, but if these platforms just allowed the opportunity for people like you to be able to build on, on top of their existing products so that the end user, which, again, we'll say are people, people can have more choice and have a better experience unlike people like me, who just basically opt out most of the time.

I would probably be online more if I felt a little less [pause] grossed out by the experience [chuckling].So that's kind of, is that in a, in a nutshell? 

Tracy: Yeah. That's it! Yeah. And I would also add, we've seen other examples of app ecosystems that have made the experience better. So, if you look at Apple and Google, like, on phones, you have so many different applications you can use that the companies themselves have not built, but they've enabled third-party developers to build out all this different functionality that makes your life as an end user, as a consumer, much better. 

Niki: Yes, definitely!  And I feel like; we'll end on this, I think that, again, I'm, I'm really impressed at your tenacity and your just sort of stick-to-it-iveness with, like, you're gonna be online, and you're gonna do you, and you are building solutions

So, I think what you're doing is, it's impressive, it's important. And let us know how people can sign up for a Block Party or find it.

Tracy: Yeah, so we are blockpartyapp.com, or you can find us on Twitter at blockpartyapp_. [Niki: Okay] We're working on getting a better handle, but 

Niki: [interrupts chcukling] Best wishes to, best wishes to you! There's no people left- 

[cross talk] 

We'll root for you to get a better handle. And in the meantime, I'm gonna drop that into the show notes so people can find it.

And on a final note, I'll say, I know you're hiring. You're looking for talent. So, anybody listening to this who's like, “I would actually love to get in and work on a tech solution like this.” You're hiring, so we'll point them in your direction. 

Tracy: Thank you. Yes. We'd love to have more help in making the internet safer for everybody. 

Niki: Very good. Thank you so much, Tracy, again for taking the time to come talk to us. 

Tracy: Thanks for having me on your show.

Outro:  

Niki: Be sure to tune in for our next episode with Debbie Taylor Moore. She’s a legend in the cybersecurity business and I’ve been eager to host her ever since I heard her speak on a panel on the Consumer Electronic Show in Vegas.  We connect in the studio and talk about the unthinkable: What would you do if the entire electrical grid goes down? How should you be preparing yourself? 

This is our 50th episode and, as always, I want to thank you for listening, subscribing to the podcast, and telling your friends about this show.