Clubhouse app raises security, privacy concerns

It’s got $100 million, loads of personal data, questionable privacy practices, and no CSO: What enterprise security needs to know before employees join hot new social app Clubhouse.

social network of simple figures and their connections
Thinkstock

Social media app Clubhouse has been on the market for less than one year and it’s already facing privacy-related court filings and fallout from a user data leak that has been exploited, in which a user recorded and shared private conversations, user login information, and metadata to another website.

What is Clubhouse?

Frequented by celebrities like Oprah Winfrey, Elon Musk, Ashton Kutcher and others, Clubhouse provides chat rooms that users can join for open conversation. Since famous people are on the service, marketers want to be there, too, with marcomm notables like Guy Kawasaki and Seth Godin joining the club. Brands like Milk Bar, Kool-Aid, and Politico have created profiles and, in December, Clubhouse even started an official influencer program.

Clubhouse security risks

As marketers increasingly get on board, the security risk their usage presents to the enterprise also grows. Marketing has always been a difficult department for many security teams to cooperate with: Their role is to share data, while security’s is intrinsically to protect data and in some cases hold it back. As the two departments work together to keep information safe, what about Clubhouse do security professionals need to know?

Like Facebook in its early days, Clubhouse’s user base is driven by exclusivity. In addition to having famous people on board, the app is currently iPhone only. To join, people must have an invite from an existing user. To invite new users, existing ones give Clubhouse access to the address books on their phones—whether the people loaded in those contact lists want Clubhouse to have their information or not. Clubhouse then creates shadow profiles that show how many connections these not-yet users have, prompting people on the platform to invite them: “Don’t you want John to join the club? He already has nine friends!”

This, of course, has raised security concerns Clubhouse has yet to answer. The startup has the private cell phone numbers of some of the wealthiest, best-known people in the world, but they don’t seem to have a chief security officer. They record conversations that happen in the app then temporarily store them on servers in China, partnering with Shanghai-based startup Agora.

In Europe, the company is facing a cease-and-desist order filed January 27 by consumer watch group Federation of German Consumer Organisations for violating General Data Protection Regulation (GDPR). On February 21, Clubhouse experienced a data leak that was exploited when a user manipulated the system to stream chat room conversations on a third-party website.

“[Clubhouse] is an anti-privacy super-spreader vector with backend infrastructure in China. It’s not safe to be in the [App Store] and should be taken down [and] rearchitected to be made safe for its users (and non-users),” Debra Farber tweeted February 14.

Privacy lead for Amazon Prime Video, Farber has quickly become an independent expert on Clubhouse security—or the lack thereof. She’s not the only industry leader speaking out against the app: Lourdes Turrecha, founder of privacy evangelism initiative The Rise of Privacy Tech—where Farber is an advisor—has accused Clubhouse of leveraging fear of missing out (FOMO) to overcome significant privacy concerns. Daragh Brien, managing director at Castlebridge, an information governance and privacy advisory firm, has gone so far as to call the app a “minimum liable product”—playing off the startup term “minimum viable product”—defining it as “a product that has done so little to address regulatory or user safety needs that it is a lawsuit in a box and investors should tread carefully.”

These are not glowing words, yet when marketing departments see the allure of Oprah Winfrey and Ashton Kutcher glistening before them, harsh tweets from security and privacy experts aren’t likely to turn them away. “While there are some risks...I’m not sure if the restriction would do any good,” says Vykintas Maknickas, CSO for Nord Security. He’s operating under the assumption that “Clubhouse functionalities are designed with good intentions in mind [but] some of them may get abused”—like what happened with that chat room data leak. He cautions not to assume Clubhouse will undergo the same privacy violations that Facebook has just because they’re both social networks.

[Editor's note: On March 14, Clubhouse CEO Paul Davison announced that it would no longer require access to members' phone contacts and that users can request that Clubhouse delete previously collected contacts.] 

Risks of employees using Clubhouse

When it comes to the chat room recordings, Maknickas says, “[U]sers shouldn’t be more concerned than when some of their employees are speaking at [a] conference.” Whether it’s within the Clubhouse app or leaked to a third-party site, speakers are still sharing information publicly.

Consider partnering with public relations to develop a policy that helps speakers promote the company brand while keeping its data safe. “If company policy is to review all the talking points and guide what needs to be dismissed, then yes, there might be more difficulty to coordinate the information flow. But the same can be applied to the majority of social networks and Clubhouse by itself doesn’t bring any more concerns than others,” Maknickas says.

Yet according to Marc Gilman, general counsel and vice-president of compliance for risk detection company Theta Lake, “[G]iven the security concerns, absolutely anything beyond the sharing or discussion of public information would be totally off limits.” At a conference, attendees are listed and vetted. Some even operate under Chatham House Rule, which restricts the sharing of information post-event. Clubhouse, on the other hand, encourages all users to jump into any conversation they like with no sharing restrictions. “Without a mechanism for pre-vetting recorded content using supporting technologies, there are risks that conversations could veer into complaints or contain sensitive PII [personally identifiable information] or confidential information that would be very problematic,” Gilman says—“particularly those in regulated fields [which] need to consider the broader reputational risks associated with these activities.”

Even those not in regulated industries may have non-disclosure agreements (NDAs) that protect client PII like phone numbers, addresses, birthdays, and all the other data Clubhouse users store in their contacts. Marketers who use their personal phone for work may not think about the fact that those are company contacts they’re uploading. The shadow profiles Clubhouse then creates for these clients will show your company’s employee as a connection—a connection that could be seen by any competitors who also have the client in their phone, posing possible competitive intelligence concerns.

How to minimize risk of employees using Clubhouse

Employers with bring your own device (BYOD) policies should take extra care that sharing contact information doesn’t violate NDAs or other legal and security policies. That starts with education on the risks of using Clubhouse accompanied by clear policies on its use.

One way to avoid exposing contacts and data is to control, as much as you can, the data that Clubhouse has access to from employees’ personal accounts. Most enterprise mobile device management (MDM) systems allow you to segment personal and work data on both user- and company-owned devices. So does Apple’s unified management framework. If neither of those is an option for your organization, MDM apps are available from the Apple Store.

Clubhouse is also building an Android version, so plan ahead by requiring those users to turn on their phones’ enterprise work profile feature.

With more than 10 million downloads at the time of writing, Clubhouse doesn’t show signs of slowing. The startup is valued at $1 billion with more than $100 million invested. Two trust and safety analyst jobs are posted on the company site. Their descriptions, however, indicate that these are more user-facing communication roles, as opposed to actual data security positions.

Clubhouse’s lead investor, Andreessen Horowitz, did not respond to request for comment. Clubhouse sent a do-not-reply email encouraging CSO to join Clubhouse.

Copyright © 2021 IDG Communications, Inc.

How to choose a SIEM solution: 11 key features and considerations