Dating guru resurrects Robin Sage by social engineering TS/SCI holders on LinkedIn

LinkedIn is still the "safest," most-trusted social media site to connect with people, right? One DEF CON presentation proves it could be the riskiest network of all

LAS VEGAS (DEF CON) — Jordan Harbinger, co-founder of The Art of Charm, a dating and social dynamics instruction school, isn't a hacker. But he used his basic knowledge of the social scene in order to social engineer people with Top Secret / Sensitive Compartmentalized Information (TS/SCI) clearances on LinkedIn.

Social media is a major asset to people and organizations, at the same time, it can also pose a serious risk, because once something is online, it remains there forever. However, when it comes to people, social media can be a social engineer's dream come true. Instant access to data, and human interaction, can create a cocktail of disaster for those who are quick to befriend a friendly face.

Years ago, the Robin Sage Experiment demonstrated how easily a person can be connected, either directly or indirectly (the friend of a friend scenario), with someone who may not be what they claim to be.

When the Robin Sage experiment concluded, hundreds of seasoned security professionals, officials and staffers from the Department of Defense, as well as others from various three-letter agencies, were linked to a false persona, and many of them violated OPSEC (Operational Security) and divulged somewhat sensitive, if not personal, information. They told a complete stranger the type of things that a malicious actor could later use against them.

[Related:The Robin Sage experiment]

At DEF CON on Friday, in the social engineering village, Jordan Harbinger resurrected Robin Sage in a way, as he explained how he used LinkedIn, a cleverly created recruiter profile, and his assistant's image on Facebook, to get military and intelligence workers (all with some type of clearance), to discuss themselves and the types of projects they were working on. He then discussed how all of that information, when collected bit by bit, and be used to gain access to this person. Interestingly enough, the idea for his experiment came from a discussion he had with students.

During his talk, Harbinger explained that some of his clients, Top Secret cleared missile scientists, were discussing their projects in front of him. Nothing overly secret, but given the nature of their work, everything they do falls under the cloak of secrecy for the most part. Initially confused by the disclosures, he wondered why it had happened in the first place.

"Then I realized, that the same trust triggers that we use with women, with people, and friendships, are the same triggers you can use to get sensitive information from people," Harbinger explained to CSO in an interview.

To test this theory, he went to LinkedIn, and from there "it took on a life of its own, it was so easy."

The reason that his experiment worked is because he followed basic human interactions. For example, "telling somebody personal information about yourself, makes them think that it's safe to open up, and it usually is."

[Also from DEF CON 2013: Attendees demonstrate social engineering prowess in CTF contest]

"If you do that on a social network with people, and you ask them for help and you become vulnerable, a lot of times their trust triggers are switched on," he explained.

At that point, those who have activated trust triggers don't see the information sharing as a violation of policy or security, they view it as helping someone in need — demonstrating one of the core traits exploited by a social engineer, human trust and the drive to help.

Asking people directly for details about their project may not yield any results, however, when an engineering student asks for help, the story is completely different. The Top Secret cleared engineers will then "throw them a bone" and share basic details about what they're working on. It's from this passive elicitation that those little nuggets of information add-up to a much larger picture.

Harbinger was able to join a group on LinkedIn by posing as a recruiter looking for people who have clearance levels, he was amazed at the lack of verification required to prove who he was, needing little more than an email to the group owner before he was added to the list. Once in the group, he started interacting with members.

"Most of the stuff that was in the group was not that interesting, almost all of it [was] job offers," Harbinger explained.

Interacting with the group was easier than expected. Because he was working with limited numbers (about 50 people in all), LinkedIn's restrictions on connection adding didn't apply. LinkedIn allows a small number of connections to be made without knowing the person's email address as a basic service. Given that he was in the group already, the members that Harbinger focused on added him almost immediately. From there he started messaging them about fake job offers.

Some of the people he spoke with, to their credit, were practicing some form of security and would not discuss where they worked or what they were working on. But there was a solution for that, in the form of Facebook. Using the data collected on LinkedIn, it wasn't hard for Harbinger to find many of his contacts on the world's largest social network. Creating a profile of an aspiring engineer, who just happens to be a smart, attractive woman (played by an image of Harbinger's assistant), he started collecting friends.

Those who were hesitant to share details before did so now as they thought they were helping a beginner, one who was smart, knew the field she was wanting to work in, and wasn't shy in asking for pointers and advice. From there, it was easy to lookup email addresses and Yelp reviews, to determine not only where the person lives and works, but also their favorite eateries and entertainment spots.

Such data, when combined with what is already known thanks to LinkedIn, can lead to the discovery of additional information that bad actors can use, either to launch an attack against a company, or conduct espionage. It was Robin Sage all over again.

While Harbinger didn't take things too far, both for ethical reasons and for legal ones, it wouldn't be hard for someone else to do what he did, assuming they're not doing it already.

"If I wanted to take this further, and I had a budget, it wouldn't cost me anything to hire a smart woman and seduce some scientist," he said.

And that's the point, proven once again, when it comes to the human element of the security chain, it can and will be broken. Even those trained to avoid questions, to keep some things secret, have a basic element within them that demands they trust complete strangers on some level.

Once that level is discovered, then all an attacker, con artist, or other malicious actor needs is focus and time. Eventually, they'll get what they're after — or they will move on to another human link in the chain and start again until the desired results are achieved.

Insider: How a good CSO confronts inevitable bad news
Join the discussion
Be the first to comment on this article. Our Commenting Policies