Google’s video-sharing platform is allegedly censoring the security community – even those who are hacking for good.
Poison Tap is a hacking tool that helps a user gain access to a locked device. With a few pieces of hardware – plus around 10 minutes of your time – you can learn how to make your own by visiting a YouTube channel.
“It’s always been a bit of a hobby,” said Dale Ruane, a self-taught hacker who created the channel, DemmSec, in 2012, and is currently employed as a pen tester.
“It’s only after I went to university that I realized that there’s actually a career here.”
Ruane, 23, started hacking 10 years ago after he found tools online that completely altered the way he thought computers were meant to function.
The only difference then was that Ruane didn’t have YouTube, which has since grown into an educational resource for infosec rookies and a platform for creators to hone their craft.
“When I was in secondary school, there was nothing out there,” he told The Daily Swig. “I couldn’t learn any of this without going somewhere that looked a bit strange or dodgy or wasn’t necessarily 100% legitimate.”
The intention of DemmSec is to provide tutorials on the hacking tools and other aspects of a legal and practical passion that Ruane never had access to as a child.
But while he presents this information with a strong moral code, the fact that something like Poison Tap could be used for malicious purposes, say stealing a person’s information, has resulted in the demonetization and censorship of channels like Ruane’s. It happens regularly, he says.
“The average person will just assume a hacking channel is criminal,” said Ruane. “The generic response [from YouTube] is that it breaches community guidelines, but they’re fairly closed about any other reasoning.”
How YouTube defines acceptable content has certainly underlined the difficulties in striking a balance between safety and transparency in today’s digital age – an issue that the Google subsidiary has tackled consistently from 2012, amidst efforts to remove videos deemed potentially problematic to advertisers.
Facing more recent pressure by governments to crackdown on terrorist propaganda, this year YouTube released its first report into the number of videos it deletes using a combination of automated tools and human moderators – a total of 8.3 million were taken down between October and December 2017.
Most of this content (30%) was flagged for being sexually explicit, followed by spam (26%), abusive speech (16%), and violence (14%).
“So, some guy, whose getting paid €500, or an algorithm, which is actually really stupid, is going to determine if my video is appropriate hacking or not,” Chris Abou-Chabke, founder of Black Hat Ethical Hacking, told The Daily Swig.
“Tell me that you understand the mindset of a hacker so that you can judge my video. The whole system is flawed.”
Black Hat Ethical Hacking is a group that provides hacker-for-hire services for businesses, such as data recovery or penetration testing, and additionally offers courses for those wanting to get into offensive security.
They believe that in order to secure an organization’s infrastructure, you need to think like the criminal who wants to break in.
“A hacker is really misunderstood and misrepresented,” he said. “And the way they represent black hat as being criminal is wrong.
“Black hat is not criminal – it’s actually one of the elite hackers, which separates them from the kids. A criminal is a criminal no matter what they do.”
Abou-Chabke launched a YouTube channel for Black Hat Ethical Hacking in 2015. When it was taken down two years later, he lost all of his almost 3,000 followers.
He relaunched the channel and soon became an expert in YouTube’s lengthy appeal process, where videos may be reinstated if proved to be in no violation of the platform’s broadly written community guidelines – or the law.
YouTube removed two videos on Abou-Chabke’s channel that were demonstrating DDoS attacks, but approved them after he explained how his tutorials reflected the skills required to appropriately evaluate security systems.
“If I am demonstrating a zero-day attack on Google or Yahoo or a public IP, that’s probably not a good idea because you need written permission from the owner in order to hack them for the purpose of finding vulnerabilities,” he said.
Abou-Chabke explained that if he is simulating a distributed denial-of-service (DDoS) attack, for instance – an offensive tactic that disrupted 33% of businesses in 2017 – he needs to ensure that he’s presenting the information on a machine that he owns, or one that he has permission to use.
Describing an attack on a specific target, perhaps Facebook, is also verging on the line of wrongdoing, particularly if Facebook has a serious security flaw that hasn’t been patched.
“I wouldn’t feel guilty if somebody did something bad with this knowledge because I’ve said in my disclaimers that the information should be used to prevent such an attack,” said Abou-Chabke. “I’m not in any way promoting it [criminal behaviour].”
Public safety threat?
In March of this year, YouTube enraged a significant segment of its user base following a decision to tighten restrictions on content featuring firearms.
This included videos showing how to manufacture, modify, and sell guns or gun-related accessories, notably in regards to the customization of an automatic pistol. The move was taken in line with the mass shooting that had occurred in Parkland, Florida just over a month before – a tragedy that took the lives of 17 people.
Now with the front-page headlines increasingly dedicated to threats of cyberwarfare, and government attempts to get rid of end-to-end encryption for messaging apps, it should come as no surprise that hackers have found themselves in the same boat as responsible gun owners.
“I hold essentially the same view for all videos on YouTube,” said Ruane. “When it comes to tutorials, and the kind of area that I sit in, where it’s educational videos or how-to videos, all I’m teaching people is how to use the tool.
“How they go out and use that tool is not part of what we should have to deal with, but asking YouTubers to accept these restrictions is [acting] as if we are in the wrong.”
YouTube, which will expand its human moderator team to 10,000 by the end of 2018, did not respond to comment on its stance on ethical hacking, or whether its staff was trained with both the technical and cultural know-how to ascertain what constitutes a threat to public safety.
The company, however, has frequently worked with global police forces and governments, whether in counteracting extremism, or negating violence such as knife and gang crime in the UK.
When pressed about whether hacking tutorials had ever come under its radar, a spokesperson from the UK’s Department for Digital, Culture, Media & Sport told The Daily Swig: “Online platforms should immediately remove content not complying with their own policies.
”We want to make the UK the safest place in the world to be online.”
The UK has put forward a five-year £1.9 billion investment aimed at the prevention of cybercrime as part of its National Cyber Security Strategy launched in 2016, amid rising reports of fraud and computer misuse in Britain.
Calls to get more young people into cybersecurity careers has highlighted the government’s commitment to its digital infrastructure.
It is equally committed to stopping teenagers from becoming involved in cybercrime, something which the UK’s National Crime Agency (NCA) believes is fuelled by the availability of free hacking tools online.
Videos promoting activity leaning towards illicit practices definitely do exist on YouTube, but when it comes to open information versus censorship, Ruane has a different point of view.
“I feel as though censorship of this type of content is harmful to the community as a whole,” he said, admitting that people who aren’t aware of ethical hacking often ask if he can be hired to spy on their partners.
“I think a lot of [the] time it takes someone to say that you shouldn’t be doing that, and if there’s no mentor in their lives, then they’re obviously not going to receive that information and be more likely to turn to black hat websites to obtain information, and get sucked into illegal activities.”
’I’m a professional’
YouTube does deserve some credit for its transparent, albeit irregular, policies it implements on its community of billions – internationally recognized security conferences, for example, rarely appear to have the same problems with their recorded talks and channels.
In a 2018 report on free expression within major digital platforms, YouTube was even praised by the Electronic Frontier Foundation (EFF) for its open interaction with content creators.
But, as the world’s largest video sharing site has just launched its own education initiative, the platform needs to take a more prominent role in correcting the widespread misconceptions of hacking and the community that surrounds it.
“The vast majority of people on YouTube are trying to make videos that totally abide by the community guidelines and laws,” said Ruane.
“My preferred solution would be if there was some kind of facility of verifying so that I could say look this is what I do, I’m a professional.”