Virus Bulletin Conference Papers (13-14) (update)

[Now with additional content re the 2013 paper]

David Harley, Martijn Grooten, Steve Burn and Craig Johnston: My PC has 32,539 Errors: how Telephone Support Scams really Work; Virus Bulletin Conference Proceedings, 2012. Copyright is held by Virus Bulletin Ltd, but is made available on this site for personal use free of charge by permission of Virus Bulletin.

Fake security products, pushed by variations on black hat SEO and social media spam, constitute a highly adaptive, longstanding and well-documented area of cybercriminal activity. By comparison, lo-tech Windows support scams receive far less attention from the security industry, probably because they’re seen as primarily social engineering and not really susceptible to a technical ‘anti-scammer’ solution. Yet they’ve been a consistent source of fraudulent income for some time, and have quietly increased in sophistication. In this paper, we consider:
1. The evolution of the FUD and Blunder approach to cold-calling support scams, from ‘Microsoft told us you have a virus’ to more technically sophisticated hooks such as deliberate misinterpretation of output from system utilities such as Event Viewer and ASSOC.
2. The developing PR-oriented infrastructure behind the phone calls: the deceptive company websites, the flaky Facebook pages, the scraped informational content and fake testimonials.
3. Meetings with remarkable scammers: scammer and scam victim demographics, and scammer techniques, tools and psychology, as gleaned from conversational exchanges and a step-through remote cleaning and optimization session.
4. The points of contact between the support scam industry, other telephone scams, and mainstream malware and security fakery.
5. A peek into the crystal ball: where the scammers might go next, some legal implications, and some thoughts on making their lives more difficult.

And a teaser for number 14:

Virus Bulletin 2013

With Lysa Myers: “Mac Hacking: the way to better testing?” (To be made available after the Virus Bulletin conference in October 2013.)


Anti-malware testing on the Windows platform remains highly controversial, even after almost two decades of regular and frequent testing using millions of malware samples. Macs have  fewer threats and there are fewer prior tests on which to base a testing methodology, so establishing sound mainstream testing is even trickier. But as both Macs and Mac malware  increase in prevalence, the importance of testing the software intended to supplement the internal security of OS X increases too.

What features and scenarios make Mac testing so much trickier? We look at the ways in which Apple’s intensive work on enhancing OS X security internally with internal detection of  known malware has actually driven testers back towards the style of static testing from which Windows testing has moved on. And in what ways might testing a Mac be easier? What can  a tester do to make testing more similar to real-world scenarios, and are there things that should reasonably be done that would make a test less realistic yet more fair and accurate?  This paper looks to examine the testing scenarios that are unique to Macs and OS X, and offers some possibilities for ways to create a test that is both relevant and fair.

And a little more information in an interview with Lysa and myself here: VB2013 speaker spotlight.

Small Blue-Green World
ESET Senior Research Fellow

About David Harley

Musician/singer/songwriter; independent author/editor
This entry was posted in ChainMailCheck, conference papers, David Harley, ESET, Mac Virus, VB Conference Papers, Virus Bulletin and tagged , , , , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.