Conference item
The anatomy of online deception: what makes automated text convincing?
- Abstract:
- Technology is rapidly evolving, and with it comes increasingly sophisticated bots (i.e. software robots) which automatically produce content to inform, influence, and deceive genuine users. This is particularly a problem for social media networks where content tends to be extremely short, informally written, and full of inconsistencies. Motivated by the rise of bots on these networks, we investigate the ease with which a bot can deceive a human. In particular, we focus on deceiving a human into believing that an automatically generated sample of text was written by a human, as well as analysing which factors affect how convincing the text is. To accomplish this, we train a set of models to write text about several distinct topics, to simulate a bot's behaviour, which are then evaluated by a panel of judges. We find that: (1) typical Internet users are twice as likely to be deceived by automated content than security researchers; (2) text that disagrees with the crowd's opinion is more believably human; (3) light-hearted topics such as Entertainment are significantly easier to deceive with than factual topics such as Science; and (4) automated text on Adult content is the most deceptive regardless of a user's background.
- Publication status:
- Published
- Peer review status:
- Peer reviewed
Actions
Access Document
- Files:
-
-
(Preview, Accepted manuscript, pdf, 142.9KB, Terms of use)
-
- Publisher copy:
- 10.1145/2851613.2851813
Authors
- Publisher:
- Association for Computing Machinery
- Host title:
- SAC '16 Proceedings of the 31st Annual ACM Symposium on Applied Computing
- Pages:
- 1115-1120
- Publication date:
- 2016-01-01
- Acceptance date:
- 2015-11-30
- DOI:
- ISBN:
- 9781450337397
- Keywords:
- Pubs id:
-
pubs:614783
- UUID:
-
uuid:a22ff288-8f27-457a-860b-c06337ee3d73
- Local pid:
-
pubs:614783
- Source identifiers:
-
614783
- Deposit date:
-
2016-04-10
Terms of use
- Copyright holder:
- ACM
- Copyright date:
- 2016
- Notes:
- Copyright © 2016 ACM. This is the accepted manuscript version of the conference paper. The final version is available online from ACM at: https://doi.org/10.1145/2851613.2851813
If you are the owner of this record, you can report an update to it here: Report update to this record