Last October, users of the Qiui Cellmate—an erection-preventing cage that can be locked and unlocked via an app—received an unusual message. “Your cock is mine now.” The hacker who had gained control of the clunky metal devices announced that they would be holding penises hostage until a Bitcoin ransom was paid.
A combination of known flaws in Qiui’s software had led to this bizarre juncture: the system both failed to fully anonymize users, allowing hostile third parties to contact some of them directly, and exposed the device’s locking mechanism to outside control. When British security researchers had discovered the vulnerabilities, months earlier, they had notified the company, but little action was taken. So they blogged about their findings for potential users—and hackers—to see.
Thankfully, it’s not clear that anyone was actually wearing their Cellmate during the attack, but the breach isn’t an outlier. As our use of connected devices has expanded, so, too, have the opportunities for security violations. A few years ago, a Texas couple woke up to a strange voice coming from their Wi-Fi-enabled baby monitor, threatening to kidnap their child. Meanwhile, hospitals, government agencies, and even nuclear power plants have all become regular targets for ransomware. An ineffectual cock-cage infiltration may not seem as urgent a matter as a hijacked health care sector, but the growing number of sex-toy breaches demonstrates that fully embracing the Internet of Things can put our most private moments under threat. As Swedish media scholar Jenny Sundén writes in the journal Sexualities, “When sexual play becomes data, safe sex becomes a matter of keeping one’s data safe.”
During the pandemic, when the proverbial six inches expanded to six feet of social distance and many romances moved online, sales of smart sex toys surged. Canadian German company Wow Tech Group, for example, reported a 200 percent year-over-year online sales increase in April 2020. These internet-enabled devices, called “teledildonics,” can be controlled remotely whether their users are in the same bedroom, out in public, or separated by an ocean. Today, vibrators are paired with apps and video games can connect to smart sex devices, allowing, for example, a user’s butt plug to vibrate at charged moments of first-person-shooter play.
Rather than simply opening new avenues for sexual expression and intimacy, teledildonics risk a variety of breaches, from the nonconsensual gathering, release, or use of personal data to the easy discoverability of Wi-Fi or Bluetooth signals by other networked devices nearby. This is due, in part, to the fact that niche producers of smart sex toys often have less capacity to engineer robust user protections than massive tech companies like Apple, which increasingly uses security as a selling point for its products. In 2017, Canadian sex-toy brand We-Vibe paid a $4 million settlement after its eavesdropping vibrator collected intimate data on users’ body temperatures and preferred vibration intensities without their consent. Given the lack of consumer research regarding sexual experience, teledildonics companies covet this kind of information to improve and better market their products. By tracking their users, companies can learn all sorts of valuable details about their customers without ever having to persuade them to fill out any tedious forms—and they could sell the data to third-party advertisers, like Facebook and Google do.
Devices that record data also risk exposing private adventures, as when users discovered that their Lovense Remote, the app controlling Lovense’s smart vibrators, was capturing surreptitious audio of their sessions. The recordings were made as a result of a bug and were likely never transferred off of users’ devices, but once data has been recorded by a networked toy, there is always a chance that it could escape.
“There are two reasons why you manufacture an Internet of Things device,” says Kirsten Thompson, a partner at Dentons Canada and the national lead of the law firm’s transformative technologies and data strategy group. “One is because internet connectivity is critical to the function of the device. Or, two, you want to get valuable information, personal or otherwise, via this device. Right now in the market, it is likely that 75 percent or higher of those devices are in the latter category.”
“My dishwasher is internet enabled,” she adds, “and I can only ask myself why.”
While the data generated by teledildonic devices is always at risk of exploitation by third parties, there is still an onus on companies to ensure that their intended data uses are clearly laid out for users to agree to (or not). Some legal experts say that user consent to digital experiences should align with concepts of sexual consent. “The idea that consent in a sexual context should be active and ongoing, it should be understood, reasonably informed—those concepts inform all types of consent,” Thompson explains. The law will not recognize consent to a data use outlined in a deceptive or unreasonable privacy policy. As a result, Thompson says, more companies are designing products that inform users of their data uses through frequent (but ideally nonintrusive) pop-ups and dialogue boxes with clear, minimal text. The idea is to ensure that users are really consenting rather than just ignoring pages of terms and conditions and clicking Agree, thereby allowing their data to be gathered, transmitted, sold, and used for purposes well beyond the immediate function of the service or product.
Jack Lamon, a worker-owner of Toronto feminist sex shop Come As You Are, says there’s a larger issue at hand than just companies gathering anonymized user information. For him, more troubling consent questions arise with teledildonics that connect to massive multiplayer online games. Users have set up smart sex toys to respond to the actions of other players: every time the user gets hit with a sword, for example, the toy vibrates, “engaging people in nonsexual play that has a sexual outcome for the user without anyone else’s knowledge,” Lamon explains. “That feels pretty messed up to me.”
Our courts may not be fully equipped to manage these new challenges. Part of the issue is that the common law legal system Canada inherited from England evolved hundreds of years ago, in a very different world, at a much slower pace. “It was not a society that underwent a wholesale data revolution in the span of ten years,” says Thompson. “So, while the common law system of precedent is very valuable for incremental change, it may not be fast enough to keep up with the changes we’re seeing.” As for the privacy cases that do make it to court—which are often related to digital harm inflicted upon individuals, such as instances of cyberbullying or revenge porn—Thompson says the judgments aren’t always nuanced enough to provide the guidance companies need and the protections consumers desire.
The risks associated with teledildonics have led some digital thinkers to analogize the hazards of intimate data breaches to the history of public sex, specifically within queer communities.
In previous centuries, the idea of the single-family household as the only acceptable venue for sexuality developed alongside the establishment of state-sanctioned heterosexual marriage. Kicked out of the straight home, practices that didn’t conform found room outside, in public places like bathhouses and washrooms. While these spaces became sources of pride and connection for queer communities, they were also regularly targeted by the police, resulting in frequent arrests of LGBTQ2+ people. Things began to shift in 1967, after the US Supreme Court established the “reasonable expectation of privacy” in Katz v. United States. The ruling found that the FBI, which had wiretapped a telephone booth to gather evidence regarding a suspected illegal gambling operation, had no right to do so without a warrant. If a person is entitled to privacy in a phone booth, advocates later argued, surely they had the same right in other public spaces too. The case clarified decades of confusion regarding the legality of electronic surveillance and continues to underpin modern digital privacy protections.
Contemporary questions surrounding privacy are complex, with concerns extending from physical spaces into digital ones. In many ways, the online world today resembles the public spaces that were at the heart of privacy debates in the ’60s, with some now asking how much seclusion we should expect when using networked devices. In a Daily Dot article, writer Ana Valens points out that, with smart devices tracking their habits all the time, users “are all having public sex now.”
Though networked devices may inadvertently put their users’ private lives on display, Jenny Sundén insists that this “sex in public, as it were, can be made safe.” She proposes detaching the idea of intimacy from privacy altogether—a sentiment introduced by Simon Fraser University new media theorist Wendy Hui Kyong Chun. In her 2016 book, Updating to Remain the Same: Habitual New Media, Chun famously called our leaky devices “chatty and promiscuous” and asked what would happen if, “rather than pushing for a privacy that is no privacy—a security that fosters insecurity . . . we fought for the right to be exposed—to take risks and to be in public—and not be attacked?” Perhaps, with everyone’s data always on the brink of going public, a new front in the long fight of LGBTQ2+ people for the right to safe private sex outside the home is the demand for freedom from harm if intimacies are exposed.
This philosophical turn may reflect the experiences of teledildonic users who don’t feel threatened by the possibility that their data may leak. “I think that this is exactly the same as people using Facebook,” says Lamon. “A lot of people just don’t care.”
Despite the risks, many continue to use networked sex toys because they feel, correctly or not, that these devices—like social media platforms—give them more than they take. So long as users are able to give fully informed and active consent to a company’s data uses, and so long as companies make reasonable efforts to protect their users, perhaps the next step is not to expect that data will never leak but to demand that no one be harmed if it is.