Full name
Birth date
Birthplace
Citizenship status
Nationality
Passport/ID number
Passport issue & expiration dates
Nationally registered gender
ID photo
Personal signature
Parent’s full names
Fingerprints
Additional country-specific details (e.g. emergency contact information for UK citizens)
This is bad. IDs shouldn't be stored on your server once you've confirmed the age/identity/whatever of the user.
The law requires the identity records to be kept with copies of each work containing explicit depictions sufficient to match the performer to the identification. If you do online real-time live streaming of depictions, how do you do that if they aren't on the same system?
Just a thought, but you don’t store them in an unprotected S3 bucket.
If you’re required to handle this kind of sensitive information — for any purpose, but I would say this is even more true for anything tied to anything of a sexual nature or anything tied to health — you should take every precaution that that information is secure, encrypted when possible, and that your systems are regularly audited for vulnerabilities.
If the onus of properly protecting sensitive data — whether you’re legally required to hold it or not — is too much or too expensive, you shouldn’t be in that type of business. Period.
I can chime in one as I was a systems administrator for one of those shady “Get sex tonight” dating websites.
The law of having to hold the identification of the models is pretty much the only law you need to uphold. These sites are created from shills of shells.
You have one company, from there you then have two or three other companies which create white label brands utilising one of the shell companies.
So the brand “bangshelia” would be registered with a shell company “adult dating world”. This brand would then be bought by a shill and branded as “purple dating” but registered by a company called “dating purple”.
You would then affiliate with “adult dating tonight” where they would provide the data for your brand.
To get anywhere near close to chasing the company, you need to launch a lawsuit against the active site, (the company registered to) and then the company behind that and from that the company providing the database. Which all costs money.
As my CEO did, you pack up shop on one of those brands and rebrand yourself. Claim your not associated to the previous company by registering another director.
There are no resources available to attempt to make anything secure. The website of the company I used to work for kept all passwords in plain text in a database that was accessible online publicly for other affiliates to borrow to create their own “brand”.
It took me a couple reads, but do you mean "shell" (as in a shell company that does nothing but exist to protect or obscure the operations of another company) instead of "shill" (a person who markets for someone while hiding that they are associated with them)?
Sorry about that, I’ve rehashed what I wrote and should now make more sense.
It’s shill of shells. The CEO makes one company, another “investor” buys a brand of the original company and registers it in their name and then sells that on.
I watched "Laundromat" (*movie about the Panama Papers), towards the end of the movie, Antonio Banderas says: "In... In retrospect, uh... I think we... we could have spent more on cyber-security."
> Just a thought, but you don’t store them in an unprotected S3 bucket.
> If you’re required to handle this kind of sensitive information ... you should take every precaution that that information is secure, encrypted when possible, and that your systems are regularly audited for vulnerabilities.
None of that solves the problem, though.
You can protect an S3 bucket, but anyone who wants it simply checks it periodically.
Then one day someone is working with the data and changes the permissions to get access, and before they know it, it's downloaded in minutes. It just takes one attacker to get lucky and catch it before your audits do.
And S3 is not unusual, it's just an Internet service like any other, they all have this problem. Even if you have your own data center on a private network, it's only private until someone needs to transmit something and opens a route.
It's ultimately a human problem, there's no magic that can prevent it.
> If the onus ... you shouldn’t be in that type of business. Period.
Ending an unjustified opinion with "period" highlights the fact that you didn't justify the opinion. And it sounds fatuous.
If you're changing the permissions to public just to work with the data, there's your problem. Proper access policies solves all of this - the bucket should never have had public access enabled in the first place as there is literally zero need for it, just sloppy work.
> Just a thought, but you don’t store them in an unprotected S3 bucket.
That's pretty much a good rule of thumb for, well, everything.
(Sure, there are exceptions. But, there's a reason AWS not only doesn't enable public access to buckets by default, but actually by default has superceding policies in place which block setting normal policies to allow public access; it's almost always the wrong choice.)
I had never anything to do with this topic and have no expertise in legal topics but it sounds like you are referring to the Child Protection and Obscenity Enforcement Act [1]. I didn't read the entire thing but just searched for parts talking about keeping records, mostly § 75.2 [2]. Maybe I am interpreting the letters of the law a bit naively but I can not find anything that would require keeping this information anywhere close to the servers used for streaming. Quite to the contrary, it seems to me that you could keep the required records on paper if you wanted to - more or less a copy of the ID, name and aliases used by the performer, and a list of works the performer appears in, for example the URL a stream is available at. But maybe I missed something important.
There are rules on how the information has to be indexed and cross referenced; while you might be able to design a system of electronic-beside-hardcopy records that you could make a plausible argument for meeting all of the requirements, it's straightforward to do it if your data and media are in the same electronic system.
Considering that getting it wrong is a federal felony, the incentive set by the law is clear even if, arguably, unintended.
> Considering that getting it wrong is a federal felony, the incentive set by the law is clear even if, arguably, unintended.
I don’t think you’re wrong, but I would argue that if getting it wrong is a felony, not properly protecting the data should be a felony as well. I realize it isn’t, but incidents like this showcase just how poorly the laws around this type of thing are written. If you can’t properly protect the data, you shouldn’t be in this type of business.
Of course, in this case, this is a site registered in Andorra. So who even know what those laws required in the first place.
> I don’t think you’re wrong, but I would argue that if getting it wrong is a felony, not properly protecting the data should be a felony as well.
It probably should be; I suspect that it's not because making involvement in porn risky for adult participants (and thereby discouraging it), while not the central focus of the law, isn't actually undesirable to lawmakers.
> Of course, in this case, this is a site registered in Andorra. So who even know what those laws required in the first place.
My understanding is that the US applies it's rules to anything of an adult nature sold into or among the US states, regardless of origin, though in practice applying it to foreign entities with little US exposure is difficult; but certainly it wouldn't be the only law that applies, to the extent it might apply.
Eh, maybe it's irresponsible but I don't think that it needs to be a felony or that those people shouldn't be in business. People just shouldn't expect every shady website to have perfect data practices.
You shouldn’t need to ask if your employer is shady — and even then because it isn’t shady, not because someone tells you afterwards that it should’ve been obvious to you that it was bad before you started.
Not being a legal scholar, I won’t make a distinction between felony and misdemeanor.
What I will say is that just as computers are a force-multipler for getting important stuff done, they are also a force-multiplier for causing harm. As the old saying goes: “to er is human, to really foul up requires a computer”.
This leak did not endanger just one or two people, which would be bad enough, but 4000. Even if the remedy is limited to a fine sufficient for each affected person to change their names and address, it is still a more serious harm than almost anything normal intuition will help with because of how many were involved.
I wish that being in the sex industry was socially neutral for men and women and that nobody would be assaulted or insulted for it. I don’t know why that isn’t the case already, but I do recognise that it isn’t — and given that it isn’t, this leak is still extremely likely to result in someone getting hurt.
Also not being a legal scholar, the distinction being that a felony is a much more serious crime and lands people in jail for long periods of time. Which is why I think this could be an over reaction when a data leak may possibly be a mistake or ignorance. I understand that the nature of the data may be sensitive, but I don't think it makes sense to ham handedly punish people for something they aren't directly responsible for. Like someone being hypothetically hurt by someone else because of a data leak they may not even have known about.
I strongly disagree. I’d expect the shadiest of shady websites to have the best security, but as demonstrated, the people that scooped up tonnes of my financial history without my permission, then charge me for access, and sell that data to other financial institutions that use that data to determine my access to credit, not only have terrible security, but aren’t really punished for it either.
Anytime human labour that is required is ignored that makes the startup more deceptively attractive to VC - and may end up bordering on fraud if done with intent.
In principle this problem should be solvable with a zero-knowledge proof system.
The law should allow for schemes that prove a performer is legal, identified, etc., but that don't keep individual private details. Even if the site was completely compromised, all the attackers would gain is a single bit of data on each performer. The data would be worthless for personal identification
Yeah the law (for very good reason) is made in a way that you can never hide behind "i don't have it now / it's not here / it wasn't me at the time", but in the day and age of live streaming and data leaks that's basically a matter of time until such leaks happen.
You store them wherever you like, encrypted with a public key. The important thing is the decryption key, which never has to leave the airgapped laptop or the printout in a vault. The law doesn't say the files have to be in a particular format or instantly viewable, any more than it says they need to be stored in an unprotected S3 bucket.
A custodian can hold the records, but they still have to have all of the required records at the moment you sell any depiction if you don't want to commit a federal felony, and if your depictions are created live in real-time, that's going to be quite challenging to do offline.
I don't have it here / I don't have it now / It's not me who keep them / It wasn't me at the time / I don't know
Just like banking regulations that seems stupid, they come from real abuses and a solution to it at the time. I doubt laws signed in 1988 had internet scale and access in mind, nor could they predict it.
Indeed. Especially when in cases like this, when you’re talking about fingerprints, passport information, gender, parents names, marital information — not to mention standard PII fare — all coupled with the fact that nude photos/videos can be tied to this data too.
2257 was passed well over a decade ago, with minimal public dispute. The only concerning thing is if the models don’t fully understand what the law entails. All affiliate publishers have to have that information under the law.
Also, because of face recognition there really is zero anonymity at this point. It would be very misleading to represent to someone otherwise.
The lawyers don’t even understand how the law says data is supposed to be segregated [1], how are the performers supposed to know.
Also, there is a very big difference between citing facial recognition tech (which while rapidly becoming more common, is still something that requires a degree of skill to use and has a real cost to using) as a reason for “zero anonymity” and having public records with direct ties to nude photos and videos leaked. It’s even more misleading to represent that these have the same risk profile.
It's one data leak that will have life-changing consequences for a lot of people.
Age verification and privacy aren't mutually exclusive. The law can be amended in a way to remain effective without exposing people to such risks.
Back when the law was designed data breaches and being able to find everything online wasn't a thing so it wasn't a problem anyone thought of. Now it's time to fix the law to address the new risks.
AFAIK (IANAL) this contradicts laws such as California's adult entertainment laws. I believe the studio is required to maintain records of the cast. (Thanks to the Traci Lords incident).
And given that this is a site paying their webcammers, the other information is presumably back-up identification in case of account recovery or some such.
I don't think GDPR covers people who have an actual written and signed financial contract with the service in question. How could any business function without the information needed on who to pay?
It's funny that there are a tons of requirements before you are allowed to store credit cards (PCI DSS), which are easily changed and expire anyway, yet none for storing ID photo or other highly sensitive personal data, much of which is unchangeable (like date of birth).
As always, the powerful (banks) can rig the system in their favor.
A credit card number is managed/owned/valid within the payment networks. Since they manage it, they get to dictate what means of protection are required in order for you to engage in business on those networks.
Government IDs are usually property of the body which issues them. That means each state can (and IIRC, does) have rules about what you can and cannot use that information for. Therefore you'd have to know the rules for each state or issuing body.
That's not to say that common sense _ought_ to dictate that these things should happen anyway, but who thinks to look up various state government rules on handling a Driver's License?
Yes, but then again it's much simpler. Leak credit card data because you didn't stored it as they want, you'll never touch one again. But if you leak personal user data, nothing (really) bad will happen to you.
Firstly disclaimer that I work at Very Good Security.
ec109685 is right and this is precisely what VGS does. Most companies collect sensitive data (PII/PCI) to achieve a business objective, like verifying identity, age, creditworthiness or to authorize payment. Storing and securing this data is not their primary objective and possession of data comes with the burden of compliance and regulations. VGS acts as a proxy aliasing sensitive datasets as the data flows between systems. When it is time to use this data, VGS acts as a proxy replacing the aliases with real values as data has to be exchanged with third parties.
When dealing with short attributes, combining several other parts of the form to create a composite alias partitioned over the different parts of the payload gets the job done. This would expand the size of the short integer. Adjacent encodings in Chinese, Korean and Japanese characters further expand possibilities.
Could you not also just throw in 1-2kb of random data with your short data? Your app can then just disregard all text after a null bye or something similar.
I'm not a cryptographer but that sounds like a good strategy when encrypting very short strings to resist cryptanalysis.
I had several legitimate businesses ask me for that. Thankfully they were fine with obscuring all but the last 4 digits of the card.
I don't get why they do it though. Faking a card photo is trivial so anyone trying to defraud them will just do so. These guys can manufacture physical replicas of cards to cash them out at ATMs, do the companies really think they won't be capable of altering a picture?
I'm assuming someone somewhere told them to do it and nobody took a minute to actually look into it deeper and realise how easy it is to exploit (and thus useless at preventing fraud).
To pay for enrollment in their iOS dev program (to publish iOS apps) Apple asked me to FAX my credit card data to a USA phone number so that they can charge it manually. Told me that that was the only way of enrollment from my country of residence at that time. Things changed now and they allow online payment but that was also ridiculous. They literally had a form on their website that I printed and filled in my CC data to appropriate boxes. Then had to find a fax machine to send it.
So Caribbean hotel, adult streaming site and Apple can be compared as on the same tech competency level for online payment without blinking. Okay, didn't expect that one today.
You'd be surprised but established brands like Arcteryx ask their customers to email pictures of their passport and credit card. Email. Pictures. Yes, you've read that right. Given how stupid such request is (from data security and privacy points of view), I'm sure they treat that data properly. /s
yes but unless you go into a major porn studio in person and they scan it all, etc there. Its likely to just stay in the digital system. You could say the same thing about printed copies. Do you really want them printing out everyone's photos and trusting them to delete the online versions (what if you just sent it via email). Do you trust them to properly secure a filing cabinet? You know its easy to pick a lock on one of those too.
generally I agree, but I can understand why they did it. If they're accused of allowing underage models they'll be in a spot they definitely don't want to be in.
There was a case years ago of a man who was on trial for child porn. It took the actress physically coming into the courtroom and showing her ID before the case was dropped.
It's just a really scary place to be in terms of society and the law, so I can understand making the decision to do the 'wrong' thing in this instance.
If you're sufficiently paranoid to want to hold onto this data, it should at least be kept in cold storage. And it's not expensive; I can walk over to Best Buy right now and pick up a hard drive big enough to store this data with cash I happen to have in my wallet.
> right now and pick up a hard drive big enough to store this data with cash I happen to have in my wallet.
You do know that the legal recordkeeping requirements mandate specific indexing and cross-referencing requirements, and that the age records must legally include copies of portions of every covered piece of media sufficient to identify the performer against the photos in the age recors, and also must include cross-reference to every individual full work.
If your media is live streamed, it doesn't seem to me likely that you could meet the requirements with an offline system.
It would work if the law is applied on such a way that the list of media retrieved by querying using the performer ID on the media server is acceptable as the listing of depictions that is required to be part of the age verification records. It's pretty clear in the combination of statute and regulation that either hardcopy or digital records can be used, but it's not at all obvious that a hybrid of that particular form satisfied the requirements. From an information management perspective, I can see it making sense; from what I've seen of administration of legal rules I can see where it might not be. Especially in an industry that politicians (and lead US federal prosecutors are all a brand of politicians) like to score points against, I would expect people to be extremely conservative about uncertain legal exposure; you don't want a setup where you have a decent but uncertain argument of it goes to court, you want one that would never give anyone a reason to think it was a viable pretext for legal action against you in the first place.
Would printing out screenshots of the video constitute copies of portions?
If so, there's a simple solution, albeit one that involves hiring a decent-sized staff of file clerks: whenever a model uploads a video, have a printer start spewing screenshots and have an file clerk grab a selection of photos that both identify the video and clearly show the model's face and put them in that model's file in the cabinet.
There could be a regulatory requirement that law enforcement must have automated access to the resource at any time. Wouldn't be the first time the Government build a road to hell with good intentions.
Well, no - this data could be relevant ten years from now, if they’re accused of allowing an underage model after the fact. They need this data to protect themselves, and they need to have it accessible for the rest of their lives.
This is quite incorrect. Most jurisdictions have record retention policies that explicitly state that you're allowed to dispose records after a certain period of time (usually based on record type). After that, even if the record would be relevant to a case, you cannot be held liable for having disposed of it.
There are many, but capital offenses have no limitation, and there are many potential capital offenses. However, that shouldn't really be an issue here, because any business can commit a capital offense, but we don't require every business to record every bit of information about everything they do in perpetuity just because they might commit a capital offense one day. Innocent until proven guilty.
I think that there is a limit on the required period of record retention but I keep failing to find it.
But I don't think it is short, and it at least extends as long as you are selling the material to which it relates, since it is expressly illegal to sell depictions without the records.
Often the law that stipulates you need the record, also stipulates how long you need it. Good data hygiene is to retain in it for as long as necessary, and no longer.
It does not. They use old footage of models indefinitely and can be accused at any time of distributing such materials even after they’ve stopped distributing it when clients are found in possession.
The actress had to fly from Spain to the United States to testify all because an “expert” said she appeared to be a minor in the material in question, iirc.
That data needs to be in a filing cabinet in case law enforcement are investigating accusations of a crime, likely with a warrant. There's absolutely no reason to be keeping such incredibly sensitive information, which presents a severe risk to the person's livelihood and life, on a public facing server because no one - absolutely no one - has any reason to be querying it from the public internet.
Note that compliance with the age verification laws is required, but does not (even now: they didn't exist at the time of the Traci Lords incident) give you “safe harbor” from laws criminalizing sex with minors, child pornography, etc. Not all of the applicable laws require knowing that the minor is a minor for a violation to occur.
Some implementations and interpretations (iirc, Great Britain for one) don't even require that the person is a minor, only the appearance and illusion of, including animation/illustration.
These include scans of documents that prove the model's age, things like ID cards, birth certificates, and passport scans. Also included were performer release forms and profile information. This is particularly bad given the sensitive nature of the work and the need to maintain the personal privacy and safety of the X-rated web stars. There is also the risk that, as the records from virtually every occupied part of the world, that LGBTQ+ performers in some areas could be at risk of persecution.
I can't remember which community this was, but I remember some forum that had a category dedicated to something similar (with consent).
People would post pictures of the view from their hotel window when travelling and the other members would compete in being the first to find the exact location.
When I say exact, I mean exact. They had to find from which window the picture was taken.
It amazed me how some people could figure it out in a few hours and sometimes even less. Only post pictures online if you are alright with being doxxed from them.
I think some commenters might be missing that in many countries like China (where the article shows a passport photo) engaging in pornography is illegal. This leak has the opportunity to ruin lives in many ways.
That's easy for you to say as someone that I'm assuming isn't trying to become a cam model. These people wouldn't have been able to get the job without sending the verifying information, and you don't know their circumstances. Saying "just don't do the thing," without nuance or context is just ignorant.
This last point of 'understanding' is what makes this issue so hard to address for any non-technical userbase. Most people don't even really think about the fact that the "Cloud" is really just someone else's computer. Deloitte did a study recently where they concluded that although privacy is users' #1 concern, most people don't change their behavior. When you combine that with people who don't understand how technology works--but need to get paid--I don't see a good solution where they have any incentive not to take risks. The companies creating these sites don't have an incentive to put up a banner saying "You might not want to sign up for this, we might leak your data." Perhaps some legal disclosure of the risks when providing different levels of PII, just like we have Surgeon General warnings?
I'm going to use an analogy to explain why I responded in the first place.
We tell kids "just don't do drugs" for lots of reasons. No one would ever suggest with a straight face that we shouldn't tell children this because some of them will end up doing the drugs anyway due to all manner of reason, including their circumstances.
What? I absolutely would suggest with a straight face that we shouldn't tell are kids that. It's more complicated than that, and lumping all drugs together seems to do more harm than good, judging by the failure of the DARE program.
Would a redacted document with just the number be enough? The government can still figure it out if there's an investigation, but at least leaks are somewhat contained; you can't do much with a passport number alone unless you have another leaked DB that maps these numbers to other details, and even then it slightly increases the effort required for someone to identify you (they can't just Ctrl+F your name in the data dump).
The problem is that anyone can claim they take security seriously (and I'm sure this site did as well), but as a user there's no way to tell whether it's actually true. There's also the risk that the data being secure now becomes less secure later on when the company decides to cut costs.
Totally agree with you. I wish folks used better technical solutions that made a breach like this impossible. It’s the ethical thing to do - this breach will directly cause people to come to physical harm.
From reading the other comments my understanding is that you need to keep the document itself, where as most KYC companies will verify the document (and potentially other factors such as credit history) but then discard it and only give you a pass/fail status code.
This is correct. Fintech (and gambling, which I am intimately familiar with) companies are required to keep the submitted KYC documents on file for several years from the last customer interaction/activity.
You can't even delete dud uploads. If a customer is involved in fraud or money laundering investigation, every document they have ever uploaded is evidence. So is the type, time and timing of different uploads: in fact, the uploading of a bad document is itself a valid and potentially valuable data point. Multiple uploads in tight sequence with duds in the mix? Hello...
The submitted KYC documentation is TOXIC. It is essentially an archive to impersonate customers. Hell, I consider the material so dangerous that we built a dedicated protection system to guarantee the fraud potential of our archive would be seriously limited even if the whole archive leaked[0].
i always redact the PII part. so for example i cannot sign up at all of the verification required proof of age. because birthdate will be redacted.
so far, if the tiny number of places i’ve needed to send a DL or something, no one has complained that it’s pretty much just my name and picture. i imagine the staff checking isn’t paid enough to care.
I stayed in one AirBnB where there was a book on a shelf in the living room. I opened it, and it was the name, passport number, date, and some other information I can't recall for every past guest.
It's a risk vs reward scenario. The data leaking from Airbnb is bad, but nowhere near as bad as the data leaking from a porn site. You might be willing to take the risk with the former but not the latter.
The main damage here isn't the data leaking (it's already out there thanks to countless other breaches anyway), it's the data leaking and the association with the porn site.
What makes it even worse, is that this is an industry where un-sane individuals frequently get obsessed / fall in love / stalk the models in question very frequently due to the context in which they work. This leak is going to be life-ruining (and potentially dangerous) for many of them.
Yes. I don’t say this lightly: this is the sort of data that if used the wrong way could wind up with someone being killed. This is a situation where the real-life consequences could literally be death.
"We were able to access Pussycash’s S3 bucket because it was completely unsecured and unencrypted. Using a web browser, the team could access all files hosted on the database."
This is terrible. The sensitive nature of this information could have significantly impact on the victims. This is the type of data, that coupled with the sensitive nature of the sites content, could pose significant safety risks — and I don’t say that lightly.
Moreover, the jurisdiction of this place (Andorra), leaves a lot of open questions about what (if any) recourse there could be either from a punitive or criminal standpoint.
It's amazing how often you find companies/individuals asking for personal information/documents over email and SMS. I've been on the hunt for a flat for the last few weeks. Being in NZ, I used the TradeMe platform where many of the landlords/listing owners asked for Passport scans, employment letters, bank statements etc via email before evening looking at the property.
No mention of how that information will be secured and how it will be discarded.
They've likely received hundreds of messages with personal information, all stored in Gmail inboxes. What happens to them after they're sighted - I wouldn't know.
I don't know what the 'adult' industry is like but I suspect there's some sites that verify their models by similar email/SMS mediums.
A friend of mine who did a few movies in the 2000s told me she felt much more violated by the paperwork for the required record keeping than she did having sex on camera.
Absolutely zero, considering their business is already shady as fuck powered by spammers spamming the affiliate links, nasty ads and no doubt lots of dark patterns. If the law cared they would already be in trouble for something else.
Besides that, Equifax got away with a slap on the wrist even though it was a highly publicised case.
This case will have zero attention outside of HN and the likes so I'd be very surprised if it even makes it to court.
There's also the issue that bringing the case to court will attract more attention to the leak and potentially force the plaintiffs to state their real details on the record, so while it's definitely unfair to let the website operators go unpunished, maybe leaving the mess alone and hoping the dust settles is the best course of action.
Nothing. The jurisdiction this place operates in is fuzzy at best and I feel confident the owners will just abscond and go off to the next place.
Plus, the people who are the victims here are not the people who typically have the money to pursue this (and may live in countries where pursuing anything would cause more harm).
Marginalized victims, opaque legal jurisdiction/laws, and little/no incentive to go after the owners means that I feel confident absolutely nothing will happen.
Think about it: Equifax doxxed more than half the US population and got away with a slap on the wrist and was rewarded with even more government contracts.
Given the requirements to keep this data, I'd be curious what data-model would make sense to prevent leaking it. I honestly can't think of anything practical that's better than "encrypt at rest, better hygiene with database credentials". After all, the webserver needs the ability to submit this data, and logically people are going to want to be able to review their own data on the web in order to update it.
Each person's records could be given an ID (say, a UUID) and stored encrypted in a database. Strong access controls over the decryption keys, etc. Then each video gets metadata with the UUIDs of the people in it.
Also it never needs to be updated. Once you've proved you're old enough to legally appear you never need to do so again, at least as long as no time-travel shenanigans are possible.
In US law the records have recency requirements, content-sample requirements for each work the model appears in, and indexing and cross-referencing requirements, and must at the time of each depiction include all names, nicknames, and aliases, the model has ever used in any context (which can expand over time), so, no, it will require updates after being stored.
Who are you obligated to provide the records to? When you get a records request from such a party, how long do you have to respond?
As long as there aren't too many parties who can ask for the records, and you don't have to provide them on very short order, I think the approach I'd take is to encrypt each document separately using a public key system, and not keep the plaintext. Each encrypted document would be assigned an ID. Indexes and cross references would refer to those IDs.
Since each document is separately encrypted, new and updated documents can be added to the collection without having to decrypt earlier documents.
The private key would be kept on a system that is not online. When a request for records is received, the indexes and cross references could be consulted to determine the IDs of the relevant documents, which could then be taking to the system that has the private key via flash drive, where they could be decrypted and turned over to the requesting party (presumably law enforcement).
For the system with the private key, I'd consider using cheap Linux tablets. Maybe three of them. One for the CEO, one for the CTO, and one kept by the company's lawyer. The tablets are meant to get locked into a safe and stay there except when the company is responding to a records request.
> Who are you obligated to provide the records to? When you get a records request from such a party, how long do you have to respond?
You are required to maintain the records at your place of business or with an identified custodian, with specified content, indexing and cross-reference structure, to provide identification of where they are stored along with any depictions sold/distributes, and make them immediately available for inspection by inspectors authorized by the US Attorney-General (which will generally, as I understand, be any US law enforcement agency which asks for such authority) on demand during normal business hours which are either 9-5 local time or, for inspections at the producer's place of business, the producer's actual working hours, which must be provided to inspectors and, if not at least 20 regular working hours per week, must provide notice of at least 20 hours per week during which the records are available for on-demand inspection even if they aren't otherwise working hours for the producer.
> As long as there aren't too many parties who can ask for the records, and you don't have to provide them on very short order,
I don't think either of those qualifications actually holds, especially the second.
FTA: PussyCash never replied to any of our attempts to contact them regarding the data leak, including their Data Protection Officer. ImLive finally responded to one of our emails, stating that they would take care of it and pass on the information to the PussyCash tech team.
Even if you're trolling, I suppose you're technically correct in that this is "good" for stalker-rapists, and maybe analysts of the cam-modeling industry.
I am only half trolling. In your "nothing good will come out of it" you seem to imply that "good" means "good for society" or something like that. A lot of people in the world understand "good" as "personally profitable and/or enjoyable". So yeah, you can say that nothing good will come out of using this data, but a whole lot of people out there are going to disagree with that.
As a society we have decided that there are objective good and objective bad. While in your view all may be an amusing intellectual experiment, society at large considers stalking, and other crimes that could arise here, to be objectively bad whether or not the perp derives some pleasure from it.
Nothing good comes out from playing videogames either, or archiving pictures of WWII guns, or ...
Collecting information (which is public by the way) is just yet another hobby, which can be used for good (training an ML algorithm), bad (annoying someone), or just be neutral (sitting in someone's archive). It does not mean that you use it to ruin someone's life.
Also, there is nothing wrong with stalking people, as long as you do not interact with them, let them know, or affect them in any way.
Alongside the risk of exposing people's private peccadilloes and the danger that presents, there's a huge risk of identity fraud, bank fraud, sim-swapping etc. with all this data.
Particularly when it comes to all the copies of Government Photo IDs (Passports, Drivers licence, etc.)
To be clear, I mean using solely the term "adult". Using the term "adult pornography" is fine.
But to answer your question: Because it is destructive and wrong, no matter the age of the viewer. Calling it "adult" not only suggests that it is fine for adults, but entices minors to it under the false impression that they will be more adult by viewing or participating in it.
> They boast 66 million registered members on their webcam chat arena, ImLive, alone.
Let's say each registered user pays $1 a month for access to this one site they run. That's $66 million/month in revenue. Enough to secure data and comply with privacy laws.
"Eschew flamebait. Don't introduce flamewar topics unless you have something genuinely new to say. Avoid unrelated controversies and generic tangents."
The problem is that the site didn't collect the data because they wanted to, they did it because the law requires it. GDPR (and presumably equivalent privacy regulations) explicitly has an exemption for data you are legally required to keep.
Regarding fines, they wouldn't undo the damage of the leak either. I don't think this kind of leak can be mitigated with any amount of money, short of giving all the people involved a new identity and forcing them to start a new life somewhere else (and even then, they can still be recognised by their physical appearance).
I don't follow the distinction you're making about GDPR. I don't think anyone is saying they shouldn't have this information, just that they should make at least some minor modicum of effort to secure it.
GDPR doesn't say you don't need to properly secure data even if you're legally required to collect it.
GDPR solves this problem as much as any legislation can.
And this is why I recommend using fake details & IDs when signing up to sensitive services like this. Not an ideal situation and I'm not blaming the victims here, just stating what I would do if I had no choice but to sign up for such a site. Given the life-changing consequences of a leak and the risk of harm (stalkers showing up at home, or being an LGBT performer in a location where the government doesn't approve of that) the consequences of being caught with a fake ID are tame in comparison.
Ideally there should be a way for the websites to fulfil their legal obligations regarding age verification without actually handling any ID data themselves. Maybe a government-provided oAuth style service where you are redirected there, authenticate with the government (no extra risk there, they already have the data) and then they return a signed blob to the website asserting that you are of legal age without actually disclosing any details.
That's may be the overt motivation, but the record keeping requirements include things unrelated to that purpose that make the records more dangerous (including every name, nickname, and alias the performer has ever used). Combined with the extension of the rules to
“secondary producers” (redistributors) who are permitted to get copies from primary distributors, the law assures that a treasure trove of easily abusable information about adult performers is widely dispersed.
Would an ID with everything redacted but the picture and birthdate pass? It should be sufficient to fulfil the site's legal requirements while mitigating risk of the data leaking - you can't leak what you can't have.
> Would an ID with everything redacted but the picture and birthdate pass? It should be sufficient to fulfil the site's legal requirements
No, it would not. ”...the records shall also include a legible hard copy or legible digitally scanned or other electronic copy of a hard copy of the identification document examined and, if that document does not contain a recent and recognizable picture of the performer, a legible hard copy of a picture identification card.” 28 CFR 75.2(a)(1); and there's a lot more besides, see https://www.law.cornell.edu/cfr/text/28/75.2 and 28 CFR 75 generally, as well as 18 USC § 2257.
Would a site be willing to take the risk on accepting a blatantly doctored ID? The consequences for allowing an underage performer on the site are extremely severe. Decades in jail labeled as a pedophile and spending the rest of your life on the sex offender registry.
It's no surprise at all that the sites demand an extraordinary amount of PII about the performers before they are allowed to post a single image.
Shame the punishments for leaking PII are nowhere near as severe.
I've seen a lot of cases where the potential penalties of not checking IDs or blatant financial crime are severe, and yet the jobs are outsourced to people not paid enough to care, not given the right tools to investigate inconsistencies, or encouraged by management to "look the other way" so I wouldn't be surprised if the same happens here.
Asking for a lot of PII is one thing, actually checking that PII to be accurate is another thing. The latter can be exploited to regain a slight bit of privacy.
The moral, legal and technical imperative to protect this data is 100% on the company storing this data. Even if the onus of protecting sensitive personal information were passed on to the performers making a living from this site, they would still need to show their full legal name on a redacted ID (which makes finding the address trivial).
The owners of this site should be ordered to pay restitution for the damages it has caused to all the performers impacted by this leak. If there are no consequences for things like this, companies will continue to be poor custodians of sensitive data that we entrust to them. The most vulnerable people in society will, as usual, suffer the greatest harm.
I agree, but both the Equifax case and the lack of enforcement of the GDPR (still no sign of the million-dollar fines or even investigations) shows that the powers that be clearly have no incentive to actually enforce this (well at least until some high-profile politician's dirty laundry gets leaked).
(1) ascertain, by examination of an identification document containing such information, the performer’s name and date of birth, and require the performer to provide such other indicia of his or her identity as may be prescribed by regulations;
(2) ascertain any name, other than the performer’s present and correct name, ever used by the performer including maiden name, alias, nickname, stage, or professional name; and
(3) record in the records required by subsection (a) the information required by paragraphs (1) and (2) of this subsection and such other identifying information as may be prescribed by regulation.
In a situation like this the consequences of being caught for fake IDs are less damaging than this data breach, especially if you're an LGBT performer in certain locations.
I don't, but if I had no other choice and needed to sign up to such a site I'd consider making one (or trying a real one with sensitive info redacted and see if that passes), along with other anonymity precautions.
The law is IMO the least of your concerns here (you are not stealing or causing harm to anyone, so very little incentive for someone to look into it), the fallout when your real ID leaks like what happened here would be a much bigger concern especially for LGBT performers in certain regions.
Regarding credit cards, using a prepaid one or a service such a Privacy.com is enough so no fakery needed there.
Credit cards: prepaid can be detected and blocked, same as the privacy.com ones - especially when the credit card is being used to validate something. Look at any major fraud prevention software, these things are trivial.
In the real world, if you want to make money, you need to show and prove ID with matching banking details. Any inconsistencies and you don't get paid. This isn't something you can outsmart. People smarter than you and I have been thinking very long and hard about these points, much more so than the two minutes you took to think up your post. The idea is like those videos of 'primitive underground dwellings with a swimming hole on top'. Cute, creative, but terribly impractical and useless in any real world situation.
Yes that is correct. I am thankful I have other means of income meaning I don't need to model for a cam site.
> Creating a fake ID = super illegal.
Agreed. But if I'm at the desperate stage where I have no choice but to sign up to a cam site, I would prefer taking that risk than having such PII leak many years in the future and affect my career prospects (the article mentions some of the data being up to 20 years old - most of these people now have no doubt left the scene but their new life can now be screwed up by this data leaking). Neither is a good solution, but IMO the risks of the latter outweigh those of the former.
Regarding prepaid cards, yes I know they can be detected and blocked, but is there any incentive to do so? It makes sense for a performer to want to protect their privacy, so I don't see why the site would block these cards?
It's actually not always illegal to create or possess a fake ID. It depends on the state and what you do with it. Some states it's illegal always. In California though as an example here's the law:
> 470b. Every person who displays or causes or permits to be displayed or has in his or her possession any driver’s license or identification card of the type enumerated in Section 470a with the intent that the driver’s license or identification card be used to facilitate the commission of any forgery, is punishable by imprisonment in a county jail for not more than one year, or by imprisonment pursuant to subdivision (h) of Section 1170.
You have to have the intent to commit a forgery. This is defined elsewhere but means to use the id to commit fraud.
So you have a novelty id that says your name is Mickey Mouse and you are 100 years old. You show it to your friends. Or maybe you get one as a gag gift for a friend. Not illegal in California. Using a fake id to misrepresent your age for legal purposes such as buying alcohol, tobacco, firearms, voting, acting in porn? Very illegal.