Investigators, commands, and attorneys often throw these terms around without defining them. Since these issues are international and involve international coordination among law enforcement agencies, a set of were created to standardize terminology so everyone speaks the same language. These are known as the Luxembourg Guidelines. They’re called the Luxembourg Guidelines because the international working group that developed these standardized CSAM terms met and adopted them in Luxembourg, making it the point of origin for the terminology now used worldwide.
This guide pulls out the terms that are relevant to military investigations and courts-martial, and breaks them down in plain English.
Why “Pornography” is not the Preferred Term
Pornography is a term primarily used for adults engaging in consensual sexual acts distributed to the general public for their sexual entertainment. The concern when it comes to children is that “pornography” (the term and the activity) has become increasingly normalised and can contribute to diminishing the seriousness of, or even legitimizing, what is actually sexual abuse and sexual exploitation of children. The term “child pornography” might allow the inference that the acts are carried out with the consent of the child, and represent legitimate sexual material. The term “Child Sex Abuse Material” removes that ambiguity.
Why Terminology is Important
In cases involving minors and sexual content, the wrong word can cause
- Overcharging and undercharging
- Confusion in interviews, legal documents, and trials
- Difficulty distinguishing criminal conduct from non-criminal behavior
The terms below reflect language recommended by the Luxembourg Guidelines.
Core Terms You Will Hear in a CSAM Investigation
Child Sexual Abuse Material (CSAM)
Definition: Any picture, video, or digital file that shows a real child engaged in sexual activity, or any depiction of the sexual parts of a child for sexual purposes.
This is the preferred term worldwide. “Child pornography” is outdated and misleading because it suggests consent, participation, or commercial activity.
Key point: It requires a real child. If no real minor is involved, it is not CSAM but might fall into another category of contraband.
Child Sexual Exploitation Material (CSEM)
Similar to CSAM but broader: includes any material created through the exploitation of a child, including non-explicit images used for grooming, coercion, or blackmail.
Example: A minor coerced into sending lingerie photos.
Not explicit, but still exploitation.
Child Sexual Abuse (CSA)
A general term for the underlying act, not the images.
CSA may involve:
- physical sexual contact,
- non-contact abuse (exposure, masturbation in front of a child),
- coercive interactions,
- exploitation for images.
Child Sexual Exploitation (CSE)
Abuse where the child receives something of value or is manipulated for the perpetrator’s benefit.
Examples:
- exchanging images for money or gifts
- grooming a child to create sexual content
- using threats to obtain sexual images (“sextortion”)
CSE overlaps with CSA but involves a component of advantage, gain, or manipulation.
Terms Related to the Creation of Sexual Content
Youth-Produced Sexual Content
A subset of self-generated content, typically shared between minors, often in consensual adolescent relationships.
The Luxembourg Guidelines emphasize not using criminalizing language here unless:
- an adult is involved,
- coercion is used, or
- the minor is incapable of consent (very young children).
This distinction prevents treating ill-advised teenage behavior as organized abuse.
In military investigations, youth-produced sexual content can become evidence when it is shared recklessly and intercepted by adults, obtained through sextortion schemes, or sent by a minor during an exchange with a service member, triggering UCMJ exposure even though the material was originally created by the child.
Indecent Images of Children
A UK/Commonwealth term often encountered in international or allied-force investigations. It is a broad category that can overlap with CSAM or CSEM.
Used inconsistently; not recommended where CSAM/CSEM can be used instead, but it is often heard in these cases when people try to use delicate language.
Terms Related to Distribution or Exchange
Sextortion
Coercion using threats to obtain sexual images, money, or further compliance.
- Can involve minors or adults.
- When the victim is a minor, the coerced product becomes CSAM/CSEM.
- When the victim is an adult, the conduct can violate Articles 127, 120c, or 134.
Grooming
A deliberate pattern of behavior to gain access to or manipulate a minor for sexual purposes.
Forms include:
- emotional bonding
- gift-giving
- desensitization to sexual topics
- requests for sexual images
- isolating the child from peers/adults
Live-Streamed Child Sexual Abuse
Real-time abuse viewed online, sometimes with remote participants who direct or pay for the conduct.
Not “virtual”; requires a real child.
Terms Related to Non-Real Imagery
Computer-Generated CSAM / AI-Generated Child Sexual Abuse Images
No actual child victim is involved.
The Luxembourg Guidelines make a clear distinction:
- “AI-Generated CSAM” is a misnomer, because no child is abused in making it.
- Preferred term: Computer-generated child sexualized material.
The Luxembourg Guidelines discourage the phrase ‘AI-Generated CSAM’ because no actual child appears in or is harmed by the material, but we use it on this site because it is the term investigators, service members, and the public most often search for, and it is the term most commonly used by the media.
Some conduct of this nature might violate Article 134 even if the material is not CSAM.
Fantasy Material / Fictional Depictions
Includes written stories, drawn images, or other fictional representations with no real child.
This not CSAM and not CSA, but it is potentially relevant to prove intent, consciousness of guilt, or absence of remorse.
Terms Related to Possession, Access, and Sharing
Possession
Having CSAM on a device or storage medium, regardless of intent.
The Guidelines warn against assuming intent from mere presence of files.
Distribution
Sharing, sending, trading, posting, uploading, or streaming CSAM.
Accessing / Viewing
Intentional viewing can be charged even without proof of download.
Luxembourg emphasizes that “viewing” should not be conflated with “possession” unless files are saved or cached.
Attempted Access
Trying to obtain CSAM, even if unsuccessful.
Production (Including Solicitation of Images)
Production is not limited to cameras, lighting, or a hands-on environment. An adult who asks a minor to “send pics,” even once, has entered the realm of production. The law treats the request itself as an attempt to create CSAM. If the minor sends an image in response to a request, the adult is now liable for the production of that image.
Terms Related to Coercion, Abuse, and Context
Revictimization
Repeated harm caused when CSAM is redistributed, possessed, or viewed.
Important concept: viewing and possession are unlawful because they perpetuate harm and provide a market/audience.
Disclosure
A child describing abuse or exploitation.
The Guidelines highlight that disclosure is often delayed, inconsistent, and nonlinear in terms of recollection. While investigators should not automatically treat these irregular disclosures as evidence of fabrication, these irregularities might be relevant to defenses.
Consent
A minor cannot consent to creating sexual content for an adult. Terminology should not imply shared responsibility.
Terms That Should Not Be Used
Luxembourg Guidelines discourage:
- “Child pornography”
- “Kiddie porn”
- “Child prostitute”
- “Underage girls/boys” (might suggest ability to consent)
- “Pornographic images” when the subject is a minor
- “Virtual child pornography” (vague and legally inconsistent)
The language used in interviews, warrants, forensic reports, and charges drives how CSAM cases are built, interpreted, and prosecuted. Misunderstanding even a single word can change the direction of an investigation or the outcome of a case. We have spent more than twenty years defending service members in the most complex digital-evidence cases across every branch of the military, working with forensic analysts, military and civilian investigators, and expert witnesses in federal and military courts. If you are under investigation or have been questioned about any form of suspected CSAM, call 800-319-3134 for a confidential case review. We can help you understand what the terminology really means in practice and how it applies to your situation.