Privacy Commissioner issues new guidance to Social Media Platforms regarding age limits

October 16, 2025 |

As 10 December approaches the regulators are releasing guidances. Last month the e safety Commissioner issued its guidance.  Last Friday the Privacy Commissioner issued a statement and guidance.  As the Guidance makes clear, more is expected of entities in handling and, importantly, destroying data. Part 4A of the Online Safety Act 2021 sets out quite detailed obligations upon Social Media Platforms.  For Social Media entities this will require a very thorough audit of data collection and use practices.  

The Statement provides:

The Office of the Australian Information Commissioner (OAIC) has published regulatory guidance for age-restricted social media platforms and age assurance providers on compliance with the privacy provisions for the Social Media Minimum Age (SMMA) scheme, due to take effect on 10 December.

Privacy Commissioner Carly Kind said that the guidance reflects the stringent legal obligations on entities to ensure that age assurance is applied proportionately and through privacy-respecting approaches.

“Today we’re putting age-restricted social media platforms on notice,” Ms Kind said. “The OAIC is here to guard and uplift the privacy protections of all Australians by ensuring that the age assurance methods used by age-restricted social media platforms and age assurance providers are lawful.”

The OAIC co-regulates SMMA alongside eSafety. Last month, eSafety published their regulatory guidance – external site detailing what ‘reasonable steps’ age-restricted social media platforms must take to prevent age-restricted users from having accounts, including guiding principles for the implementation of age assurance to meet SMMA obligations.

The OAIC’s guidance published today provides information for age-restricted social media platforms and third-party age assurance providers on handling personal information for age assurance purposes in the SMMA context.

“The OAIC is committed to ensuring the successful rollout of the SMMA regime by robustly applying and regulating the privacy rules contained in the legislation, in order to reassure the Australian community that their privacy is protected,” said Privacy Commissioner Carly Kind.

“eSafety has provided the rules of the game with their ‘reasonable steps.’ Now the OAIC is setting out what is out-of-bounds when it comes to the handling of personal information for age assurance in the social media minimum age context.

“Together, eSafety and the OAIC’s regulatory guidance outlines the field of play for age-restricted social media platforms and third-party age assurance providers.

“SMMA is not a blank cheque to use personal or sensitive information in all circumstances; we’ll be actively monitoring platforms to ensure they stay within the bounds by deploying age assurance proportionately and lawfully.”

Key considerations detailed in the guidance call on entities to:

    • note the additional privacy obligations in the SMMA scheme operate alongside the Privacy Act 1988 and the Australian Privacy Principles.
    • choose age-assurance methods that are necessary and proportionate, and assess the privacy impacts associated with each method.
    • minimise the inclusion of personal and sensitive information in age assurance processes.
    • note pre-existing personal information later used for SMMA purposes does not need to be destroyed where the original purposes are ongoing.
    • destroy personal information collected for SMMA purposes once purposes are met.
    • make sure that any further use of personal information collected for SMMA purposes is strictly optional, has the user’s unambiguous consent and can be easily withdrawn.
    • be transparent about the handling of personal information for SMMA purposes in privacy notices and at the moments it matters.

Together, these privacy safeguards impose stringent legal obligations on age-restricted social media platforms and age assurance providers. Failure to meet these obligations may constitute ‘an interference with the privacy of an individual’ and may trigger enforcement action.

Further OAIC resources will be released soon to help Australians understand what personal information may be handled through age assurance methods, as well as educational resources for children and families to help them navigate the changes and support conversations about children’s privacy online.

For more information and to view the guidance, visit: www.oaic.gov.au/privacy/privacy-legislation/related-legislation/social-media-minimum-age

Background

The OAIC co-regulates the Social Media Minimum Age Scheme with eSafety. Specifically, the OAIC oversees the compliance and enforcement of the privacy provisions set out in Section 63F of Part 4A of the Online Safety Act 2021, which operate in tandem with the Privacy Act 1988.

Key aspects of the guidance are:

  1. Purpose Limitation – section 63F(1) Entities that hold personal information collected for, or including, SMMA purposes must not use or disclose that information for any other purpose.  There are limited Limited exceptions under APP 6.2(b)–(e) which permits use or disclosure, or where the individual gives voluntary, informed, current, specific and unambiguous consent under section 63F(2).  This standard goes beyond the general APP 6 framework. The inclusion of “unambiguous” as an element of consent precludes the use of pre-selected settings or opt-outs when seeking consent. Also the reuse of information is prohibited unless clearly authorised or in the exceptional circumstances set out in APP 6.2(b) – (e).
  2. Information Destruction – section 63F(3) Once personal information collected for SMMA purposes which has been used or disclosed for those purposes that personal information must be destroyed.  De-identification is not permitted.  The destruction must happen as soon as all SMMA purposes are met.  This obligation is stricter than APP 11.2, which permits de-identification or retention for ancillary business needs. Pre-existing data used to support age assurance  remains governed by APP 11.2.
  3. Enforcement. The Privacy Commissioner has the power to investigate and take action for breaches as a breach of section 63F constitutes an “interference with the privacy of an individual” under the Privacy Act.  Those actions include investigating, make determinations, and require remediation or compensation. Individuals may also lodge complaints directly with the Privacy Commissioner.
  4. Part 4A does not replace the APPs.  It is an overlay of stricter duties in addition to the existing APPs.  The APPs still apply in their entirety.

Under the Guidelines Platforms cannot retain information “just in case” it is useful later. The OAIC can investigate and enforce directly, even against entities not previously regulated, such as small technology providers or overseas processors.

The OAIC expects age assurance solutions to be privacy by design, backed by an early-stage Privacy Impact Assessment (PIA) that examines proportionality, necessity and data minimisation.  That may be a new concept for some entities.  In establishing the processes and procedures the least privacy-invasive method should be used.  It should be teated through a PIA before deployment.

The OAIC recommends establishing a “ring-fenced SMMA environment” — a segregated technical and data structure where age assurance information is processed, stored and destroyed separately from other systems. Only minimal artefacts, such as a binary “16+ yes/no” result, method and timestamp, should persist. Inputs like ID scans or selfies must be deleted immediately after use.

The OAIC supports inference-based and AI-driven approaches but with clear restrictions: they must be transparent, demonstrably accurate, and not rely on continuous behavioural tracking or unnecessary sensitive data such as biometric or content analysis.

The process must be transparent. That includes:

  • just-in-time notifications at the point of data collection,
  • explaining what information is being collected, by whom, for how long, and why.
  • having privacy policies which clearly describe SMMA-specific processing and destruction practices.

Legal, product and design teams need to collaborate. Poorly designed consent or information screens — even if legally accurate — can amount to non-compliance.

Part 4A sets a higher bar for consent to secondary uses of information collected for SMMA purposes than the standard APP test. It must be:

  • voluntary,
  • informed,
  • current,
  • specific and unambiguous and
  • be able to be withdrawn.

The OAIC Guidance says that there should be:

  • no:
    • bundled or pre-ticked consents,
    • reliance on general terms of use, and
  • simple withdrawal mechanisms in dedicated privacy settings or contextually appropriate screens.
  • purpose specific and time limited consent which is purpose-specific and time-limited.

Section 63F’s destruction requirement is specific andmandatory.  Photos, document scans and biometric data must be destroyed as soon as the age check is complete, unless unambiguous consent to other purposes is obtained or another limited exception applies. Entities should retain only minimal decision artefacts for troubleshooting, fraud and circumvention prevention, responding to complaints and reviews and to evidence compliance.  Retention for these purposes must be transparent and should be subject to time-based limits in accordance with industry standards.

Where the same collection serves multiple purposes each purpose must be narrowly defined with separate retention rules. Information should be destroyed once the last retention period has expired.

Not surprisingly third-party age assurance providers are subject to the same privacy rules as platforms. 

Small businesses are also covered. Section 63F applies to any entity that holds information collected for SMMA scheme purposes and any provider that discloses or commercialises personal information for a benefit, service or advantage will not benefit from the small business exemption under section 6D(4)(c) of the Privacy Act.

This alignment between OAIC and eSafety guidance is explicit in the OAIC Guidance: steps to comply with the SMMA scheme will not be considered by the OAIC to be “reasonable” unless they are also privacy-compliant. In practice, that means privacy and safety teams will need to work in lock-step — one without the other is unlikely to pass regulatory muster.

As a first step effected entities should :

  • map data flows and third-party integrations;
  • conduct PIAs for all SMMA processing;
  • design destruction-on-decision workflows; and
  • align privacy notices, consent flows and audit processes with both OAIC and eSafety expectations.

The Guidance relevantly provides:

1.   Key considerations

    • Part 4A of the Online Safety Act 2021 operates alongside the Privacy Act 1988 and Australian Privacy Principles. Part 4A introduces additional, more stringent obligations on age-restricted social media platform providers and third-party age assurance providers when handling personal information for social media minimum age (SMMA) compliance purposes.
    • When choosing or offering an age assurance method (or combination of methods) ensure it is necessary for SMMA compliance purposes and proportionate to the legitimate aim of preventing age-restricted users from having accounts. Consider alternate methods and how you can use low-intrusion techniques within an age assurance method(s). Escalate to more intrusive personal information handling only as necessary.
    • Take a privacy by design approach and consider the privacy impacts associated with each age assurance method (e.g. inference, estimation and verification) and whether the circumstances surrounding the specific chosen method(s) justify the privacy risks.
    • Undertake a privacy impact assessment (PIA) when choosing an age-assurance method(s) to identify potential privacy impacts at the outset and implement recommendations to manage, minimise or eliminate This will assist to ensure that a privacy by design approach is embedded from the start.
    • Minimise the inclusion of personal and sensitive information in age assurance processes. Only retain enough personal information in outputs to meet defined purposes, such as to explain the measures implemented for a user and to facilitate reviews or complaints, then destroy on schedule.
    • Destroy any inputs that have been collected immediately once the purposes of collection have been met. Personal information, including sensitive information, that is collected for SMMA compliance purposes (e.g. biometric information, biometric templates, identity documents) must be destroyed once all purposes have been met. Avoid purpose ‘padding’ and ensure destruction includes caches and storage.
    • Existing personal information used for age assurance does not need to be destroyed where the original purposes for its collection are ongoing. Using personal information that was collected for a non-SMMA purpose (e.g. age inference) for SMMA compliance purposes does not, by itself, put that information within the remit of s 63F of Part 4A. However, entities must comply with Australian Privacy Principle (APP) 6 to establish the basis for this type of secondary use.
    • Be thoughtful when designing consent requests for secondary uses and disclosures of personal information collected for SMMA. Secondary use and disclosure should be strictly optional and easily The consent request should be written and designed so users of all abilities can understand what they are being asked to agree to and change their mind.
    • Be transparent, at the moment it matters. Use APP 5 just-in-time notices to explain key information such as what is collected, why, by whom, how long it is retained, and the user’s choices (including alternative methods and review processes). APP 1 privacy policies should be updated with clear and transparent information, with clear policies and procedures to facilitate this transparency.

2.   Overview

Part 4A of the Online Safety Act 2021 (Part 4A) requires a provider of an age-restricted social media platform to take ‘reasonable steps’ to prevent age-restricted users (under 16 years) from having an account with the platform.1 The onus is on platforms to introduce systems, processes and controls that can be demonstrated to ensure that people under the minimum age cannot create and hold a social media account.

Part 4A does not prescribe what ‘reasonable steps’ platforms must take. The eSafety Commissioner (eSafety) is responsible for enforcing compliance with this obligation, and has published regulatory guidance on this topic. However, it is expected2 that at a minimum, the obligation will require platforms to implement some form of age assurance as a means of identifying whether a prospective or existing account holder is an Australian child under the age of 16 years.

Age assurance is an umbrella term for a set of processes and methods used to verify, estimate and/or infer the age or age range of an individual. This enables platform providers and third-party age assurance providers to make age-related eligibility decisions.3 These providers are collectively described as ‘entities’ in this guidance.

Part 4A is technology-neutral and does not mandate any single method or combination of methods. Whether an age assurance methodology meets the ‘reasonable steps’ requirement is to be determined objectively having regard to the suite of methods available, their relative effectiveness, costs associated with their implementation, and data and privacy implications on users, amongst other things.4 The Office of the Australian Information Commissioner (OAIC) recommends reading this guidance about entities’ privacy obligations alongside eSafety’s regulatory guidance about the reasonable steps platforms can take to comply with their safety obligations.

2.1.   What is personal information in the SMMA context?

For SMMA compliance, information involved in age assurance will likely be personal information because it is information or an opinion about an identified individual, or an individual who is reasonably identifiable. This includes situations where the information is inferred, generated or incorrect.

In practice, the personal information involved in age assurance may be one or more of the following:

    • Inputs – personal information about an individual that is collected and processed by an age assurance technology (e.g. photo, voice, document scan).
    • Outputs – the SMMA decision artefact created as part of the age assurance process (e.g. ‘16+ yes/no’ token) and linked to an account.
    • Existing personal information – information already held about an account

An individual does not need to be named in the specific information for that information to be personal information. An individual can be ‘identified’ if they are distinguishable from others. For example, even if a name is not present, it may identify an individual, as it will usually be associated with a record of the user or could be linked back to the person it relates to.

Sensitive information is a subset of personal information that is generally afforded a higher level of privacy protection under the Australian Privacy Principles (APPs) than other personal information. This recognises that inappropriate handling of sensitive information can have adverse consequences for an individual or those associated with the individual.

In practice, the personal information involved in age assurance may be one or more of the following:

    • Inputs – personal information about an individual that is collected and processed by an age assurance technology (e.g. photo, voice, document scan).
    • Outputs – the SMMA decision artefact created as part of the age assurance process (e.g. ‘16+ yes/no’ token) and linked to an account.
    • Existing personal information – information already held about an account

An individual does not need to be named in the specific information for that information to be personal information. An individual can be ‘identified’ if they are distinguishable from others. For example, even if a name is not present, it may identify an individual, as it will usually be associated with a record of the user or could be linked back to the person it relates to.

Sensitive information is a subset of personal information that is generally afforded a higher level of privacy protection under the Australian Privacy Principles (APPs) than other personal information. This recognises that inappropriate handling of sensitive information can have adverse consequences for an individual or those associated with the individual.

Where there is uncertainty, the OAIC encourages entities to err on the side of caution by treating the information as personal or sensitive information and handle it in accordance with Part 4A and the Privacy Act obligations.

2.2.   Privacy obligations under the SMMA scheme

Part 4A of the Online Safety Act 2021 operates alongside the Privacy Act 1988 (Privacy Act) and APPs. Part 4A introduces additional, more stringent obligations on age-restricted social media platform providers and third-party age assurance providers when handling personal information for social media minimum age (SMMA) compliance purposes.

In summary, Part 4A privacy obligations are:

    • Purpose limitation (s 63F(1)) – An entity that holds personal information about an individual that was collected for the purpose of (or purposes including) the SMMA obligation must not use or disclose the information for any other purpose. The following exceptions apply:
      • In circumstances where APP 2(b), (c), (d) or (e) apply; or
      • With the voluntary, informed, current, specific and unambiguous consent of the

individual (s 63F(2)).

    • Information destruction (s 63F(3)) – An entity that holds personal information about an individual that was collected for the purpose of (or purposes including) the SMMA obligation must destroy the information after using or disclosing it for the purposes for which it was

Diagram 1 illustrates these obligations and references the sections of this guidance where the relevant issues are discussed.

Failure to comply with the obligations contained in s 63F is an interference with the privacy of the individual for the purposes of the Privacy Act. This brings non-compliance with s 63F within the remit of the Information Commissioner’s enforcement powers under the Privacy Act. It also entitles an individual to complain to the Information Commissioner about an alleged contravention of s 63F.

Steps to comply with the SMMA obligation will not be ‘reasonable’ unless an entity also complies with

its information and privacy obligations under Part 4A, as well as the Privacy Act and the APPs.6

3.      Adopting a privacy by design approach when choosing an age assurance method or combination of methods

Age assurance methods have the potential to interfere with the privacy of individuals. Each scenario, or combination of scenarios, employs different technologies and processes and raises different privacy implications depending on how personal information is handled and the sensitivity of the personal information.

The OAIC encourages entities to adopt a ‘privacy by design’ approach when selecting an assurance method. A Privacy Impact Assessment (PIA) is a systematic assessment that identifies the privacy impact on individuals, and sets out recommendations for managing, minimising or eliminating that impact. A PIA demonstrates commitment to, and respect of, individual’s privacy.

This guidance highlights some key privacy considerations for entities to consider, in accordance with the SMMA information lifecycle, particularly regarding collection, use, disclosure and destruction.

Other examples of privacy risks that could be captured and addressed through a PIA include:

    • Transparency – the complexity of age assurance methods can make it difficult to understand how personal information is used and how decisions about whether a user is an age-restricted user are reached. Entities should ensure they update their privacy policies (APP 1) and use notifications (APP 5) with clear and transparent information about their use of age assurance
    • Accuracy and quality – issues in relation to accuracy or quality of information, particularly for inferred information (see 4.1, 4.3 and 7 below). Entities must comply with their obligation to take reasonable steps to ensure the accuracy of personal information under APP 10 when using age assurance methods.
    • Security and data breach – age assurance may increase the risks related to data This could be through unauthorised access or through attacks. It is important to consider an entity’s security obligations under APP 11 and the Part 4A destruction obligations when selecting an age assurance method.

Entities should also consider principles such as necessity and proportionality in implementing chosen technologies and methods, particularly given age assurance methods may involve the handling of personal and sensitive information such as biometric templates, behavioural signals and formal identification documents.

Entities should consider low-intrusion techniques within an age assurance method(s) and escalate to more intrusive information handling only as necessary. Entities should also consider the privacy impacts associated with each age assurance method (e.g. inference, estimation and verification) and whether the circumstances surrounding the specific chosen method(s) justify the privacy risks.

In determining whether an age assurance method is necessary, entities should consider factors including:

    • the suitability and effectiveness in addressing the SMMA obligation
    • whether the method is proportionate to the legitimate aim of preventing age-restricted users from having accounts, particularly where handling of sensitive information is proposed11
    • alternative age assurance methods available to address the SMMA

It is the responsibility of the entity to justify that the age assurance method is reasonably necessary. The fact that a particular age assurance method or combination of methods is available, convenient or desirable should not be relied on to establish necessity.

4.   Privacy guidance – collection

4.1.   New collection of information for SMMA compliance purposes

What it looks like

An entity asks a user to provide certain personal information or go through a process that allows the entity to collect personal information to determine whether the user is an age-restricted user (under 16 years) for SMMA compliance purposes.

Example – Age estimation

    • Facial age estimation that collects a single or burst of selfie photos, plus anti-spoof signals; this is processed on-device or via a third-party provider and returns a ‘16+ yes/no’

Example – Age verification

    • Document check via on-device scan that reads the date of birth (DOB) from a government ID via an on-device app and returns a ‘16+ yes/no’ result.12
    • Tokenised assertion from a digital identity credential (provided by an accredited identity provider such as a bank, telco or education institution) that the user is 16+; no other identity attributes are collected.13

The OAIC provides the following practical considerations in relation to collection: Minimise what you collect

    • Where possible, collect binary outcomes (‘16+ yes/no’) rather than DOB or exact
    • If scanning a document, only parse the DOB and redact or avoid non-DOB Process information temporarily
    • Use technology solutions and/or third-party age assurance providers that temporarily process personal information inputs (e.g., document images/fields, face frames, liveness videos) as part of age assurance and do not retain them.
    • Transient processing of personal information is considered a ‘collection’ where the information is included in a record.

4.1.   Using existing information directly to confirm the residency and age of an account holder

What it looks like

An entity uses information it already holds about a user to directly determine whether they are under 16 years. This is typically done to detect and deactivate accounts belonging to age-restricted users. Using existing information to infer the age or location of a user is discussed separately in Section 4.3.

Practical Considerations

The OAIC provides the following practical considerations when using existing information directly to comply with the SMMA obligation:

Minimise what you use

    • As long as the transparency and secondary use obligations are met, using existing information directly to confirm residency and whether the user is over 16 is a data minimising option because it does not require a new collection or the handling of additional personal information.
    • Use only the fields that are needed to determine age or Document the APP 6 basis
    • Assess and be able to demonstrate the APP 6 basis for information

Handle sensitive information carefully

    • Be very cautious if using existing biometric templates, images or other sensitive kinds of information for SMMA compliance purposes.
    • Ensure handling is necessary and proportionate to comply with the requirements of s If unsure, establish a clear expectation from the user and ensure a close relationship to the primary purpose of collection; otherwise obtain consent.

 

4.3.   Using existing information to infer the residency and age of an account holder

What it looks like

The entity uses information it already has about the account holder to infer whether they are under 16 years and whether they are ordinarily resident in Australia. This could involve drawing probabilistic conclusions based on behavioural patterns, contextual data, digital interactions, metadata or other information and subsequent collection of a 16+ decision artefact.

Examples include:15

Location-related signals

    • IP address, GPS or other location services
    • Device identifier, language, time settings
    • Phone number
    • App store, operating system, account settings
    • Photos, tags, connections, engagement, other kinds of Age-related signals
    • Age of account (e.g. the account has existed for 10 or more years)
    • Engagement with content targeted at children or early teens
    • Linguistic analysis or language processing
    • Analysis of end-user-provided information and posts
    • Visual content analysis (e.g. facial age analysis performed on photos and videos uploaded to the platform or entity)
    • Audio analysis (e.g. age estimation based on voice)
    • Connection with other end-users who appear to be under 16
    • Membership in youth-focused groups, forums or
  •  

Although the use of information for age inference may result in a more frictionless experience for the individual, it may also result in the collection and retention of disproportionate amounts of personal information in a way that undermines individuals’ privacy.

Practical considerations

Different cohorts of users may require different approaches. eSafety guidance confirms there is no one-size-fits-all approach that will be suitable in all circumstances. For a substantial proportion of users on long-standing platforms, it may be possible to confirm at a high level of confidence that they are 16+ years old based on the account tenure or creation date. More work, effort and personal information will be required to infer age where account tenure is short, or where the user is in a younger age threshold.

The OAIC recommends taking a risk-based approach which ensures information used for inference is proportionate and privacy impacts are minimised. This means less sensitive information is preferred over more sensitive information Ito achieve an acceptable inference outcome. It also means that where privacy risks are higher, entities should explore other methods for age assurance.

The OAIC provides the following proportionality considerations tailored to age inference, drawing on the factors outlined in Section 4.2 above:

Sensitivity – How sensitive is the personal information you plan to reuse, and what harm could result if it is wrong or mishandled?

    • Prefer non-sensitive information, non-content signals such as metadata and system
    • Treat behavioural and content data (e.g. posts, events, groups, interests, affinities, communications and other user interactions) as higher privacy risk.

Volume – How much, how often and for how long will you use personal information for inference?

    • Use event-based, point-in-time

Avoid building long-lived behavioural profiles; only add more signals if they materially improve confidence

Purpose – Is the reuse strictly necessary to achieve the SMMA decision and nothing more?

    • Define the outcome precisely and assess whether inference is an effective
    • Use a less intrusive method if it can deliver the same outcome while using less personal

Relatedness – How closely is the reuse of personal information for age inference related to the original purpose?

    • Ask whether an individual would reasonably expect the personal information to be reused for age assurance purposes.

eSafety’s regulatory guidance provides further detail on assessing the reliability, accuracy, robustness and effectiveness of age inference as a method of age assurance.

To minimise privacy impacts on individuals, the OAIC recommends handling less sensitive information over more sensitive information (e.g. age analysis performed on photos and videos, or audio analysis on voice), to achieve an acceptable inference outcome. It also means that where privacy risks are higher, entities should explore other methods for age assurance.

5.   Privacy guidance – destruction

5.1.   General obligation to destroy personal information

What it looks like

When conducting age assurance activities to comply with the SMMA obligation, an entity will likely collect and handle personal information relating to current and prospective users.

Examples include:

    • Inputs (e.g. document images/text, selfies, biometric information, biometric templates)

that are used for a point-in-time age check.

    • SMMA artefact (e.g. 16+ flag) that is created from inputs, existing DOB information on file or inferred from multiple data points.
    • Third-party assertion/token received from a third-party
    • Documents received as part of a formal review or complaint escalation process to comply with the SMMA obligation.

Practical considerations

The OAIC provides the following practical considerations in relation to destruction: Distinguish between inputs and outputs

    • Age assurance inputs (generally higher risk) – examples include document images/text,

selfies, liveness videos, other biometric information or templates and any other personal information that is used as input for an age assurance method.

    • Process for the purpose of age assurance, then destroy immediately
    • Do not store inputs ‘just in case’16
    • Ensure destruction covers caches and transient
    • Age assurance outputs (generally lower risk) – examples include binary outcomes (16+ yes/no), methods, provider IDs, timestamps and non-linkable references/tokens; third- party assertions or tokens received from a third-party provider (such as a bank, telco or education institution).
      • Retain strictly for limited purposes – that is, evidence of compliance, troubleshooting, complaint or review handling, dealing with fraud or circumvention
      • Set bright-line, limited retention windows. Ring-fence the age assurance outputs
    • To ensure compliance with the s 63F destruction obligation, the entity should create a

distinct ring-fence or ‘SMMA environment’ that enables it to be fully aware of the outputs

that it handles and where they are kept.

Different entities will have different implementation arrangements. For example:

    • Physical/logical separation – Combination of people, technology and processes to ensure that personal information for SMMA is separated from other parts of the entity and only interface with the entity in limited and controlled ways.
    • Documented boundary – To aid compliance and demonstrate accountability, the SMMA environment could be documented in a way that shows the inputs, transient processing, outputs, retention points and destruction paths.
    • Destruction readiness – The environment could be configured such that personal information for SMMA is able to be destroyed automatically and independently of other organisational data.

There may be legitimate business reasons for co-mingling personal information for SMMA with other personal information (e.g. processing them in shared pipelines or storing them in shared databases). However, this may make it harder to prove purpose limitation and to comply with the strict destruction obligation. Each entity needs to make its own assessment, considering the compliance requirements in s 63F of Part 4A.

The most straightforward path to compliance, and the one that best aligns with the intention of s 63F, is to ring-fence personal information collected for SMMA compliance purposes.

5.1.   Information destruction when there are multiple purposes

What it looks like

Section 63F(3) of Part 4A acknowledges there may be multiple purposes for which the personal information is collected, as long as compliance with the SMMA obligation is one of them. A relevant consideration for destruction is what happens in such circumstances, especially where one or more of the other purposes may require the information to be retained for longer than compliance with the SMMA obligation.

Examples include:

    • Sign-up age check – User completes facial age estimation to open an The same event creates a short-lived decision artefact for audit logging and reviews purposes.
    • One age gate, several compliance needs – A single age check is used to satisfy the entity’s

obligations with respect to (i) SMMA and (ii) another jurisdiction’s age rule.

    • Know Your Customer flow for creator – An ID and selfie are captured for AML/CTF onboarding; the entity also needs to know that the creator is over 16 years to comply with the SMMA obligation.

Practical considerations

The OAIC provides the following practical considerations when considering destruction in the context of multiple purposes:

Avoid ‘purpose padding’

    • Consistent with Chapter 6 of the APP Guidelines, purposes must be construed narrowly and not be so general in nature that they comprise a function or activity of an entity. Do not include broad, speculative or open-ended purposes as part of collection for age assurance (e.g. product improvement, research).

Additional purposes must be genuine. Merely asserting that the collection is for other purposes does not allow you to retain the information collected for longer than compliance with the SMMA obligation

Develop a retention matrix

    • Where the information collected (e.g. SMMA artefact) serves multiple purposes, ensure that each purpose has a defined retention period and destroy the information once the last retention period has expired.

Further partition the personal information where there are additional requirements

    • If a different legal regime (e.g. AML/CTF, overseas jurisdiction) requires retention following an age check, produce and retain separate non-SMMA artefacts or records.

1.1.   Information retention in limited circumstances

What it looks like

There are narrow situations where an entity may need to retain a minimal record after an age check to operate the service responsibly and evidence compliance.

Examples include:

    • Audit logging and evidence of compliance – Prove that a check has occurred, the outcome, how it was done, and when.
    • Troubleshooting, fraud and circumvention – Investigate errors, suspected spoofing and re-registration
    • Complaints and reviews – Respond to user/parent challenges to the age check or its

In such cases, it is sufficient that a SMMA artefact is collected and retained, which contains minimal information such as binary outcome (16+ yes/no), method, provider ID, timestamp and non-linkable reference/token.17

Privacy considerations

The OAIC considers that tightly limited retention of personal information is acceptable and can be done in accordance with Part 4A and the Privacy Act.

All the practical considerations above regarding destruction in the context of multiple information collection purposes are applicable here. In particular, the entity should be transparent about the directly related purposes arising from the age check that involve retention for a longer period.

The one additional consideration is for entities to set time-based limits for each purpose that involves personal information for SMMA (e.g. evidence of compliance, troubleshooting, complaints and reviews). The timing should be justified by the business practice and accord with standard industry practice.

The time-limits for each purpose should determine when and how the personal information is accessed and used. Once the time period for the last allowed purpose has expired, the entity should destroy the relevant artefact.

6.   Privacy guidance – secondary use or disclosure of personal information collected for SMMA compliance purposes

What it looks like

An entity may seek to reuse age assurance inputs for other business purposes or disclose the output (e.g. 16+ artefact) to another entity.

Practical considerations

The OAIC provides the following practical considerations when seeking to use or disclose personal information used for SMMA for secondary purposes with unambiguous consent.

Consented purposes

    • Limit what you use and disclose: Use or disclose only a binary assertion (‘16+ yes’),

one-time or short-lived tokens where possible, that are specific as to purpose.

    • Make consent truly optional: Implement a separate consent flow dedicated to secondary purposes; do not bundle with the SMMA purpose. Avoid general or broad terms of use or agreement obtained through use of dark patterns. Set defaults to ‘off’.
    • Design for users of all abilities: Present icons, visuals and choices in the user interface. Offer additional clarifying information and prompts to aid comprehension. Implement easy withdrawal toggles in a dedicated privacy setting or contextually appropriate screen.

Exceptional circumstances

    • Exceptional circumstances are non-routine. However, as a matter of best practice, it is useful for entities to have processes in place to deal with them. For example:
      • Identify the presenting issue and which APP 2 exception is relevant.
      • Apply a necessity and proportionality test to determine whether use or disclosure is
      • Default to using or disclosing the minimum amount of information
      • Keep a record of the decision(s) made and action(s)

7.   Privacy guidance – frequency of checks

The SMMA guidance issued by eSafety observes that the measures taken by platforms to comply with the SMMA obligation should not be static. Rather, ‘[p]roviders should proactively monitor and respond to changes in their platforms’ features, functions, and end-user practices, especially where these or other changes may introduce new risks.’18 Furthermore, eSafety expects platforms to take proactive steps to detect accounts held by age-restricted users on an ongoing basis.

The OAIC notes that steps taken by entities to comply with the SMMA obligation on an ongoing basis will likely handle personal information (including collection and reuse) in ways that are addressed by the preceding sections.

Ongoing compliance (e.g. recurring checks or triggers) should be proportionate and necessary to comply with the SMMA obligation. Any reuse that relies on existing personal information should have consent or another clear legal basis (APP 6). Entities should build and maintain their age assurance practices so that quality (APP 10), security and retention limitations (APP 11) are enforced by design.

 

Leave a Reply