Back to blog
Compliance

DPIAs, Article 22, and Automated Dunning: What UK SaaS Needs to Know

When automated payment recovery triggers UK GDPR DPIA requirements. Article 35 thresholds, Article 22 rights, and what a dunning DPIA must cover.

Rekko Team
April 8, 2026
9 min read
dpiagdprarticle 22uk

Your product manager has a clever idea. Instead of running the same dunning sequence for every failed payment, the system will score customers on predicted recovery probability, tenure, engagement, and historical payment behaviour. High-score customers get a gentle 3-step sequence. Low-score customers get a more aggressive 5-step sequence that ends with a service-suspension warning. The scoring is automated and happens at the moment the Stripe webhook fires.

That change looks operational. It is actually a data protection decision. You may have just triggered UK GDPR Article 35 DPIA obligations and, depending on the consequences for the customer, Article 22 rights around solely automated decision-making. This article walks through the threshold, what a DPIA for dunning needs to cover, and how Article 22 intersects with payment recovery.

General information only, not legal advice. If you are introducing profiling or automated scoring into a dunning workflow, get a qualified DPO or data protection lawyer involved before you ship.

The two thresholds that matter

Two separate UK GDPR articles come into play when automation starts making decisions about customers.

Article 35 requires a Data Protection Impact Assessment before you start any processing likely to result in a high risk to the rights and freedoms of natural persons. Article 35(3) lists mandatory triggers, including "systematic and extensive evaluation of personal aspects... based on automated processing, including profiling, on which decisions are based that produce legal effects... or similarly significantly affect" the data subject.

Article 22 gives data subjects the right not to be subject to decisions based solely on automated processing, including profiling, which produce legal effects or similarly significantly affect them. It has three exceptions (contract, law, explicit consent) and specific safeguards even when an exception applies.

The two articles use similar language but do different work. Article 35 says "you must assess risk before you process." Article 22 says "the data subject has a right in relation to certain automated decisions." A dunning workflow can trigger one, both, or neither, depending on how it is designed.

When routine dunning does not need a DPIA

Standard dunning, the kind Rekko runs by default, usually does not need a DPIA. A typical sequence looks like this.

  • Stripe webhook fires on invoice.payment_failed
  • The system checks the customer is not opted out
  • A pre-written message is sent via email and SMS on a fixed schedule
  • The same sequence runs for every failed payment, regardless of customer attributes

There is no profiling. There is no automated decision. There is no evaluation of personal aspects. The processing is minimal, contractually necessary (see our lawful basis article), and carries low risk to data subjects. The ICO's DPIA screening guidance does not flag this kind of processing as high risk, and Article 35(3) does not bite.

You still need a Record of Processing Activities entry, a privacy notice disclosure, and appropriate security measures, but a full DPIA is not required.

When dunning does need a DPIA

The picture changes the moment you add any of the following.

Customer scoring or segmentation. If you score customers on predicted recovery likelihood, CLV, engagement, churn risk, or any other attribute and use that score to choose a dunning path, you are doing profiling. Even if the scoring is simple (rule-based, not machine learning), it is still profiling under UK GDPR Article 4(4).

Differential treatment based on the score. If high-score customers get one experience and low-score customers get a materially different one, including more frequent messages, more aggressive escalation, faster suspension, or exclusion from grace periods, that is "a decision based on the automated processing."

Significant consequences. If the decision affects the customer in a way that is more than trivial (losing access to a service they rely on, being charged fees, being reported to a credit bureau, being fast-tracked to debt collection), you are in "similarly significantly affects" territory.

Hit all three and Article 35(3)(a) plus ICO guidance on "evaluation or scoring" push you squarely into DPIA territory. The ICO publishes a list of ten criteria for "likely high risk" processing, and you only need to hit two to trigger a mandatory DPIA. Automated scoring plus significant consequences usually hits two by itself.

Other common DPIA triggers in dunning workflows include:

  • Combining data from multiple sources (Stripe, your CRM, an analytics tool, an external credit reference) to build the score
  • Processing at scale (thousands of customers per month)
  • Using data in a way that differs from what the customer would reasonably expect
  • Sharing scored data with third parties like collection agencies

What a dunning DPIA must cover

Article 35(7) sets out the minimum content for a DPIA. For a dunning workflow with scoring, a thorough DPIA should answer the following.

Description of the processing

  • What data you use (identity, transaction history, engagement signals, failed payment metadata)
  • Where it comes from (Stripe, your application database, analytics platform)
  • How the scoring model works, what features it uses, and how decisions map to actions
  • Who has access to the scores and decisions
  • How long you retain the score and the underlying data
  • Your lawful basis (usually contract, possibly legitimate interests for the scoring layer specifically)

Necessity and proportionality

  • Why you need to score rather than running a uniform sequence
  • Whether a less intrusive alternative would achieve the same recovery outcome
  • Data minimisation: are you using only the features you actually need
  • Whether the scoring distinguishes in a way that is fair and justifiable

Risks to data subjects

  • Error risk: what happens if the model misclassifies a customer
  • Fairness risk: does the model disadvantage particular groups (for example, newer customers, customers in certain regions, customers using particular payment methods)
  • Transparency risk: can customers understand what is happening to them
  • Psychological risk: more aggressive messaging toward already-stressed customers
  • Loss of service: if low-score customers are fast-tracked to suspension, they lose access to the service

Mitigations

  • Human review of high-impact decisions (for example, anything that ends with suspension)
  • Clear opt-out and objection paths
  • Explanation rights: telling the customer what factors influenced their treatment
  • Regular model review and bias testing
  • Retention limits on scores
  • Logging and auditability

Consultation

Article 35(9) encourages consulting data subjects or their representatives "where appropriate." In practice for B2B SaaS this usually means a note in the DPIA explaining why formal consultation is not proportionate, and relying instead on transparency through the privacy notice.

If residual risk remains high after mitigations, Article 36 requires you to consult the ICO before starting the processing. Most dunning DPIAs do not reach that threshold, but the option exists.

Article 22: the solely automated decision question

Article 22 is separate from the DPIA question and more narrowly focused. It gives data subjects the right not to be subject to a decision "based solely on automated processing... which produces legal effects concerning him or her or similarly significantly affects him or her."

Two conditions must be met.

  1. The decision is solely automated. If a human meaningfully reviews and can change the outcome, Article 22 does not apply.
  2. The decision produces legal effects (affects the customer's legal rights) or similarly significantly affects them (changes their circumstances or behaviour in a meaningful way).

For standard dunning, condition 2 is usually not met. Sending a reminder email is not a legal effect and is not "similarly significant." For dunning with automated scoring and automated service suspension, condition 2 can be met, because suspension changes the customer's ability to use the service they are paying for.

If Article 22 applies, you need one of three exceptions.

  • Contract. The decision is necessary for entering into or performing a contract. This often works for subscription suspension, because the contract itself says non-payment leads to suspension.
  • Law. A UK or EU law authorises the decision. Rare for dunning.
  • Explicit consent. Not a good fit for dunning for the same reasons consent is a weak basis generally.

Even under an exception, Article 22(3) requires you to implement "suitable measures to safeguard the data subject's rights and freedoms," at minimum the right to obtain human intervention, the right to express their point of view, and the right to contest the decision. For a dunning system that automatically suspends service, that means a clear path for the customer to talk to a human and get the suspension paused or reversed.

Practical pattern: keep humans in the loop for anything significant

The cleanest design is one that keeps routine messaging fully automated (no Article 22 issue) and routes any significant decision through a human.

  • Automated: sending a reminder email or SMS based on a failed payment event

  • Automated: retrying the charge on a schedule

  • Automated: flagging the account for review if scores indicate high risk

  • Human-reviewed: deciding to suspend service

  • Human-reviewed: deciding to send to collections

  • Human-reviewed: deciding to apply a non-standard sequence to a specific customer

That split keeps you out of Article 22 for the hard parts, even if Article 35 still requires a DPIA on the scoring layer.

How Rekko approaches this

Rekko's default product is deliberately non-profiling. Every failed payment runs through the same sequence unless a human explicitly configures a different one per Stripe account or customer segment. There is no automated scoring, no profiling, no automated suspension, and no solely automated decision with significant consequences. That keeps the default deployment out of Article 22 territory and out of mandatory DPIA territory for most customers.

If you do build segmentation on top (for example, by connecting Rekko to a customer scoring tool you maintain yourself), the profiling and DPIA responsibility sits with you as controller. Rekko acts as a processor for the message delivery step and provides the transparency, logging, and opt-out infrastructure you need to mitigate risk. Our DPA covers the processor role, and the message logs give you the audit trail Article 30 and Article 22 transparency both expect.

Start your 14-day free trial, no credit card required. Or compare Rekko to Churnkey and Recurly.

Sources

  • UK GDPR Articles 22, 35, 36
  • ICO guidance on DPIAs, including the list of "likely high risk" criteria
  • ICO guidance on automated decision-making and profiling
  • Data Protection Act 2018

This article is general information. If you are introducing automated decision-making into a dunning or collections workflow, speak to a qualified data protection lawyer before you go live.

Stop losing revenue

Ready to recover your failed payments automatically?

Join hundreds of SaaS companies using Rekko to recover 10-20x their investment. Set up in 5 minutes, see ROI in 24 hours.

No credit card required. 14-day free trial.

Related Articles