Data Standards Advisory Committee, Meeting Minutes
Date: Wednesday 12 June 2024
Location: Held remotely via MS Teams
Time: 10:00 to 12:00
Meeting: Committee Meeting No: 61
Download meeting minutes (PDF 2.93MB)
Attendees
- Andrew Stevens, Data Standards Chair
- Jeremy Cabral, Finder
- Damir Cuca, Basiq
- Prabash Galagedara, Telstra
- Gavin Leon, CBA
- Peter Leonard, Data Synergies Pty Ltd
- Drew MacRae, Financial Rights Legal Centre Co
- Lisa Schutz, Verifer
- Aakash Sembey, Simply Energy
- Richard Shanahan, Tiimely
- Stuart Stoyan, MoneyPlace
- Zipporah Szalay, ANZ
- David Taylor, Westpac
- Tony Thrassis, Frollo
- Naomi Gilbert, DSB
- Elizabeth Arnold, DSB
- Ruth Boughen, DSB
- RT Hanson, DSB
- Jarryd Judd, DSB
- Terri McLachlan, DSB
- Michael Palmyre, DSB
- Hemang Rathod, DSB
- Nathan Sargent, DSB
- Rita Mohan, OAIC
- James Kelly, Treasury
- Aidan Storer, Treasury
- Alysia Abeyratne, NAB
- Jill Berry, Adatree
- Brenton Charnley, Mastercard
- Melinda Green, Energy Australia
- Colin Mapp, Independent
Chair Introduction
The Data Standards Chair (Chair) opened the meeting and thanked all committee members and observers for attending meeting # 63.
The Chair acknowledged the traditional owners of the various lands from which the committee members joined the meeting. He acknowledged their stewardship and ongoing leadership in the management of water, land and air and paid respect to their elders, past, present and those emerging. He joined the meeting from Cammeraygal land.
The Chair noted that Technical Team completed Maintenance Iteration # 19 and held a number of NFR & InfoSec Consultative Group meetings. The Engineering team released updated versions on a number of repositories and the CX team updated the CX Guidelines.
The Chair noted the DSB’s participation in a University of Melbourne mentoring program with master’s students. The students have undertaken a research report in Attachment B that considers how the CDR and Digital ID reforms could support overseas visitors migrating to Australia to access key digital and financial services.
The Chair acknowledged and thanked CSIRO’s Data61 and the Australian Competition & Consumer Commission (ACCC) for hosting the recent Standards Assessment Framework workshops in Sydney and Melbourne.
The Chair welcomed Jeremy Cabral the COO and co-founder of Finder to the committee.
The Chair noted that Alysia Abeyratne (NAB), Jill Berry (Adatree), Brenton Charnley (Mastercard), Melinda Green (Energy Australia) and Colin Mapp (Independent) were apologies for this meeting.
Minutes
Minutes
The Chair thanked the DSAC Members for their comments on the Minutes from the 8 May 2024 meeting. The Minutes were formally accepted.
Action Items
The Chair noted that the Action Items for the ACCC to invite the Digital Platform Inquiry team to a future meeting will be carried forward.
Forward Agenda
The Chair noted that a list of proposed topics that the DSB would present to DSAC members had been included in the papers.
Working Group Update
A summary of the Working Groups was provided in the DSAC Papers and taken as read.
Technical Working Group Update
Technical Working Group update taken as read.
The Chair noted that the Deceptive Patterns Assessment would be of interest to members going forward and this will be added to the Agenda for the July meeting.
ACTION: Add Deceptive Patterns Assessment as an Agenda Item to the July meeting
Consumer Experience (CX) Working Group Update
Consumer Experience Working Group update taken as read.
Stakeholder Engagement
Jarryd Judd, the Engagement Manager from the DSB provided an update as follows:
In terms of key terminology, the DSB operates a “build and maintain” phase and an “operate” phase. The DSB use this nomenclature to progress work, from working with the community then onto the implementation side.
The “build and maintain” side is where all the heavy lifting happens. The DSB seeks feedback, including via maintenance iterations and Consultative Groups. This allows them to obtain further insight into complex issues.
The “operate” side looks at ongoing activities to support the community. This includes the support portal, implementation guidance, CX guidelines, and engineering tooling (at least 9 different tooling and platforms to support engineering teams, who are reaching out for feedback and insight on its use).
Over the last four years, the DSB has built communication channels, starting with the implementation call, weekly newsletter and a YouTube Channel focused on pointed videos to give feedback and explain particular topics.
Some key metrics follow:
- YouTubeVideo Channel to date had more views in 2024 than in 2023 (3531 in 2024 vs total of 5315 views in 2023).
- CX guidelines have expanded two-fold since late last year
- Addition of a new subdomain (cx.cds.gov.au) which makes it a lot easier to get to it.
- Weekly newsletter has remained steady
- LinkedIn has engagement rate of 10% on posts. It is early stages, and would be great if members could like, share, and comment on posts.
In terms of shifts in engagement:
- CDR support portal remained steady with 2.2K to 2.4K visitors per month with number of queries reduced from 1-2 per day to 1-2 per week.
- Standards have 7.6K to 7.8K visitors per month and Guidelines has grown from 300 visitors per month in late 2023 to 800 in May 2024.
- Implementation Call has dropped in attendance from 110-120 per week to 50-70 per week. Very few questions raised which is a direct correlation from implementation change to activity.
- Newsletter has declined from 1100 to 950 over the last year but the open and click rate remain consistent.
- Social: the unique impressions vary wildly from 700 to 1500. They are experimenting on how to best catch people’s attention.
In terms of next steps, the DSB asked members the following:
- How would you like to engage with the DSB?
- What content would you like to see from the DSB?
- Which are the best channels to engage with you?
One member noted there are a lot of different methods to engage with the community. Because of the breadth and volume of information, one of the challenges is making sure they’re across everything. They noted the importance of understanding what forums are critical to engage with to ensure they are not missing out on important information.
One member asked how consumers find an impartial and useful description of the CDR system and why it is trustworthy and secondly, how to increase engagement with the consumer standards development process. They find it extraordinarily difficult to engage in the standards process although they live and breathe data and data governance. It’s impossible to get the general public to engage in the standards process and we need to set some achievable objectives on engagement.
One member noted they’d like to see more LinkedIn engagement as this is the only time you ever see public discourse around CDR. They are not sure if GitHub is part of this community engagement piece but find it confusing and hard to use. It’s also difficult to know where to go for information on the CDR, as it spread around and complex. They asked what happened to the budget for marketing the CDR. While consumers won’t engage with technical consultation, they need to understand the CDR is safe and secure.
One member noted they like to engage with the DSB via emails, workshops and newsletters and agreed that GitHub has its difficulties. They’d like to see content described with its purpose, for example the Action Initiation Experiment, and the InfoSec & NFR Consultative Group which are worthwhile.
One member noted that in terms of channels, less is more. It would be useful to have fewer to monitor and be purposeful about how they’re used. Some channels are great for pushing out a message and the workshops are helpful for dialogue on proposals. As Decision Proposals evolve, it would be helpful to have clearer notes on what’s changed in the iteration, what feedback has been incorporated, and what has not, which helps with transparency and ease of digestion of those revisions.
One member felt that we’ve failed the public in terms of education of the CDR. We’ve talked about campaigns and funding being set aside for years. We need a grassroots education piece on what the CDR is and how and why to use it. In terms of engaging with the public, the CDR is not fully understood, and the feedback will most likely be that they don’t understand or don’t trust it. We need to be careful on public discourse.
Standards Assessment Framework
The Chair noted that Elizabeth Arnold, the Business Analyst of the DSB will present on the Standards Assessment Framework (SAF).
The DSB noted that the engagement to date on the SAF has been summarised in the papers which she will take as read.
The DSB noted they were pleased with the level of in person engagement, although there has been little feedback from the self-led sessions. They’ve had only four bilateral sessions, one via email and none posted on the GitHub issue. Closing date for feedback was 12 June.
The DSB noted that a number of concerns were raised about who would be required to use the Framework and how it would be applied. The Framework is for the Chair and assessments are carried out by the DSB. However, the intent is for change requesters to be aware of the Framework and what is being considered. This will be made clear when the framework is finalised.
The Chair noted that as a Framework he would apply this in his standards decision-making process but also in the communication of decisions or seeking further consultation or feedback.
The DSB noted that some of the key findings were that there needs to be triage of proposals, and that some of the terms need clearer definitions. There was little feedback about changes to the framework itself, but more went to how the DSB uses it and documents outcomes.
The participants workshopped in the breakout rooms focussing on Activity # 1: Rationale and Activity # 2: Implementation Feasibility. They returned in 20 minutes.
A summary from each group was provided below:
Group 1
- Changes often focus on the change and solution without fully unpacking the problem. Starting with the problem first is key, and linking it to the intent, KPIs and success metrics around what we want to achieve.
- Everyone raises a change for a benefit, but we need consider how it’s shifting the dial and how we measure the problem and the key result areas.
- The CDR ecosystem has three primary players - consumer, data holders (DH) and accredited data recipients (ADRs). We always talk about the consumer and DH, but the consumer only gets value from solutions that ADRs bring to market.
- From a framing perspective, we should be use case conscious. There are specific use cases where CDR provide immense benefit for example account verification and lending, and we need to measure on that basis. When we ask for changes we need to be able to see from a use case perspective how much of an impact it will have on uptake.
Group 2
- Change should be linked to priorities
- Quantification should be used to justify problem statement
- Need to set out use cases and practical implications
- Need to be clear on who the proposer is and audiences benefitting and impacted by the change
- Proposals shouldn’t require too much work up front – we should consider what is MVP to get it across the line
- Further consideration should be given to how the voice on the consumer is represented
- We need to drill down further on security, privacy, and risks
- Process through which the framework would be applied and visible to the users
Group 3
- Categories of problem would be useful assist defining what the problem is. A flow diagram of the process at each stage and how the decision making is made would be useful.
- The “why” is the most important part and should be linked back to the CDR purpose and success measures.
- There is a high degree of variability in implementation impacts that is not visible to ADRs.
- More testing should be done with respect to changes from an ecosystem perspective to better understand the impacts and benefits.
- Consumers have a big part to play. Although they can’t see the service that will be implemented for them, they might have a view of types of things they’d be interested in and be offered by ADRs and DHs.
Group 4
- There needs more work done on open APIs and we need to get the data quality problems resolved and have reliability in implementation.
- Generally, there is a need to improve consumer awareness.
- It is important to get some of the language right in the way we frame up that first set of questions. This will ensure we can directly address net benefit.
- We should ensure we are always considering how consumers are affected by the problem and prioritising consumer benefits.
Group 5
- A key theme discussed was the prospect of bringing the framework up a level to a high level of aggregation to understand whether it decreases or increases CDR competitiveness.
- Changes are often phrased from a data/technical perspective, but it is important to consider what is the problem statement from the consumer perspective (i.e. the use case)
- Queried whether PIAs be included, noted, monitored in the framework, and at what stage.
- The role that the Chair and the DSB team need to play in assessing proposals is complex.
- So much hinges off the quality of the problem statement and the on flow-on assessment. This is something we should test and learn from.
- We should understand how the problem statement can’t be met by existing rules and standards and ensure it can’t already be met without a change.
- We should always be too detailed about the change. We should start from a strategic view first and then get down to the technical level.
The DSB thanked the DSAC for engaging in the exercise and noted that they will be adding this feedback to the feedback already collected via the Sydney and Melbourne. The updated framework and business processes that implement the Framework will be brought to the August DSAC meeting.
Approach to Experimentation
Michael Palmyre, the CX Team Lead from the DSB provided an update on CDR Experiments as follows:
The DSB noted that the idea of the experiments was relatively new for the DSB, and it was deployed in response to community feedback as a more speculative way to explore issues. The Account Origination Experiment was the first one, which recently concluded.
The DSB noted that the feedback received was that’s it’s often too late in the game where issues are identified in implementation. This is a way to explore standards, rules, policy, and hypothetical future state to support a way to facilitate use cases. It is also a good way for the CDR program to get expertise and insight from industry.
The DSB noted a range of outcomes can be drawn from experiments including identification of barriers to a use case (which might require amendments to existing requirements); voluntary standards that industry could implement at their discretion; or identification of a need for guidance on how to leverage existing requirements to facilitate a use case that could already be provided. Experiments could also lead to a proposition being shut down early on or prior to consultation.
The DSB sought feedback on two candidate experiments that could proceeded with next, detailed in the Appendix. These are CDR-enabled payments and CDR-enabled Energy account Switching.
The first, CDR-enabled payments, had been requested strongly from the community and included “native CDR payment initiation”, “facilitation to PayTo via the CDR”; and “the use of CDR data to inform and enable payments outside of the CDR”.
This could focus on simple once off payments natively using CDR. The key proposition is streamlining and taking advantage of the CDR consent flow based on the steps required today for CDR data to inform a payment. There was also potential to reduce drop off for consumers and remaining rails agnostic facilitates choice, reliability and agility allowing DH to determine least cost routing option for that payment to be made.
The second, CDR-enabled energy account switching, could touch on a few different opportunities and problem spaces including building modular components for use cases, account opening and closure, updating customer data, customer verification and setting outstanding payments etc.
The DSB noted the longer a list of experiments in the Attachment, which had been identified though community feedback.
One member asked how labelling something an experiment will help consultation that otherwise typically a slow process?
The DSB noted that labelling it thus enables a culture of experimentation and timeboxes it. Consultations can be lengthy, and they want a clearly defined hypotheses in a defined timeframe and gates through which they answer or validate them. The timeframe will depend on complexity of the use case and be approx. 5-8 weeks per experiment. This is not something whereby sign off is required like a formal consultation, but an open collaboration with industry which is intended to be low cost.
The Chair noted that as we move towards the experimentation level, their collaboration with TSY and ACCC increases. The saving isn’t in the completion of the experiment but saving in time though the alignment in rules, standards and accreditation and enforcement. An experimentation approach could have saved significant time if it were undertaken before previous decisions (e.g. AEMO’s role in energy). The objective is to bring together policy, draft rule making and draft standards together to understand at a deeper level through interaction with the community.
One member asked how do you get experiment topics listed? For example, the Financial Counselling Australia (FCA) and other consumer led use cases are not listed.
The DSB noted that the list is based on what people have suggested to date and it is not an exhaustive list. This is the start of that process, and they are inviting further suggestions.
The Chair noted that the FCA use case is being done to evaluate whether this can improve the outcome for financial counsellors in a hardship scenario, and therefore different to other experiments that consider new rules and standards.
One member noted the need to be clear about the expectations of participating in an experiment as this will determine who they send to participate.
Items raised by Members for discussion
No items were raised by Members for discussion.
Treasury Update
James Kelly, the A/g Deputy Secretary from Treasury provided an update this month.
TSY noted the consent-related elements of the draft rules package to be considered by the Minister to improve the consumer experience focus on consent bundling and information simplification, to:
- Allow for the bundling of consents where they’re ‘reasonably needed’ to deliver a requested good or service
- Simplify the information a data recipient is required to provide a consumer at the time of consent
- Allow a data recipient to consolidate 90-day notifications to reduce consumer fatigue
- Simplify obligations regarding CDR receipts (given to a consumer after consent)
- Align all information requirements regarding supporting parties (e.g. ADRs, nominated reps, OSPs) who may access the consumer’s data – currently, the requirements vary in how this info is displayed during the consent flow.
Draft consent rules may also include amendments in relation to data deletion/de-identification and direct marketing activities.
Some of these obligations would be set in the standards. Treasury had been working with the DSB on draft standards that could be released for consultation at the same time as the draft rules. Treasury’s consultation paper will include a reference to the DSB standards consultation process.
TSY noted that there had been a lot of chat in LinkedIn around what’s happening with CDR and its direction, which the Minister is very aware of. The Minister plans on making a speech on CDR in coming months, including flagging how a Strategic Assessment will be conducted in the second half of 2024.
TSY have made a lot of progress on the trust brand package of work. However, this work is paused for now.
TSY noted a lot of their current work was involved in discussions within government engagement with the Attorney Generals Department (AGD) about the Privacy Act reforms and the potential implications down the track for the CDR.
One member noted there has been some initial conversations with the Australian Banking Association (ABA) and TSY assessing potential options for reducing ongoing compliance costs of CDR and assessing opportunities for optimising existing rules and standards.
TSY noted they’ve had ongoing discussions to understand their concerns and the Minister has asked them to look again at the draft rules in respect to NBL in the context of reducing compliance costs. They are having targeted conversations with a wide range of parties to feed into that.
One Member asked if single-touch-payroll will be part of the Strategic Assessment Review later this year as per senate estimates conversations with the ACCC?
TSY will take this on notice.
One member asked where they are up to in terms of the NBL timeline and next steps?
TSY noted that they are currently focusing on compliance costs and feedback on draft NBL rules and implementation considerations. Toward the end of the year, they will go back to the Minister with a set of draft rules for his consideration. If the draft rules are approved, the typical approach would be to allow at least 12 months from the making of related standards for the first obligations to come into effect.
ACCC Update
No update was provided this month by ACCC.
Meeting Schedule
The Chair advised that the next meeting would be held remotely on Wednesday 10 July 2024 from 10am to 12pm.
Other Business
No other business was raised.
Closing and Next Steps
The Chair thanked the DSAC Members and Observers for attending the meeting.
Meeting closed at 12:05