Data Standards Advisory Committee, Meeting Minutes
Date: Wednesday 11 August 2021
Location: Held remotely via WebEx
Time: 10:00 to 12:00
Meeting: Committee Meeting No: 34
Download meeting minutes (PDF 273KB)
Attendees
- Andrew Stevens, Data Standards Chair
- Luke Barlow, AEMO
- Jill Berry, Adatree
- Brenton Charnley, TrueLayer
- Nigel Dobson, ANZ
- Chris Ellis, Finder
- Lawrence Gibbs, Origin Energy
- Peter Giles, CHOICE
- Melinda Green, Energy Australia
- Joanna Gurry, NBN Co
- Gareth Gumbley, Frollo
- Rob Hale, Regional Australia Bank
- John Harries, Westpac
- Lisa Schutz, Verifer
- Aakash Sembey, Simply Energy
- Lauren Solomon, CPRC
- Stuart Stoyan, MoneyPlace
- Glen Waterson, AGL
- Barry Thomas, DSB
- James Bligh, DSB
- Ruth Boughen, DSB
- Rob Hanson, DSB
- Terri McLachlan, DSB
- Michael Palmyre, DSB
- Mark Verstege, DSB
- Daniel Ramos, ACCC
- Mark Staples, Data61
- Shona Watson, OAIC
- Kate O’Rourke, Treasury
- Jessica Robinson, Treasury
- Phil Schofield, Treasury
- Andrew Cresp, Bendigo and Adelaide Bank
- Damir Cuca, Basiq
Chair Introduction
The Data Standards Chair (Chair) opened the meeting and thanked all committee members and observers for attending meeting # 34.
The Chair noted that Maintenance Iteration # 8 is underway, and the Workshop held on 27 July 2021 (hosted by DSB & Treasury) on Action Initiation was well attended and fruitful in terms of the information that was surfaced and the perspectives.
The Chair noted that the CX Checklist has been updated to make sure that it's in as good a format and as comprehensive a guideline as it can be for people in terms of their compliance.
The Chair noted that the DSB are holding a CDR 101 Workshop, which is focussed on people who are entering the regime, and it will be looking at the fundamentals and how the CDR is structured.
The Chair would also like to welcome Hemang Rathod, our new API Architect who started on the 5 August 2021. His previous role was as a Senior Solutions Architect for Legislative and Digital Transformation Initiatives Team at the Australian Energy Market Operator (AEMO).
The Chair noted that Andrew Cresp (Bendigo & Adelaide Bank) is an apology for this meeting. He also noted that Paul Franklin (ACCC) is at a Commission Meeting. Daniel Ramos has joined the meeting and will provide the ACCC update.
Minutes
Minutes
The Chair thanked the Committee Members for their comments and feedback on the Minutes from the 14 July 2021 Advisory Committee meeting. The Minutes were formally accepted.
Action Items
The Chair provided the following update in regard to the Action Items:
The Data Standards Advisory Committee (DSAC) Terms of Reference have been published both in the minutes and online. Committee input into the Service Provider Directory is continuing.
The Chair acknowledged those members who have provided on a voluntary basis, statistical level data to the Australian Competition and Consumer Commission (ACCC), which he understands has been very well received.
The Chair noted that the DSB will provide the committee with an update on the Design Challenge Sub Committee concept and an update of the feedback in relation to the Service Provider Directory (SPD).
The DSB noted that the Design Challenge Subcommittee item arose out of the Energy Sector Advisory Committee meeting back in March, and at that meeting, a discussion was held about the challenge of engaging smaller players in the early stages (the design stages) of the standards with the outcome that a subcommittee be formed out of the DSAC in order to provide advice to the DSB and the Chair on better ways to engage.
The DSB noted that we will reaching out to members after this meeting to see who would like to join the subcommittee. The subcommittee would meet on a regular basis and the purpose would be to consider ways to improve the engagement from smaller players. The DSB acknowledges that the feedback we receive does tend to be weighted towards the very well-resourced participants and that does increase a risk of some level of bias in their decision-making.
One member asked if this is open to other data holders (DHs) who are not intended to be accredited data recipient (ADRs).
The DSB noted that the subcommittee could be anyone from this group that wants to contribute, so if DHs have a desire and some thoughts on how we can engage better with the smaller players, they’re welcome to provide this feedback.
The DSAC member noted that they were not asking for them - they have an Australian Energy Council of smaller retailers which struggle to engage sometimes, and they might be interested in joining the committee.
ACTION: DSB to invite members of the DSAC to join the Design Challenge subcommittee and set up a meeting
The DSB noted that they’ve not received much feedback on the SPD on ways to improve it, and there is a prominent disclaimer on the SPD highlighting that there is no warranty for the listed providers, and that it is a lightweight service to promote awareness of providers in the early stages of the regime. They acknowledge there's probably a longer term need for something more robust, which provides vetting of the providers, and the initial thinking is that this would sit with the ACCC, given its other accreditation roles.
The DSB would welcome further feedback on whether the DSB should retire the SPD and they encourage members to make further representations about whether a better SPD should exist, and where it might be housed.
Issues Raised by Members
The Chair thanked Committee Members for tabling issues for discussion at this Advisory Committee meeting. Jessica Robinson from Treasury (TSY) will present on “What Success looks like for the CDR” and Jill Berry from Adatree will present on The CDR Standards in Action.
What Success looks like for the CDR
TSY presented on What success looks like for the CDR as follows:
TSY wants to focus on seeking the committee’s views and getting input on the outcomes they’re trying to achieve by drilling down into those broad policy objectives and then to try and link it into the measures of success that they can start tracking. TSY stated that this is just an entrée into what will be an ongoing discussion.
TSY noted that it’s a little ambitious to get down into the specific metrics today, but they would like to make sure that we have a shared vision and understanding of what the broad success outcomes is an important place to start the conversation.
TSY noted that they have been working on the Strategic Assessment piece which has started to really shape and change their thinking about what the program success looks like. They said they hoped the committee has got a little bit of a flavour of that through the consultation paper, which outlines an evolution of approach of program success being directed at consumers accessing better value deals or supporting switching, and now applying a broader lens to the full opportunity and capabilities that the CDR framework can offer the consumer, but also importantly to recognise what drives economic and future prosperity.
TSY noted that they’re starting with the conceptual framework, and how they’re approaching the problem of measuring the success of the CDR. TSY said they are at the stage of describing broad objectives, outcomes, and benefits with the details around the success criteria and metrics to come down the track, and these will be linked to the strategic assessment work. The strategic assessment consultation is providing a rich amount of information that will support a more robust approach to defining and measuring success.
TSY noted that in the consultation paper for the strategic assessment, they’ve tried to drill down into the four key policy objectives or drivers of the CDR. The central key driver being consumer benefit, which can be difficult to define based on different people attributing different meaning. TSY stated that being able to articulate what consumer benefit is important and encouraged feedback on the practical, tangible ways that consumer benefit can be described.
TSY noted that the three key interlocking pieces (i. Safe and Secure ii. Competition and Market Efficiency and iii. Innovation) that go to consumer benefit are all interdependent. TSY went on to explain that the CDR is intended to be a safe and secure platform for sharing data, while also driving competition and market efficiency as an economic reform, with the consumer at the heart of how they drive that competition outcome. TSY said that if the innovation layer is not supported, and innovation is not driven, then the rest doesn’t occur. TSY noted that there is a really important intersect between those three interlocking pieces and the consumer centric approach, which consultation on the strategic assessment are really highlighting.
TSY noted that in regard to the CDR objectives and outcomes, that they have set up high level descriptions of the three key drivers and also added a fourth one, which is around promoting the confidence in the Australian digital economy. TSY said that the fourth outcome is important, which highlights the important role of CDR as enabling data sharing infrastructure that underpins a data driven digital economy. They are:
Outcome 1: The CDR provides consumers with direct benefits through their use of CDR-enabled products and services, using safter and more secure data sharing infrastructure
Outcome 2: The CDR increases competition among participants to provide more competitive and consumer centric products and services and enhanced choice for consumers
Outcome 3: The CDR fosters innovation by participants who unlock the potential of CDR data to create new products and services for consumers, building a data driven economy, creating new jobs, and promoting the adoption of new technologies
Outcome 4: The CDR promotes confidence in the Australian digital economy by supporting the establishment of a trusted data market and digital ecosystem
TSY noted on the CDR Program Objective “To deliver an economy wide CDR that provides value for consumers that are safe and secure, drives competition and innovation (including in the data services sector”. TSY noted that this is a high-level description, which starts to shape automatically what the entry point into “what success starts to look like”.
TSY noted that in terms of what we can expect to see from the derived outcomes, that it is more at the conceptual level, and they need to get into the more granular metric based KPIs. In regard to the concept of empowerment, TSY stated it is important to focus on how things like consent play a role in empowering consumers, and it is not just data sharing for data sharing sake, but it actually brings consumers closer to making more informed decisions and understanding what it is that consumers are engaging with in terms of products and services.
TSY noted that in considering indicators of success such as the number of consumers using the CDR, thought might need to be given to the broad-based nature of participation or participation of particular cohorts. TSY stated that this is really important for them to think about in terms of the short-term success measures and balancing that against the longer term.
TSY noted that safety and security is a key objective of the system, creating the trusted data ecosystem and the sense of empowerment and consumers being in control and the processes by which they help ensure that those receiving data ultimately are accredited which helps grow that trust in the system.
TSY noted that innovation is critical to the success of the program and that the policy and regulatory framework is set to foster innovation, but there can be challenges in balancing safety and security and innovation objectives. An important consideration to grapple with as they think about the success measures is a framework that’s flexible enough to help drive broad innovation and rather than being limited to very narrow use cases and supporting safe and secure rails.
TSY noted that competition impacts relate to both driving competition within the incumbents in sectors, but also competition within the innovation layer as well as at a cross sector level. TSY noted that another important theme being highlighted through the strategic assessment work is providing consumers with choice around data and digital platforms. TSY noted consultations were also highlighting some of the questions starting to emerge in relation to the potential for multi-national data rich companies to both have both potentially positive and negative competitive disruptive effects.
TSY noted in terms of designing success metrics in relation to providing direct benefits to consumers the importance of balancing benefits that relate to broad, based societal benefits and the specific benefits that relate to individuals. TSY also noted informed consent as an important aspect of consumers benefiting from their data, and supporting data literacy and digital literacy.
TSY noted that a common question posed by stakeholders is “what is the overarching value proposition”. TSY notes the temporal aspects for setting success measures that need to capture both a bold and ambitious multi-year vision and tangible indicators of early success.
TSY noted that when framing what success is for the program, they will be thinking of an approach that seeks to do some of that long term goal setting to help ensure that they’ve got the policy and regulatory settings in place which has a view of the future world, and at the same time have clear tangible outcomes to meet milestones along the way.
TSY are considering how they think about success metrics for a vibrant innovation ecosystem and what does that look like, for example, is it lots of ADRs or is it more about quality with fewer ADRs but excellent products being produced? TSY said having a range of different products and service offerings from the ADRs and how that reaches into the broader data ecosystem is important, and that a thriving innovation ecosystem is not necessarily going to mean a large number of ADRs as an effective measure, and poses the following questions: is it broad based population participation or targeted initially to small and medium businesses being able to leverage off CDR to support economic transition job creation?
TSY noted that in terms of the questions of how does an economy wide expansion of CDR contribute to program success, that there is a genuine question around how fast that expansion needs to happen and is expansion itself success or is there something more nuanced in that?
One member asked whether TSY are envisaging that the next stage is coming up with measurable KPIs noting the presentation focussed on conceptual thinking and broader CDR purpose and ethos. The member noted a couple of touch points on the benefits TSY mentioned being within sector, and that the members thinks there’s also an ability to look at this from a broader CDR ecosystem and economy perspective, such that there should also be across sector benefits.
The member noted that there is commentary around driving innovation across sectors because what we are actually looking for, as the key consumer, is reducing or eliminating friction. The member said that this is around being informed and to make informed decisions, easy access to the new products and services and easy transition to a product and service from one provider to another.
The member noted that this then goes to the tangible benefits for the consumer – time, money, and experience, and that some of these can be measured but are consumers getting a better experience and how do you quantify that. They said the quantifiable measurable KPIs are important to have, and they also need to be aspirational and visionary, and it is important to have things that are measurable and achievable as milestones as well.
TSY thanked the member for their excellent points. TSY noted the challenge of setting quantifiable indicators noting the unknowns around where we might see some products and services delivering those outcomes. TSY posed the question, how does the CDR reach in and deliver a broad societal outcomes and policy objectives?
Another member noted that on the point of multi-national companies that are data focussed. They stated that the challenge will be whether we can stop those companies in any way and how can we stop them through CDR because that's just one aspect of the broader economy and markets that those companies are operating in. They noted that it is quite pertinent for TSY to think about in terms of which DHs are going first and how do those other players get a foothold in the market.
The member also asked if expansion should be considered as success. They said that in the energy sector, there are lots of regulators and other commentators who look at competition and it’s really important that those sorts of measures are used because it's not just about numbers of customers using it, it’s about getting the right outcomes. They noted that TSY will need to be a bit sector specific at times.
One member noted that they were looking for a theme around ubiquitous or pervasive. They asked is there a build phase with metrics, and then a growth phase with metrics which are quite different? They said the whole point of metrics is that it gives us a focus and motivates us because what gets measured gets done etc. They said that right now, we need participation because we are in the build phase and we need all the DHs, data recipients (DRs) and use cases. They noted that in the growth phase, if consumers are getting value, then participation rates go up. They also said that if we see consent going up that means consumers are doing things and they’re getting some value.
The member noted in the earlier slides, there was a lot of characteristics of success, but we also need to differentiate between characteristics and actual outcomes. They also stated that some aspirational targets, like a Net Promoter Score (NPS), for the CDR or ADR’s need to have some kind of NPS assessment, and they think this will make the difference and if we can communicate these publicly that would help.
TSY noted that having some of those concrete things are about the now and starting to build success metrics ahead for the next phase is a really good way of thinking about it.
Another member noted that it is interesting that TSY has drawn out the role in the data infrastructure of the nation, because they see the CDR as a part of, but not the whole, and they said if you look at those social aspirations, like Australia gets carbon neutral etc. they think the boundary of CDR is “and it was done using CDR when it was needed and not when it wasn't”. They said the CDR has a role, but there’s a bigger piece and it comes back to the point about the CDR Framework, as opposed to the CDR regime so it can potentially play a bigger part. They noted the boundary is important, where CDR isn’t relevant harmonious data sharing crops up that supports that social goal.
They also asked a question about objectives - is there a fifth objective which isn’t so much the consumer at the heart, but about promoting an export sector?
TSY noted that this is a key objective and Minister Hume and the Prime Minister see this as a significant potential for Australia and that the ambitious agenda would also support the export of innovation and provide a framework for others to model.
Another member asked TSY if the slides would be available to the committee. The member also stated that on the guard rails for security, that’s important but at the same time we have to allow for innovation and friction etc.
TSY stated that they are happy to share the presentation to the committee.
ACTION: TSY to provide the slide deck to the DSB for circulation to the committee
Another member noted that it’s really positive that they’ve focusing on this in terms of outcomes of the scheme, because it is going to be critical given the complexity of the reform process we're about to enter. The member agreed that consumer outcomes should be central and the comments that have been made previously about how it's actually demonstrably improving people's lives and being specific about the use cases is critical. The member said their research shows that you have to be really specific and tangible otherwise it’s too confusing for consumers in their already very complex daily lives.
The member noted that the other sorts of metrics they look at in other markets for consumer outcomes are things like whether or not consumers are reporting they can comprehend the information out of the choices that they have, the extent to which they feel they're in genuine control and they're able to do what they want to do. The member stated the underlying infrastructure enables control to be fully in consumer's hands. The member noted the other outcomes and stated they look at are accessibility and inclusion across the customer base, which they put under empowerment choice and control. The member stated that the second group is around safety and security, and metrics around data breaches, scams and fraud that's been perpetrated using CDR data. They also stated that the number of disputes and complaints that have been captured by the system, ensuring consumers know where to go to make a complaint. They also said when things go wrong, how effectively consumers can resolve their disputes and adopting the principles of effective dispute resolution systems in all other markets.
The member thought there is an urgent need to establish a digital ombudsman because it is a sectoral economy wide scheme. They stated that a digital ombudsman would actually provide an economy-wide place for consumers to go to resolve disputes. We also need an effective order enforcement framework so that the community has confidence that when things go wrong, the regulators are able to follow through and ensure data is being used appropriately and in line with what consent has been given. The member declared that this is urgent and needs to be fast tracked.
The member asked in regard to building trust, have we gone about this the wrong way? They then went on to say if we’re talking about an economy wide scheme, they think the best way to implement this is to look at it through the staging and phasing of different use cases. They also said, if we're talking about data coming from many different sectors into the scheme, taking a sector-by-sector approach will result in unnecessary adjustment and duplication through time and complexity.
The member said they think we should be looking at this as stage one - a set of 20 use cases and working on those and making sure the systems are working for those use cases, it's tangible in terms of what consumers understand they're receiving in terms of outcomes. They then said, there can then be a pipeline of use cases over the next year, which can be brought up online when you know that you've got enough market actors in the system that actually want to put those use cases into the CDR regime. They said it’s difficult to participate in these consultations at the moment and participate meaningfully when you're trying to get your head around four or five different sectors and the data and the specific requirements in each sector.
TSY noted that the consultation paper on the Strategic Assessment intends to raise that question of whether sector by sector is the right approach or is it data sets etc. TSY noted that the consultations they’ve had to date have helped illustrate the importance of having some use cases, but also not being too wedded to particular use cases. Also, the importance of setting the innovation rails broad enough so that success isn’t determined by picking particular use cases and getting the balance right between specific use cases and enabling new use cases and innovations.
TSY also noted that at some point they might have a discussion around cohorts because the type of use case and who benefits will be attached to a specific use case.
The Chair noted that the input on this topic provided by this member is very consistent with the DSB’s input. The Chair stated that rather than pick a sector and then look at what data we designate, if we're about consumer benefit, we've got to look for data and then designate the sector where the data holders hold it. The Chair said that’s not a major shift from where we are, but choice of sector because we think it needs some reform due to competition is not necessary the right solution. The Chair said this is inherent in all of those consumer benefits and understanding what lines up exactly with those discussions about consumers understanding consent in relation to their data, have used consent in relation to their data, and can demonstrate benefits from using that consent. The Chair said if people using consent and can point to the benefit, we have a very short cut version to say whether people are getting benefit for the CDR.
Another member noted that they like lead and lag indicators around success, or input and output metrics. They said they would prefer if it was broken down along some of those lines. They said that right now, we need a high number of ADRs, we need DHs that are compliant and some basic metrics about adoption which you could overlay across multiple sectors. The member said you can then move onto usage and engagement; we need to see consumers starting to use these services and the innovation point in the use cases feed into that and then some of those derived benefits. The member said they like the frameworks that break down the input, lead indicators that are more activity based which get us more momentum and building traction knowing that the ultimate outcomes are probably more mid to longer term.
One member noted that friction is another great lens, for example, where would consumers most likely benefit from change – mispricing, service, or whether it's the ability to switch and then that'll drive a consumer benefit as a lens among other things.
The Chair noted that we need to be careful in outlining what success looks like as a vision or aspiration for the CDR. The Chair said discussions we’re having here is about Australians, not bureaucrats or tech executives saying this is what we believe CDR should contribute to Australia and how widespread etc. rather than the CDR program that we're involved in and its interim staging. The Chair said you can then always reconcile it back to the program.
The DSB noted that there is a strong support for success metrics, but also having them measurable, objective and tracked and the need to solve how they collect that data. The DSB asked the committee for advice on how they engage on that question because the DSB have done two engagements around metrics recently with a third planned. The DSB noted 90% of the feedback they’ve received has been “no don't do this - it’ll be expensive”, but they have had minimal feedback from the community on what kind of things they should be measuring. The DSB said the feedback they’re getting is very much along the implementation project team feedback not about how we actually improve the regime.
One member noted that most of the metrics for the success of the scheme won't actually be technical it will be more about things like a survey or otherwise measured.
The CDR Standards in Action
Jill Berry from Adatree presented on The CDR Standards in Action as follows:
Berry noted that she would like to present on their experience as an active ADR. She is CEO co-founder of Adatree and have been working on the CDR since June 2019. Berry said they have a turnkey SAS platform for ADRs, and their use case is enabling other companies use cases and they were accredited in February 2021 and active in May 2021.
Berry noted that she will be going through some of the issues and trends they’re seeing and some solutions on how to fix it in the short and medium term. Berry said there are expectations as an ADR that once you get accredited and more importantly active, and you’ve met all the standards and rules to receive data, and there were active data holders, that you would be able to consent and have access to data pretty easily, however, the actual experience is far from this.
Berry noted that one of the issues they’re seeing is DH non-conformance. Berry said that when a new DH comes online, they are surprised when dynamic client registration (DCR) works the first time, and the consent process can initiate, but this has only happened seamlessly with around six companies. Berry said that issues are usually the norm. Berry said that as an ADR, they thought that if they met all the requirements, they would be protected by this incredibly prescriptive and regulated framework.
Berry noted when something doesn’t work you go to Service Now in JIRA on the ACCC portal and raise an issue, with each issue takes an hour to lodge – even if you have the same issue with multiple DHs, you have to raise it multiple times. Berry noted that this takes up 20% of their engineering teams’ effort which has a really material impact on their company.
Berry noted that responses are inconsistent across DHs. Berry stated that they have experienced where an issue has been acknowledged and no resolution time has been given even when they’re raised it as a severity # 1, and that sometimes it takes a few hours, and sometimes a few days to respond. They noted that this is not acceptable, and these severity # 1 issues are not treated as such. Berry noted that there are no Service Level Agreements (SLAs). Berry said Adatree had one experience with a severity # 1 and were advised a patch would be available in 5 weeks, but after a phone call to a senior executive at the ACCC the resolution time went from 5 weeks to 5 days, but Berry noted that calling the senior executive for every issue is not scalable. Berry concluded that CDR issue management was an issue.
Berry noted that pre-July 2020, DHs had to complete 200+ tests with early ADRs, and these were all manual, but at least it showed that you were conformant. Berry stated that now DHs go through conformance testing, which is around ~20 tests, which is around 10% of all the tests that are actually required for conformance. Berry stated that this is the reason they are fielding so many issues from active DHs, because they haven’t gone through the extra 180 plus tests. Berry also said that Conformance Test Suite (CTS) does not make you conformant, so they would like it to rename it to the Data Holder Testing Suite.
Berry proposed a medium-term solution by expanding the conformance testing suite to the full suite, and if the ACCC made the barriers to entry higher it would practically eliminate those conformance issues raised by ADRs. Berry said this would have the benefit of increasing the quality of data.
Berry proposed another solution of introducing standards and SLAs for incidents, issues, and resolutions and have response times for issues for both DHs and ADRs and resolution guidelines.
Lastly Berry proposed another solution is to introduce a bug bounty for all active ADRs, which acknowledges effort and contributions for ADRs which are helping to make this ecosystem better.
One member noted that these proposals tie into the conversations around the overall quantity versus quality of the CDR, because without a focus on quality, the integrity of the system is actually at risk which translates to consumer confidence in the system. The member noted that these proposals need to be treated with upmost priority and urgency to both address the current issues with the existing DHs but also points to a higher standard is required for future DHs.
Another member noted that they face the same issues every day. They said they have raised this time and time again but ultimately, they are dealing with fairly complex systems within banks and things go wrong, and they need a consistent way to measure the number of issues that they’re raising, bring them into a forum, prioritise and get to the root cause which we are not doing at the moment. They noted this was a problem in the UK so it should not be surprising, and so the CDR should bring some overseas feedback into it as well.
Another member noted that when new DHs come on board and they’re asking for help with their CDR data, they are shocked when they find out there is no pre-production – it’s their environment and then into production - which technical teams don’t like. They said it would be useful to have tools and utilities to help people confirm their end points, confirm there conformant and complaint. They declared that the CDR is missing the architectural, pre-production, UAT environment where DHs can prove and test things before, they go live.
The DSB noted that there are a number of vendors now supporting testing of implementors, and they asked if ACCC expanded the CTS to full coverage, would that impair the community?
Berry noted that they have signed twenty-six DHs to access their development environment and they are doing that because it’s good for DHs in the CDR ecosystem, but they feel it’s more of a distraction to their company to help DHs than to focus on more ADRs.
The DSB noted that they have been looking at ways in which the DSB can help test support for implementers by providing tools and collateral which is independent to CTS, obviously working with the ACCC. Is the direction we’re going useful?
Berry suggests that the DSB talk to CTOs or CIOs at some of the DHs that are credit unions, the mutuals that have a tech team between of 0.5 and 0 FTE and ask what would really help them.
The DSB noted that in regard to SLAs for responding to issues, that with the next round of consultation around ADR metrics, the idea is that the ADR metrics not be metrics about the ADR but metrics about the performance of DHs that the ADR is interacting with. The DSB said this would be a source of information on the performance of the regime that would be impartial and voluntary and could be a way of reducing personal effort as ADRs into raising issue you see and make it more systematic. The DSB asked do they think that this is a good path to investigate?
Berry noted that right now the extra costs are around handholding and raising issues for DHs and they would welcome the DH performance and quality metrics.
The Chair noted that the DSB have commenced Decision Proposal 208 in relation to the Non-Functional Requirements (NFRs) which are currently non-binding, and that this proposal is to assess whether they should be adjusted or whether they are fine to be made binding in their current form.
ACCC noted that they are leading a piece of work that is based on feedback broadly around how the CDR system is working. The ACCCC said they would frame it in three ways i). system performance and the gap between actual stakeholder system performance and the expectations of DRs ii). Data quality and iii). Technical issues with DHs. They will be working with the DSB and TSY on these issues, and they are hoping to have something to present at the next DSAC meeting.
The Chair noted that we would be happy for ACCC to present on this at the next meeting.
ACTION: ACCC to present at next meeting around how the CDR system is working
ACCC noted that they are looking at broadly three areas – i) data ii) processes and iii) tech. They said that in the case of data, they are looking at get metrics and the variability in the data they’re getting through get metrics. The ACCC said that in order to get metrics to a meaningful data set that they can use to inform specific future actions, the ACCC have a two approach in the CDR. They noted there are the regulatory activities, which work off a legal timeframe and then the operational measures, which are hopefully a bit more immediate.
The ACCC noted that they are watching with interest the DP 208 NFRs consultation, and also the metrics piece as this relates to the ACCCs role in enforcing them. The ACCC said they have received some great insights from DRs on a voluntary basis, but this is not a sustainable way of managing a healthy ecosystem. The ACCC noted that given the ecosystem is young, they do think it is a valuable data point and they are happy to keep receiving those reports in whatever level of detail that is convenient.
The ACCC said they have been thinking about whether that data should be made public in a dashboard similar to how to Open Banking Implementation Entity (OBIE) present stats on the UK Open Banking ecosystem, and that there is no reason that the metadata around the CDR shouldn't itself be treated the same as the principles as the CDR itself.
The ACCC noted on process that they hope to get more opportunities to provide more targeted guidance to participants around conformance standards. They said the first step will be to release a revised document that supersedes the participant test strategy that they released in December last year. The ACCC said they will take the opportunity to rename it to the Participant Conformance Approach (PCA), that is in response to the feedback around language and that they’ve used the word “testing” very loosely.
The ACCC noted that the primarily updates for the PCA is to provide clarity on what the CTS is and isn’t designed to do and reaffirm current expectations of participants for their own internal testing and their responsibilities for their own channel quality management of their IT. The ACCC said they will be outlining conformance expectations for new and active participants, guidance on updates to the CTS test plans and how they will retire those test plans, and that they will also be highlighting the role that participant tooling might play in the broader conformance approach.
ACCC noted that it is not a bad time to consider the incident management process, and things like barriers to participate and how to raise an incident and what kind of data is available to them to act upon, but because this is largely a voluntary basis, and the ACCC can’t compel participants in that process, there might be a role for the ACCC to consider how they improve that process.
ACCC noted that on the tech side, at the beginning of the ecosystem they undertook a manual, relatively intensive and multilateral testing exercise, but noted this was very extensive and not sustainable. The ACCC said that after that they went to the CTS and possibly what's happened is, they’ve swung too far the other way, and they are keen to look at whether or not they need to be somewhere else which might include things like opportunities to facilitate multilateral testing in a light touch way, considering how test style might play a larger role and generally for completeness whether the scope of the CTS should be revised. ACCC noted that they have released the mock register and they are looking at complementing that with a mock DH and ADR and are moving forward into energy facilitating the Australian Energy Market Operator (AEMO). The ACCC think this fills a gap, but it is not a substitute for multilateral testing.
ACCC noted that in a non-immediate sense, they are keen to apply the lessons learned from banking to improve outputs for energy both because they’re likely to see similar problems and also because they will see different problems.
Working Group Update
A summary of progress since the last committee meeting on the Working Groups was provided in the Committee Papers and was taken as read.
Technical Working Group Update
A further update was provided on the Technical Working Group by James Bligh as follows:
The DSB noted that they are heading headlong into what they’re calling the candidate level standards, which are standards that have been fully consulted on, worked-through quite significantly and are at the quality where the DSB is comfortable presenting to the Chair as a candidate to be made binding. The DSB won't be seeking to make them binding until the Rules are made. The DSB said their goal is to give as much solid lead time to implementation teams as humanly possible, so that there's certainty over what needs to be delivered, acknowledging that the standards will change in response to the rules and also in response to feedback as people start implementing them. The DSB’s target date is 1 November and they have put in place all the consultations for the work that needs to be done to achieve that outcome.
The DSB noted that there have been two consultations already completed and another one was posted this week and another one that they are working on with Energy Made Easy (EME) and Victorian Energy Council (VEC) around generic tariffs.
The DSB noted that they have bought on a new API Architect (Hemang Rathod) and he is working on three Decision Proposals; Hemang has come from AEMO and he will be focussing on finalising the AEMO data sets.
The DSB noted that they have got five significant consultations, three around AEMO related datasets and two retailer only datasets. The DSB said they are looking good to hit the November date, but they would like to call out that there are a lot of consultations currently.
The DSB noted that they are doing some work on maintenance and InfoSec uplift which is a non-trivial conversation to have with the community, and that they are also looking at further action initiation workshops and consulting on the energy standards to completion. The DSB noted that there is a lot going and they are concerned about community consultation fatigue, and they are open to feedback from the retailer community about how things are going as the last thing they want is for the candidate standards not to be quite right because people didn’t have time to engage.
The DSB noted that they have a workshop planned for the 31 August, which is a CDR 101 Workshop, which is aimed at new entrants of the CDR with the goal to try and bring into the fold the retailers that haven’t been engaged to date and get them up to date.
The DSB noted the retailer to AEMO standards is a new thing for the energy sector. The key thing to call out with the energy standards is that with the move to peer-to-peer model, we are leveraging the banking standards significantly. The InfoSec profile, the high-level standards and the consent models will virtually remain unchanged, which is a real benefit for implementors, and it should make that sector easier to implement.
The DSB noted that there is one feature of the energy sector, which is brand new, is the concept of a secondary responsible data request. The DSB elaborated that this is the scenario where a retailer is contacted as a primary DH for request for National Meter Identifying (NMI) standing data or usage and they then send to the secondary DH which in the case of the energy sector is AEMO. The DSB said the two consultations they’ve completed are on this topic and the third one on NFR’s.
The DSB noted that on generic tariffs, this is less of an issue for the wider community and more an issue for internal activation and this is going well.
The DSB noted that for the finalisation of retailer payloads, in the last round of holistic feedback of the standards there was concern as to whether we had the entity relationships correct. They said there will be some really important detailed nuance in the consultations to come.
The DSB noted that there is a lot of collaboration with the CDR Rules team, and they’ve got a lot of consultations going in parallel in different concerns, but at the end of the day, the standards need to align with the rules. The DSB said that they are trying to factor in from a planning perspective how to minimise any misalignment.
The DSB also wanted to thank the ACCC team who have worked hard with the DSB team to collaborate in regard to the register standards move.
Consumer Experience (CX) Working Group Update
A summary of CX Working Group Update was provided in the Committee Papers and was taken as read.
Stakeholder Engagement
A summary of stakeholder engagement including upcoming workshops, weekly meetings and the maintenance iteration cycle was provided in the Committee Papers and was taken as read.
Treasury Update
Kate O’Rourke, First Assistant Secretary, and CDR Division Head, from TSY provided an update as follows:
TSY noted that on the Strategic Assessment, the consultation process is underway. TSY have extending the timeframe for an additional 2 weeks (originally 28 days) because the depth of interest and complexity of the issues that are emerging.
TSY noted that they also have a Telecommunications consultation open as the government is committed to considering a possible designation of this sector. TSY stated the timeframe for this consultation is 28 days, and that they are interested in tying the feedback they’ve received from the Strategic Assessment into the Telecommunications Assessment and they have a roundtable specifically on Telecommunications on the 12 August.
TSY thanked everyone for their contributions v3.0 of the Rules and they are working through the 50+ submissions they have received.
TSY noted that the other rules piece that they’re hoping to open consultation on soon is in relation to energy.
TSY noted that in relation to the future directions, there was a very successful workshop (Action Initiation) on the 27 July, which had lots of policy and rules issues to contemplate coming out of that. The complexity associated with some of the wider ideas around action initiation is real, and they are considering what that means from a policy design prioritisation and tying that into what that means for the Strategic Assessment.
ACCC Update
Daniel Ramos from ACCC provided an update as follows:
ACCC noted that in regard to onboarding. Since the last DSAC the registrar made activation decisions for six new DHs which brings the active DHs up to thirty-eight. That is made up of twenty-two ADIs plus sixteen additional brands which represents about 85.7% of household deposits. There has been no new DR’s since the last meeting, the number remains unchanged at six.
ACCC noted that on the 30 July, they published a revised rectification schedule that is available on the website.
ACCC noted that the ADIs have not begun to provide the required service at the date of publishing so is around 56% ADI’s which represent 11% of wholesale deposits. The rectification schedule in line with the ACCC and Office of the Australian Information Commissioner (OAIC) joint compliance and enforcement policy for the CDR, transparency is really important in building trust and confidence in the ecosystem, which extends to when things are not running perfectly. The schedule provides dates where the ADIs and the associated brands are expected to be compliant with their obligations. The ACCC will revise the document routinely.
The ACCC noted that in regard to the alignment of the standards process. They are working with the DSB on the proposal to align the process of drafting, consulting, and making the register standards and the data standards in a more joined up way. This is Decision Proposal 206 – Register Standards.
Meeting Schedule
The Chair advised that the next meeting will be held remotely on Wednesday 8 September 2021 from 10am to 12pm.
Other Business
The Chair asked members if they had any suggestions or thoughts on metrics or other indicators to provide directly to Jessica Robinson at TSY.
ACTION: Members to provide input on metrics or indicators directly to TSY.
Closing and Next Steps
The Chair thanked the Committee Members and Observers for attending the meeting.
Meeting closed at 11:58