In a serious breach of privacy, the New South Wales (NSW) Reconstruction Authority (RA) has confirmed that a contractor uploaded personal data belonging to program applicants into ChatGPT, an AI tool not authorised for such use. The incident involved the Northern Rivers Resilient Homes Program (RHP), which is designed to assist flood-affected homeowners through buyback offers or resilience upgrades.
Between 12 and 15 March 2025, a former contractor to the RA uploaded a Microsoft Excel file containing over 12,000 rows and 10 columns of applicant information into ChatGPT. The uploaded information is understood to include names, addresses, email addresses, phone numbers, and some personal and health information.
Initial forensic review suggests that up to 3,000 individuals may have had their data exposed. The RA says no evidence currently exists that the data has been accessed by a third party or published online, though it has not ruled out that possibility.
Response, investigation, and notification
Once the RA became aware of the breach, it engaged forensic analysts and began working with Cyber Security NSW to determine the full scope. Every row of the spreadsheet is now under review to identify exactly what personal information may have been exposed.
Affected individuals will be contacted in the coming days, with notifications intended to specify the data related to them and offer support. The RA has said it will cover reasonable costs for replacing identity documents if needed.
The NSW Privacy Commissioner has been notified in accordance with law. An independent review has also been launched to assess how the breach occurred, and what failures in oversight or internal controls allowed it.
Impact on the Resilient Homes Program and internal safeguards
The Resilient Homes Program is aimed at helping homeowners in flood-prone regions rebuild or retrofit properties to better withstand future floods. The government offers either to buy back properties at risk or contribute to strengthening or raising homes.
In response to the breach, the RA says it has reviewed and strengthened its internal systems, issued clear directives to staff not to use unauthorised AI tools, and implemented safeguards to block further uploads of personal information to ChatGPT or similar platforms.
Cyber Security NSW is actively monitoring the internet and dark web for signs the data has been leaked or made accessible.
Broader context and concerns
Data breaches involving AI platforms raise complex questions about privacy, oversight, and accountability. In this case, a government contractor’s unauthorized upload of sensitive personal data underscores the risk posed when staff, subcontractors, or third parties use AI tools without proper controls.
The breach also reveals challenges in breach detection and notification. Authorities say the notification process was delayed because of the complexity of reviewing the large data set line by line to ensure accurate contact and scope assessment. The public announcement only came more than six months after the upload occurred.
This incident follows a trend of mounting scrutiny over government use of AI and data practices. It evokes debates over how to regulate AI tools, enforce data security, and ensure public trust in digital systems.