AI Platform Compliance with the New Zealand Privacy Act 2020: A Comprehensive Analysis
// This report has been produced using AI tools like ChatGPT and Perplexity. While every effort has gone into checking statements and facts presented, there may be inaccuracies. Please contact us if you feel a correction is required. The bar graph is subjective, with no criteria provided for the scoring of the platforms it provides an illustrative visual only.
The rapid adoption of artificial intelligence platforms across New Zealand has raised critical questions about data privacy and regulatory compliance.
This report examines the compliance of four major AI platforms; Perplexity, ChatGPT, Claude, and Microsoft Copilot, with the New Zealand Privacy Act 2020, providing organisations with essential guidance for responsible AI adoption.
Overview of the New Zealand Privacy Act 2020
The Privacy Act 2020, which came into force on December 1, 2020, represents a significant modernisation of New Zealand's privacy legislation, replacing the Privacy Act 1993. The Act establishes a comprehensive framework for protecting personal information in an increasingly digital world, with specific provisions that directly impact AI tool usage.
The legislation is built upon 13 Information Privacy Principles (IPPs) that govern how organisations collect, store, use, and share personal information. These principles apply universally to all organisations operating in New Zealand, including international companies providing AI services to New Zealand residents.
The 13 Information Privacy Principles
The Privacy Act 2020 establishes comprehensive requirements through its 13 IPPs, each directly relevant to AI platform operations:
Collection Principles (IPPs 1-4) govern how personal information may be collected, including purpose limitations, source requirements, transparency obligations, and collection methods. Storage and Access Principles (IPPs 5-7)address security requirements, individual access rights, and correction procedures. Use and Disclosure Principles (IPPs 8-12) establish accuracy requirements, retention limits, use restrictions, disclosure limitations, and cross-border transfer rules. Unique Identifier Principle (IPP 13) regulates the assignment and use of unique identifiers.
AI Governance Context in New Zealand
New Zealand has adopted a "light-touch, proportionate and risk-based approach to AI regulation," leveraging existing legal frameworks rather than creating AI-specific legislation. The government released a Public Service AI Framework in January 2025, emphasising responsible AI adoption across government agencies.
The Office of the Privacy Commissioner has issued specific guidance on AI tools, establishing clear expectations for organisations using AI systems. These expectations include conducting Privacy Impact Assessments, ensuring senior leadership approval, maintaining transparency, and implementing human oversight of AI outputs.
Compliance assessment of major AI platforms with New Zealand Privacy Act 2020 requirements.
Platform-Specific Compliance Analysis
Microsoft Copilot: Excellent Compliance
Microsoft Copilot demonstrates the strongest compliance with New Zealand Privacy Act requirements. The platform offers robust enterprise data protection features, including commercial data protection for business and educational accounts. Microsoft has established New Zealand data residency options, allowing organizations to store data locally within New Zealand's borders.
Key compliance strengths include comprehensive encryption of data at rest and in transit, strict access controls with multi-factor authentication, admin-configurable; default up to 18 months for consumer, 30 days for enterprise.
The platform maintains SOC 2 Type II certification and offers Data Processing Addendums for GDPR compliance, demonstrating alignment with international privacy standards.
Microsoft Copilot's enterprise versions do not use customer data for model training by default, addressing concerns about unauthorised use of personal information. The platform provides granular controls for administrators to manage data sharing permissions and implement organisational privacy policies.
Claude: Excellent Compliance
Anthropic's Claude platform exhibits exceptional privacy-by-design principles, with a fundamental commitment to not using conversation data for model training purposes. This approach represents a significant departure from industry norms and directly addresses Privacy Act concerns about secondary use of personal information.
Claude implements comprehensive security measures including automatic encryption of data in transit and at rest, with strictly limited employee access to user conversations. The platform maintains a 30-day default data retention period and provides users with clear controls over their data.
The platform has achieved multiple compliance certifications including SOC 2 Type II, ISO 27001:2022, and ISO/IEC 42001:2023, demonstrating robust information security management practices. Claude's privacy policy explicitly addresses international data transfers through Standard Contractual Clauses and adequacy decisions.
Anthropic's approach to transparency and user consent exemplifies best practices for Privacy Act compliance, with clear privacy notices and granular user controls. The platform's commitment to accuracy and bias mitigation aligns with IPP 8 requirements for information accuracy.
ChatGPT: Good Compliance
OpenAI's ChatGPT demonstrates good compliance with Privacy Act requirements, though with some areas requiring attention. The platform offers enterprise-grade security features including encryption at rest and in transit, comprehensive access controls, and regular security monitoring.
ChatGPT provides users with data control options, including the ability to opt out of model training through account settings. The platform's "Temporary Chats" feature ensures conversations are not used for training purposes, while enterprise and API customers receive additional protections by default.
However, the default configuration for free and Plus users allows conversation data to be used for model improvement, requiring users to actively opt out to prevent this usage. The platform's data retention practices vary by service tier, chats persist until the user deletes them; once deleted they are purged within 30 days. Temporary Chat auto-deletes after 30 days.
OpenAI has established data residency options in multiple regions and offers comprehensive Business Associate Agreements for healthcare customers. The platform maintains SOC 2 Type 2 compliance (only Security Copilot is certified) and provides Data Processing Addendums for enterprise customers.
Perplexity: Moderate Compliance
Perplexity AI shows moderate compliance with Privacy Act requirements, with several areas of concern. The platform collects extensive user data including search history, device information, and location data, which may exceed necessity requirements under IPP 1.
The platform's data retention policies are less clearly defined compared to competitors, core account data is retained while the account is active; deleted accounts purge within 30 days. Enterprise input data can be limited to 7-30 days.. While Perplexity offers an opt-out option for AI training data usage, this requires active user intervention rather than privacy-by-default design.
Security measures include standard encryption practices, though the platform lacks some enterprise-grade protections available from competitors. Perplexity has faced scrutiny regarding its web crawling practices and respect for robots.txt protocols, raising concerns about data collection methods.
The platform's transparency regarding data usage and sharing practices requires improvement to fully align with Privacy Act notification requirements. Recent moves toward introducing advertising and browser development raise additional privacy considerations.
Multi-dimensional privacy compliance analysis across key Privacy Act requirements
Cross-Border Data Transfer Compliance
The Privacy Act's IPP 12 establishes strict requirements for cross-border data transfers, requiring organizations to ensure recipient countries provide comparable privacy protections. Microsoft Copilot's New Zealand data residency offering provides the strongest compliance with these requirements.
Claude and ChatGPT both operate globally but have implemented contractual and technical safeguards to ensure data protection during international transfers. Perplexity's global server infrastructure presents greater challenges for organisations requiring data localisation.
The Privacy Commissioner retains broad authority to prohibit cross-border transfers if recipient jurisdictions lack adequate protections, making platform selection critical for compliance. Organisations must carefully evaluate each platform's data residency options and transfer mechanisms.
Organisational Compliance Recommendations
Organisations seeking Privacy Act compliance when implementing AI platforms should conduct comprehensive Privacy Impact Assessments before deployment. This assessment should evaluate the specific AI tool's necessity and proportionality relative to organisational objectives.
Senior leadership approval based on full risk consideration is essential, with ongoing governance structures to monitor compliance. Organisations must implement transparent disclosure of AI usage to affected individuals and maintain human oversight of automated decision-making processes.
For organisations handling sensitive personal information, enterprise-tier services from Microsoft Copilot or Claude provide the strongest compliance foundations. Government agencies and healthcare providers should prioritise platforms offering New Zealand data residency and comprehensive compliance certifications.
Training and awareness programs should ensure staff understand both the Privacy Act requirements and the specific privacy controls available within chosen AI platforms. Regular review and updating of privacy policies to reflect AI usage is mandatory under the Act's transparency requirements.
Conclusion
The analysis reveals significant variation in Privacy Act compliance across major AI platforms, with Microsoft Copilot and Claude demonstrating excellent compliance, ChatGPT showing good compliance with some limitations, and Perplexity requiring substantial improvement to meet New Zealand privacy standards.
Organisations must carefully evaluate their specific privacy requirements against each platform's capabilities, considering factors such as data sensitivity, user populations, and regulatory obligations. The Privacy Commissioner's guidance emphasises that privacy compliance is not optional but fundamental to responsible AI adoption in New Zealand.
As AI technologies continue evolving, organisations should maintain ongoing monitoring of both platform capabilities and regulatory expectations to ensure continued compliance. The government's commitment to light-touch regulation places greater responsibility on organisations to implement robust privacy governance frameworks.
The investment in privacy-compliant AI platforms represents not just regulatory necessity but a competitive advantage in New Zealand's trust-conscious digital economy. Organisations that prioritise privacy compliance will be better positioned to capture AI's benefits while maintaining public confidence and regulatory approval.
Citations:
https://www.legislation.govt.nz/act/public/2020/0031/latest/LMS23342.html
https://secureprivacy.ai/blog/new-zealand-privacy-act-2020-explained
https://www.privacy.org.nz/publications/guidance-resources/ai/
https://www.pcmatic.com/blog/the-dark-side-of-perplexity-ai-privacy-risks-you-should-know/
https://b2bnews.co.nz/news/amazon-probes-perplexity-ai-for-alleged-website-scraping-violations/
https://theaipmm.substack.com/p/ai-platforms-and-data-privacy
https://newzealand.ai/insights/trust-perceptions-are-holding-back-new-zealands-ai-adoption
https://privacy.anthropic.com/en/articles/10301952-updates-to-our-privacy-policy
https://www.reddit.com/r/ClaudeAI/comments/1ciz7y8/does_claude_api_direct_keep_your_data_private/
https://618media.com/en/blog/navigating-privacy-concerns-with-claude-ai/
https://www.navascusi.com/en/privacy-data-protection-ia-chatgpt-claude/
https://privacy.anthropic.com/en/articles/10015870-what-certifications-has-anthropic-obtained
https://learn.microsoft.com/en-us/copilot/microsoft-365/microsoft-365-copilot-privacy
https://www.hp.com/us-en/shop/tech-takes/microsoft-copilot-data-privacy-guide
https://learn.microsoft.com/en-us/power-platform/faqs-copilot-data-security-privacy
https://www.acc.co.nz/assets/business/Privacy-Impact-Assessment-M365-Copilot.pdf
https://privacyfoundation.nz/what-free-ai-chatbots-are-really-costing-you/
https://www.private-ai.com/en/2023/08/03/new-zealands-privacy-act/
https://apps.apple.com/nz/app/privacy-ai-offline-chat-bot/id6526463185
https://aiforum.org.nz/our-work/working-groups/ai-governance-working-group/
https://newzealand.ai/insights/new-zealand-is-co-designing-regulatory-frameworks-for-ai
https://www.russellmcveagh.com/insights-news/release-of-public-sector-ai-framework/
https://www.simpsongrierson.com/insights-news/legal-updates/ai-regulation-nz-walks-a-tightrope
https://simplyprivacy.co.nz/responsible-ai-guidance-for-nz-public-service-released/
https://simplyprivacy.co.nz/opc-guidance-on-applying-the-ipps-to-ai/
https://www.legislation.govt.nz/act/public/2020/0031/latest/LMS23223.html
https://www.consumerprotection.govt.nz/general-help/consumer-laws/privacy-act
https://privacy.org.nz/source-of-personal-information-principle-two/
https://www.perplexity.ai/help-center/en/collections/8934956-data-privacy-security
https://support.anthropic.com/en/collections/4078534-privacy-legal
https://privacy.anthropic.com/en/articles/10023548-how-long-do-you-store-my-data
https://learn.microsoft.com/en-us/copilot/privacy-and-protections
https://securiti.ai/new-zealand-privacy-commissioner-issues-guidance-on-ai-usage/
https://www.auckland.ac.nz/en/news/2025/05/05/our--chance-to-lead-ai-regulation.html
Answer from Perplexity: https://www.perplexity.ai/search/i-want-you-to-learn-the-new-ze-7XIyX5ImS1CfP_pA6K4g9Q?utm_source=copy_output