Skip to content

feat(route): add iapp#21682

Merged
TonyRL merged 1 commit intoDIYgod:masterfrom
TonyRL:feat/iapp
Apr 10, 2026
Merged

feat(route): add iapp#21682
TonyRL merged 1 commit intoDIYgod:masterfrom
TonyRL:feat/iapp

Conversation

@TonyRL
Copy link
Copy Markdown
Collaborator

@TonyRL TonyRL commented Apr 10, 2026

Involved Issue / 该 PR 相关 Issue

Close #21322

Example for the Proposed Route(s) / 路由地址示例

/iapp/news

New RSS Route Checklist / 新 RSS 路由检查表

  • New Route / 新的路由
  • Anti-bot or rate limit / 反爬/频率限制
    • If yes, do your code reflect this sign? / 如果有, 是否有对应的措施?
  • Date and time / 日期和时间
    • Parsed / 可以解析
    • Correct time zone / 时区正确
  • New package added / 添加了新的包
  • Puppeteer

Note / 说明

@github-actions github-actions Bot added the route label Apr 10, 2026
@github-actions
Copy link
Copy Markdown
Contributor

Auto Review

No clear rule violations found in the current diff.

@github-actions github-actions Bot added the auto: ready to review Manual review will come in after lint issues and merge conflicts are fixed label Apr 10, 2026
@github-actions
Copy link
Copy Markdown
Contributor

Successfully generated as following:

http://localhost:1200/iapp/news - Success ✔️
<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:atom="http://www.w3.org/2005/Atom" version="2.0">
  <channel>
    <title>IAPP - News</title>
    <link>https://iapp.org/news</link>
    <atom:link href="http://localhost:1200/iapp/news" rel="self" type="application/rss+xml"></atom:link>
    <description>Daily top stories from around the world, original reporting and thought-leadership articles to keep your finger on the pulse of privacy, AI governance and digital responsibility. - Powered by RSSHub</description>
    <generator>RSSHub</generator>
    <webMaster>contact@rsshub.app (RSSHub)</webMaster>
    <language>en</language>
    
    <lastBuildDate>Fri, 10 Apr 2026 01:32:03 GMT</lastBuildDate>
    <ttl>5</ttl>
    <item>
      <title>Alabama set to add variation to US state privacy patchwork</title>
      <description>&lt;p&gt;Alabama is on its way to joining the U.S. comprehensive state privacy law ranks. House Bill 351, the &lt;a href=&quot;https://alison.legislature.state.al.us/files/pdf/SearchableInstruments/2026RS/HB351-enr.pdf&quot; target=&quot;_blank&quot;&gt;Alabama Personal Data Protection Act&lt;/a&gt;, cleared the state legislature 7 April in relatively seamless fashion, as no lawmaker voted against the bill in any roll call votes taken in the House or Senate.&lt;/p&gt;&lt;p&gt;If the bill is enacted by the governor, it will take effect 1 May 2027. Alabama joins &lt;a href=&quot;https://iapp.org/news/a/a-long-winding-road-oklahoma-closes-in-on-comprehensive-privacy-law&quot; target=&quot;_self&quot;&gt;Oklahoma&lt;/a&gt; in passing a bill this year and will mark the 21st state to enact a comprehensive statute.&lt;/p&gt;&lt;p&gt;Recent additions to the &lt;a href=&quot;https://iapp.org/resources/article/us-state-privacy-laws-overview&quot; target=&quot;_self&quot;&gt;state patchwork&lt;/a&gt; aligned with previously enacted legislation, leaving few compliance questions. However, Alabama&#39;s bill raises some novelties that businesses will be required to consider.&lt;/p&gt;&lt;p&gt;The bill applies to businesses that control or process the data of more than 25,000 Alabama residents or those that derive 25% of their revenue from data sales involving any number of data subjects. There are notable business exemptions, particularly around what constitutes a &quot;sale,&quot; while the definition of minors only covers children under age 13. A non-sunsetting 45-day cure provision is also included along with exclusive attorney general enforcement.&lt;/p&gt;&lt;p&gt;&quot;HB 351 is the product of two years of hard work to create a common-sense framework that protects consumers while also remaining friendly to those who do business in our state,&quot; state Rep. Mike Shaw, R-Ala., told the IAPP. &quot;As someone with more than 30 years as a technology professional in a regulated environment, my goal with HB 351 was to create a practical, workable law that protects the people of Alabama in the most responsible way possible.&quot;&lt;/p&gt;&lt;p&gt;In addition to his elected position, Shaw has spent two decades as the senior vice president and chief technology officer of Mutual Savings Credit Union. State lawmakers had not attempted to pass a comprehensive framework since 2021 before Shaw kickstarted a new initiative &lt;a href=&quot;https://s3.amazonaws.com/fn-document-service/file-by-sha384/8d0e7fa3a4c73e023168699fe3841e93e17c043ddae497c577ea8bcbbeb5f0b6c57e5a50d42017b4b7a401de6bc469c0&quot; target=&quot;_blank&quot;&gt;last year&lt;/a&gt;.&lt;/p&gt;&lt;h4&gt;&lt;strong&gt;Coverage thresholds&lt;/strong&gt;&lt;/h4&gt;&lt;p&gt;The bill&#39;s coverage thresholds represent some of the most nuanced applicability standards among all comprehensive state laws.&lt;/p&gt;&lt;p&gt;Alabama is just the second state to land on a minimum processing threshold of 25,000 data subjects, which is the lowest across states. But in terms of applicability versus state population, a covered entity would need to process data on approximately 4.8% of state residents, making the threshold among the hardest to achieve.&lt;/p&gt;&lt;p&gt;The sale threshold is unique in that no other state stipulates the law applies when any number of individuals&#39; data is sold. Most states attach the 25% revenue to sales of data belonging to more than 25,000 individuals.&lt;/p&gt;&lt;p&gt;Shaw said he consulted the attorney general&#39;s office and other interested parties while arriving at thresholds that would address multiple state interests.&lt;/p&gt;&lt;p&gt;&quot;This bill was all about balance:&amp;nbsp;Balancing Alabamians&#39; rights with the burden of regulation,&quot; he said. &quot;Balancing the need for enforcement with fairness.&amp;nbsp;In this case we are balancing what other states are doing with the unique needs of Alabama.&quot;&lt;/p&gt;&lt;p&gt;Polsinelli Shareholder Starr Drum, CIPP/E, CIPM, FIP, noted small businesses with fewer than 500 employees and nonprofits with fewer than 100 employees are exempt unless they sell personal data. There is also an exemption for defined political organizations, a provision that has proven to be a sticking point in Maine&#39;s comprehensive privacy &lt;a href=&quot;https://iapp.org/news/a/state-of-the-states-maine-comprehensive-privacy-oregon-ai-chatbot-bills-on-the-move&quot; target=&quot;_self&quot;&gt;debate&lt;/a&gt;.&lt;/p&gt;&lt;h4&gt;&lt;strong&gt;&#39;Sale&#39; exemptions&lt;/strong&gt;&lt;/h4&gt;&lt;p&gt;Statutory exemptions vary between states, but a handful of Alabama&#39;s proposed exemptions for what constitutes a sale of data are not found anywhere else. Specifically exemptions for disclosure or transfer of data for the purposes of &quot;providing analytics services&quot; or &quot;providing marketing services solely to the controller.&quot;&lt;/p&gt;&lt;p&gt;Both exemptions raise potential ambiguity in compliance, depending on how businesses might interpret their analytics and marketing practices.&lt;/p&gt;&lt;p&gt;&quot;Sale is more narrowly defined than in some comparable laws since the valuable consideration in exchange for personal data component only encompasses situations where third parties are not restricted in subsequent uses of the personal data,&quot; Drum said. &quot;This is something businesses should be mindful of during contracting.&quot;&lt;/p&gt;&lt;p&gt;Rep. Shaw&#39;s consultations on the sale definition yielded questions and concerns regardless of the approach. He said the &quot;cash-only&quot; characterization was &quot;too narrow and subject to loopholes,&quot; but valuable consideration &quot;had its own set of issues.&quot;&lt;/p&gt;&lt;p&gt;&quot;We tried to thread the needle a bit and find something that was broad enough to allow legitimate relationships with important partners without rendering large parts of the bill useless,&quot; he added, noting other states&#39; approaches are &quot;being tested in the wild.&quot;&lt;/p&gt;&lt;h4&gt;&lt;strong&gt;Minors&#39; data&lt;/strong&gt;&lt;/h4&gt;&lt;p&gt;Children&#39;s privacy provisions in the bill are on par with other states following the Children&#39;s Online Privacy Protection Act&#39;s definition of a child.&lt;/p&gt;&lt;p&gt;A number of state legislatures have begun taking steps beyond COPPA to treat children&#39;s data as sensitive under their comprehensive laws. Notably, &lt;a href=&quot;https://www.goodwinlaw.com/en/insights/publications/2025/08/alerts-practices-dpc-colorado-proposes-childrens-privacy-amendments&quot; target=&quot;_blank&quot;&gt;Colorado&lt;/a&gt;, &lt;a href=&quot;https://www.hunton.com/privacy-and-cybersecurity-law-blog/connecticut-amends-the-connecticut-data-privacy-act&quot; target=&quot;_blank&quot;&gt;Connecticut&lt;/a&gt; and &lt;a href=&quot;https://www.hunton.com/privacy-and-cybersecurity-law-blog/virginias-new-protections-for-children-go-into-effect&quot; target=&quot;_blank&quot;&gt;Virginia&lt;/a&gt; have amended their laws in recent years to enhance children&#39;s protections.&lt;/p&gt;&lt;p&gt;Alabama has other children&#39;s online safety legislation in place with the recent passage of the state&#39;s &lt;a href=&quot;https://legiscan.com/AL/bill/HB161/2026&quot; target=&quot;_blank&quot;&gt;App Store Accountability Act&lt;/a&gt;. The age verification law, which also requires verifiable parental consent, applies to minors under age 18.&lt;/p&gt;&lt;p&gt;Shaw said the definition of minors wasn&#39;t discussed at length; however, he said there needs to be further conversation and coordination moving forward about aligning laws to a common age group.&lt;/p&gt;&lt;p&gt;&quot;In general, I&#39;d want to avoid creating different age standards for different regulations, so expanding age would likely be part of a larger discussion,&quot; he said.&lt;/p&gt;</description>
      <link>https://iapp.org/news/a/alabama-set-to-add-variation-to-us-state-privacy-patchwork</link>
      <guid isPermaLink="false">https://iapp.org/news/a/alabama-set-to-add-variation-to-us-state-privacy-patchwork</guid>
      <pubDate>Wed, 08 Apr 2026 16:00:00 GMT</pubDate>
      <author>Joe Duball</author>
      <enclosure url="https://images.contentstack.io/v3/assets/bltd4dd5b2d705252bc/blt6c69e002a773dd0b/69d7a0c073638cd2f6cb9675/alabama-state-house-sunny-us-040926.jpg" type="image/jpeg"></enclosure>
      <category>Privacy</category>
      <category>Law and regulation</category>
      <category>U.S. state regulation</category>
      <category>iapp_original</category>
    </item>
    <item>
      <title>Notes from the Asia-Pacific region: Robust conversations at richly rewarding Summit</title>
      <description>&lt;p&gt;This year&#39;s IAPP Global Summit 2026, held just last week, featured a series of robust, potent conversations, which lingered long after attendees departed.&amp;nbsp;&lt;/p&gt;&lt;p&gt;Highlights continuing to brew in my mind are distilled below.&amp;nbsp;&lt;/p&gt;&lt;p&gt;The conference opened with cognitive scientist Maya Shankar, whose work and book on decision-making invite us to reconsider the invisible forces shaping human behavior. Shankar&#39;s keynote was nothing short of a hand-crafted, house-blend reflection on how small interventions can stir profound change.&lt;/p&gt;&lt;p&gt;In a conversation with IAPP Vice President and Chief Knowledge Officer Caitlin Fennessy, CIPP/US, world-renowned author Salman Rushdie added a deeply human and cultural dimension. Reflecting on his experiences and observations, including life in India, Rushdie highlighted how privacy is neither universal nor evenly distributed. In the slums, privacy is a luxury. Cultural norms, social structures and economic realities shape what privacy means — and whether it is even attainable.&amp;nbsp;&lt;/p&gt;&lt;p&gt;Boston University School of Law Professor Woodrow Hartzog&#39;s keynote was a calibrated dark roast — deep, complex and intentionally uncomfortable. He spoke candidly about privacy&#39;s ongoing identity crisis. Against the backdrop of shrinking budgets and expanding technological capability, Hartzog warned of the risks posed by surveillance systems, including facial recognition, and the creeping normalization of intrusive technologies. Unfettered artificial intelligence, he argued, threatens not just data protection, but the very fabric of social institutions, human relationships, dignity and autonomy. His prescription was striking in its simplicity, yet difficult in execution: Minimize data. Do not exploit people. And, perhaps most provocatively in an age of AI: dethrone efficiency.&amp;nbsp;&lt;/p&gt;&lt;p&gt;One of the most impactful moments, which also drew a large crowd, was the conversation between Prince Harry and IAPP Research and Insights Director Joe Jones. Speaking with candor shaped by personal experience, Prince Harry framed privacy not as a technical issue, but as a foundational one essential to trust and safety in modern society. He emphasized that without privacy, individuals cannot feel secure, and institutions cannot command trust.&amp;nbsp;&lt;/p&gt;&lt;p&gt;His message carried both urgency and optimism. In acknowledging the work of the privacy community, Prince Harry noted that the professionals gathered at Summit gave him hope; a sweet note among an otherwise strong, sometimes bitter discourse. It was a reminder that while the challenges are systemic, so too is the community working to address them.&lt;/p&gt;&lt;p&gt;A distinctly APAC flavor seeped into my second day, where Khaitan &amp;amp; Co. Partner Supratim Chakraborty and I had the pleasure of moderating a panel on India&#39;s groundbreaking Digital Personal Data Protection Act — with credits to PwC India Associate Director, Data Privacy Abhishek Tiwari, AIGP, CIPP/A, CIPP/E, CIPM, FIP, for bringing our session together. Tiwari could not make the trip to Summit in the end due to recent events. India is a jurisdiction that continues to evolve with intensity and complexity, in a region that embraces strong, fast-moving and deeply layered digital responsibility standards.&amp;nbsp;&lt;/p&gt;&lt;p&gt;What emerged from Summit this year is a need for adaptability — and as with crafting the perfect cortado or macchiato — balance. Too much emphasis on efficiency, and we risk eroding trust. Too little, and we fail to scale solutions in a hot, data-driven world. In equal measure, privacy frameworks eschew a uniform mug; they need to be filtered, slow drip, to reflect local notes and a distinct aroma.&amp;nbsp;&lt;/p&gt;&lt;p&gt;And finally, the quality of human connections will always be irreplaceable. Social interactions simmer into long-lasting friendships and a strong full-bodied community, which make Summit&#39;s conclusion slightly bittersweet, but all-around richly rewarding.&amp;nbsp;&lt;/p&gt;&lt;p&gt;&lt;em&gt;This article originally appeared in the Asia-Pacific Dashboard Digest, a free weekly IAPP newsletter. Subscriptions to this and other IAPP newsletters can be found &lt;/em&gt;&lt;a href=&quot;https://iapp.org/news/subscriptions&quot; target=&quot;_self&quot;&gt;&lt;em&gt;here&lt;/em&gt;&lt;/a&gt;&lt;em&gt;.&amp;nbsp;&lt;/em&gt;&lt;/p&gt;</description>
      <link>https://iapp.org/news/a/notes-from-the-asia-pacific-region-robust-conversations-at-richly-rewarding-summit</link>
      <guid isPermaLink="false">https://iapp.org/news/a/notes-from-the-asia-pacific-region-robust-conversations-at-richly-rewarding-summit</guid>
      <pubDate>Wed, 08 Apr 2026 16:00:00 GMT</pubDate>
      <author>Charmian Aw</author>
      <enclosure url="https://images.contentstack.io/v3/assets/bltd4dd5b2d705252bc/bltedd473f1502b93d2/69d5261e4b558b3692235dac/balloons-summit26-iappconferences-040726.JPG" type="image/jpeg"></enclosure>
      <category>AI governance</category>
      <category>Privacy</category>
      <category>Law and regulation</category>
      <category>opinion</category>
    </item>
    <item>
      <title>Global push for &#39;digital sovereignty&#39; risks complicating global data flows, innovation</title>
      <description>&lt;p&gt;Amid rising global tensions, the ongoing conflict in the Middle East and growing strain in the Western world related to digital trade, there is an ongoing push among some countries to implement stricter digital sovereignty measures and efforts to boost production of domestic technology stacks.&amp;nbsp;&lt;/p&gt;&lt;p&gt;From &lt;a href=&quot;https://iapp.org/news/a/notes-from-the-iapp-canada-the-question-of-data-sovereignty&quot; target=&quot;_self&quot;&gt;Canada&lt;/a&gt;, to the &lt;a href=&quot;https://iapp.org/news/a/thought-for-the-week-reflections-on-my-iapp-fireside-chat-with-max-schrems&quot; target=&quot;_self&quot;&gt;EU&lt;/a&gt;, to the &lt;a href=&quot;https://dig.watch/updates/digital-sovereignty-asia-cloud-ai&quot; target=&quot;_blank&quot;&gt;Pacific Rim&lt;/a&gt;, the ongoing period of geopolitical upheaval is compelling a number of countries to re-think what constitutes &quot;sovereignty&quot; in the digital realm. &amp;nbsp;&lt;/p&gt;&lt;p&gt;However, the term &quot;digital sovereignty&quot; in and of itself can carry different definitions in certain contexts. On one side of the debate are efforts from by some countries to break clean of the dominance of U.S. technology companies in pursuit of domestic tech production with significant data localization measures established. While proponents on the other end of the spectrum still envision an interconnected world of free-flowing data, albeit in a modified environment that maintains respect for data subjects&#39; privacy, with a whole host variability in between both sides of the debate.&amp;nbsp;&lt;/p&gt;&lt;p&gt;At the IAPP Global Summit 2026 in Washington, D.C., panelists during several breakout sessions sought to provide clarity and nuance to the digital sovereignty conversation, while also opining on how the current geopolitical climate will shape that debate in the years to come.&amp;nbsp;&lt;/p&gt;&lt;h4&gt;&lt;strong&gt;Defining &#39;digital sovereignty&#39;&lt;/strong&gt;&lt;/h4&gt;&lt;p&gt;Hunton Andrews Kurth&#39;s Centre for Information Policy Leadership Bojana Bellamy, CIPP/E, said the concept of digital sovereignty relates to countries and their domestic industries seeking a &quot;broader ability to control software, hardware and infrastructure.&quot; However, she said digital sovereignty has also become an umbrella term referring to several subcategories: Data sovereignty, advanced sovereignty, and now, sovereignty with respect developing artificial intelligence.&amp;nbsp;&lt;/p&gt;&lt;p&gt;Bellamy said data sovereignty involves the &quot;control of data and when data designs are used.&quot; She said AI sovereignty measures are efforts by national governments to control &quot;all the elements of AI from talent, to infrastructure, to data, to the stack ... by the state.&quot; Advanced sovereignty, she said, relates to ensuring all forms of digital sovereignty align with countries&#39; existing laws and regulations concerning how a specific country asserts its sovereignty holistically.&amp;nbsp;&lt;/p&gt;&lt;p&gt;&quot;What has changed in the last year is that many more countries are talking about digital data sovereignty, AI sovereignty, as a matter of their own industrial policy,&quot; Bellamy said. &quot;Also, many company boards are now considering this a priority.&quot;&lt;/p&gt;&lt;p&gt;Joining Bellamy in the same breakout session was Mastercard Chief Privacy, AI and Data Responsibility Officer was Caroline Louveaux, CIPP/E, CIPM, who said the digital economy is undergoing a massive transformation driven by geopolitical events. She said privacy and data governance professionals, as well as policymakers, are realizing that the digital economy has, by and large, matured to the point where global data flows and storage are more secure. As a result, attention is shifting away from technical security concerns and toward questions of security and control, and, specifically, which nations control the systems that enable data flows and which domestic entities within those nations are responsible for collecting and storing the data.&amp;nbsp;&lt;/p&gt;&lt;p&gt;&quot;The key question we are moving on from is can the data be adequately protected, which is a privacy question,&quot; Louveaux said. &quot;Now it&#39;s a broader question about who controls the data and the systems that will shape our economy and society.&quot;&lt;/p&gt;&lt;h4&gt;&lt;strong&gt;Why are governments seeking to achieve greater digital sovereignty?&lt;/strong&gt;&lt;/h4&gt;&lt;p&gt;Many of the discussions surrounding data sovereignty efforts have arisen out of frustration by governments and their respective private sectors over diverging global legal frameworks that impose inconsistent requirements for how data can be transferred to other jurisdictions and then processed.&lt;/p&gt;&lt;p&gt;Despite regulatory simplification efforts, such as the EU Digital Omnibus and AI Omnibus proposals that are being negotiated, in part, to ease compliance burdens on companies, NWong Strategies Principal Nicole Wong said there is still significant regulatory activity taking place in various jurisdictions but not necessarily from where technology regulations have historically originated.&amp;nbsp;&lt;/p&gt;&lt;p&gt;Using the example of the U.S. General Services Administration&#39;s new multiple award schedule &lt;a href=&quot;https://www.gsa.gov/buy-through-us/purchasing-programs/multiple-award-schedule&quot; target=&quot;_blank&quot;&gt;guidance&lt;/a&gt; for prospective GSA contractors and the subcontractors they employ, Wong said the document&#39;s AI disclosure requirements contains &quot;expansive definitions&quot; around what constitutes government data and what becomes government property when an entity enters a contract with the agency. She said this approach of regulation by &quot;individual negotiations&quot; on an agency-by-agency basis will only complicate the overall legal environment for innovators if such policies are multiplied across jurisdictions and government entities.&amp;nbsp;&lt;/p&gt;&lt;p&gt;&quot;That to me is not deregulatory, it&#39;s essentially a different kind of regulation and a different level of where it is taking place, and if our goal is to have an open field for AI innovation, I think our regulations need to be honed toward that,&quot; Wong said. &quot;The question is whether the current regulatory scheme we&#39;re in is also creating openness and interoperability, and therefore, innovation.&quot;&lt;/p&gt;&lt;h4&gt;&lt;strong&gt;Geopolitical forces acting on digital trade&lt;/strong&gt;&lt;/h4&gt;&lt;p&gt;Today, the push among countries to impose stricter digital sovereignty measures poses dilemmas for the global economy that is underpinned by how easily data can flow between jurisdictions.&amp;nbsp;&lt;/p&gt;&lt;p&gt;Perhaps no where else is this dilemma best exemplified than the ongoing digital trade dispute between the EU and U.S. that has seen U.S. President Donald Trump&#39;s administration lean on the EU to modify its digital rulebook to be less heavy-handed against U.S. technology firms when they have been found to violate EU laws, such as the EU General Data Protection Regulation, the Digital Markets Act and Digital Services Act.&amp;nbsp;&lt;/p&gt;&lt;p&gt;Irish Data Protection Commission Commissioner Dale Sunderland said when the GDPR was first adopted, there were a number of jurisdictions around the world that sought to align their data protection frameworks to be compatible with the GDPR&#39;s new requirements. Critically, he said, the U.S. was not one of them, and it has created a major geopolitical tug of war between EU regulators and American tech companies, and their backers across the U.S. political establishment.&amp;nbsp;&lt;/p&gt;&lt;p&gt;&quot;Geopolitics has changed a lot,&quot; Sunderland said. &quot;Privacy innovation is going to be critical, but if there are choices made by organizations to pull back from, it&#39;s going to make life much harder when they&#39;re working in a multi-jurisdictional context. If you come from the perspective of, &#39;let’s see what I can get away with,&#39; in each jurisdiction, that all contributes to a sense of misalignment and diverging practices.&quot;&lt;/p&gt;&lt;h4&gt;&lt;strong&gt;ASEAN offers example for maintaining data flows despite cultural, political diversity among countries&lt;/strong&gt;&lt;/h4&gt;&lt;p&gt;However, the countries party to the Association of Southeast Asian Nations, which includes Indonesia, Singapore, Thailand and Vietnam, may offer an example of how various domestic sovereignty measures can be respected, while also enabling data to flow between jurisdictions with minimal friction.&amp;nbsp;&lt;/p&gt;&lt;p&gt;Singapore&#39;s Personal Data Protection Commission Deputy Commissioner Denise Wong said despite varying forms of government among ASEAN countries, they each respect one another&#39;s economic interests and how their data protection frameworks operate within the broader concept of national sovereignty.&amp;nbsp;&lt;/p&gt;&lt;p&gt;&quot;(Digital sovereignty) is about understanding what our values are, what our economic interests are, and then our interaction with technology has to respect that, whether it&#39;s the cloud, AI, data or digital as a whole,&quot; Wong said. &quot;For ASEAN, it&#39;s a group of countries with very different economic contexts, very different journeys in terms of digitalization, so what does sovereignty and digital sovereignty mean in that economic and societal context? Part of it is about making sure there is trust in society, that technologies can be used with confidence and making sure the technology is clearly understood within existing risk management frameworks.&quot;&lt;/p&gt;&lt;h4&gt;&lt;strong&gt;What&#39;s at stake&lt;/strong&gt;&lt;/h4&gt;&lt;p&gt;Ultimately, if world governments continue taking steps to turn back the clock on globalization by pursuing a sovereignty agenda in the strictest sense, it may usher in a new era where digital barriers among countries begin to resemble the hard physical borders they have erected between one another.&amp;nbsp;&lt;/p&gt;&lt;p&gt;Google Global Head of Privacy Policy Lanah Kammourieh Donnelly said if the day comes where countries retreat inward both economically and politically, less modernized economies will be particularly vulnerable because of transnational threat actors, who will continue to operate in a borderless environment, unlike cyber defenders who will be forced to operate in national siloes created under the guise of digital sovereignty. She said &quot;digital autarky is impossible,&quot; given the highly interconnected nature of technology.&lt;/p&gt;&lt;p&gt;&quot;(If in a) generation from now where the supply chains have been fundamentally rearranged and each country is trying to produce its own full stack entirely, you will deprive yourself of the best product at every level of the stack,&quot; Kammourieh Donnelly said. &quot;Our adversaries are not localized. You don&#39;t want a single point of failure, so when we think about data localization, we immediately think about the biggest companies and their real strategic considerations, but the unintended consequences will play out in things like everyday security.&quot;&lt;/p&gt;</description>
      <link>https://iapp.org/news/a/global-push-for-digital-sovereignty-risks-complicating-global-data-flows-innovation</link>
      <guid isPermaLink="false">https://iapp.org/news/a/global-push-for-digital-sovereignty-risks-complicating-global-data-flows-innovation</guid>
      <pubDate>Wed, 08 Apr 2026 16:00:00 GMT</pubDate>
      <author>Alex LaCasse</author>
      <enclosure url="https://images.contentstack.io/v3/assets/bltd4dd5b2d705252bc/bltf144dfefef4163bd/69d7b6614c11be08cfeee12a/panel-summit26-iappconference-040926.jpg" type="image/jpeg"></enclosure>
      <category>AI governance</category>
      <category>Privacy</category>
      <category>Cybersecurity law</category>
      <category>Data security</category>
      <category>Enforcement</category>
      <category>Frameworks and standards</category>
      <category>International data transfers</category>
      <category>Law and regulation</category>
      <category>Risk management</category>
      <category>iapp_original</category>
    </item>
    <item>
      <title>A view from Brussels: A European view from IAPP Global Summit 2026</title>
      <description>&lt;p&gt;There is something special about being a European at the IAPP Global Summit in Washington, D.C. This year&#39;s conference was perfectly timed for cherry blossom lovers and there is nothing like a stroll down around the Washington Monument and across the National Mall in the early evening to process two full days of inspirational keynotes, insightful panels and hallway discussions with old friends and new acquaintances.&amp;nbsp;&lt;/p&gt;&lt;p&gt;During the two days, attendees heard reassuring messages from both EU and U.S. officials about the robustness of the EU-U.S. Data Privacy Framework architecture. Two complaints from European citizens are currently moving through redress mechanisms; one is being processed and the other is still under review. According to officials on both sides of the agreement, this shows the redress mechanism is functioning as intended. &amp;nbsp;&lt;/p&gt;&lt;p&gt;Attendees also heard about digital rules and EU General Data Protection Regulation simplification plans. In contrast, or perhaps as a staggering illustration of the complexity Brussels is precisely attempting to solve, one panel dove into the cybersecurity law framework spanning the NIS2 Directive, the Cyber Resilience Act, the Critical Entities Resilience Directive and the Digital Operational Resilience Act. Several instruments still await national implementation laws and organizations are left guessing to some extent.&amp;nbsp;&lt;/p&gt;&lt;p&gt;The panelists&#39; advice could apply beyond these cybersecurity instruments as it feels like we are experiencing constant change: determine, update, manage obligations, maintain — the virtuous cycle of compliance and governance.&amp;nbsp;&lt;/p&gt;&lt;p&gt;And then NOYB&#39;s honorary chairman Max Schrems said the following during a fireside chat with IAPP Editorial Director Jedidiah Bracy: &quot;Laws are not made for when we all get along, and everything works. Laws are made for when things go bad.&quot;&lt;/p&gt;&lt;p&gt;I agree in part with Schrems&#39; statement. Laws are essential when things go bad. They are an integral component of a legal system and are foundational to its legitimacy. They enable redress, enforcement, sanctions and mitigation when needed. They anchor the ability to make things right when something defaults.&amp;nbsp;&lt;/p&gt;&lt;p&gt;But the statement came across as a restrictive way to view regulation, ignoring the essential positive value it bears: indicating what good looks like; ensuring a society, group of businesses or people operate on a commonly defined field, by the same rules; and building confidence in a collective system. The value of law also rests in its mission to be a compass, a guide to the north pole of good behavior, progress and accountability.&amp;nbsp;&lt;/p&gt;&lt;p&gt;&lt;em&gt;This article originally appeared in the European Data Protection Digest, a free weekly IAPP newsletter. Subscriptions to this and other IAPP newsletters can be found &lt;/em&gt;&lt;a href=&quot;https://iapp.org/news/subscriptions&quot; target=&quot;_self&quot;&gt;&lt;em&gt;here&lt;/em&gt;&lt;/a&gt;&lt;em&gt;.&amp;nbsp;&lt;/em&gt;&lt;/p&gt;</description>
      <link>https://iapp.org/news/a/a-view-from-brussels-a-european-view-from-iapp-global-summit-2026</link>
      <guid isPermaLink="false">https://iapp.org/news/a/a-view-from-brussels-a-european-view-from-iapp-global-summit-2026</guid>
      <pubDate>Wed, 08 Apr 2026 16:00:00 GMT</pubDate>
      <author>Isabelle Roccia</author>
      <enclosure url="https://images.contentstack.io/v3/assets/bltd4dd5b2d705252bc/blt84a3822957892ffc/69d7c2b070b17cd46d631f70/brussels-belgium-map-eu-111325.jpg" type="image/jpeg"></enclosure>
      <category>Cybersecurity law</category>
      <category>Privacy</category>
      <category>Law and regulation</category>
      <category>opinion</category>
    </item>
    <item>
      <title>Nigeria moves toward comprehensive AI regulation</title>
      <description>&lt;p&gt;The age of artificial intelligence has unlocked new opportunities, as well as significant challenges, and many countries are facing the dilemma of whether to allow innovation to progress freely or introduce safeguards to manage AI-related risks and protect users. The European Union took the lead with its 2024 &lt;a href=&quot;https://artificialintelligenceact.eu/the-act/&quot; target=&quot;_blank&quot;&gt;AI Act&lt;/a&gt;, and several African countries — including &lt;a href=&quot;https://www.scribd.com/document/822646478/Angola-MN&quot; target=&quot;_blank&quot;&gt;Angola&lt;/a&gt;, &lt;a href=&quot;https://ai.gov.eg/SynchedFiles/en/Resources/Egyptian%20Charter%20For%20Responsible%20AI.pdf&quot; target=&quot;_blank&quot;&gt;Egypt&lt;/a&gt;, &lt;a href=&quot;https://new.kenyalaw.org/akn/ke/bill/senate/2026-02-19/the-artificial-intelligence-bill-2026/eng@2026-02-19/source&quot; target=&quot;_blank&quot;&gt;Kenya&lt;/a&gt; and &lt;a href=&quot;https://en.hespress.com/100403-moroccan-parliament-considers-legislation-to-regulate-artificial-intelligence.html&quot; target=&quot;_blank&quot;&gt;Morocco&lt;/a&gt; — have since adopted similar stances toward governing AI through national strategies, policy frameworks and regulation.&amp;nbsp;&lt;/p&gt;&lt;p&gt;In Nigeria, the Artificial Intelligence and Robotics Research Regulatory Agency Bill was introduced in &lt;a href=&quot;https://placng.org/i/wp-content/uploads/2021/07/House-of-Reps-order-paper-Thursday-15-July-2021.pdf&quot; target=&quot;_blank&quot;&gt;2021&lt;/a&gt;, but did not complete its legislative cycle. In &lt;a href=&quot;https://x.com/PLACNG/status/1712054592563716326&quot; target=&quot;_blank&quot;&gt;2023&lt;/a&gt;, the National AI and Robotics Sciences Bill and the Control of Usage of Artificial Intelligence Technology were introduced into the House of Representatives, both scaling through the first reading. In 2024, the National Artificial Intelligence Regulatory Authority Bill was also presented for its first reading. However, in December 2024, the bills introduced to Parliament between 2021 and 2024 were consolidated for second reading.&amp;nbsp;&lt;/p&gt;&lt;p&gt;In 2024, Nigeria also published its &lt;a href=&quot;https://ncair.nitda.gov.ng/wp-content/uploads/2024/08/National-AI-Strategy_01082024-copy.pdf&quot; target=&quot;_blank&quot;&gt;National AI Strategy&lt;/a&gt;, which highlights crucial AI governance proposals, including: developing national AI principles; establishing an AI governance regulatory body; publishing transparent terms and guidelines for responsible AI development and deployment; and developing a comprehensive risk management framework that minimizes the potential negative impacts of AI deployment and use.&amp;nbsp;&lt;/p&gt;&lt;p&gt;In 2025, Nigeria maintained momentum on AI regulation with the introduction of the National Artificial Intelligence Commission (Establishment) &lt;a href=&quot;https://www.dataguidance.com/news/nigeria-national-ai-commission-bill-passes-first&quot; target=&quot;_blank&quot;&gt;Bill&lt;/a&gt;, which proposes the establishment of a National AI Commission to oversee AI development and the Nigeria Digital Sovereignty and Fair Data Compensation Bill, 2025. Considerable progress was also made with the National Digital Economy and E-Governance Bill.&lt;/p&gt;&lt;h4&gt;&lt;strong&gt;The National Digital Economy and E-Governance Bill&lt;/strong&gt;&lt;/h4&gt;&lt;p&gt;The National Digital Economy and E-Governance Bill is designed as a foundational statute for Nigeria&#39;s digital public infrastructure and governance ecosystem, aiming to modernize public sector administration by establishing a legal equivalence between electronic and paper-based processes. It applies to individuals and public institutions engaged in electronic transactions and records, and trust service providers operating within or in relation to Nigeria.&amp;nbsp;&lt;/p&gt;&lt;p&gt;The bill seeks to address key aspects of the digital economy, including electronic commerce, digital government transformation, consumer protection in online transactions and cybersecurity compliance.&lt;/p&gt;&lt;p&gt;On AI governance, it would position the National Information Technology Development Agency as the central authority responsible for regulating the digital economy, including AI systems deployed in both public and private sectors. The bill would introduce a risk-based framework for AI systems, requiring that they be designed and deployed in a manner that is fair, transparent, secure, nondiscriminatory and subject to human oversight. Developers and deployers of AI systems would be required to implement governance and risk management measures proportional to the potential impact of their systems.&amp;nbsp;&lt;/p&gt;&lt;p&gt;While the bill does not specifically classify AI risks, it would empower the NITDA to develop regulations considering: the purpose and context for use of the AI system; the nature and extent of its application; the likelihood and severity of damages and impacts that may result from its use; the reversibility of its effects on the rights of affected persons; its degree of autonomy and the possibility of human oversight or intervention; and the level of control exercised by an AI agent over the system function and outcomes.&amp;nbsp;&lt;/p&gt;&lt;p&gt;The bill would also require AI agents to cooperate with the NITDA and the Nigeria Data Protection Commission to ensure compliance with relevant data protection laws.&amp;nbsp;&lt;/p&gt;&lt;p&gt;A notable feature of the proposal is its formal establishment of a regulatory sandbox for digital and AI innovation. This mechanism would allow companies to test new AI-driven products and services within a controlled regulatory environment, subject to oversight and predefined safeguards. While intended to promote innovation, participation in the sandbox would also provide regulators with early visibility into emerging technologies and associated risks.&lt;/p&gt;&lt;p&gt;Enforcement powers under the bill would be extensive. It would authorize the NITDA to conduct compliance audits, accredit AI auditors, require corrective measures, suspend AI systems deemed to pose imminent risks, and impose administrative penalties. Depending on the severity of the breach, sanctions could include fines of up to NGN10 million, approximately USD7,332, or up to 2% of an entity&#39;s preceding annual gross revenue in Nigeria.&amp;nbsp;&lt;/p&gt;&lt;h4&gt;&lt;strong&gt;The Digital Sovereignty and Fair Data Compensation Bill&lt;/strong&gt;&lt;/h4&gt;&lt;p&gt;This bill seeks to establish a framework for Nigeria&#39;s digital sovereignty, AI governance and fair compensation for the use of Nigerian data. It applies primarily to foreign digital companies generating revenue from Nigerian users, particularly those involved in online advertising, cloud computing, AI training or digital transactions.&amp;nbsp;&lt;/p&gt;&lt;p&gt;It aims to ensure data generated within Nigeria remains in Nigeria by requiring foreign digital companies to contribute fairly to the Nigerian economy, preventing unregulated extraction of Nigerian data, promoting local AI innovation and research, and strengthening national security through local data storage. Additionally, the bill aims to promote AI innovation and research within Nigeria, strongly advocating for AI-focused research and development centered on Nigeria, with a strong focus on financial compensation mechanisms and regulatory penalties.&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: black;&quot;&gt;The bill would require that all data belonging to Nigerian users collected by foreign digital companies be stored and processed within Nigeria, and that such companies establish local data centers or mirror servers in the country. It would also require that all cross-border data transfers be approved by the NITDA, supported by mandatory proof of compliance with Nigeria&#39;s data security laws.&amp;nbsp;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: black;&quot;&gt;The Nigeria AI Development Fund would be established under the bill and foreign companies that use Nigerian data to train AI models would be required to contribute 2% of their annual revenue from Nigeria to the fund. Additionally, any AI company operating in Nigeria would be required to ensure that at least 30% of its AI research and development using Nigerian data is conducted in the country. Financial sanctions under the bill would range from 10% of annual Nigerian turnover to NGN1 billion, approximately USD733,259.&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;span style=&quot;color: black;&quot;&gt;The bill raises potential conflicts with existing laws, particularly the Nigeria Data Protection Act. The NDPA permits cross-border data transfers based on adequacy decisions, appropriate safeguards and recognized derogations, without requiring prior authorization from a regulatory authority. However, privacy professionals can navigate this tension by paying close attention to the purpose of the transfer and the context of data use, ensuring that any required approvals for cross-border transfers are obtained where applicable.&lt;/span&gt;&lt;/p&gt;&lt;h4&gt;&lt;strong&gt;What AI governance professionals should watch closely&lt;/strong&gt;&lt;/h4&gt;&lt;p&gt;Both bills make it clear that the NITDA will be at the forefront of AI regulation in Nigeria. While the DPC will be actively involved, the NITDA will take the lead. This will require close monitoring of the agency&#39;s activities, guidance notes and AI-related regulations. Additionally, it would be helpful for privacy professionals to seek guidance on AI risk classification, audit expectations and transparency requirements, especially after the bills take effect.&amp;nbsp;&lt;/p&gt;&lt;p&gt;Data localization and AI training obligations may require fundamental changes to AI system architecture, data pipelines and vendor arrangements. This requires privacy and AI professionals to assess whether AI models rely on Nigerian data and whether existing cross-border processing arrangements can be reconciled with the proposed localization regime.&lt;/p&gt;&lt;p&gt;The financial implications of both bills are quite significant. Digital services taxes, AI data compensation contributions, and revenue-based penalties pose material regulatory risk that must be factored into product design, market-entry strategies and contractual arrangements. Compliance with these regulations should never be compromised, given the potential financial sanctions.&amp;nbsp;&lt;/p&gt;&lt;p&gt;Finally, both bills signal a clear expectation that organizations deploying and developing AI systems in Nigeria adopt mature, documented AI governance frameworks. This includes formal risk assessments, human oversight mechanisms, auditability and clear internal accountability for AI-related decisions. Organizations that proactively embed these controls will be better positioned as Nigeria transitions from legislative experimentation to active enforcement.&lt;/p&gt;</description>
      <link>https://iapp.org/news/a/nigeria-moves-toward-comprehensive-ai-regulation</link>
      <guid isPermaLink="false">https://iapp.org/news/a/nigeria-moves-toward-comprehensive-ai-regulation</guid>
      <pubDate>Tue, 07 Apr 2026 16:00:00 GMT</pubDate>
      <author>Victoria Adaramola</author>
      <enclosure url="https://images.contentstack.io/v3/assets/bltd4dd5b2d705252bc/blt9776fb25370dc2e5/69d3e404cb1540633e596a71/tower-city-lagos-nigeria-africa-0004.jpg" type="image/jpeg"></enclosure>
      <category>AI governance</category>
      <category>AI and machine learning</category>
      <category>Law and regulation</category>
      <category>Program management</category>
      <category>analysis</category>
      <category>member_gated</category>
    </item>
    <item>
      <title>Personalization, privacy and the mystery of the red hair dye</title>
      <description>&lt;p&gt;Being in the privacy profession for a decade now, I have had my share of conversations that start with &quot;we want to provide customers with more personalized offers&quot; or worse, &quot;we want to increase our intimacy with our customers.&quot; I have a terrible poker face and can only imagine what my body language does when a conversation starts like this.&lt;/p&gt;&lt;p&gt;This morning, however, I found myself wishing I had a more &quot;intimate&quot; relationship with my online grocery provider. Hear me out.&lt;/p&gt;&lt;p&gt;Every week, like clockwork, I place my online grocery order. I have been doing this for seven years. Seven years of loyalty. Seven years of scanned barcodes, substitutions, forgotten avocados and emergency top-ups because I forgot the milk. And yet, when I open the app and scroll to &quot;personalized offers,&quot; I am confidently presented with red hair dye and pet food.&lt;/p&gt;&lt;p&gt;I do not dye my hair red.&lt;/p&gt;&lt;p&gt;I do not own a pet.&lt;/p&gt;&lt;p&gt;I have never owned a pet.&lt;/p&gt;&lt;p&gt;I have never — explicitly or implicitly — indicated to this retailer that either of these things might be relevant to me.&lt;/p&gt;&lt;p&gt;And yet, there they are. Every week.&amp;nbsp;&lt;/p&gt;&lt;p&gt;What makes this particularly odd is that, based on my order history alone, my grocery provider almost certainly knows a great deal about my household. They could reasonably infer how many people I shop for. They likely have a good sense of the age range of my children — hello school-lunch staples, farewell nappies. They know the types of meals we cook, how those meals change with the seasons and when life is busy versus when I optimistically plan to cook from scratch.&lt;/p&gt;&lt;p&gt;They know all of this because I&#39;ve told them, quietly, consistently, through thousands of first-party transactions over many years.&lt;/p&gt;&lt;p&gt;And yet the best they can do is suggest hair dye and pet food.&lt;/p&gt;&lt;p&gt;This is where I start to feel frustrated by how privacy and personalization are often framed as opposing forces. As if organizations must choose between rich, personalized customer experiences that are inherently invasive or privacy-respectful approaches that result in generic, low-value interactions.&lt;/p&gt;&lt;p&gt;This framing is misleading.&lt;/p&gt;&lt;p&gt;I am not asking my grocery store to infer sensitive attributes about me, combine my data with third-party sources or surprise me with insights I did not knowingly provide. I am simply asking them to use the data they already hold, collected directly from me through ordinary transactions, to provide a more useful experience in return.&lt;/p&gt;&lt;p&gt;At the moment, the value exchange feels lopsided. The retailer benefits from my data through demand forecasting, supply chain optimization and customer retention. I benefit very little. The offered &quot;personalization&quot; feels disconnected from my actual behavior and, over time, undermines rather than builds trust.&lt;/p&gt;&lt;p&gt;And here is the part that often gets lost in privacy discussions: I would be willing to share more information if it genuinely improved my experience.&lt;/p&gt;&lt;p&gt;I would happily state my dietary preferences.&lt;/p&gt;&lt;p&gt;I would tell them what kinds of meals I enjoy cooking.&lt;/p&gt;&lt;p&gt;I would welcome the ability to say &quot;please don&#39;t show me pet products,&quot; &quot;I don&#39;t want alcohol promotions,&quot; or &quot;I&#39;d prefer not to see children&#39;s products.&quot;&lt;/p&gt;&lt;p&gt;None of this requires invasive profiling. It requires transparency, clear purpose and meaningful choice.&lt;/p&gt;&lt;p&gt;This could be accomplished through highly privacy-respectful methods: clear preference centers, explicit opt in to personalization, the ability to turn personalization off entirely and controls over more sensitive product categories. Customers should be able to understand why they receive certain recommendations and adjust those signals over time.&lt;/p&gt;&lt;p&gt;The irony, of course, is that the data already exists. It is already held, governed, retained and linked to my account. But rather than being used to create mutual value, it largely serves the organization&#39;s interests. When customers don&#39;t see tangible benefits, they disengage, opt out or stop trusting the system altogether.&lt;/p&gt;&lt;p&gt;Privacy should not be used as the explanation for poor customer experience. When done well, privacy is an enabler of better design. It should result in experiences built on trust, proportionality and genuine reciprocity.&lt;/p&gt;&lt;p&gt;I still wince when I hear talk of &quot;increasing intimacy with customers.&quot; But if my grocery provider wants to start by acknowledging that seven years of weekly orders probably says more about me than an algorithm guessing I own a dog, I&#39;m open to the conversation.&lt;/p&gt;&lt;p&gt;Just … please stop with the hair dye.&lt;/p&gt;</description>
      <link>https://iapp.org/news/a/personalization-privacy-and-the-mystery-of-the-red-hair-dye</link>
      <guid isPermaLink="false">https://iapp.org/news/a/personalization-privacy-and-the-mystery-of-the-red-hair-dye</guid>
      <pubDate>Tue, 07 Apr 2026 16:00:00 GMT</pubDate>
      <author>Leah Parker</author>
      <enclosure url="https://images.contentstack.io/v3/assets/bltd4dd5b2d705252bc/bltfbd539958aabac70/69cec7436ec44e77dbd539f5/grocery-carts-marketing-retail-0014.jpg" type="image/jpeg"></enclosure>
      <category>Privacy</category>
      <category>Adtech</category>
      <category>opinion</category>
    </item>
    <item>
      <title>Notes from the AI Governance Center: AI governance has officially been woven into the IAPP Global Summit</title>
      <description>&lt;p&gt;Something felt different at this year&#39;s IAPP Global Summit. The integration of artificial intelligence governance into the program no longer&amp;nbsp;seemed like&amp;nbsp;an addition&amp;nbsp;to a privacy conference.&amp;nbsp;This year, the AI governance sessions, questions from participants, and hallway chatter&amp;nbsp;sounded&amp;nbsp;more&amp;nbsp;informed, more specific and more nuanced. &amp;nbsp;&lt;/p&gt;&lt;p&gt;High-level musings about the uptake and impact of AI or introductory information sessions about the EU AI Act evolved&amp;nbsp;into&amp;nbsp;action-oriented AI governance panels with practical examples, guidance and&amp;nbsp;even&amp;nbsp;takeaway&amp;nbsp;frameworks, as&amp;nbsp;well as meaningful dialogue between regulators&amp;nbsp;and AI deployers&amp;nbsp;on implementation questions. &amp;nbsp;&lt;/p&gt;&lt;p&gt;While&amp;nbsp;it&#39;s&amp;nbsp;impossible to see&amp;nbsp;everything and everyone at&amp;nbsp;Summit, I was pleased&amp;nbsp;to&amp;nbsp;be&amp;nbsp;able to attend&amp;nbsp;several&amp;nbsp;sessions and&amp;nbsp;meet so many AI governance practitioners&amp;nbsp;this year.&amp;nbsp;These are my key takeaways&amp;nbsp;from the IAPP Global Summit 2026. &amp;nbsp;&lt;/p&gt;&lt;h4&gt;&lt;strong&gt;Change as opportunity&amp;nbsp;&lt;/strong&gt;&lt;/h4&gt;&lt;p&gt;To kick off, Travis LeBlanc&#39;s conversation with cognitive scientist &lt;a href=&quot;https://mayashankar.com/&quot; target=&quot;_blank&quot;&gt;Maya Shankar&lt;/a&gt; highlighted a&amp;nbsp;series of stories about people who had experienced profound trauma and used those moments as an opportunity to ask a harder question:&amp;nbsp;What is truly self-defining, and what is simply circumstance? &amp;nbsp;&lt;/p&gt;&lt;p&gt;She shared a story about a prisoner that stuck with me.&amp;nbsp;For her book &quot;&lt;a href=&quot;https://www.goodreads.com/book/show/231404078-the-other-side-of-change&quot; target=&quot;_blank&quot;&gt;The Other Side of Change,&quot;&lt;/a&gt; Shankar interviewed a man who recounted his feelings prior&amp;nbsp;to incarceration. He contemplated the person he would become,&amp;nbsp;wondering what&amp;nbsp;this experience&amp;nbsp;would do to him.&amp;nbsp;When faced with the realities of prison,&amp;nbsp;he found&amp;nbsp;there were&amp;nbsp;more choices available to him than&amp;nbsp;anticipated.&amp;nbsp;He ended up writing&amp;nbsp;poetry and&amp;nbsp;mentoring younger inmates. &amp;nbsp;&lt;/p&gt;&lt;p&gt;Most importantly, he&amp;nbsp;made choices that led to a&amp;nbsp;positive experience&amp;nbsp;while&amp;nbsp;there,&amp;nbsp;which&amp;nbsp;ultimately&amp;nbsp;helped&amp;nbsp;his future self.&lt;/p&gt;&lt;p&gt;While this example might seem like a far departure from the day-to-day experience&amp;nbsp;of most AI governance professionals, it resonated with me since it is easy for us&amp;nbsp;to feel like we are up against a&amp;nbsp;behemoth.&amp;nbsp;This often leads&amp;nbsp;to&amp;nbsp;a&amp;nbsp;sense that we&amp;nbsp;are losing agency when the circumstances seem&amp;nbsp;too difficult, complex, or there are too many societal decisions already made for us. &amp;nbsp;&lt;/p&gt;&lt;p&gt;This person&#39;s choices are&amp;nbsp;a good&amp;nbsp;reminder of how even in less-than-ideal circumstances we can&amp;nbsp;create&amp;nbsp;agency. We have the choice to accept that our circumstances are changing,&amp;nbsp;whether we hoped for them&amp;nbsp;to&amp;nbsp;or not. Given our roles,&amp;nbsp;individual choices can help work toward a positive outcome&amp;nbsp;during&amp;nbsp;this significant&amp;nbsp;period of change that is upon us.&lt;/p&gt;&lt;h4&gt;&lt;strong&gt;Inspired royalty &amp;nbsp;&lt;/strong&gt;&lt;/h4&gt;&lt;p&gt;A highlight for many was&amp;nbsp;the keynote by&amp;nbsp;Prince Harry, Duke of Sussex,&amp;nbsp;and&amp;nbsp;his&amp;nbsp;conversation with&amp;nbsp;the IAPP&#39;s&amp;nbsp;Joe Jones.&amp;nbsp;Harry&#39;s&amp;nbsp;lived experience&amp;nbsp;in the public eye since&amp;nbsp;birth&amp;nbsp;has given him a unique&amp;nbsp;view of&amp;nbsp;privacy.&amp;nbsp;For me, the most compelling part of his work was how his experience inspired meaningful reflection on the role and impact of social media and technology in our society.&amp;nbsp;Harry&#39;s &lt;a href=&quot;https://iapp.org/resources/article/iapp-global-summit-2026-keynote-prince-harry-the-duke-of-sussex&quot; target=&quot;_self&quot;&gt;keynote speech&lt;/a&gt; and a recap of &lt;a href=&quot;https://iapp.org/news/a/at-iapp-global-summit-2026-prince-harry-hails-digital-governance-pros-on-their-important-work&quot;&gt;his conversation&lt;/a&gt; with Jones are available&amp;nbsp;for your perusal. &amp;nbsp;&lt;/p&gt;&lt;p&gt;Two concepts stood out.&amp;nbsp;Concerns about&amp;nbsp;trust that many in the AI governance community and broader society have raised about the increased&amp;nbsp;use of technology is core to his motivation. He&amp;nbsp;is using&amp;nbsp;his platform to inspire others to&amp;nbsp;leverage&amp;nbsp;their own platforms to create change,&amp;nbsp;stating,&amp;nbsp;&quot;the question isn&#39;t whether our concept of trust is broken; it&#39;s whether we&#39;re willing to rebuild it for everyone&#39;s sake.&quot;&lt;/p&gt;&lt;p&gt;Harry drew on precedents from aviation, medicine and finance;&amp;nbsp;industries where trust was not assumed but engineered.&amp;nbsp;Trust&amp;nbsp;did not&amp;nbsp;emerge&amp;nbsp;from good intentions alone. It was the product of governance structures, incentive alignment and deliberate decisions to put rules in place before the worst harms&amp;nbsp;materialized.&amp;nbsp;Building on the discussion with Maya Shankar and the power of choice in the face of change, the technology sector, he argued, cannot wait for&amp;nbsp;behavior&amp;nbsp;to change on its own. &amp;nbsp;&lt;/p&gt;&lt;p&gt;Second, when asked about what makes him hopeful, he shared that it was &quot;everyone in this room.&quot; The people&amp;nbsp;governing technology, ensuring privacy, preventing cyberattacks.&amp;nbsp;It was&amp;nbsp;encouraging&amp;nbsp;to hear that those who have&amp;nbsp;devoted&amp;nbsp;their&amp;nbsp;careers to digital governance&amp;nbsp;are&amp;nbsp;not only&amp;nbsp;being&amp;nbsp;recognized&amp;nbsp;but&amp;nbsp;are&amp;nbsp;also&amp;nbsp;part of&amp;nbsp;what is inspiring hope.&lt;/p&gt;&lt;h4&gt;&lt;strong&gt;A tool is not a moral object&amp;nbsp;&lt;/strong&gt;&lt;/h4&gt;&lt;p&gt;In conversation with&amp;nbsp;IAPP Vice President and Chief Knowledge Officer&amp;nbsp;Caitlin Fennessy, CIPP/US,&amp;nbsp;legendary&amp;nbsp;author Salman Rushdie shared that a tool is not itself a moral object. This&amp;nbsp;is the concept that will&amp;nbsp;probably stay&amp;nbsp;with&amp;nbsp;me&amp;nbsp;the longest.&amp;nbsp;It again&amp;nbsp;builds on&amp;nbsp;the idea that we all have agency, that technology itself is simply a tool, and it is up to all of us — both on the&amp;nbsp;individual and collective levels — to decide how it is wielded. &amp;nbsp;&lt;/p&gt;&lt;p&gt;Rushdie spoke about his own evolution of privacy.&amp;nbsp;After&amp;nbsp;his near-fatal attack, he&amp;nbsp;said&amp;nbsp;that&amp;nbsp;his&amp;nbsp;experience&amp;nbsp;reshaped how he thinks about the&amp;nbsp;context&amp;nbsp;in which concepts like privacy&amp;nbsp;should be understood, and he&amp;nbsp;reminded&amp;nbsp;us&amp;nbsp;that privacy and other harms&amp;nbsp;cannot be assessed in the abstract. &amp;nbsp;&lt;/p&gt;&lt;p&gt;The right decisions about how much,&amp;nbsp;or how little,&amp;nbsp;privacy a person needs must be made with full awareness of the circumstances they are navigating. What is&amp;nbsp;appropriate in&amp;nbsp;one context is entirely wrong in another.&amp;nbsp;He then reminded us that not everyone has the same access to privacy.&amp;nbsp;He&amp;nbsp;shared the realities of&amp;nbsp;growing up in India&amp;nbsp;where&amp;nbsp;privacy is not always available to people in less fortunate circumstances. &amp;nbsp;&lt;/p&gt;&lt;p&gt;It was a good reminder that we&amp;nbsp;shouldn&#39;t&amp;nbsp;repeat the challenges of the built world in the digital world.&amp;nbsp;For more, the IAPP&#39;s Alex LaCasse reported on &lt;a href=&quot;https://iapp.org/news/a/iapp-global-summit-2026-salman-rushdie-reflects-on-notion-of-privacy-after-attempt-on-his-life&quot; target=&quot;_self&quot;&gt;Rushdie&#39;s conversation&lt;/a&gt;. &amp;nbsp;&lt;/p&gt;&lt;h4&gt;&lt;strong&gt;Digital governance professionals are&amp;nbsp;creating change&amp;nbsp;&lt;/strong&gt;&lt;/h4&gt;&lt;p&gt;While the inspirational keynotes&amp;nbsp;picked up on&amp;nbsp;many themes of the conference, the people doing the work and sharing their experiences built upon these concepts. &amp;nbsp;&lt;/p&gt;&lt;p&gt;In several panels, the question of how to deal with privacy and other AI governance principles in practice was met with real-life examples. When should synthetic data be used to protect someone&#39;s privacy? If training&amp;nbsp;a model&amp;nbsp;with personal information will save a person&#39;s life, is it acceptable to use this data? These are no longer theoretical&amp;nbsp;questions.&amp;nbsp;Many&amp;nbsp;professionals&amp;nbsp;are starting to draw lines on what these limits are within each of their organizations.&amp;nbsp;&lt;/p&gt;&lt;p&gt;From discussions about&amp;nbsp;AI vendor contracts&amp;nbsp;to best practices when building risk assessments for AI implementation, there were similar themes. Understand the&amp;nbsp;objectives&amp;nbsp;that you are trying to solve with these&amp;nbsp;technologies. Work across teams&amp;nbsp;to pull in the right subject matter experts at the right time, and&amp;nbsp;don&#39;t&amp;nbsp;do&amp;nbsp;compliance for&amp;nbsp;compliance&amp;nbsp;sake.&amp;nbsp;&lt;/p&gt;&lt;h4&gt;&lt;strong&gt;Getting more granular &amp;nbsp;&lt;/strong&gt;&lt;/h4&gt;&lt;p&gt;It&#39;s&amp;nbsp;probably important&amp;nbsp;to note that&amp;nbsp;it&#39;s&amp;nbsp;not just the IAPP community that is&amp;nbsp;evolving. A&amp;nbsp;significant part of this year&#39;s depth is due to evolving governance needs. &amp;nbsp;&lt;/p&gt;&lt;p&gt;One panel&amp;nbsp;dove specifically into how AI governance implementation is changing. They spoke about the&amp;nbsp;set of triggers that digital governance teams increasingly need to track: changes in data sources, improvements in harm feedback from a wider range of sources, shifts in core functionality and model performance, better understanding of third-party risk, changes in legislation across geographical regions, and the question of timing. When&amp;nbsp;should&amp;nbsp;reviews happen and how&amp;nbsp;frequently? This discussion presented different scenarios, and one of the panelists,&amp;nbsp;Andrew&amp;nbsp;Gamino-Cheong,&amp;nbsp;shared &lt;a href=&quot;https://trustible.ai/post/what-ai-governance-looks-like-after-year-one/&quot; target=&quot;_blank&quot;&gt;some best practices&lt;/a&gt;.&lt;/p&gt;&lt;p&gt;In addition to overarching best practices and framework discussions, we wanted to get into some sector-specific discussions. I was pleased to host a&amp;nbsp;discussion&amp;nbsp;exploring&amp;nbsp;how financial institutions are approaching AI governance,&amp;nbsp;asking whether they are building entirely new processes, or augmenting privacy, legal and risk frameworks already in place.&amp;nbsp;More details on the discussion &lt;a href=&quot;https://www.linkedin.com/posts/ashley-casovan-b3247211_huge-thanks-to-jennifer-kosar-meera-d-and-activity-7444765356289622016--i6r?utm_source=share&amp;amp;utm_medium=member_desktop&amp;amp;rcm=ACoAAAJhytYBx02UNVBsYzLMUUtCShJZ-EcK_zA&quot; target=&quot;_blank&quot;&gt;here.&lt;/a&gt;&lt;/p&gt;&lt;h4&gt;&lt;strong&gt;Looking ahead&lt;/strong&gt;&lt;/h4&gt;&lt;p&gt;My final takeaway was that AI governance professionals are not done with change.&amp;nbsp;I&#39;ve&amp;nbsp;started to think and write about this more, but the idea of using AI as a part of the AI governance process came up in a lot of my discussions. &amp;nbsp;&lt;/p&gt;&lt;p&gt;What does this shift mean for the future of the AI governance profession?&lt;/p&gt;&lt;p&gt;Questions about future literacy requirements for AI governance professionals and the training of the AI agents were on people&#39;s minds.&amp;nbsp;Understanding where and how to use these AI agents, when and for what are they better at reviewing than humans.&lt;/p&gt;&lt;p&gt;During this year&#39;s conference there were more questions than best practices. However, it&amp;nbsp;seems&amp;nbsp;clear that future conferences, likely before we get to Global Summit 2027,&amp;nbsp;will start to&amp;nbsp;provide&amp;nbsp;more examples of where AI governance professionals are working alongside agentic digital governance professionals. &amp;nbsp;&lt;/p&gt;&lt;p&gt;What does this mean&amp;nbsp;for&amp;nbsp;us? &amp;nbsp;&lt;/p&gt;&lt;p&gt;Change will be a constant for our profession, but as many of our keynotes emphasized, we have the agency to make that&amp;nbsp;change be&amp;nbsp;a positive for us and for society. &amp;nbsp;&lt;/p&gt;&lt;h4&gt;&lt;strong&gt;Additional IAPP Global Summit 2026 posts worth reading&lt;/strong&gt;&lt;/h4&gt;&lt;p&gt;Joe Jones on &lt;a href=&quot;https://www.linkedin.com/posts/joe-jones-b1793bb6_iappsummit26-activity-7445409702991155200-isM5?utm_source=share&amp;amp;utm_medium=member_desktop&amp;amp;rcm=ACoAAAJhytYBx02UNVBsYzLMUUtCShJZ-EcK_zA%C2%A0%C2%A0&quot; target=&quot;_blank&quot;&gt;LinkedIn&lt;/a&gt;. &amp;nbsp;&lt;/p&gt;&lt;p&gt;New Irish Data Protection Commissioner Niamh Sweeney addresses scrutiny over her appointment, shares agency priorities, by &lt;a href=&quot;https://iapp.org/news/a/new-irish-data-protection-commissioner-niamh-sweeney-addresses-scrutiny-over-her-appointment-shares-agency-priorities&quot; target=&quot;_self&quot;&gt;Jedidiah Bracy.&lt;/a&gt;&lt;/p&gt;&lt;p&gt;FTC Commissioner Meador stresses agency preference for &#39;case-by-case&#39; enforcement, by &lt;a href=&quot;https://iapp.org/news/a/iapp-global-summit-2026-ftc-commissioner-meador-stresses-agency-preference-for-case-by-case-enforcement&quot; target=&quot;_self&quot;&gt;Joe Duball.&lt;/a&gt;&lt;/p&gt;&lt;p&gt;European Data Protection Office&amp;nbsp;on &lt;a href=&quot;https://www.linkedin.com/posts/ai-privacy-enforcement-ugcPost-7444858450842550272-bde0?utm_source=share&amp;amp;utm_medium=member_desktop&amp;amp;rcm=ACoAAAJhytYBx02UNVBsYzLMUUtCShJZ-EcK_zA%C2%A0%C2%A0&quot; target=&quot;_blank&quot;&gt;LinkedIn&lt;/a&gt;.&lt;/p&gt;&lt;p&gt;&#39;About bloody time&#39;: Prince Harry welcomes lawsuits against tech firms, by &lt;a href=&quot;https://www.theguardian.com/uk-news/2026/apr/01/about-bloody-time-prince-harry-welcomes-landmark-suits-against-major-tech-companies&quot; target=&quot;_blank&quot;&gt;the Guardian.&lt;/a&gt;&lt;/p&gt;&lt;p&gt;Barbara Cosgrove on &lt;a href=&quot;https://www.linkedin.com/posts/barbara-cosgrove_iapp2026-aigovernance-digitaltrust-activity-7447332754012221440-kL6r?utm_source=share&amp;amp;utm_medium=member_desktop&amp;amp;rcm=ACoAAAJhytYBx02UNVBsYzLMUUtCShJZ-EcK_zA%C2%A0&quot; target=&quot;_blank&quot;&gt;LinkedIn&lt;/a&gt;. &amp;nbsp;&lt;/p&gt;&lt;p&gt;&lt;em&gt;This article originally appeared in the AI Governance Dashboard, a free weekly IAPP newsletter. Subscriptions to this and other IAPP newsletters can be found &lt;/em&gt;&lt;a href=&quot;https://iapp.org/news/subscriptions&quot;&gt;&lt;em&gt;here&lt;/em&gt;&lt;/a&gt;&lt;em&gt;.&lt;/em&gt;&lt;/p&gt;</description>
      <link>https://iapp.org/news/a/ai-governance-has-officially-been-woven-into-the-iapp-global-summit</link>
      <guid isPermaLink="false">https://iapp.org/news/a/ai-governance-has-officially-been-woven-into-the-iapp-global-summit</guid>
      <pubDate>Tue, 07 Apr 2026 16:00:00 GMT</pubDate>
      <author>Ashley 

@TonyRL TonyRL merged commit 86bc955 into DIYgod:master Apr 10, 2026
31 of 32 checks passed
@TonyRL TonyRL deleted the feat/iapp branch April 10, 2026 01:37
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

auto: ready to review Manual review will come in after lint issues and merge conflicts are fixed route

Projects

None yet

Development

Successfully merging this pull request may close these issues.

希望增加https://iapp.org/news的RSS

1 participant