
- 1. Q1. You’ve grown Pundi AI’s community to over 28,000 active data owners. How does this model upend the “closed‑data” advantage of incumbent AI giants like OpenAI and Google?
- 2. Q2. What are the biggest challenges you face in ensuring the quality, diversity, and provenance of crowdsourced data, and how does Pundi AI address them?
- 3. Q3. There’s growing concern about synthetic or low‑quality training data polluting AI models. How does real‑world, human‑verified data factor into model performance and trust?
- 4. Q4. Can you walk us through a concrete example where community‑sourced data materially improved an AI model’s accuracy or utility, versus a purely synthetic dataset?
- 5. Q5. With 4.5 trillion data tokens issued so far, what key lessons have you learned about using token economics to incentivize high-quality contributions? Any unexpected outcomes?
- 6. Q6. Dataset Partnerships: Given the scale of 122,000+ datasets, what types of collaborations (e.g., for LLM training or AI development) do you see as most promising or impactful?
- 7. Q7. Which metric(s) do you watch most closely as a leading indicator of Pundi AI’s health and growth—and why?
- 8. Q8. You started in payments with Pundi X before pivoting to AI data. What common lessons on decentralisation carried over, and what new challenges surprised you?
- 9. Q9. As an Erasmus Mundus scholar and former Opera evangelist, how did your early academic and evangelism experiences shape Pundi AI’s mission and culture?
- 10. Q10. What was the “aha” moment when you realized AI data quality and provenance would be Pundi AI’s focus—and why that, rather than another AI niche?
- 11. Q11. By your goal, one million people should earn from AI via Pundi AI. What does the roadmap look like to reach that milestone, and what milestones in “people empowered” matter most to you?
- 12. Q12. How does Pundi AI give real ownership of AI data back to individuals, and why does that matter for the future of AI?
- 13. Q13. What’s the thinking behind using a bonding curve for Data Pump, and how will it shape participation, pricing, and dataset liquidity going forward?
- 14. Q14. You’ve integrated a full OmniLayer stack including the data platform, marketplace, and data tokenization. How do these components work together to create a seamless developer experience?
- 15. Q15. Defining Openness: How does Pundi AI’s approach to being a "truly open AI ecosystem" differ from others in the market, and what does that mean in practice?
- 16. Q16. In five years, what does a successful "open AI ecosystem" look like for Pundi AI?
Q1. You’ve grown Pundi AI’s community to over 28,000 active data owners. How does this model upend the “closed‑data” advantage of incumbent AI giants like OpenAI and Google?
Traditional AI giants built their advantage on proprietary data, often scraped without consent from public sources. Pundi AI fundamentally flips this model. Our 28,000+ contributors voluntarily own and contribute their data, creating a transparent, ethically sourced alternative that breaks the monopoly of homogenised corporate datasets.
What makes this powerful is the diversity we unlock. Our community-sourced datasets include multilingual voice samples from rural regions and cultural contexts that are completely absent from most corporate training data. Through contributors who earn value from their data through tokenised rewards, we’re building a different dataset; one that’s a sustainable alternative to the extraction economy that powers today’s AI giants.
Q2. What are the biggest challenges you face in ensuring the quality, diversity, and provenance of crowdsourced data, and how does Pundi AI address them?
The biggest challenge is ensuring quality at scale while maintaining our decentralised ethos. The platform provides professional data, labeling and annotation services, ensuring AI data remains open and accessible to all.
Powered by the Pundi AIFX OmniLayer, activities are securely recorded on-chain and integrated with IPFS and various blockchains, which provides immutable provenance tracking. With browser plugin on X (formerly Twitter), users can actively contribute by tagging and annotating data for Social AI, fostering a more participatory ecosystem.
Q3. There’s growing concern about synthetic or low‑quality training data polluting AI models. How does real‑world, human‑verified data factor into model performance and trust?
Synthetic data has fundamental limitations and it amplifies existing biases and creates artificial patterns that don’t exist in the real world. Pundi AI is a decentralised artificial intelligence ecosystem built on blockchain technology. Our mission is to empower unbiased global collaboration in AI development by enabling secure, transparent, and privacy-preserving data exchange.
Users are rewarded in digital assets for their contributions, and the data they help create goes into training open-source AI agents. This creates transparency where users can trace the provenance of training data, which is becoming essential as AI moves into high-stakes applications.
Q4. Can you walk us through a concrete example where community‑sourced data materially improved an AI model’s accuracy or utility, versus a purely synthetic dataset?
For example, our partnership with ElizaOS specifically democratises dataset access for AI agent creators, providing concrete evidence of how community data improves model performance. Also our collaboration with FLock.io and Zypher network to utilise these datasets for AI Agents training.
Q5. With 4.5 trillion data tokens issued so far, what key lessons have you learned about using token economics to incentivize high-quality contributions? Any unexpected outcomes?
In 2024, the community voted to rebrand the FX token to Pundi AI. This migration introduced a new tokenomics model that centers the Pundi AI token as both a governance asset and a reward currency for data contributors.
Contributors are fairly compensated through a “tag to earn” model, and the PundiAI token ($PUNDIAI) is the native cryptocurrency that fuels the ecosystem. It incentivises data contributors, powers governance decisions, and facilitates transactions within the platform.
Q6. Dataset Partnerships: Given the scale of 122,000+ datasets, what types of collaborations (e.g., for LLM training or AI development) do you see as most promising or impactful?
We’re seeing the strongest traction in three areas:
- Infrastructure Partners – e.g., the NVIDIA Inception Program provides enterprise validation, while ElizaOS and Swarm Network build resilient infrastructure for AI agent launches, Hive Intelligence supports unified data access.
- AI Agent Training – e.g., our collaboration with FLock.io and Zypher Network leverages our datasets for AI agent deployments on their platforms.
- Specialised AI – e.g., our partnership with Twin3 AI focuses on decentralised digital identities, creating datasets for identity verification AI models, with Conflux network for performance improvement.
Q7. Which metric(s) do you watch most closely as a leading indicator of Pundi AI’s health and growth—and why?
We focus on metrics that track ecosystem health and sustainability rather than just growth numbers. Pundi AI’s objective is democratising AI, and fulfilling the vision of creating 1 million job opportunities, so we monitor both participation and the quality of economic opportunities we’re creating
Q8. You started in payments with Pundi X before pivoting to AI data. What common lessons on decentralisation carried over, and what new challenges surprised you?
Known for our pioneering work in enabling real-world crypto payments since 2017, our foray into the artificial intelligence frontier is a strategic evolution. The core lesson from our payments background was that decentralisation works when users feel genuine ownership and control over their assets.
Powered by the Pundi AIFX OmniLayer, activities are securely recorded on-chain and integrated with IPFS and various blockchains, abstracting away data and smart contract complexities while allowing users to interact on multi-chains like BNB Chain, Base, Ethereum.
Q9. As an Erasmus Mundus scholar and former Opera evangelist, how did your early academic and evangelism experiences shape Pundi AI’s mission and culture?
My academic journey as an Erasmus Mundus scholar across Europe allowed me to learn and live in Sweden and Norway. The studies were immensely useful for my later life in terms of Masters classes in cryptography and other computer science subjects.
At Opera and the W3C, I championed interoperability and user empowerment – principles that now define Pundi AI’s mission to democratise AI data ownership. There weren’t many Asian focused standardisation in W3C and I was in the right place and the right time, and as someone focused on standardisation and chinese speaker, I had the opportunity to start the first Interest group in W3C on standardisation of Chinese needs including how Chinese text is displayed across browsers, and sometimes also interoperability of Chinese words with Japanese and Korean words. It was a great experience
Just as blockchain removes intermediaries in finance (as we did with Pundi X), Pundi AI leverages token incentives to crowdsource high-quality, real-world data, shifting power from centralised entities to individuals. My experiences taught me that scalable innovation thrives when communities are incentivised to participate, a philosophy embedded in our culture of privacy-first, community-governed AI.
Q10. What was the “aha” moment when you realized AI data quality and provenance would be Pundi AI’s focus—and why that, rather than another AI niche?
The realisation came from understanding that AI’s bottleneck wasn’t algorithms or computation, but it was an ethical data supply-chain. Our mission is to empower unbiased global collaboration in AI development by enabling secure, transparent, and privacy-preserving data exchange
Q11. By your goal, one million people should earn from AI via Pundi AI. What does the roadmap look like to reach that milestone, and what milestones in “people empowered” matter most to you?
Our roadmap to empowering one million people to earn from AI revolves around three key pillars: accessibility, scalability, and community governance.
Q12. How does Pundi AI give real ownership of AI data back to individuals, and why does that matter for the future of AI?
Empowering users to contribute, secure, and manage their AI data as IP is a core component of our mission. Unlike centralised AI platforms, Pundi AI distributes control and ownership among its community, fostering trust and openness.
This shifts AI from extraction to partnership, where contributors retain control and share in the value creation rather than having their data harvested without consent.
Q13. What’s the thinking behind using a bonding curve for Data Pump, and how will it shape participation, pricing, and dataset liquidity going forward?
Data Pump’s bonding curve creates a self-reinforcing cycle: early data contributors earn higher rewards as demand grows, while buyers benefit from predictable, algorithmically adjusted pricing. This mirrors Pundi X’s token incentives – rewarding participation while ensuring liquidity scales organically. By tying dataset value directly to usage, we replace opaque pricing with a transparent, community-governed model where every transaction strengthens the network
Q14. You’ve integrated a full OmniLayer stack including the data platform, marketplace, and data tokenization. How do these components work together to create a seamless developer experience?
Powered by Pundi AIFX OmniLayer, activities are securely recorded on-chain and integrated with IPFS and various blockchains, abstracting away data and smart contract complexities while allowing users to interact on chains like Base.
Later this year, a trading platform will be launched for untrained and trained datasets, including but not limited to those from Pundi AI Data platform. An agent training platform will also be launched in Q3/4 this year, allowing the training of agentic AI agents using datasets on Pundi AI Data platform and market-making agents and their DAO tokens on-chain.
Q15. Defining Openness: How does Pundi AI’s approach to being a “truly open AI ecosystem” differ from others in the market, and what does that mean in practice?
Pundi AI is a decentralised artificial intelligence ecosystem built on blockchain technology. Its mission is to empower unbiased global collaboration in AI development by enabling secure, transparent, and privacy-preserving data exchange.
Unlike centralised AI platforms, Pundi AI distributes control and ownership among its community, fostering trust and openness. It incentivises data contributors, powers governance decisions, and facilitates transactions within the platform.
Q16. In five years, what does a successful “open AI ecosystem” look like for Pundi AI?
Pundi AI’s objective is democratising AI, and fulfilling the vision of creating 1 million job opportunities. Success means establishing a sustainable model where users are rewarded in digital assets for their contributions, and the data they help create goes into training open-source AI agents.
In the coming future, our partnership model will have created a self-sustaining ecosystem where universities provide education and research, infrastructure partners provide scale, and specialized AI companies provide applications, all built on ethically sourced, community-owned data. The measure of success will be when closed-data AI models become the exception rather than the rule.
-END-
Zac Cheah Bio:
Zac is a blockchain pioneer and tech entrepreneur who was named among the Top Ten Fintech Leaders in 2019 by the Singapore Fintech Association. An Erasmus Mundus scholar, Zac is a double-Master’s degree holder in Computing and Security in Swedish Royal Institute and Norwegian University of Science. Before founding Pundi X, he served as Chief Evangelist Officer of Opera China and chaired a W3C Interest Group.
In his present role as co-founder & core contributor of Pundi AI, Zac is building a truly open AI ecosystem with the goal of empowering 1 million people to earn from AI. Cheah’s expertise spans decentralised AI data platforms, token-economy design, cross-chain interoperability, privacy-preserving frameworks, fintech, and community-driven innovation.