Consent.
Do you agree to this checkbox. Did you click accept. Was the privacy notice “clear enough”. Did the company “take consent properly”.
And look, consent matters. It is a core concept in modern privacy law. But it is also… kind of tired. And in 2026, it is starting to feel less like protection and more like paperwork. Something you “do” so you can move on and use the app, get the loan, enter the building, buy the ticket.
Which is why the more interesting conversation in Malaysia right now is not “how do we get better consent”, but something closer to.
What does it mean to treat people’s data with dignity.
Not just legality. Not just compliance. Dignity. A word that sounds philosophical until you see how digital systems actually behave when nobody is watching.
This article is about that shift. Where it might go. And what “data dignity” could look like in a Malaysian context, where the Personal Data Protection Act (PDPA) exists, enforcement has been uneven, digital adoption is huge, and AI is now sitting inside everything from hiring to marketing to credit scoring.
The Consent Problem, Honestly
Let’s say you open a food delivery app. It asks for location access. Fine. Then it asks for access to your contacts, your photos, maybe your microphone, who knows. You can deny, sure.
But the real consent moment is usually not a clean free choice. It is.
Accept or don’t use the service.
And that pattern repeats everywhere. Retail loyalty programs. Clinics. E wallets. Property management apps. Schools. Even job applications.
So the first problem is power imbalance. Consent is not truly voluntary when one side controls the gate.
Second problem. People cannot realistically understand what they are consenting to.
Privacy notices are long, full of legal phrasing, sometimes inconsistent with how the product actually works. Even if a notice is “readable”, you still do not know what downstream data sharing looks like. You do not know whether your data is being used to build profiles. Whether it will be combined with other datasets. Whether it will shape what you are offered, what you are charged, or whether you are flagged as “risky”.
Third problem. Consent is a one time ritual for an ongoing relationship.
Data use changes. Businesses pivot. Vendors change. Models get retrained. New analytics tools appear. A company that was “just storing” data suddenly starts “predicting” with it. And your consent from 18 months ago is still being treated as if it covers everything. Because it is convenient.
So yes. Consent is necessary. But on its own, it is not a strong moral foundation for the data economy. It is more like a basic permission slip.
Malaysia is not unique here, by the way. This is a global fatigue. But Malaysia is hitting it at a very specific moment, with AI adoption, cross border data flows, and a public that is more digitally aware than it was even five years ago.
So What Is “Data Dignity”, Actually
“Data dignity” is not a single legal definition in Malaysia today. You will hear different versions depending on who you ask. Lawyers. Policymakers. Civil society. Product people. Researchers.
But the idea is roughly this.
Your data is tied to your personhood. It can affect your life chances. It should be handled in a way that respects you, not just in a way that avoids getting sued.
And that changes the framing.
Instead of “did we get consent”, you ask things like:
- Was this data collection necessary, or just opportunistic?
- Does the person understand what is happening in a meaningful way?
- Are we using the data in ways that could humiliate, exclude, exploit, or manipulate?
- Are we creating hidden scoring systems that people cannot contest?
- If something goes wrong, can the person get a remedy, quickly?
- Are we building systems that treat certain groups as inherently suspicious?
Data dignity is about outcomes and relationships, not paperwork.
It is also about respecting limits. Even if you can legally collect something, should you. Even if you got consent, is the use fair.
That is the big leap.
The Malaysian Context: PDPA, Progress, and Gaps
Malaysia’s PDPA has been around since 2010, enforced from 2013. It covers personal data in commercial transactions and lays out principles like notice and choice, disclosure limits, security, retention, data integrity, and access and correction.
Those principles are good. They are not nothing.
But the PDPA also has limitations that become more obvious as digital systems get more complex.
- Coverage gaps. PDPA focuses on commercial transactions and has exemptions. That creates uneven expectations across sectors and situations.
- Consent centric behavior. In practice, many organizations treat PDPA compliance as “we have a notice and a checkbox”. The deeper principles, like retention and data minimisation style thinking, do not always get the same attention.
- Enforcement and culture. A law can be solid on paper but still feel optional if there are few visible consequences. Over time, a compliance culture forms around what is likely to be checked, not what is truly right.
- AI and profiling pressure. PDPA was written before the current wave of automated decision making. The principles still apply, but the law was not designed with large scale inference, prediction, and continuous profiling in mind.
What “data dignity” does is expose those weak points. It asks for more than baseline compliance, especially where data is used to influence decisions about people.
Why This Shift Is Happening Now
A few forces are pushing Malaysia toward a dignity based conversation, even if nobody calls it that formally.
1. People are seeing the consequences of data use
Data breaches are one part of it, sure. When your phone number leaks, you get the scam calls, the fake delivery messages, the impersonation attempts. That pain is immediate.
But the more subtle part is also growing. People are noticing hyper targeted ads that feel creepy. They are noticing pricing differences. They are noticing “you are not eligible” messages with no explanation. They are noticing job platforms filtering them out.
Once you feel like you are being managed by invisible data systems, consent starts to feel irrelevant.
2. AI is making “secondary use” the main use
In many organizations, the value of data is no longer the primary service. It is what you can infer later.
Your transaction history becomes a proxy for financial stress. Your location becomes a proxy for lifestyle. Your browsing becomes a proxy for intent. Your social graph becomes a proxy for influence.
Consent was designed for collection and straightforward use. AI turns everything into raw material for new uses.
3. Cross border business forces higher expectations
Malaysia trades and collaborates globally. Companies operating across regions often face stricter customer expectations, procurement requirements, and partner audits. Even when local law is less demanding in a specific area, international business norms can pull local practice upward.
And honestly, consumers do the same. People compare. They hear about stronger rights in other places. They start asking why things here feel so opaque.
4. There is a quiet reputational cost now
A decade ago, most users would shrug. Now a brand that mishandles data can get dragged for weeks. Not always fairly, but it happens.
Reputation becomes a governance tool when regulation is slow, or when enforcement is not consistent. That reputational pressure pushes companies toward principles like fairness, transparency, and restraint. Which is basically the dignity conversation, just in business language.
What Data Dignity Looks Like in Practice (Not Abstract)
If you are thinking, okay, nice idea. But what do you actually do differently.
Here are the concrete moves. This is where it stops being theory.
Collect less, on purpose
Data dignity starts with restraint.
Do you really need IC numbers for everything. Do you need date of birth or just age band. Do you need exact GPS or just rough area. Do you need persistent identifiers, or can you use short lived tokens.
In Malaysia, a lot of systems still default to over collection. Because it is easier. Because “maybe we will need it later”. Dignity says no, you do not take what you cannot justify.
Make consent granular and reversible
Not five toggles hidden behind three menus. Real choices.
Separate essential data from optional data. Separate service delivery from marketing. Separate marketing from third party sharing. Make withdrawal as easy as giving consent, not a customer service maze.
And when consent is withdrawn, treat it seriously. Stop using. Stop sharing. Stop training on it where feasible, or at least have clear policies that people can understand.
Explain the “why”, not just the “what”
A dignity approach to transparency is not a 12 page privacy notice.
It is a short explanation at the moment it matters.
Why are we asking for this. What happens if you say no. Who will see it. How long will we keep it.
That is it. If you can’t explain it simply, it is a sign the use is too complex, or you are not proud of it.
Don’t hide behind “we don’t sell data”
This line is everywhere now. And it is often technically true while being practically misleading.
You might not “sell” data, but you might share it. You might “match” it. You might “enrich” it. You might let ad tech partners siphon it. You might give it to a vendor that uses it for their own models.
Data dignity means you describe the reality, not the comforting version.
Build systems people can contest
This is a huge one for Malaysia as automated decisions spread.
If a system denies someone a benefit, blocks an account, flags a transaction, rejects a job application, reduces visibility, or changes pricing.
There should be a path to challenge it. A real path. With humans involved. With time limits. With reasons given at an appropriate level.
Even if you cannot fully reveal proprietary logic, you can still provide meaningful reasons and a process.
Dignity without remedy is just branding.
Treat sensitive contexts as different
Not all data is equal, and not all moments are equal.
Healthcare data, children’s data, financial hardship signals, domestic abuse indicators, religious and political inferences, biometric identifiers. These contexts require extra care.
Malaysia is multi religious, multi ethnic, and politically diverse. That makes inference especially risky. A dignity lens would avoid building or buying systems that infer sensitive traits unless there is a very strong, socially justified reason.
And even then. Guardrails.
Pay attention to “group harms”, not just individuals
A lot of privacy compliance is individualistic. Access requests. Corrections. Consent.
But data systems often harm groups. A model that flags certain neighborhoods more. A fraud system that blocks migrant workers disproportionately. A hiring filter that screens out candidates from certain universities.
Data dignity includes fairness and anti discrimination thinking. It forces audits, bias testing, and impact assessments that look at patterns, not just complaints.
The Business Side: Why Companies Should Care (Even If They Don’t Want To)
Some leaders still see privacy as a cost center. That mindset is slowly getting expensive.
Because the cost is not just regulatory penalties. It is:
- Customer churn when trust is broken.
- Higher fraud and scam exposure when data handling is sloppy.
- Partner risk when you cannot pass due diligence.
- Internal inefficiency when data is scattered and uncontrolled.
- AI quality issues when you train models on junk, or on data you cannot properly govern.
Data dignity, framed well, becomes operational discipline. Clean data practices. Clear retention. Vendor control. Access logging. Purpose limitation.
It sounds boring. It is. But boring is what you want when you are dealing with personal data.
What Regulators and Policymakers Could Push Next
If Malaysia is moving toward data dignity, the next steps are not just “more consent forms”. It would look more like:
- Stronger expectations around data minimisation and retention in practice, not just as a principle.
- Clearer rules or guidance around profiling and automated decision making, especially where decisions have significant effects.
- More visible enforcement, because norms follow signals.
- Higher accountability for vendors and processors, since modern data ecosystems are supply chains.
- Requirements for impact assessments in high risk processing, like biometrics, surveillance like tracking, or large scale scoring.
This does not require importing another country’s law word for word. Malaysia can build its own approach, tied to local realities. But the direction matters.
Toward outcomes. Toward fairness. Toward human respect.
A Small Reality Check
“Data dignity” can also be co opted. Companies can slap the word on a slide deck and keep doing the same things.
So the test is simple.
Does the person have less exposure, less manipulation, less uncertainty, and more control.
Do people understand what is happening to them. Can they say no without being punished. Can they get help when something goes wrong.
If the answer is yes, dignity is real. If not, it is just another privacy slogan.
Where This Leaves Us
Consent is not going away in Malaysia. It will still matter, legally and practically.
But consent alone is too thin for what data is doing now. Data is shaping access. Shaping pricing. Shaping opportunity. Shaping how institutions treat people. Quietly.
So the next phase, the more necessary phase, is dignity.
Collect less. Explain better. Share less. Keep it shorter. Build appeals. Audit for bias. Respect sensitive contexts. Stop pretending a checkbox equals understanding.
Beyond consent, basically.
And Malaysia, whether by policy, market pressure, or sheer public fatigue, seems to be heading in that direction. Not smoothly. Not all at once. But it is happening.
FAQs (Frequently Asked Questions)
What is the main limitation of consent in Malaysia’s current privacy framework?
Consent in Malaysia often feels like a one-time ritual where users must accept terms to access services, but it doesn’t always represent a truly voluntary or informed choice due to power imbalances and complex privacy notices.
How does ‘data dignity’ differ from traditional consent-based data protection?
‘Data dignity’ emphasizes respecting individuals’ personhood by handling their data with care beyond legal compliance, focusing on fairness, transparency, and avoiding harm or exploitation, rather than just obtaining consent.
What are some challenges with Malaysia’s Personal Data Protection Act (PDPA) regarding modern digital practices?
The PDPA has coverage gaps, focuses heavily on consent checkboxes, lacks strong enforcement culture, and wasn’t designed to address AI-driven profiling and continuous automated decision-making prevalent today.
Why is there a growing need for a shift towards ‘data dignity’ in Malaysia now?
With increased digital adoption, AI integration in services like hiring and credit scoring, and heightened public awareness about data misuse consequences, Malaysia is moving towards treating data with dignity rather than just legal compliance.
What questions should organizations ask to ensure they respect ‘data dignity’?
Organizations should consider if data collection is necessary, if users truly understand data use, whether usage could humiliate or exploit individuals, if hidden scoring systems exist without recourse, and if remedies are available when issues arise.
How does the concept of ‘data dignity’ impact the future of data privacy conversations in Malaysia?
It shifts the focus from mere paperwork and consent towards building trustful relationships that prioritize ethical data use, fairness, transparency, and accountability in an increasingly AI-driven Malaysian digital economy.

