LinkedIn Data and AI Training: A Growing Privacy Concern

1

October

2025

No ratings yet.

The Dutch privacy authority (Autoriteit Persoonsgegevens, AP) recently advised LinkedIn users to opt out of the platform’s new policy that allows personal data to be used for training artificial intelligence (AI) systems (AP, 2025). This move raises an important question: at what point does innovation cross the line into exploitation? Tech companies often frame data collection as progress, but in reality, users are rarely given meaningful choice or control over how their information is used.

Once information is fed into AI systems, removal becomes nearly impossible. The AP explicitly warned that individuals risk losing control over their data (AP, 2025). This is not limited to LinkedIn. Companies like Apple, Nvidia, and Anthropic have also used online content, including social media and video platforms, to train their systems, raising similar questions about transparency and permission (Gilbertson & Reisner, 2024).

The legal situation remains uncertain. In the United States, courts have begun to reassess how copyright and fair use apply to AI training, as shown in the recent Thomson Reuters v. Ross Intelligence case, where earlier decisions were reversed (D. Levi et al., 2025). These disputes illustrate how quickly AI has outpaced existing frameworks, leaving users’ rights exposed.

Beyond privacy and law, there are broader social risks. If companies train models on datasets skewed toward particular groups or perspectives, the outputs will reflect and amplify those biases (Dilmegani, 2025). That means decisions in hiring, finance, or even healthcare could be influenced by flawed or unrepresentative data.

The central issue is one of accountability. Companies argue that data-driven training is necessary for innovation, but innovation cannot come at the expense of trust and fairness. Opt-outs and transparency should be standard practice, not hidden in settings menus. Without stronger safeguards, AI risks being built on practices that exploit rather than respect the individuals who provide the data.

Reading and researching this made me think more critically about the AI tools I use daily, like ChatGPT, Claude or Perplexity. It highlighted the importance of being aware of where models get their data and the potential consequences for privacy and fairness. If I were designing one, I’d make it clear what data is used and give people an easy way to opt out. This experience has made me more cautious and intentional about how I interact with AI tools, and how AI systems can quietly shape our privacy and fairness.

References:

AP. (2025). AP bezorgd over AI-training LinkedIn en roept gebruikers op om instellingen aan te passen. Autoriteit Persoonsgegevens. https://autoriteitpersoonsgegevens.nl/actueel/ap-bezorgd-over-ai-training-linkedin-en-roept-gebruikers-op-om-instellingen-aan-te-passen

D. Levi, S., Feirman, J., Ghaemmaghami, M., & N. Morgan, S. (2025). Court reverses itself in AI training data case. Skadden. https://www.skadden.com/insights/publications/2025/02/court-reverses-itself-in-ai-training-data-case

Dilmegani, C. (2025). Bias in AI: Examples and 6 Ways to Fix it. AIMultiple. https://research.aimultiple.com/ai-bias/

Gilbertson, A., & Reisner, A. (2024). Apple, Nvidia, Anthropic used thousands of swiped YouTube videos to train AI. WIRED. https://www.wired.com/story/youtube-training-data-apple-nvidia-anthropic/

Please rate this

Palantir: Data Integration or Digital Dependency?

17

September

2025

No ratings yet.

Palantir is one of those companies that splits opinion. Some see it as a pioneer in data integration, others as a black-box operator with too much power. What is clear is that the firm has shifted from its original roots in counterterrorism toward becoming a central player in enterprise and public-sector data strategy. That transition has produced a few real-world cases worth thinking about.

One of the most cited examples is Airbus. Since 2017, Airbus has been using Palantir’s technology to run its Skywise platform (Airbus, n.d.). By connecting everything from supply chains to maintenance records, Airbus reports a 33 percent increase in production for its A350 aircraft (Palantir, n.d.). That scale of improvement shows why companies turn to Palantir: the ability to combine messy, siloed data into a single operational backbone. The same logic applies in healthcare. During the pandemic, the UK’s National Health Service (NHS) worked with Palantir to create a COVID-19 Data Store, bringing together fragmented data to manage resources more effectively (NHS England, n.d.). More recently, Palantir secured a £330 million contract to run the NHS’s new Federated Data Platform, which aims to link patient and operational data across trusts (Booth, 2025).

These successes raise hard questions. Once a company builds its daily operations on Palantir’s systems, can it ever leave without serious disruption? The lock-in risk is real because you are not just outsourcing IT, you are outsourcing decision logic. Then there is the matter of transparency and trust. Palantir’s history with intelligence agencies and policing continues to shape public perception. Many see giving sensitive healthcare or industrial data to a company with this track record as a risk, despite the clear efficiency gains.

The open debate is whether Palantir becomes the Microsoft of enterprise data or whether concerns over lock-in, transparency, and ethics ultimately limit its reach. In my view, the technology demonstrates clear strategic value, but unless Palantir can build stronger trust, its long-term role will remain uncertain.

References:
Airbus. (n.d.). Skywise Core [ X ]. Airbus Aircraft. https://aircraft.airbus.com/en/services/enhance/skywise-data-platform/skywise-core-x

Booth, R. (2025, July 8). Palantir accuses UK doctors of choosing ‘ideology over patient interest’ in NHS data row. The Guardian. https://www.theguardian.com/technology/2025/jul/08/palantir-technology-uk-doctors-patient-nhs-data

NHS England. (n.d.). NHS COVID-19 Data Store. https://www.england.nhs.uk/contact-us/privacy-notice/how-we-use-your-information/covid-19-response/nhs-covid-19-data-store/

Palantir. (n.d.). Impact | Airbus and Skywise. https://www.palantir.com/impact/airbus

Please rate this