This privacy-first chatbot is taking off – here’s why and how to try it


gettyimages-1147693712

Mohith S R/iStock / Getty Images Plus

Follow ZDNET: Add us as a preferred source on Google.


ZDNET’s key takeaways 

  • DuckDuckGo’s privacy-first chatbot is taking off.
  • Users are increasingly concerned about how their data is used.
  • New features could also be driving growth.

Privacy concerns around chatbots are nothing new, but as AI adoption spreads, users are becoming increasingly aware of the risks. Duck.ai, the chatbot from privacy-focused browser DuckDuckGo, could be benefiting from that. 

Also: Stop telling AI your secrets – 5 reasons why, and what to do if you already overshared

New data from Similarweb found web traffic to Duck.ai exploded last month. Duck.ai “reached 11.1 million visits in February, up more than 300% from January,” Similarweb told ZDNET.

DuckDuckGo did not immediately respond to a request for comment.

That number is still small compared to the most dominant chatbots: By contrast, Similarweb estimates that chatgpt.com reached 5.4 billion visits in February, while gemini.google.com reached 2.1 billion and claude.ai hit 290.3 million. Still, for having only launched in beta a year ago, that’s a sharp uptick in visits worth keeping an eye on. 

unnamed.png

Similarweb

Duck.ai extends the same privacy to users that they have come to expect from its browser, anonymizing queries to prevent third parties from accessing chats. The chatbot does not run a bespoke LLM; it uses frontier models from Anthropic, OpenAI, and Meta, among others, but calls those providers on your behalf so as not to expose your IP address or other personal information. 

(Disclosure: Ziff Davis, ZDNET’s parent company, filed an April 2025 lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.)

Also: The permissions behind your AI Chrome extensions deserve a closer look – they may be spying on you

“In addition, we have agreements in place with all model providers that further limit how they can use data from these anonymous requests, including not using Prompts and Outputs to develop or improve their models, as well as deleting all information received once it is no longer necessary to provide Outputs (at most within 30 days, with limited exceptions for safety and legal compliance),” Duck.ai’s privacy policy says.

ZDNET’s Jack Wallen tested Duck.ai last year and found he preferred it over Perplexity at the time. 

Duck.ai’s popularity spike  

So why the sudden jump in traffic? 

Duck.ai offers two main benefits over individual proprietary chatbots like Gemini and ChatGPT: the option to toggle between many models and increased privacy protections. The latter is likely what sets Duck.ai apart, though, as Perplexity also offers access to multiple models through a single interface, in addition to its own Sonar model family.

While some users on Reddit said they enjoy Duck.ai — one poster said “it’s way better than Google’s,” ostensibly referring to Gemini, and that it’s the reason they use DuckDuckGo.  Many others said it was “not bad,” neutral, comparably disappointing to other options, or just “better than nothing.” Some users dislike that Duck.ai doesn’t support document uploads.

Also: Tired of seeing AI images online? DuckDuckGo lets you hide them from results now

One Reddit thread from a few months before the spike is full of users lamenting that Duck.ai is just as disappointing as other chatbots, though one user said it “is decent, particularly if you are privacy focused.” A newer thread includes complaints about usage limits without mentioning the specifics of what keeps users coming back to the chatbot.

Also: Want to know which sites are selling your data? This free privacy tool gave me answers

Positive reviews are harder to parse because Duck.ai supplies access to several frontier models as opposed to its own bespoke option. Those who do enjoy Duck.ai cite the efficacy of the specific models they chose within it, like OpenAI’s GPT-5 mini. For at least one user, however, Duck.ai seemed to impact the efficacy of those models.

“A friend of mine uses regular chat-gpt and swears he gets better replies than he does with duck.ai,” one poster wrote. “I don’t know if it’s true. Maybe the privacy focused system prompts or whatever else they do in the background does change the answers.” That poster added that they “really enjoy Duck.ai.”

Increased privacy concerns

Last month, Anthropic rejected some proposed applications of its tech for weapons and mass surveillance by the Department of Defense (DoD), which retaliated by cutting its contract. OpenAI quickly swooped in, only to encounter the same debates over use

Coverage of the incident brought renewed concerns about privacy and AI to the forefront of public conversation. Nathan Calvin, vice president of state affairs and general counsel at advocacy organization Encode AI, told ZDNET that he noticed an uptick in conversations about data brokers, privacy, and how the government obtains data since the contract incident, in both the general public and policy spaces. 

“It’s an issue that’s been around for a while, but I definitely feel like a lot of folks are taking a look at it with fresh eyes and urgency,” he noted, adding that many people “had never heard of Anthropic or Claude” before the DoD story. 

In that light, chatbots that go further to protect user data from both AI companies themselves and the government may look more appealing than before. 

Also: Copilot quietly grabs your data from other Microsoft products now – here’s how to opt out

But according to Similarweb’s graph, Duck.ai started seeing a slight uptick at the end of 2025 that then grew exponentially last month. Duck.ai added image generation to the platform in December; in mid-February, it added real-time, privacy-protected voice chat. Some Reddit users had complained prior to the release that text-to-speech was the only thing missing from their Duck.ai experience, so the release is likely a driver of the spike as well. 

In keeping with the company’s other policies, Duck.ai voice chats are anonymized and not used to train models, and neither DuckDuckGo nor OpenAI (which provides voice support through Duck.ai) stores audio. That said, Duck.ai advises users that their voice “can be a biometric identifier,” which they should consider before trying the feature. 

One user wrote on the /ChatGPTcomplaints Reddit thread this month that they were trying Duck.ai “for no other reason than to hopefully rebuild my connection with 4o,” referring to GPT-4o, the model OpenAI sunset in ChatGPT in February, to the displeasure of many users. Other users noted their frustration with OpenAI’s DoD contract in the same thread, however, which is still a possible driver away from ChatGPT and toward any other chatbot (Anthropic’s Claude overtook OpenAI’s ChatGPT as the most downloaded free app in the US directly after the contract dispute began). 

How to try Duck.ai

You can try Duck.ai yourself for free or $10 per month (or $100 per year, if paying annually), which gets you access to more advanced models.





Source link

Leave a Reply

Subscribe to Our Newsletter

Get our latest articles delivered straight to your inbox. No spam, we promise.

Recent Reviews


Do you ever walk past a person on the streets exhibiting mental health issues and wonder what happened to their family? I have a brother—or at least, I used to. I worry about where he is and hope he is safe. He hasn’t taken my call since 2014.

James and his brother as young children playing together before his brother became sick. James is on the right and his brother is on the left.

James and his brother as young children playing together before his brother became sick. James is on the right and his brother is on the left.

When I was 13, I had a very bad day. I was in the back of the car, and what I remember most was the world-crushing sound violently panging off every surface: he was pounding his fists into the steering wheel, and I worried it would break apart. He was screaming at me and my mother, and I remember the web of saliva and tears hanging over his mouth. His eyes were red, and I knew this day would change everything between us. My brother was sick.

Nearly 20 years later, I still have trouble thinking about him. By the time we realized he was mentally ill, he was no longer a minor. The police brought him to a facility for the standard 72-hour hold, where he was diagnosed with paranoid delusional schizophrenia. Concluding he was not a danger to himself or others, they released him.

There was only one problem: at 18, my brother told the facility he was not related to us and that we were imposters. When they let him out, he refused to come home.

My parents sought help and even arranged for medication, but he didn’t take it. Before long, he disappeared.

My brother’s decline and disappearance had nothing to do with the common narratives about drug use or criminal behavior. He was sick. By the time my family discovered his condition, he was already 18 and legally independent from our custody.

The last time he let me visit, I asked about his bed. I remember seeing his dirty mattress on the floor beside broken glass and garbage. I also asked about the laptop my parents had gifted him just a year earlier. He needed the money, he said—and he had maxed out my parents’ credit card.

In secret from my parents, I gave him all the cash I had saved. I just wanted him to be alright.

My parents and I tried texting and calling him; there was no response except the occasional text every few weeks. But weeks turned into months.

Before long, I was graduating from high school. I begged him to come. When I looked in the bleachers, he was nowhere to be seen. I couldn’t help but wonder what I had done wrong.

The last time I heard from him was over the phone in 2014. I tried to tell him about our parents and how much we all missed him. I asked him to be my brother again, but he cut me off, saying he was never my brother. After a pause, he admitted we could be friends. Making the toughest call of my life, I told him he was my brother—and if he ever remembers that, I’ll be there, ready for him to come back.

I’m now 32 years old. I often wonder how different our lives would have been if he had been diagnosed as a minor and received appropriate care. The laws in place do not help families in my situation.

My brother has no social media, and we suspect he traded his phone several years ago. My family has hired private investigators over the years, who have also worked with local police to try to track him down.

One private investigator’s report indicated an artist befriended my brother many years ago. When my mother tried contacting the artist, they said whatever happened between them was best left in the past and declined to respond. My mom had wanted to wish my brother a happy 30th birthday.

My brother grew up in a safe, middle-class home with two parents. He had no history of drug use or criminal record. He loved collecting vintage basketball cards, eating mint chocolate chip ice cream, and listening to Motown music. To my parents, there was no smoking gun indicating he needed help before it was too late.

The next time you think about a person screaming outside on the street, picture their families. We need policies and services that allow families to locate and support their loved ones living with mental illness, and stronger protections to ensure that individuals leaving facilities can transition into stable care. Current laws, including age-based consent rules, the limits of 72-hour holds, and the lack of step-down or supported housing options, leave too many families without resources when a serious diagnosis occurs.

Governments and lawmakers need to do better for people like my brother. As someone who thinks about him every day, I can tell you the burden is too heavy to carry alone.

James Finney-Conlon is a concerned brother and mental health advocate. He can be reached at [email protected].



Source link