AI vs Doctor Dilemma
| The growing debate: Can AI chatbots like Gemini and ChatGPT eventually replace dermatologists? |
(Disclaimer: The views expressed here are my personal opinions intended for public awareness and discussion. They do not constitute medical advice. Readers are encouraged to consult qualified healthcare professionals for individual medical concerns. Read this article Ad FREE on Patreon)
After a short winter break, I was eager to write about this topic - perfectly timed as we begin 2026. With OpenAI's recent health model release making headlines, now is the ideal moment to dissect AI's role in healthcare, and this article arrives right alongside that development. Recently, I’ve noticed an interesting trend among some of my younger patients: their medical questions during consultations have become noticeably more refined and well-informed. When I asked a few of them how they prepared, many candidly admitted to using AI chatbots. Curious to understand how widespread this behaviour might be, I decided to ask my Instagram followers - many of whom are also my patients - whether they use AI chatbots for medical advice, and if so, which ones.
Most said yes.
![]() |
| Public trends: 78% of surveyed users have sought medical advice from an AI, with ChatGPT being the most common choice |
That piqued my curiosity. I began to examine closely the terms and conditions of one of the most widely used chatbots - the very documents most of us agree to without ever reading.
Here’s what I found.
What AI Chatbots Say in Their Fine Print
Do not rely on AI for professional medical advice.
The terms explicitly state that users should not treat chatbot outputs as a substitute for consultation with qualified healthcare professionals.No use for health decisions.
The fine print prohibits using AI-generated information for any decision that could have a legal or medical impact - including diagnosis or treatment.Information may be inaccurate or incomplete.
AI output isn’t guaranteed to be accurate, complete, or appropriate. It can be misleading or wrong, and users bear responsibility for verifying the information.Use at your own risk.
These tools are provided “as is,” with no warranties on correctness or outcomes. Essentially, you are responsible for any consequences of relying on their responses.
So, just like the cigarette pack that says “Smoking is injurious to health,” or the disclaimer on trading apps that warns against F&O speculation, the warnings are clear, yet often ignored. Not every smoker faces life‑threatening harm, but when it does occur, it can alter that person’s life and their family’s future forever. Similarly, ignoring the limitations of AI in medical decision‑making may not cause issues for everyone, but when it does, the consequences can be serious and far‑reaching. That made me want to understand why people still feel the need to use AI despite being aware of these disclaimers.
Why Patients Still Use AI
When I spoke with the subset of my patients who use AI, the most common reasons were:
Convenience. AI chatbots are instantly available, day or night - no appointments, no waiting rooms.
Accessibility issues. It’s easy to see the emotional appeal. If someone is anxious at midnight with a troubling symptom, an AI chatbot can feel like a reassuring lifeline - immediate, patient, and available when no one else is.
And truthfully, that’s understandable. Points 1 and 2 seem reasonable given the current healthcare scenario in India, where affordability and access remain major determinants of how people seek medical help.
In a private setup, a consultation can cost anywhere between USD 2 to 20 per visit ( 1USD = ₹90), depending on where you live and the experience of the dermatologist you visit. For context, the average Indian earns less than USD 2,500 per year, making even basic outpatient visits a financial consideration. In the government sector, waiting times are notoriously long, and the doctor-patient interaction often lasts less than five minutes. Faced with these realities, turning to a free, always-on AI chatbot can feel like a practical alternative.
Sensitive or shy questions. Many seek AI advice for concerns they hesitate to discuss openly - such as sexual health or private skin issues - because AI provides a sense of privacy and judgment-free comfort.
The Other Side of AI Convenience
However, that very sense of “privacy” may not be absolute. When users share personal medical details, images, or symptoms, that information is typically stored on remote servers. These inputs may be used to train AI models or personalise responses, unless users proactively opt out - something few people actually do.
This raises genuine data security and confidentiality concerns. Sensitive images or private health questions could be exposed if user accounts or platform servers are ever compromised. What feels like a safe, private chat might, in fact, leave behind a permanent digital trail.
Here are the key points from Terms of Use and Privacy Policy that inform users about data storage and usage, particularly for services like ChatGPT (individual/consumer use):
From Terms of Use
Use of Your Content: OpenAI may use your Content (Inputs like prompts and Outputs generated by the service) to "provide, maintain, develop, and improve our Services, comply with applicable law, enforce our terms and policies, and keep our Services safe."
Model Training: Your Content may be used specifically to train and improve models. However, users can opt out: "If you do not want us to use your Content to train our models, you can opt out by following the instructions" (linked to their data controls article). Opting out may limit personalised performance.
Privacy Reference: The Terms state that the Privacy Policy "explains how we collect and use personal information" and recommend reading it.
From Privacy Policy
Collection and Storage: OpenAI collects and stores User Content (prompts, files, images, etc. submitted to ChatGPT), account information, usage data (e.g., interactions, logs, device info), and other data from service use. Data is stored as needed for providing services and legitimate purposes.
Retention: Personal data is retained "only as needed" based on purpose, sensitivity, risk, and legal requirements. For example, temporary chats in ChatGPT are kept up to 30 days for safety review and then deleted (they don't appear in history).
Usage Purposes:
To provide, analyse, and maintain services (e.g., generating responses).
To improve and develop services, including research and new features.
Specifically: "Content provided may be used to improve Services, such as training models powering ChatGPT."
For security, fraud prevention, legal compliance, and communication.
De-identified or aggregated data may be used for analysis and research.
Sharing: Data may be shared with vendors (e.g., cloud providers), affiliates, in business transfers, or with authorities for legal reasons. Shared conversations (e.g., via links) are visible to others.
Opt-Out for Training: Users can opt out of having their Content used for model training/improvement via account settings or instructions provided.
Where Doctors Still Make the Difference
![]() |
| Real experience vs. digital diagnosis: Why a doctor’s clinical judgment remains the gold standard. |
There’s no denying the usefulness of AI for general understanding or reassurance after a consultation. However, in healthcare - especially dermatology - the human touch remains irreplaceable.
Dermatological diagnosis is visual and tactile. A doctor sees the lesion in three dimensions, feels its texture, and assesses it in context. AI can only interpret two-dimensional images.
Skin of colour behaves differently. As medical trainees in India, we were taught to refer to Indian textbooks and atlases because diseases often appear radically different on darker skin compared to Caucasian skin, which is predominantly represented in books written by Western authors. For instance, lichen planus, a common skin condition, appears violet on fair skin but takes on a blackish hue on darker skin - a striking example of how the same disease can look entirely different based on skin tone. Most Western AI tools today are not trained on such diverse datasets, limiting their diagnostic accuracy for subcontinental skin types.

*Visualising Lichen planus: Comparison of how the condition presents on light vs. dark skin tones. Like self-driving cars that still struggle to replace human depth perception, AI in dermatology lacks the sensory nuance of experienced clinical eyes and hands.
Unlike AI apps that track everything you do online, doctors only access the specific information you choose to share.
The Right Way to Use AI in Healthcare
AI can be a helpful learning companion - to understand complex topics in simple language or to clarify what you’ve already discussed with your doctor. But it should never replace professional medical evaluation. The risks include:
Inaccurate or hallucinated information.
Lack of context about your personal medical history.
Potential for harmful self-diagnosis or treatment delays.
In Summary
Think of AI as your medical encyclopedia with conversational abilities - handy, informative, but not diagnostically reliable. Your doctor, on the other hand, brings context, judgement, and empathy - the dimensions that generate truly safe and personalised care.
AI can inform you.
But only your doctor can treat you.
🛠️Check for updates to this article (T13)
👉🏾Visit www.drsubramaniant.com for more Articles [FREE]
Content and Figure License Notice
This article incorporates original figures and images created by Dr Subramanian T, MD, released under the Creative Commons Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license. You may share and adapt these materials for any purpose (including commercial use) provided you provide proper attribution and license adaptations under the same terms.
https://creativecommons.org/licenses/by-sa/4.0/
Required Attribution Format for Reuse:
Author: Dr Subramanian T, MD
Source:
Title: [Insert exact article title]
URL: [Insert exact article URL]
License: CC BY-SA 4.0
Figure List:
Fig 1*: ai-vs-human-expertise-debate.jpg
Fig 2: ai-medical-advice-survey-data.jpg
Fig 3: doctor-experience-vs-ai-chatbot.jpg
Fig 4: lichen-planus-skin-tone-comparison.jpg
Third-Party Image Credits (Fig. 4):
Image 4a (Left): "Lichen planus on leg" by
Warfieldian, CC BY-SA 3.0
Image 4b (Right): "Lichen planus (new photo for diagnosis)" by
Masryyy, CC BY-SA 4.0
Trademark Notice:
ChatGPT® (OpenAI) and Gemini™ (Google LLC) logos are trademarks of their respective owners, used here under nominative fair use for identification and commentary purposes only.
Notes:
Images marked with * are not created by the author and are credited individually above.
Third-party content is used under "fair dealing" provisions where applicable. Obtain separate permissions for reuse.


Comments
Post a Comment