High-Level Overview
See-Mode Technologies is a MedTech startup developing AI-powered software for automated analysis and reporting of thyroid and breast ultrasound images, primarily serving radiologists, sonographers, and clinicians to enhance diagnostic accuracy, reduce reporting time, and improve workflow efficiency.[1][2][3][4] The company addresses key challenges in medical imaging, such as inter-operator variability, errors in nodule/lesion detection, and time-intensive reporting, by using deep learning to generate standardized reports based on TI-RADS (thyroid) and BI-RADS (breast) criteria.[1][3][4] Initially focused on stroke prediction via computational modeling, it pivoted to ultrasound reporting solutions with strong growth momentum, evidenced by regulatory approvals across the US (FDA-cleared for thyroid), Canada, Australia, New Zealand, and Singapore, plus a June 2025 acquisition by RadNet's DeepHealth division, which integrates its tech into broader population health platforms and reports up to 30% scan time reductions in real-world deployments.[2][3][4]
Origin Story
See-Mode Technologies was founded by Sadaf Monajemi, PhD (Co-founder & Director) and Milad Mohammadzadeh (Co-founder & Director), a team blending scientists, engineers, clinicians, and machine learning experts based in Singapore with operations in Australia.[1][2][5] The idea emerged from applying cutting-edge deep learning and computational modeling to medical images, initially targeting stroke prediction—a leading global cause of death—to help doctors optimize treatments.[1] Early traction built through regulatory milestones, including 2023 approvals in Australia and New Zealand for breast/thyroid ultrasound tools, followed by FDA clearance for its SMART-T (See-Mode Augmented Reporting Tool, Thyroid) in 2024, and partnerships with institutions like RadNet, Duke University radiologists, and I-MED Radiology.[2][3][4][5] Backed by APAC VCs such as MassMutual Ventures, Blackbird Ventures, Cocoon Capital, and SGInnovate, the company scaled to commercial deployment before its 2025 acquisition by RadNet, marking a pivotal validation of its technology.[1][2]
Core Differentiators
- Automated Detection and Reporting: Uses machine learning to localize nodules/lesions (e.g., ROIs on thyroid images), characterize them with lexicon-based descriptors (ACR TI-RADS for thyroid, BI-RADS for breast), and generate preliminary reports, reducing reporting time and eliminating errors.[3][4]
- Workflow Efficiency: Web-based, stand-alone software accelerates scans (up to 30% reduction observed at RadNet centers), enhances diagnostic confidence, and ensures consistency across operators.[2][4]
- Regulatory and Clinical Validation: FDA-cleared (thyroid), approved in multiple regions; real-world proof via deployments at 12+ RadNet sites and endorsements from clinical advisors like Duke's Dr. Benjamin Wildman-Tobriner and Butterfly Network's Dr. John Martin.[2][3][5]
- Integration-Ready: Post-acquisition, seamlessly embeds into DeepHealth's AI portfolio for breast, lung, prostate, and brain imaging, positioning it for high-volume care scalability.[2]
Role in the Broader Tech Landscape
See-Mode rides the AI-in-medical-imaging wave, where deep learning addresses radiologist shortages, rising imaging volumes, and precision demands amid chronic disease burdens like thyroid cancer and breast lesions.[1][2][4] Timing aligns with post-2023 regulatory accelerations (FDA, ANZ, etc.) and 2025 consolidations like RadNet's acquisition, fueled by market forces such as aging populations, ultrasound's cost-effectiveness over MRI/CT, and AI's proven ROI in workflows.[2][3] It influences the ecosystem by standardizing reporting (TI-RADS/BI-RADS automation), enabling population health insights via DeepHealth integration, and partnering with leaders like Monash Health and Stanford, thus lowering barriers for global adoption in underserved regions like APAC.[1][2][5]
Quick Take & Future Outlook
Post-acquisition, See-Mode's tech will expand within RadNet/DeepHealth as a core ultrasound engine, driving new AI solutions for diverse anatomies and fueling revenue growth in outpatient imaging.[2] Trends like multimodal AI (combining ultrasound with other modalities) and regulatory harmonization will amplify its reach, evolving its influence from standalone innovator to embedded platform leader in value-based care. As AI reshapes diagnostics, See-Mode exemplifies how targeted MedTech unlocks imaging's hidden insights, delivering the efficient, accurate tools clinicians need to save lives from stroke risks to cancer detection.[1][2][4]