AI sign language recognition interface
AI accessibility platform

Tathleel SignSense

Real-time sign language recognition and translation for classrooms, clinics, service desks, and public-sector teams that need communication to feel immediate.

Arabic SL Research-focused recognition
< 250 ms Designed for instant response
API-ready Built for institutional workflows
AI accessibility research visualization
Platform

AI that turns visual language into shared understanding.

SignSense combines computer vision, multimodal language models, and Arabic sign language research to detect gestures, interpret context, and deliver readable translations in real time.

The platform is shaped for everyday environments: a reception desk, a lecture hall, a telehealth appointment, or a government service counter where communication needs to be fast and respectful.

Privacy-aware deployment Arabic-first translation Edge and cloud inference
Solutions

Recognition, translation, and interfaces in one product layer.

Each module can stand alone or connect through a shared API for teams building accessible digital and physical services.

Sign recognition computer vision scene

Gesture Recognition

Detects hand shapes, movement, facial expression, and signing tempo with confidence scoring.

Computer vision
Sign language translation product visualization

Live Translation

Converts recognized signs into Arabic or English text streams for service teams and captions.

Sign to text
Accessible interface dashboard visualization

Accessible Interfaces

Embeds translation into kiosks, dashboards, web apps, and call-center tools through a clean API.

API integration
Technology

Built around multimodal signals, not isolated gestures.

Sign language carries meaning through hands, face, posture, motion, and timing. The technology layer reads those signals together.

Vision Models

Pose estimation, hand tracking, and temporal recognition tuned for signing movement.

Language AI

Context-aware translation that keeps sentence intent readable for hearing teams.

Arabic SL Data

Dataset pipelines designed for annotation quality, review, and domain adaptation.

Low Latency

Optimized inference paths for kiosk, browser, and edge-device deployments.

Leadership

Research leadership with a product mandate.

Tathleel is led by Dr. Hamzah Luqman, founder and CEO, and informed by work from the Arabic Sign Language Processing Lab at KFUPM. The goal is practical accessibility technology with research-grade care.

AraSLP Lab direction Peer-reviewed AI research Global sign-language workshops
Visit AraSLP Lab
Contact

Bring accessible communication into your next service flow.

Share where SignSense could help: education, healthcare, public service, research partnerships, or enterprise integration.

hluqman@kfupm.edu.sa Dhahran, Saudi Arabia