Glovatrix’s AI-powered gloves translate signal language into speech and textual content, bridging communication gaps for the deaf, beginning in workplaces, empowering lives, and shaping the way forward for accessibility. In an unique dialog with EFY’s Nidhi Agarwal, Aishwarya Karnataki, Founder and CEO of Glovatrix, discusses the journey behind the innovation, challenges confronted, and the imaginative and prescient to make communication universally inclusive.

Q. Are you able to shed some mild in your agency and its vary of services or products?
A. Our organisation’s purpose is to help deaf and speech-impaired people who use signal language. Now we have developed AI-powered good gloves that translate signal language into speech and convert spoken responses into textual content, serving to each events talk extra simply.
Q. Any purpose for starting a startup?
A. In eighth grade, I used to be a part of the coed council and commonly interacted with youngsters with disabilities. Most had associates who understood them, however one boy, Atharv, didn’t—nobody knew signal language. I realized it and requested him, “Will you be my good friend?” He was overjoyed to lastly talk. That have impressed me. Later, whereas learning engineering, I developed a gesture-controlled robotic and merged the thought with signal language translation. That’s how Glovatrix started.
Q. Do you assume your product, fifth Sense, shall be used extra at work or at residence?
A. Positively in workplaces. We goal to start out by collaborating with corporations that make use of deaf people. Our product presently recognises round 200 gestures, however signal language contains roughly 20,000. A deaf employee at a spot like KFC solely wants a small set of gestures associated to their job, like ‘hen’ or ‘spicy.’ So, we’ll create particular gesture libraries for every office, making it simpler to make use of.
Beginning with workplaces is smart, not solely on account of CSR funding and HR assist, but in addition as a result of job-specific gesture libraries facilitate simpler adoption. As manufacturing scales up, the associated fee will lower, making it extra accessible to people.
Q. How will this product assist in studying and learning signal language? Will it create new indicators?
A. Now we have spoken with the Indian Signal Language Analysis and Coaching Centre (ISLRTC), and they’re eager to collaborate with us, not solely to translate signal language but in addition to create new indicators. When there’s a have to introduce an indication, resembling for ‘earphones’, specialists from numerous fields convene to finalise it. Nevertheless, they presently lack a system to confirm whether or not the same signal already exists elsewhere.
We recommended utilizing our gloves to assist construct a digital dictionary that verifies current indicators, recommends related options, and aids in signal language training. ISLRTC has responded positively, and we’re persevering with discussions.
Q. What sensors make your gesture system work?
A. Every fingertip incorporates a movement sensor to trace finger motion, together with a further sensor on the wrist. All the info is transmitted to a microcontroller on the wrist, which then sends it through Bluetooth to a cellphone. From there, it’s uploaded to the cloud, the place our algorithm processes it. The output is shipped again to the gloves, which function a speaker for sound and a display for displaying textual content.
Q. How is electronics concerned in your machine?
A. Our machine incorporates a considerable quantity of electronics and contains two forms of circuit boards (PCBs): a most important board on the wrist and ten customized fingertip boards. The primary board includes a movement sensor, an ESP32 chip, a microphone, a speaker, a vibration motor, and a display, together with important elements like resistors and capacitors. Every fingertip board has a movement sensor, and one fingertip features a button. Urgent the button begins gesture seize; urgent it once more stops it. The system then initialises gesture evaluation, serving to distinguish between signal language and different actions.
Q. What particular options does your machine have?
A. One key function we have now added is background sound detection, which is especially useful for deaf customers. Sounds resembling doorbells, a child crying, or a strain cooker whistle usually go unnoticed, which might trigger severe points. Our machine addresses this by sending alerts for particular sounds. It doesn’t simply point out ‘noise’, it reveals what the sound is. A crying child triggers a child icon, a cooker whistle reveals a cooker icon, and a doorbell shows its personal icon. Now we have preloaded 8–10 frequent sounds, together with a child crying, a cooker whistle, a hearth alarm, a doorbell, and a automobile horn. Customers can even add as much as 4 customized sounds by recording them via the app.
Q. What languages does it assist?
A. Proper now, the gloves assist Indian Signal Language. Plans are underway so as to add American and British variants. Customers can even select their most popular spoken output resembling Hindi, Marathi, or English, through the app.
Q. How AI helps?
A. AI performs a key function in recognising signal language in our system. We use a machine studying (ML) mannequin that processes knowledge from 12 sensors, every sending 6 values, 72 in whole, each nanosecond. These values kind a novel sample for every gesture. For instance, the indicators for ‘good day’ and ‘how are you’ create distinct patterns. The AI identifies these patterns and recognises the gesture, even when folks signal barely otherwise. Now we have presently educated the mannequin on practically 200 gestures, attaining an accuracy of round 98%.
AI additionally assists with language translation. Signal language grammar differs from spoken English. For example, ‘A cat ran after the mouse’ turns into ‘cat mouse ran’ in signal language, usually omitting phrases resembling ‘a’, ‘is’, and ‘the’. Utilizing pure language processing (NLP), the AI restructures the indicators into grammatically appropriate English. It additionally adapts to particular person signing types, very similar to decoding totally different handwriting.
Q. The processing taking place over the sting or within the cloud?
A. The first processing happens within the cloud, so an web connection is required. Nevertheless, in emergencies the place deaf people might not have entry to the web, we didn’t need to go away them with out assist. To handle this, we created a neighborhood gesture library containing the 26 letters of the alphabet and 10 emergency phrases, saved on the machine itself. This library works offline. Entry to the complete library of 20,000 gestures requires web connectivity.
Q. How does the system be taught new indicators?
A. Sure, when we have to add a brand new gesture, we offer the gloves to deaf people and ask them to carry out the gesture. At current, round 15 to twenty folks assist us, every performing the gesture 10 occasions, leading to roughly 200 samples. We use this knowledge to coach the AI to recognise the brand new gesture.
Q. What design issues did you face, and the way did you repair them?
A. We encountered a number of design challenges. Our first model was a full glove, which proved uncomfortable, particularly in summer time, and lowered customers’ sense of contact, which is important for them. We then explored a smartwatch mannequin, which was extra snug however launched delays. Finally, we developed a hybrid mannequin: half glove, half watch, with open fingers and palm to protect contact.
One other situation was speech-to-text conversion. Lengthy texts are sometimes difficult for deaf customers to learn on account of their unfamiliarity with spoken English. To assist, we added a summariser that shows simply 2–3 key phrases, enabling customers to comply with conversations via context and lip studying.
Q. What issues did you’ve gotten making a cross-platform app, and the way did you repair them?
A. The most important problem was receiving knowledge from two gloves concurrently through Bluetooth. Whereas connecting one machine is simple, connecting two gadgets to a single cellphone proved to be troublesome. Initially, we deliberate to develop separate Android and iOS apps. Nevertheless, our associate, Coreco Applied sciences, used Flutter to create a single app for each platforms. They started with a fundamental app to check Bluetooth connectivity and later constructed the complete model. Now, the best glove sends knowledge to the left glove, which then sends all the info to the cellphone. Every glove’s controller communicates with the opposite and the app, which relays the info to the cloud.
Q. What have been the most important challenges going from the primary prototype to the ultimate design?
A. We’re presently on the eighth model of our prototype and the second model of our app. Our journey began in February 2020 with a crude prototype made utilizing duct tape and jumper wires. Initially, we used flex sensors to detect finger actions, however they have been costly (₹800 every) and unable to seize full hand movement. We tried to mix them with a wrist-mounted movement sensor, however the outcomes have been unreliable. Ultimately, we switched to utilizing movement sensors on every finger, which considerably improved accuracy. We additionally transitioned from a cumbersome Arduino board to a compact, customized PCB, and later eradicated using Arduino.
In July 2022, we experimented with a smartwatch-based mannequin utilizing finger rings and removable strings as an alternative of gloves. The thought was to let customers put on solely the watch at residence and shortly join the strings through magnetic connectors when wanted. Nevertheless, restricted funding and a small group made it troublesome to make the connectors work successfully. The strings additionally had an excessive amount of slack and interfered with usability. We finally returned to the glove design, which offered higher stability.
Q. Have you ever acquired any authorities funding on your prototypes?
A. Sure, we have now acquired important authorities funding for prototyping, about Rs 15 million to this point, all fairness and debt-free. We acquired Rs 1.5 million from Startup India seed fund, Rs 1 million from Nidhi Prayas, and Rs 5 million from the Purchaser Again Ignition grant. We additionally received Rs 1 million from Boeing’s Construct 3.0 competitors and smaller quantities from different contests, together with 200,000-300,000 from a UN competitors. This funding has helped us assist our work independently.
Q. How do you get deaf folks and organisations to make use of your product?
A. Now we have acquired intensive media protection in publications resembling The Indian Specific, Occasions of India, CNN, NDTV, and others. A number of organisations, together with Royal Orchid Lodges, TVS Scooters, Indigo Airways, and KFC, are on a ready checklist for our gloves. Phrase of mouth and media visibility have accelerated our progress. We additionally take part in exhibitions, job festivals, and startup occasions to increase our attain.
Amongst our customers, the deaf neighborhood, over 10,000 folks comply with us on social media. One video about our product acquired 1.5 million views, largely on account of shares from the deaf neighborhood. We additionally plan to start out deaf-focused podcasts to strengthen this engagement.
Q. Any tie-up with colleges or faculties?
A. Sure, we have now partnered with a design agency known as Dominix International. They assisted with early analysis, together with empathy mapping, and helped us higher perceive the thought processes of deaf customers, enabling a extra user-friendly product design.
Q. What number of items have you ever offered, and the way a lot cash have you ever made?
A. Now we have not made any gross sales but. Royal Orchid Lodges has positioned a pre-order for 10 gadgets via their CSR initiative. We plan to ship these subsequent month. As this can be a paid pilot, it might not be categorized as common income. So, formally, we have now not recorded any gross sales but.
Q. What are your progress plans for the corporate?
A. We plan to start by promoting to companies (B2B). Deaf waiters will initially use our gloves to take orders. Following this, we intend to hunt funding to scale up operations. We goal to increase our group, notably in advertising and marketing, gross sales, and analysis, and collaborate with companions to construct a complete gesture library. By the top of 2026, we plan to promote on to shoppers (B2C) and work on translating American Signal Language to voice, increasing our attain to world customers.
Q. What’s stopping your startup from rising actually quick proper now?
A. We’re presently in a traditional chicken-and-egg state of affairs. Buyers need to see the product in real-world use earlier than committing funds, however we’d like capital to scale and deploy the product. Presently, we’re working with a lean group, supported by authorities grants and a few private funding. Our quick purpose is to ship gadgets to Royal Orchid efficiently by mid-July.
Q. How is the ecosystem serving to you? Are you including any new companions or resellers?
A. We’re working with a number of NGOs devoted to supporting people with disabilities. Over the previous 4 years, we have now constructed relationships with exceptional organisations and founders, resembling Pratik Madhav and the Asistec Basis. Asistec helps startups growing assistive applied sciences. These organisations have helped us attain prospects, join with buyers, and acquire a deeper understanding of consumer wants. We proceed so as to add new companions frequently.