Just over a year ago, I wrote about Danielle Feerst and AustismSees, a startup dedicated to creating technology – apps – to help people with autism make successful public presentations. A year later, and Danielle is now a rising senior at Tufts University.

She hasn’t been idle. The company received support from MassChallenge, a startup accelerator, it launched a Kickstarter campaign, and it is working on expanding its application “Podium” to the Microsoft Kinect System.

Danielle says, “The Podium app is the first self-assessment tool and remedial software application for enhancing and improving eye-contact, speech, and expression communication to assist users in the quality of job interviews and presentations. The app uses evidence-based practice to allow users to score their own eye contact, expressions, and speech patterns for engagement during a job interview mock session. These areas are important in socialization. We would like to automate this scoring process. The newest version of the app will be piloted in the Aspire program at Mass General Hospital in October, 2015.”

Alas, the Kickstarter campaign didn’t receive enough funding as of this writing, so I hope that Danielle tries again, and finds other ways to keep the momentum going. The Podium app seems potentially to be a huge help for autistic people working on their public speaking.

As I noted a year ago, in a tangential way I have a connection to people with autism, because of an experience I had as a 17-year-old that seems to have paralleled the autism experience.

I was tobogganing and crashed headfirst into a tree on the second turn. I fractured my skull, and was operated on for a subdural hematoma – a blood clot – that was putting pressure on my brain and causing intense pain.

I was in a coma for a few days, and at some point during that coma, I flatlined briefly – for a total of about 15 minutes.  Then I woke up.

Over the next several weeks, I noticed that something odd had happened to my mental processes. The world – or at least the people in it – had become distant and strange for me. It was as if a color movie had suddenly turned black and white.

I couldn’t tell what other people were thinking, or feeling. I knew I should be able to tell what was going on with them, but I couldn’t. Something in me had switched off, I had no idea what, and it meant that people were suddenly complete mysteries to me. It was terrifying.

So I began to study body language consciously, in a deliberate and indeed panicked attempt to figure out what people were feeling, what their intent was, what they actually meant. I focused obsessively on gesture, facial expressions, posture, the ways people revealed tension in their arms and shoulders, the way they moved closer or further away from each other, their smiles and frowns – everything, in short, that I could see that might tell me something about what they were feeling.

I was lucky – the ability to read people came back to me with work and time. Autistic people are not so lucky – they can find it very difficult to read people in that way the rest of us take for granted. As I said a year ago, I’m thrilled to see a group of young people (still) working to provide technology that may help.  Thank you, Danielle and the team at AutismSees:  Alexandria Trombley and Alix Generous; Anthony Principe, Stan Laidus, and Laura Carpenter.