VMworld 2017 Thursday Keynote

MIT Media Lab – Hugh Herman

Works with prosthesis/crutches/etc. Even back in old times, found artificial items – toe on Egyptian mummy. In some ways not a lot of advancement in many years.

Hugh lost his legs to frostbite – asked doctor if could drive, ride his bike, and moutainclimb. Doctor said can drive with hand controls but never ride a bike or climb mountains. Today can mountain climb, make taller or shorter, even dance with advanced limbs.

Part of Center for Extreme Bionics – 4 initiatives. Can only talk about a couple items due to time.

What about amputation? Bionic limbs that have motors and actuators – human can’t control them though. Nothing has changed in how amputation is done since WWII or longer. This is bad because muscles have lots of sensors – stretch, tense, etc. Can feel torque and more – want to feed that back to the prosthesis.

Improvements around amputation to add a small muscle to the end to stretch and tense – did this for the first time last year. Different amputation technique paired with sensors allow control of a “phantom limb” or an actual bionic foot. This even works for unconscious toe movements like climbing stairs.

The patient said that “this is just my foot – it’s part of me”. More pieces around injecting a virus that can help a muscle put out more stimuli for fine grained control. This can lead to human augmentation and design of your physical body. This is a brave new world but also one that needs safeguards – can see scenarios of parents designing their children or governments that control citizen’s moods.

Ran El Kaliouby, working on Emotion AI

Technology + human interaction + emotion. Thinking about IQ & EQ – computers have no awareness of emotion and require unnatural ways for humans to interact.

“Lots of smarts but no heart – lots of cognitive intelligence but no emotional intelligence.”

Alexa has team trying to make Alexa more emphathetic. Showing example of teens who recorded a man drowning on their phones and did nothing – mourning the death of empathy.

Current ways to interact with technology can even lower EQ. First work was at MIT around autism and facial expressions. There are 45 facial muscles – now using computers and machine learning to replace 100+ hours of training on coding facial expressions.

Did a live demo reading facial expressions from a volunteer in the audience. Have been feeding in millions of images – can read 20 facial expressions, 7 emotions. Confirmed that women are more expressive then men but in UK they’re equally expressive – interesting.

Have created an SDK that’s available on many platforms – iOS, Android, etc.

Going through examples – emotion-aware entertainment. Different movie versions with more or less romance, more or less blood, etc. based on peoples’ reactions. Cars that track emotion with braking, IoT appliances that track emotion and recommend based on that.

What about learning apps that adapt learning patterns to peoples’ current emotions? Experiments with a social aware learning “stuffed animal” showed higher student interaction and less quitting.

Or tracking emotional state for medical reasons and not just a “1-10 pain scale”. Patients actually opened up more to a “emotionally aware robot doctor” than human doctors.

There are risks here around ethics and privacy but believe the reward outweights the risks.

Peter Winstock, ICU Doctor @ Boston Chidren’s Hospital

Penicillin was discovered by accident, anasthesia allows saving lives via surgery that would be otherwise impossible.

A new game changer at the fundamental level of medinice for treating all patients is Like-Like Rehearsal via Medical Simulation. To me this feels like Dev/Test for doctors…or practicing your upgrade scripts before your downtime window.

Showing a baby that needed heart/lung bypass – it’s hard to get good for infrequent conditions. Medical training is built around apprenticeship model – “See One”. “Do One”, “Teach One”. The problem is that we’re practicing on the patients that we treat.

Medicine may be the last major industry that does not practice prior to game time – time to change that. Nuclear power plants do simulations of all the things they hope never happen – same for Boeing and Airbus with airplanes. Airplane simulators for pilots are common because we found that most pilots can fly planes but can’t work together as a team in crisis. Simulators can help change that.

Think about baseball? Batting practice, practice swings, practice, practice, practice.

How to do this in medicine? Use 3D printing to create children’s anatomy so can practice. Before this would actually use a red pepper to take seeds out – a “seed-ectomy” – to practice puncturing a part of the brain to allow drainage of fluid buildup.

Building a “trainer” working with Hollywood special effects firms + 3D printing to help it be lifelike (suspend disbelief) and then practice. Showing a picture that is amazingly lifelike. Now embedding these trainers into team experiences where the whole team practices in the hours before patient surgery – modeling after Formula 1 teams that have to work together. After each practice debrief to discuss what can be done differently or better.

The impact is the ability to make the rare now routine to reduce morbidity, mortality, hospital-wide infections, and more.

Amazing closing real world example of a child whose brain partly formed outside the skull – family searched the country and no doctor would do it. Boston Children’s Hospital worked with plastic surgeons and iterative modeling to figure out how to put the brain inside the skull, drain fluid. It was successful – this is mindblowing stuff.

Making the impossible now possible.

One thought on “VMworld 2017 Thursday Keynote

  1. Pingback: VMware – VMworld 2017 – Wrap-up and Link-o-rama | penguinpunk.net

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s