Last keynote live blog…focused on neuroscience.
- You’re not gonna hear about cloud in this presentation – people start clapping.
- Going to hear from people doing cool things…
- Dr. Fei-Fei Li – Director at Stanford
- http://vision.stanford.edu/feifeili/
- Showing video a 3 year old child describing what she sees in photos
- Has a lot to learn but is already an expert at making sense of what she sees
- Lots of advances but computers still struggle with what this 3 year old can do.
- “Even the smartest machines are still blind.”
- Working on this for the last 15 years – research field = Computer Vision & Machine Learning
- First have to teach the computer to recognize objects – the building block of our visual world
- How hard can this be? Showing a cat picture…and cats in different positions…some sitting like humans. Even a household cat can have almost infinite variations to the object models.
- No one shows children how to see – they do it through their own learning.
- So what if think of human eyes as biological cameras? Takes a picture about every 200 milliseconds….millions of pictures in the first couple years.
- This made them realize that they needed a massively larger image set than they’d ever used….so downloaded tons of images from the internet. Used Amazon Mechanical Turk to tag images.
- 48,940 workers in 167 countries helped to label 1 billion images
- Coworkers told her not to do this – “go do something more useful”
- Imagenet – 15,000,000 images in 22,000 categories
- Convolutional Neural Network – in their lab it has 24M nodes, 140M partners, and 15B connections
- Showing an interesting map around how car prices correlate with income…but also with crime and even voting patterns
- Showed a crocodile-duck – and how computer hands that.
- Improving computer recognition….and maybe have it up to roughly age 3.
- Closed with a picture of a really happy young boy with a cake – talks about how it’s her son and all the context in the picture that she hopes someday a computer can recognize.
- Greg Gage, Founder, Backyard Brains
- Brains have 80 billion cells called neuron
- Pulled out a cockroach, dunked him in cold water to anesthaetize him, cuts his leg off live.
- Not going to map a human brain live….but map a cockroach brain.
- Cuts off the leg, puts pins in it, connects it to a “Spiker Box” – lets us listen to what the neuron impulses sound like as he pokes the leg (because the leg is still sending info to the brain)
- Now a human onstage…has been at VMworld 12 years. Putting electrodes on his arm.
- Electromyogram – one neuron connected to one muscle fiber.
- Do you workout? “Sure”
- Holding a tenser type thing that can measure how hard can hold something – can see it drop from 75 pounds of force and also mirrored in the electrode output.
- Now hooking up to 2 people – asking the “puppet” – have you ever lost your free will? “Well, I got married 24 years ago.”
- Showed 2 people, electrodes on one arm of each person connected through a machine, one person squeezes their hand, the other person’s hand closes and they can’t control it.
- Really fun presentation…hard to blog it.
- David Eagleman, Director at Baylor
- Our reality sits in between the super micro cell stuff and the vast cosmos
- Color spectrum is a very small part of the electromagnetic radiation spectrum.
- Snakes can see infrared, honeybees can see ultraviolet. We build machines to see X-Rays, cel phones, etc.
- Your senses limit your reality.
- Umwelt (said “umvelt”) – German world for the reality you can see. Each animal is locked into it’s own perceived reality.
- Truman show – “we accept the reality that’s presented to us”.
- Reality expanding example – what’s it like to be born blind? If you haven’t had it, you never miss it. Or if a blood hound dog thinking about how humans must feel loss about not smelling all the things a dog can.
- Focus is how technology can expand our Umwelt and how that can change the experience of being human.
- “As technology expands our Umwelt, it will change the experience of being human.”
- Lots of people walking with retinal implants and cochlear impacts (sight and hearing). Scientists thought this wouldn’t work b/c incoming signals would be different than from real eyes or ears.
- The brain is locked away from the real world – only works on electrical impulses – is an incredible general computation machine.
- Talking about studies for sight substitution – input on the forehead and tongue from a camera helps blind people practically see (can even throw a ball into a basket)
- David and his research partner working on helping deaf people hear – has created a vest that translates sound into a complex set of pokes on the back.
- Showing a video of a totally deaf man – after a couple weeks could take spoken words and understand them well enough to write them on a whiteboard when spoken.
- This started with sensory substitution but also looking at sensory addition.
- Showing how did a scrape of all #VMworld hashtag tweets, figure out if positive or negative, then reflects that on the vest, and can tell how conference is going.
- What about an airplane cockpit? Way too many things being channeled through your eyes that aren’t good at recognizing that kind of data. Your skin can process in parallel…so why not push that kind of information through your skin?
- Picture of someone piloting a quadcopter with a sensor grid on his back to get input on 9 different things – pitch, yaw, etc.
- The human umwelt has been unlocked
- The question is…how do you want to experience your universe?
That’s a wrap….fascinating stuff.