How do self driving cars know what they are looking at? How does Google Photos know how to tag various things like dogs and cats in photographs? The answer is object detection, which is the subject of a TED Talk that Joseph Redmon gave a TED talk in April of this year. The talk is located at https://www.ted.com/talks/joseph_redmon_how_a_computer_learns_to_recognize_objects_instantly#t-445883. Joseph works on an open source object detection program called Darknet.
Joseph's talk was fascinating, as I had no idea that computer object detection had come so far in recent years. In the back of my mind I suppose I must have known (how else can self-driving cars know what to avoid?), but watching the talk really put it into perspective. Just 10 years ago the thought of having a computer that was able recognize objects in real time would be ridiculous. 5 years ago it was getting a little more plausible, but the assumption was that you would need a super powerful cluster of computer processing images like mad. Now it can be done with a phone, and indeed, Joseph used his phone to show the identification of random objects in real time. It was amazing.
There are all sorts of applications for this technology. A self driving car is only the tip of the iceberg. How about a surgery assistant robot? Or assembly line product matching? Or a replacement for a service dog? The possibilities are almost limitless. If this technology can run on a simple phone today, what will it look like in 5 years? I, for one, am very excited to find out.