Making a car for blind drivers - Dennis Hong
Many believe driving is an activity solely reserved for those who can see. A blind person driving a vehicle safely and independently was thought to be an impossible task until now.
Hello, my name is Dennis Hong and we're bringing freedom and independence to the blind by building a vehicle for the visually impaired. So before I talk about this car for the blind, let me briefly tell you about another project I worked on called the DARPA Urban Challenge.
Now, this was about building a robotic car that can drive itself. You press start, nobody touches anything, and it can reach its destination fully autonomously. So in 2007, our team won half a million dollars by placing third in this competition.
Around that time, the National Federation of the Blind, or NFB, challenged the research community about who can develop a car that a blind person can drive safely and independently. We decided to give it a try because we thought, "Hey, how hard could it be? We just have an already autonomous vehicle, we just put a blind person in it, and we're done, right?" We could not have been more wrong.
What NFB wanted was not a vehicle that can drive a blind person around, but a vehicle where a blind person can make active decisions and drive. So, we had to throw everything out the window and start from scratch. To test this crazy idea, we developed a small dune buggy prototype vehicle to test the feasibility.
In the summer of 2009, we invited dozens of blind youth from all over the country and gave them a chance to give it a spin. It was an absolutely amazing experience. But the problem with this car was it was designed to only be driven in a very controlled environment—in a flat, closed-off parking lot, with lanes defined by red traffic cones.
With this success, we decided to take the next big step to develop a real car that can be driven on the real roads. So, how does it work? Well, it's a rather complex system, but let me try to explain it and make it simplified.
We have three steps: perception, computation, and non-visual interfaces. Now, obviously the driver cannot see, so the system needs to perceive the environment and gather information for the driver. For that, we use an inertial measurement unit. It measures acceleration and angular acceleration like a human ear—inner ear.
We fuse that information with a GPS unit to get an estimate of the location of the car. We also use two cameras to detect the lanes of the road. Additionally, we use three laser range finders. The lasers scan the environment to detect obstacles, cars approaching from the front and back, and any obstacles that may run into the road—any obstacles around the vehicle.
All this vast amount of information is then fed into the computer, and the computer can do two things. First of all, it processes this information to have an understanding of the environment—these are the lanes of the road, there are obstacles—and conveys this information to the driver.
The system is also smart enough to figure out the safest way to operate the car, so we can also generate instructions on how to operate the controls of the vehicle. But the problem is this: how do we convey this information and instructions to a person who cannot see fast enough and accurately enough so they can drive?
For this, we developed many different types of non-visual user interface technologies. It starts from a three-dimensional pinging sound system, a vibrating vest, a click wheel with voice commands, a leg strip, and even a shoe that applies pressure to the foot.
But today, we're going to talk about three of these non-visual user interfaces. The first interface is called a Drive Grip. These are a pair of gloves that have vibrating elements on the knuckle part, so it can convey instructions about how to steer, the direction, and intensity.
Another device is called Speed Strip. This is a chair; as a matter of fact, it's actually a massage chair. We got it out and rearranged these vibing elements in different patterns, and we actuate them to convey information about speed and also instructions on how to use the gas and the brake pedal.
So over here, you can see how the computer understands the environment. Because you cannot see the vibrations, we actually put red LEDs on the Drive Grip so we can actually see what's happening. This is the sensory data, and that data is transferred to the devices through the computer.
So these two devices, Drive Grip and Speed Strip, are very effective, but the problem is these are instructional devices. This is not really freedom, right? The computer tells you how to drive, turn left, turn right, speed up, stop. We call this the backseat driver problem.
So we're moving away from these instructional devices and are now focusing more on the informational devices. A good example of this informational non-visual user interface is called AirPix. Think of it as a monitor for the blind.
It's a small tablet that has many holes in it, and compressed air comes out, so it can actually draw images. Even though you're blind, when you put your hand over it, you can see the lanes of the road and obstacles. You can also change the frequency of the air coming out and possibly the temperature, so it's actually a multi-dimensional user interface.
Here, you can see the left camera and the right camera from the vehicle and how the computer interprets that and sends that information to the AirPix. For this, we're showing a simulator of a blind person driving using the AirPix. This simulator was also very useful for training the blind drivers and quickly testing different types of ideas for different types of non-visual user interfaces.
So basically, that's how it works. Just a month ago, on January 29th, we unveiled this vehicle for the very first time to the public at the world-famous Daytona International Speedway during the Rolex 24 racing event. We also had some surprises.
Let's take a look at this historic day. Jim Larry, he's coming up to the grandstand, fellow federationists, passing the grandstand now and heading down for the kick, following that fan that's out in front of him. Well, there comes the first box, and now let's see if Mark avoids it. He does! He passes it on the right.
The third box is out, the fourth box is out, and he's perfectly making his way between the two. He's closing in on the B to make the move back. Well, this is what it's all about—this kind of dynamic display of capacity and ingenuity!
He's approaching the end of the run and makes his way in between the barrels that are set up there. I'm so happy! I'm so glad Mark's going to give me a ride back to the hotel!
Since we started this project, we've been getting hundreds of letters, emails, and phone calls from people from all around the world—letters thanking us. But sometimes you also get funny letters like this one.
Now, I understand that where there's Braille on a drive-up ATM machine, right? But sometimes I also do get, you know, I wouldn't call it hate mail, but letters of really strong concern: "Dr. Hong, are you insane trying to put blind people on the road? You must be out of your mind!"
But this vehicle is a prototype vehicle, and it's not going to be on the road until it's proven safe—or safer—than today's vehicles. I truly believe this can happen, but still, will society accept such a radical idea?
How are we going to handle insurance? How are we going to issue driver licenses? There are many different hurdles, besides technology challenges, that we need to address before this becomes a reality.
Of course, the main goal of this project is to develop a car for the blind, but potentially more important than this is the tremendous value of the spin-off technologies that can come from this project.
The sensors that we use can see through the dark, fog, and rain, and together with these new types of interfaces, we can use these technologies and apply them to safer cars for people, or for the blind, everyday home appliances, in the educational setting, in the office setting.
Just imagine in a classroom a teacher writes on the blackboard, and a blind student can see what's written and read using these non-visual interfaces. This is priceless!
So today, the things I've shown you today are just the beginning. Thank you very much.