Professor: Dr. Hubert Bray



Download 3.06 Mb.
Page2/2
Date28.05.2018
Size3.06 Mb.
#50825
1   2
Tesla:

In addition to the technologies described above, Google is also attempting to implement a new technique known as vehicular communication. Vehicular communication is a method that will allow cars to communicate with each other on the road through cloud servers. Vehicles would then be able to pull this information down from the server, and thus have a better understand of how traffic flow is in certain areas which would ultimately aid them by helping them determine the most efficient path (quickest and safest) to go from one place to another.



Tesla:

Computer Vision:7

As mentioned above, Tesla and Google have different mechanisms to create a visual representation of a car’s environment. Both Google and Tesla revolve heavily around sensors, however, the main difference as mentioned above is Google’s usage of LIDAR. Tesla’s autonomous system (also known as Tesla Autopilot System) is composed of multiple sensors that are placed around the car to help it understand its environment. The image below shows how a driver sees the Autopilot system:

In terms of hardware, Tesla’s vehicles currently “includes a forward radar, a forward-looking camera, a high-precision digitally-controlled electric assist braking system, and 12 long-range ultrasonic sensors placed around the car” (Thompson). These ultrasonic sensors are strategically placed around the vehicle so that the car is able to sense when something is too close and gauge the appropriate distance so that it can slow down. One important detail to understand about these sensors is that they can report inaccurate information if there is something that is interfering with them. We saw this in the fatal Tesla accident, where a man died in a car accident because of an “issue with the autopilot system”. While the exact reasoning is unknown, many experts believe that the Tesla sensors were affected by the brightness of the sun which causes the sensors/radar to believe that the truck was actually an overhead road sign because of its large height. While this is a problem, we also need to be reminded that truly autonomous cars do not currently exist (no car is at an SAE level 5 right now), so a driver using Tesla’s Autopilot needs to be ready to take over at all times. Going back to the sensors, the image below shows the “circle” that is created by the sensors placed around the car:


Tesla’s autopilot system ultimately takes the data inputted by the sensors and cameras and creates a digital representation of its surroundings (stationary and moving objects). The image below shows an example of how the digital representation looks for a car:



As you can see from the image, the digital representation created is quite similar to the image created by Google’s car which ultimately emphasizes how despite using different methods to learn, both autonomous systems yield similar graphics to display the information that they have learned.
Decision Making:

In addition to learning about its surroundings, one of the other important parts to an autonomous system is its decision making algorithm/ability. Currently, no car is truly autonomous as they still have drivers, however in the near future, these cars will be required to make decisions on whether they should stop, slow down, or swerve when in an emergency situation. While Tesla and Google have not yet fully developed autonomous cars, many people are already discussing the ethical dilemmas that will result from a computerized system making a decision about life or death. While there will be some ethical challenges, I believe that the challenges are being over-exaggerated due to the lack of knowledge people have about these systems. One common misconception about the decision making process is that many people believe the system will be a series of if/else statements (ex: If a person is in front of Car- turn right). In reality, however, these machines will rely on machine learning and pattern recognitions to make decisions ultimately allowing them to mimic the decision making process of a human(Galceran). By using the elements of AI, autonomous cars will be able to learn and analyze how “good” drivers drive and ultimately replicate their driving ability. Once autonomous cars reach an SAE level of 5, the next issue is the “Trolley Problem”. Currently, engineers from both companies are saying that they are teaching their systems to hit a stationary object whenever there is a chance that it might hit a human being. While this is easy to program, the real problems begin to occur when the system has to decide between hitting one person vs another. Engineers have stated that through machine learning, cars will be able to detect which path would lead to the least amount of distress and follow that path(Galceran). Overall while these problems must be considered, one needs to realize two things, the first is that these hypothetical situations rarely occur in the real world and the second is that a computer system will most likely make better decisions (in terms of what to do in an accident/situation) in comparison to a bad driver.

Economic Feasibility:

Currently, the safest small car to buy is a 2018 Honda Civic which has an MSRP of $18,600. While Google has not yet released its car in the market, the current cost of just LIDAR is $75,000 per vehicle which is already almost four times the price of a Honda Civic(Stephen). When adding all other costs, the current price of a car from Google is looking to be over $100,000 dollars making it virtually unaffordable to almost all Americans (average car budget for a person living in the U.S is $17-$33K)8. It is because of this absurd price that Google has stated numerous times that the price of LIDAR is going to drop significantly over the next decade ultimately making the car more affordable. Tesla on the other hand is now producing cars with autonomous systems that have a starting MSRP of $30,000 as they already have cars with elements of autonomous driving (SAE level 2-4) that go for anywhere from $66,000 to $110,000. As one can see, it is far more affordable to buy a Honda Civic than it is to buy an autonomous car, however as advancements in technology continue, the cost of manufacturing these cars will decrease.



Conclusion/Future:

Overall, while we still are far away from having truly autonomous cars, we have made great strides in creating cars that have some elements of autonomous driving. Companies such as Google and Tesla are the frontrunners in this field, and other companies are trying to join the movement. Just recently, rumors about Apple purchasing McLaren began to circulate ultimately showing their interest in also joining the autonomous driving industry. It is because there are so many different companies involved in this industry, there are a variety of different approaches to creating an autonomous car. Some of the common elements that can be found in all of the cars developed up to this point are computer vision, the usage of a variety of sensors, cameras, and forms of neural networks. Decision-making is still an unclear process as fully autonomous cars (SAE Level 5) have not yet been released to the public, however, the usage of artificial intelligent learning methods looks to be an efficient way for these systems to make decisions. Overall, I believe that autonomous cars, over the next few decades, will become cheaper, and safer than the cars we currently have today and will ultimately help us drive into a safer future.


Word Cited

Galceran, Enric, Alexander Cunningham, Ryan Eustice, and Edwin Olson. "Multipolicy Decision-Making for Autonomous Driving via Changepoint-based Behavior Prediction." Robotics: Science and Systems XI (2015): n. pag. Web. 1 Nov. 2016.

Gerla, Mario. "Internet of Vehicles: From Intelligent Grid to Autonomous Cars and Vehicular Clouds." (2014): n. pag. Web. 1 Nov. 2016.

"How an Autonomous Car Gets Around." The New York Times. The New York Times, 25 Oct. 2012. Web. 01 Nov. 2016.

Nicholls, Keith W. "High-Latitude Oceanography Using the Autosub Autonomous Underwater Vehicle." Limnology and Oceanography 53.5, Part 2. Autonomous and Lagrangian Platforms and Sensors (ALPS) (2008): 2309-320. Web. 1 Nov. 2016.

"Road Crash Statistics." Road Crash Statistics. N.p., n.d. Web. 01 Nov. 2016.

Stephen. "Elon Musk Says That the LIDAR Google Uses in Its Self-driving Car ‘doesn’t Make Sense in a Car Context’." 9to5Google. N.p., 17 Oct. 2015. Web. 01 Nov. 2016.

@teslamotors. "Tesla Press Information." Press Kit. N.p., n.d. Web. 01 Nov. 2016.

Thompson, Cadie. "Here's How Tesla's Autopilot Works." Business Insider. Business Insider, Inc, 01 July 2016. Web. 01 Nov. 2016.

Vanderbilt, Tomq. "Autonomous Cars Through the Ages." Wired.com. Conde Nast Digital, n.d. Web. 01 Nov. 2016.

"What Is LIDAR." US Department of Commerce, National Oceanic and Atmospheric Administration. N.p., n.d. Web. 01 Nov. 2016.

"Where To? A History of Autonomous Vehicles." A History of Autonomous Vehicles. N.p., n.d. Web. 01 Nov. 2016.

Images:


SAE Levels Diagram:

http://www.sae.org/misc/pdfs/automated_driving.pdf


Tesla Images:

https://www.tesla.com/


Google LIDAR Diagram:

http://www.nytimes.com/interactive/2012/10/28/automobiles/how-an-autonomous-car-gets-around.html





1 Cars are able to stop by themselves when they are near an object and the human driver is not braking

2 Statistics taken from ASIRT

3 Referring to the different devices used to guide cars up to this point

4 Neural Networks are essentially computerized systems that model the brain/CNS

5 SAE LEVELS are explained in the next section

6 Both companies use different methods to create “Computer Vision”. Computer Vision is the process of creating digital data from the real world (analog data).

7 Most of the information listed was accessible on Tesla’s website

8 Figures reported by multiple financial magazines


Download 3.06 Mb.

Share with your friends:
1   2




The database is protected by copyright ©ininet.org 2024
send message

    Main page