X

Autonomous electric shuttle bus service debuts in Las Vegas

The city of Las Vegas is partnering with Navya, a French company that builds autonomous electric shuttle buses, and Keolis, a public transportation company, to offer a driverless shuttle bus pilot program. The bus will service the newly created Innovation District on the East Fremont Street corridor downtown. This will be the first autonomous car program to be implemented on a public transportation system anywhere in the U.S.

“The city of Las Vegas is on the cutting edge of technology and innovation, building on the strong tech foundation provided by Switch and Nellis Air Force Base,” Mayor Carolyn Goodman said. “The city of Las Vegas is making major investments to improve motorist mobility and safety, provide reliable transportation choices and showcase technology in downtown Las Vegas, and we are thrilled Keolis is taking us a step in that direction.”

“This pilot (program) marks an important milestone in bringing us closer to our shared vision of delivering public mobility solutions that are both connected and sustainable. We are pleased to be showcasing this exciting new technology here in Las Vegas with our partners at the Regional Transportation Commission of Southern Nevada as we work together to deliver energy-efficient, sustainable transit solutions that will improve the quality of life for people in Nevada and around the world,” said Clement Michel, CEO of Keolis North America, operator of the RTC’s transit system along the Las Vegas Strip and the southern part of the Las Vegas Valley.

The bus route can be premapped and preprogrammed through Global Positioning Systems on the ARMA shuttle bus to follow a set of waypoints. If an obstacle crosses in front of the moving bus or obstructs its path, lidar and other sensors can detect and recognize a person or obstacle and then stop or seek an alternate route around a stationary object.

Other drivers and pedestrians can recognize an autonomous vehicle on public Nevada roads by a red DMV license plate that features a sideways “infinity” symbol.

Top speed is 45 miles per hour, but the average cruising speed of the ARMA with passengers is about 25 mph.

The 32-kilowatt-hour battery pack and electric drive motor can travel about 10 to 12 hours before needing to recharge. When the capacity of the battery pack drops below a preset threshold, the ARMA will “deadhead” back to its overnight parking space and park over a wireless inductive charging system for about five hours. No human hands have to plug the charger into the vehicle each night.

The next morning, after the battery pack is topped off, the ARMA can find its way back to its scheduled route for the next day.

Navya also exhibited its ARMA technology to attendees at CES, held Jan. 5-8 in Las Vegas, by highlighting the next steps toward the continuing evolution of the cognitive car.

Eight passengers at a time could board the autonomous shuttle as it followed a preprogrammed path for a round trip of about 400 yards within the Gold parking lot across from the North Hall of the Las Vegas Convention Center.

More information about Navya ARMA autonomous shuttle bus services can be found at http://navya.tech/?lang=en.

Many automotive manufacturers at CES 2017 were showing concept cars that demonstrated each company’s unique vision of how emerging technologies for drivetrain electrification, telecommunications and artificial intelligence could be applied to driverless transportation. North Hall exhibits at the Las Vegas Convention Center included the Chrysler Portal, Volkswagen ID, Toyota Concept-i, Honda NeuV and Mercedes Vision Van. Autonomous, self-driving versions of the Nissan Leaf, Tesla Motors Model S with Autopilot, Audi Q7 and Hyundai Ioniq Electric were also showcased.

Today’s transportation vehicles are now supercomputers on wheels, with hundreds of microprocessor systems working in parallel to process millions of lines of software code that continously input sensor data, then make instantaneous decisions that control multiple actuators on each automobile while it is moving.

Some of today’s smart machines are developing an artificial intelligence that is no longer programmed by humans in the conventional sense but is actually “learned” through a combination of sensor inputs, super-fast telecommunication connections, high-speed processing chips, lots of memory storage and deep neural networks.

An AI deep-learning system continuously filters the incoming data delivered by sensor systems through decision trees that have as many as a thousand lower levels of hierarchy, then provides looped feedback that can adjust the filtering system more finely after each iteration. The result is, hopefully, a rudimentary sense of awareness about a specific event or problem happening in real time, with the capability to make nuanced adjustments that continually refine sensor perception within an ever-changing environment.

Ideally, an AI machine not only senses its environment but can also perceive and understand its context over time.

SAE International has defined five levels of autonomous platforms, from level 1, where the driver operates the vehicle exclusively, to level 3, where the vehicle exhibits some autonomy but will hand control back to the driver when it senses an undefined problem. Level 5 is complete autonomy, where the vehicle handles all driving operations without the intervention of its passengers.

Nvidia CEO Jen-Hsun Huang delivered the first keynote address of the CES 2017 conference by talking about his company’s Drive PX platform and DriveWorks software operating system, which can integrate a wide range of sensors together into sets of processing systems, then blend them with multiple deep neural networks that continue to learn and adjust to different driving environments.

“What used to be science fiction is going to become reality in the coming years,” Huang said. “By applying this technology, we can revolutionize the automobile.”

The Drive PX platform is driven by the company’s new Xavier hardware, which features a dedicated AI “supercomputer on a chip” to deliver 30 trillion computing operations per second while consuming just 30 watts of power.

The DriveWorks operating system includes an AI Co-Pilot software application that can be tailored to understand the world around and inside an automobile, by the use of both external and internal cabin sensors coupled with deep learning processes.

When both a user’s voice and facial features are learned by an AI system to recognize each person interacting with it, new automotive applications can be created.

During vehicle movement, an internal camera system can monitor a driver’s gaze, head movement and facial expressions to react with an alert if the driver’s eyes seem to be getting sleepy, or read the driver’s lips to complement the voice recognition system if there is too much background music or conversational noise within the car.

Nvidia also partnered with Bosch, ZF Group and Audi to demonstrate self-driving, cognitive cars in the Gold parking lot of the Las Vegas Convention Center during CES 2017. The companies hope to make AI-equipped, self-driving cars more widespread and accessible to consumers by 2020.

.....We hope you appreciate our content. Subscribe Today to continue reading this story, and all of our stories.
Subscribe now and enjoy unlimited access!
Unlimited Digital Access
99¢ per month for the first 2 months
Exit mobile version