Skip to main content
Intended for healthcare professionals
Restricted access
Research article
First published online October 1, 2018

Application and Evaluation of the Reinforcement Learning Approach to Eco-Driving at Intersections under Infrastructure-to-Vehicle Communications

Abstract

Eco-driving behavior is able to improve vehicles’ fuel consumption efficiency and minimize exhaust emissions, especially with the presence of infrastructure-to-vehicle (I2V) communications for connected vehicles. Several techniques such as dynamic programming and neural networks have been proposed to study eco-driving behavior. However, most techniques need a complicated problem-solving process and cannot be applied to dynamic traffic conditions. Comparatively, reinforcement learning (RL) presents great potential for self-learning to take actions in a complicated environment to achieve the optimal mapping between traffic conditions and the corresponding optimal control action of a vehicle. In this paper, a vehicle was treated as an agent to select its maneuver, that is, acceleration, cruise speed, and deceleration, according to dynamic conditions while approaching a signalized intersection equipped with I2V communication. An improved cellular automation model was utilized as the simulation platform. Three parameters, including the distance between the vehicle and the intersection, signal status, and instant vehicle speeds, were selected to characterize real-time traffic state. The total CO2 emitted by the vehicle on the approach to the intersection serves as a measure of reward policy that informs the vehicle how good its operation was. The Q-learning algorithm was utilized to optimize vehicle driving behaviors for eco-driving. Vehicle exhaust emissions and traffic performance (travel time, stop duration, and stop rate) were evaluated in two cases: (1) an isolated intersection, and (2) a medium-scale realistic network. Simulation results showed that the eco-driving behavior obtained by RL can not only reduce emissions but also optimize traffic performance.

Get full access to this article

View all access and purchase options for this article.

References

1. United States Environmental Protection Agency. U.S. Greenhouse Gas Inventory Report: 1990-2014. United States Environmental Protection Agency, Washington, D.C., 2015.
2. Ministry of Environmental Protection of the People’s Republic of China. China Vehicle Emission Control Annual Report: 2015. Ministry of Environmental Protection of the People’s Republic of China, Beijing, China, 2015.
3. Qian G., Chung E. Evaluating Effects of Eco-driving at Traffic Intersections Based on Traffic Micro-simulation. Proc., Australian Transport Research Forum 2011, Adelaide, Australia, 2011.
4. Mensing F., Bideaux E., Trigui R., Ribet J., Jeanneret B. Eco-driving: An Economic or Ecologic Driving Style? Transportation Research Part C: Emerging Technologies, Vol. 38, 2014, pp. 110–121. https://doi.org/10.1016/j.trc.2013.10.013
5. Zarkadoula M., Zoidis G., Tritopoulou E. Training Urban Bus Drivers to Promote Smart Driving: A Note on a Greek Eco-driving Pilot Program. Transportation Research Part D: Transport and Environment, Vol. 12, No. 6, 2007, pp. 449–451. https://doi.org/10.1016/j.trd.2007.05.002
6. Ho S. H., Wong Y. D., Chang V. W. C. What Can Eco-driving Do for Sustainable Road Transport? Perspectives from a City (Singapore) Eco-driving Programme. Sustainable Cities and Society, Vol. 14, 2015, pp. 82–88. https://doi.org/10.1016/j.scs.2014.08.002
7. Xu Y., Li H., Liu H., Rodgers M. O., Guensler R. L. Eco-driving for Transit: An Effective Strategy to Conserve Fuel and Emissions. Applied Energy, Vol. 194, 2017, pp. 784–797. https://doi.org/10.1016/j.apenergy.2016.09.101
8. Li J., Dridi M., El-Moudni A. A Cooperative Traffic Control of Vehicle-intersection (CTCVI) for the Reduction of Traffic Delays and Fuel Consumption. Sensors, Vol. 16, No. 12, 2016, pp. 2175–2175. https://doi.org/10.3390/s16122175
9. Barth M., Boriboonsomsin K. Energy and Emissions Impacts of a Freeway-Based Dynamic Eco-driving System. Transportation Research Part D: Transport and Environment, Vol. 14, No. 6, 2009, pp. 400–410. https://doi.org/10.1016/j.trd.2009.01.004
10. Schall D. L., Mohnen A. Incentivizing Energy-Efficient Behavior at Work: An Empirical Investigation Using a Natural Field Experiment on Eco-driving. Applied Energy, Vol. 185, 2017, pp. 1757–1768. https://doi.org/10.1016/j.apenergy.2015.10.163
11. Li J., Dridi M., El-Moudni A. A Cooperative Traffic Control of Vehicle-Intersection (CTCVI) for the Reduction of Traffic Delays and Fuel Consumption. Sensors, Vol. 16, No. 12, 2016, pp. 2175–2175. https://doi.org/10.3390/s16122175
12. Milesich T., Bucha J., Gulan L., Danko J. The Possibility of Applying Neural Networks to Influence Vehicle Energy Consumption by Eco Driving. Proc., International Conference Mechatronics, Springer, Cham., 2017, pp. 372–379. https://doi.org/10.1007/978-3-319-65960-2_46
13. Jiang Y., Zanon M., Hult R., Houska B. Distributed Algorithm for Optimal Vehicle Coordination at Traffic Intersections. IFAC World Congress, Vol. 50, No. 1, 2017, pp. 12082–12087. https://doi.org/10.1016/j.ifacol.2017.08.1511
14. Sutton R. S., Barto A. G. Reinforcement Learning: An introduction. MIT Press, Cambridge, MA, 1998.
15. Abdulhai B., Pringle R., Karakoulas G. J. Reinforcement Learning for True Adaptive Traffic Signal Control. Journal of Transportation Engineering, Vol. 129, No. 3, 2003, pp. 278–285. https://doi.org/10.1061/(ASCE)0733-947X(2003)129:3(278)
16. Walraven E., Spaan M. T. J., Bakker B. Traffic Flow Optimization: A Reinforcement Learning Approach. Engineering Applications of Artificial Intelligence, Vol. 52, 2016, pp. 203–212. https://doi.org/10.1016/j.engappai.2016.01.001
17. Zolfpour-Arokhlo M., Selamat A., Hashim S. Z. M., Afkhami H. Modeling of Route Planning System Based on Q Value-Based Dynamic Programming with Multi-Agent Reinforcement Learning algorithms. Engineering Applications of Artificial Intelligence, Vol. 29, 2014, pp. 163–177. https://doi.org/10.1016/j.engappai.2014.01.001
18. Watkins C. J., Dayan P. Q-Learning. Machine Learning, Vol. 8, No. 3–4, 1992, pp. 279–292. https://doi.org/10.1007/BF00992698
19. United States Environmental Protection Agency. Average Annual Emissions and Fuel Consumption for Passenger Cars and Light Trucks. Transportation and Air Quality. EPA420-F-08-024, October 2008.
20. Jimenez-Palacios J. L. Understanding and Quantifying Motor Vehicle Emissions with Vehicle Specific Power and TILDAS Remote Sensing. Massachusetts Institute of Technology, Cambridge, MA, 1998.
21. Frey H. C., Liu B. Development and Evaluation of a Simplified Version of MOVES for Coupling with a Traffic Simulation Model. Transportation Research Board, Washington, D.C., 2013.
22. Nagel K., Schreckenberg M. A Cellular Automaton Model for Freeway Traffic. Journal De Physique I, Vol. 2, No. 12, 1992, pp. 2221–2229. https://doi.org/10.1051/jp1:1992277
23. Wang J., Rakha H. A. Fuel Consumption Model for Conventional Diesel Buses. Applied Energy, Vol. 170, 2016, pp. 394–402. https://doi.org/10.1016/j.apenergy.2016.02.124
24. Shi J. Q., Hu Y. J., Li S. L., Zhang X. H., Mao C. Y. Simulation and Analysis of Road Construction Traffic Flow in Urban Road Networks. Advances in Mechanical Engineering, Vol. 7, No. 11, 2015, pp. 1–6. https://doi.org/10.1177/1687814015618176
25. Shi J. Q., Cheng L., Long J. C., Liu Y. L. A New Cellular Automaton Model for Urban Two-Way Road Networks. Computational Intelligence & Neuroscience, Vol. 2014, pp. 685047. https://doi.org/10.1155/2014/685047

Cite article

Cite article

Cite article

OR

Download to reference manager

If you have citation software installed, you can download article citation data to the citation manager of your choice

Share options

Share

Share this article

Share with email
EMAIL ARTICLE LINK
Share on social media

Share access to this article

Sharing links are not relevant where the article is open access and not available if you do not have a subscription.

For more information view the Sage Journals article sharing page.

Information, rights and permissions

Information

Published In

Article first published online: October 1, 2018
Issue published: December 2018

Rights and permissions

© National Academy of Sciences: Transportation Research Board 2018.
Request permissions for this article.

Authors

Affiliations

Junqing Shi
Department of Traffic and Transportation, College of Engineering, Zhejiang Normal University, Jinhua, Zhejiang, China
Fengxiang Qiao
Innovative Transportation Research Institute, Texas Southern University, Houston, TX
Qing Li
Innovative Transportation Research Institute, Texas Southern University, Houston, TX
Lei Yu
Yangtze River Scholar and Adjunct Professor, Xuchang University and Beijing Jiaotong University
Texas Southern University, Houston, TX
College of Engineering, Zhejiang Normal University, Jinhua, Zhejiang, China

Notes

Address correspondence to Fengxiang Qiao: [email protected]

Author Contributions

The authors confirm contribution to the paper as follows: study conception and design: Junqing Shi, Fengxiang Qiao, Lei Yu; data collection: Qing Li; analysis and interpretation of results: Qing Li, Junqing Shi, Yongju Hu; draft manuscript preparation: Lei Yu, Fengxiang Qiao, Junqing Shi, Qing Li. All authors reviewed the results and approved the final version of the manuscript.

Metrics and citations

Metrics

Journals metrics

This article was published in Transportation Research Record: Journal of the Transportation Research Board.

VIEW ALL JOURNAL METRICS

Article usage*

Total views and downloads: 292

*Article usage tracking started in December 2016


Altmetric

See the impact this article is making through the number of times it’s been read, and the Altmetric Score.
Learn more about the Altmetric Scores



Articles citing this one

Receive email alerts when this article is cited

Web of Science: 0

Crossref: 25

  1. A Deep Reinforcement Learning Framework for Eco-Driving in Connected a...
    Go to citation Crossref Google Scholar
  2. Learning eco-driving strategies from human driving trajectories
    Go to citation Crossref Google Scholar
  3. LSTM‐based deep learning framework for adaptive identifying eco‐drivin...
    Go to citation Crossref Google Scholar
  4. A Scoping Review of Energy-Efficient Driving Behaviors and Applied Sta...
    Go to citation Crossref Google Scholar
  5. Eco-driving at signalized intersections: a parameterized reinforcement...
    Go to citation Crossref Google Scholar
  6. Energy-Saving Speed Planning for Electric Vehicles Based on RHRL in Ca...
    Go to citation Crossref Google Scholar
  7. Global Speed Planning Algorithm for Intelligent Connected Buses Based ...
    Go to citation Crossref Google Scholar
  8. A Critical Evaluation of Eco-Driving Strategies for Connected Autonomo...
    Go to citation Crossref Google Scholar
  9. Platoon-centered control for eco-driving at signalized intersection bu...
    Go to citation Crossref Google Scholar
  10. Learning the Policy for Mixed Electric Platoon Control of Automated an...
    Go to citation Crossref Google Scholar
  11. Integrated eco-driving automation of intelligent vehicles in multi-lan...
    Go to citation Crossref Google Scholar
  12. Intersection control with connected and automated vehicles: a review
    Go to citation Crossref Google Scholar
  13. Navigating Electric Vehicles Along a Signalized Corridor via Reinforce...
    Go to citation Crossref Google Scholar
  14. Energy-Efficient Driving for Adaptive Traffic Signal Control Environme...
    Go to citation Crossref Google Scholar
  15. Safe Model-Based Off-Policy Reinforcement Learning for Eco-Driving in ...
    Go to citation Crossref Google Scholar
  16. Adaptive Speed Planning of Connected and Automated Vehicles Using Mult...
    Go to citation Crossref Google Scholar
  17. Ecological cruising control of connected electric vehicle: a deep rein...
    Go to citation Crossref Google Scholar
  18. Deep reinforcement learning based green wave speed guidance for human-...
    Go to citation Crossref Google Scholar
  19. Automated eco-driving in urban scenarios using deep reinforcement lear...
    Go to citation Crossref Google Scholar
  20. Hybrid deep reinforcement learning based eco-driving for low-level con...
    Go to citation Crossref Google Scholar
  21. Assessing the impact of multi-dimensional driving behaviors on link-le...
    Go to citation Crossref Google Scholar
  22. Platoon-Centered Control for Eco-Driving at Signalized Intersection Bu...
    Go to citation Crossref Google Scholar
  23. Deep Reinforcement Learning Based Left-Turn Connected and Automated Ve...
    Go to citation Crossref Google Scholar
  24. Predictive Eco-Driving Application Considering Real-World Traffic Flow
    Go to citation Crossref Google Scholar
  25. Model-Based Reinforcement Learning for Eco-Driving Control of Electric...
    Go to citation Crossref Google Scholar

Figures and tables

Figures & Media

Tables

View Options

Get access

Access options

If you have access to journal content via a personal subscription, university, library, employer or society, select from the options below:


Alternatively, view purchase options below:

Purchase 24 hour online access to view and download content.

Access journal content via a DeepDyve subscription or find out more about this option.

View options

PDF/ePub

View PDF/ePub

Full Text

View Full Text